diff --git a/models/nlp/plm/bert_base_squad/ixrt/README.md b/models/nlp/plm/bert_base_squad/ixrt/README.md index 901ba72764a1691a6b1501fed1bfc34463a06f9d..8f595b3b40224f36999375d55a46a5f8f1a327a1 100644 --- a/models/nlp/plm/bert_base_squad/ixrt/README.md +++ b/models/nlp/plm/bert_base_squad/ixrt/README.md @@ -90,4 +90,4 @@ bash script/inference_squad.sh --bs 32 --int8 ## Referenece -- [bert-base-uncased.zip](https://drive.google.com/file/d/1_DJDdKBanqJ6h3VGhH78F9EPgE2wK_Tw/view?usp=drive_link) +- [bert-base-uncased.zip](https://drive.google.com/file/d/1_q7SaiZjwysJ3jWAIQT2Ne-duFdgWivR/view?usp=drive_link) \ No newline at end of file diff --git a/models/nlp/plm/bert_large_squad/ixrt/README.md b/models/nlp/plm/bert_large_squad/ixrt/README.md index dcb9f8c5cebd56e7fa5b33c15ce1d615a405a70f..26b1231fc0e1400b2d8d4e9b102544d70ddaa7bd 100644 --- a/models/nlp/plm/bert_large_squad/ixrt/README.md +++ b/models/nlp/plm/bert_large_squad/ixrt/README.md @@ -15,8 +15,7 @@ BERT is designed to pre-train deep bidirectional representations from unlabeled ### Prepare Resources -Get `bert-large-uncased.zip` from [Google -Drive](https://drive.google.com/file/d/1eD8QBkbK6YN-_YXODp3tmpp3cZKlrPTA/view?usp=drive_link) +Get `bert-large-uncased.zip` from [GoogleDrive](https://drive.google.com/file/d/1eD8QBkbK6YN-_YXODp3tmpp3cZKlrPTA/view?usp=drive_link) ### Install Dependencies