1 Star 0 Fork 0

modelee / t5-large-ssm

Create your Gitee Account
Explore and code with more than 12 million developers,Free private repositories !:)
Sign up
This repository doesn't specify license. Please pay attention to the specific project description and its upstream code dependency when using it.
Clone or Download
contribute
Sync branch
Cancel
Notice: Creating folder will generate an empty file .keep, because not support in Git
Loading...
README
language datasets license
en
c4
wikipedia
apache-2.0

Google's T5 for Closed Book Question Answering.

The model was pre-trained using T5's denoising objective on C4 and subsequently additionally pre-trained using REALM's salient span masking objective on Wikipedia.

Note: This model should be fine-tuned on a question answering downstream task before it is useable for closed book question answering.

Other Community Checkpoints: here

Paper: How Much Knowledge Can You Pack Into the Parameters of a Language Model?

Authors: Adam Roberts, Colin Raffel, Noam Shazeer

Abstract

It has recently been observed that neural language models trained on unstructured text can implicitly store and retrieve knowledge using natural language queries. In this short paper, we measure the practical utility of this approach by fine-tuning pre-trained models to answer questions without access to any external context or knowledge. We show that this approach scales with model size and performs competitively with open-domain systems that explicitly retrieve answers from an external knowledge source when answering questions. To facilitate reproducibility and future work, we release our code and trained models at https://goo.gle/t5-cbqa.

model image

Empty file

About

Cancel

Releases

No release

Contributors

All

Activities

Load More
can not load any more
1
https://gitee.com/modelee/t5-large-ssm.git
git@gitee.com:modelee/t5-large-ssm.git
modelee
t5-large-ssm
t5-large-ssm
main

Search