1 Star 0 Fork 0

modelee / longformer-base-4096-finetuned-squadv1

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README
MIT
datasets license
squad_v1
mit

LONGFORMER-BASE-4096 fine-tuned on SQuAD v1

This is longformer-base-4096 model fine-tuned on SQuAD v1 dataset for question answering task.

Longformer model created by Iz Beltagy, Matthew E. Peters, Arman Coha from AllenAI. As the paper explains it

Longformer is a BERT-like model for long documents.

The pre-trained model can handle sequences with upto 4096 tokens.

Model Training

This model was trained on google colab v100 GPU. You can find the fine-tuning colab here Open In Colab.

Few things to keep in mind while training longformer for QA task, by default longformer uses sliding-window local attention on all tokens. But For QA, all question tokens should have global attention. For more details on this please refer the paper. The LongformerForQuestionAnswering model automatically does that for you. To allow it to do that

  1. The input sequence must have three sep tokens, i.e the sequence should be encoded like this <s> question</s></s> context</s>. If you encode the question and answer as a input pair, then the tokenizer already takes care of that, you shouldn't worry about it.
  2. input_ids should always be a batch of examples.

Results

Metric # Value
Exact Match 85.1466
F1 91.5415

Model in Action 🚀

import torch
from transformers import AutoTokenizer, AutoModelForQuestionAnswering,

tokenizer = AutoTokenizer.from_pretrained("valhalla/longformer-base-4096-finetuned-squadv1")
model = AutoModelForQuestionAnswering.from_pretrained("valhalla/longformer-base-4096-finetuned-squadv1")

text = "Huggingface has democratized NLP. Huge thanks to Huggingface for this."
question = "What has Huggingface done ?"
encoding = tokenizer(question, text, return_tensors="pt")
input_ids = encoding["input_ids"]

# default is local attention everywhere
# the forward method will automatically set global attention on question tokens
attention_mask = encoding["attention_mask"]

start_scores, end_scores = model(input_ids, attention_mask=attention_mask)
all_tokens = tokenizer.convert_ids_to_tokens(input_ids[0].tolist())

answer_tokens = all_tokens[torch.argmax(start_scores) :torch.argmax(end_scores)+1]
answer = tokenizer.decode(tokenizer.convert_tokens_to_ids(answer_tokens))
# output => democratized NLP

The LongformerForQuestionAnswering isn't yet supported in pipeline . I'll update this card once the support has been added.

Created with ❤️ by Suraj Patil Github icon Twitter icon

MIT License ----------- Copyright (c) 2020 Suraj Patil (https://huggingface.co/valhalla/longformer-base-4096-finetuned-squadv1) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE

简介

暂无描述 展开 收起
MIT
取消

发行版

暂无发行版

贡献者

全部

近期动态

加载更多
不能加载更多了
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/modelee/longformer-base-4096-finetuned-squadv1.git
git@gitee.com:modelee/longformer-base-4096-finetuned-squadv1.git
modelee
longformer-base-4096-finetuned-squadv1
longformer-base-4096-finetuned-squadv1
main

搜索帮助

344bd9b3 5694891 D2dac590 5694891