# vert-papers
**Repository Path**: greitzmann/vert-papers
## Basic Information
- **Project Name**: vert-papers
- **Description**: This repository contains code and datasets related to entity/knowledge papers from the VERT (Versatile Entity Recognition & disambiguation Toolkit) project, by the Knowledge Computing group at Microsoft Research Asia (MSRA).
- **Primary Language**: Unknown
- **License**: Not specified
- **Default Branch**: master
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 0
- **Created**: 2020-11-10
- **Last Updated**: 2020-12-19
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
This repository contains code and datasets related to entity/knowledge papers from the VERT (**V**ersatile **E**ntity **R**ecognition & Disambiguation **T**oolkit) project, by the [Knowledge Computing](https://www.microsoft.com/en-us/research/group/knowledge-computing/) group at Microsoft Research Asia (MSRA).
# Recent Papers:
* [UniTrans: Unifying Model Transfer and Data Transfer for Cross-Lingual Named Entity Recognition with Unlabeled Data](https://www.ijcai.org/Proceedings/2020/543), *Qianhui Wu, Zijia Lin, Börje F. Karlsson, Biqing Huang, Jian-Guang Lou*, IJCAI 2020.
Repository: **https://github.com/microsoft/vert-papers/tree/master/papers/UniTrans**
* [Single-/Multi-Source Cross-Lingual NER via Teacher-Student Learning on Unlabeled Data in Target Language](https://arxiv.org/abs/2004.12440), *Qianhui Wu, Zijia Lin, Börje F. Karlsson, Jian-Guang Lou, Biqing Huang*, ACL 2020.
Repository: **https://github.com/microsoft/vert-papers/tree/master/papers/SingleMulti-TS**
* [Enhanced Meta-Learning for Cross-lingual Named Entity Recognition with Minimal Resources](https://arxiv.org/abs/1911.06161), *Qianhui Wu, Zijia Lin, Guoxin Wang, Hui Chen, Börje F. Karlsson, Biqing Huang, Chin-Yew Lin*, AAAI 2020.
Repository: **https://github.com/microsoft/vert-papers/tree/master/papers/Meta-Cross**
* [Improving Entity Linking by Modeling Latent Entity Type Information](https://arxiv.org/abs/2001.01447), *Shuang Chen, Jinpeng Wang, Feng Jiang, Chin-Yew Lin*, AAAI 2020.
* [Towards Improving Neural Named Entity Recognition with Gazetteers](https://www.aclweb.org/anthology/P19-1524/), *Tianyu Liu, Jin-Ge Yao, Chin-Yew Lin*, ACL 2019.
Repository: **https://github.com/lyutyuh/acl19_subtagger**
* [CAN-NER: Convolutional Attention Network for Chinese Named Entity Recognition](https://arxiv.org/abs/1904.02141), *Yuying Zhu, Guoxin Wang, Börje F. Karlsson*, NAACL-HLT 2019.
Repository: **https://github.com/microsoft/vert-papers/tree/master/papers/CAN-NER**
* [GRN: Gated Relation Network to Enhance Convolutional Neural Network for Named Entity Recognition](https://arxiv.org/abs/1907.05611), *Hui Chen, Zijia Lin, Guiguang Ding, Jian-Guang Lou, Yusen Zhang, Börje F. Karlsson*, AAAI 2019.
Repository: **https://github.com/HuiChen24/NER-GRN**
# Related Software:
* **[microsoft/Recognizers-Text](https://github.com/microsoft/Recognizers-Text)** - Open-source library that provides recognition and normalization/resolution of **numbers**, **units**, **date/time**, and **sequences** (e.g., phone numbers, URLs) expressed in multiple languages.
# Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a
Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us
the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide
a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions
provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or
contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.