# RadBERT **Repository Path**: modelee/RadBERT ## Basic Information - **Project Name**: RadBERT - **Description**: No description available - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 3 - **Forks**: 0 - **Created**: 2023-05-24 - **Last Updated**: 2025-06-16 ## Categories & Tags **Categories**: llm **Tags**: None ## README --- widget: - text: "low lung volumes, [MASK] pulmonary vascularity." tags: - fill-mask - pytorch - transformers - bert - biobert - radbert - language-model - uncased - radiology - biomedical datasets: - wikipedia - bookscorpus - pubmed - radreports language: - en license: mit --- RadBERT was continuously pre-trained on radiology reports from a BioBERT initialization. ## Citation ```bibtex @article{chambon_cook_langlotz_2022, title={Improved fine-tuning of in-domain transformer model for inferring COVID-19 presence in multi-institutional radiology reports}, DOI={10.1007/s10278-022-00714-8}, journal={Journal of Digital Imaging}, author={Chambon, Pierre and Cook, Tessa S. and Langlotz, Curtis P.}, year={2022} } ```