# Subspace-Tuning **Repository Path**: xuyangyan/Subspace-Tuning ## Basic Information - **Project Name**: Subspace-Tuning - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-09-25 - **Last Updated**: 2024-09-25 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README
## π Introduction Welcome to our repository, which contains a diverse collection of Subspace Tuning methods for Parameter-Efficient Fine-Tuning (PEFT). Subspace Tuning are essential for adapting large pre-trained models to specific tasks with minimal changes to the original parameters. It endeavors to identify the maximal projection of the optimal weight $\mathbf{W}^{*}$ onto the subspace spanned by the bases of $\phi(\mathbf{W})$, where $\phi(\mathbf{W})$ denotes the subspace transformation of the original frozen weight $\mathbf{W}$. For more details, please refer to [the original paper](https://arxiv.org/abs/2407.05417).  We aim to provide a comprehensive resource for researchers and practitioners in this field, and facilitate easy integration into your projects. Whether you are here to find resources for your projects or to contribute, we hope this repository will be a valuable and inspiring part of your research journey. ### Information Box This repository also contains some of the other projects we have worked on, which might have led you here. - [**LoRA-Dash**](https://chongjiesi.site/full-publications/2024-arxiv-lora-dash/): Unleashing the Power of Task-Specific Directions in Parameter Efficient Fine-tuning. - [**FLoRA**](https://chongjiesi.site/full-publications/2024-arxiv-flora/): FLoRA: Low-Rank Core Space for N-dimension. ## π₯ News - **[2024.09.04]** π₯π₯ Add ***Method*** LoRA-Dash and ***Task*** Subject-driven Generation to Our Repo! - **[2024.08.18]** π₯π₯ Add ***Task*** Math Reasoning to Our Repo! - **[2024.07.22]** π₯π₯ Add ***Methods*** PISSA, MiLoRA and Spectral Adapter to Our Repo! - **[2024.07.09]** π₯π₯ Repository Constructed! ## π Todo List - Nothing to do yet. ## π οΈ Usage To use the algorithms in this repository, clone the repository and install the necessary dependencies. 1. Clone this Repository: ```bash git clone https://github.com/Chongjie-Si/Subspace-Tuning.git cd Subspace-Tuning ``` 2. Follow the Instructions in Each Folder. ## π― Tasks We support several tasks including: - Natural Language Understanding ([NLU](./NLU/)) - Natural Language Generation ([NLG](./NLG_QA/)) - Question Answering ([QA](./NLG_QA/)) - Commonsense Reasoning ([CR](./CR_MR/)) - Math Reasoning ([MR](./CR_MR/)) - Subject-driven Generation ([SdG](./SdG/)) - ... ## π Algorithms Based on subspace tuning theory, PEFT methods are classified into three categories: reconstruction-based, extension-based and combination-based.  We implement different methods mainly in [loralib/](./loralib/loralib/).| Category | Algorithm | Code | Paper |
|---|---|---|---|
| Reconstruction | SAM-PARSER | Code | 2024 AAAI |
| IA3 | Code | 2022 NeurIPS | |
| SSB | Code | 2024 Arxiv | |
| SSL | Code | 2024 Arxiv | |
| BitFit | N/A | 2022 ACL | |
| Prefix-tuning | Code | 2021 ACL | |
| Prompt-tuning | Code | 2021 EMNLP | |
| P-tuning | Code | 2022 ACL | |
| PISSA | Code | 2024 Arxiv | |
| MiLoRA | Code | 2024 Arxiv | |
| Extension | LoRA | Code | 2022 ICLR |
| AdaLoRA | Code | 2023 ICLR | |
| FLoRA | Code | 2024 Arxiv | |
| MoSLoRA | Code | 2024 Arxiv | |
| TriLoRA | Code | 2024 Arxiv | |
| Adapter (Houlsby) | N/A | 2019 ICML | |
| Adapter (Pfeiffer) | N/A | 2021 ACL | |
| Parallel Adapter | Code | 2022 ICLR | |
| Combination | DoRA | Code | 2024 ICML |
| SVDiff | Code | 2023 ICCV | |
| Spectral Adapter | Code | 2024 Arxiv | |
| LoRA-Dash | Code | 2024 Arxiv | |
| More algorithms and updates are continually added... | N/A | N/A |