186 Star 1.2K Fork 1.1K

GVPAscend/MindSpeed-LLM

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README
MIT

GitHub Documentation

MindSpeed-LLM是基于昇腾生态的大语言模型分布式训练框架,旨在为华为 昇腾芯片 生态合作伙伴提供端到端的大语言模型训练方案,包含分布式预训练、分布式指令微调、分布式偏好对齐以及对应的开发工具链,如:数据预处理、权重转换、在线推理、基线评估。

注 : 原仓名ModelLink更改为MindSpeed-LLM,原包名modellink更改为mindspeed_llm


NEWS !!! 📣📣📣

🚀🚀🚀Qwen3 系列模型同步首发支持!!!🚀🚀🚀

Qwen3系列模型 😊

🚀🚀🚀DeepSeek-R1 系列功能逐步上线!!🚀🚀🚀

DeepSeek-R1-ZERO Qwen-7B 😊

DeepSeek-R1-ZERO Qwen-32B

🚀🚀🚀DeepSeek-V3-671B 模型全家桶已上线!!!🚀🚀🚀

数据处理:预训练指令微调 😊

权重转换 😊

预训练 😊

微调: 全参微调LoRA微调QLoRA微调 😊

🚀🚀🚀DeepSeek-R1-Distill 系列模型已上线!!🚀🚀🚀

DeepSeek-R1-Distill-Qwen 😊

DeepSeek-R1-Distill-LLaMA 😊

版本配套表

MindSpeed-LLM的依赖配套如下表,安装步骤参考安装指导

依赖软件 版本
昇腾NPU驱动 在研版本
昇腾NPU固件
Toolkit(开发套件) 在研版本
Kernel(算子包)
NNAL(Ascend Transformer Boost加速库)
Python >=3.8
PyTorch 2.1
torch_npu插件 在研版本
apex

注:由于master分支使用在研版本的驱动以及CANN包,因此master上的一些新特性老版本配套可能有不支持情况,要使用稳定版本,请切换到商发分支并下载对应依赖版本进行安装。

预置模型

MindSpeed-LLM目前已内置支持百余个业界常用稠密和MOE大模型的预训练、微调和偏好对齐,预置模型清单详见下表。

模型类别 模型列表
稠密模型 Dense
MOE模型 MOE

训练方案与特性

MindSpeed-LLM包含分布式预训练、分布式微调、分布式偏好对齐等训练方案。

  • legacy是megatron早期方案,与新的mcore方案在代码设计上存在差异,legacy方案不支持moe模型以及长序列CP切分方案,我们建议优先使用mcore方案。

  • Released表示是否商用发布,未商用发布特性处于内部开发状态,不推荐开发者使用。

分布式预训练

基于MindSpeed-LLM的实测预训练性能如下:

模型系列 实验模型 硬件信息 集群规模 MFU
LLAMA2 LLAMA2-7B Atlas 900 A2 PODc 1x8 69.0%
LLAMA2-13B Atlas 900 A2 PODc 1x8 64.7%
LLAMA2-70B Atlas 900 A2 PODc 4x8 44.1%
Mixtral Mixtral-8x7B Atlas 900 A2 PODc 8x8 31.7%

基于 GPT3-175B 稠密大模型,从128颗 NPU 扩展到 7968颗 NPU 进行 MFU 与线性度实验,下图是实验数据:

图中呈现了对应集群规模下的 MFU 值与集群整体的 线性度情况. 计算公式已经放到社区,点击链接可进行参考:MFU计算公式线性度计算公式.

预训练方案

方案类别 Legacy Mcore Released 贡献方
样本拼接 【Ascend】
样本pack

加速特性

场景 特性名称 Mcore Legacy Released 贡献方
SPTD并行 张量并行 【Ascend】
流水线并行
虚拟流水并行
序列并行
Noop Layers
长序列并行 Ascend Ring Attention 长序列并行
Ulysses 长序列并行
混合长序列并行
MOE MOE 专家并行
MOE 重排通信优化
显存优化 参数副本复用
分布式优化器
Swap Attention
重计算
Norm重计算
O2 BF16 Optimizer
融合算子 Flash attention
Flash attention variable length
Fused rmsnorm
Fused swiglu
Fused rotary position embedding
GMM
Matmul Add
通信优化 梯度reduce通算掩盖
Recompute in advance
权重all-gather通算掩盖
MC2
CoC
Ascend Gloo 存档落盘优化

分布式微调

基于MindSpeed-LLM的实测指令微调性能如下:

模型 硬件 集群 方案 序列 性能 MFU
llama2-7B Atlas 900 A2 PODc 1x8 全参 dynamic 15.87 samples/s -
全参 16K 1.14 samples/s 37.4%
全参 32K 0.51 samples/s 48.4%
llama2-13B Atlas 900 A2 PODc 1x8 全参 dynamic 50.4 samples/s -
llama2-70B Atlas 900 A2 PODc 1x8 LoRA dynamic 15.2 samples/s -

微调方案

方案名称 Mcore Legacy LoRA QLoRA Released 贡献方
单样本微调 【Ascend】
多样本pack微调 【NAIE】
多轮对话微调 【Ascend】

加速特性

场景 特性 Mcore Legacy Released 贡献方
LoRA微调 CCLoRA 【Ascend】
Fused_MLP 【Ascend】
QLoRA微调 CCLoRA 【NAIE】
Fused_MLP 【NAIE】
长序列微调 长序列CP方案 【Ascend】

开发工具链

权重转换

MindSpeed-LLM支持huggingface、megatron-core、megatron-legacy三种格式的权重互转,支持Lora权重合并。权重转换特性参数和使用说明参考权重转换

源格式 目标格式 切分特性 lora 贡献方 Released
huggingface megatron-core tp、pp、dpp、vpp、cp、ep、loop layer 【Ascend】
megatron-legacy
megatron-core huggingface
megatron-legacy tp、pp、dpp、vpp、cp、ep、loop layer
megatron-core
megatron-legacy huggingface
megatron-core tp、pp、dpp、vpp、cp、ep、loop layer
megatron-legacy

数据预处理

MindSpeed-LLM支持预训练、指令微调、RLHF等多种任务的数据预处理。

任务场景 数据集 Mcore Legacy Released 贡献方
预训练 预训练数据处理 【Ascend】
微调 Alpaca风格
ShareGPT风格
DPO Pairwise数据集处理 【NAIE】
SimPO
ORM
PRM PRM数据集处理 【Ascend】

在线推理

特性 Mcore Legacy Released 贡献方
流式推理 【NAIE】
Chat对话 【NAIE】
yarn上下文扩展 【Ascend】

开源数据集评测

特性 数据集 Mcore Legacy Released 贡献方
开源测评集评测 MMLU 【NAIE】
CEval 【NAIE】
BoolQ 【NAIE】
BBH 【NAIE】
AGIEval 【NAIE】
HumanEval 【NAIE】

性能采集

场景 特性 Mcore Legacy Released 贡献方
性能采集 基于昇腾芯片采集 profiling 数据 【Ascend】

高可用性

场景 特性 Mcore Legacy Released 贡献方
高可用性 基于昇腾芯片开启确定性计算 【Ascend】

版本维护策略

MindSpeed-LLM版本有以下五个维护阶段:

状态 时间 说明
计划 1—3 个月 计划特性
开发 3 个月 开发特性
维护 6-12 个月 合入所有已解决的问题并发布版本,针对不同的MindSpeed-LLM版本采取不同的维护策略,常规版本和长期支持版本维护周期分别为6个月和12个月
无维护 0—3 个月 合入所有已解决的问题,无专职维护人员,无版本发布
生命周期终止(EOL) N/A 分支不再接受任何修改

MindSpeed-LLM已发布版本维护策略:

MindSpeed-LLM版本 对应标签 维护策略 当前状态 发布时间 后续状态 EOL日期
2.0.0 \ 常规版本 维护 2025/3/30 预计2025/09/30起无维护
1.0.0 v1.0.0 常规版本 维护 2024/12/30 预计2025/06/30起无维护
1.0.RC3 v1.0.RC3.0 常规版本 EOL 2024/09/30 生命周期终止 2025/3/30
1.0.RC2 v1.0.RC2.0 常规版本 EOL 2024/06/30 生命周期终止 2024/12/30
1.0.RC1 v1.0.RC1.0 常规版本 EOL 2024/03/30 生命周期终止 2024/9/30
bk_origin_23 \ Demo EOL 2023 生命周期终止 2024/6/30

致谢

MindSpeed-LLM由华为公司的下列部门以及昇腾生态合作伙伴联合贡献 :

华为公司:

  • 计算产品线:Ascend
  • 公共开发部:NAIE
  • 全球技术服务部:GTS
  • 华为云计算:Cloud

生态合作伙伴:

  • 移动云(China Mobile Cloud):大云震泽智算平台

感谢来自社区的每一个PR,欢迎贡献 MindSpeed-LLM。

安全声明

MindSpeed-LLM安全声明

免责声明

致MindSpeed-LLM使用者

  1. MindSpeed-LLM提供的模型仅供您用于非商业目的。
  2. 对于各模型,MindSpeed-LLM平台仅提示性地向您建议可用于训练的数据集,华为不提供任何数据集,如您使用这些数据集进行训练,请您特别注意应遵守对应数据集的License,如您因使用数据集而产生侵权纠纷,华为不承担任何责任。
  3. 如您在使用MindSpeed-LLM模型过程中,发现任何问题(包括但不限于功能问题、合规问题),请在Gitee提交issue,我们将及时审视并解决。

致数据集所有者

如果您不希望您的数据集在MindSpeed-LLM中的模型被提及,或希望更新MindSpeed-LLM中的模型关于您的数据集的描述,请在Gitee提交issue,我们将根据您的issue要求删除或更新您的数据集描述。衷心感谢您对MindSpeed-LLM的理解和贡献。

License声明

Ascend MindSpeed-LLM提供的模型,如模型目录下存在License的,以该License为准。如模型目录下不存在License的,以Apache 2.0许可证许可,对应许可证文本可查阅Ascend MindSpeed-LLM根目录。

The following applies to all files unless otherwise noted: # Copyright (c) 2024, Huawei Technologies Co., Ltd # Copyright (c) 2022, NVIDIA CORPORATION. All rights reserved. # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions # are met: # * Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above copyright # notice, this list of conditions and the following disclaimer in the # documentation and/or other materials provided with the distribution. # * Neither the name of NVIDIA CORPORATION nor the names of its # contributors may be used to endorse or promote products derived # from this software without specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY # EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR # PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR # CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, # EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, # PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR # PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY # OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. -- This repository also contains code from Hugging Face Inc., Google Research, Facebook (from their Fairseq and Dino projects), Microsoft(from their Swin-Transformer project)and Philip Popien. Files from these organizations have notices at the top of each file. Below are licenses used in those files, as indicated. ------------- LICENSE FOR Facebook, huggingface and Google Research code -------------- Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ------------- LICENSE FOR Facebook Fairseq code -------------- MIT License Copyright (c) Facebook, Inc. and its affiliates. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

简介

昇腾LLM分布式训练框架 展开 收起
Python 等 2 种语言
MIT
取消

发行版 (5)

全部
9天前

贡献者

全部

近期动态

不能加载更多了
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
Python
1
https://gitee.com/ascend/MindSpeed-LLM.git
git@gitee.com:ascend/MindSpeed-LLM.git
ascend
MindSpeed-LLM
MindSpeed-LLM
master

搜索帮助