name | about | labels |
---|---|---|
Bug Report | Use this template for reporting a bug | kind/bug |
mixtral网络在910B上单卡模拟编译报错
Ascend
/GPU
/CPU
) / 硬件环境:Please delete the backend not involved / 请删除不涉及的后端:
/device ascend
Software Environment / 软件环境 (Mandatory / 必填):
-- MindSpore version (e.g., 1.7.0.Bxxx) :
-- Python version (e.g., Python 3.7.5) :
-- OS platform and distribution (e.g., Linux Ubuntu 16.04):
-- GCC/Compiler version (if compiled from source):
master_20240510061514_c6a1400a90
Excute Mode / 执行模式 (Mandatory / 必填)(PyNative
/Graph
):
Please delete the mode not involved / 请删除不涉及的模式:
/mode pynative
/mode graph
mixtral_ascend910b_mixtral_8x7b_4096_64_False_8_000
网络模拟编译成功
Traceback (most recent call last):
File "run_mindformer.py", line 263, in <module>
main(config_)
File "/home/jenkins/bmz/Solution_Special_Test/cases/01special_test/mixtral/test_ms_mixtral_8x7b_910b_0001/mixtral_ascend910b_mixtral_8x7b_4096_64_False_8_001/mixtral_ascend910b_mixtral_8x7b_4096_64_0013_000/mindformers/tools/cloud_adapter/cloud_monitor.py", line 44, in wrapper
raise exc
File "/home/jenkins/bmz/Solution_Special_Test/cases/01special_test/mixtral/test_ms_mixtral_8x7b_910b_0001/mixtral_ascend910b_mixtral_8x7b_4096_64_False_8_001/mixtral_ascend910b_mixtral_8x7b_4096_64_0013_000/mindformers/tools/cloud_adapter/cloud_monitor.py", line 34, in wrapper
result = run_func(*args, **kwargs)
File "run_mindformer.py", line 39, in main
trainer.train()
File "/home/miniconda3/envs/ci/lib/python3.7/site-packages/mindspore/_checkparam.py", line 1372, in wrapper
return func(*args, **kwargs)
File "/home/jenkins/bmz/Solution_Special_Test/cases/01special_test/mixtral/test_ms_mixtral_8x7b_910b_0001/mixtral_ascend910b_mixtral_8x7b_4096_64_False_8_001/mixtral_ascend910b_mixtral_8x7b_4096_64_0013_000/mindformers/trainer/trainer.py", line 424, in train
is_full_config=True)
File "/home/jenkins/bmz/Solution_Special_Test/cases/01special_test/mixtral/test_ms_mixtral_8x7b_910b_0001/mixtral_ascend910b_mixtral_8x7b_4096_64_False_8_001/mixtral_ascend910b_mixtral_8x7b_4096_64_0013_000/mindformers/trainer/causal_language_modeling/causal_language_modeling.py", line 120, in train
**kwargs)
File "/home/jenkins/bmz/Solution_Special_Test/cases/01special_test/mixtral/test_ms_mixtral_8x7b_910b_0001/mixtral_ascend910b_mixtral_8x7b_4096_64_False_8_001/mixtral_ascend910b_mixtral_8x7b_4096_64_0013_000/mindformers/trainer/base_trainer.py", line 778, in training_process
initial_epoch=config.runner_config.initial_epoch)
File "/home/miniconda3/envs/ci/lib/python3.7/site-packages/mindspore/train/model.py", line 1087, in train
initial_epoch=initial_epoch)
File "/home/miniconda3/envs/ci/lib/python3.7/site-packages/mindspore/train/model.py", line 115, in wrapper
func(self, *args, **kwargs)
File "/home/miniconda3/envs/ci/lib/python3.7/site-packages/mindspore/train/model.py", line 637, in _train
cb_params, sink_size, initial_epoch, valid_infos)
File "/home/miniconda3/envs/ci/lib/python3.7/site-packages/mindspore/train/model.py", line 721, in _train_dataset_sink_process
outputs = train_network(*inputs)
File "/home/miniconda3/envs/ci/lib/python3.7/site-packages/mindspore/nn/cell.py", line 695, in __call__
out = self.compile_and_run(*args, **kwargs)
File "/home/miniconda3/envs/ci/lib/python3.7/site-packages/mindspore/nn/cell.py", line 1013, in compile_and_run
self.compile(*args, **kwargs)
File "/home/miniconda3/envs/ci/lib/python3.7/site-packages/mindspore/nn/cell.py", line 997, in compile
jit_config_dict=self._jit_config_dict, **kwargs)
File "/home/miniconda3/envs/ci/lib/python3.7/site-packages/mindspore/common/api.py", line 1642, in compile
result = self._graph_executor.compile(obj, args, kwargs, phase, self._use_vm_mode())
RuntimeError: Compile graph kernel_graph1 failed.
----------------------------------------------------
- Ascend Error Message:
----------------------------------------------------
E19999: Inner Error!
E19999: 2024-05-10-19:14:30.521.527 Call OptimizeGraphPrepare failed, ret:1343225860, engine_name:hccl_graph_optimizer, graph_name:kernel_graph1[FUNC:OptimizeOriginalGraphForQuantize][FILE:graph_optimize.cc][LINE:269]
TraceBack (most recent call last):
[Call][PreRun] Failed, graph_id:2, session_id:0.[FUNC:CompileGraph][FILE:graph_manager.cc][LINE:4409]
[Compile][Graph]Compile graph failed, error code:1343225857, session_id:0, graph_id:2.[FUNC:CompileGraph][FILE:ge_api.cc][LINE:1165]
(Please search "CANN Common Error Analysis" at https://www.mindspore.cn for error code description)
----------------------------------------------------
- C++ Call Stack: (For framework developers)
----------------------------------------------------
mindspore/ccsrc/plugin/device/ascend/hal/hardware/ge_graph_executor.cc:948 CompileGraph
走给周亚强
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。
感谢您的提问,您可以评论//mindspore-assistant更快获取帮助:
将max_device_memory从59g设置为57g后,用例跑通,该报错是ms预留内存过多,导致tdt和hccl内存不足引起的,请使用一台真实环境能跑的一张卡再测试一遍,看是否还有问题
@zhouyaqiang0 max_device_memory这个设置有什么规律可循,单纯从报错上来看看不出来是由于显存分配不合理导致的
在更前面一点的位置有minddata相关的报错,打开plog会有内存相关报错
Appearance & Root Cause:
是ms预留内存过多,导致tdt和hccl内存不足引起的
Fix Solution:
将max_device_memory从59g设置为57g,使用一台真实环境能跑的一张进行测试
Relation PR:
不涉及
Selftest Result:
不涉及
Self-test Report & DT Review:
不涉及
登录 后才可以发表评论