From e8faddd82b287e3117708aab32d8dca20e2d006b Mon Sep 17 00:00:00 2001 From: zhanghanLeo Date: Mon, 17 Nov 2025 16:33:53 +0800 Subject: [PATCH] update moe_token_unpermute docs. --- docs/map_from_buildin_to_custom.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/map_from_buildin_to_custom.md b/docs/map_from_buildin_to_custom.md index e89d717..1f47895 100644 --- a/docs/map_from_buildin_to_custom.md +++ b/docs/map_from_buildin_to_custom.md @@ -10,4 +10,5 @@ | ops.auto_generate.fused_add_topk_div | [ms_custom_ops.fused_add_topk_div](../ops/c_api/fused_add_topk_div/fused_add_topk_div_doc.md) | 接口一致 | | ops.auto_generate.paged_cache_load | [ms_custom_ops.paged_cache_load](../ops/c_api/paged_cache_load/paged_cache_load_doc.md) | 新增支持key、value支持不同dtype;取消inplace更新的输出key、value,直接改为输出 | | ops.auto_generate.quant_batch_matmul | [ms_custom_ops.quant_batch_matmul](../ops/c_api/quant_batch_matmul/quant_batch_matmul.md) | 新增了x2_format参数,用于指定x2的format; 入参名称`pertokenScaleOptional`修改为`pertoken_scale`; 入参名称`dtype`修改为`output_dtype` | -| ops.auto_generate.apply_rotary_pos_emb | [ms_custom_ops.apply_rotary_pos_emb_atb](../ops/c_api/apply_rotary_pos_emb_atb/apply_rotary_pos_emb_atb.md) | 新增atb的apply_rotary_pos_emb_atb算子,代替ops.auto_generate.apply_rotary_pos_emb,注意rotary_coeff和cos_format有变化,详见[API](https://gitee.com/mindspore/ms_custom_ops/blob/master/ops/c_api/apply_rotary_pos_emb_atb/apply_rotary_pos_emb_atb.md) | \ No newline at end of file +| ops.auto_generate.apply_rotary_pos_emb | [ms_custom_ops.apply_rotary_pos_emb_atb](../ops/c_api/apply_rotary_pos_emb_atb/apply_rotary_pos_emb_atb.md) | 新增atb的apply_rotary_pos_emb_atb算子,代替ops.auto_generate.apply_rotary_pos_emb,注意rotary_coeff和cos_format有变化,详见[API](https://gitee.com/mindspore/ms_custom_ops/blob/master/ops/c_api/apply_rotary_pos_emb_atb/apply_rotary_pos_emb_atb.md) | +| ops.moe_token_unpermute | [ms_custom_ops.moe_token_unpermute](../ops/c_api/moe_token_unpermute/moe_token_unpermute.md) | 接口参数一致, 但需注意:ops接口只支持A2训练芯片,ms_custom_ops场景下只支持Atlas推理系列产品, 并且ms_custom_ops场景下当前仅支持:`padded_mode = false, restore_shape = None`, topK 支持 1、2,、4、8, hidden_size 支持 2048、5120、7168。 | -- Gitee