lumin

@lumin17

lumin 暂无简介

lumin17@huawei.com
lumin的个人主页
/
关注的仓库(22)

    Watch omniai/omni-npu

    A vLLM out-of-tree platform plugin that enables running vLLM on NPU (Ascend/torch_npu).

    最近更新: 6小时前

    Watch omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    最近更新: 1天前

    Watch lumin/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    最近更新: 8天前

    Watch 夏逸斐/omni-npu forked from omniai/omni-npu

    A vLLM (0.12.0) out-of-tree platform plugin that enables running vLLM on NPU (Ascend/torch_npu).

    最近更新: 8天前

    Watch lishicheng_0/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    最近更新: 1个月前

    Watch lumin/lumin_test

    this is just for test

    最近更新: 3个月前

    Watch lumin/omni_infer_pangu72b_RL

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    最近更新: 4个月前

    Watch lumin/omni_infer_jinhuan forked from zhengjinhuan/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    最近更新: 4个月前

    Watch lumin/omni_infer_hebin forked from 何斌/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    最近更新: 5个月前

    Watch lumin/omni_infer_lishicheng forked from lishicheng_0/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    最近更新: 5个月前

搜索帮助