lumin

@lumin17

lumin no introduction.

lumin17@huawei.com
All Personal Contributions
Forks Pause/Closed

    夏逸斐/omni-npu forked from omniai/omni-npu

    A vLLM (0.12.0) out-of-tree platform plugin that enables running vLLM on NPU (Ascend/torch_npu).

    lishicheng_0/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    He Jian/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    Lin Weizhe/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    bigpumpkin/communication_softbus_lite forked from OpenHarmony/communication_softbus_lite
    Closed

    Implementation code for virtual bus discovery, networking, and transmission | 软总线发现、组网、传输功能实现

Search