钟哲涛

@zhong-zhetao

钟哲涛 暂无简介

zhongzhetao@huawei.com
所有 个人的 我参与的
Forks 暂停/关闭的

    钟哲涛/omni-npu forked from omniai/omni-npu

    A vLLM (0.12.0) out-of-tree platform plugin that enables running vLLM on NPU (Ascend/torch_npu).

    钟哲涛/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/omni_infer_2 forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/omni_infer_refactor forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/omni_infer_LLT forked from zhengjinhuan/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/omni_infer_3 forked from chengda_wu/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/omni_infer_longcat forked from pengbeicheng/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    He Jian/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/omni_infer_hj forked from He Jian/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/omni_infer_1 forked from He Jian/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/omni_infer_lm forked from lumin/omni_infer_hejian

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/omni_infer_hejian forked from lumin/omni_infer_hejian

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    钟哲涛/Zzt_repo

    RL控制VRX无人船

搜索帮助