@lvfangbo
吕方博 暂无简介
A vLLM (0.12.0) out-of-tree platform plugin that enables running vLLM on NPU (Ascend/torch_npu).
omniinfer 测试用例配置文件存放