[Local LLM] LiteChat 轻量级本地大模型聊天 WebUI,支持 vLLM May 5, 2026 · user https://github.com/zsj1029/LiteChat 企业内部场景适用,从 llama-cpp 的 webui 扒出来的,本地改造了下支持 vllm 全程 Qwen3.6 27B (vLLM), Claude Vscode 改造Share this: Click to share on Facebook (Opens in new window) Facebook Click to share on X (Opens in new window) X Comments Leave a Reply Cancel replyYour email address will not be published. Required fields are marked *Comment * Name * Email * Website Save my name, email, and website in this browser for the next time I comment. {{#message}}{{{message}}}{{/message}}{{^message}}Your submission failed. The server responded with {{status_text}} (code {{status_code}}). Please contact the developer of this form processor to improve this message. Learn More{{/message}}{{#message}}{{{message}}}{{/message}}{{^message}}It appears your submission was successful. Even though the server responded OK, it is possible the submission was not processed. Please contact the developer of this form processor to improve this message. Learn More{{/message}}Submitting…
Leave a Reply Cancel reply