--- name: Bug & Question & Get Help | Bug & 提问 & 求助 about: Describe this issue template's purpose here. 请描述你遇到的问题 title: "[GET HELP] " labels: question assignees: '' --- ### 1. Checklist / 检查项 - [ ] I have removed sensitive information from configuration/logs. 我已移除配置或日志中的敏感信息。 - [ ] I have checked the [FAQ](https://docs.llmvtuber.com/docs/faq/) and [existing issues](https://github.com/Open-LLM-VTuber/Open-LLM-VTuber/issues). 我已查阅[常见问题](https://docs.llmvtuber.com/docs/faq/)和[已有 issue](https://github.com/Open-LLM-VTuber/Open-LLM-VTuber/issues)。 - [ ] I am using the latest version of the project. 我正在使用项目的最新版本。 --- ### 2. Environment Details / 环境信息 - How did you install Open-LLM-VTuber: 你是如何安装 Open-LLM-VTuber 的: - [ ] git clone (源码克隆) - [ ] release zip (发布包) - [ ] exe (Windows) (Windows 安装包) - [ ] dmg (MacOS) (MacOS 安装包) - Are you running the backend and frontend on the same device? 后端和前端是否在同一台设备上运行? - If you used GPU, please provide your GPU model and driver version: 如果你使用了 GPU,请提供你的 GPU 型号及驱动版本信息: - Browser (if applicable): 浏览器(如果适用): --- ### 3. Describe the bug / 问题描述 What exactly is happening? What do you want to see? How to reproduce? 请详细描述发生了什么、你希望看到什么,以及如何复现。 --- ### 4. Screenshots / Logs (if relevant) 截图 / 日志(如有) - Backend log: 后端日志 - Frontend setting (General): 前端设置(通用) - Frontend console log (F12): 前端控制台日志(F12) - If using Ollama: output of `ollama ps`: 如果使用 Ollama,请附上 `ollama ps` 的输出 --- ### 5. Configuration / 配置文件 > Please provide relevant config files, with sensitive info like API keys removed > > > 请提供相关配置文件(请务必去除 API key 等敏感信息) > - `conf.yaml` - `model_dict.json`, `.model3.json`