open-webui
Open WebUI: Best Frontend for Local LLMs on Your Homelab
Why Open WebUI Beats the Alternatives for Local LLM Hosting You've got Ollama running on your homelab—solid choice—but you're stuck using CLI commands to chat with your models. Open WebUI changes that. It's a self-hosted web interface built specifically for local LLMs,