Open WebUI + Ollama - LLM Interface
Web interface for running and interacting with local LLMs.
Quick Facts
- URL: https://webui.collabrains.eu
- Backend: Ollama (llama2, mistral, etc.)
- Status: ✅ Running
Features
- 🤖 Chat interface
- 📝 Prompt management
- 🔧 Model selection
- 💾 Conversation history
Note
No GPU in container config - CPU-based inference only.