[AI] Ollama (OpenWebUI)
Ollama is an open-source LLM manager introduced by Meta. This allows you to download LLM models and run them locally. A separate application called OpenWebUI connects to Ollama and provides a UI for the User.
Details
Application: Ollama and OpenWebUI
Type: Linux Direct Install
Ollama Website: https://ollama.com/
OpenWebUI Website: https://docs.openwebui.com/
Demo Dashboard: https://ai.valleycommunity.co.za
Usage: Website or Install the Progressive Web App
Self Hosting Installation
1. Log into your Linux machine using ssh.
2. Install Ollama using the installation script
curl -fsSL https://ollama.com/install.sh | sh
3. Allow External Hosts
add the following line to /etc/systemd/system/ollama.service under the Service section
Environment="OLLAMA_HOST=0.0.0.0"
4. Start Ollama
sudo systemctl start ollama
5. Install OpenWebUI
docker run -d -p 3000:8080 -e OLLAMA_BASE_URL= https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
6. Add OpenWebUI to Reverse Proxy
This public instance is hosted locally on the Valley Community Server and registration is open to the public.
💻Server: Anton
🔑Access: Approval takes a few hours
🤖Models:
- Llama2
- Phi3
- Lllama3.1