[AI] Ollama (OpenWe...
 
Notifications
Clear all

[AI] Ollama (OpenWebUI)

2 Posts
1 Users
0 Reactions
217 Views
(@andre)
Posts: 56
Member Admin
Topic starter
 

[AI] Ollama (OpenWebUI)

 

Ollama is an open-source LLM manager introduced by Meta. This allows you to download LLM models and run them locally. A separate application called OpenWebUI connects to Ollama and provides a UI for the User. 

Details

Application: Ollama and OpenWebUI

Type: Linux Direct Install

Ollama Website: https://ollama.com/

OpenWebUI Website: https://docs.openwebui.com/

Demo Dashboard: https://ai.valleycommunity.co.za

Usage: Website or Install the Progressive Web App

Self Hosting Installation

1. Log into your Linux machine using ssh.

2. Install Ollama using the installation script

curl -fsSL  https://ollama.com/install.sh  | sh

3. Allow External Hosts

add the following line to /etc/systemd/system/ollama.service under the Service section

Environment="OLLAMA_HOST=0.0.0.0"

4. Start Ollama

sudo systemctl start ollama

5. Install OpenWebUI

docker run -d -p 3000:8080 -e OLLAMA_BASE_URL= https://example.com  -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

6. Add OpenWebUI to Reverse Proxy

This topic was modified 5 months ago 2 times by Andre Tiltman
This topic was modified 4 months ago 3 times by Andre Tiltman
 
Posted : 15/07/2024 9:19 am
(@andre)
Posts: 56
Member Admin
Topic starter
 

Valley Community Public AI

 

This public instance is hosted locally on the Valley Community Server and registration is open to the public. 

💻Server: Anton

🔑Access: Approval takes a few hours

🤖Models: 

  • Llama2
  • Phi3
  • Lllama3.1

 

This post was modified 4 months ago by Andre Tiltman
 
Posted : 22/08/2024 10:24 am
Share: