This gist will show how to get started with OpenWeb-UI and Ollama on docker compose and how to interact with the Web and API to communicate with the models hosted by Ollama.
This is done with only CPU.
We define 2 services in our compose definition:
- Ollama: we will run ollama inside a container and persist the models to disk.
- OpenWebUI: we are pointing it to the ollama service and disabling CORS.