Ollama Proxmox LXC script

Ollama is a tool that allows you to run large language models locally on your own computer. This means you can experiment with and use these AI models without needing an internet connection or relying on cloud-based services. It simplifies the process of managing and running these models, offering a way to keep your data private and potentially work faster. 1 You can use Ollama to create local chatbots, conduct AI research, develop privacy-focused AI applications, and integrate AI into existing systems.

To create a new Proxmox VE Ollama LXC, run the command below in the Proxmox VE Shell.
To Update Ollama, run the command below (or type update) in the LXC Console.

bash -c "$(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVE/main/ct/ollama.sh)"

Location of config file /usr/local/lib/ollama


Default settings

CPU: 4vCPU
RAM: 4GB
HDD: 35GB
Default Interface: IP:11434