Skip to main content

ollama

Overview

Ollama provides a local runtime for large language models. To ensure compatibility with HostedAI, you must install a specific Ollama version validated for our stack. This allows you to run models like gpt-oss, Gemma 3, DeepSeek-R1, and Qwen3 within your HostedAI environment.

Ollama Installation

Install Ollama Version 0.12.3 for HostedAI Compatibility

Install Ollama version 0.12.3 to ensure optimal performance and stability with HostedAI. Execute the following command to install this validated version:

sudo curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.12.3 sh.

This action configures your environment to run large language models successfully.

Default Installs Latest Version

The standard Ollama install command fetches the latest version, not the validated 0.12.3. Using the latest version can lead to performance issues with HostedAI. Always use the specific command provided to install version 0.12.3.

Avoid Ollama Version 0.12.5 and Later Versions

Prevent performance issues and maintain system stability by avoiding Ollama version 0.12.5 and any subsequent default installations. These versions have not been validated for HostedAI and may cause instability. Do not use the default installation command

sudo curl -fsSL https://ollama.com/install.sh | sh

as it installs the latest version, which is not guaranteed to be compatible with HostedAI.