detect_ollama()
Detect whether a local Ollama instance is running.
Usage
detect_ollama(
url=None,
*,
timeout=2.0,
)Pings the Ollama HTTP API and returns connection status, server version, and the list of available models.
Parameters
url: str | None = None-
Base URL for the Ollama API. Defaults to
http://localhost:11434. Can also be set via theOLLAMA_HOSTenvironment variable. timeout: float = 2.0- Connection timeout in seconds.
Returns
OllamaStatus- Status object with availability, version, and model list.
Examples
import talk_box as tb
status = tb.detect_ollama()
if status.available:
print(f"Ollama {status.version} running with {len(status.models)} models")
for model in status.models:
print(f" - {model}")
else:
print(f"Ollama not available: {status.error}")