ModelProfile

Capability profile for a specific LLM model.

Usage

Source

ModelProfile()

Parameters

provider: str

Provider name (e.g., "anthropic", "openai", "ollama").

model: str

Model identifier (e.g., "claude-sonnet-4-6", "gpt-4o").

display_name: str = ""

Human-readable name for display in tables and reports.

context_window: int | None = None

Maximum context window size in tokens.

max_output_tokens: int | None = None

Maximum output tokens the model can generate, if known.

supports_tools: bool | None = None

Whether the model supports tool/function calling.

supports_vision: bool | None = None

Whether the model can process image inputs.

supports_structured_output: bool | None = None

Whether the model supports structured (JSON schema) output.

supports_streaming: bool | None = None

Whether the model supports streaming responses.

cost_tier: CostTier | None = None

Relative cost tier for the model.

knowledge_cutoff: str | None = None

Knowledge cutoff date string (e.g., “2025-04”), if known.

notes: str = ""
Any additional notes about the model.

Attributes

Name Description
key Canonical provider:model key.
name Display name, falling back to model identifier.

key

Canonical provider:model key.

key: str


name

Display name, falling back to model identifier.

name: str

Methods

Name Description
supports() Check whether a capability is supported.

supports()

Check whether a capability is supported.

Usage

Source

supports(capability)
Parameters
capability: str
One of "tools"``,“vision”, `"structured_output", "streaming".
Returns
bool | None
True/False if known, None if unknown.
Raises
ValueError
If the capability name is not recognised.