Skip to content

Model settings

ModelSettings dataclass

Settings to use when calling an LLM.

This class holds optional model configuration parameters (e.g. temperature, top_p, penalties, truncation, etc.).

Not all models/providers support all of these parameters, so please check the API documentation for the specific model and provider you are using.

Source code in src/cai/sdk/agents/model_settings.py
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
@dataclass
class ModelSettings:
    """Settings to use when calling an LLM.

    This class holds optional model configuration parameters (e.g. temperature,
    top_p, penalties, truncation, etc.).

    Not all models/providers support all of these parameters, so please check the API documentation
    for the specific model and provider you are using.
    """

    temperature: float | None = None
    """The temperature to use when calling the model."""

    top_p: float | None = None
    """The top_p to use when calling the model."""

    frequency_penalty: float | None = None
    """The frequency penalty to use when calling the model."""

    presence_penalty: float | None = None
    """The presence penalty to use when calling the model."""

    tool_choice: Literal["auto", "required", "none"] | str | None = None
    """The tool choice to use when calling the model."""

    parallel_tool_calls: bool | None = None
    """Whether to use parallel tool calls when calling the model.
    Defaults to False if not provided."""

    truncation: Literal["auto", "disabled"] | None = None
    """The truncation strategy to use when calling the model."""

    max_tokens: int | None = None
    """The maximum number of output tokens to generate."""

    store: bool | None = None
    """Whether to store the generated model response for later retrieval.
    Defaults to True if not provided."""

    agent_model: str | None = None
    """The model from the Agent class. If set, this will override the model provided
    to the OpenAIChatCompletionsModel during initialization."""

    def resolve(self, override: ModelSettings | None) -> ModelSettings:
        """Produce a new ModelSettings by overlaying any non-None values from the
        override on top of this instance."""
        if override is None:
            return self

        changes = {
            field.name: getattr(override, field.name)
            for field in fields(self)
            if getattr(override, field.name) is not None
        }
        return replace(self, **changes)

temperature class-attribute instance-attribute

temperature: float | None = None

The temperature to use when calling the model.

top_p class-attribute instance-attribute

top_p: float | None = None

The top_p to use when calling the model.

frequency_penalty class-attribute instance-attribute

frequency_penalty: float | None = None

The frequency penalty to use when calling the model.

presence_penalty class-attribute instance-attribute

presence_penalty: float | None = None

The presence penalty to use when calling the model.

tool_choice class-attribute instance-attribute

tool_choice: (
    Literal["auto", "required", "none"] | str | None
) = None

The tool choice to use when calling the model.

parallel_tool_calls class-attribute instance-attribute

parallel_tool_calls: bool | None = None

Whether to use parallel tool calls when calling the model. Defaults to False if not provided.

truncation class-attribute instance-attribute

truncation: Literal['auto', 'disabled'] | None = None

The truncation strategy to use when calling the model.

max_tokens class-attribute instance-attribute

max_tokens: int | None = None

The maximum number of output tokens to generate.

store class-attribute instance-attribute

store: bool | None = None

Whether to store the generated model response for later retrieval. Defaults to True if not provided.

agent_model class-attribute instance-attribute

agent_model: str | None = None

The model from the Agent class. If set, this will override the model provided to the OpenAIChatCompletionsModel during initialization.

resolve

resolve(override: ModelSettings | None) -> ModelSettings

Produce a new ModelSettings by overlaying any non-None values from the override on top of this instance.

Source code in src/cai/sdk/agents/model_settings.py
51
52
53
54
55
56
57
58
59
60
61
62
def resolve(self, override: ModelSettings | None) -> ModelSettings:
    """Produce a new ModelSettings by overlaying any non-None values from the
    override on top of this instance."""
    if override is None:
        return self

    changes = {
        field.name: getattr(override, field.name)
        for field in fields(self)
        if getattr(override, field.name) is not None
    }
    return replace(self, **changes)