bridge.services.protocols.llm_provider module#
Typed chat message model and an LLMProvider protocol.
- class bridge.services.protocols.llm_provider.ChatMessage(**data)[source]#
Bases:
BaseModelA message in a chat conversation.
- Parameters:
role (Literal['system', 'user', 'assistant'])
content (str)
- role#
The role of the message sender. One of: - “system”: Instructions or global context for the LLM assistant. - “user”: A message from the user. - “assistant”: A message from the LLM assistant.
- Type:
Literal[“system”, “user”, “assistant”]
- content#
The content of the message.
- Type:
str
-
content:
str#
-
role:
Literal['system','user','assistant']#
- class bridge.services.protocols.llm_provider.LLMProvider(*args, **kwargs)[source]#
Bases:
ProtocolProtocol for LLM providers that generate chat-based responses.
- async generate(messages)[source]#
Generate a chat-based response from the model.
- Return type:
- Parameters:
messages (list[ChatMessage])