Conversation

Manages sequences of messages in conversational AI interactions, primarily returned by ChatBot methods.

USAGE

Conversation(conversation_id=None)

The Conversation class is automatically created and returned by ChatBot chat methods, serving as the primary container for managing multi-turn conversations between users and AI assistants. This integration with ChatBot creates a natural progression where users start with chatbot configuration and receive Conversation objects that handle message history, context management, and persistence.

Integration with ChatBot:

This design ensures that users naturally discover conversation management capabilities through normal ChatBot usage, while providing advanced users direct access to conversation-level operations for specialized workflows.

The class provides comprehensive functionality for message management, context window control, conversation persistence, and history analysis. It automatically handles chronological ordering, role-based filtering, and efficient context management for optimal AI performance.

Conversations serve as containers for sequences of Message objects, providing both high-level convenience methods and fine-grained control over message operations. The class supports automatic context window management to stay within model limitations, conversation serialization for persistence, and extensive querying capabilities for conversation analysis and processing.

Parameters

conversation_id : str = None

Unique identifier for the conversation. If not provided, automatically generated using uuid4() to ensure global uniqueness. This enables reliable conversation tracking and referencing across systems and sessions.

Returns

Conversation

A new Conversation instance with empty message history and default settings.

Core Message Operations

The Conversation class provides multiple ways to add and manage messages:

Message Retrieval and Filtering

Query and filter messages using flexible retrieval methods:

Context Window Management

Control conversation length and model context efficiently:

Context windows automatically manage conversation length by keeping only the most recent messages when conversations exceed the specified limit. This ensures optimal performance with language models while preserving conversation continuity.

Serialization and Persistence

Save and restore conversations using built-in serialization:

The serialization format preserves all conversation metadata, timestamps, and message history, enabling reliable persistence across sessions and systems.

Examples


Creating and managing basic conversations

Create a conversation and add messages using convenience methods:

from talk_box import Conversation

# Create a new conversation
conversation = Conversation()

# Add messages using role-specific methods
conversation.add_user_message("Hello! I need help with Python programming.")
conversation.add_assistant_message("I'd be happy to help! What specific Python topic would you like to explore?")
conversation.add_user_message("How do I create and use classes in Python?")

print(f"Conversation has {len(conversation)} messages")
print(f"Last message: {conversation.get_last_message().content}")

Working with system messages and context

Use system messages to provide context and instructions:

# Create conversation with initial system context
tech_conversation = Conversation()

# Set up the assistant's behavior
tech_conversation.add_system_message(
    "You are a senior Python developer. Provide detailed, practical answers with code examples."
)

# Add conversation context
tech_conversation.add_user_message(
    "I'm building a web API and need help with error handling.",
    metadata={"project": "web_api", "experience_level": "intermediate"}
)

tech_conversation.add_assistant_message(
    "For robust API error handling, I recommend using structured exception handling...",
    metadata={"code_examples": True, "topics": ["exceptions", "api_design"]}
)

# Get all assistant messages
assistant_responses = tech_conversation.get_messages(role="assistant")
print(f"Assistant provided {len(assistant_responses)} responses")

Context window management for long conversations

Control conversation length to work within model limitations:

# Create conversation with context window
long_conversation = Conversation()
long_conversation.set_context_window(10)  # Keep only last 10 messages

# Add many messages (simulating a long conversation)
for i in range(15):
    long_conversation.add_user_message(f"User message {i+1}")
    long_conversation.add_assistant_message(f"Assistant response {i+1}")

# Check total vs context messages
all_messages = long_conversation.get_messages()
context_messages = long_conversation.get_context_messages()

print(f"Total messages: {len(all_messages)}")
print(f"Context messages: {len(context_messages)}")
print(f"First context message: {context_messages[0].content}")

Conversation analysis and filtering

Analyze conversation patterns and extract insights:

from datetime import datetime, timedelta

# Create a conversation with varied message types
analysis_conversation = Conversation()

# Add messages with rich metadata
analysis_conversation.add_user_message(
    "What's the weather like?",
    metadata={"intent": "weather_query", "urgency": "low"}
)

analysis_conversation.add_assistant_message(
    "I don't have access to real-time weather data...",
    metadata={"capability": "limitation", "suggestion": "weather_api"}
)

analysis_conversation.add_user_message(
    "Can you help me debug this code?",
    metadata={"intent": "code_help", "urgency": "high", "topic": "debugging"}
)

# Analyze conversation patterns
user_messages = analysis_conversation.get_messages(role="user")
urgent_messages = [
    msg for msg in user_messages
    if msg.metadata.get("urgency") == "high"
]

code_related = [
    msg for msg in analysis_conversation.get_messages()
    if "code" in msg.content.lower() or msg.metadata.get("topic") == "debugging"
]

print(f"Urgent user requests: {len(urgent_messages)}")
print(f"Code-related messages: {len(code_related)}")

Conversation persistence and restoration

Save and restore conversations for session management:

import json
from pathlib import Path

# Create and populate a conversation
original_conversation = Conversation()
original_conversation.add_user_message("Save this conversation")
original_conversation.add_assistant_message("I'll help you save this conversation data")

# Serialize to dictionary
conversation_data = original_conversation.to_dict()

# Save to file (example - you might use databases, APIs, etc.)
save_path = Path("conversation_backup.json")
with save_path.open("w") as f:
    json.dump(conversation_data, f, indent=2)

# Later: restore from file
with save_path.open("r") as f:
    loaded_data = json.load(f)

# Reconstruct conversation
restored_conversation = Conversation.from_dict(loaded_data)

print(f"Original ID: {original_conversation.conversation_id}")
print(f"Restored ID: {restored_conversation.conversation_id}")
print(f"Messages match: {len(original_conversation) == len(restored_conversation)}")

# Clean up
save_path.unlink()

Multi-turn conversation workflows

Build complex conversation workflows with branching logic:

# Customer support conversation workflow
support_conversation = Conversation()

# Initial system setup
support_conversation.add_system_message(
    "You are a helpful customer support agent. Gather information before providing solutions."
)

# Customer inquiry
support_conversation.add_user_message(
    "My order hasn't arrived yet",
    metadata={"category": "shipping", "sentiment": "concerned"}
)

# Support response with information gathering
support_conversation.add_assistant_message(
    "I understand your concern. Can you provide your order number so I can check the status?",
    metadata={"action": "information_gathering", "next_step": "order_lookup"}
)

# Customer provides information
support_conversation.add_user_message(
    "Order #12345",
    metadata={"order_id": "12345", "info_provided": True}
)

# Check conversation flow
last_assistant_msg = support_conversation.get_last_message(role="assistant")
next_step = last_assistant_msg.metadata.get("next_step")
print(f"Next action needed: {next_step}")

# Determine conversation flow based on metadata
user_messages = support_conversation.get_messages(role="user")
has_order_info = any(msg.metadata.get("info_provided") for msg in user_messages)

if has_order_info:
    support_conversation.add_assistant_message(
        "Thank you! I'm looking up order #12345 now...",
        metadata={"action": "order_lookup", "order_id": "12345"}
    )

Integration with chatbot systems

Use conversations as the foundation for chatbot interactions:

from talk_box import ChatBot

# Create chatbot and conversation
bot = ChatBot().preset("technical_advisor")
bot_conversation = Conversation()

# Set up conversation context
bot_conversation.add_system_message(
    "You are a technical advisor specializing in Python and machine learning."
)

# Simulate chat interaction
def chat_with_bot(user_input: str) -> str:
    # Add user message
    bot_conversation.add_user_message(user_input)

    # Get bot response (simplified - real implementation would use bot.chat())
    response = f"Technical response to: {user_input}"
    bot_conversation.add_assistant_message(response)

    return response

# Use the chat function
response1 = chat_with_bot("What is machine learning?")
response2 = chat_with_bot("How do I start with scikit-learn?")

print(f"Conversation now has {len(bot_conversation)} messages")
print(f"Latest response: {bot_conversation.get_last_message().content}")

Advanced Features

Message Metadata: Each message can carry rich metadata for application-specific data, conversation analysis, and workflow management.

Automatic Timestamps: All messages include creation timestamps for chronological analysis and conversation timeline reconstruction.

Context Management: Intelligent context window handling ensures conversations stay within model limits while preserving important context.

Role-based Organization: Standard roles (user, assistant, system, function) provide clear conversation structure for AI processing.

Extensible Design: The conversation format supports custom roles, metadata schemas, and integration patterns for specialized use cases.

Memory Efficiency: Context windows and message filtering enable efficient handling of very long conversations without memory issues.

Integration Notes

  • Thread Safety: Conversations are not thread-safe; use external synchronization for concurrent access
  • Memory Usage: Long conversations should use context windows to manage memory efficiently
  • Serialization: All message content and metadata must be JSON-serializable for persistence
  • Timestamps: Uses local time zone-naive datetime objects; consider UTC for distributed systems
  • ID Uniqueness: Conversation and message IDs are globally unique UUID4 strings

The Conversation class provides the foundation for sophisticated conversational AI applications, enabling everything from simple chat interfaces to complex multi-turn workflows with rich context management and analysis capabilities.