Conversation
Manages sequences of messages in conversational AI interactions, primarily returned by ChatBot methods.
USAGE
=None) Conversation(conversation_id
The Conversation
class is automatically created and returned by ChatBot
chat methods, serving as the primary container for managing multi-turn conversations between users and AI assistants. This integration with ChatBot
creates a natural progression where users start with chatbot configuration and receive Conversation
objects that handle message history, context management, and persistence.
Integration with ChatBot:
- Automatically created by
ChatBot.chat()
- Returned by
ChatBot.start_conversation()
- Updated by
ChatBot.continue_conversation()
This design ensures that users naturally discover conversation management capabilities through normal ChatBot usage, while providing advanced users direct access to conversation-level operations for specialized workflows.
The class provides comprehensive functionality for message management, context window control, conversation persistence, and history analysis. It automatically handles chronological ordering, role-based filtering, and efficient context management for optimal AI performance.
Conversations serve as containers for sequences of Message
objects, providing both high-level convenience methods and fine-grained control over message operations. The class supports automatic context window management to stay within model limitations, conversation serialization for persistence, and extensive querying capabilities for conversation analysis and processing.
Parameters
conversation_id : str = None
-
Unique identifier for the conversation. If not provided, automatically generated using
uuid4()
to ensure global uniqueness. This enables reliable conversation tracking and referencing across systems and sessions.
Returns
Conversation
-
A new Conversation instance with empty message history and default settings.
Context Window Management
Control conversation length and model context efficiently:
set_context_window()
: Set maximum message countget_context_messages()
: Get messages within window
Context windows automatically manage conversation length by keeping only the most recent messages when conversations exceed the specified limit. This ensures optimal performance with language models while preserving conversation continuity.
Serialization and Persistence
Save and restore conversations using built-in serialization:
to_dict()
: Convert to dictionary formatfrom_dict()
: Create from dictionary data
The serialization format preserves all conversation metadata, timestamps, and message history, enabling reliable persistence across sessions and systems.
Examples
Creating and managing basic conversations
Create a conversation and add messages using convenience methods:
from talk_box import Conversation
# Create a new conversation
= Conversation()
conversation
# Add messages using role-specific methods
"Hello! I need help with Python programming.")
conversation.add_user_message("I'd be happy to help! What specific Python topic would you like to explore?")
conversation.add_assistant_message("How do I create and use classes in Python?")
conversation.add_user_message(
print(f"Conversation has {len(conversation)} messages")
print(f"Last message: {conversation.get_last_message().content}")
Working with system messages and context
Use system messages to provide context and instructions:
# Create conversation with initial system context
= Conversation()
tech_conversation
# Set up the assistant's behavior
tech_conversation.add_system_message("You are a senior Python developer. Provide detailed, practical answers with code examples."
)
# Add conversation context
tech_conversation.add_user_message("I'm building a web API and need help with error handling.",
={"project": "web_api", "experience_level": "intermediate"}
metadata
)
tech_conversation.add_assistant_message("For robust API error handling, I recommend using structured exception handling...",
={"code_examples": True, "topics": ["exceptions", "api_design"]}
metadata
)
# Get all assistant messages
= tech_conversation.get_messages(role="assistant")
assistant_responses print(f"Assistant provided {len(assistant_responses)} responses")
Context window management for long conversations
Control conversation length to work within model limitations:
# Create conversation with context window
= Conversation()
long_conversation 10) # Keep only last 10 messages
long_conversation.set_context_window(
# Add many messages (simulating a long conversation)
for i in range(15):
f"User message {i+1}")
long_conversation.add_user_message(f"Assistant response {i+1}")
long_conversation.add_assistant_message(
# Check total vs context messages
= long_conversation.get_messages()
all_messages = long_conversation.get_context_messages()
context_messages
print(f"Total messages: {len(all_messages)}")
print(f"Context messages: {len(context_messages)}")
print(f"First context message: {context_messages[0].content}")
Conversation analysis and filtering
Analyze conversation patterns and extract insights:
from datetime import datetime, timedelta
# Create a conversation with varied message types
= Conversation()
analysis_conversation
# Add messages with rich metadata
analysis_conversation.add_user_message("What's the weather like?",
={"intent": "weather_query", "urgency": "low"}
metadata
)
analysis_conversation.add_assistant_message("I don't have access to real-time weather data...",
={"capability": "limitation", "suggestion": "weather_api"}
metadata
)
analysis_conversation.add_user_message("Can you help me debug this code?",
={"intent": "code_help", "urgency": "high", "topic": "debugging"}
metadata
)
# Analyze conversation patterns
= analysis_conversation.get_messages(role="user")
user_messages = [
urgent_messages for msg in user_messages
msg if msg.metadata.get("urgency") == "high"
]
= [
code_related for msg in analysis_conversation.get_messages()
msg if "code" in msg.content.lower() or msg.metadata.get("topic") == "debugging"
]
print(f"Urgent user requests: {len(urgent_messages)}")
print(f"Code-related messages: {len(code_related)}")
Conversation persistence and restoration
Save and restore conversations for session management:
import json
from pathlib import Path
# Create and populate a conversation
= Conversation()
original_conversation "Save this conversation")
original_conversation.add_user_message("I'll help you save this conversation data")
original_conversation.add_assistant_message(
# Serialize to dictionary
= original_conversation.to_dict()
conversation_data
# Save to file (example - you might use databases, APIs, etc.)
= Path("conversation_backup.json")
save_path with save_path.open("w") as f:
=2)
json.dump(conversation_data, f, indent
# Later: restore from file
with save_path.open("r") as f:
= json.load(f)
loaded_data
# Reconstruct conversation
= Conversation.from_dict(loaded_data)
restored_conversation
print(f"Original ID: {original_conversation.conversation_id}")
print(f"Restored ID: {restored_conversation.conversation_id}")
print(f"Messages match: {len(original_conversation) == len(restored_conversation)}")
# Clean up
save_path.unlink()
Multi-turn conversation workflows
Build complex conversation workflows with branching logic:
# Customer support conversation workflow
= Conversation()
support_conversation
# Initial system setup
support_conversation.add_system_message("You are a helpful customer support agent. Gather information before providing solutions."
)
# Customer inquiry
support_conversation.add_user_message("My order hasn't arrived yet",
={"category": "shipping", "sentiment": "concerned"}
metadata
)
# Support response with information gathering
support_conversation.add_assistant_message("I understand your concern. Can you provide your order number so I can check the status?",
={"action": "information_gathering", "next_step": "order_lookup"}
metadata
)
# Customer provides information
support_conversation.add_user_message("Order #12345",
={"order_id": "12345", "info_provided": True}
metadata
)
# Check conversation flow
= support_conversation.get_last_message(role="assistant")
last_assistant_msg = last_assistant_msg.metadata.get("next_step")
next_step print(f"Next action needed: {next_step}")
# Determine conversation flow based on metadata
= support_conversation.get_messages(role="user")
user_messages = any(msg.metadata.get("info_provided") for msg in user_messages)
has_order_info
if has_order_info:
support_conversation.add_assistant_message("Thank you! I'm looking up order #12345 now...",
={"action": "order_lookup", "order_id": "12345"}
metadata )
Integration with chatbot systems
Use conversations as the foundation for chatbot interactions:
from talk_box import ChatBot
# Create chatbot and conversation
= ChatBot().preset("technical_advisor")
bot = Conversation()
bot_conversation
# Set up conversation context
bot_conversation.add_system_message("You are a technical advisor specializing in Python and machine learning."
)
# Simulate chat interaction
def chat_with_bot(user_input: str) -> str:
# Add user message
bot_conversation.add_user_message(user_input)
# Get bot response (simplified - real implementation would use bot.chat())
= f"Technical response to: {user_input}"
response
bot_conversation.add_assistant_message(response)
return response
# Use the chat function
= chat_with_bot("What is machine learning?")
response1 = chat_with_bot("How do I start with scikit-learn?")
response2
print(f"Conversation now has {len(bot_conversation)} messages")
print(f"Latest response: {bot_conversation.get_last_message().content}")
Advanced Features
Message Metadata: Each message can carry rich metadata for application-specific data, conversation analysis, and workflow management.
Automatic Timestamps: All messages include creation timestamps for chronological analysis and conversation timeline reconstruction.
Context Management: Intelligent context window handling ensures conversations stay within model limits while preserving important context.
Role-based Organization: Standard roles (user, assistant, system, function) provide clear conversation structure for AI processing.
Extensible Design: The conversation format supports custom roles, metadata schemas, and integration patterns for specialized use cases.
Memory Efficiency: Context windows and message filtering enable efficient handling of very long conversations without memory issues.
Integration Notes
- Thread Safety: Conversations are not thread-safe; use external synchronization for concurrent access
- Memory Usage: Long conversations should use context windows to manage memory efficiently
- Serialization: All message content and metadata must be JSON-serializable for persistence
- Timestamps: Uses local time zone-naive datetime objects; consider UTC for distributed systems
- ID Uniqueness: Conversation and message IDs are globally unique UUID4 strings
The Conversation class provides the foundation for sophisticated conversational AI applications, enabling everything from simple chat interfaces to complex multi-turn workflows with rich context management and analysis capabilities.