-
Notifications
You must be signed in to change notification settings - Fork 3
Expand file tree
/
Copy path17_multi_model_consensus_with_tools.py
More file actions
152 lines (115 loc) · 5.14 KB
/
17_multi_model_consensus_with_tools.py
File metadata and controls
152 lines (115 loc) · 5.14 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
"""
This example builds upon example 16 (multi_model_consensus.py) but gives more agency to
the response-combiner. While example 16 uses a fixed set of models defined in the main
function, this version lets the response-combiner itself decide which models are most
appropriate for the question.
This example demonstrates agent delegation, where one agent (the response-combiner) can
dynamically invoke another agent (get_model_response) via tools. By providing the ask_model
function as a tool, the response-combiner can:
1. Choose which models to query based on the nature and complexity of the question
2. Adapt its strategy based on initial responses (e.g. asking specialized models for clarification)
3. Use its own reasoning to determine when it has enough perspectives to synthesize an answer
This hierarchical approach allows the response-combiner agent to orchestrate multiple model
queries by delegating to the get_model_response agent through tool calls. The response-combiner
acts as a "manager" agent that can strategically coordinate with "worker" agents to gather
the insights needed.
"""
import asyncio
import pytest
from pydantic import BaseModel, Field
import workflowai
from workflowai import Model
class AskModelInput(BaseModel):
"""Input for asking a question to a specific model."""
question: str = Field(description="The question to ask")
model: Model = Field(description="The model to ask the question to")
class AskModelOutput(BaseModel):
"""Output from asking a question to a model."""
response: str = Field(description="The model's response to the question")
# This function acts as a tool that allows one agent to delegate to another agent.
# The response-combiner agent can use this tool to dynamically query different models
# through the get_model_response agent. This creates a hierarchy where the
# response-combiner orchestrates multiple model queries by delegating to get_model_response.
async def ask_model(query_input: AskModelInput) -> AskModelOutput:
"""Ask a specific model a question and return its response."""
run = await get_model_response.run(
MultiModelInput(
question=query_input.question,
),
model=query_input.model,
)
# get_model_response.run() returns a Run[ModelResponse], so we need to access the output
return AskModelOutput(response=run.output.response)
class MultiModelInput(BaseModel):
"""Input model containing the question to ask all models."""
question: str = Field(
description="The question to ask all models",
)
class ModelResponse(BaseModel):
"""Response from an individual model."""
response: str = Field(description="The model's response to the question")
class CombinerInput(BaseModel):
"""Input for the response combiner."""
original_question: str = Field(description="The question to ask multiple models")
class CombinedOutput(BaseModel):
"""Final output combining responses from all models."""
combined_answer: str = Field(
description="Synthesized answer combining insights from all models",
)
explanation: str = Field(
description="Explanation of how the responses were combined and why",
)
models_used: list[str] = Field(
description="List of models whose responses were combined",
)
@workflowai.agent(
id="question-answerer",
)
async def get_model_response(query: MultiModelInput) -> ModelResponse:
"""
Make sure to:
1. Provide a clear and detailed response
2. Stay focused on the question asked
3. Be specific about any assumptions made
4. Highlight areas of uncertainty
"""
...
@workflowai.agent(
id="response-combiner",
model=Model.GPT_4O_MINI_LATEST,
tools=[ask_model],
)
async def combine_responses(responses_input: CombinerInput) -> CombinedOutput:
"""
Analyze and combine responses from multiple models into a single coherent answer.
You should ask at least 3 different models to get a diverse set of perspectives.
You are an expert at analyzing and synthesizing information from multiple sources.
Your task is to:
1. Review the responses from different models (at least 3)
2. Identify key insights and unique perspectives from each
3. Create a comprehensive answer that combines the best elements
4. Explain your synthesis process
5. List all models whose responses were used in the synthesis
Please ensure the combined answer is:
- Accurate and well-reasoned
- Incorporates unique insights from each model
- Clear and coherent
- Properly attributed when using specific insights
- Based on responses from at least 3 different models
"""
...
@pytest.mark.xfail(reason="Example is flaky")
async def main():
# Example: Scientific explanation
print("\nExample: Scientific Concept")
print("-" * 50)
question = "What is dark matter and why is it important for our understanding of the universe?"
# Let the response-combiner handle asking the models
combined = await combine_responses.run(
CombinerInput(
original_question=question,
),
)
print(combined)
if __name__ == "__main__":
asyncio.run(main())