Skip to content

Commit 75ffd6a

Browse files
committed
Add an example of the augmented LLM
1 parent bb08b34 commit 75ffd6a

File tree

3 files changed

+78
-0
lines changed

3 files changed

+78
-0
lines changed

README.md

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -167,6 +167,36 @@ coagent translator -H type:ChatMessage --chat -d '{"role": "user", "content": "
167167

168168
## Patterns
169169

170+
(The following patterns are mainly inspired by [Anthropic's Building effective agents][4] and [OpenAI's Orchestrating Agents][5].)
171+
172+
### Basic: Augmented LLM
173+
174+
**Augmented LLM** is an LLM enhanced with augmentations such as retrieval, tools, and memory. Our current models can actively use these capabilities—generating their own search queries, selecting appropriate tools, and determining what information to retain.
175+
176+
<p align="center">
177+
<img src="assets/patterns-augmented-llm.png">
178+
</p>
179+
180+
**Example** (see [examples/patterns/augmented_llm.py](examples/patterns/augmented_llm.py) for a runnable example):
181+
182+
```python
183+
from coagent.agents import ChatAgent, ModelClient, tool
184+
from coagent.core import AgentSpec, new
185+
186+
187+
class Assistant(ChatAgent):
188+
system = """You are an agent who can use tools."""
189+
client = ModelClient(...)
190+
191+
@tool
192+
async def query_weather(self, city: str) -> str:
193+
"""Query the weather in the given city."""
194+
return f"The weather in {city} is sunny."
195+
196+
197+
assistant = AgentSpec("assistant", new(Assistant))
198+
```
199+
170200
### Workflow: Chaining
171201

172202
**Chaining** decomposes a task into a sequence of steps, where each agent processes the output of the previous one.
@@ -423,3 +453,5 @@ triage = AgentSpec(
423453
[1]: https://docs.nats.io/nats-concepts/jetstream
424454
[2]: https://modelcontextprotocol.io/introduction
425455
[3]: https://docs.nats.io/running-a-nats-service/nats_docker/nats-docker-tutorial
456+
[4]: https://www.anthropic.com/research/building-effective-agents
457+
[5]: https://cookbook.openai.com/examples/orchestrating_agents

assets/patterns-augmented-llm.png

46.9 KB
Loading

examples/patterns/augmented_llm.py

Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
import asyncio
2+
import os
3+
4+
from coagent.agents import ChatAgent, ModelClient, tool
5+
from coagent.agents.messages import ChatMessage
6+
from coagent.core import AgentSpec, new, set_stderr_logger
7+
from coagent.runtimes import LocalRuntime
8+
9+
10+
class Assistant(ChatAgent):
11+
system = """You are an agent who can use tools."""
12+
client = ModelClient(
13+
model=os.getenv("MODEL_NAME"),
14+
api_base=os.getenv("MODEL_API_BASE"),
15+
api_version=os.getenv("MODEL_API_VERSION"),
16+
api_key=os.getenv("MODEL_API_KEY"),
17+
)
18+
19+
@tool
20+
async def query_weather(self, city: str) -> str:
21+
"""Query the weather in the given city."""
22+
return f"The weather in {city} is sunny."
23+
24+
25+
assistant = AgentSpec("assistant", new(Assistant))
26+
27+
28+
async def main():
29+
async with LocalRuntime() as runtime:
30+
await runtime.register(assistant)
31+
32+
result = await assistant.run(
33+
ChatMessage(
34+
role="user",
35+
content="What's the weather like in Beijing?",
36+
).encode(),
37+
stream=True,
38+
)
39+
async for chunk in result:
40+
msg = ChatMessage.decode(chunk)
41+
print(msg.content, end="", flush=True)
42+
43+
44+
if __name__ == "__main__":
45+
set_stderr_logger("TRACE")
46+
asyncio.run(main())

0 commit comments

Comments
 (0)