Skip to content

Releases: microsoft/autogen

python-v0.7.4

19 Aug 18:50
00d6e78
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: python-v0.7.3...python-v0.7.4

python-v0.7.3

19 Aug 08:07
2f3981d
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: python-v0.7.2...python-v0.7.3

python-v0.7.2

07 Aug 00:29
c145ace
Compare
Choose a tag to compare

What's Changed

  • Update website 0.7.1 by @ekzhu in #6869
  • Update OpenAIAssistantAgent doc by @ekzhu in #6870
  • Update 0.7.1 website ref by @ekzhu in #6871
  • Remove assistant related methods from OpenAIAgent by @ekzhu in #6866
  • Make DockerCommandLineCodeExecutor the default for MagenticOne team by @Copilot in #6684
  • Add approval_func option to CodeExecutorAgent by @ekzhu in #6886
  • Add documentation warnings for AgentTool/TeamTool parallel tool calls limitation by @Copilot in #6883
  • Add parallel_tool_call to openai model client config by @ekzhu in #6888
  • Fix structured logging serialization data loss with SerializeAsAny annotations by @Copilot in #6889
  • Update version 0.7.2 by @ekzhu in #6895
  • Adds support for JSON and MARKDOWN in Redis agent memory by @justin-cechmanek in #6897
  • Add warning for MCP server docs by @ekzhu in #6901

Full Changelog: python-v0.7.1...python-v0.7.2

python-v0.7.1

28 Jul 08:43
1ca7419
Compare
Choose a tag to compare

What's New

OpenAIAgent supports all built-in tools

Support nested Team as a participant in a Team

  • Supporting Teams as Participants in a GroupChat by @ekzhu in #5863

Introduce RedisMemory

Upgrade to latest MCP version

Upgrade to latest GraphRAG version

include_name_in_message flag to make the use of name field optional in chat messages sent via the Open AI client.

  • Add include_name_in_message parameter to make name field optional in OpenAI messages by @Copilot in #6845

All Changes

  • Feat/OpenAI agent builtin tools 6657 by @tejas-dharani in #6671
  • Setup publishing for pyautogen package by @ekzhu in #6813
  • In Add required termination condition and missing agent_e by @dave-howard in #6809
  • Fix JSON serialization of team state by handling datetime objects in message dump by @Copilot in #6797
  • Upgrade_mcp_version by @victordibia in #6814
  • Update AGS (Support Workbenches ++) by @victordibia in #6736
  • feat: add timeout for http tools by @lo5twind in #6818
  • Expand MCP Workbench to support more MCP Client features by @tylerpayne in #6785
  • Deprecating openai assistant agent. Apply version conditioned import for open ai version < 1.83 by @ekzhu in #6827
  • Fix OpenAI UnprocessableEntityError when AssistantAgent makes multiple tool calls by @Copilot in #6799
  • fix: use correct format when adding memory to mem0 by @savy-91 in #6831
  • Adds Redis Memory extension class by @justin-cechmanek in #6743
  • Add support for "format": "json" in JSON schemas by @onematchfox in #6846
  • docs: correct function spelling by @savy-91 in #6849
  • Add include_name_in_message parameter to make name field optional in OpenAI messages by @Copilot in #6845
  • upgrade graphrag sample to v2.3+ by @victordibia in #6744
  • fix: load agent correctly in test service by @zrquan in #6860
  • Update installation guide in _openai_assistant_agent.py by @ekzhu in #6863
  • fix: use ```sh consistently by @zrquan in #6864
  • Supporting Teams as Participants in a GroupChat by @ekzhu in #5863
  • Update version to 0.7.0 by @ekzhu in #6865
  • Bring back OpenAIAssistantAgent by @ekzhu in #6867
  • Update version to 0.7.1 by @ekzhu in #6868

New Contributors

Full Changelog: python-v0.6.4...python-v0.7.1

python-v0.6.4

09 Jul 17:52
9f2c5aa
Compare
Choose a tag to compare

What's New

More helps from @copilot-swe-agent for this release.

Improvements to GraphFlow

Now it behaves the same way as RoundRobinGroupChat, SelectorGroupChat and others after termination condition hits -- it retains its execution state and can be resumed with a new task or empty task. Only when the graph finishes execution i.e., no more next available agent to choose from, the execution state will be reset.

Also, the inner StopAgent has been removed and there will be no last message coming from the StopAgent. Instead, the stop_reason field in the TaskResult will carry the stop message.

Improvements to Workbench implementations

McpWorkbench and StaticWorkbench now supports overriding tool names and descriptions. This allows client-side optimization of the server-side tools, for better adaptability.

All Changes

New Contributors

Full Changelog: python-v0.6.2...python-v0.6.4

python-v0.6.2

01 Jul 00:09
556033b
Compare
Choose a tag to compare

What's New

Streaming Tools

This release introduces streaming tools and updates AgentTool and TeamTool to support run_json_stream. The new interface exposes the inner events of tools when calling run_stream of agents and teams. AssistantAgent is also updated to use run_json_stream when the tool supports streaming. So, when using AgentTool or TeamTool with AssistantAgent, you can receive the inner agent's or team's events through the main agent.

To create new streaming tools, subclass autogen_core.tools.BaseStreamTool and implement run_stream. To create new streaming workbench, subclass autogen_core.tools.StreamWorkbench and implement call_tool_stream.

  • Introduce streaming tool and support streaming for AgentTool and TeamTool. by @ekzhu in #6712

tool_choice parameter for ChatCompletionClient and subclasses

Introduces a new parameter tool_choice to the ChatCompletionClients create and create_stream methods.

This is also the first PR by @copliot-swe-agent!

  • Add tool_choice parameter to ChatCompletionClient create and create_stream methods by @copilot-swe-agent in #6697

AssistantAgent's inner tool calling loop

Now you can enable AssistantAgent with an inner tool calling loop by setting the max_tool_iterations parameter through its constructor. The new implementation calls the model and executes tools until (1) the model stops generating tool calls, or (2) max_tool_iterations has been reached. This change simplies the usage of AssistantAgent.

OpenTelemetry GenAI Traces

This releases added new traces create_agent, invoke_agent, execute_tool from the GenAI Semantic Convention.

  • OTel GenAI Traces for Agent and Tool by @ekzhu in #6653

You can also disable agent runtime traces by setting the environment variable AUTOGEN_DISABLE_RUNTIME_TRACING=true.

output_task_messages flag for run and run_stream

You can use the new flag to customize whether the input task messages get emitted as part of run_stream of agents and teams.

Mem0 Extension

Added Mem0 memory extension so you can use it as memory for AutoGen agents.

Improvement to GraphFlow

  • Add activation group for workflow with multiple cycles by @ZenWayne in #6711

uv update

We have removed the uv version limit so you can use the latest version to develop AutoGen.

  • Unpin uv version to use the latest version by @ekzhu in #6713

Other Python Related Changes

New Contributors

Full Changelog: python-v0.6.1...python-v0.6.2

python-v0.6.1

05 Jun 05:58
348bcb1
Compare
Choose a tag to compare

Bug Fixes

  • Fix bug in GraphFlow cycle check by @ekzhu in #6629
  • Fix graph validation logic and add tests by @ekzhu in #6630

Others

  • Add list of function calls and results in ToolCallSummaryMessage by @ekzhu in #6626

Full Changelog: python-v0.6.0...python-v0.6.1

python-v0.6.0

05 Jun 00:37
16e1943
Compare
Choose a tag to compare

What's New

Change to BaseGroupChatManager.select_speaker and support for concurrent agents in GraphFlow

We made a type hint change to the select_speaker method of BaseGroupChatManager to allow for a list of agent names as a return value. This makes it possible to support concurrent agents in GraphFlow, such as in a fan-out-fan-in pattern.
Β 

# Original signature:
async def select_speaker(self, thread: Sequence[BaseAgentEvent | BaseChatMessage]) -> str:
  ...

# New signature:
async def select_speaker(self, thread: Sequence[BaseAgentEvent | BaseChatMessage]) -> List[str] | str:
  ...

Now you can run GraphFlow with concurrent agents as follows:

import asyncio

from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.conditions import MaxMessageTermination
from autogen_agentchat.teams import DiGraphBuilder, GraphFlow
from autogen_ext.models.openai import OpenAIChatCompletionClient


async def main():
    # Initialize agents with OpenAI model clients.
    model_client = OpenAIChatCompletionClient(model="gpt-4.1-nano")
    agent_a = AssistantAgent("A", model_client=model_client, system_message="You are a helpful assistant.")
    agent_b = AssistantAgent("B", model_client=model_client, system_message="Translate input to Chinese.")
    agent_c = AssistantAgent("C", model_client=model_client, system_message="Translate input to Japanese.")

    # Create a directed graph with fan-out flow A -> (B, C).
    builder = DiGraphBuilder()
    builder.add_node(agent_a).add_node(agent_b).add_node(agent_c)
    builder.add_edge(agent_a, agent_b).add_edge(agent_a, agent_c)
    graph = builder.build()

    # Create a GraphFlow team with the directed graph.
    team = GraphFlow(
        participants=[agent_a, agent_b, agent_c],
        graph=graph,
        termination_condition=MaxMessageTermination(5),
    )

    # Run the team and print the events.
    async for event in team.run_stream(task="Write a short story about a cat."):
        print(event)


asyncio.run(main())

Agent B and C will run concurrently in separate coroutines.

  • Enable concurrent execution of agents in GraphFlow by @ekzhu in #6545

Callable conditions for GraphFlow edges

Now you can use lambda functions or other callables to specify edge conditions in GraphFlow. This addresses the issue of the keyword substring-based conditions cannot cover all possibilities and leading to "cannot find next agent" bug.

NOTE: callable conditions are currently experimental, and it cannot be serialized with the graph.

  • Add callable condition for GraphFlow edges by @ekzhu in #6623

New Agent: OpenAIAgent

  • Feature: Add OpenAIAgent backed by OpenAI Response API by @jay-thakur in #6418

MCP Improvement

AssistantAgent Improvement

Code Executors Improvement

  • Add option to auto-delete temporary files in LocalCommandLineCodeExecutor by @holtvogt in #6556
  • Include all output to error output in docker jupyter code executor by @ekzhu in #6572

OpenAIChatCompletionClient Improvement

OllamaChatCompletionClient Improvement

AnthropicBedrockChatCompletionClient Improvement

MagenticOneGroupChat Improvement

  • Use structured output for m1 orchestrator by @ekzhu in #6540

Other Changes

New Contributors

Full Changelog: python-v0.5.7...python-v0.6.0

python-v0.5.7

14 May 05:02
87cf4f0
Compare
Choose a tag to compare

What's New

AzureAISearchTool Improvements

The Azure AI Search Tool API now features unified methods:

  • create_full_text_search() (supporting "simple", "full", and "semantic" query types)
  • create_vector_search() and
  • create_hybrid_search()
    We also added support for client-side embeddings, while defaults to service embeddings when client embeddings aren't provided.

If you have been using create_keyword_search(), update your code to use create_full_text_search() with "simple" query type.

SelectorGroupChat Improvements

To support long context for the model-based selector in SelectorGroupChat, you can pass in a model context object through the new model_context parameter to customize the messages sent to the model client when selecting the next speaker.

  • Add model_context to SelectorGroupChat for enhanced speaker selection by @Ethan0456 in #6330

OTEL Tracing Improvements

We added new metadata and message content fields to the OTEL traces emitted by the SingleThreadedAgentRuntime.

Agent Runtime Improvements

Other Python Related Changes

New Contributors

Full Changelog: python-v0.5.6...python-v0.5.7

python-v0.5.6

02 May 22:55
880a225
Compare
Choose a tag to compare

What's New

GraphFlow: customized workflows using directed graph

Should I say finally? Yes, finally, we have workflows in AutoGen. GraphFlow is a new team class as part of the AgentChat API. One way to think of GraphFlow is that it is a version of SelectorGroupChat but with a directed graph as the selector_func. However, it is actually more powerful, because the abstraction also supports concurrent agents.

Note: GraphFlow is still an experimental API. Watch out for changes in the future releases.

For more details, see our newly added user guide on GraphFlow.

If you are in a hurry, here is an example of creating a fan-out-fan-in workflow:

import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.teams import DiGraphBuilder, GraphFlow
from autogen_agentchat.ui import Console
from autogen_ext.models.openai import OpenAIChatCompletionClient


async def main() -> None:
    # Create an OpenAI model client
    client = OpenAIChatCompletionClient(model="gpt-4.1-nano")

    # Create the writer agent
    writer = AssistantAgent(
        "writer",
        model_client=client,
        system_message="Draft a short paragraph on climate change.",
    )

    # Create two editor agents
    editor1 = AssistantAgent(
        "editor1", model_client=client, system_message="Edit the paragraph for grammar."
    )

    editor2 = AssistantAgent(
        "editor2", model_client=client, system_message="Edit the paragraph for style."
    )

    # Create the final reviewer agent
    final_reviewer = AssistantAgent(
        "final_reviewer",
        model_client=client,
        system_message="Consolidate the grammar and style edits into a final version.",
    )

    # Build the workflow graph
    builder = DiGraphBuilder()
    builder.add_node(writer).add_node(editor1).add_node(editor2).add_node(
        final_reviewer
    )

    # Fan-out from writer to editor1 and editor2
    builder.add_edge(writer, editor1)
    builder.add_edge(writer, editor2)

    # Fan-in both editors into final reviewer
    builder.add_edge(editor1, final_reviewer)
    builder.add_edge(editor2, final_reviewer)

    # Build and validate the graph
    graph = builder.build()

    # Create the flow
    flow = GraphFlow(
        participants=builder.get_participants(),
        graph=graph,
    )

    # Run the workflow
    await Console(flow.run_stream(task="Write a short biography of Steve Jobs."))

asyncio.run(main())

Major thanks to @abhinav-aegis for the initial design and implementation of this amazing feature!

Azure AI Agent Improvement

New Sample

  • A multi-agent PostgreSQL data management example by @mehrsa in #6443

Bug Fixes:

Dev Improvement

Other Python Related Changes

New Contributors

Full Changelog: python-v0.5.5...python-v0.5.6