Skip to content

ToolException from MCP server not handled correclty #6449

@jgomezve

Description

@jgomezve

Checked other resources

  • This is a bug, not a usage question. For questions, please use the LangChain Forum (https://forum.langchain.com/).
  • I added a clear and detailed title that summarizes the issue.
  • I read what a minimal reproducible example is (https://stackoverflow.com/help/minimal-reproducible-example).
  • I included a self-contained, minimal example that demonstrates the issue INCLUDING all the relevant imports. The code run AS IS to reproduce the issue.

Example Code

import asyncio
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain_openai import AzureChatOpenAI
import os
from dotenv import load_dotenv
from auth.oauth_login import get_oauth_token
from langchain.agents import create_agent
from system.prompts import agent_prompt

load_dotenv()
AZURE_ENDPOINT = "https://chat-ai.acme.com"

async def main():
    client = MultiServerMCPClient({
        "github": {
            "command": "docker",
            "args": ["run", "-i", "--rm",
                        "-e", "GITHUB_PERSONAL_ACCESS_TOKEN",
                        "-e", "GITHUB_HOST",
                        "ghcr.io/github/github-mcp-server"],
            "env": {
                "GITHUB_PERSONAL_ACCESS_TOKEN": os.getenv('GITHUB_PERSONAL_ACCESS_TOKEN'),
                "GITHUB_HOST": os.getenv('GITHUB_HOST'),
            },
            "transport": "stdio",
    
        }
    })


    tools = await client.get_tools()
    app_key = os.getenv('APP_KEY')
    model =  AzureChatOpenAI(
                            deployment_name="gpt-4.1",
                            azure_endpoint=AZURE_ENDPOINT,
                            api_key=get_oauth_token(),
                            api_version="2023-08-01-preview",
                            model_kwargs=dict(user=f'{{"appkey": "{app_key}"}}')
                        )
    agent = create_agent(model, tools)
    response = await agent.ainvoke({"messages": "Could you please list all the issues in the GitHub repo TheWrongRepo?"})

    for msg in response['messages']:
        print(type(msg))
        print(msg.content)

    print(response['messages'][-1].content)

asyncio.run(main())

Error Message and Stack Trace (if applicable)

Traceback (most recent call last):
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/test_bot.py", line 49, in <module>
    asyncio.run(main())
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/asyncio/runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/asyncio/base_events.py", line 664, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/test_bot.py", line 41, in main
    response = await agent.ainvoke({"messages": "Could you please list all the issues in the GitHub repo fsdfdfs?"})
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/pregel/main.py", line 3137, in ainvoke
    async for chunk in self.astream(
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/pregel/main.py", line 2956, in astream
    async for _ in runner.atick(
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/pregel/_runner.py", line 304, in atick
    await arun_with_retry(
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/pregel/_retry.py", line 137, in arun_with_retry
    return await task.proc.ainvoke(task.input, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/_internal/_runnable.py", line 705, in ainvoke
    input = await asyncio.create_task(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/_internal/_runnable.py", line 473, in ainvoke
    ret = await self.afunc(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/prebuilt/tool_node.py", line 763, in _afunc
    outputs = await asyncio.gather(*coros)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/prebuilt/tool_node.py", line 1102, in _arun_one
    return await self._execute_tool_async(tool_request, input_type, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/prebuilt/tool_node.py", line 1051, in _execute_tool_async
    content = _handle_tool_error(e, flag=self._handle_tool_errors)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/prebuilt/tool_node.py", line 407, in _handle_tool_error
    content = flag(e)  # type: ignore [assignment, call-arg]
              ^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/prebuilt/tool_node.py", line 364, in _default_handle_tool_errors
    raise e
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langgraph/prebuilt/tool_node.py", line 1004, in _execute_tool_async
    response = await tool.ainvoke(call_args, config)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langchain_core/tools/structured.py", line 63, in ainvoke
    return await super().ainvoke(input, config, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langchain_core/tools/base.py", line 608, in ainvoke
    return await self.arun(tool_input, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langchain_core/tools/base.py", line 1036, in arun
    raise error_to_raise
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langchain_core/tools/base.py", line 1002, in arun
    response = await coro_with_context(coro, context)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langchain_core/tools/structured.py", line 117, in _arun
    return await self.coroutine(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langchain_mcp_adapters/tools.py", line 148, in call_tool
    return _convert_call_tool_result(call_tool_result)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jgomezve/Documents/Development/ai/mcp_a2a/venv/lib/python3.12/site-packages/langchain_mcp_adapters/tools.py", line 58, in _convert_call_tool_result
    raise ToolException(tool_content)
langchain_core.tools.base.ToolException: Could not resolve to a Repository with the name 'TheWrongRepo/TheWrongRepo'.

Description

During a error inside the ToolNode the exception is not properly handled. I managed to workaround it by modifying the method _default_handle_tool_errors in

def _default_handle_tool_errors(e: Exception) -> str:

This is my implementation

def _default_handle_tool_errors(e: Exception) -> str:
    """Default error handler for tool errors.

    If the tool is a tool invocation error, return its message.
    Otherwise, raise the error.

    """

    if isinstance(e, ToolInvocationError):
        return e.message
    elif isinstance(e, ToolException):
        return str(e)
    raise e

Let me know if I should open this in the langchain_mcp_adapters repo

System Info

(venv) $ python -m langchain_core.sys_info

System Information
------------------
> OS:  Darwin
> OS Version:  Darwin Kernel Version 24.6.0: Mon Aug 11 21:16:05 PDT 2025; root:xnu-11417.140.69.701.11~1/RELEASE_X86_64
> Python Version:  3.12.0 (v3.12.0:0fb18b02c8, Oct  2 2023, 09:45:56) [Clang 13.0.0 (clang-1300.0.29.30)]

Package Information
-------------------
> langchain_core: 1.0.5
> langchain: 1.0.5
> langsmith: 0.4.38
> langchain_mcp_adapters: 0.1.11
> langchain_openai: 1.0.1
> langgraph_sdk: 0.2.9

Optional packages not installed
-------------------------------
> langserve

Other Dependencies
------------------
> httpx: 0.28.1
> jsonpatch: 1.33
> langgraph: 1.0.3
> mcp: 1.19.0
> openai: 2.6.1
> orjson: 3.11.4
> packaging: 25.0
> pydantic: 2.12.3
> pyyaml: 6.0.3
> requests: 2.32.5
> requests-toolbelt: 1.0.0
> rich: 14.2.0
> tenacity: 9.1.2
> tiktoken: 0.12.0
> typing-extensions: 4.15.0
> zstandard: 0.25.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingpendingawaiting review/confirmation by maintainer

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions