-
Notifications
You must be signed in to change notification settings - Fork 164
Open
Labels
questionFurther information is requestedFurther information is requested
Description
Hello,
I noticed my tools were called for no reason so i created a nodejs server that would debug what OllamaSharp sends it and i see that everything is sent over correctly except the Description of the tool. Let me Show you my sever the output and the generated class.
OllamaSharp version: 5.3.4
2025-08-02T14:34:26.739Z - POST /api/chat
Received POST to /api/chat:
Headers: {
host: 'localhost:11434',
'content-type': 'application/json; charset=utf-8',
'content-length': '960'
}
Body: {
"model": "sparklebeat:latest",
"messages": [
{
"role": "user",
"content": "Extra Contextual information:\nThe following is factual information about the current state of the game. Use this as your knowledge base to answers questions. If someone asks you a questionmake sure to double check your contextual information.\n\nDressed: You are dressed.\nTalking to: unknown\nCurrent date: 2025-08-02\nCurrent day: Saturday\nCurrent time: 16:34:26\nOnline since: 2025-08-02T16:34:26Z\nYour location: Sin Club\n"
},
{
"role": "user",
"content": "Ready?"
}
],
"stream": true,
"tools": [
{
"type": "function",
"function": {
"name": "WhatDoYouSee",
"description": "",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "",
"enum": null
}
},
"required": []
}
}
},
{
"type": "function",
"function": {
"name": "GoogleSearch",
"description": "",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "",
"enum": null
}
},
"required": [
"query"
]
}
}
}
],
"CustomHeaders": {}
}
This is how i added the description for the tool

This is the generated code make by OllamaSharp..
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Text.Json.Serialization;
using System.Threading.Tasks;
#nullable enable
namespace Core.Ai.Tools.Collection
{
/// <summary>
/// This class was auto-generated by the OllamaSharp ToolSourceGenerator.
/// </summary>
public class WhatDoYouSeeTool : OllamaSharp.Models.Chat.Tool, OllamaSharp.Tools.IAsyncInvokableTool
{
/// <summary>
/// Initializes a new instance with metadata about the original method.
/// </summary>
public WhatDoYouSeeTool()
{
this.Function = new OllamaSharp.Models.Chat.Function
{
Name = "WhatDoYouSee",
Description = "Use this function to find out what is happening around you."
};
this.Function.Parameters = new OllamaSharp.Models.Chat.Parameters
{
Properties = new Dictionary<string, OllamaSharp.Models.Chat.Property>
{
{ "query", new OllamaSharp.Models.Chat.Property { Type = "string", Description = "A user question about what is happening around you." } }
},
Required = Array.Empty<string>()
};
}
/// <summary>
/// Invokes the tool with given arguments asynchronously
/// </summary>
/// <param name="args">The arguments to invoke the tool with</param>
/// <returns>The result of the invoked tool</returns>
public async Task<object?> InvokeMethodAsync(IDictionary<string, object?>? args)
{
if (args == null) args = new Dictionary<string, object?>();
string query = (string?)args["query"] ?? "What does the sign say?";
var result = await Core.Ai.Tools.Collection.VisionTools.WhatDoYouSee(query);
return result;
}
}
}
Here is the nodejs script i used to debug what OllamSharp was sending to Ollama.
const express = require('express');
const app = express();
const PORT = 11434; // Ollama's default port
// Middleware to parse JSON bodies
app.use(express.json());
// Middleware to log all requests
app.use((req, res, next) => {
console.log(`\n${new Date().toISOString()} - ${req.method} ${req.path}`);
next();
});
// Handler for /api/create endpoint
app.post('/api/create', (req, res) => {
console.log('Received POST to /api/create:');
console.log('Headers:', req.headers);
console.log('Body:', JSON.stringify(req.body, null, 2));
res.status(200).send('OK');
});
// Handler for /api/chat endpoint
app.post('/api/chat', (req, res) => {
console.log('Received POST to /api/chat:');
console.log('Headers:', req.headers);
console.log('Body:', JSON.stringify(req.body, null, 2));
res.status(200).send('OK');
});
// Start the server
app.listen(PORT, () => {
console.log(`Server is running on http://localhost:${PORT}`);
});
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested