Skip to content

Issue with OllamaSharp endpoint? #327

@ch4uTR

Description

@ch4uTR

Hi!

I've been learning C# and I tried the new ASP.NET AI Chat Web Application template.

When I send a request to http://localhost:11434/api/generate using Postman, I get a valid response because the llama3.2:latest model is running.

However, when I use this template, which relies on OllamaSharp, it throws an exception:

System.Net.Http.HttpRequestException: 'Response status code does not indicate success: 404 (Not Found).'

To investigate, I wrote a small code snippet in Program.cs. When I run it, the request succeeds:

          var client = new HttpClient();
          client.BaseAddress = new Uri("http://localhost:11434");
          
          var payload = new
          {
              model = "llama3.2:latest",
              prompt = "Hello"
          };
          
          var content = new StringContent(JsonSerializer.Serialize(payload), Encoding.UTF8, "application/json");
          var response = await client.PostAsync("api/generate", content);
          var result = await response.Content.ReadAsStringAsync();
          Console.WriteLine(result);

I’m still learning C#, so I might be missing something on my side, but I wonder if there could be an issue with the endpoints that OllamaSharp uses internally when sending requests.

PS: I’m a student and this is for learning purposes. I would really appreciate any guidance or advice you could offer.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions