Skip to content

[BUG]: GenerateAsync via the IEmbeddingGenerator interface throws ObjectDisposedException on LLama.Native.SafeLLamaContextHandle #1259

@bmazzarol-bunnings

Description

@bmazzarol-bunnings

Description

When creating embeddings via the Microsoft.Extensions.AI IEmbeddingGenerator interface the underlying LLama.Native.SafeLLamaContextHandle throws an object disposed exception.

Which makes a bit of sense when you look through how the LlamaEmbedder works, it creates a new context for each call of GetEmbeddingsWithTokenCount, which it then disposes before returning.

So the call here https://github.com/SciSharp/LLamaSharp/blob/master/LLama/LLamaEmbedder.EmbeddingGenerator.cs#L46 will always hits a disposed context.

Reproduction Steps

Create a LlamaEmbedder and generate an embedding through the IEmbeddingGenerator interface methods.

ModelParams model = ...
var weights = await LLamaWeights.LoadFromFileAsync(model);

var embedder = new LLamaEmbedder(weights, model);

// throws
_ = await embedder.GenerateAsync("What is 1 + 1?");
// works
_ = await embedder.GetEmbeddings("What is 1 + 1?");

Environment & Configuration

  • Operating system: Windows 11
  • .NET runtime version: dotnet 9
  • LLamaSharp version: 0.25.0
  • CUDA version (if you are using cuda backend):
  • CPU & GPU device: CPU

Known Workarounds

Downgrade to 0.24.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions