Skip to content

A full-stack ChatGPT-like sample using Azure Cosmos DB to store chat history. Showcasing Aspire's ability to dynamically spin up containers, configure services, and launch a unified dashboard complete with full observability and integrated debugging support.

Notifications You must be signed in to change notification settings

AzureCosmosDB/aspire-ai-chat-demo

Repository files navigation

Aspire AI Chat

Aspire AI Chat is a full-stack chat sample that combines modern technologies to deliver a ChatGPT-like experience, backed by Azure Cosmos DB for chat storage and Ollama for local inference. This sample showcases the capabilities of Aspire for cloud-native development, debugging, orchestration, and observability.

What is Aspire?

Aspire is a new cloud-native stack that streamlines the development, orchestration, and observability of distributed applications. With Aspire, you can simply start debugging, and it will dynamically spin up containers, configure services, and launch a unified dashboard - complete with full observability and integrated debugging support. In this sample, Aspire runs the entire application architecture locally (including the backend API, Azure Cosmos DB emulator, Ollama, and React frontend) exactly as it would run in production. No other framework delivers Aspire’s seamless "F5 = full cloud-native stack + observability + debug" workflow. It enables a tightly integrated, cloud-native development model that makes working with microservices feel as straightforward as building a monolith.

High-Level Overview

  • Backend API:
    The backend is built with ASP.NET Core and interacts with an LLM using Microsoft.Extensions.AI. It leverages IChatClient to abstract the interaction between the API and the model. Chat responses are streamed back to the client using stream JSON array responses.

  • Data & Persistence:
    Uses Entity Framework Core with Azure Cosmos DB for flexible, cloud-based NoSQL storage. This project utilizes the new preview CosmosDB emulator for efficient local development.

  • AI & Chat Capabilities:

    • Uses Ollama (via OllamaSharp) for local inference, enabling context-aware responses.
    • In production, the application switches to OpenAI for LLM capabilities.
  • Frontend UI:
    Built with React, the user interface offers a modern and interactive chat experience. The React application is built and hosted using Caddy.

Getting Started

Prerequisites

Running the Application

Run the AIChat.AppHost project. This project uses .NET Aspire to run the application in a container.

Configuration

  • By default, the application uses Ollama for local inference.
  • To use OpenAI, set the appropriate configuration values (e.g., API keys, endpoints).
  • The Azure Cosmos DB database will be automatically created and migrated when running with Aspire.

About

A full-stack ChatGPT-like sample using Azure Cosmos DB to store chat history. Showcasing Aspire's ability to dynamically spin up containers, configure services, and launch a unified dashboard complete with full observability and integrated debugging support.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published