Skip to content

bretuobay/model-based-interface-generation

Repository files navigation

ARIA — Adaptive Reasoning Interface Architecture

A runtime framework for LLM-driven UI generation grounded in the academic tradition of Model-Based User Interface Development (MBUID).

What is ARIA?

ARIA generates, adapts, and renders user interfaces by reasoning over a structured stack of four formal models — the same four-layer decomposition used in classical MBUID frameworks (CameleonP, UsiXML, Teallach) — with LLM-compatible representations and Zod-typed schemas replacing hand-authored rule engines.

ARIADomainModel
       ↓
ARIATaskModel  ←→  ARIAUserModel
       ↓                ↓
     AbstractUISpec (AUS)
             ↓
   Concrete Renderer (React / RN / Voice)

The LLM Planner reads the first three models and produces the Abstract UI Specification. The renderer reads the AUS only. No layer knows about a layer downstream of it.

Status

All packages fully implemented and typechecking.

Package Status Description
@aria/core Models, full AUS Zod schema (z.lazy recursive), validateTraceability()
@aria/reasoning Intent resolver, context assembler, LLM planner, conversational delta resolver, provider-agnostic routing
@aria/renderer React renderer with 30+ Tailwind HTML component types, pluggable registry
@aria/runtime Interaction scorer (expertise dimensions), diff engine (complexity delta thresholds)
@aria/sdk useARIA hook — full pipeline, fallbackSpec (L1 no-LLM mode), adaptation loop, refine() + undo()
apps/playground Next.js Product Catalog demo (L1–L5), mode toggle, API key input

Quick Start

cd aria
pnpm install

# Start the playground — opens in L1 static mode, no API key needed
cd apps/playground
pnpm dev
# → http://localhost:3000

To enable LLM generation (L2+ mode), enter your API key in the playground's sidebar input, or pre-fill it via environment variable:

# Anthropic (default)
echo "NEXT_PUBLIC_ANTHROPIC_API_KEY=sk-ant-..." >> apps/playground/.env.local

# OpenAI
echo "NEXT_PUBLIC_OPENAI_API_KEY=sk-..." >> apps/playground/.env.local

Note: The playground runs LLM calls in the browser, so variables must use the NEXT_PUBLIC_ prefix. Server-side usage (API routes, Server Actions) can use ANTHROPIC_API_KEY / OPENAI_API_KEY without the prefix.

API Keys

Playground (browser)

The playground makes LLM calls directly from the browser. Browser JavaScript can only access NEXT_PUBLIC_ prefixed environment variables in Next.js.

Provider Browser env var Server env var
Anthropic NEXT_PUBLIC_ANTHROPIC_API_KEY ANTHROPIC_API_KEY
OpenAI NEXT_PUBLIC_OPENAI_API_KEY OPENAI_API_KEY

You can also type the key directly into the sidebar — no rebuild needed.

SDK usage

Pass the key via config.apiKey. The SDK resolves it in this order: config.apiKeyNEXT_PUBLIC_**_API_KEY.

useARIA({
  domain,
  intent: "Show all products",
  config: { apiKey: process.env.ANTHROPIC_API_KEY },
})

Switching LLM Providers

The model string follows "provider/model-id" format. Bare model IDs default to Anthropic for backward compatibility.

// Anthropic (default)
config: { model: "anthropic/claude-sonnet-4-6", apiKey: "sk-ant-..." }

// OpenAI
config: { model: "openai/gpt-4o", apiKey: "sk-..." }

// Backward compat — bare ID assumes anthropic/
config: { model: "claude-sonnet-4-6", apiKey: "sk-ant-..." }

Supported providers: anthropic, openai. Switching requires only a config change — no code changes in @aria/reasoning.

Using the SDK

L1 — Static mode (no LLM)

Pass a hand-authored AbstractUISpec as fallbackSpec. No API key required.

import { useARIA, ARIARenderer } from "@aria/sdk";
import type { AbstractUISpec } from "@aria/sdk";

const mySpec: AbstractUISpec = {
  id: "my-ui",
  version: "1.0.0",
  generatedAt: new Date().toISOString(),
  generatedBy: "handcrafted",
  nodes: [
    {
      id: "page", nodeType: "page", label: "My App",
      children: [
        {
          id: "table", nodeType: "table", label: "Users",
          dataBinding: { entity: "User", operation: "list", fields: ["name", "email"] },
        }
      ]
    }
  ],
};

export default function Page() {
  const { renderer } = useARIA({ fallbackSpec: mySpec });
  return renderer;
}

fallbackSpec is also used for graceful degradation: if LLM generation fails (e.g., invalid key), the fallback spec renders and the error is surfaced alongside it.

L2+ — LLM-driven generation

import { useARIA } from "@aria/sdk";
import type { ARIADomainModel } from "@aria/sdk";

const domain: ARIADomainModel = {
  id: "product-catalog",
  version: "1.0.0",
  entities: [{
    id: "Product",
    label: "Product",
    fields: [
      { name: "name",    type: "string", meta: { label: "Name",  required: true } },
      { name: "price",   type: "number", meta: { label: "Price", uiHint: "currency" } },
      { name: "inStock", type: "boolean", meta: { label: "In Stock" } },
    ],
    operations: ["list", "search", "create", "delete"],
  }],
};

export default function Page() {
  const { renderer, isGenerating, error } = useARIA({
    domain,
    intent: "Show all products with search and a create form",
    config: {
      generationMode: "autonomous",
      model: "anthropic/claude-sonnet-4-6",  // or "openai/gpt-4o"
      apiKey: process.env.ANTHROPIC_API_KEY,
    },
  });

  if (isGenerating) return <p>Generating…</p>;
  if (error) return <p>Error: {error.message}</p>;
  return renderer;
}

Conversational mode

const { renderer, refine, undo, canUndo } = useARIA({
  domain,
  intent: "Manage contacts",
  config: { generationMode: "conversational", apiKey: "..." },
});

await refine("Add a filter by company");  // DeltaIntent → AUS patch or replan
undo();                                   // O(1) revert, no LLM call

Key Design Principles

  1. Models are first-class citizens — every rendered element traces to a model node; no magic components.
  2. LLMs replace rules, not models — the four-model stack is preserved; the LLM is a rule-engine replacement.
  3. Abstract before concrete — the AUS layer is mandatory; direct LLM-to-JSX is prohibited.
  4. Incrementally adoptable — adopt one layer at a time; L1 requires no LLM at all.
  5. Runtime is the primary target — generation and adaptation happen at runtime, not at build time.
  6. The user model is continuous — expertise and preference are scored dimensions updated from interaction events.

Incremental Adoption

Level What You Add What You Get
L1 Domain model + hand-authored AUS + renderer Model-driven rendering, no LLM required
L2 @aria/reasoning LLM planner Autonomous UI generation from domain model
L3 ARIATaskModel Structured workflows, forms, multi-step flows
L4 ARIAUserModel + adaptation loop Runtime personalization, continuous user model
L5 Conversational mode (refine()) Chat-driven UI construction and refinement

Monorepo Structure

aria/
  packages/
    aria-core/         # @aria/core     — models, AUS schema, Zod types
    aria-reasoning/    # @aria/reasoning — intent resolver, context assembler, planner, provider factory
    aria-renderer/     # @aria/renderer  — React renderer, component registry
    aria-runtime/      # @aria/runtime   — scorer, diff engine, interaction events
    aria-sdk/          # @aria/sdk       — useARIA hook, public developer API
  apps/
    playground/        # Next.js dev playground
docs/
  concepts/            # Core mental models
  guides/              # How-to guides
  reference/           # TypeScript types, JSON schemas, API surface
  architecture/        # Architecture Decision Records (ADRs)
  academic/            # Research lineage, classical→ARIA mapping, bibliography
  examples/            # End-to-end worked examples

Academic Foundations

This project is informed by two seminal papers included at the root of this repository:

  • Generating_user_interface_code_in_a_mode.pdf — Tran & Vanderdonckt (2008): generating UI code in a model-based UI development environment.
  • Generating_user_interface_from_task_user.pdf — Silva et al. (2000): generating UIs from task, user, and domain models (the Teallach system).

These documents define the four-model pipeline that ARIA is built upon.

See docs/index.md for the full documentation index and recommended reading order.

License

MIT

About

A runtime framework for LLM-driven UI generation grounded in the academic tradition of Model-Based User Interface Development (MBUID).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages