-
Notifications
You must be signed in to change notification settings - Fork 404
Fix dashboard ui profile build by restoring dashboard sources #143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| # OpenMemory Dashboard Configuration | ||
| NEXT_PUBLIC_API_URL=http://localhost:8080 | ||
| # Set this if your backend has OM_API_KEY configured for authentication | ||
| NEXT_PUBLIC_API_KEY=your | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,159 @@ | ||
| # Chat Interface Setup | ||
|
|
||
| The chat interface is now connected to the OpenMemory backend and can query memories in real-time. | ||
|
|
||
| ## Features | ||
|
|
||
| ✅ **Memory Querying**: Searches your memory database for relevant content | ||
| ✅ **Salience-based Results**: Shows top memories ranked by relevance | ||
| ✅ **Memory Reinforcement**: Click the + button to boost memory importance | ||
| ✅ **Real-time Updates**: Live connection to backend API | ||
| ✅ **Action Buttons**: Quick actions after assistant responses | ||
|
|
||
| ## Setup Instructions | ||
|
|
||
| ### 1. Start the Backend | ||
|
|
||
| First, make sure the OpenMemory backend is running: | ||
|
|
||
| ```bash | ||
| cd backend | ||
| npm install | ||
| npm run dev | ||
| ``` | ||
|
|
||
| The backend will start on `http://localhost:8080` | ||
|
|
||
| ### 2. Configure Environment (Optional) | ||
|
|
||
| The dashboard is pre-configured to connect to `localhost:8080`. If your backend runs on a different port, create a `.env.local` file: | ||
|
|
||
| ```bash | ||
| # dashboard/.env.local | ||
| NEXT_PUBLIC_API_URL=http://localhost:8080 | ||
| ``` | ||
|
|
||
| ### 3. Start the Dashboard | ||
|
|
||
| ```bash | ||
| cd dashboard | ||
| npm install | ||
| npm run dev | ||
| ``` | ||
|
|
||
| The dashboard will start on `http://localhost:3000` | ||
|
|
||
| ### 4. Add Some Memories | ||
|
|
||
| Before chatting, you need to add some memories to your database. You can do this via: | ||
|
|
||
| **Option A: API (Recommended for Testing)** | ||
|
|
||
| ```bash | ||
| curl -X POST http://localhost:8080/memory/add \ | ||
| -H "Content-Type: application/json" \ | ||
| -d '{ | ||
| "content": "JavaScript async/await makes asynchronous code more readable", | ||
| "tags": ["javascript", "async"], | ||
| "metadata": {"source": "learning"} | ||
| }' | ||
| ``` | ||
|
|
||
| **Option B: Use the SDK** | ||
|
|
||
| ```javascript | ||
| // examples/js-sdk/basic-usage.js | ||
| import OpenMemory from '../../sdk-js/src/index.js'; | ||
|
|
||
| const om = new OpenMemory('http://localhost:8080'); | ||
|
|
||
| await om.addMemory({ | ||
| content: 'React hooks revolutionized state management', | ||
| tags: ['react', 'hooks'], | ||
| }); | ||
| ``` | ||
|
|
||
| **Option C: Ingest a Document** | ||
|
|
||
| ```bash | ||
| curl -X POST http://localhost:8080/memory/ingest \ | ||
| -H "Content-Type: application/json" \ | ||
| -d '{ | ||
| "content_type": "text", | ||
| "data": "Your document content here...", | ||
| "metadata": {"source": "document"} | ||
| }' | ||
| ``` | ||
|
|
||
| ## How It Works | ||
|
|
||
| ### Memory Query Flow | ||
|
|
||
| 1. **User Input**: You ask a question in the chat | ||
| 2. **Backend Query**: POST to `/memory/query` with your question | ||
| 3. **Vector Search**: Backend searches HSG memory graph | ||
| 4. **Results**: Top 5 memories returned with salience scores | ||
| 5. **Response**: Chat generates answer based on retrieved memories | ||
|
|
||
| ### Memory Reinforcement | ||
|
|
||
| Clicking the **+** button on a memory card: | ||
|
|
||
| - Sends POST to `/memory/reinforce` | ||
| - Increases memory salience by 0.1 | ||
| - Makes it more likely to appear in future queries | ||
|
|
||
| ## Current Features | ||
|
|
||
| ✅ Real-time memory querying | ||
| ✅ Salience-based ranking | ||
| ✅ Memory reinforcement (boost) | ||
| ✅ Sector classification display | ||
| ✅ Error handling with backend status | ||
|
|
||
| ## Coming Soon | ||
|
|
||
| - 🚧 LLM Integration (OpenAI, Ollama, Gemini) | ||
| - 🚧 Conversation memory persistence | ||
| - 🚧 Export chat to memories | ||
| - 🚧 WebSocket streaming responses | ||
| - 🚧 Quiz generation from memories | ||
| - 🚧 Podcast script generation | ||
|
|
||
| ## Troubleshooting | ||
|
|
||
| ### "Failed to query memories" | ||
|
|
||
| - Ensure backend is running: `npm run dev` in `backend/` | ||
| - Check backend is on port 8080: `curl http://localhost:8080/health` | ||
| - Verify CORS is enabled (already configured) | ||
|
|
||
| ### "No memories found" | ||
|
|
||
| - Add memories using the API or SDK (see setup above) | ||
| - Try broader search terms | ||
| - Check memory content exists: `GET http://localhost:8080/memory/all` | ||
|
|
||
| ### Connection refused | ||
|
|
||
| - Backend not started | ||
| - Wrong port in `.env.local` | ||
| - Firewall blocking connection | ||
|
|
||
| ## API Endpoints Used | ||
|
|
||
| ```typescript | ||
| POST /memory/query // Search memories | ||
| POST /memory/add // Add new memory | ||
| POST /memory/reinforce // Boost memory salience | ||
| GET /memory/all // List all memories | ||
| GET /memory/:id // Get specific memory | ||
| ``` | ||
|
|
||
| ## Next Steps | ||
|
|
||
| 1. Add LLM integration for intelligent responses | ||
| 2. Implement conversation memory storage | ||
| 3. Add streaming response support | ||
| 4. Create memory export feature | ||
| 5. Build quiz/podcast generators |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -3,6 +3,12 @@ FROM node:20-alpine AS builder | |
|
|
||
| WORKDIR /app | ||
|
|
||
| # Build-time public vars for Next.js client bundle | ||
| ARG NEXT_PUBLIC_API_URL=http://localhost:8080 | ||
| ARG NEXT_PUBLIC_API_KEY= | ||
| ENV NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL} | ||
| ENV NEXT_PUBLIC_API_KEY=${NEXT_PUBLIC_API_KEY} | ||
|
Comment on lines
+6
to
+10
|
||
|
|
||
| # Install dependencies | ||
| COPY package*.json ./ | ||
| RUN npm install | ||
|
|
@@ -18,14 +24,19 @@ FROM node:20-alpine AS production | |
|
|
||
| WORKDIR /app | ||
|
|
||
| ARG NEXT_PUBLIC_API_URL=http://localhost:8080 | ||
| ARG NEXT_PUBLIC_API_KEY= | ||
| ENV NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL} | ||
| ENV NEXT_PUBLIC_API_KEY=${NEXT_PUBLIC_API_KEY} | ||
|
|
||
| # Install only production dependencies | ||
| COPY package*.json ./ | ||
| RUN npm install --omit=dev | ||
|
|
||
| # Copy built assets from builder | ||
| COPY --from=builder /app/.next ./.next | ||
| COPY --from=builder /app/public ./public | ||
| COPY --from=builder /app/next.config.ts ./next.config.ts | ||
| COPY --from=builder /app/next.config.js ./next.config.js | ||
|
|
||
| # Create a dedicated non-root user for security | ||
| RUN addgroup -g 1001 -S nodejs \ | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,36 @@ | ||
| This is a [Next.js](https://nextjs.org) project bootstrapped with [`create-next-app`](https://nextjs.org/docs/app/api-reference/cli/create-next-app). | ||
|
|
||
| ## Getting Started | ||
|
|
||
| First, run the development server: | ||
|
|
||
| ```bash | ||
| npm run dev | ||
| # or | ||
| yarn dev | ||
| # or | ||
| pnpm dev | ||
| # or | ||
| bun dev | ||
| ``` | ||
|
|
||
| Open [http://localhost:3000](http://localhost:3000) with your browser to see the result. | ||
|
|
||
| You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file. | ||
|
|
||
| This project uses [`next/font`](https://nextjs.org/docs/app/building-your-application/optimizing/fonts) to automatically optimize and load [Geist](https://vercel.com/font), a new font family for Vercel. | ||
|
|
||
| ## Learn More | ||
|
|
||
| To learn more about Next.js, take a look at the following resources: | ||
|
|
||
| - [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API. | ||
| - [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial. | ||
|
|
||
| You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js) - your feedback and contributions are welcome! | ||
|
|
||
| ## Deploy on Vercel | ||
|
|
||
| The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js. | ||
|
|
||
| Check out our [Next.js deployment documentation](https://nextjs.org/docs/app/building-your-application/deploying) for more details. |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,109 @@ | ||
| import { NextResponse } from 'next/server' | ||
| import fs from 'fs' | ||
| import path from 'path' | ||
|
|
||
| const ENV_PATH = path.resolve(process.cwd(), '../.env') | ||
|
|
||
| function parseEnvFile(content: string): Record<string, string> { | ||
| const result: Record<string, string> = {} | ||
| const lines = content.split('\n') | ||
|
|
||
| for (const line of lines) { | ||
| const trimmed = line.trim() | ||
| if (!trimmed || trimmed.startsWith('#')) continue | ||
|
|
||
| const equalIndex = trimmed.indexOf('=') | ||
| if (equalIndex === -1) continue | ||
|
|
||
| const key = trimmed.substring(0, equalIndex).trim() | ||
| const value = trimmed.substring(equalIndex + 1).trim() | ||
| result[key] = value | ||
| } | ||
|
|
||
| return result | ||
| } | ||
|
|
||
| function serializeEnvFile(updates: Record<string, string>): string { | ||
| const lines: string[] = [] | ||
|
|
||
| for (const [key, value] of Object.entries(updates)) { | ||
| lines.push(`${key}=${value}`) | ||
| } | ||
|
|
||
| return lines.join('\n') | ||
| } | ||
|
|
||
| export async function GET() { | ||
| try { | ||
| if (!fs.existsSync(ENV_PATH)) { | ||
| return NextResponse.json({ | ||
| exists: false, | ||
| settings: {} | ||
| }) | ||
| } | ||
|
|
||
| const content = fs.readFileSync(ENV_PATH, 'utf-8') | ||
| const settings = parseEnvFile(content) | ||
|
|
||
| const masked = { ...settings } | ||
| if (masked.OPENAI_API_KEY) masked.OPENAI_API_KEY = '***' | ||
| if (masked.GEMINI_API_KEY) masked.GEMINI_API_KEY = '***' | ||
| if (masked.AWS_SECRET_ACCESS_KEY) masked.AWS_SECRET_ACCESS_KEY = "***" | ||
| if (masked.OM_API_KEY) masked.OM_API_KEY = '***' | ||
|
|
||
| return NextResponse.json({ | ||
| exists: true, | ||
| settings: masked | ||
| }) | ||
| } catch (e: any) { | ||
| console.error('[Settings API] read error:', e) | ||
| return NextResponse.json( | ||
| { error: 'internal', message: e.message }, | ||
| { status: 500 } | ||
| ) | ||
| } | ||
| } | ||
|
|
||
| export async function POST(request: Request) { | ||
| try { | ||
| const updates = await request.json() | ||
|
|
||
| if (!updates || typeof updates !== 'object') { | ||
| return NextResponse.json( | ||
| { error: 'invalid_body' }, | ||
| { status: 400 } | ||
| ) | ||
| } | ||
|
|
||
| let content = '' | ||
| let envExists = false | ||
|
|
||
| if (fs.existsSync(ENV_PATH)) { | ||
| content = fs.readFileSync(ENV_PATH, 'utf-8') | ||
| envExists = true | ||
| } else { | ||
| const examplePath = path.resolve(process.cwd(), '../.env.example') | ||
| if (fs.existsSync(examplePath)) { | ||
| content = fs.readFileSync(examplePath, 'utf-8') | ||
| } | ||
| } | ||
|
|
||
| const existing = content ? parseEnvFile(content) : {} | ||
| const merged = { ...existing, ...updates } | ||
| const newContent = serializeEnvFile(merged) | ||
|
|
||
| fs.writeFileSync(ENV_PATH, newContent, 'utf-8') | ||
|
|
||
| return NextResponse.json({ | ||
| ok: true, | ||
| created: !envExists, | ||
| message: 'Settings saved. Restart the backend to apply changes.' | ||
| }) | ||
| } catch (e: any) { | ||
| console.error('[Settings API] write error:', e) | ||
| return NextResponse.json( | ||
| { error: 'internal', message: e.message }, | ||
| { status: 500 } | ||
| ) | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using
NEXT_PUBLIC_API_KEYhere to authenticate to a backend protected byOM_API_KEYwill leak your server’s bearer token into the browser bundle, because allNEXT_PUBLIC_*values are embedded in the client-side JavaScript. An attacker who can load the dashboard can simply inspect the built JS or outgoingx-api-keyheader to recover the secret and then call protected write endpoints (e.g.,/memory/add,/memory/delete) directly from scripts. To fix this, avoid putting any shared secret orOM_API_KEY-equivalent intoNEXT_PUBLIC_API_KEYand instead keep authentication secrets server-only (e.g.,OM_API_KEYplus proper user/session auth), while updating this example comment to make that constraint explicit.