Skip to content

Releases: Lyellr88/MARM-Systems

v2.1.0

15 Sep 07:38
61d0ef5

Choose a tag to compare

MARM Universal MCP Server v2.1.0 - GitHub Release

🧠 Universal Memory Intelligence for AI Agents

Transform any AI conversation into a lasting memory experience. MARM provides intelligent semantic search, contextual logging, and seamless workflow management across all major AI platforms.

⚡ Key Features

19 Production-Ready MCP Tools

  • Smart Memory: Semantic search with AI embeddings finds exactly what you need
  • Session Management: Start, refresh, and bridge conversations across different AI agents
  • Intelligent Logging: Auto-categorize and store important decisions, code snippets, and insights
  • Notebook System: Create reusable knowledge templates and active instruction sets
  • Context Bridging: Seamless workflow transitions between different AI agents and projects

Professional Architecture

  • FastAPI Backend: Modern Python server with SQLite optimization and connection pooling
  • Docker Ready: Multi-stage builds with health monitoring and configurable environments
  • Professional Testing: 5 comprehensive diagnostic suites (security, performance, integration, memory, MCP compliance)
  • Cross-Platform: Works with Claude Code, Qwen CLI, Gemini CLI, Grok CLI
  • Rate Limiting: IP-based protection with graceful degradation

Complete Documentation

  • Installation Guides: Docker, Windows, Linux, and platform integration
  • Usage Handbooks: Step-by-step workflows with real-world examples
  • Professional Testing: Built-in diagnostics validate security, performance, and compliance

🎯 Perfect For

  • Developers: Maintain context across coding sessions, remember architecture decisions
  • Researchers: Build searchable knowledge bases from AI conversations
  • Content Creators: Track ideas, maintain consistent voice across projects
  • Teams: Share AI conversation insights and maintain project memory

🛠 Quick Start

Docker (Recommended):

git clone https://github.com/Lyellr88/MARM-Systems.git
cd MARM-Systems/marm-mcp-server/MARMcp-beta
docker-compose up --build

Connect to your AI:

# Claude Code
claude mcp add marm-memory --transport http --url "http://localhost:8001/mcp"

# Grok CLI
grok mcp add marm-memory --transport http --url "http://localhost:8001/mcp"

📊 What's New in v2.1.0

  • Universal MCP Protocol: Full Model Context Protocol implementation with 1MB response compliance
  • Semantic Search Engine: AI-powered memory retrieval using sentence-transformers
  • Production Hardening: Rate limiting, error isolation, health monitoring
  • Professional Test Suite: 74 comprehensive tests across security, performance, and integration
  • Cross-Platform Integration: Support for all major AI CLI tools
  • Complete Documentation: Installation guides for every platform and use case

🔗 Learn More


MARM Discord
Join the waitlist

Built with ❤️ for the AI developer community. Transform your AI conversations from ephemeral chats into lasting intelligence.

MARM v2.0 Release - The AI That Remembers Your Conversations

18 Aug 17:24
139af1f

Choose a tag to compare

🚀 chatbot-dark

Memory Accurate Response Mode has evolved from experimental protocol to production-ready AI framework.

The AI That Remembers Your Conversations


What's New in v2.0

🧠 Enhanced AI Provider & Performance

  • Complete migration from Google Gemini to Llama 4 Maverick (400B parameters)
  • 95% cost reduction while significantly improving response quality and speed
  • 10M token context limit for handling extensive conversation histories
  • Advanced streaming implementation with proper polling for real-time responses

🎯 MARM Protocol Evolution

  • "MARM IS memory incarnate" - Stronger AI identity and more natural responses
  • Improved command structure: /deep dive replaces /contextual reply for clearer functionality
  • Enhanced notebook system with /notebook use:, /notebook clear:, /notebook status: commands
  • Mid-session activation - Start MARM anytime and automatically import existing conversation context
  • "💭 Thinking Trail" format replaces verbose contract responses for better user experience

🎨 Complete Interface Modernization

  • Glassmorphism Revolution - Eliminated nested windows with floating message cards directly on transparent glass background
  • Color-Coded Conversations - Clean white user message cards, warm gold AI response cards creating perfect contrast against cool blue glass
  • Single-Layer Architecture - Modern 2025 design eliminating the "stacked windows" feeling with beautiful backdrop blur effects
  • Professional Polish - Professional-grade shadows, hover effects, and smooth transitions rivaling commercial AI applications
  • Contextual command menu - Moved from sidebar to intelligent popup next to input field
  • File upload system (📎) - Upload and analyze 15+ file types with syntax highlighting
  • MARM toggle button (🤖) - Switch instantly between structured MARM and free conversation mode
  • Mobile-Responsive Glassmorphism - Adapts beautifully to any screen size while maintaining visual integrity

🔒 Production-Grade Security

  • Comprehensive XSS protection with dedicated security module across all components
  • Centralized state management with immutable patterns and validation
  • Multi-tab synchronization matching ChatGPT/Claude standards
  • Memory leak prevention with trackable cleanup system for all resources

🧪 Professional Testing Infrastructure

  • Comprehensive Test Suite: 74 passing tests covering Voice, UI, State, Commands, and Security modules
  • Automated Testing: GitHub Actions CI/CD with Node.js 18.x & 20.x compatibility testing
  • Quality Assurance: 42.39% test coverage with browser API mocking and edge case validation
  • Developer Ready: Professional-grade testing setup ensuring code quality and contributor confidence

🐛 Critical Stability Improvements

  • Fixed memory loss bug - MARM now preserves conversation context when activated mid-session
  • Enhanced Sessions - Conversations survive page refreshes and mode switches
  • Runtime crash elimination - Resolved 15+ critical bugs affecting core functionality
  • Voice system fixes - Smooth text-to-speech without interruption errors
  • Performance optimization - 60+ lines of duplicate code eliminated, O(1) size tracking

Why Upgrade to v2.0?

For New Users

MARM v2.0 is your gateway to AI conversations that remember context, maintain consistency, and grow with you over time. Unlike traditional AI chats that forget everything, MARM builds a enhanced memory of your interactions.

For Existing Users (v1.5 → v2.0)

  • Better performance with 95% cost reduction and faster responses
  • Smoother experience with modern UI and eliminated crashes
  • Enhanced capabilities with file uploads and mid-session activation
  • Production stability ready for serious workflows

Quick Start

Try MARM Online

Visit the live webchat interface - no installation required.

Use MARM in Any AI Chat

Copy the protocol from PROTOCOL.md and paste into ChatGPT, Claude, or your preferred AI.

Local Installation

git clone https://github.com/yourusername/MARM-Systems-MARM.git
cd MARM-Systems-MARM/webchat
npm install
npm start

What Makes MARM Different?

Traditional AI: Forgets context every conversation
MARM: Builds enhanced memory and grows smarter over time

Traditional AI: Generic responses for everyone
MARM: Personalized responses based on your specific knowledge base

Traditional AI: No control over AI behavior
MARM: Toggle between structured memory mode and free conversation instantly


Core Features

📝 Session Management

  • /start marm - Activate memory-enhanced responses
  • /log session:name - Create structured conversation logs
  • /compile SessionName --summary - Generate intelligent summaries

🗂️ Personal Knowledge Base

  • /notebook add:key value - Store facts, preferences, and context
  • /notebook use:key - Reference your stored information
  • Steel trap memory - MARM treats your notebook as absolute truth

🔍 Advanced Analysis

  • /deep dive topic - Get comprehensive analysis with reasoning trails
  • File upload support - Analyze code, documents, and data files
  • Context preservation - Never lose conversation flow when switching modes

Technical Improvements

  • 50+ files modified across frontend, backend, and configuration
  • 1000+ lines of code optimized for performance and maintainability
  • Complete HTML/JS separation following modern web development standards
  • Comprehensive testing suite with XSS protection and memory leak detection
  • Production deployment ready with proper error handling and monitoring

Migration Notes

From v1.5

  • All existing /log and /notebook data remains compatible
  • New /deep dive command replaces /contextual reply
  • Enhanced notebook commands available (see HANDBOOK.md)
  • UI will feel familiar but significantly more polished

From v1.4 and Earlier

  • Session management now more robust with automatic recovery
  • Protocol responses now more natural and user-friendly

What's Next?

MARM v2.0 establishes the foundation for our expanding AI ecosystem:

  • MCP Integration - Model Context Protocol implementation
  • N8N Workflows - Automation and workflow integration
  • Dual RAG System - Advanced retrieval augmented generation
  • Multi-AI Personalities - MoreLogic, HybridLogic, and specialized AI variants

Community & Support

  • GitHub Discussions - Share use cases and get help
  • Issue Tracker - Report bugs and request features
  • Live Demo - Try it before you download, either local or with my chatbot visit my readme.md

Acknowledgments

MARM v2.0 represents months of refinement based on community feedback, real-world testing, and continuous iteration. Thank you to everyone who provided feedback, reported bugs, and helped shape this release.

Stop settling for forgetful AI. Transform your conversations. Empower your workflows. Experience the future where AI truly remembers. The revolution starts now. Dive into MARM v2.0! Get Started Here | webchat interface