🎓 An open-source AI personal tutor that runs locally on your computer
Based on Llama Tutor by @nutlope
Study Buddy is a desktop application that provides personalized AI tutoring without requiring internet access, accounts, or risking API abuse. It's a fork of the excellent Llama Tutor project, enhanced with:
- 🔌 Provider-agnostic architecture - Use Ollama (local), OpenAI, Together AI, or others
- 🖥️ Desktop application - Runs securely on your computer via Electron
- 🔒 Privacy-first - Default local mode means your data never leaves your device
- 💰 Free to use - No API costs with local Ollama mode
- 🎯 Student-focused - Simple interface designed for learners
- 📚 Generate comprehensive tutorials on any topic
- 🔍 Smart search integration for enriched content
- 💬 Interactive chat for follow-up questions
- 🎨 Clean, intuitive interface
- 📱 Works offline after initial setup
- ⚡ Fast local inference with Ollama
- Node.js 18+ installed
- (Optional) Ollama for local AI - recommended for students
-
Download the latest release from the Releases page
- Windows:
StudyBuddy-Setup-x.x.x.exe - macOS:
StudyBuddy-x.x.x.dmg - Linux:
StudyBuddy-x.x.x.AppImage
- Windows:
-
Install and run - that's it! Study Buddy will use Ollama if installed, or prompt for API configuration.
# Clone the repository
git clone https://github.com/michael-borck/study-buddy.git
cd study-buddy
# Install dependencies
npm install
# Run in development mode
npm run electron-dev
# Build for production
npm run electron-packStudy Buddy supports multiple AI providers. Configure in Settings or via environment variables:
AI_PROVIDER=ollama
# No API key needed! Just install OllamaAI_PROVIDER=openai
OPENAI_API_KEY=your_api_key_hereAI_PROVIDER=together
TOGETHER_API_KEY=your_api_key_here- Create a new provider in
utils/providers/ - Implement the
LLMProviderinterface - Add to the provider factory in
utils/provider-factory.ts
- Launch Study Buddy from your Applications/Programs
- Enter a topic you want to learn about
- Click "Generate" to create your personalized tutorial
- Ask follow-up questions in the chat interface
- Save or export your sessions for later review
Study Buddy can be deployed institution-wide:
- Self-hosted option: Deploy the web version on your school's servers
- Managed API keys: Configure with your institution's API keys
- Custom models: Use your preferred AI models
- Usage analytics: Monitor usage with built-in observability
See our Educator's Guide for deployment instructions.
- Frontend: Next.js 14, React, TypeScript, Tailwind CSS
- Desktop: Electron
- AI Integration: Configurable providers (Ollama, OpenAI, Together AI)
- Search: Tavily API for enriched content
- Analytics: Helicone (optional)
- Database: Supabase (optional, for web deployment)
We welcome contributions! Please see our Contributing Guide for details.
# Fork the repo, then:
git clone https://github.com/michael-borck/study-buddy.git
cd study-buddy
npm install
npm run dev- Additional AI provider support (Anthropic, Cohere, local GGUF)
- Collaborative study sessions
- PDF/Document upload and analysis
- Study progress tracking
- Flashcard generation
- Mobile app (React Native)
Study Buddy is based on Llama Tutor by Hassan El Mghari (@nutlope). The original project showcased the power of AI in education, and we're building on that foundation to make it accessible to all students.
See our ACKNOWLEDGMENTS.md for a comprehensive list of all the amazing open source projects that make Study Buddy possible.
MIT License - see LICENSE file for details.
Note: This is an independent fork and is not officially associated with the original Llama Tutor project.