PeerPrep is a real-time collaborative coding platform that pairs users to solve coding interview problems together. The system features peer matching, real-time code collaboration, AI-powered assistance, and code execution capabilities.
PeerPrep follows a microservices architecture with the following components. Links to all the currently deployed services are provided.
- User Service: Authentication, user management, and difficulty tracking
- Question Service: Question bank management with admin CRUD operations
- Matching Service: Peer matching algorithm using Redis queues
- Collaboration Service: Real-time collaboration via WebSocket, code execution, and AI assistance
- Login/Signup UI: User authentication and dashboard
- Matching UI: Topic selection and matching interface
- Collaboration UI: Real-time code editor with collaboration features
- PostgreSQL (Supabase): User data and question bank storage
- Redis: Matching queues, session state, and temporary data
Before running the project locally, ensure you have:
- Node.js v18 or higher
- npm or yarn
- Redis (can run via Docker or locally)
- PostgreSQL (via Supabase cloud or local instance)
- Docker (optional, for containerized setup)
git clone <repository-url>
cd cs3219-ay2526s1-project-g25
git checkout github-actionsInstall dependencies for all services and frontend applications:
# Backend services
cd user-service/user-service && npm install && cd ../..
cd question-service/question-service && npm install && cd ../..
cd matching-service/matching-service && npm install && cd ../..
cd collaboration-service/collaboration-service && npm install && cd ../..
# Frontend applications
cd feature-login-signup-ui/frontend && npm install && cd ../..
cd feature-matching-ui/frontend && npm install && cd ../..
cd feature-collaboration-ui/frontend/peerprep-collab && npm install && cd ../../..Create .env files for each service by copying the env.example files:
# User Service
cp user-service/user-service/env.example user-service/user-service/.env
# Question Service
cp question-service/question-service/env.example question-service/question-service/.env
# Matching Service
cp matching-service/matching-service/env.example matching-service/matching-service/.env
# Collaboration Service
cp collaboration-service/collaboration-service/env.example collaboration-service/collaboration-service/.env
# Frontend applications
cp feature-login-signup-ui/frontend/env.example feature-login-signup-ui/frontend/.env
cp feature-matching-ui/frontend/env.example feature-matching-ui/frontend/.env
cp feature-collaboration-ui/frontend/peerprep-collab/env.example feature-collaboration-ui/frontend/peerprep-collab/.envImportant: Fill in all environment variables according to the env.example files. You'll need:
- Supabase credentials (URL and service role key)
- JWT secrets (must be identical across all services)
- Redis connection URL
- External API keys (Google AI, RapidAPI for Judge0, Cloudinary)
- Service URLs for inter-service communication
# Start Redis
docker run -d -p 6379:6379 redis:latestEnsure Redis is running on localhost:6379
The services use Supabase PostgreSQL. Ensure your Supabase database is set up with the required tables:
- User Service: Run migrations from
user-service/user-service/migrations/init.sql - Question Service: Run schema from
question-service/question-service/db/init.sql
Open separate terminal windows for each service:
# Terminal 1: User Service
cd user-service/user-service
npm run dev
# Terminal 2: Question Service
cd question-service/question-service
npm run dev
# Terminal 3: Matching Service
cd matching-service/matching-service
npm run dev
# Terminal 4: Collaboration Service
cd collaboration-service/collaboration-service
npm run dev# Terminal 5: Login/Signup UI
cd feature-login-signup-ui/frontend
npm start
# Terminal 6: Matching UI
cd feature-matching-ui/frontend
npm run dev
# Terminal 7: Collaboration UI
cd feature-collaboration-ui/frontend/peerprep-collab
npm run devOnce all services are running, access them at:
- Login/Signup UI: http://localhost:3000
- Matching UI: http://localhost:3002
- Collaboration UI: http://localhost:4000
- User Service API: http://localhost:3001
- Question Service API: http://localhost:5050
- Matching Service API: http://localhost:4001
- Collaboration Service API: http://localhost:3004
For a containerized setup, use Docker Compose:
# Ensure all .env files are configured
docker-compose up -d --buildThis will start all backend services in containers. Frontend applications should still be run locally for development.
Run tests for each service:
# User Service
cd user-service/user-service && npm test
# Question Service
cd question-service/question-service && npm test
# Matching Service
cd matching-service/matching-service && npm test
# Collaboration Service
cd collaboration-service/collaboration-service && npm testcs3219-ay2526s1-project-g25/
├── user-service/ # User authentication and management
├── question-service/ # Question bank service
├── matching-service/ # Peer matching service
├── collaboration-service/ # Real-time collaboration service
├── feature-login-signup-ui/ # Authentication frontend
├── feature-matching-ui/ # Matching interface frontend
├── feature-collaboration-ui/ # Collaboration editor frontend
├── compose.yaml # Docker Compose configuration
- User Authentication: JWT-based auth with role-based access control
- Peer Matching: Redis-based queue system for matching users by topic and difficulty
- Real-time Collaboration: WebSocket-based code synchronization using Yjs
- Code Execution: Judge0 integration for running code in multiple languages
- AI Assistance: Google Gemini integration for hints and code analysis
- Question Management: Admin interface for CRUD operations on question bank
- Backend: JavaScript (ES6+) with Express.js
- Frontend: React (Login UI) and Next.js (Matching & Collaboration UIs)Is)
Tools Used: ChatGPT and GitHub Copilot.
Prohibited Phases Avoided: We confirm that AI was not used for requirements prioritization, system architecture, component design, or decision rationales. These were done solely by the team.
Uses:
- Implementation Code: Assisted in writing specific functions, classes, and unit tests after the architecture was finalized by the team.
- Debugging Assistance: Used to explain error messages and suggest fixes.
- Refactoring & Documentation: Suggested improvements to code structure and helped generate docstrings/comments.
Verification: All AI-generated outputs have been reviewed, tested, and understood by the team members. We take full responsibility for the final code.
Usage Log: A detailed log of our AI interactions is available in /ai/usage-log.md. (Note: Some initial interactions were not logged but are summarized here.)