Skip to content

feat: Add multi-provider LLM support via LiteLLM integration#110

Open
joeVenner wants to merge 1 commit intoVectifyAI:mainfrom
joeVenner:feat/Add_multi_provider_LLM_support_including_local_ollama
Open

feat: Add multi-provider LLM support via LiteLLM integration#110
joeVenner wants to merge 1 commit intoVectifyAI:mainfrom
joeVenner:feat/Add_multi_provider_LLM_support_including_local_ollama

Conversation

@joeVenner
Copy link

🌟 Add Multi-Provider LLM Support - Use Any LLM with PageIndex!

Summary

This PR transforms PageIndex from an OpenAI-only tool into a universal LLM-powered document indexing solution. By integrating LiteLLM, users can now choose from 100+ LLM providers including:

Provider Models Setup
🟢 OpenAI GPT-4o, GPT-4 Turbo OPENAI_API_KEY
🟣 Anthropic Claude 3 Opus, Sonnet, Haiku ANTHROPIC_API_KEY
🔵 Google Gemini Pro, Gemini 1.5 GEMINI_API_KEY
🟦 Azure Azure OpenAI deployments AZURE_API_KEY + base
🟠 AWS Bedrock Claude, Llama on Bedrock AWS credentials
Groq Llama 3.1, Mixtral GROQ_API_KEY
🦙 Ollama (Local) Llama3, Mistral, Qwen No API key!

🚀 Key Features

  • Zero Breaking Changes: Fully backward compatible with existing setups
  • Local Model Support: Run completely offline with Ollama
  • Flexible Token Counting: Automatic tokenizer selection per provider
  • Simple Migration: Just change the model string!

📝 Usage Examples

# OpenAI (default - no changes needed)
python3 run_pageindex.py --pdf_path doc.pdf

# Claude 3 Opus
export ANTHROPIC_API_KEY=your_key
python3 run_pageindex.py --pdf_path doc.pdf --model claude-3-opus-20240229

# Local Ollama (no API key!)
ollama serve
python3 run_pageindex.py --pdf_path doc.pdf --model ollama/llama3

Replace OpenAI-only implementation with LiteLLM to support 100+ LLM providers
including Anthropic Claude, Google Gemini, Azure OpenAI, AWS Bedrock, Groq,
and local Ollama models.

Changes:
- Add litellm>=1.0.0 dependency
- Refactor ChatGPT_API functions to use litellm.completion()
- Enhance count_tokens() for multi-provider token counting
- Update config.yaml with provider-specific model examples
- Update README.md with multi-provider setup instructions

Backward compatible: Existing OPENAI_API_KEY and CHATGPT_API_KEY still work.
Default model remains gpt-4o-2024-11-20.
@jdiaz-elyndra
Copy link

jdiaz-elyndra commented Feb 14, 2026 via email

@joeVenner
Copy link
Author

@zmtomorrow Can you check please ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants