A command-line utility for interacting with various AI models directly from your terminal. This tool supports multiple AI providers including OpenAI, OpenRouter, and Anthropic.
- Query AI models from your terminal with simple commands
- Support for multiple AI providers:
- OpenAI (GPT models)
- OpenRouter (access to multiple AI models through one API)
- Anthropic (Claude models)
- Read prompts from command line arguments, files, or standard input
- Configure default models for each provider
- List available models for each provider
- Secure API key management
- Python 3.6 or higher
- Required Python packages:
requests
- For the setup script:
jq
command-line tool
- Clone this repository or download the script files
- Install required dependencies:
pip install requests
- Make the scripts executable:
chmod +x ai_prompt.py setup_api_keys.sh
- Run the setup script to configure your API keys and default models:
./setup_api_keys.sh
Alternatively, you can run the Python setup directly:
python ai_prompt.py --setup
The tool stores configuration in two files in your home directory:
~/.ai_prompt_config.json
- Stores default models and provider settings~/.ai_prompt_keys.json
- Securely stores your API keys
You can edit these files manually or use the setup commands to configure them.
By default, the tool uses:
- OpenRouter as the default provider
- Default models:
- OpenAI:
gpt-3.5-turbo
- OpenRouter:
google/gemini-2.0-pro-exp-02-05:free
- Anthropic:
claude-3-opus-20240229
- OpenAI:
Query an AI model with a prompt:
./ai_prompt.py "What is the capital of France?"
./ai_prompt.py -p openai "Explain quantum computing in simple terms"
Available providers: openai
, openrouter
, anthropic
./ai_prompt.py -m gpt-4 "Write a short story about a robot"
./ai_prompt.py -f prompt.txt
cat prompt.txt | ./ai_prompt.py
./ai_prompt.py --list-models
To list models for a specific provider:
./ai_prompt.py -p openai --list-models
./ai_prompt.py --setup
-m, --model MODEL Model to use (defaults to provider's configured default)
-p, --provider {openai,openrouter,anthropic}
Specify the provider
--setup Run the configuration setup
--list-models List available models
-f, --file FILE Read prompt from file
You'll need API keys from the providers you want to use:
- OpenAI: https://platform.openai.com/api-keys
- OpenRouter: https://openrouter.ai/keys
- Anthropic: https://console.anthropic.com/keys
./ai_prompt.py -p openai -m gpt-4 "Write a Python function to calculate Fibonacci numbers"
./ai_prompt.py -p anthropic "Explain the difference between REST and GraphQL"
./ai_prompt.py -p openrouter -m anthropic/claude-3-opus-20240229 "Suggest five book recommendations"
alias llm="python3 <path>/AIprompt/ai_prompt.py"
If you encounter authentication errors, ensure your API keys are correctly configured:
./ai_prompt.py --setup
If a specified model is unavailable, use the --list-models
option to see available models for your provider.
If you encounter rate limit errors, wait a few minutes before trying again or switch to a different provider.
This project is open source and available under the MIT License.
We welcome contributions! Please read our Contributing Guidelines and Code of Conduct before submitting a pull request.