LLM API Cost Calculator
Compare pricing across 47+ models from 11 providers. Calculate costs and find the best model for your workload.
LLM Cost Calculator
Input Cost
$0.002500
1.0K tokens @ $2.5/M
Output Cost
$0.0100
1.0K tokens @ $10/M
Total Cost
$0.0125
1K requests:$12.50
10K requests:$125.00
100K requests:$1250.00
Find Your Perfect Model
What will you use the model for?
Model Comparison
| Model β | Provider β | Input/M β | Output/M β | Total β | Context β | Features |
|---|---|---|---|---|---|---|
Llama 4 Scout llama-4-scout | Meta | $0.10 | $0.10 | $0.000200 | 10.0M | VF |
Mistral Small 3.2 mistral-small-3-2 | Mistral AI | $0.06 | $0.18 | $0.000240 | 32.0K | F |
Gemini 2.5 Flash-Lite gemini-2-5-flash-lite | $0.07 | $0.30 | $0.000375 | 1.0M | VF | |
DeepSeek Coder V2 deepseek-coder | DeepSeek | $0.14 | $0.28 | $0.000420 | 128.0K | F |
Mixtral 8x7B (Groq) mixtral-8x7b-32768 | Groq | $0.24 | $0.24 | $0.000480 | 32.8K | F |
GPT-4.1 Nano gpt-4-1-nano | OpenAI | $0.10 | $0.40 | $0.000500 | 1.0M | VFC |
Gemini 2.0 Flash gemini-2-0-flash | $0.10 | $0.40 | $0.000500 | 1.0M | VF | |
GPT-4o Mini gpt-4o-mini | OpenAI | $0.15 | $0.60 | $0.000750 | 128.0K | VFCB |
Gemini 3 Flash Preview gemini-3-flash-preview | $0.15 | $0.60 | $0.000750 | 1.0M | VF | |
Gemini 2.5 Flash gemini-2-5-flash | $0.15 | $0.60 | $0.000750 | 1.0M | VF | |
Command R command-r | Cohere | $0.15 | $0.60 | $0.000750 | 128.0K | F |
Grok 3 Mini grok-3-mini | xAI | $0.30 | $0.50 | $0.000800 | 131.1K | F |
Llama 4 Maverick llama-4-maverick | Meta | $0.40 | $0.40 | $0.000800 | 512.0K | VF |
Llama 3.3 70B llama-3-3-70b | Meta | $0.60 | $0.60 | $0.001200 | 128.0K | F |
Codestral codestral-2508 | Mistral AI | $0.30 | $0.90 | $0.001200 | 32.0K | F |
DeepSeek V3 deepseek-chat | DeepSeek | $0.28 | $1.10 | $0.001380 | 128.0K | FC |
Llama 3.3 70B (Groq) llama-3-3-70b-versatile | Groq | $0.59 | $0.79 | $0.001380 | 128.0K | F |
Llama 3.3 70B Turbo llama-3-3-70b-instruct-turbo | Together AI | $0.88 | $0.88 | $0.001760 | 131.1K | F |
DeepSeek R1 Distill 70B deepseek-r1-distill-llama-70b | Together AI | $0.88 | $0.88 | $0.001760 | 131.1K | F |
GPT-4.1 Mini gpt-4-1-mini | OpenAI | $0.40 | $1.60 | $0.002000 | 1.0M | VFC |
Mistral Large 3 mistral-large-3 | Mistral AI | $0.50 | $1.50 | $0.002000 | 128.0K | VF |
Sonar sonar | Perplexity | $1.00 | $1.00 | $0.002000 | 127.0K | |
DeepSeek R1 deepseek-reasoner | DeepSeek | $0.55 | $2.19 | $0.002740 | 128.0K | FC |
Grok 3 Fast grok-3-fast | xAI | $0.60 | $3.00 | $0.003600 | 131.1K | VF |
Claude 3.5 Haiku claude-3-5-haiku | Anthropic | $0.80 | $4.00 | $0.004800 | 200.0K | VFCB |
o3-mini o3-mini | OpenAI | $1.10 | $4.40 | $0.005500 | 200.0K | FC |
o4-mini o4-mini | OpenAI | $1.10 | $4.40 | $0.005500 | 200.0K | VFC |
Claude Haiku 4.5 claude-haiku-4-5 | Anthropic | $1.00 | $5.00 | $0.006000 | 200.0K | VFCB |
Magistral Medium magistral-medium | Mistral AI | $2.00 | $5.00 | $0.007000 | 40.0K | F |
Mistral Large 2 mistral-large-2 | Mistral AI | $2.00 | $6.00 | $0.008000 | 128.0K | VF |
Pixtral Large pixtral-large | Mistral AI | $2.00 | $6.00 | $0.008000 | 128.0K | VF |
GPT-4.1 gpt-4-1 | OpenAI | $2.00 | $8.00 | $0.0100 | 1.0M | VFC |
Gemini 2.5 Pro gemini-2-5-pro | $1.25 | $10.00 | $0.0112 | 1.0M | VF | |
GPT-4o gpt-4o | OpenAI | $2.50 | $10.00 | $0.0125 | 128.0K | VFCB |
Command R+ command-r-plus | Cohere | $2.50 | $10.00 | $0.0125 | 128.0K | F |
Gemini 3 Pro Preview gemini-3-pro-preview | $2.50 | $15.00 | $0.0175 | 1.0M | VF | |
Claude Sonnet 4.5 claude-sonnet-4-5 | Anthropic | $3.00 | $15.00 | $0.0180 | 200.0K | VFCB |
Claude Sonnet 4 claude-sonnet-4 | Anthropic | $3.00 | $15.00 | $0.0180 | 200.0K | VFCB |
Claude 3.5 Sonnet claude-3-5-sonnet | Anthropic | $3.00 | $15.00 | $0.0180 | 200.0K | VFCB |
Grok 4 grok-4 | xAI | $3.00 | $15.00 | $0.0180 | 200.0K | VF |
Grok 4.1 grok-4-1 | xAI | $3.00 | $15.00 | $0.0180 | 200.0K | VF |
Grok 3 grok-3 | xAI | $3.00 | $15.00 | $0.0180 | 131.1K | VF |
Sonar Pro sonar-pro | Perplexity | $3.00 | $15.00 | $0.0180 | 200.0K | |
Claude Opus 4.6 claude-opus-4-6 | Anthropic | $5.00 | $25.00 | $0.0300 | 1.0M | VFCB |
Claude Opus 4.5 claude-opus-4-5 | Anthropic | $5.00 | $25.00 | $0.0300 | 200.0K | VFCB |
GPT-4 Turbo gpt-4-turbo | OpenAI | $10.00 | $30.00 | $0.0400 | 128.0K | VF |
o3 o3 | OpenAI | $10.00 | $40.00 | $0.0500 | 200.0K | VFC |
V = VisionF = Function CallingC = Caching AvailableB = Batch API
About LLMPrice.dev
LLMPrice.dev helps developers and businesses compare LLM API pricing and calculate costs across all major providers. Our pricing data is updated daily to ensure accuracy.
Features
- Real-time cost calculation with batch and caching options
- Side-by-side model comparison with sorting and filtering
- Support for OpenAI, Anthropic, Google, Mistral, and more
- Daily price updates via automated scraping
Coming Soon
- Usage analyzer - connect your API keys to track spending
- Cost optimization recommendations
- Budget alerts and forecasting
- Weekly price change newsletter