Skip to main content

Provider Comparison

ProviderModelsBest For
OpenAIgpt-4o, gpt-4o-mini, gpt-4-turboGeneral purpose, cost-effective
Anthropicclaude-sonnet-4, claude-3-opusAdvanced reasoning, long context
Geminigemini-1.5-pro, gemini-proMultimodal, Google ecosystem

Configuration

Environment:
export OPENAI_API_KEY=sk-...
Config:
from echo import LLMConfig, get_llm

llm_config = LLMConfig(
    provider="openai",
    model="gpt-4o-mini",
    temperature=0.2,
    max_tokens=2000,
)

llm = get_llm(llm_config)
Models: gpt-4o, gpt-4o-mini, gpt-4-turbo, gpt-3.5-turbo

Switching Providers

Change one line to switch providers:
# OpenAI
llm_config = LLMConfig(provider="openai", model="gpt-4o-mini")

# Anthropic
llm_config = LLMConfig(provider="anthropic", model="claude-sonnet-4-20250514")

# Gemini
llm_config = LLMConfig(provider="gemini", model="gemini-1.5-pro")
Agent code remains the same.

LLMConfig Parameters

ParameterTypeDefaultDescription
providerstrRequired"openai", "anthropic", "gemini"
modelstrRequiredModel identifier
temperaturefloat0.7Randomness (0.0-1.0)
max_tokensint4096Max response length
max_iterationsint5Max tool use iterations

Next Steps