Documentation Index
Fetch the complete documentation index at: https://taskmaster-49ce32d5-claude-issue-1550-20260102-1153.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
API Keys Configuration
Task Master supports multiple AI providers through environment variables. This page lists all available API keys and their configuration requirements.Required API Keys
Note: At least one required API key must be configured for Task Master to function. “Required: Yes” below means “required to use that specific provider,” not “required globally.” You only need at least one provider configured.
ANTHROPIC_API_KEY (Recommended)
- Provider: Anthropic Claude models
- Format:
sk-ant-api03-... - Required: ✅ Yes
- Models: Claude 3.5 Sonnet, Claude 3 Haiku, Claude 3 Opus
- Get Key: Anthropic Console
PERPLEXITY_API_KEY (Highly Recommended for Research)
- Provider: Perplexity AI (Research features)
- Format:
pplx-... - Required: ✅ Yes
- Purpose: Enables research-backed task expansions and updates
- Models: Perplexity Sonar models
- Get Key: Perplexity API
OPENAI_API_KEY
- Provider: OpenAI GPT models
- Format:
sk-proj-...orsk-... - Required: ✅ Yes
- Models: GPT-4, GPT-4 Turbo, GPT-3.5 Turbo, O1 models
- Get Key: OpenAI Platform
GOOGLE_API_KEY
- Provider: Google Gemini models
- Format: Various formats
- Required: ✅ Yes
- Models: Gemini Pro, Gemini Flash, Gemini Ultra
- Get Key: Google AI Studio
- Alternative: Use
GOOGLE_APPLICATION_CREDENTIALSfor service account (Google Vertex)
GROQ_API_KEY
- Provider: Groq (High-performance inference)
- Required: ✅ Yes
- Models: Llama models, Mixtral models (via Groq)
- Get Key: Groq Console
OPENROUTER_API_KEY
- Provider: OpenRouter (Multiple model access)
- Required: ✅ Yes
- Models: Access to various models through single API
- Get Key: OpenRouter
AZURE_OPENAI_API_KEY
- Provider: Azure OpenAI Service
- Required: ✅ Yes
- Requirements: Also requires
AZURE_OPENAI_ENDPOINTconfiguration - Models: GPT models via Azure
- Get Key: Azure Portal
XAI_API_KEY
- Provider: xAI (Grok) models
- Required: ✅ Yes
- Models: Grok models
- Get Key: xAI Console
Optional API Keys
Note: These API keys are optional - providers will work without them or use alternative authentication methods.
AWS_ACCESS_KEY_ID (Bedrock)
- Provider: AWS Bedrock
- Required: ❌ No (uses AWS credential chain)
- Models: Claude models via AWS Bedrock
- Authentication: Uses AWS credential chain (profiles, IAM roles, etc.)
- Get Key: AWS Console
CLAUDE_CODE_API_KEY
- Provider: Claude Code CLI
- Required: ❌ No (uses OAuth tokens)
- Purpose: Integration with local Claude Code CLI
- Authentication: Uses OAuth tokens, no API key needed
GEMINI_API_KEY
- Provider: Gemini CLI
- Required: ❌ No (uses OAuth authentication)
- Purpose: Integration with Gemini CLI
- Authentication: Primarily uses OAuth via CLI, API key is optional
GROK_CLI_API_KEY
- Provider: Grok CLI
- Required: ❌ No (can use CLI config)
- Purpose: Integration with Grok CLI
- Authentication: Can use Grok CLI’s own config file
OLLAMA_API_KEY
- Provider: Ollama (Local/Remote)
- Required: ❌ No (local installation doesn’t need key)
- Purpose: For remote Ollama servers that require authentication
- Requirements: Only needed for remote servers with authentication
- Note: Not needed for local Ollama installations
GITHUB_API_KEY
- Provider: GitHub (Import/Export features)
- Format:
ghp_...orgithub_pat_... - Required: ❌ No (for GitHub features only)
- Purpose: GitHub import/export features
- Get Key: GitHub Settings
Configuration Methods
Method 1: Environment File (.env)
Create a.env file in your project root:
Method 2: System Environment Variables
Method 3: MCP Server Configuration
For Claude Code integration, configure keys in.mcp.json:
Key Requirements
Minimum Requirements
- At least one AI provider key is required
- ANTHROPIC_API_KEY is recommended as the primary provider
- PERPLEXITY_API_KEY is highly recommended for research features
Provider-Specific Requirements
- Azure OpenAI: Requires both
AZURE_OPENAI_API_KEYandAZURE_OPENAI_ENDPOINTconfiguration - Google Vertex: Requires
VERTEX_PROJECT_IDandVERTEX_LOCATIONenvironment variables - AWS Bedrock: Uses AWS credential chain (profiles, IAM roles, etc.) instead of API keys
- Ollama: Only needs API key for remote servers with authentication
- CLI Providers: Gemini CLI, Grok CLI, and Claude Code use OAuth/CLI config instead of API keys
Model Configuration
After setting up API keys, configure which models to use:Security Best Practices
- Never commit API keys to version control
- Use .env files and add them to
.gitignore - Rotate keys regularly especially if compromised
- Use minimal permissions for service accounts
- Monitor usage to detect unauthorized access
Troubleshooting
Key Validation
Common Issues
- Invalid key format: Check the expected format for each provider
- Insufficient permissions: Ensure keys have necessary API access
- Rate limits: Some providers have usage limits
- Regional restrictions: Some models may not be available in all regions
Getting Help
If you encounter issues with API key configuration:- Check the FAQ for common solutions
- Join our Discord community for support
- Report issues on GitHub