If your version of Aider doesn't include support for Amazon Bedrock Nova models or Claude 3 Sonnet, you can add them manually using model metadata configuration. Also see how to use Aider with Bedrock
Feature | Nova Micro | Nova Lite | Nova Pro |
---|---|---|---|
Context Window | 128K tokens | 300K tokens | 300K tokens |
Best For | Quick tasks, shorter interactions | General purpose, good balance | Complex tasks, highest quality |
Strengths | - Fast response time - Lowest cost - Good for simple tasks |
- Larger context window - Better reasoning - Good price/performance ratio |
- Best reasoning capabilities - Most precise responses - Advanced coding assistance |
Use Cases | - Quick code reviews - Simple queries - Basic documentation |
- General development - Documentation writing - Code analysis |
- Complex system design - Advanced problem solving - Detailed code generation |
- Create a file named
.aider.model.metadata.json
- According to aider documentation. Place the file in one of these locations:
- Your home directory
- The root of your git repository
- The current directory where you run aider
- Or use the
--model-metadata-file <filename>
option when launching aider
Add the following JSON configuration to .aider.model.metadata.json
:
{
"bedrock/us.amazon.nova-micro-v1:0": {
"max_tokens": 128000,
"max_input_tokens": 128000,
"max_output_tokens": 5000,
"input_cost_per_token": 0.000000035,
"output_cost_per_token": 0.00000014,
"litellm_provider": "bedrock",
"mode": "chat"
},
"bedrock/us.amazon.nova-lite-v1:0": {
"max_tokens": 300000,
"max_input_tokens": 300000,
"max_output_tokens": 5000,
"input_cost_per_token": 0.00000006,
"output_cost_per_token": 0.000000024,
"litellm_provider": "bedrock",
"mode": "chat"
},
"bedrock/us.amazon.nova-pro-v1:0": {
"max_tokens": 300000,
"max_input_tokens": 300000,
"max_output_tokens": 5000,
"input_cost_per_token": 0.0000008,
"output_cost_per_token": 0.0000032,
"litellm_provider": "bedrock",
"mode": "chat"
},
"bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0": {
"max_tokens": 128000,
"max_input_tokens": 128000,
"max_output_tokens": 5000,
"input_cost_per_token": 0.000003,
"output_cost_per_token": 0.000015,
"litellm_provider": "anthropic",
"mode": "chat"
}
}
You can test one of the combinations depending of your need.
# model for chat and summary
AIDER_WEAK_MODEL=bedrock/us.amazon.nova-micro-v1:0
#AIDER_WEAK_MODEL=bedrock/us.anthropic.claude-3-5-haiku-20241022-v1:0
# model for coding
AIDER_MODEL=bedrock/us.amazon.nova-pro-v1:0
# AIDER_MODEL=bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0