Skip to content

Instantly share code, notes, and snippets.

@leonjang88
Last active February 27, 2026 21:23
Show Gist options
  • Select an option

  • Save leonjang88/318f9a2acfbae451820399b7e88ce4ef to your computer and use it in GitHub Desktop.

Select an option

Save leonjang88/318f9a2acfbae451820399b7e88ce4ef to your computer and use it in GitHub Desktop.
Claude API Usage Monitor for Home Assistant

Claude API Usage Monitor for Home Assistant

Real-time tracking of Claude Code CLI token usage and costs in Home Assistant with custom dashboard and icon.

Requirements

  • Claude Pro/Max subscription
  • Claude Code CLI installed (npm install -g @anthropics/claude-code)
  • ccusage tool (comes with Claude Code CLI)
  • PostgreSQL (optional, for historical data)
  • Custom Brand Icons integration (HACS)

How It Works

  1. Claude Code CLI tracks your usage with proper model-specific pricing
  2. ccusage calculates costs per model:
    • Opus: $15/$75 per M tokens (input/output)
    • Sonnet: $3/$15 per M tokens
    • Different cache rates per model
  3. collect_claude_tokens.py runs every 5 min via cron:
    • Calls npx ccusage daily --json --offline
    • Stores in PostgreSQL (optional)
    • Creates /config/scripts/claude_token_cache.json
  4. Home Assistant sensors read the cache JSON

Setup

1. Install Claude Code CLI

npm install -g @anthropics/claude-code
claude-code auth login  # Authenticate with your Pro/Max account

2. Install ccusage

npm install -g ccusage

3. Copy Scripts

Copy to your Home Assistant /config/scripts/ directory:

  • collect_claude_tokens.py - Main script (runs ccusage, creates cache)
  • claude_token_usage.py - Alternative: reads Claude's stats-cache.json directly
  • claude_token_usage_db.py - Alternative: reads from PostgreSQL

4. Install Custom Brand Icons (for Claude icon)

Install via HACS: https://github.com/elax46/custom-brand-icons

The Claude icon (pezbox:clawd) is pending in PR #1114: elax46/custom-brand-icons#1114

Once merged, the icon will be available. In the meantime, you can use the included pezbox-icons.js file.

5. Set Up Cron Job

*/5 * * * * python3 /config/scripts/collect_claude_tokens.py

6. Configure PostgreSQL (Optional)

If using the database features, update DB credentials in collect_claude_tokens.py. Otherwise, comment out the DB sections and just use the cache file generation.

7. Add HA Sensors

Add the configuration from claude_sensors.yaml to your configuration.yaml.

8. Add Dashboard

Option A: Separate Dashboard (Recommended)

  • Go to Settings → Dashboards → Add Dashboard
  • Name it "Claude"
  • Select "Configure UI" → Raw Configuration Editor
  • Paste contents of lovelace_claude_dashboard.yaml

Option B: Add to Existing Dashboard

  • Edit your existing dashboard
  • Add the cards from lovelace_claude_dashboard.yaml

Files Explained

File Purpose Required?
collect_claude_tokens.py Main collector - runs ccusage, creates cache YES
claude_sensors.yaml HA sensor definitions YES
lovelace_claude_dashboard.yaml Dashboard configuration YES
pezbox-icons.js Temporary Claude icon (until PR #1114 merged) Temporary
claude_token_usage.py Alt: reads stats-cache.json No
claude_token_usage_db.py Alt: reads from PostgreSQL No

Troubleshooting

  • No data showing? Check if ccusage works: npx ccusage daily --json
  • Cache file missing? Check cron logs, ensure collect_claude_tokens.py runs
  • Icon not showing? Ensure Custom Brand Icons is installed and PR #1114 is merged
# Claude Token Usage Sensors for Home Assistant
# Add these to your configuration.yaml under the command_line: section
#
# Requirements:
# 1. Claude Code CLI installed and authenticated
# 2. Python scripts in /config/scripts/
# 3. Cron job running collect_claude_tokens.py every 5 minutes
# NO API KEY NEEDED - reads from cache created by collect_claude_tokens.py
command_line:
- sensor:
name: "Claude Tokens Today"
unique_id: claude_tokens_today
command: "python3 -c \"import json; v=json.load(open('/config/scripts/claude_token_cache.json'))['tokens_today']; print(f'{v/1e6:.1f}M' if v>=1e6 else f'{v/1e3:.0f}K')\""
scan_interval: 300
icon: mdi:hand-coin
- sensor:
name: "Claude Cost Today"
unique_id: claude_cost_today
command: "python3 -c \"import json; print('$'+str(round(json.load(open('/config/scripts/claude_token_cache.json'))['cost_today'],2)))\""
scan_interval: 300
icon: mdi:currency-usd
- sensor:
name: "Claude Tokens 7d"
unique_id: claude_tokens_7d
command: "python3 -c \"import json; v=json.load(open('/config/scripts/claude_token_cache.json'))['tokens_last_7_days']; print(f'{v/1e6:.1f}M' if v>=1e6 else f'{v/1e3:.0f}K')\""
scan_interval: 300
icon: mdi:hand-coin
- sensor:
name: "Claude Cost 7d"
unique_id: claude_cost_7d
command: "python3 -c \"import json; print('$'+str(round(json.load(open('/config/scripts/claude_token_cache.json'))['cost_last_7_days'],2)))\""
scan_interval: 300
icon: mdi:currency-usd
- sensor:
name: "Claude Tokens Month"
unique_id: claude_tokens_month
command: "python3 -c \"import json; v=json.load(open('/config/scripts/claude_token_cache.json'))['tokens_last_30_days']; print(f'{v/1e6:.1f}M' if v>=1e6 else f'{v/1e3:.0f}K')\""
scan_interval: 300
icon: mdi:hand-coin
- sensor:
name: "Claude Cost Month"
unique_id: claude_cost_month
command: "python3 -c \"import json; print('$'+str(round(json.load(open('/config/scripts/claude_token_cache.json'))['cost_last_30_days'],2)))\""
scan_interval: 300
icon: mdi:currency-usd
#!/usr/bin/env python3
"""
Claude Token Usage Parser for Home Assistant
Reads ~/.claude/stats-cache.json and outputs token usage data
"""
import json
import sys
from datetime import datetime
from pathlib import Path
def get_token_usage(date=None):
"""Get token usage for a specific date or today"""
stats_file = Path.home() / '.claude' / 'stats-cache.json'
if not stats_file.exists():
print(json.dumps({"error": "stats-cache.json not found"}))
sys.exit(1)
try:
with open(stats_file, 'r') as f:
data = json.load(f)
except Exception as e:
print(json.dumps({"error": f"Failed to read stats: {str(e)}"}))
sys.exit(1)
# Get today's date if not specified
if date is None:
date = datetime.now().strftime("%Y-%m-%d")
# Find today's usage in dailyModelTokens
daily_tokens = data.get('dailyModelTokens', [])
today_usage = None
for entry in daily_tokens:
if entry.get('date') == date:
today_usage = entry
break
# Calculate total tokens for today
total_today = 0
if today_usage:
for model, tokens in today_usage.get('tokensByModel', {}).items():
total_today += tokens
# Get overall usage stats
model_usage = data.get('modelUsage', {})
total_all_time = {
'input_tokens': 0,
'output_tokens': 0,
'cache_read_tokens': 0,
'cache_creation_tokens': 0,
'total_tokens': 0
}
for model, stats in model_usage.items():
total_all_time['input_tokens'] += stats.get('inputTokens', 0)
total_all_time['output_tokens'] += stats.get('outputTokens', 0)
total_all_time['cache_read_tokens'] += stats.get('cacheReadInputTokens', 0)
total_all_time['cache_creation_tokens'] += stats.get('cacheCreationInputTokens', 0)
total_all_time['total_tokens'] = (
total_all_time['input_tokens'] +
total_all_time['output_tokens'] +
total_all_time['cache_read_tokens'] +
total_all_time['cache_creation_tokens']
)
# Calculate last 7 days total
last_7_days = 0
for entry in daily_tokens[-7:]:
for model, tokens in entry.get('tokensByModel', {}).items():
last_7_days += tokens
# Output JSON for Home Assistant
result = {
'date': date,
'tokens_today': total_today,
'tokens_last_7_days': last_7_days,
'total_input_tokens': total_all_time['input_tokens'],
'total_output_tokens': total_all_time['output_tokens'],
'total_cache_read_tokens': total_all_time['cache_read_tokens'],
'total_cache_creation_tokens': total_all_time['cache_creation_tokens'],
'total_all_time_tokens': total_all_time['total_tokens'],
'total_sessions': data.get('totalSessions', 0),
'total_messages': data.get('totalMessages', 0),
'last_updated': data.get('lastComputedDate', 'unknown')
}
print(json.dumps(result))
if __name__ == '__main__':
# Check if date argument provided
date_arg = sys.argv[1] if len(sys.argv) > 1 else None
get_token_usage(date_arg)
#!/usr/bin/env python3
"""
Claude Token Usage Reader for Home Assistant
Queries claude_analytics DB and outputs JSON for command_line sensor.
"""
import json
import sys
from datetime import datetime
import psycopg2
from psycopg2.extras import RealDictCursor
DB_CONFIG = {
'host': 'YOUR_DB_HOST',
'port': 5432,
'database': 'claude_analytics',
'user': 'YOUR_DB_USER',
'password': 'YOUR_DB_PASSWORD',
}
def get_token_usage():
try:
conn = psycopg2.connect(**DB_CONFIG, cursor_factory=RealDictCursor)
cur = conn.cursor()
cur.execute("""
SELECT COALESCE(SUM(total_tokens), 0) AS tokens_today
FROM token_daily WHERE date = CURRENT_DATE
""")
tokens_today = int(cur.fetchone()['tokens_today'])
cur.execute("""
SELECT COALESCE(SUM(total_tokens), 0) AS tokens_7d
FROM token_daily WHERE date >= CURRENT_DATE - INTERVAL '6 days'
""")
tokens_7d = int(cur.fetchone()['tokens_7d'])
cur.execute("""
SELECT COALESCE(SUM(total_tokens), 0) AS tokens_30d
FROM token_daily WHERE date >= CURRENT_DATE - INTERVAL '29 days'
""")
tokens_30d = int(cur.fetchone()['tokens_30d'])
cur.execute("""
SELECT
COALESCE(SUM(input_tokens), 0) AS input_tokens,
COALESCE(SUM(output_tokens), 0) AS output_tokens,
COALESCE(SUM(cache_read_tokens), 0) AS cache_read_tokens,
COALESCE(SUM(cache_creation_tokens), 0) AS cache_creation_tokens
FROM model_snapshot
""")
totals = cur.fetchone()
cur.close()
conn.close()
result = {
'tokens_today': tokens_today,
'tokens_last_7_days': tokens_7d,
'tokens_last_30_days': tokens_30d,
'total_input_tokens': int(totals['input_tokens']),
'total_output_tokens': int(totals['output_tokens']),
'total_cache_read_tokens': int(totals['cache_read_tokens']),
'total_cache_creation_tokens': int(totals['cache_creation_tokens']),
'total_all_time_tokens': int(
totals['input_tokens'] + totals['output_tokens'] +
totals['cache_read_tokens'] + totals['cache_creation_tokens']
),
}
print(json.dumps(result))
except Exception as e:
print(json.dumps({"error": str(e)}))
sys.exit(1)
if __name__ == '__main__':
get_token_usage()
#!/usr/bin/env python3
"""
Claude Token Usage Collector
Uses ccusage to read live token data from Claude Code conversation files.
Run via cron every 5 minutes.
"""
import json
import subprocess
from datetime import datetime
from pathlib import Path
import psycopg2
DB_CONFIG = {
'host': 'YOUR_DB_HOST',
'port': 5432,
'database': 'claude_analytics',
'user': 'YOUR_DB_USER',
'password': 'YOUR_DB_PASSWORD',
}
def collect():
result = subprocess.run(
['/usr/local/bin/npx', 'ccusage', 'daily', '--json', '--offline'],
capture_output=True, text=True
)
data = json.loads(result.stdout)
conn = psycopg2.connect(**DB_CONFIG)
cur = conn.cursor()
for entry in data.get('daily', []):
date = entry['date']
for mb in entry.get('modelBreakdowns', []):
total = (
mb.get('inputTokens', 0) +
mb.get('outputTokens', 0) +
mb.get('cacheReadTokens', 0) +
mb.get('cacheCreationTokens', 0)
)
cur.execute("""
INSERT INTO token_daily
(date, model, input_tokens, output_tokens, cache_read_tokens,
cache_creation_tokens, total_tokens, cost_usd, updated_at)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, NOW())
ON CONFLICT (date, model) DO UPDATE
SET input_tokens = EXCLUDED.input_tokens,
output_tokens = EXCLUDED.output_tokens,
cache_read_tokens = EXCLUDED.cache_read_tokens,
cache_creation_tokens = EXCLUDED.cache_creation_tokens,
total_tokens = EXCLUDED.total_tokens,
cost_usd = EXCLUDED.cost_usd,
updated_at = NOW()
""", (
date, mb['modelName'],
mb.get('inputTokens', 0),
mb.get('outputTokens', 0),
mb.get('cacheReadTokens', 0),
mb.get('cacheCreationTokens', 0),
total,
mb.get('cost', 0), # ccusage provides the calculated cost!
))
conn.commit()
# Build cache file for HA
def q(sql):
cur.execute(sql)
return cur.fetchone()
t = q("SELECT COALESCE(SUM(total_tokens),0), COALESCE(SUM(cost_usd),0) FROM token_daily WHERE date = CURRENT_DATE")
w = q("SELECT COALESCE(SUM(total_tokens),0), COALESCE(SUM(cost_usd),0) FROM token_daily WHERE date >= CURRENT_DATE - INTERVAL '6 days'")
m = q("SELECT COALESCE(SUM(total_tokens),0), COALESCE(SUM(cost_usd),0) FROM token_daily WHERE date >= CURRENT_DATE - INTERVAL '29 days'")
a = q("SELECT COALESCE(SUM(input_tokens),0), COALESCE(SUM(output_tokens),0), COALESCE(SUM(cache_read_tokens),0), COALESCE(SUM(cache_creation_tokens),0), COALESCE(SUM(total_tokens),0), COALESCE(SUM(cost_usd),0) FROM token_daily")
cur.close()
conn.close()
cache = {
'tokens_today': int(t[0]),
'cost_today': round(float(t[1]), 4),
'tokens_last_7_days': int(w[0]),
'cost_last_7_days': round(float(w[1]), 4),
'tokens_last_30_days': int(m[0]),
'cost_last_30_days': round(float(m[1]), 4),
'total_input_tokens': int(a[0]),
'total_output_tokens': int(a[1]),
'total_cache_read': int(a[2]),
'total_cache_creation': int(a[3]),
'total_all_time_tokens': int(a[4]),
'total_cost_all_time': round(float(a[5]), 4),
}
cache_file = Path(__file__).parent / 'claude_token_cache.json'
with open(cache_file, 'w') as f:
json.dump(cache, f)
print(f"[{datetime.now():%Y-%m-%d %H:%M:%S}] {int(t[0]):,} tokens today (${float(t[1]):.4f})")
if __name__ == '__main__':
collect()
# Claude Token Usage Dashboard Card
# Copy this to your Lovelace dashboard by going to Settings > Dashboards > Edit Dashboard
# Add a new card and paste this YAML in manual mode
resources:
- url: /local/icons/pezbox-icons.js
type: module
type: vertical-stack
cards:
# Summary Stats Card
- type: entities
title: 🧠 Claude Token Usage Summary
show_header_toggle: false
entities:
- entity: sensor.claude_token_usage
name: Today's Tokens
icon: pezbox:claude-pixel
- entity: sensor.claude_tokens_last_7_days
name: Last 7 Days
icon: pezbox:claude-pixel
- entity: sensor.claude_total_all_time
name: All Time Total
icon: pezbox:claude-pixel
- type: divider
- entity: sensor.claude_input_tokens
name: Total Input Tokens
icon: mdi:upload
- entity: sensor.claude_output_tokens
name: Total Output Tokens
icon: mdi:download
- entity: sensor.claude_cache_tokens
name: Total Cache Tokens
icon: mdi:database
# Statistics Card
- type: statistic
entity: sensor.claude_token_usage
period:
calendar:
period: week
stat_type: mean
name: Average Daily Tokens (This Week)
# Mini Graph Card (requires custom:mini-graph-card from HACS)
# If you don't have it installed, comment out or remove this section
- type: custom:mini-graph-card
name: Claude Token Usage Trend
entities:
- entity: sensor.claude_token_usage
name: Daily Tokens
hours_to_show: 168 # 7 days
points_per_hour: 0.042 # Once per day
line_width: 3
animate: true
show:
graph: line
fill: fade
icon: true
name: true
state: true
legend: true
# Alternative: Standard History Graph (works without HACS)
- type: history-graph
title: Claude Token Usage (7 Days)
entities:
- entity: sensor.claude_token_usage
name: Daily Tokens
hours_to_show: 168
# Gauge Card for Today
- type: gauge
entity: sensor.claude_token_usage
name: Today's Token Usage
min: 0
max: 100000
severity:
green: 0
yellow: 50000
red: 80000
needle: true
var pezboxIcons = {
"claude-pixel": [0, 0, 24, 24, "M12 2L13.5 8.5L20 7L15 12L20 17L13.5 15.5L12 22L10.5 15.5L4 17L9 12L4 7L10.5 8.5Z"],
};
async function getIcon(name) {
if (!(name in pezboxIcons)) {
console.log(`Icon "pezbox:${name}" not available`);
return "";
}
var svgDef = pezboxIcons[name];
return {
path: svgDef[4],
viewBox: svgDef[0] + " " + svgDef[1] + " " + svgDef[2] + " " + svgDef[3],
};
}
async function getIconList() {
return Object.entries(pezboxIcons).map(([icon]) => ({ name: icon }));
}
window.customIconsets = window.customIconsets || {};
window.customIconsets["pezbox"] = getIcon;
window.customIcons = window.customIcons || {};
window.customIcons["pezbox"] = { getIcon, getIconList };
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment