Universal AI context generator. Saves thousands of tokens per conversation in Claude Code, Cursor, Copilot, Codex, and more.
+3 stars 24h | 0 7d
0 in 24h | 0 sources
0/5 channels firing
Each channel contributes 0-1. Per-channel tiers: GitHub (breakout 1.0 / hot 0.7 / rising 0.4), HN (front-page 1.0 / ≥3 mentions 0.7 / 1-2 mentions 0.4), Bluesky (≥5 mentions 1.0 / 2-4 0.7 / 1 0.4), dev.to (≥3 articles 1.0 / 2 0.7 / 1 0.4), Reddit (corpus-normalized 48h velocity).
No mentions on this channel in the last 7 days.
// QUIET HERE DOESN'T MEAN THE REPO IS DEAD — CHECK OTHER TABS
Code Editor for the AI Agents Era - Run an army of Claude Code, Codex, etc. on your machine
CLI proxy that reduces LLM token consumption by 60-90% on common dev commands. Single Rust binary, zero dependencies
no linked package yet
last commit 2d ago
* Reddit bar shows a per-repo velocity proxy (raw score / 100); the score formula uses the corpus-normalized version so a single repo's bar may not match its contribution to the corpus-wide ranking.
See where your AI coding tokens go. Interactive TUI dashboard for Claude Code, Codex, and Cursor cost observability.
// KNOWN REPO · PACKAGE · LAUNCH · SITE SURFACES