πͺ¨ why use many token when few token do trick β Claude Code skill that cuts 65% of tokens by talking like caveman
+127 stars 24h | +1.3k 7d
0 in 24h | 0 sources
1/5 channels firing
Each channel contributes 0-1. Per-channel tiers: GitHub (breakout 1.0 / hot 0.7 / rising 0.4), HN (front-page 1.0 / β₯3 mentions 0.7 / 1-2 mentions 0.4), Bluesky (β₯5 mentions 1.0 / 2-4 0.7 / 1 0.4), dev.to (β₯3 articles 1.0 / 2 0.7 / 1 0.4), Reddit (corpus-normalized 48h velocity).
No mentions on this channel in the last 7 days.
// quiet here doesn't mean the repo is dead β check the other tabs
The agent that grows with you
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
ARIS βοΈ (Auto-Research-In-Sleep) β Lightweight Markdown-only skills for autonomous ML research: cross-model review loops, idea discovery, and experiment automation. No framework, no lock-in β works with Claude Code, Codex, OpenClaw, or any LLM agent.
Ranked confirmation layer for repo-specific X buzz in the last 24h.
no linked package yet
last commit 6d ago
* Reddit bar shows a per-repo velocity proxy (raw score / 100); the score formula uses the corpus-normalized version so a single repo's bar may not match its contribution to the corpus-wide ranking.
Contains the exact GitHub repo slug.
Known repo, package, launch, and site surfaces.
1 issue
2 issues
1 issue
3 issues
4 issues
7 issues
1 issue
Technical signal board
Reachable
no
JSON-LD
0
Meta tags
set
AI discovery
no
Agent access
set
Agent readiness
3/10scanned 1d ago