Single-channel signal so far. GitHub picked up +17 stars in 24h and Bluesky buzz (5 posts / 7d). Reddit, Dev.to and X are still cold — typical for a niche project at this stage.
Added 1,058 stars over the past week, climbing the C leaderboard with a steady 7-day curve.
+17 stars 24h | +1.1k 7d
0 in 24h | 0 sources
1/6 channels firing
no linked package yet
last commit 1h ago
Each channel contributes 0-1. Per-channel tiers: GitHub (breakout 1.0 / hot 0.7 / rising 0.4), HN (front-page 1.0 / ≥3 mentions 0.7 / 1-2 mentions 0.4), Bluesky (≥5 mentions 1.0 / 2-4 0.7 / 1 0.4), dev.to (≥3 articles 1.0 / 2 0.7 / 1 0.4), Reddit (corpus-normalized 48h velocity), X (≥10 mentions 24h 1.0 / 3-9 0.7 / 1-2 0.4).
* Reddit bar shows a per-repo velocity proxy (raw score / 100); the score formula uses the corpus-normalized version so a single repo's bar may not match its contribution to the corpus-wide ranking.
// KNOWN REPO · PACKAGE · LAUNCH · SITE SURFACES
No mentions on this channel in the last 7 days.
// QUIET HERE DOESN'T MEAN THE REPO IS DEAD — CHECK OTHER TABS
Computer programmer based in Sicily, Italy. I mostly write OSS software. Born 1977. Not a puritan.
~95% on SimpleQA (e.g. Qwen3.6-27B on a 3090). Supports all local and cloud LLMs (llama.cpp, Ollama, Google, ...). 10+ search engines - arXiv, PubMed, your private documents. Everything Local & Encrypted.
Open-source Claude Design alternative. One-click import your Claude Code / Codex API key. Prompt → prototype / slides / PDF. Multi-model (Claude, GPT, Gemini, Kimi, GLM, Ollama). BYOK, local-first, MIT.
Hundreds of models & providers. One command to find what runs on your hardware.
The fastest local AI engine for Apple Silicon. 4.2x faster than Ollama, 0.08s cached TTFT, 100% tool calling. 17 tool parsers, prompt cache, reasoning separation, cloud routing. Drop-in OpenAI replacement. Works with Claude Code, Cursor, Aider.
Privacy first, AI meeting assistant with 4x faster Parakeet/Whisper live transcription, speaker diarization, and Ollama summarization built on Rust. 100% local processing. no cloud required. Meetily (Meetly Ai - https://meetily.ai) is the #1 Self-hosted, Open-source Ai meeting note taker for macOS & Windows.
让最新版 OpenAI Codex CLI / Codex 桌面端接入主流大模型的本地代理。内置 小米 MiMo V2.5/DeepSeek V4 Pro,并提供通用 provider 机制,**OpenAI Chat Completions 兼容**(Qwen / GLM / Kimi / 本地 vLLM / Ollama / LM Studio …)或**原生 Responses API**(OpenAI 自家)的上游接到新版 Codex。把 Codex 的 Responses API 实时翻译成上游的 Chat Completions API,按客户端发的 `model` 字段在 provider 之间自动路由.