On-device inference engines, local model runtimes, and self-hosted LLM stacks
| CHART | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|
#1 | Gitlawb/openclaude | 24.7k | +50+0.2% | +267+1.1% | +4.7k+19.2% | 1.3k | 8.1k | |||
#21 | Michael-A-Kuykendall/shimmy | 4.6k | +72+1.5% | +124+2.7% | +149+3.2% | 816 | 409 | |||
#31 | hydropix/TranslateBooksWithLLMs | 1.2k | +45+3.7% | +123+10.1% | — | 810 | 159 | |||
#4 | AlexsJones/llmfit | 24.8k | +12+0.0% | +88+0.4% | +946+3.8% | 527 | 1.5k | |||
#51 | nicedreamzapp/claude-code-local | 2.3k | -1-0.0% | +90+3.9% | +370+16.1% | 522 | 448 | |||
#61 | ENTERPILOT/GoModel | 705 | +3+0.4% | +62+8.8% | +67+9.5% | 410 | 37 | |||
#71 | ardanlabs/kronk | 559 | +3+0.5% | +21+3.8% | — | 280 | 29 | |||
#81 | mercurialsolo/claudectl | 121 | +4+3.3% | +12+9.9% | — | 248 | 16 | |||
#9 | mostlygeek/llama-swap | 3.7k | +8+0.2% | +32+0.9% | +139+3.8% | 232 | 288 | |||
#10 | ggml-org/llama.cpp | 107k | +31+0.0% | +124+0.1% | +1.3k+1.2% | — | 17.4k | |||
#111 | JackChen-me/open-multi-agent | 5.9k | +5+0.1% | +26+0.4% | +1.2k+19.8% | — | 2.3k | |||
#121 | sooryathejas/METATRON | 2.6k | +2+0.1% | +15+0.6% | +532+20.4% | — | 541 |