Show HN: LocalLLM – Recipes for Running the Local LLM (Need Contributors)
Sentiment Mix
Expert Signals
Igor_Wiwi
author • 1 mention
Hacker News
source • 1 mention
AI-Generated Claims
Generated from linked receipts; click sources for full context.
I built localLLLM: a small community project for running local models.Live: https://locallllm.fly.devThe goal is simple: if someone has model + OS + GPU + RAM, they should get steps that actually work (ideally one liner)I need help populating and validating guides.If you run local models, please submit one working recipe (or report what failed).
Supported by 1 story
Related Events
Show HN: Gemini Plugin for Claude Code
LLMs • 4/23/2026
LLM pricing has never made sense
LLMs • 4/23/2026
Show HN: Run coding agents in microVM sandboxes instead of your host machine
Uncategorized • 4/24/2026
OpenAI's GPT-5.5 is here, and it's no potato: narrowly beats Anthropic's Claude Mythos Preview on Terminal-Bench 2.0 - VentureBeat
LLMs • 4/24/2026
Anthropic’s Claude gains persistent memory, impacting AI model leaderboard race - Crypto Briefing
LLMs • 4/24/2026
Causality Chain
Preceded By