I found 2 hidden Microsoft MoE models that run on 8GB RAM laptops (no GPU)… but nobody noticed?
Sentiment Mix
Geography
Expert Signals
FamousFlight7149
author • 1 mention
r/LocalLLaMA
source • 1 mention
AI-Generated Claims
Generated from linked receipts; click sources for full context.
I found 2 hidden Microsoft MoE models that run on 8GB RAM laptops (no GPU)… but nobody noticed?.
Supported by 1 story
Is there anyone here who even knows about the existence of Microsoft's Phi-mini-MoE and Phi-tiny-MoE models?
Supported by 1 story
I only discovered them a few days ago, and they might actually be some of the very few MoE models with under 8B parameters.
Supported by 1 story
I'm not kidding, these are real MoE models around that scale, and they can supposedly run on regular laptops with just 8GB RAM, no GPU required.
Supported by 1 story
The weird part is I can't find *anyone* on the internet talking about them or even acknowledging that they exist.
Supported by 1 story
Related Events
Meta Unveils Mango and Avocado AI Models to Take On Google in 2026 - The420.in
Uncategorized • 3/21/2026
Anthropic and OpenAI released flagship models 27 minutes apart -- the AI pricing and capability gap is getting weird
LLMs • 3/21/2026
Exclusive | Meta Is Developing a New AI Image and Video Model Code-Named ‘Mango’ - WSJ
Computer Vision • 3/21/2026
Meta reportedly delays rollout of new AI model Avocado – here's why - Mint
LLMs • 3/21/2026
Exclusive | Meta Is Developing a New AI Image and Video Model Code-Named ‘Mango’ - WSJ
Computer Vision • 3/21/2026