SIGNAL GRIDv0.1

Trained a GPT transformer from scratch on a $300 CPU — 39 minutes, 0.82M params, no GPU needed

1 sources1 storiesFirst seen 3/21/2026Score28Mixed Progress
Single Source
CoverageRecencyEngagementVelocityBignessConfidenceClipability
Bigness
28
Coverage
13
Recency
92
Engagement
9
Velocity
0
Confidence
49
Clipability
50
Polarization
0
Claims
5
Contradictions
0
Breakthrough
50

Sentiment Mix

Positive0%
Neutral100%
Negative0%

Expert Signals

Suspicious_Gap1121

author1 mention

r/LocalLLaMA

source1 mention

AI-Generated Claims

Generated from linked receipts; click sources for full context.

Trained a GPT transformer from scratch on a $300 CPU — 39 minutes, 0.82M params, no GPU needed.

Supported by 1 story

Can be trained on $300 machine Git hub repo : [https://github.com/Eamon2009/Transformer-language-model](https://github.com/Eamon2009/Transformer-language-model) **What I trained:** Parameters : 0.82M Dataset : 201K characters of children's stories Vocab size : 28 unique characters Hardware : CPU only — AMD Ryzen 5 Train time : 39 minutes Best val : 1.3145 — still improving at step 3000 **Full training log:** [ 0/3000] train=3.2961 val=3.2981 << best!

Supported by 1 story

[ 200/3000] train=2.3038 val=2.2490 << best!

Supported by 1 story

[ 400/3000] train=2.2469 val=2.1950 << best!

Supported by 1 story

[ 800/3000] train=1.9742 val=1.9103 << best!

Supported by 1 story

Related Events

Timeline (1 stories)

Receipts (1)

Bias Snapshot

Center
Left 0%Center 100%Right 0%
Socialreddit.com3/21/2026