SIGNAL GRIDv0.1

Experiment: How far can a 28M model go in business email generation?

1 sources1 storiesFirst seen 3/20/2026Score21Mixed Progress
Single Source
CoverageRecencyEngagementVelocityBignessConfidenceClipability
Bigness
21
Coverage
13
Recency
67
Engagement
8
Velocity
0
Confidence
49
Clipability
60
Polarization
0
Claims
5
Contradictions
0
Breakthrough
50

Sentiment Mix

Positive0%
Neutral100%
Negative0%

Geography

North America

Expert Signals

AdhesivenessSea9511

author1 mention

r/LocalLLaMA

source1 mention

AI-Generated Claims

Generated from linked receipts; click sources for full context.

Experiment: How far can a 28M model go in business email generation?.

Supported by 1 story

I've been experimenting with training a small (\~28M parameter) Transformer model on synthetic business email data.

Supported by 1 story

It's definitely not perfect and still struggles with instruction-following, but I was surprised that it can sometimes produce reasonably coherent email-like text.

Supported by 1 story

The model is very small compared to typical LLMs, so this was more of an experiment to see how far structured generation can go under tight parameter constraints.

Supported by 1 story

Some generations are messy or drift off-topic, but occasionally it produces outputs that *almost* look usable.

Supported by 1 story

Related Events

Timeline (1 stories)

Receipts (1)

Bias Snapshot

Center
Left 0%Center 100%Right 0%
Socialreddit.com3/20/2026