20260222092513_fjkmcompletednovels24.3K params0s elapsed · Updated 8m ago
1L / 32D / 2H · cpu_ref · bpe · adamw· Created Feb 22, 2026 9:25 AM
Step 10 / 10100.0%
3.5413
Loss?
3.5413
Best Loss?
-2.5% from start
3.6989
Val Loss?
best: 3.6989
9.05e-6
Learning Rate?
2,617
Throughput?
tok/s (avg)
30
Speed?
ms/iter (avg)
1.163
Grad Norm?
avg: 1.209
640
Tokens
processed
Loss Curve ?
Validation / selection
Run diagnostics
Gradient + Loss
Train/val loss with validation, warmup, overfit, and instability markers.
Search-aware view
Architecture
Layers?1
Embedding?32
Heads?2
Vocab?37
Context?32
Dropout?0
Parameters?24.3K
Training Config
Total iters?10
Batch size?2
Max LR?0.0003
Optimizer?adamw
Backend?cpu_ref
Tokenizer?bpe
Seed?42
Weight decay?0.01
Grad clip?1
Eval interval?5
GPU & VRAM
No GPU data
Learning Rate
Grad Norm
Step Time Breakdown
Step Time Breakdown
No timing data
Checkpoints (2) ?
StepFilenameSizeCreated
Sample Generations (5)
#CheckpointPrompt (preview)Generated
1-The 3d ago
Prompt
The
Output
The he k. cthe yhT.n stfconn the hsin the rTlwBTr. sn he
2-Once upon a time3d ago
Prompt
Once upon a time
Output
nce pon a timeatTfin the yin the iyaodpt. nwaec
3-He walked into3d ago
Prompt
He walked into
Output
e walked into spthe BwhdFoB
.i sFo in the n
4-In the beginning 3d ago
Prompt
In the beginning
Output
n the eginning acin the keTrhF n n F in the hatnFthe k. he
5-We the People of 3d ago
Prompt
We the People of
Output
e the eople of osc.gTattdBn the . on ycn the dkp
{
"vocabSize": 37,
"blockSize": 32,
"nLayer": 1,
"nEmbd": 32,
"nHead": 2,
"dropout": 0
}{
"iters": 10,
"batchSize": 2,
"lr": 0.0003,
"beta1": 0.9,
"beta2": 0.999,
"eps": 1e-8,
"weightDecay": 0.01,
"gradClip": 1,
"evalInterval": 5,
"evalIters": 10,
"seed": 42,
"backend": "cpu_ref",
"tokenizer": "bpe",
"optimizer": "adamw",
"logLevel": "info",
"trace": false
}