20260222043849_u15scompletedconcordance116.14M params · Updated 3d ago
12L / 768D / 12H · helios · bpe-64k · adamw· Created Feb 22, 2026 4:38 AM
Step 0 / 10,0000.0%
-
Loss?
-
Best Loss?
-
Val Loss?
-
Learning Rate?
-
Throughput?
tok/s (avg)
-
Speed?
ms/iter (avg)
-
Grad Norm?
-
Tokens
processed
Loss Curve ?
No metrics data
Architecture
Layers?12
Embedding?768
Heads?12
Vocab?19,777
Context?1024
Dropout?0
Parameters?116.14M
Training Config
Total iters?10,000
Batch size?4
Max LR?0.0006
Optimizer?adamw
Backend?helios
Tokenizer?bpe-64k
Seed?42
Weight decay?0.01
Grad clip?1
Eval interval?100
GPU & VRAM
No GPU data
Learning Rate
No data
Grad Norm
No data
Step Time Breakdown
Step Time Breakdown
No timing data
Checkpoints (0) ?
No checkpoints saved
{
"vocabSize": 19777,
"blockSize": 1024,
"nLayer": 12,
"nEmbd": 768,
"nHead": 12,
"dropout": 0
}{
"iters": 10000,
"batchSize": 4,
"lr": 0.0006,
"beta1": 0.9,
"beta2": 0.999,
"eps": 1e-8,
"weightDecay": 0.01,
"gradClip": 1,
"evalInterval": 100,
"evalIters": 10,
"seed": 42,
"backend": "helios",
"tokenizer": "bpe-64k",
"optimizer": "adamw",
"logLevel": "info",
"trace": true
}