A
Alpha
20260222093255_ze7rcompletedconcordance115.55M params17m 10s elapsed · Updated 1d ago
12L / 768D / 12H · helios · bpe-64k · adamw· Created Feb 22, 2026 9:35 AM
Step 712 / 2,00035.6%
13.8775
Loss?
6.7904
Best Loss?
38.1% from start
12.3467
Val Loss?
best: 7.7351
7.66e-5
Learning Rate?
1,210
Throughput?
tok/s (avg)
1,696
Speed?
ms/iter (avg)
1341571549.206
Grad Norm?
avg: 36453097179.277
1.25M
Tokens
processed
Loss Curve ?
Validation / selection
Run diagnostics
Gradient + Loss
Train/val loss with validation, warmup, overfit, and instability markers.
Search-aware view
Architecture
Layers?12
Embedding?768
Heads?12
Vocab?19,777
Context?256
Dropout?0
Parameters?115.55M
Training Config
Total iters?2,000
Batch size?8
Max LR?0.0001
Optimizer?adamw
Backend?helios
Tokenizer?bpe-64k
Seed?42
Weight decay?0.01
Grad clip?1
Eval interval?100
GPU & VRAM
Learning Rate
Grad Norm
Step Time Breakdown
Step Time Breakdown
No timing data
Checkpoints (0) ?
No checkpoints saved
{
  "vocabSize": 19777,
  "blockSize": 256,
  "nLayer": 12,
  "nEmbd": 768,
  "nHead": 12,
  "dropout": 0
}
{
  "iters": 2000,
  "batchSize": 8,
  "lr": 0.0001,
  "beta1": 0.9,
  "beta2": 0.999,
  "eps": 1e-8,
  "weightDecay": 0.01,
  "gradClip": 1,
  "evalInterval": 100,
  "evalIters": 10,
  "seed": 42,
  "backend": "helios",
  "tokenizer": "bpe-64k",
  "optimizer": "adamw",
  "logLevel": "info",
  "trace": false
}