novels_all_20260225162239_kx12activenovels7.21M params15m 34s elapsed · ~14h 37m remaining
6L / 288D / 6H · helios · bpe · adamw· Created Feb 25, 2026 4:22 PM
Step 654 / 50,0001.3%
7.5762
Loss?
6.9151
Best Loss?
-1.0% from start
7.6607
Val Loss?
best: 7.6550
3.00e-4
Learning Rate?
4,920
Throughput?
tok/s (avg)
1,067
Speed?
ms/iter (avg)
0.916
Grad Norm?
avg: 0.582
3.33M
Tokens
processed
125ms
Forward
12% of step
866ms
Backward
81% of step
19ms
GPU Sync
2% of step
584
GPU Ops
per step
0.7%
MFU
model FLOPS util
6.9x
Bwd/Fwd
ratio
Loss Curve ?
Symbio semantics: this chart stitches many fresh candidate evaluations onto one global step axis. Loss resets near switches are expected because candidates are re-initialized. Compare local candidate shapes and the global frontier, not a single continuous model trajectory.
Search semantics
Validation / selection
Run diagnostics
Search Trajectory + Frontier
Candidate-local train/val loss segments on a shared step axis, with switch events and global frontier overlays.
Search-aware view
Architecture
Layers?6
Embedding?288
Heads?6
Vocab?2,000
Context?256
Dropout?0
Parameters?7.21M
Training Config
Total iters?50,000
Batch size?20
Max LR?0.0003
Optimizer?adamw
Backend?helios
Tokenizer?bpe
Seed?42
Weight decay?0.1
Grad clip?5
Eval interval?100
GPU & VRAM
Learning Rate
Grad Norm
Step Time Breakdown
Step Time Breakdown
Clip Telemetry
SymbiogenesisSWIGLU
1.78
Wt Entropy
bits
20.0
Eff. Rank
7.0170
Free Energy
3.894
Pop Entropy
nats
0.0791
Complexity
0.0459
Fitness
638
CUSUM Alerts
of 650 steps
12
Batch Size
adaptive
CUSUM Change-Point Monitor
Weight Entropy
Effective Rank
Free Energy
Fitness Score
Population Entropy
Adaptive Batch Size
Phase Change / Gelation
Current
Transitioning
Stability
0%
Phase Changes
13
Regime Shifts
7
Training dynamics are shifting. The model may be entering a new loss basin or the learning rate is hitting a critical threshold. This often happens before a breakthrough or a plateau.
Phase Timeline
Step 1Step 603
Loss Oscillation (Harmonic Analysis)
Evolutionary Search
Generations
4
Candidates
27
Activations
12
Best Loss
6.9151
Total Steps
650
| # | Candidate | Activation | Gen | Loss | Fitness | Steps | Mutation |
|---|---|---|---|---|---|---|---|
| 1 | id-Delta.1.2 | id | 2 | 6.9151 | 0.0449 | 25 | perturb_scale |
| 2 | gelu×gelu-Gamma.1.2 | gelu×gelu | 2 | 6.9476 | 0.0449 | 25 | prune |
| 3 | relu-Beta.1.2 | relu | 2 | 6.9479 | 0.0454 | 25 | prune |
| 4 | 0.68·relu+0.32·id-Beta.1.2 | 0.68·relu+0.32·id | 2 | 6.9490 | 0.0443 | 25 | inject_residual |
| 5 | gelu-Delta.1 | gelu | 1 | 6.9561 | 0.0454 | 25 | inject_gate |
| 6 | 0.85·silu+0.15·id-Alpha.1.2.3 | 0.85·silu+0.15·id | 3 | 6.9577 | 0.0456 | 25 | clone |
| 7 | id-Delta.1.2 | id | 2 | 6.9702 | 0.0453 | 25 | perturb_scale |
| 8 | silu-Alpha.1.2 | silu | 2 | 6.9729 | 0.0453 | 25 | clone |
| 9 | 0.85·silu+0.15·id-Alpha.1.2 | 0.85·silu+0.15·id | 2 | 6.9859 | 0.0441 | 23 | inject_residual |
| 10 | relu-Beta.1.2.3 | relu | 3 | 6.9878 | 0.0459 | 23 | prune |
| 11 | gelu-Gamma.1 | gelu | 1 | 7.0034 | 0.0435 | 25 | perturb_scale |
| 12 | gelu×gelu-Gamma.1 | gelu×gelu | 1 | 7.0275 | 0.0437 | 25 | inject_gate |
| 13 | id-Delta.1 | id | 1 | 7.0682 | 0.0459 | 25 | clone |
| 14 | 0.74·silu+0.26·id-Alpha.1 | 0.74·silu+0.26·id | 1 | 7.0714 | 0.0443 | 25 | inject_residual |
| 15 | relu-Beta.1 | relu | 1 | 7.0752 | 0.0445 | 25 | swap_basis |
| 16 | relu×gelu-Beta.1 | relu×gelu | 1 | 7.0821 | 0.0447 | 25 | inject_gate |
| 17 | gelu-Theta | gelu | 0 | 7.1201 | 0.0441 | 25 | origin |
| 18 | relu-Eta | relu | 0 | 7.1458 | 0.0421 | 25 | origin |
| 19 | silu-Alpha.1 | silu | 1 | 7.1523 | 0.0436 | 25 | clone |
| 20 | gelu×gelu×silu-Gamma.1.2 | gelu×gelu×silu | 2 | 7.1545 | 0.0405 | 25 | inject_gate |
Showing top 20 of 27 candidates
Generation Summary
G08c7.1201
G18c6.9561
G28c6.9151
G33c6.9577
Fitness Progression
Architecture Diversity
Convergence vs Diversity (Tug-of-War)
Current Mode
Exploration Dominant
Diversity Pressure
82%
Convergence Momentum
0%
Convergence Progress
100%
Phase Portrait: Diversity Pressure vs Convergence Momentum
Low diversity / high momentum = lock-in convergence
High diversity / high momentum = productive exploration
Low diversity / low momentum = stalled collapse
High diversity / low momentum = diversity stalling convergence
Tug-of-War Trace (Time Domain)
Positive tension means recent frontier improvement is outpacing diversity pressure (search is converging). Negative tension means exploration pressure is dominating recent convergence momentum (search is broadening or getting “stumped”).
Strongest Convergence
step 10
tension 0.500
Strongest Diversity Push
step 240
tension -0.820
Best Frontier
6.9151
progress 100%
Evolutionary Lineage Tree
Lineage Tree
100%
Activation Flow (Sankey)
Activation Switch Log
| Step | From | To | Gen | Prev Steps | Best Loss | Final Loss | Fitness | Tree | |
|---|---|---|---|---|---|---|---|---|---|
| 1 | - | → | silu | 0 | - | - | - | - | |
| 26 | silu | → | relu | 0 | 25 | 7.4119 | 7.4119 | 0.0383 | |
| 51 | relu | → | gelu | 0 | 25 | 7.3206 | 7.3206 | 0.0401 | |
| 76 | gelu | → | id | 0 | 25 | 7.2793 | 7.2849 | 0.0404 | |
| 101 | id | → | sq | 0 | 25 | 7.2933 | 7.2952 | 0.0414 | |
| 126 | sq | → | silu | 0 | 25 | 7.2035 | 7.2035 | 0.0419 | |
| 151 | silu | → | relu | 0 | 25 | 7.2644 | 7.2791 | 0.0416 | |
| 176 | relu | → | gelu | 0 | 25 | 7.1458 | 7.1557 | 0.0421 | |
| 201 | gelu | → | silu | 1 | 25 | 7.1201 | 7.1201 | 0.0441 | |
| 226 | silu | → | relu | 1 | 25 | 7.1523 | 7.1625 | 0.0436 | |
| 251 | relu | → | gelu×gelu | 1 | 25 | 7.0752 | 7.0752 | 0.0445 | |
| 276 | gelu×gelu | → | id | 1 | 25 | 7.0275 | 7.0275 | 0.0437 | |
| 301 | id | → | 0.74·silu+0.26·id | 1 | 25 | 7.0682 | 7.0682 | 0.0459 | |
| 326 | 0.74·silu+0.26·id | → | relu×gelu | 1 | 25 | 7.0714 | 7.0714 | 0.0443 | |
| 351 | relu×gelu | → | gelu | 1 | 25 | 7.0821 | 7.0821 | 0.0447 | |
| 376 | gelu | → | gelu | 1 | 25 | 7.0034 | 7.0034 | 0.0435 | |
| 401 | gelu | → | 0.85·silu+0.15·id | 2 | 25 | 6.9561 | 6.9706 | 0.0454 | |
| 426 | 0.85·silu+0.15·id | → | relu | 2 | 23 | 6.9859 | 6.9859 | 0.0441 | |
| 451 | relu | → | gelu×gelu | 2 | 25 | 6.9479 | 6.9479 | 0.0454 | |
| 476 | gelu×gelu | → | id | 2 | 25 | 6.9476 | 6.9476 | 0.0449 | |
| 501 | id | → | silu | 2 | 25 | 6.9151 | 6.9888 | 0.0449 | |
| 526 | silu | → | 0.68·relu+0.32·id | 2 | 25 | 6.9729 | 6.9990 | 0.0453 | |
| 551 | 0.68·relu+0.32·id | → | gelu×gelu×silu | 2 | 25 | 6.9490 | 6.9792 | 0.0443 | |
| 576 | gelu×gelu×silu | → | id | 2 | 25 | 7.1545 | 7.1545 | 0.0405 | |
| 601 | id | → | 0.85·silu+0.15·id | 3 | 25 | 6.9702 | 6.9702 | 0.0453 | |
| 626 | 0.85·silu+0.15·id | → | relu | 3 | 25 | 6.9577 | 6.9577 | 0.0456 | |
| 651 | relu | → | gelu×gelu+0.08·sq | 3 | 23 | 6.9878 | 6.9992 | 0.0459 |
Search Candidates
| # | Name | Activation | Gen | Parent | Steps | Best Loss | Best Val | Avg Loss | Fitness | Avg tok/s | Alerts |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | id-Delta.1.2 | id | 2 | id-Delta.1 | 25 | 6.9151 | 7.6550 | 7.2388 | 0.0449 | 5,307 | 25 |
| 2 | id-Delta.1 | id | 1 | id-Delta | 25 | 7.0682 | 7.6586 | 7.3066 | 0.0459 | 4,969 | 25 |
| 3 | id-Delta.1.2 | id | 2 | id-Delta.1 | 25 | 6.9702 | 7.6607 | 7.2313 | 0.0453 | 4,999 | 25 |
| 4 | gelu-Theta | gelu | 0 | - | 25 | 7.1201 | 7.6614 | 7.3412 | 0.0441 | 5,450 | 25 |
| 5 | gelu-Delta.1 | gelu | 1 | id-Delta | 25 | 6.9561 | 7.6661 | 7.2180 | 0.0454 | 5,494 | 25 |
| 6 | id-Delta | id | 0 | - | 25 | 7.2933 | 7.6668 | 7.4461 | 0.0414 | 5,704 | 25 |
| 7 | gelu×gelu-Gamma.1.2 | gelu×gelu | 2 | gelu×gelu-Gamma.1 | 25 | 6.9476 | - | 7.1776 | 0.0449 | 4,891 | 25 |
| 8 | relu-Beta.1.2 | relu | 2 | relu-Beta.1 | 25 | 6.9479 | - | 7.1875 | 0.0454 | 5,156 | 25 |
| 9 | 0.68·relu+0.32·id-Beta.1.2 | 0.68·relu+0.32·id | 2 | relu-Beta.1 | 25 | 6.9490 | - | 7.1985 | 0.0443 | 3,788 | 25 |
| 10 | 0.85·silu+0.15·id-Alpha.1.2.3 | 0.85·silu+0.15·id | 3 | 0.85·silu+0.15·id-Alpha.1.2 | 25 | 6.9577 | - | 7.2369 | 0.0456 | 2,033 | 25 |
| 11 | silu-Alpha.1.2 | silu | 2 | silu-Alpha.1 | 25 | 6.9729 | - | 7.1752 | 0.0453 | 2,273 | 25 |
| 12 | 0.85·silu+0.15·id-Alpha.1.2 | 0.85·silu+0.15·id | 2 | silu-Alpha.1 | 23 | 6.9859 | - | 7.2633 | 0.0441 | 2,052 | 23 |
| 13 | relu-Beta.1.2.3 | relu | 3 | relu-Beta.1.2 | 23 | 6.9878 | - | 7.2304 | 0.0459 | 5,289 | 23 |
| 14 | gelu-Gamma.1 | gelu | 1 | gelu-Gamma | 25 | 7.0034 | - | 7.2311 | 0.0435 | 5,157 | 25 |
| 15 | gelu×gelu-Gamma.1 | gelu×gelu | 1 | gelu-Gamma | 25 | 7.0275 | - | 7.2887 | 0.0437 | 4,582 | 25 |
| 16 | 0.74·silu+0.26·id-Alpha.1 | 0.74·silu+0.26·id | 1 | silu-Alpha | 25 | 7.0714 | - | 7.2838 | 0.0443 | 2,120 | 25 |
| 17 | relu-Beta.1 | relu | 1 | relu-Beta | 25 | 7.0752 | - | 7.2994 | 0.0445 | 5,445 | 25 |
| 18 | relu×gelu-Beta.1 | relu×gelu | 1 | relu-Beta | 25 | 7.0821 | - | 7.3464 | 0.0447 | 4,926 | 25 |
| 19 | relu-Eta | relu | 0 | - | 25 | 7.1458 | - | 7.3464 | 0.0421 | 5,449 | 25 |
| 20 | silu-Alpha.1 | silu | 1 | silu-Alpha | 25 | 7.1523 | - | 7.3387 | 0.0436 | 2,235 | 25 |
| 21 | gelu×gelu×silu-Gamma.1.2 | gelu×gelu×silu | 2 | gelu×gelu-Gamma.1 | 25 | 7.1545 | - | 7.4248 | 0.0405 | 2,152 | 25 |
| 22 | sq-Epsilon | sq | 0 | - | 25 | 7.2035 | - | 7.3750 | 0.0419 | 5,328 | 25 |
| 23 | silu-Zeta | silu | 0 | - | 25 | 7.2644 | - | 7.4293 | 0.0416 | 2,257 | 25 |
| 24 | gelu-Gamma | gelu | 0 | - | 25 | 7.2793 | - | 7.4259 | 0.0404 | 5,608 | 25 |
| 25 | relu-Beta | relu | 0 | - | 25 | 7.3206 | - | 7.4268 | 0.0401 | 5,949 | 25 |
| 26 | silu-Alpha | silu | 0 | - | 25 | 7.4119 | - | 7.5217 | 0.0383 | 2,295 | 13 |
| 27 | gelu×gelu+0.08·sq-Gamma.1.2.3 | gelu×gelu+0.08·sq | 3 | gelu×gelu-Gamma.1.2 | 4 | 7.5762 | - | 7.6128 | - | 3,970 | 4 |
Activation Distribution
relu
123 (19%)
silu
100 (15%)
gelu
100 (15%)
id
100 (15%)
gelu×gelu
50 (8%)
0.85·silu+0.15·id
48 (7%)
sq
25 (4%)
0.74·silu+0.26·id
25 (4%)
relu×gelu
25 (4%)
0.68·relu+0.32·id
25 (4%)
gelu×gelu×silu
25 (4%)
gelu×gelu+0.08·sq
4 (1%)
Oscillation & Heat Capacity
Activation Evolution Radial
Symbio Config
{
"cusumSensitivity": 4,
"cusumBaselineWindow": 5,
"metricsInterval": 10,
"trackWeightEntropy": true,
"trackEffectiveRank": true,
"trackFreeEnergy": true,
"trackMIProfiles": false,
"trackPopulationMetrics": true,
"freeEnergyBeta": 0.01,
"miNumBins": 30,
"adaptiveBatch": true,
"batchMin": 8,
"batchMax": 64,
"batchStep": 4,
"calmStepsBeforeRestore": 200,
"fitnessAlpha": 1,
"complexityMode": "entropy",
"diversityBonus": 0.1,
"diversityDecay": "cosine",
"searchMode": "composed-activation-search",
"activationPool": [
"gelu",
"relu",
"silu",
"swiglu",
"universal",
"kan_spline"
],
"searchStrategy": "evolutionary",
"populationSize": 8,
"generations": 250,
"selectionStrategy": "topk",
"tournamentK": 3,
"mutationRate": 0.7,
"stepsPerCandidate": 25,
"rankBy": "valLoss",
"perfWeight": 0,
"stabilityWeight": 0,
"writeReport": true,
"writeCandidates": true,
"writeSummary": true,
"basisPool": [
"silu",
"relu",
"gelu",
"identity",
"square"
],
"maxGraphDepth": 4,
"maxGraphNodes": 10
}Checkpoints (0) ?
No checkpoints saved
Sample Generations (3)
#CheckpointPrompt (preview)Generated
1-The 4h ago
Prompt
The
Output
The something thought contwasforERations intelligcould decreferimple people en the s to le n'trequbeforGPheiter couldyouof sembctustill ely sput . They 't with phwas alsoaxbookpresentchestembtheme t as a crebackgerlimwesu
2-Once upon a time4h ago
Prompt
Once upon a time
Output
Once upon a timebre there promptdescriTraves ativfirste. The that was ese they were pter eathot on ction- storFetchastwas slolininstESode ================================otword whiTHE , andINsomeone is s."
afterorgphrolternpiies Febru andt ite datoll
3-He walked into4h ago
Prompt
He walked into
Output
He walked into callpurOptionrun turnhaddro.
It anc," ing that lessum thanust ed.
anddata place ponentexact. Nboth .
It ind Cdatect for a atmod. Bator if was the him - illits? artplac whatations ownwrite ed on The ridually
{
"vocabSize": 2000,
"blockSize": 256,
"nLayer": 6,
"nEmbd": 288,
"nHead": 6,
"dropout": 0,
"ffnActivation": "swiglu",
"ffnDim": 768
}{
"iters": 50000,
"batchSize": 20,
"lr": 0.0003,
"lrMin": 0,
"warmupIters": 500,
"beta1": 0.9,
"beta2": 0.95,
"eps": 1e-8,
"weightDecay": 0.1,
"gradClip": 5,
"evalInterval": 100,
"evalIters": 10,
"seed": 42,
"backend": "helios",
"tokenizer": "bpe",
"optimizer": "adamw",
"logLevel": "info",
"trace": false,
"gradAccumSteps": 1,
"sampleInterval": 100,
"spikeThreshold": 10,
"syncEvery": 1,
"gcEvery": 0,
"packed": false,
"symbio": true,
"symbioConfig": {
"cusumSensitivity": 4,
"cusumBaselineWindow": 5,
"metricsInterval": 10,
"trackWeightEntropy": true,
"trackEffectiveRank": true,
"trackFreeEnergy": true,
"trackMIProfiles": false,
"trackPopulationMetrics": true,
"freeEnergyBeta": 0.01,
"miNumBins": 30,
"adaptiveBatch": true,
"batchMin": 8,
"batchMax": 64,
"batchStep": 4,
"calmStepsBeforeRestore": 200,
"fitnessAlpha": 1,
"complexityMode": "entropy",
"diversityBonus": 0.1,
"diversityDecay": "cosine",
"searchMode": "composed-activation-search",
"activationPool": [
"gelu",
"relu",
"silu",
"swiglu",
"universal",
"kan_spline"
],
"searchStrategy": "evolutionary",
"populationSize": 8,
"generations": 250,
"selectionStrategy": "topk",
"tournamentK": 3,
"mutationRate": 0.7,
"stepsPerCandidate": 25,
"rankBy": "valLoss",
"perfWeight": 0,
"stabilityWeight": 0,
"writeReport": true,
"writeCandidates": true,
"writeSummary": true,
"basisPool": [
"silu",
"relu",
"gelu",
"identity",
"square"
],
"maxGraphDepth": 4,
"maxGraphNodes": 10
}
}