novels_all_20260225161648_5t6xactivenovels7.21M params20m 6s elapsed · ~22h 38m remaining
6L / 288D / 6H · helios · bpe · adamw· Created Feb 25, 2026 4:16 PM
Step 708 / 50,0001.4%
7.2347
Loss?
6.9177
Best Loss?
-5.5% from start
7.6543
Val Loss?
best: 7.6543
3.00e-4
Learning Rate?
3,824
Throughput?
tok/s (avg)
1,654
Speed?
ms/iter (avg)
0.565
Grad Norm?
avg: 0.619
3.61M
Tokens
processed
128ms
Forward
8% of step
1449ms
Backward
88% of step
20ms
GPU Sync
1% of step
568
GPU Ops
per step
0.4%
MFU
model FLOPS util
11.3x
Bwd/Fwd
ratio
Loss Curve ?
Symbio semantics: this chart stitches many fresh candidate evaluations onto one global step axis. Loss resets near switches are expected because candidates are re-initialized. Compare local candidate shapes and the global frontier, not a single continuous model trajectory.
Search semantics
Validation / selection
Run diagnostics
Search Trajectory + Frontier
Candidate-local train/val loss segments on a shared step axis, with switch events and global frontier overlays.
Search-aware view
Architecture
Layers?6
Embedding?288
Heads?6
Vocab?2,000
Context?256
Dropout?0
Parameters?7.21M
Training Config
Total iters?50,000
Batch size?20
Max LR?0.0003
Optimizer?adamw
Backend?helios
Tokenizer?bpe
Seed?42
Weight decay?0.1
Grad clip?5
Eval interval?100
GPU & VRAM
Learning Rate
Grad Norm
Step Time Breakdown
Step Time Breakdown
Clip Telemetry
SymbiogenesisSWIGLU
1.77
Wt Entropy
bits
20.0
Eff. Rank
7.0108
Free Energy
3.892
Pop Entropy
nats
0.0786
Complexity
0.0465
Fitness
695
CUSUM Alerts
of 706 steps
12
Batch Size
adaptive
CUSUM Change-Point Monitor
Weight Entropy
Effective Rank
Free Energy
Fitness Score
Population Entropy
Adaptive Batch Size
Phase Change / Gelation
Current
Transitioning
Stability
0%
Phase Changes
15
Regime Shifts
12
Training dynamics are shifting. The model may be entering a new loss basin or the learning rate is hitting a critical threshold. This often happens before a breakthrough or a plateau.
Phase Timeline
Step 1Step 703
Loss Oscillation (Harmonic Analysis)
Evolutionary Search
Generations
4
Candidates
29
Activations
15
Best Loss
6.9177
Total Steps
706
| # | Candidate | Activation | Gen | Loss | Fitness | Steps | Mutation |
|---|---|---|---|---|---|---|---|
| 1 | id-Delta.1.2 | id | 2 | 6.9177 | 0.0448 | 25 | clone |
| 2 | relu+0.3·silu-Beta.1.2 | relu+0.3·silu | 2 | 6.9381 | 0.0443 | 25 | clone |
| 3 | 0.62·gelu+0.38·id+0.12·gelu-Gamma.1.2 | 0.62·gelu+0.38·id+0.12·gelu | 2 | 6.9444 | 0.0448 | 25 | add_term |
| 4 | (0.62·gelu+0.38·id)×silu-Gamma.1.2 | (0.62·gelu+0.38·id)×silu | 2 | 6.9602 | 0.0441 | 25 | inject_gate |
| 5 | silu+0.09·silu-Alpha.1.2 | silu+0.09·silu | 2 | 6.9624 | 0.0453 | 25 | add_term |
| 6 | silu+0.21·relu+0.22·relu-Alpha.1.2.3 | silu+0.21·relu+0.22·relu | 3 | 6.9664 | 0.0457 | 23 | add_term |
| 7 | relu+0.3·silu-Beta.1.2 | relu+0.3·silu | 2 | 6.9667 | 0.0451 | 25 | perturb_scale |
| 8 | id+0.29·id-Delta.1 | id+0.29·id | 1 | 6.9762 | 0.0451 | 25 | add_term |
| 9 | id-Delta.1.2 | id | 2 | 6.9775 | 0.0452 | 25 | perturb_scale |
| 10 | silu+0.21·relu-Alpha.1.2 | silu+0.21·relu | 2 | 6.9841 | 0.0439 | 25 | add_term |
| 11 | id-Delta.1.2.3 | id | 3 | 6.9877 | 0.0465 | 25 | clone |
| 12 | gelu×relu-Gamma.1 | gelu×relu | 1 | 6.9941 | 0.0435 | 25 | inject_gate |
| 13 | 0.62·gelu+0.38·id+0.12·gelu-Gamma.1.2.3 | 0.62·gelu+0.38·id+0.12·gelu | 3 | 7.0246 | 0.0457 | 25 | clone |
| 14 | 0.62·gelu+0.38·id-Gamma.1 | 0.62·gelu+0.38·id | 1 | 7.0413 | 0.0435 | 25 | inject_residual |
| 15 | id-Delta.1 | id | 1 | 7.0433 | 0.0462 | 25 | clone |
| 16 | relu+0.3·silu-Beta.1.2.3 | relu+0.3·silu | 3 | 7.0819 | 0.0446 | 25 | clone |
| 17 | relu-Beta.1 | relu | 1 | 7.0826 | 0.0447 | 25 | clone |
| 18 | silu×silu-Alpha.1 | silu×silu | 1 | 7.0830 | 0.0434 | 25 | inject_gate |
| 19 | gelu-Theta | gelu | 0 | 7.1012 | 0.0444 | 25 | origin |
| 20 | relu-Eta | relu | 0 | 7.1226 | 0.0425 | 25 | origin |
Showing top 20 of 29 candidates
Generation Summary
G08c7.1012
G18c6.9762
G28c6.9177
G35c6.9664
Fitness Progression
Architecture Diversity
Convergence vs Diversity (Tug-of-War)
Current Mode
Exploration Dominant
Diversity Pressure
82%
Convergence Momentum
0%
Convergence Progress
100%
Phase Portrait: Diversity Pressure vs Convergence Momentum
Low diversity / high momentum = lock-in convergence
High diversity / high momentum = productive exploration
Low diversity / low momentum = stalled collapse
High diversity / low momentum = diversity stalling convergence
Tug-of-War Trace (Time Domain)
Positive tension means recent frontier improvement is outpacing diversity pressure (search is converging). Negative tension means exploration pressure is dominating recent convergence momentum (search is broadening or getting “stumped”).
Strongest Convergence
step 8
tension 0.500
Strongest Diversity Push
step 240
tension -0.920
Best Frontier
6.9177
progress 100%
Evolutionary Lineage Tree
Lineage Tree
100%
Activation Flow (Sankey)
Activation Switch Log
| Step | From | To | Gen | Prev Steps | Best Loss | Final Loss | Fitness | Tree | |
|---|---|---|---|---|---|---|---|---|---|
| 1 | - | → | silu | 0 | - | - | - | - | |
| 26 | silu | → | relu | 0 | 25 | 7.3898 | 7.3898 | 0.0384 | |
| 51 | relu | → | gelu | 0 | 25 | 7.3206 | 7.3206 | 0.0401 | |
| 76 | gelu | → | id | 0 | 25 | 7.2801 | 7.2855 | 0.0404 | |
| 101 | id | → | sq | 0 | 25 | 7.3221 | 7.3234 | 0.0410 | |
| 126 | sq | → | silu | 0 | 25 | 7.2018 | 7.2018 | 0.0419 | |
| 151 | silu | → | relu | 0 | 25 | 7.2637 | 7.2641 | 0.0418 | |
| 176 | relu | → | gelu | 0 | 25 | 7.1226 | 7.1380 | 0.0425 | |
| 201 | gelu | → | silu | 1 | 25 | 7.1012 | 7.1012 | 0.0444 | |
| 226 | silu | → | relu+0.3·silu | 1 | 25 | 7.1814 | 7.1957 | 0.0425 | |
| 251 | relu+0.3·silu | → | 0.62·gelu+0.38·id | 1 | 25 | 7.1303 | 7.1303 | 0.0436 | |
| 276 | 0.62·gelu+0.38·id | → | id | 1 | 25 | 7.0413 | 7.0413 | 0.0435 | |
| 301 | id | → | silu×silu | 1 | 25 | 7.0433 | 7.0433 | 0.0462 | |
| 326 | silu×silu | → | relu | 1 | 25 | 7.0830 | 7.0830 | 0.0434 | |
| 351 | relu | → | gelu×relu | 1 | 25 | 7.0826 | 7.0826 | 0.0447 | |
| 376 | gelu×relu | → | id+0.29·id | 1 | 25 | 6.9941 | 6.9941 | 0.0435 | |
| 401 | id+0.29·id | → | silu+0.21·relu | 2 | 25 | 6.9762 | 6.9870 | 0.0451 | |
| 426 | silu+0.21·relu | → | relu+0.3·silu | 2 | 25 | 6.9841 | 6.9841 | 0.0439 | |
| 451 | relu+0.3·silu | → | 0.62·gelu+0.38·id+0.12·gelu | 2 | 25 | 6.9667 | 6.9667 | 0.0451 | |
| 476 | 0.62·gelu+0.38·id+0.12·gelu | → | id | 2 | 25 | 6.9444 | 6.9444 | 0.0448 | |
| 501 | id | → | silu+0.09·silu | 2 | 25 | 6.9177 | 6.9971 | 0.0448 | |
| 526 | silu+0.09·silu | → | relu+0.3·silu | 2 | 25 | 6.9624 | 6.9947 | 0.0453 | |
| 551 | relu+0.3·silu | → | (0.62·gelu+0.38·id)×silu | 2 | 25 | 6.9381 | 6.9819 | 0.0443 | |
| 576 | (0.62·gelu+0.38·id)×silu | → | id | 2 | 25 | 6.9602 | 6.9602 | 0.0441 | |
| 601 | id | → | silu+0.21·relu+0.22·relu | 3 | 25 | 6.9775 | 6.9775 | 0.0452 | |
| 626 | silu+0.21·relu+0.22·relu | → | relu+0.3·silu | 3 | 23 | 6.9664 | 6.9664 | 0.0457 | |
| 651 | relu+0.3·silu | → | 0.62·gelu+0.38·id+0.12·gelu | 3 | 25 | 7.0819 | 7.0819 | 0.0446 | |
| 676 | 0.62·gelu+0.38·id+0.12·gelu | → | id | 3 | 25 | 7.0246 | 7.0246 | 0.0457 | |
| 701 | id | → | silu+0.21·relu | 3 | 25 | 6.9877 | 6.9931 | 0.0465 |
Search Candidates
| # | Name | Activation | Gen | Parent | Steps | Best Loss | Best Val | Avg Loss | Fitness | Avg tok/s | Alerts |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 1 | id-Delta.1.2.3 | id | 3 | id-Delta.1.2 | 25 | 6.9877 | 7.6543 | 7.2443 | 0.0465 | 5,140 | 25 |
| 2 | id-Delta.1.2 | id | 2 | id-Delta.1 | 25 | 6.9177 | 7.6549 | 7.2424 | 0.0448 | 5,672 | 25 |
| 3 | id-Delta.1 | id | 1 | id-Delta | 25 | 7.0433 | 7.6584 | 7.2976 | 0.0462 | 5,687 | 25 |
| 4 | id-Delta.1.2 | id | 2 | id-Delta.1 | 25 | 6.9775 | 7.6610 | 7.2356 | 0.0452 | 5,392 | 25 |
| 5 | gelu-Theta | gelu | 0 | - | 25 | 7.1012 | 7.6614 | 7.3351 | 0.0444 | 5,682 | 25 |
| 6 | id+0.29·id-Delta.1 | id+0.29·id | 1 | id-Delta | 25 | 6.9762 | 7.6656 | 7.2451 | 0.0451 | 3,870 | 25 |
| 7 | id-Delta | id | 0 | - | 25 | 7.3221 | 7.6669 | 7.4581 | 0.0410 | 7,523 | 25 |
| 8 | relu+0.3·silu-Beta.1.2 | relu+0.3·silu | 2 | relu+0.3·silu-Beta.1 | 25 | 6.9381 | - | 7.1636 | 0.0443 | 2,015 | 25 |
| 9 | 0.62·gelu+0.38·id+0.12·gelu-Gamma.1.2 | 0.62·gelu+0.38·id+0.12·gelu | 2 | 0.62·gelu+0.38·id-Gamma.1 | 25 | 6.9444 | - | 7.2041 | 0.0448 | 3,455 | 25 |
| 10 | (0.62·gelu+0.38·id)×silu-Gamma.1.2 | (0.62·gelu+0.38·id)×silu | 2 | 0.62·gelu+0.38·id-Gamma.1 | 25 | 6.9602 | - | 7.2306 | 0.0441 | 1,888 | 25 |
| 11 | silu+0.09·silu-Alpha.1.2 | silu+0.09·silu | 2 | silu-Alpha.1 | 25 | 6.9624 | - | 7.1748 | 0.0453 | 1,438 | 25 |
| 12 | silu+0.21·relu+0.22·relu-Alpha.1.2.3 | silu+0.21·relu+0.22·relu | 3 | silu+0.21·relu-Alpha.1.2 | 23 | 6.9664 | - | 7.1973 | 0.0457 | 1,858 | 23 |
| 13 | relu+0.3·silu-Beta.1.2 | relu+0.3·silu | 2 | relu+0.3·silu-Beta.1 | 25 | 6.9667 | - | 7.2040 | 0.0451 | 2,013 | 25 |
| 14 | silu+0.21·relu-Alpha.1.2 | silu+0.21·relu | 2 | silu-Alpha.1 | 25 | 6.9841 | - | 7.2685 | 0.0439 | 2,079 | 25 |
| 15 | gelu×relu-Gamma.1 | gelu×relu | 1 | gelu-Gamma | 25 | 6.9941 | - | 7.2308 | 0.0435 | 4,738 | 25 |
| 16 | 0.62·gelu+0.38·id+0.12·gelu-Gamma.1.2.3 | 0.62·gelu+0.38·id+0.12·gelu | 3 | 0.62·gelu+0.38·id+0.12·gelu-Gamma.1.2 | 25 | 7.0246 | - | 7.2612 | 0.0457 | 3,227 | 25 |
| 17 | 0.62·gelu+0.38·id-Gamma.1 | 0.62·gelu+0.38·id | 1 | gelu-Gamma | 25 | 7.0413 | - | 7.3119 | 0.0435 | 3,935 | 25 |
| 18 | relu+0.3·silu-Beta.1.2.3 | relu+0.3·silu | 3 | relu+0.3·silu-Beta.1.2 | 25 | 7.0819 | - | 7.3176 | 0.0446 | 2,053 | 25 |
| 19 | relu-Beta.1 | relu | 1 | relu-Beta | 25 | 7.0826 | - | 7.3348 | 0.0447 | 5,060 | 25 |
| 20 | silu×silu-Alpha.1 | silu×silu | 1 | silu-Alpha | 25 | 7.0830 | - | 7.3051 | 0.0434 | 1,472 | 25 |
| 21 | relu-Eta | relu | 0 | - | 25 | 7.1226 | - | 7.3186 | 0.0425 | 5,253 | 25 |
| 22 | relu+0.3·silu-Beta.1 | relu+0.3·silu | 1 | relu-Beta | 25 | 7.1303 | - | 7.3518 | 0.0436 | 2,100 | 25 |
| 23 | silu-Alpha.1 | silu | 1 | silu-Alpha | 25 | 7.1814 | - | 7.3711 | 0.0425 | 2,276 | 25 |
| 24 | sq-Epsilon | sq | 0 | - | 25 | 7.2018 | - | 7.3731 | 0.0419 | 6,898 | 25 |
| 25 | silu+0.21·relu-Alpha.1.2.3 | silu+0.21·relu | 3 | silu+0.21·relu-Alpha.1.2 | 8 | 7.2347 | - | 7.3951 | - | 1,975 | 8 |
| 26 | silu-Zeta | silu | 0 | - | 25 | 7.2637 | - | 7.4198 | 0.0418 | 2,933 | 25 |
| 27 | gelu-Gamma | gelu | 0 | - | 25 | 7.2801 | - | 7.4227 | 0.0404 | 6,855 | 25 |
| 28 | relu-Beta | relu | 0 | - | 25 | 7.3206 | - | 7.4269 | 0.0401 | 7,046 | 25 |
| 29 | silu-Alpha | silu | 0 | - | 25 | 7.3898 | - | 7.5090 | 0.0384 | 3,003 | 14 |
Activation Distribution
id
125 (18%)
relu+0.3·silu
100 (14%)
silu
75 (11%)
relu
75 (11%)
gelu
50 (7%)
0.62·gelu+0.38·id+0.12·gelu
50 (7%)
silu+0.21·relu
33 (5%)
sq
25 (4%)
0.62·gelu+0.38·id
25 (4%)
silu×silu
25 (4%)
gelu×relu
25 (4%)
id+0.29·id
25 (4%)
silu+0.09·silu
25 (4%)
(0.62·gelu+0.38·id)×silu
25 (4%)
silu+0.21·relu+0.22·relu
23 (3%)
Oscillation & Heat Capacity
Activation Evolution Radial
Symbio Config
{
"cusumSensitivity": 4,
"cusumBaselineWindow": 5,
"metricsInterval": 10,
"trackWeightEntropy": true,
"trackEffectiveRank": true,
"trackFreeEnergy": true,
"trackMIProfiles": false,
"trackPopulationMetrics": true,
"freeEnergyBeta": 0.01,
"miNumBins": 30,
"adaptiveBatch": true,
"batchMin": 8,
"batchMax": 64,
"batchStep": 4,
"calmStepsBeforeRestore": 200,
"fitnessAlpha": 1,
"complexityMode": "entropy",
"diversityBonus": 0.1,
"diversityDecay": "cosine",
"searchMode": "composed-activation-search",
"activationPool": [
"gelu",
"relu",
"silu",
"swiglu",
"universal",
"kan_spline"
],
"searchStrategy": "evolutionary",
"populationSize": 8,
"generations": 250,
"selectionStrategy": "topk",
"tournamentK": 3,
"mutationRate": 0.7,
"stepsPerCandidate": 25,
"rankBy": "valLoss",
"perfWeight": 0,
"stabilityWeight": 0,
"writeReport": true,
"writeCandidates": true,
"writeSummary": true,
"basisPool": [
"silu",
"relu",
"gelu",
"identity",
"square"
],
"maxGraphDepth": 4,
"maxGraphNodes": 10
}Checkpoints (0) ?
No checkpoints saved
Sample Generations (3)
#CheckpointPrompt (preview)Generated
1-The 7h ago
Prompt
The
Output
The something thought contwasforERations intelligcould decreferimple people en the s to le n'trequbeforGPheiter couldyouof sembctustill ely sput . They CH
abclaude parresumcationed in fatheme t as a betweenplac. And containight ight
2-Once upon a time7h ago
Prompt
Once upon a time
Output
Once upon a timebre there promptdescriTraves ativfirste. The that was ese they were pter eathot on ction- storFetchastwas slolininstESode ================================otharnessmy, vers she end ken theint Lisa had managch particularension that would gupetos. s were CHAPTH
3-He walked into7h ago
Prompt
He walked into
Output
He walked into callpurOptionrun turnhaddro.
It anc," ing that lessum thanust ed.
anddata place ponentexact. Nboth .
It ind Cdatptionfindponweeous because. The thwhen think fromday impl someconversationeven thering. sid cre==gerdict
{
"vocabSize": 2000,
"blockSize": 256,
"nLayer": 6,
"nEmbd": 288,
"nHead": 6,
"dropout": 0,
"ffnActivation": "swiglu",
"ffnDim": 768
}{
"iters": 50000,
"batchSize": 20,
"lr": 0.0003,
"lrMin": 0,
"warmupIters": 500,
"beta1": 0.9,
"beta2": 0.95,
"eps": 1e-8,
"weightDecay": 0.1,
"gradClip": 5,
"evalInterval": 100,
"evalIters": 10,
"seed": 42,
"backend": "helios",
"tokenizer": "bpe",
"optimizer": "adamw",
"logLevel": "info",
"trace": false,
"gradAccumSteps": 1,
"sampleInterval": 100,
"spikeThreshold": 10,
"syncEvery": 1,
"gcEvery": 0,
"packed": false,
"symbio": true,
"symbioConfig": {
"cusumSensitivity": 4,
"cusumBaselineWindow": 5,
"metricsInterval": 10,
"trackWeightEntropy": true,
"trackEffectiveRank": true,
"trackFreeEnergy": true,
"trackMIProfiles": false,
"trackPopulationMetrics": true,
"freeEnergyBeta": 0.01,
"miNumBins": 30,
"adaptiveBatch": true,
"batchMin": 8,
"batchMax": 64,
"batchStep": 4,
"calmStepsBeforeRestore": 200,
"fitnessAlpha": 1,
"complexityMode": "entropy",
"diversityBonus": 0.1,
"diversityDecay": "cosine",
"searchMode": "composed-activation-search",
"activationPool": [
"gelu",
"relu",
"silu",
"swiglu",
"universal",
"kan_spline"
],
"searchStrategy": "evolutionary",
"populationSize": 8,
"generations": 250,
"selectionStrategy": "topk",
"tournamentK": 3,
"mutationRate": 0.7,
"stepsPerCandidate": 25,
"rankBy": "valLoss",
"perfWeight": 0,
"stabilityWeight": 0,
"writeReport": true,
"writeCandidates": true,
"writeSummary": true,
"basisPool": [
"silu",
"relu",
"gelu",
"identity",
"square"
],
"maxGraphDepth": 4,
"maxGraphNodes": 10
}
}