Darwin, But Make It Neural: When Networks Learn to Mutate Themselves
Opening — Why this matters now Modern AI has become very good at climbing hills—provided the hill stays put and remains differentiable. But as soon as the terrain shifts, gradients stumble. Controllers break. Policies freeze. Re-training becomes ritualistic rather than intelligent. This paper asks a quietly radical question: what if adaptation itself lived inside the network? Not as a scheduler, not as a meta-optimizer bolted on top, but as part of the neural machinery that gets inherited, mutated, and selected. ...