Bayesian inference preserves the support of the prior. Any process that changes the space of admissible hypotheses therefore lies outside the Bayesian framework. The same is true of all fixed-support inference. Bayesian methods are just the canonical example.
The traditional fix for this issue with Bayesian inference is to sidestep the question of admissibility by choosing a starting prior with support for all computable hypotheses (ie the Solomonoff prior), and narrow it down from there.
The problem with this approach is that the Solomonoff prior is uncomputable. Any realizable approximation must restrict to a finite hypothesis space—at which point the original problem returns.
Hierarchical models enable Bayesian model reduction and other similar techniques, but ultimately the top level of the hierarchy has some fixed support it cannot go beyond.
So if learning which expands the support cannot be fully characterized by any fixed-support method, including Bayesian inference, then what kind of inference is it?
If a perturbation is outside the support of the process, it can't be processed as evidence for updating. Instead the learning must occur when the perturbation causally restructures the process itself directly.
Support-expanding variance arises from the process allowing the environment to perturb it in dimensions it currently assigns measure zero to. Learning via the natural gradient, the gradient of the substrate rather than the model.
An agents beliefs are necessarily either under its own control (what the GDI paper describes as agency) or under the environments (what the GDI paper describes as plasticity).
Without agency, a process cannot successfully act in itself and the world in order to maintain its own persistence. Without plasticity, a model cannot discover the unknown angles in the cracks its beliefs, the hidden dimensions of reality, and it will die when the world changes.
(Thanks to Claude Opus for helping me workshop the language on several of these, it’s much tighter than the original thread!)
132