The H-Bar Model of Knowledge Development
In ProgressA formal framework that distinguishes parametric depth (δ) from schema coherence (σ) as independently necessary dimensions of agent capability.
A formal framework that distinguishes parametric depth (δ) from schema coherence (σ) as independently necessary dimensions of agent capability. The model provides a phase-structured account of how both variables evolve during training, introduces the delegation gradient (𝒟*) and intersection activation (Ψ) mechanisms, and derives a two-component decay decomposition separating cognitive decay (λ_c) from frontier obsolescence (λ_f).
The central falsifiable claim is that schema coherence is formally distinct from depth — its absence explains compositional generalization failure and OOD brittleness that depth-maximizing training regimes cannot account for. Physics-Informed Residual Learning (PIRL) serves as the motivating case study.
**Status details:**
- **Status**: Active writing · formal paper in progress
- **Target venue**: Journal of Artificial Intelligence Research (JAIR)
- **Preprint**: In preparation · arXiv cs.AI + cs.LG
**Links:**
- [Read article](/articles) →
- [View paper entry](/papers) →
H-Bar ModelSchema CoherenceCompositional GeneralizationAI TrainingFormal MethodsCurriculum Learning