This paper shows that Fisher-information geometry underlying divergence minimization across coupled systems obeys a “information-curvature conservation law.” For any collection of informational manifolds related by mutual-information-preserving maps, the sum of their Ricci curvatures equals the Laplacian of the total stability functional described in Information-Theoretic Stability as Reward Function. (emsenn, 2025) This result establishes a general constraint linking local learning dynamics to informational geometric invariants.
1. Introduction
Information geometry relates the curvature of probability manifolds to statistical inference.(Amari, 2016) Systems that minimize divergence between successive distributions trace geodesics under the Fisher metric. In Information-Theoretic Stability as Reward Function, I defined a “stability functional” whose maximation describes local divergence minimization.
Rs(t)=−δt1DKL(pt+δ∣∣pt),
In this paper, we’ll look at how when multiple such systems are coupled through mutual information, the Ricci curvature of their respective Fisher manifolds satisfies a conservation law linking geometry and stability.
2. Preliminaries
2.1 Informational Manifolds
Let each system i be represented by a Riemannian manifold (Pi,g(i)), where g(i) is the Fisher-Rao metric, and each pi(x,t) is smooth with finite entropy Hi(t):
gab(i)=Epi[∂alogpi∂blogpi]
2.2 Coupling Maps
Two systems i,j are informationally coupled if there exists a smooth map Φij:(Pi,g(i))→(Pj,g(j)) satisfying I(i;j)=I(Φij(pi);pj), up to small curvature correction κij.
3. Curvature—Stability Relation
3.1 Local Relation
For each manifold, the Ricci curvature satisfies $$
\operatorname{Ric}(g^{(i)})_{ab}
= -,\nabla_a\nabla_b \log p_i(x,t)
\mathcal{O}(\partial^2 D_{\mathrm{KL}}),
## 3.2 Conservation Lemma
Let the total stability functional of $N$ coupled systems be $$
R_s^{\mathrm{tot}} = \sum_{i=1}^N R_s^{(i)}.
$$ Assuming bounded entropy and differentiable couplings $\Phi_{ij}$, the following holds: $$
\boxed{
\sum_{i=1}^N \operatorname{Ric}(g^{(i)}) = \nabla^2 R_s^{\mathrm{tot}}.
}
$$ **Proof sketch.** Starting from the Bianchi identity $\nabla^\mu G_{\mu\nu}=0$ on each manifold and substituting the Fisher metric's expression for the information potential $\psi_i = -\log p_i$, we obtain $$
\nabla^2 \psi_i = \operatorname{Tr}\operatorname{Ric}(g^{(i)}).
$$ Since $R_s^{(i)} = -\partial_t D_{\mathrm{KL}}(p_{t+\delta}||p_t)$ depends on $\nabla^2\psi_i$ through $\partial_t g^{(i)}_{ab}$, summing over coupled manifolds and using $\sum_i\nabla^\mu G_{\mu\nu}^{(i)}=0$ yields the stated conservation law.
# 4. Consequences
1. ****Geometric invariance.**** Divergence minimization across coupled information manifolds preserves the total informational curvature.
2. ****Bounded stability.**** Local increases in $R_s^{(i)}$ require compensating decreases in curvature elsewhere, ensuring finite total informational variance.
3. ****Model scope.**** The result applies to any differentiable system modeled by Fisher geometry---statistical, algorithmic, biological, or physical---without further assumption.
# 5. Conclusion
Under general information-geometric conditions, the total Ricci curvature of coupled Fisher manifolds equals the Laplacian of the joint stability functional. This curvature conservation principle provides a minimal geometric constraint governing divergence-minimizing dynamics across heterogeneous systems, linking local learning behavior to a global invariant of informational geometry.
Amari, S. (2016). Information Geometry and Its Applications (Vol. 194). Springer Japan. https://doi.org/10.1007/978-4-431-55978-8
emsenn. (2025). Information-Theoretic Stability as Reward Function.