Semantic Level of Detail (SLoD) ​
Multi-scale knowledge representation via heat kernel diffusion on hyperbolic manifolds.
Eduard Izgorodin — arXiv:2603.08965
The problem ​
AI memory systems store knowledge as flat vector collections, discarding the hierarchical structure inherent in semantic information. A software project has architecture-level concepts, module-level patterns, and line-level details — but a vector database treats them all the same.
The idea ​
SLoD borrows Level of Detail from computer graphics — where 3D engines render geometry at variable resolution depending on distance — and builds an analogous operator for semantic data.
The key insight: hyperbolic space is the natural substrate. The Poincare ball has exponential volume growth, embedding tree-structured hierarchies with distortion O(log n) — provably optimal. Heat kernel diffusion on this space provides a continuous zoom parameterized by scale sigma:
sigma -> 0 Fine detail (individual facts, specific memories)
sigma -> inf Global summary (high-level themes, abstract concepts)Components ​
| Component | What it does | Paper reference |
|---|---|---|
| SLoD Operator | Heat kernel weights + Frechet mean on Poincare ball | Algorithm 1, Def. 2 |
| Hierarchical Coherence | Bounded error O(sigma), distortion O(log n) | Theorems 1-2 |
| Boundary Scanner | Detects natural abstraction levels from spectral gaps | Algorithm 2, Prop. 3 |
| Multi-Center Extension | Mixture representation when single summary is lossy | Def. 3, Corollary |
Boundary detection ​
Three complementary signals detect where the representation undergoes qualitative transitions:
- V(sigma) — representation velocity: how fast does the summary move in hyperbolic space?
- D_w(sigma) — weight divergence: Jensen-Shannon between consecutive weight distributions
- C_k(sigma) — neighborhood churn: do the nearest neighbors change?
Peaks in the composite score reveal natural abstraction boundaries — no manual tuning required.
Theoretical guarantees ​
Theorem 1 (Hierarchical Coherence): For a tree with n nodes embedded in the Poincare ball:
d_H(Phi_s1, Phi_s2) <= C * |s2 - s1| * log(n)Nearby scales produce semantically related representations.
Theorem 2 (Approximation Error): Memories within cognitive distance R can be approximated with error O(sigma).
Proposition 3 (Spectral Boundaries): JSD peaks near sigma* ~ 1/lambda_k when spectral gap ratio exceeds threshold R.
Experimental validation ​
| Experiment | Data | Key result | Status |
|---|---|---|---|
| Boundary Recovery | HSBM (1024 nodes, 3 levels) | ARI macro=1.00, meso=0.91 at r=200 | Complete (8/8 pass) |
| WordNet Consistency | WordNet 3.0 (82K synsets, depth 19) | Kendall tau=0.79, Recall@2=0.75 | Complete |
Citation ​
@article{izgorodin2026slod,
title = {Semantic Level of Detail: Multi-Scale Knowledge Representation
via Heat Kernel Diffusion on Hyperbolic Manifolds},
author = {Izgorodin, Edward},
year = {2026},
eprint = {2603.08965},
archivePrefix = {arXiv}
}Paper ​
- arXiv: 2603.08965
- Format: 11 pages (9 body + references), 34 citations
Related ​
- Tensor-Hyperbolic Graphs — the library implementing SLoD geometry
- Manifesto — why hyperbolic geometry matters for memory
- Scientific sources — bibliography underpinning the approach