Skip to content

🎨 Spatial Memory Design Language ​

How Our 3D Interface Brings AI Memory to Life ​

This document explains how every visual element in our interface encodes the core architectural concepts of Mnemoverseβ€”turning abstract mathematical principles into an intuitive, interactive experience.

🎯 Who This Is For ​

This guide serves three key audiences:

  • 🎨 Product & Design Teams - Understanding why the interface looks and behaves this way
  • βš™οΈ Engineers & Researchers - Seeing how visuals map to underlying mechanisms
  • πŸš€ New Contributors - Getting oriented through the Vision and Getting Started guides

🌟 The Big Picture ​

Our UI transforms particles, morphing 3D forms, scroll fading, inertia, and controlled "explosions" into a living diagram of spatial memory retrieval. Every interaction you see represents real cognitive processes happening in hyperbolic embedding space, accelerated by GPU similarity search and stabilized through continual learning.

The magic: What looks like beautiful visual effects are actually mathematical representations of how AI memory works in curved space.

πŸ—ΊοΈ Visual β†’ System Mapping ​

Visual ElementSystem ConceptWhy It Matters
Particles gravitating to cursorAttention focusing relevant embeddingsModels query-conditioned activation versus global scan
Particle accumulation + burstRe-ranking / neighborhood rebuild + adaptive consolidationShows selective restructuring instead of catastrophic overwrite
Morphing sphere / torus / distorted shapeMemory state reconfiguration in hyperbolic manifoldEncodes hierarchy + similarity efficiently
Scroll fade outTemporal decay / lowered activation (passive retention)Reduces surface complexity without deleting knowledge
Inertia β†’ stabilizationLOD compression & resource saving when idleMirrors dynamic resolution control
Cursor re-approach reactivating detailDemand-driven expansion (attention-guided zoom)Preserves latency by delaying fine-grain decoding
GPU particle renderingGPU-native k-NN & vector index operationsCommunicates performance scalability
Glossary tooltips (planned)Concept disambiguationLowers cognitive entry barrier

Each row pairs a deliberate interaction design with a retrieval or learning primitive grounded in our mathematical foundations.

🧠 Core Principles ​

1. Selective Activation ​

Only embeddings within an attention-defined neighborhood are animated at high fidelity, reflecting Transformer-style focus. This mirrors how your brain only activates relevant memories when thinking about a specific topic.

2. Hierarchical Compactness ​

Hyperbolic geometry (PoincarΓ© ball) encodes tree/hierarchy depth exponentially near the boundary, enabling parsimonious structure visualization. Think of it as a city where important landmarks are closer together.

3. Adaptive Detail ​

Level of Detail (LOD) reduces polygon/interaction complexity for inactive regions while preserving potential for rapid expansion. Like how a map shows more detail as you zoom in.

4. Stability Under Change ​

Explosive reorganization illustrates continual learning with protected core weights (analogy to EWC) mitigating catastrophic forgetting. The system can learn new things without forgetting what it already knows.

5. Throughput as UX ​

GPU-native retrieval justifies real-time morphism and high particle counts without perceptible lag. Performance isn't just technicalβ€”it's part of the user experience.

🎯 Particles as Attention Units ​

Particles represent candidate memory vectors. The gravitational pull toward your cursor metaphorically depicts scaled dot-product attentionβ€”only context-relevant vectors get elevated in activation probability.

Try it: Move your cursor around and watch how particles respond. You're literally seeing attention mechanisms in action.

πŸ”„ Morphing Forms as Memory States ​

Canonical shapes represent different distributional regimes inside hyperbolic space:

  • Sphere = balanced global state
  • Torus = episodic cyclic patterns
  • Distorted forms = task-adapted manifolds

The curvature supports simultaneous hierarchy + similarity embeddingβ€”something impossible in flat Euclidean space.

πŸ’₯ Accumulation β†’ Burst (Reorganization Cycle) ​

When particle density exceeds a threshold or on user click, a controlled "burst" disperses and reassembles neighbors. This visualizes:

  • Neighborhood rebuild (re-ranking)
  • Selective parameter consolidation
  • Avoidance of catastrophic forgetting

The science: This mirrors how neural networks reorganize connections during learning without losing previously learned information.

πŸ“œ Scroll-Based Temporal Attenuation ​

Scrolling lowers opacity and interaction affordances of the active shape, encoding a decay of immediate salience without deletion. This mirrors memory items moving from active working sets into lower-frequency storage layers.

βš–οΈ Inertia and Idle Stabilization (LOD Compression) ​

When you stop interacting, motion dampens and geometry simplifies. This corresponds to level-of-detail reduction to conserve compute while preserving the hyperbolic coordinate scaffold for instant re-expansion.

πŸ” Reactivation on Proximity ​

Re-approach (cursor proximity) triggers:

  • Finer geometry
  • Denser particle shaders
  • Semantic tooltips

This is attention-guided drill-down that parallels conditional decoding or localized high-resolution retrieval.

🌌 Hyperbolic Spatial Model ​

We visually imply negative curvature (density thinning toward center, expansion near boundary) to communicate why hyperbolic embeddings achieve parsimonious hierarchical representation compared to Euclidean alternatives.

The math: Hyperbolic space has "more room" near the edges, making it perfect for representing hierarchical relationships.

⚑ GPU-Native Retrieval Layer ​

Continuous particle motion and instantaneous regrouping signal underlying GPU-optimized k-NN and clustering operations. This enables interactive scale through parallel distance computations and selection primitives.

🎯 Attention-Guided Level of Detail ​

Rather than global uniform fidelity, attention scores drive which clusters receive high-poly morph targets versus decimated shells. This extends classical rendering LOD into semantic memory retrieval.

πŸ”„ Continual Learning & Stability ​

Explosive transitions preserve a "skeleton" subset, reflecting selective weight consolidation strategies to protect performance on prior tasks while integrating new memory vectors.

πŸ› οΈ Implementation Notes (Technical Symbolism) ​

We render particles via Three.js/WebGL leveraging GPU parallelismβ€”the same paradigm exploited in large-scale similarity search libraries. This reinforces conceptual alignment between visualization and engine design goals.

🎨 Design Benefits ​

This metaphor stack provides multiple advantages:

Reduced Cognitive Load ​

Visual cues directly map to concepts, making complex ideas accessible to newcomers.

Enhanced SEO ​

Domain terms are embedded organically throughout the content rather than isolated in glossaries.

Improved Navigation ​

Users can understand the system's behavior through interaction, not just reading.

πŸ“š Glossary (Quick Reference) ​

  • Hyperbolic Embedding - Vector representation in negatively curved space (PoincarΓ© ball) for efficient hierarchy + similarity
  • Attention - Mechanism weighting context-relevant vectors; drives selective activation
  • k-NN GPU Retrieval - Parallel nearest-neighbor search enabling low-latency high-dimensional similarity queries
  • Level of Detail (LOD) - Dynamically varying representation complexity based on importance or proximity
  • Catastrophic Forgetting / EWC - Degradation from new task learning mitigated by elastic weight consolidation
  • Re-ranking - Secondary refinement step updating candidate sets after initial approximate retrieval

Start Your Journey ​

Deep Dive ​

See Also ​


Ready to explore the future of spatial AI memory? Start with our Getting Started Guide or dive into the Core Mathematical Theory.

Explore related documentation: