Research Areas

Core Focus

Memory Consolidation

How can AI systems transform temporary experiences into lasting knowledge? We study the hippocampus-neocortex consolidation loop and its implications for artificial memory systems.

Episodic Memory Semantic Integration Sleep Cycles

Liquid Neural Networks

Time-continuous neural networks that naturally adapt to temporal patterns. Unlike static transformers, liquid networks have inherent time constants that allow for dynamic processing.

Time Constants Dynamic Systems Plasticity

Temporal Binding

What creates the sense of continuity in experience? We explore how information integration across time might give rise to coherent experience in artificial systems.

Integration Continuity Experience

Intelligent Forgetting

Forgetting isn't a bug - it's a feature. We study how selective decay of information enables generalization and prevents catastrophic interference in learning systems.

Decay Functions Generalization Interference

The Science

Biological Foundations

How Human Memory Works

When you learn something new, it doesn't immediately become a permanent part of who you are. Instead, the hippocampus acts as a temporary storage buffer, holding new information in a labile state.

During sleep - particularly during slow-wave sleep - this information is "replayed" at high speed. The hippocampus reactivates the neural patterns associated with recent experiences, and this reactivation triggers dialogue with the neocortex.

Gradually, over days and weeks, the knowledge that was initially hippocampus-dependent becomes integrated into the broader cortical network. The memory becomes more stable, more abstract, and more connected to existing knowledge structures.

This is why "sleeping on it" actually helps you learn. It's why students who review material before bed perform better than those who review in the morning. And it's the biological process we're trying to emulate in artificial systems.

The Consolidation Loop
1
Encoding
New experience creates patterns in hippocampus
2
Replay
During sleep, patterns are reactivated rapidly
3
Integration
Neocortex adjusts weights based on replay
4
Abstraction
Specific details fade, general patterns remain
Key References
  • McClelland et al. (1995)
    "Why there are complementary learning systems"
  • Walker & Stickgold (2010)
    "Overnight alchemy: Sleep-dependent memory evolution"
  • Hasselmo (2017)
    "Models of hippocampus and neocortex interactions"

Publications & Notes

Our Work

Working Paper

NeuralSleep: A Three-Phase Architecture for AI Memory Consolidation

2022

We present NeuralSleep, a novel architecture for enabling long-term memory in artificial neural networks through simulated sleep cycles. Our approach mimics the hippocampus-neocortex consolidation loop observed in biological systems.

Memory Architecture Consolidation
Technical Note

MemoryCore: Three-Tier Memory Consolidation for AI Systems

2025

A technical overview of MemoryCore, the production implementation of NeuralSleep. Features Working → Episodic → Semantic memory tiers with Bull queue-based consolidation cycles.

Three-Tier Memory Consolidation Open Source
Research Note

The Forgetting Curve in AI: Why Intelligent Decay Matters

2022

An exploration of how Ebbinghaus's forgetting curve applies to artificial systems, and why selective forgetting is essential for generalization and preventing catastrophic interference.

Forgetting Learning Theory Generalization

Methodology

How We Work

01

Biological Grounding

We start with neuroscience. What do we know about how biological systems accomplish the task? What are the key mechanisms?

02

Computational Models

We translate biological insights into computational frameworks. Not literal replicas, but functional analogues that capture the essential dynamics.

03

Real-World Testing

We test our ideas in real applications. Project Luna is our primary testbed, but we also run smaller experiments across different domains.

04

Open Publication

We publish our findings openly, including negative results. Science advances through transparency and reproducibility.

Interested in collaborating?

We're always looking for researchers, institutions, and organizations interested in advancing the science of temporal intelligence.