Technical November 2024

Implementing Sleep Cycles in Neural Networks

A technical deep-dive into how we're implementing the consolidation phase in NeuralSleep.

One of the most challenging aspects of building temporal AI systems is implementing the consolidation phase - the "sleep" where temporary experiences become permanent knowledge. In this post, I'll walk through our implementation in MemoryCore.

The Three-Phase Architecture

MemoryCore implements NeuralSleep's theoretical framework through three distinct memory tiers, each with its own storage backend and time constants:

// Memory tier configuration
const MEMORY_CONFIG = {
  working: {
    storage: 'redis',
    timeConstant: '100ms-1s',
    retention: 'session'
  },
  episodic: {
    storage: 'postgresql',
    timeConstant: '1s-10min',
    retention: '30 days'
  },
  semantic: {
    storage: 'postgresql',
    timeConstant: '10min-1day',
    retention: 'permanent'
  }
};

Working Memory: The Session Buffer

Working memory handles real-time interaction. It's implemented in Redis for speed, with high plasticity and rapid decay. Every interaction creates entries here first:

interface WorkingMemoryEntry {
  sessionId: string;
  timestamp: number;
  interactionType: 'user_input' | 'system_response';
  content: string;
  emotionalValence: number;  // -1 to 1
  cognitiveLoad: number;     // 0 to 1
  metadata: Record<string, any>;
}

The key insight is that we're not just storing what happened - we're capturing how it happened. Emotional valence and cognitive load metrics help determine which experiences are worth consolidating.

The Consolidation Engine

Consolidation happens through Bull queues at three frequencies:

// Consolidation job scheduling
const consolidationQueue = new Bull('consolidation');

// Immediate consolidation on session end
consolidationQueue.add('session-end', { userId, sessionId }, {
  priority: 1,
  attempts: 3
});

// Daily consolidation
consolidationQueue.add('daily', {}, {
  repeat: { cron: '0 2 * * *' }
});

// Weekly deep consolidation
consolidationQueue.add('weekly', {}, {
  repeat: { cron: '0 3 * * 0' }
});

Pattern Extraction

The daily consolidation job is where the magic happens. We analyze episodic memories to extract patterns:

These patterns become the user's "semantic model" - a structural representation of who they are and how they learn.

Temporal Decay Functions

Not everything should be remembered forever. We implement Ebbinghaus-inspired decay curves:

function calculateRetention(
  memory: EpisodicMemory,
  now: number
): number {
  const age = now - memory.timestamp;
  const reinforcements = memory.accessCount;
  const stability = memory.emotionalValence * 0.3 +
                   memory.cognitiveLoad * 0.7;

  // Modified Ebbinghaus curve
  const baseDecay = Math.exp(-age / (24 * 60 * 60 * 1000));
  const stabilityBonus = stability * reinforcements * 0.1;

  return Math.min(1, baseDecay + stabilityBonus);
}

Memories with high emotional valence or that have been accessed multiple times decay more slowly. This mirrors how biological memory works - emotionally significant or frequently recalled experiences persist longer.

The Results

In our testing with Project Luna, users report that the system "feels different" after a few weeks of use. It's not just remembering facts about them - it's adapting its teaching style, anticipating their struggles, and building on their strengths.

That's the difference between retrieval and genuine learning.

The full implementation is available in MemoryCore on GitHub. Questions or feedback? Get in touch.