We're excited to announce MemoryCore v0.2, which introduces configurable temporal decay functions. This release addresses one of the most important aspects of biological memory systems: the ability to forget.
Why Forgetting Matters
In biological systems, forgetting isn't a bug - it's a feature. The brain actively prunes connections, allowing generalizations to form and preventing overfitting to specific experiences. Without forgetting, we'd be overwhelmed by irrelevant details.
The same principle applies to AI memory systems. An AI that remembers everything equally is actually less useful than one that intelligently forgets. The question is: how do we decide what to forget?
The Ebbinghaus Foundation
Our decay functions are inspired by Hermann Ebbinghaus's forgetting curve, discovered in the 1880s. His research showed that memory retention follows a predictable exponential decay pattern, but that retention can be strengthened through spaced repetition.
We've adapted this for AI systems with several modifications:
type DecayFunction = | 'exponential' // Standard Ebbinghaus | 'power-law' // Slower initial decay | 'stepped' // Discrete retention levels | 'adaptive' // Adjusts based on usage | 'custom'; // User-defined function
Configuring Decay
Each memory tier can now have its own decay configuration:
const memoryConfig = {
episodic: {
decay: {
function: 'adaptive',
baseHalfLife: 7 * 24 * 60 * 60 * 1000, // 7 days
emotionalBoost: 0.3,
accessBoost: 0.1,
minimumRetention: 0.01
}
},
semantic: {
decay: {
function: 'power-law',
exponent: 0.5,
minimumRetention: 0.1
}
}
};
Adaptive Decay in Action
The adaptive decay function is particularly interesting. It adjusts the decay rate based on how memories are accessed:
- Memories that are frequently accessed decay slower
- Memories that are never accessed decay faster
- Emotionally significant memories (high valence) get a stability boost
- Memories that lead to successful outcomes are reinforced
This creates a natural selection pressure where useful memories persist and irrelevant ones fade.
Practical Impact
In our testing with Project Luna, adaptive decay has shown several benefits:
- Reduced storage: 40% reduction in episodic memory entries without loss of useful context
- Better generalization: The semantic layer produces more abstract, transferable patterns
- Faster queries: Less data to search through means quicker recommendations
- More natural interaction: Users report the AI feels more "human" in what it remembers and forgets
Getting Started
Upgrade to MemoryCore v0.2:
Check out the updated documentation for configuration examples and migration guides.
Questions about implementing decay functions? Reach out or open an issue on GitHub.