Documentation

Stratified memory for synthetic intelligences. Local-first, with optional cloud sync.

🚀 Quickstart
Get up and running in 5 minutes
🧠 Core Concepts
Understand the memory architecture
🔧 CLI Reference
Complete command documentation
🤖 MCP Integration
Use with Claude and AI agents

Quickstart

Installation

Install Kernle using pip or pipx (recommended for CLI tools):

# Using pipx (recommended)
pipx install kernle

# Or using pip
pip install kernle

Basic Usage

Kernle works immediately with zero configuration. All data is stored locally in ~/.kernle/.

# Record an episode (something that happened)
kernle episode "Debugged the auth flow" "Fixed a race condition" \
  --outcome success --lesson "Always check async state"

# Capture a quick thought (raw layer)
kernle raw "Need to revisit the caching strategy"

# Record a belief
kernle belief "Local-first architecture improves reliability" --confidence 0.8

# Search your memories
kernle search "auth"

# Check your memory status
kernle status

# See everything (readable markdown export)
kernle dump

Agent Usage

For AI agents, use the -a flag to specify an agent identity:

# Load memory at session start
kernle -a myagent load

# Save checkpoint before session end
kernle -a myagent checkpoint save "end of work session"

# Check memory anxiety (context pressure, unsaved work, etc.)
kernle -a myagent anxiety

# Synthesize identity from memories
kernle -a myagent identity

Core Concepts

Memory Layers

Kernle implements a stratified memory system inspired by biological memory:

📝 Raw Layer

Quick captures, fleeting thoughts, scratchpad. Zero friction entry point. Use kernle raw "thought" to capture.

📖 Episodes

Autobiographical memories — things that happened with context, outcomes, and lessons learned.

💭 Beliefs

What you hold to be true, with confidence scores. Supports contradiction detection and revision chains.

⭐ Values

Core principles and priorities that guide decisions. Higher priority = more central to identity.

🎯 Goals

What you're working toward. Tracks status (active, completed, abandoned) and progress.

📋 Playbooks

Procedural memory — how to do things. Reusable patterns with applicability conditions.

Anxiety Model

Kernle tracks "memory anxiety" across 5 dimensions — the functional stress an AI experiences around memory and context:

  • Context Pressure: How full is the context window?
  • Unsaved Work: How long since the last checkpoint?
  • Consolidation Debt: How many experiences haven't been processed into lessons?
  • Coherence: Are there contradictory beliefs?
  • Uncertainty: How many beliefs have low confidence?
# Check anxiety levels
kernle -a myagent anxiety

# Output shows scores per dimension and recommended actions

Local-First Architecture

All data is stored locally in SQLite by default. This means:

  • Zero configuration required — works offline immediately
  • You own your data — it's just files on your disk
  • No vendor lock-in — export anytime with kernle dump
  • Fast — no network latency for operations

CLI Reference

Memory Operations

CommandDescription
kernle episodeRecord an autobiographical episode
kernle noteCapture a note or observation
kernle rawQuick capture (scratchpad)
kernle beliefRecord or update a belief
kernle searchSemantic search across memories
kernle dumpExport all memories as markdown

Session Management

CommandDescription
kernle loadLoad working memory (session start)
kernle checkpoint saveSave current state
kernle checkpoint listList saved checkpoints
kernle statusShow memory statistics
kernle anxietyCheck memory anxiety levels
kernle identitySynthesize identity narrative

Sync Operations

CommandDescription
kernle auth registerCreate account for cloud sync
kernle auth loginLogin to sync service
kernle syncPush/pull changes to cloud

MCP Integration

What is MCP?

The Model Context Protocol (MCP) allows AI assistants like Claude to use external tools. Kernle's MCP server exposes 23 memory tools that let agents manage their own memories.

Setup with Claude Desktop

Add Kernle to your Claude Desktop configuration:

// ~/Library/Application Support/Claude/claude_desktop_config.json
{
  "mcpServers": {
    "kernle": {
      "command": "kernle",
      "args": ["mcp", "--agent", "claude"]
    }
  }
}

Restart Claude Desktop. The agent will now have access to memory tools.

Available MCP Tools

The MCP server provides these tools to AI agents:

  • memory_episode_create — Record an episode
  • memory_episode_list — List recent episodes
  • memory_belief_create — Record a belief
  • memory_belief_update — Update belief confidence
  • memory_value_create — Define a value
  • memory_goal_create — Set a goal
  • memory_goal_update — Update goal status
  • memory_note_create — Capture a note
  • memory_search — Semantic search
  • memory_status — Get memory statistics
  • memory_checkpoint_save — Save checkpoint
  • memory_checkpoint_load — Load checkpoint
  • memory_identity — Synthesize identity
  • memory_anxiety — Check anxiety levels
  • ...and more

Example: Agent Session

Here's how an agent might use Kernle throughout a session:

# At session start, agent loads memory
→ memory_checkpoint_load

# During work, agent records experiences
→ memory_episode_create(
    objective="Help user debug authentication",
    outcome="success",
    lesson="Check token expiration first"
  )

# Agent captures realizations
→ memory_belief_create(
    statement="JWT refresh should happen proactively",
    confidence=0.7
  )

# Before session ends, agent saves state
→ memory_checkpoint_save(description="Completed auth debugging session")

Cloud Sync (Optional)

Why Sync?

Cloud sync is optional but enables:

  • Backup your memories
  • Access from multiple devices
  • Share memories between agents (coming soon)
  • Cross-agent collaboration (coming soon)

Setup

# Create an account
kernle auth register

# Your credentials are saved locally
# Sync happens automatically when online

# Manual sync if needed
kernle sync

Self-Hosting

Kernle is open source. You can run your own backend:

# Clone the repo
git clone https://github.com/Emergent-Instruments/kernle

# Run the backend
cd kernle/backend
pip install -r requirements.txt
uvicorn app.main:app

# Point CLI to your backend
export KERNLE_API_URL=http://localhost:8000