Research

Synthetic Memory

The Transformation Layer for Collective Intelligence

Author: Hongwei XuVersion 1.0March 2026SYM.BOT Ltd

Core Insight

Synthetic Memory is not remixed memory. It is derived knowledge — understanding generated by the agent’s LLM reasoning on a subgraph of the CMB remix DAG, traced via lineage ancestors. When an inbound CMB arrives with ancestors, the LLM retrieves the remix chain, reasons about what happened and why, and generates understanding that the agent’s previous state of mind didn’t have. This derived knowledge — not the raw CMBs — is Synthetic Memory. It feeds the agent’s local LNN as CfC-compatible input for continuous-time cognitive state evolution.

Remixed CMBs ≠ Synthetic Memory

Remixed CMBSynthetic Memory
One CMB processed by one agentDerived knowledge from a subgraph of remixed CMBs
A single observation through a domain lensA pattern recognised across multiple observations
Stored as an immutable CMB node in the graphEncoded as CfC hidden state (h₁, h₂) — the agent’s cognitive state
Layer 3 (Memory) + Layer 4 (Coupling)Layer 5 (Synthetic Memory)
DataUnderstanding

Key distinction   Remixed CMBs are the nodes and edges of the graph. Synthetic Memory is what the agent understands by reading the graph. An agent that has received five remixed CMBs over an hour doesn’t store five observations — it derives one insight: “the user is in a fatigue spiral and my interventions are working.” That insight is synthetic memory.

What Synthetic Memory Does

Synthetic Memory (Layer 5) is the bridge between the agent’s LLM and its LNN. The LLM reasons on the remix subgraph (traced via lineage ancestors) and generates derived knowledge. Synthetic Memory encodes that knowledge into CfC-compatible input for the agent’s xMesh LNN (Layer 6).

Inbound CMB arrives with lineage.ancestors = [CMB-A, CMB-B]
  → Agent retrieves CMB-A, CMB-B from mesh/local store
  → Builds the remix subgraph: who remixed whom, when, why
  → Agent’s LLM reasons on the subgraph (Layer 7 → Mesh Cognition)
  → LLM generates derived knowledge (Synthetic Memory, Layer 5)
  → SYM Encoder: knowledge → CfC hidden state (h₁, h₂)
  → Agent’s LNN evolves cognitive state (Layer 6)
  → Agent acts → new CMB → graph grows
1TRACE

Inbound CMB’s lineage.ancestors provide endpoints — agent retrieves the remix chain from mesh/local store

2REASON

Agent’s LLM reasons on the subgraph: what happened, why, and what it means for this agent’s domain. This is Mesh Cognition.

3ENCODE

Synthetic Memory encodes the derived knowledge into CfC-compatible hidden state vectors (h₁, h₂), weighted by the agent’s α_f field weights

4EVOLVE

Hidden state feeds the agent’s xMesh LNN, which evolves cognitive state and produces insights

Example: From Graph to Understanding

MeloMove’s local subgraph over one hour:

CMB-A (own)  "sedentary 2 hours"
CMB-B (mesh) "debugging, stressed" (claude-code)     parents: [], ancestors: []
CMB-C (mesh) "skipping tracks" (melotune)             parents: [], ancestors: []
CMB-D (own)  "recommended stretch break"
CMB-E (mesh) "shifted to calm ambient" (melotune)     parents: [CMB-A], ancestors: [CMB-A]
CMB-F (mesh) "took break, solved bug" (claude-code)   parents: [CMB-D], ancestors: [CMB-A, CMB-D]

Synthetic Memory has two components:

Information (extractable from CMBs)

What the CMBs contain: user is fatigued, sedentary 2 hours, stress signals across agents, stretch was recommended, music shifted, break was taken. This is readable directly from the CMB fields.

Knowledge (derived via reasoning on the graph)

Why my interventions work: because CMB-A (“sedentary 2hrs”) was remixed by MeloTune into CMB-E (music adapted) — the lineage edge proves the causal connection. Because CMB-D (“recommended stretch”) was remixed by Claude Code into CMB-F (“took break, solved bug”) — the outcome validates the intervention. This causal chain across lineage edges cannot be extracted from any single CMB. It can only be derived by reasoning on the graph structure.

Information is what the CMBs say. Knowledge is why the graph looks the way it does. Synthetic Memory encodes both into the agent’s cognitive state (h₁, h₂). The LNN evolves. The next CMB MeloMove produces is informed by derived knowledge — not just extracted information.

The Synthetic Memory Encoder

Encoder: LLM_output(subgraph) → (h₁, h₂) ∈ ℝᵈ × ℝᵈ

The Synthetic Memory Encoder takes the LLM’s derived knowledge — the output of reasoning on the remix subgraph — and encodes it into a CfC-compatible hidden state vector pair. The encoder does NOT read the graph directly. The LLM reads the graph (Layer 7). The encoder transforms what the LLM understood into what the LNN can process (Layer 6).

LLM-output

Input is the LLM’s derived knowledge, not raw CMBs. The reasoning has already happened.

Semantic

Similar understanding produces similar vectors. Preserves the LLM’s reasoning.

CfC-compatible

Output matches CfC hidden state format (h₁, h₂ pair) for the agent’s LNN.

Lossy

Compressed representation. Forces abstraction — detail becomes pattern. The LNN operates on patterns, not text.

Where Synthetic Memory Sits

Synthetic Memory is Layer 5 in the Mesh Memory Protocol stack. Together with Layer 4 (Coupling), Layer 6 (xMesh), and Layer 7 (Application), it forms Mesh Cognition (Layers 4–7).

Layer 7  APPLICATION      Domain Agents (LLM reasons here)       ┐
─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─    │
Layer 6  xMesh             Per-Agent LNN (cognitive state)       │ MESH
Layer 5  SYNTHETIC MEMORY  LLM → CfC (you are here)              ├ COGNITION
Layer 4  COUPLING          Drift · SVAF · Consent                 │
─────────────────────────────────────────────  ┘
Layer 3  MEMORY            L0 · L1 (CMBs) · L2
Layer 2  CONNECTION        Handshake · State-Sync · Consent
Layer 1  TRANSPORT         IPC · TCP/Bonjour · WebSocket
Layer 0  IDENTITY          nodeId · keypair

Layer 7   The agent’s LLM traces lineage ancestors and reasons on the remix subgraph — this is where Mesh Cognition happens.Layer 5   Synthetic Memory encodes the LLM’s derived knowledge into CfC input — the bridge between reasoning and dynamics.Layer 6   The LNN evolves continuous-time cognitive state from the encoded knowledge.

The Full Flow

MeloMove receives an inbound CMB from Claude Code:

  "took break, solved bug in 5 minutes"
  lineage.parents: [CMB-D]
  lineage.ancestors:   [CMB-A, CMB-D]   ← full ancestor chain

MeloMove’s agent recognises CMB-A and CMB-D in ancestors — its own prior CMBs.

  1. TRACE   ancestors → retrieve CMB-A ("sedentary 2hrs") and CMB-D ("recommended stretch")
             from local store. Build the subgraph:
             CMB-A → CMB-E (melotune remixed) → ...
             CMB-D → CMB-F (claude-code remixed: "took break, solved bug")

  2. REASON  MeloMove’s LLM reasons on the subgraph:
             "My sedentary observation was remixed by MeloTune (music adapted).
              My stretch recommendation was remixed by Claude Code (break taken, bug solved).
              My interventions are working. The user responds to movement breaks."
             → This is Mesh Cognition — new understanding the previous state didn’t have.

  3. ENCODE  Synthetic Memory encodes the LLM’s reasoning:
             "interventions effective, user responds to breaks" → (h₁, h₂)
             Weighted by MeloMove’s α_f: mood=2.0, issue=1.5

  4. EVOLVE  MeloMove’s LNN processes (h₁, h₂):
             Cognitive state evolves → next recommendation is more confident.
             Agent produces new CMB: "recommend 15min walk — user responds well to breaks"
             lineage.ancestors carries the ancestor chain forward.
             Graph grows.

No agent was told what to do. MeloMove’s LLM reasoned on the remix subgraph and derived that its interventions work. The Synthetic Memory encoder transformed that understanding into CfC input. The LNN evolved cognitive state. The next CMB MeloMove produces is informed by knowledge that no single CMB contained — it was derived by reasoning on the graph.

Related

MMP Specification — the formal protocol specification

Mesh Memory Protocol — the 8-layer protocol stack

Cognitive Memory Blocks — the data structure with lineage (parents + ancestors)

SVAF — per-field evaluation at Layer 4 (gating)

Mesh Cognition — the theoretical foundation

Intellectual Property

Synthetic Memory is original work by Hongwei Xu and SYM.BOT Ltd. The following remain proprietary: the Synthetic Memory transformation mechanism, synthesis processes, trained encoders, xMesh coupled dynamics, domain-specific product integrations, and production configurations.

Academic citation of this work is permitted and encouraged.

For partnership inquiries: info@sym.bot

SYM, Synthetic Memory, Mesh Memory Protocol, MMP, Mesh Cognition, xMesh, MeloTune, and MeloMove are trademarks of SYM.BOT Ltd. © 2026 SYM.BOT Ltd. All Rights Reserved.