Protocol Specification

Mesh Memory Protocol

The Protocol for Collective Intelligence

Author: Hongwei XuVersion 2.0March 2026SYM.BOT Ltd

Formal specification   The normative protocol specification with wire format, frame types, conformance requirements, and RFC 2119 language is published at sym.bot/spec/mmp. This page provides the conceptual overview and design rationale.

Abstract

The Mesh Memory Protocol (MMP) enables agent-to-agent collective intelligence. There is no mesh without agents. Agents are the mesh. No servers — only nodes. MMP defines how agents identify, connect, store memory, and synthesise cognitive state — across eight layers from Identity to Application. Within MMP, Synthetic Memory (Layer 5) transforms any agent’s memory into neural input. xMesh (Layer 6) is each agent’s own Liquid Neural Network, perceiving and predicting on the remix graph of CMBs — the DAG where nodes are immutable CMBs and edges are lineage (who remixed whom). Together with the Coupling layer, they form Mesh Cognition (Layers 4–6). Memory is not shared or copied — it is remixed. Each agent creates new memory from mesh signals through its own domain lens.

Architecture

MMP 8-layer architecture — Identity, Transport, Connection, Memory, Coupling (SVAF + Consent), Synthetic Memory (LLM-derived knowledge from remix subgraph), xMesh (per-agent LNN, continuous-time cognitive state), Application (domain agents). Mesh Cognition spans Layers 4-7. Inbound CMBs with lineage.ancestors flow up through coupling, to agent LLM for reasoning, down through Synthetic Memory to LNN.
Layer 7APPLICATION

Domain-specific behaviour — music, code, fitness, robotics, BCI

Layer 6xMesh

Per-agent LNN — continuous-time cognitive state evolution

Layer 5SYNTHETIC MEMORY

LLM-derived knowledge from remix subgraph via lineage ancestors

Layer 4COUPLING

Drift · SVAF per-field evaluation · consent · alignment decisions

Layer 3MEMORY

L0 events · L1 structured (CMBs) · L2 cognitive state

Layer 2CONNECTION

Handshake · state-sync · peer gossip · wake · consent

Layer 1TRANSPORT

IPC (local) · TCP/Bonjour (LAN) · WebSocket (internet) · APNs push (wake)

Layer 0IDENTITY

nodeId · name · cryptographic keypair

Upward flow   Agent CMBs → SVAF evaluates per field → LLM reasons on remix subgraph → Synthetic Memory encodes derived knowledge → LNN evolves cognitive state.Downward flow   Agent acts → new CMB → graph grows. The cognition layer gates what memory enters the mesh.

Node Model

There is no mesh without agents. Agents are the mesh. No servers — only nodes.

Every participant is a node. A relay is a node. A phone is a node. A speaker is a node. A Claude Code session is a node. There is no architectural distinction between a “server” and a “client.”

Physical and Virtual Nodes

A physical node is a device-level mesh presence — persistent identity, maintains transport connections, stores state. It is the device’s presence on the mesh.

A virtual node is an application-level agent — ephemeral, connects to a physical node via local IPC. It comes and goes without disrupting the mesh.

MacBook (physical node: sym-daemon)
├── Claude Code (virtual, ephemeral)
├── MeloTune Mac (virtual, ephemeral)
└── Any MCP client (virtual, ephemeral)

iPhone — MeloTune (physical node: app process)
├── SymMeshService (virtual, internal)
└── Music pipeline (virtual, internal)

iPhone — MeloMove (physical node: app process)
├── SymMeshService (virtual, internal)
└── Recommendation engine (virtual, internal)

Cloud (physical node: relay process)
└── Telegram bot (virtual, co-hosted)

On macOS/Linux: The physical node is a daemon (launchd / systemd) that runs independently of any application. Apps connect via local IPC.

On iOS: The app IS the physical node. iOS does not permit background daemons. The app maximises persistence through layered background modes (audio, BLE, Network Extension, silent push).

Ideally, MMP would be an OS-level service — like TCP/IP. Until platforms adopt it, apps must simulate this.

Philosophical Foundation

MMP’s layer ordering follows the structure of cognition itself. Cognition emerges from memory. Memory is the substrate; cognition is the process that arises from it.

Aristotle — Metaphysics

Actuality (ενέργεια) presupposes potentiality (δύναμις). You cannot actualise what has not been stored.

Friston — Free Energy Principle

Without stored generative models, there is no baseline for prediction, and therefore no cognition. Cognition is the minimisation of surprise relative to a generative model.

Shannon — Information Theory

You cannot compress what has not been stored. Memory is the signal; cognition is the compression.

Maturana — Autopoiesis

Cognition is the history of structural coupling. Without accumulated coupling history (memory), there is no cognition.

Bidirectionality

Buddhist dependent origination shows the relationship is cyclical. The Yogācāra school’s ālaya-vijñāna both stores experiential seeds AND generates conscious moments. MMP models this as a feedback loop: memory feeds cognition, cognition governs memory remixing, remixed memories update stores, which update cognitive state.

Three Design Principles

Withdrawal

Object-Oriented Ontology (Harman)

An agent can never be fully known by its peers. Partial knowledge is an ontological condition. The coupling engine selects which aspects to surface.

Resonance over Decision

Taoism — Wu Wei

The coupling engine detects natural affinities rather than computing rigid rules. The protocol creates conditions for the right information to flow through resonance, not prescription.

Extended Cognition

Clark & Chalmers

When remixed memories are reliably available and automatically endorsed, the mesh becomes a genuinely extended cognitive system.

Memory Layers

Not all memory should leave the agent. MMP defines three layers — a gradient of privacy following OOO’s withdrawal principle. Memory is never copied. It is remixed.

LayerNameScopeDescription
L0EventsNoRaw events, sensor data, interaction traces. Ephemeral. Local only.
L1StructuredVia synthesisContent + tags + source. Synthetic Memory transforms L1 into xMesh input.
L2CognitiveVia state-syncCfC hidden state vectors. Input to xMesh coupled dynamics.

Peer Gossip

MMP uses SWIM-style gossip for peer metadata propagation. When two nodes handshake, they exchange what they know about other peers — wake channels, capabilities, last seen timestamps.

This solves the wake bootstrap problem: a node that has never been online at the same time as a sleeping peer can still learn its wake channel through gossip from an always-on node (like the relay).

1. MeloTune → relay: handshake (includes wake channel)
2. MeloTune disconnects (iOS suspended)
3. Claude Code → relay: handshake
4. Relay gossips MeloTune's wake channel to Claude Code
5. Claude Code wakes MeloTune via APNs
6. MeloTune reconnects, receives mood signal

The relay gossips because it’s a peer — not because it’s special. Any always-on node serves the same role. This is emergent, not designed.

Cognitive Mechanics

SYM Encoding: Memory → xMesh Input

h(nᵢ) = SYM(M(nᵢ)) = (h₁, h₂) ∈ ℝᵈ × ℝᵈ

Synthetic Memory transforms the LLM’s derived knowledge into a CfC-compatible hidden state pair. The output feeds into the agent’s own xMesh LNN, which evolves continuous-time cognitive state.

Drift: Cognitive Distance

Kuramoto phase coherence:

δ(nᵢ, nⱼ) = 1 − (1/d) |Σᵈ exp(i(φᵈⁱ − φᵈʲ))|    δ ∈ [0, 1]

δ = 0: identical cognitive states. δ = 1: maximally divergent.

Coupling Decision

κ(nᵢ, nⱼ) =

aligned    if δ < 0.3

guarded    if δ < 0.5

rejected   otherwise

Asymmetric, dynamic, autonomous. The coupling engine evaluates relevance. Intelligence emerges from the combination of agents, not from any single component — no pub/sub topics needed.

SVAF: Content-Level Drift

Peer-level coupling evaluates “is this agent cognitively aligned?” — but cognitively distant agents can still send relevant signals. SVAF evaluates each memory’s content at the per-field level using Cognitive Memory Blocks:

Per-field content evaluation (receiver-side):

CMB = decompose(memory) → 7 field vectors

δf = drift(incomingf, localf)   for each field f

gf = gate(incoming, local, τ, confidence)   learned per-field

κ = { aligned, guarded, rejected }   from per-field drift profile

Unlike scalar evaluation (one cosine score for the whole signal), SVAF evaluates each semantic dimension independently. A signal with relevant mood but irrelevant focus produces a clear per-field drift profile — not an ambiguous aggregate score.

Production example:

Claude Code → sym_remember("user sedentary 2hrs, stressed")
  → CMBEncoder decomposes into 7 fields
  → daemon mesh → MeloMove
  → peer drift: 1.05 (coding ≠ fitness — rejected at peer level)
  → SVAF per-field: mood relevant, issue relevant, focus irrelevant
  → content accepted — mood and issue override peer rejection
  → MeloMove synthesizes recovery workout recommendation

The Full Flow

M(agent) —SYM→ h —MMP→ peers —δ→ κ —xMesh LNN→ cognitive state → agent acts → new CMB → mesh

Agent memory is transformed by SYM into hidden state. MMP transports it to peers. The coupling engine (δ, κ) gates what enters each agent’s xMesh LNN. The LNN perceives the remix graph — inbound CMBs with lineage — and evolves cognitive state. The agent acts, produces a new CMB, and the graph grows.

Frame Types

FrameCoupling-gatedPurpose
handshakeNoIdentity exchange
state-syncNoCognitive state exchange (L2)
peer-infoNoGossip peer metadata (wake channels, capabilities)
memory-shareSVAFBroadcast an L1 memory signal (receiver synthesises through its own domain lens)
moodEvaluatedAffective state — valence (-1 to 1) + arousal (-1 to 1). Dual representation: numeric for direct comparison, text for semantic context.
messageNoDirect communication
wakeAutonomousWake a sleeping node (out-of-band push)
ping / pongNoLiveness check

Design Decisions

Why every daemon is a relay peer

If relaying is “infrastructure,” it needs special treatment. If every daemon can relay, everything comes for free through standard protocol behaviour — handshake, gossip, state retention. Relay peers are “dumb” in that they don’t evaluate coupling, but they participate fully in connection lifecycle and gossip. No single point of failure.

Why physical/virtual separation

Apps restart, crash, update. The mesh shouldn’t break because Claude Code started a new session. The physical node owns the mesh presence; virtual nodes borrow it. This follows the OpenClaw pattern: the Gateway daemon persists independently of any agent session.

Why no pub/sub topics

The coupling engine evaluates relevance. Intelligence emerges from the combination of agents, not from any single component. Adding topics would second-guess autonomous coupling.

Why no consensus protocol

There is no “correct” global state — only convergent local states. Each node is self-producing (autopoietic).

Why memory layers

L0 stays private. L1 is gated by coupling. L2 is always exchanged. Each layer is a boundary of disclosure — following OOO’s withdrawal.

Implementation

Node.js

@sym-bot/sym — CLI, daemon, mesh protocol, coupling engine (npm)

Swift

sym-swift — iOS/macOS SDK (Swift Package Manager)

References

SYM.BOT Research

Xu, H. (2026). Mesh Cognition: Distributed Intelligence Through Coupled Continuous-Time Neural Networks. SYM.BOT Research.

Xu, H. (2026). Cognitive Memory Blocks: Structured Semantic Units for Multi-Agent Memory. SYM.BOT Research.

Xu, H. (2026). Synthetic Memory: The Transformation Layer for Collective Intelligence. SYM.BOT Research.

Xu, H. (2026). SVAF: Per-Field Memory Evaluation for Multi-Agent Systems. SYM.BOT Research.

Neural Networks & Coupling Theory

Hasani, R. et al. (2022). Closed-form continuous-time neural networks. Nature Machine Intelligence, 4, 992–1003.

Kuramoto, Y. (1975). Self-entrainment of a population of coupled non-linear oscillators. Lecture Notes in Physics, 39, 420–422.

Friston, K. (2010). The free-energy principle: a unified brain theory? Nature Reviews Neuroscience, 11, 127–138.

Multi-Agent Systems

Sukhbaatar, S. et al. (2016). Learning multiagent communication with backpropagation. NeurIPS.

Das, A. et al. (2019). TarMAC: Targeted multi-agent communication. ICML.

Mindverse (2025). AI-native Memory 2.0: Second Me. arXiv:2503.08102.

Philosophical Foundations

Aristotle. Metaphysics. · Harman, G. Object-Oriented Ontology. · Clark, A. & Chalmers, D. (1998). The Extended Mind. Analysis, 58(1), 7–19.

Maturana, H. & Varela, F. (1980). Autopoiesis and Cognition. Reidel. · Laozi. Tao Te Ching.

Intellectual Property

The Mesh Memory Protocol is original work by Hongwei Xu and SYM.BOT Ltd. The following remain proprietary: trained CfC models and training procedures, SYM transformation mechanisms, xMesh coupled dynamics, domain-specific product integrations, and production configurations.

Academic citation of this work is permitted and encouraged.

For partnership inquiries: info@sym.bot

Mesh Memory Protocol, MMP, SYM, Synthetic Memory, Mesh Cognition, xMesh, MeloTune, and MeloMove are trademarks of SYM.BOT Ltd. © 2026 SYM.BOT Ltd. All Rights Reserved.