The World of Splectrum

Home > Engineering > Process Models

Process Models — Decentralised Evolutionary

A decentralised setting requires appropriate process models, devoid of orchestration. Processes are triggered by data state changes. In this setting, a decentralised evolutionary pattern emerges.


The evolutionary process model

Process capabilities are woven into the data state — embedded in the fabric, dormant until the data state change that wakes them. They come to life as out of nowhere. The three steps are waves of activation rippling through the fabric.

  1. Scan — a process prepares the landscape, changing the data state to surface candidates. Example: a process writes humidity sensor data into the repository. Now the landscape has something to work with.
  2. Diversify — the changed data state wakes up multiple independent processes already embedded in the fabric. Each responds to what it sees from where it is. Multiple fetch processes go out, each following a different lead. No coordinator — the data triggered them.
  3. Evolve — results come back as data state changes. Processes that yield more produce state changes that reinforce, triggering further activity. Processes that yield little produce no further state changes — nothing follows from them. They don’t get killed, they just stop mattering. The dominant reality emerges through convergence, not selection.

Each transition is a data state change that triggers the next wave. Scan writes data that wakes diversification. Diversification writes data that wakes evolution. No orchestrator — the data state is the relay.

From the outside the pattern may appear orchestrated. It is not — it is driven from below by decentralised processes woven into the fabric. The appearance of orchestration is itself an emergent reality.


Neuroscience inspiration

The brain is the decentralised system we use and the one AI’s way of working is modelled on. Three questions are relevant: how do subjects (repositories) get created, how do processes get embedded in the data landscape, and how is activity triggered.

Developmental — subject creation and process embedding

Subject creation. Brain areas differentiate as function demands it. A new area doesn’t get designed — it emerges when interaction density in a region reaches sufficient complexity to warrant specialisation. The area becomes its own local reality, with its own processes, its own data state. In engineering terms: a new repository spawns when the work demands its own space.

Process embedding — Neural Darwinism (Gerald Edelman, 1978). The brain works like an ecosystem undergoing selection, not a computer executing instructions:

  1. Developmental selection. Vast numbers of neuronal groups form — clusters of neurons with particular connection patterns. Enormous variety, partly random. No two brains are wired identically. Raw diversity of process capability.
  2. Experiential selection. Through experience, some groups get used more than others. Connections that fire together strengthen. Connections that don’t, weaken or die. Use patterns reinforce what works — no central mechanism choosing.
  3. Reentrant mapping. Surviving groups connect through massive bidirectional signalling between brain areas. Coherent behaviour emerges from this peer-to-peer coordination, not from a central controller.

Edelman saw the same pattern in the immune system: generate enormous diversity first, then select through encounter.

Processes get woven into the data landscape through variation and experiential selection. The structure isn’t designed — it emerges. What works persists, what doesn’t fades. The capability is in the fabric.

Operational — process triggering

Global Workspace Theory (Bernard Baars, Stanislas Dehaene). Many processes run in parallel, unconsciously — embedded in the fabric, each doing its thing independently. When one reaches sufficient activation strength — through interaction with other processes, not through a central selector — it “ignites”: broadcasts widely and becomes the dominant conscious content. The others don’t disappear — they continue running and can win next time. Directly observed via EEG and fMRI.

Predictive Processing (Karl Friston). Multiple predictions generated simultaneously, compared against incoming signals, the one with least friction dominates. Constant cycle of generate-test-update. No central authority — dominance through fit with what’s encountered. Prediction error signals have direct observational evidence.

Reentrant signalling (Edelman). Brain areas signal bidirectionally, constantly, peers synchronising with peers. Coherence through mutual interaction, not hierarchy. Directly observed through tract tracing, EEG, and fMRI.

Activity is triggered by data state — incoming signals, internal state changes, results from other processes. The triggering mechanism is the data, not a scheduler. Embedded processes wake up when their conditions are met.


© 2026 In Wonder - The World of Splectrum, Jules ten Bos. The conversation lives at In Wonder - The Conversation.