Whole-Brain Emulation (WBE) & Constructivist Discovery Engine
⚠️ [LEGACY / REFERENCE IMPLEMENTATION] This repository contains the original Python/PyTorch prototype of the Noesis Whole-Brain Emulation Engine.While this version successfully implements continuous ODE physics, dynamic neurogenesis, and distributed heterogeneous TCP tensor parallelism (splitting the brain across AMD ROCm and NVIDIA CUDA)... it is fundamentally bottlenecked by the Python GIL and PyTorch dispatch overhead.
🚀 A new, bare-metal version of Noesis written entirely in MLRift (a custom, zero-dependency machine learning compiler) is currently under development and coming soon. The MLRift native version executes raw RDNA3 opcodes directly on the GPU, achieving a huge speedup over this Python implementation.
Noesis is an experimental architecture for Whole-Brain Emulation (WBE). It is not a standard Deep Learning model; it is a continuous-time, non-linear physics engine designed to simulate the chaotic attractor dynamics of biological neural networks.
The core hypothesis tested in this repository is Constructivism ("The Seed and Tape" Theory): WBE cannot be achieved by booting up a massive adult snapshot. Intelligence must be grown.
Noesis starts with a small "Seed" of neurons and feeds it the sensory "Tape" of an environment. The brain undergoes Error-Driven Neurogenesis, physically spawning new neurons only when it fails to predict the environment, sculpting the network to fit the mind.
During the development of this prototype, several critical biological principles were mathematically verified as necessary to prevent digital brains from collapsing into a coma (Catatonia) or exploding (Seizure):
- Local Homeostatic Plasticity (The "Spark"): Global noise or artificial heuristics do not create intelligence. A digital brain requires true Local Homeostatic Plasticity, where every individual neuron tracks its own firing rate and dynamically adjusts its intrinsic bias to maintain an edge-of-chaos equilibrium.
- "The Flaw" (Catastrophic Forgetting & Saturation): An Uploaded Intelligence forced to run continuously will eventually saturate its capacity, leading to severe hallucinations and cognitive degradation.
- Sleep, Pruning, and Consolidation: To cure "The Flaw," the brain must undergo regular Sleep cycles triggered by cognitive load. During sleep, it performs Structural Pruning (killing neurons with decayed weights) and Memory Consolidation (weight decay).
To simulate hundreds of thousands of neurons with continuous ODE physics without crashing 16GB GPUs, this Python prototype utilizes a Heterogeneous TCP Tensor Parallelism cluster:
- Left Hemisphere: AMD ROCm (Master Node).
- Right Hemisphere: NVIDIA CUDA (Worker Node).
- The Corpus Callosum: A raw TCP socket over a 2.5 Gbps cable. By splitting a massive Sparse CSR matrix across two machines, they calculate the ODE independently and exchange only a tiny vector of firing rates (e.g., 1MB per step) via zero-copy serialization, achieving sub-100ms latencies for massive scale.
(Note: The upcoming MLRift rewrite solves this memory/compute bottleneck natively via custom AMDKFD driver execution).
agent/hybrid_agent.py: The original architecture. It introduced the Slow Core + Fast Cache (WZMA) concept. It evolved to include an internal GRU and biological homeostasis (RAS Arousal, Local Intrinsic Bias) to survive the Emulation Bottleneck.agent/wzma.py: Implementation of the WeightZip Memory Architecture (WZMA) as a low-rank plastic adapter for fast, boundable learning.agent/controller.py: The Meta-Controller that tracked telemetry (entropy, fast_norm) to decide when to learn, decay, or reset the fast cache.env/ode_physics.py: The true biological ground truth. A non-linear, continuous-time Wilson-Cowan differential equation solved viatorchdiffeq, supporting chaotic attractors.env/connectome_env.py: The Gymnasium environment wrapping the ODE physics engine, allowing the agent to observe states and inject continuous control signals (Surgeon Mode).data/ibl_loader.py: Script to connect to the International Brain Laboratory (IBL) ONE-api, downloading and binning real, multi-region mouse Neuropixels spike data.
loop.py: The primary execution script for testing the WBE Spark. It ran Phase 1 (Causal Discovery) and Phase 2 (Emulation Bottleneck), rigorously testing different hypotheses against strict Catatonia and Zombie (1/f PSD) checks.seed_and_tape.py: The implementation of the Constructivist theory. It feeds a complex sine wave (the Tape) into a 2-neuron network (the Seed), dynamically spawning new neurons based on prediction error.grow_brain.py: Tests "The Holy Trinity": Neurogenesis + Local Homeostatic Plasticity + RAS Arousal. It proved that a network can safely boot from 5 neurons to 100 without seizing, achieving stable Edge-of-Chaos dynamics.
stress_test_flaw.py: Simulates "The Flaw." Feeds a 200,000-step lifespan tape to a dense brain with no sleep. Proves that reaching capacity leads to catastrophic loss explosions.cure_the_flaw.py: Applies Sleep Consolidation to the stress test. Triggers periodic sleep to prune dead neurons and apply weight decay, allowing the brain to survive the lifespan tape gracefully.large_scale_brain.py: A massive unified simulation on a single node using PyTorch Sparse CSR tensors and Oja's rule for Hebbian learning.distributed/desktop_left_hemi.py: The Master Node for Tensor Parallelism. Allocates half the Sparse matrix, computes its ODE slice, monitors global Cognitive Load, and issues TCP commands (CMD_SPAWN,CMD_SLEEP,CMD_NORMAL) to the worker.distributed/laptop_right_hemi.py: The Worker Node. Connects via TCP, allocates the other half of the matrix, and flawlessly synchronizes its ODE calculations, Neurogenesis, and Pruning cycles with the Master.distributed_brain_master.py/distributed_brain_worker.py: Heavily optimized versions of the distributed scripts, featuring chunked sparse Hebbian updates to prevent dense VRAM spikes, exact byte-framing for TCP stability, and structural pruning.
If you wish to run the reference experiments (not recommended for performance benchmarks):
pip install -r requirements.txtAuthor: Pantelis Christou
Status: Legacy Reference. Migrating to MLRift.