Switch language / 언어 전환
Tensor Wave Function Collapse (T-WFC) is a research prototype that tests whether a tiny neural network can be trained without gradient descent by borrowing the superposition -> observation -> collapse -> propagation loop from Wave Function Collapse.
Clean partial collapse on make_moons: 8/32 weights committed, no rollback pressure, decision boundary changes visible step by step.
All visuals below are real artifacts generated by the CLI and committed under docs/media/ — not mock illustrations.
This is the main story of the project: not just whether a final classifier appears, but how the search behaves while the discrete weight state collapses.
Before the frontier-based forced-commit fallback existed, this stress setting could terminate at 0/32 committed weights. The visuals above make that difference easy to spot.
Seed sweep on make_moons: stable seeds, weaker seeds, and search-pressure summaries can be compared side by side.
Where the GIFs show a single run, the gallery shows how much behavior varies across seeds. The generated Markdown report includes inline storyboard and metrics previews for the best and worst seeds. Full example: docs/media/make_moons_seed_report.md.
T-WFC collapses each weight into one of five discrete values ({-1, -0.5, 0, 0.5, 1}), while SGD optimizes over continuous real-valued weights. The baseline uses SGD with momentum (0.9) and learning-rate decay — a standard optimizer, not a deliberately weakened one. All results: seed=7, deterministic NumPy CPU ops.
| Dataset | Nonlinearity | T-WFC | SGD+Mom | T-WFC time | SGD time | Params |
|---|---|---|---|---|---|---|
| linear_binary | None | 0.967 | 0.967 | 0.10s | 0.10s | 32 |
| blobs_binary | None | 1.000 | 1.000 | 0.10s | 0.11s | 32 |
| make_blobs | None | 1.000 | 1.000 | 0.20s | 0.10s | 51 |
| iris | Weak | 0.972 | 0.944 | 0.27s | 0.13s | 67 |
| make_moons | Moderate | 0.933 | 1.000 | 0.11s | 0.11s | 32 |
| xor | Moderate | 0.660 | 1.000 | 0.12s | 0.15s | 32 |
| circles | Moderate | 0.620 | 1.000 | 0.12s | 0.15s | 32 |
| spiral | Strong | 0.433 | 0.987 | 21.55s | 0.82s | 747 |
|
|
|
| linear_binary: 0.967 = 0.967 | blobs_binary: 1.000 = 1.000 | make_blobs: 1.000 = 1.000 |
On linearly separable problems, T-WFC matches SGD perfectly — discrete 5-value weights can express simple hyperplane boundaries. No backtracking or rollback needed.
|
|
| make_moons: 0.933 vs 1.000 — gap begins | xor: 0.660 vs 1.000 — effective failure |
|
|
| circles: 0.620 vs 1.000 — effective failure | spiral: 0.433 vs 0.987 — 26× slower, fundamentally insufficient |
Once the required decision boundary is nonlinear, the combinatorial space of 5 discrete values cannot represent it. The gap grows with problem complexity; at spiral scale (747 params), T-WFC is also 26× slower and uses 117× more memory.
The PoC succeeded: WFC-style collapse genuinely trains a toy MLP without backpropagation, matching SGD on linearly separable tasks. But T-WFC does not generalize to nonlinear problems, does not scale efficiently, and offers no speed or memory advantage over SGD. Full analysis in docs/RESULT.en.md.
- Datasets:
linear_binary,blobs_binary,make_blobs,iris,make_moons,xor,circles,spiral. - Models: single-hidden-layer toy MLP and deeper configurations (e.g.
2-24-24-3). - Training loop: observation, single-weight collapse, propagation, rollback-aware backtracking, and forced-commit fallback.
- Visualization: storyboard, GIF, metrics timeline, multi-seed gallery and report,
T-WFC vs SGDcomparison boards. - The package exposes an installable
t-wfcCLI viapyproject.toml. - See CHANGELOG.md for full feature history.
python3 -m pip install -e .
t-wfc --dataset make_moons --max-steps 8 --show-steps 6
t-wfc --dataset iris --hidden-layers 16,16 --max-steps 18 --compare-sgd --show-steps 6Run t-wfc --help for visualization, seed-sweep, and stress-test options. See docs/VERIFICATION.en.md for full recipe examples.
- Concept, English: docs/CONCEPT.en.md
- Concept, Korean: docs/CONCEPT.md
- Verification, English: docs/VERIFICATION.en.md
- Verification, Korean: docs/VERIFICATION.md
- Change history: CHANGELOG.md
src/t_wfc/data.py: dataset loading and splitssrc/t_wfc/model.py: single-layer and multi-layer MLP definition plus backprop support for the SGD baselinesrc/t_wfc/baseline.py:numpySGD baseline training for side-by-side comparisonsrc/t_wfc/state.py: discrete probability statesrc/t_wfc/trainer.py: collapse loop, rollback logic, metrics, snapshotssrc/t_wfc/batch.py: repeated experiment runs across seed lists and per-seed artifact exportsrc/t_wfc/reporting.py: Markdown seed-sweep report generation with inline highlight previews and drill-down linkssrc/t_wfc/visualization.py: overview, progress, metrics, storyboard, GIF, seed-gallery, andT-WFC vs SGDcomparison plotssrc/t_wfc/cli.py: command-line entry pointdocs/media/: curated public showcase media used directly in this READMEpyproject.toml: package metadata, dependencies, and thet-wfcconsole script
- This is still a research prototype, not a polished training framework.
numpyis the main runtime dependency.matplotlibis used for visualization output.Pillowis used for GIF export.













