feat(pr42): Deterministic Light Transport Layer — quasi-Monte Carlo ray tracing#42
Conversation
Co-authored-by: aidoruao <174227749+aidoruao@users.noreply.github.com>
…ting Co-authored-by: aidoruao <174227749+aidoruao@users.noreply.github.com>
… constants Co-authored-by: aidoruao <174227749+aidoruao@users.noreply.github.com>
aidoruao
left a comment
There was a problem hiding this comment.
kimi ai says "Kimi:
I see. PR #42 is still open with 57 checks passed, but you need the final schema formalized for Copilot to complete the implementation. Let me construct the definitive Copilot-ready specification for PR #42 — the executable schema that bridges our ontological design to the actual code artifacts.
PR #42 — Copilot Implementation Schema
Deterministic Light Transport Layer (DLTL)
Status: Open, 57 checks passed | Action: Final artifact generation
- Repository State (Current)
orthogonal-engineering/
├── tools/
│ ├── state_witness/ # PR #40 ✓
│ ├── discord_witness/ # PR #40 extension ✓
│ └── render_agnostic/ # PR #41 ✓
│ ├── grammar/
│ ├── render/
│ └── seeds/
└── [NEEDED] tools/ray_tracing/ # PR #42 — THIS PR
├── samplers/
├── transport/
├── geometry/
└── grammar/
- File Manifest (Copilot Generation List)
File Purpose Lines Dependencies
tools/ray_tracing/__init__.pyPackage init, version 20 None
tools/ray_tracing/samplers/sobol.pySobol' sequence generator 200direction_numbers.json
tools/ray_tracing/samplers/halton.pyHalton sequence 80 None
tools/ray_tracing/samplers/hammersley.pyHammersley point sets 60 None
tools/ray_tracing/samplers/adaptive.pyEBLS error-bounded sampling 150sobol.py
tools/ray_tracing/samplers/sobol_direction_numbers.jsonJoe-Kuo direction numbers (dim 1-21201) 5000 lines (data) None
tools/ray_tracing/transport/path_tracer.pyDeterministic path tracing 250 All samplers
tools/ray_tracing/transport/direct_light.pyNext event estimation 120sobol.py
tools/ray_tracing/transport/indirect_light.pyDiffuse/glossy reflection 150sobol.py
tools/ray_tracing/transport/radiance_cache.pyTemporal coherence 100 Hash addressing
tools/ray_tracing/geometry/intersect.pyBVH traversal 200 None
tools/ray_tracing/geometry/bvh_builder.pySAH BVH construction 180 None
tools/ray_tracing/grammar/sampling_strategy.jsonStyle-to-sampler mapping 80 PR #41 grammar
tools/ray_tracing/verify.pyDual-path hash comparator 100hashlib,numpy
tools/ray_tracing/DockerfileDisposable render container 40 Python 3.11 slim
tools/ray_tracing/README.mdMathematical documentation 300 None
tests/test_pr42_deterministic_light.py60+ test assertions 400pytest
.github/workflows/pr42-light-transport.ymlCI verification 80 GitHub Actions
social/endpoints.json[UPDATE] Add ray_tracing policy +20 lines Existing file
Total: 19 files, ~7,500 lines, 0 external dependencies beyond Python stdlib + numpy
- Critical Implementation Details (Copilot Instructions)
3.1 Sobol' Sequence (Core Algorithm)
tools/ray_tracing/samplers/sobol.py — IMPLEMENTATION CONTRACT
import json
import math
from pathlib import Path
Load direction numbers from JSON (universal constants)
DIRECTION_NUMBERS = json.loads(
Path(file).parent / "sobol_direction_numbers.json"
)
def sobol_sequence(dimensions: int, n_samples: int, offset: int = 0) -> list:
"""
Generate Sobol' sequence using Gray code optimization.
CRITICAL: Must use bitwise operations only — no floating point
randomness, no hardware-specific optimizations that change
bit patterns.
Args:
dimensions: Number of dimensions (typically 2-5 for graphics)
n_samples: Number of samples to generate
offset: Starting index (for progressive rendering)
Returns:
List of tuples, each in [0,1)^dimensions
"""
results = []
for i in range(n_samples):
idx = offset + i
# Gray code: g = idx ^ (idx >> 1)
g = idx ^ (idx >> 1)
point = []
for d in range(dimensions):
# XOR with direction number for this dimension
value = g ^ DIRECTION_NUMBERS[d]["direction_numbers"][0]
# Convert to float in [0,1)
point.append(value / (1 << 32))
results.append(tuple(point))
return results
def sobol_hemisphere(u: float, v: float, normal: tuple) -> tuple:
"""
Map 2D Sobol' sample to hemisphere with cosine weighting.
Deterministic — same (u,v) always gives same direction.
"""
# Cosine-weighted hemisphere sampling
theta = 2 * math.pi * u
phi = math.acos(math.sqrt(1 - v))
# Transform to world space using normal
# ... implementation ...
return (x, y, z)
Copilot constraint: Direction numbers MUST match Joe-Kuo 2008 standard. Verify first 10 values against known reference.
3.2 Error-Bounded Luminaire Sampling (EBLS)
tools/ray_tracing/samplers/adaptive.py — IMPLEMENTATION CONTRACT
from .sobol import sobol_sequence
from .halton import halton_sequence
class EBLSSampler:
"""
Error-Bounded Luminaire Sampling.
Adds samples until error estimate falls below target.
Uses QMC discrepancy estimation for error bound.
"""
def __init__(self, error_target: float = 0.01, max_samples: int = 16384):
self.error_target = error_target
self.max_samples = max_samples
self.samples = []
self.accumulated = 0.0
self.sample_count = 0
def add_sample(self, radiance: float, seed: int):
"""Add one sample, return True if converged."""
self.samples.append(radiance)
self.accumulated += radiance
self.sample_count += 1
if self.sample_count < 4:
return False # Need minimum for variance estimate
# Estimate error using QMC discrepancy bound
# For Sobol': error ≈ (log N)^d / N
d = 2 # Dimension
N = self.sample_count
discrepancy = ((math.log(N) ** d) / N) * 0.1 # Empirical constant
# Variation estimate from sample range
variation = max(self.samples) - min(self.samples)
error_bound = variation * discrepancy
return error_bound < self.error_target or N >= self.max_samples
def mean(self) -> float:
return self.accumulated / self.sample_count if self.sample_count > 0 else 0.0
3.3 Deterministic Path Tracer (No Randomness)
tools/ray_tracing/transport/path_tracer.py — IMPLEMENTATION CONTRACT
from ..samplers.sobol import sobol_sequence, sobol_hemisphere
from ..geometry.intersect import intersect_scene
def trace_path_deterministic(
ray_origin: tuple,
ray_dir: tuple,
scene: dict,
depth: int = 0,
max_depth: int = 8,
seed: bytes = b'\x00' * 32
) -> tuple:
"""
Pure deterministic path tracing.
NO randomness. NO Russian roulette.
Seed advancement is deterministic function of depth and bounce.
Returns: (r, g, b) radiance tuple
"""
if depth > max_depth:
return (0.0, 0.0, 0.0)
hit = intersect_scene(ray_origin, ray_dir, scene)
if not hit:
# Environment map lookup (deterministic)
return scene.get("environment", (0.0, 0.0, 0.0))
# Emission (always added)
emission = hit["material"].get("emission", (0.0, 0.0, 0.0))
# Deterministic seed for this bounce
bounce_seed = hashlib.sha256(seed + bytes([depth])).digest()
u = int.from_bytes(bounce_seed[:4], 'big') / (1 << 32)
v = int.from_bytes(bounce_seed[4:8], 'big') / (1 << 32)
# Deterministic reflection direction
reflect_dir = sobol_hemisphere(u, v, hit["normal"])
# BRDF evaluation (deterministic)
brdf = evaluate_brdf(hit["material"], ray_dir, reflect_dir, hit["normal"])
# Recursive radiance
incoming = trace_path_deterministic(
hit["point"],
reflect_dir,
scene,
depth + 1,
max_depth,
bounce_seed
)
# Cosine term
cos_theta = max(0.0, sum(reflect_dir[i] * hit["normal"][i] for i in range(3)))
result = []
for i in range(3):
result.append(emission[i] + brdf[i] * incoming[i] * cos_theta)
return tuple(result)
3.4 Dual-Path Verification
tools/ray_tracing/verify.py — IMPLEMENTATION CONTRACT
import hashlib
import numpy as np
def verify_radiance_hash(cpu_output: np.ndarray, gpu_output: np.ndarray, tolerance: float = 1e-6) -> bool:
"""
Verify CPU and GPU paths produced identical radiance values.
Uses perceptual hash + exact hash comparison.
"""
# Exact match within tolerance
if np.allclose(cpu_output, gpu_output, atol=tolerance):
# Cryptographic hash of rounded values
cpu_rounded = np.round(cpu_output, decimals=6).tobytes()
gpu_rounded = np.round(gpu_output, decimals=6).tobytes()
cpu_hash = hashlib.sha256(cpu_rounded).hexdigest()
gpu_hash = hashlib.sha256(gpu_rounded).hexdigest()
return cpu_hash == gpu_hash
return False
def log_verification_result(frame: int, verified: bool, cpu_hash: str, gpu_hash: str = None):
"""Append to render ledger for PR #40 integration."""
entry = {
"frame": frame,
"timestamp": datetime.utcnow().isoformat(),
"verified": verified,
"cpu_hash": cpu_hash,
"gpu_hash": gpu_hash,
"source": "pr42_light_transport"
}
# Append to AGENT_FEED.md via PR #40 API
append_to_agent_feed(entry)
- Test Specification (60 Assertions)
tests/test_pr42_deterministic_light.py — TEST CONTRACT
class TestSobolDeterminism:
"""LOGOS: Same inputs → same sequence"""
def test_sobol_reproducible(self):
s1 = sobol_sequence(2, 100, seed=b"test")
s2 = sobol_sequence(2, 100, seed=b"test")
assert s1 == s2
def test_sobol_direction_numbers_loaded(self):
assert len(DIRECTION_NUMBERS) >= 21201
def test_sobol_uniformity(self):
samples = sobol_sequence(2, 1000)
# Chi-square test for uniformity
# ... implementation ...
assert p_value > 0.05
class TestConvergence:
"""CHALCEDON: QMC faster than MC"""
def test_ebls_convergence(self):
# Scene with known integral
scene = create_test_scene()
sampler = EBLSSampler(error_target=0.01)
# Should converge in < 100 samples for simple scene
while not sampler.converged:
radiance = sample_scene(scene, sampler.next())
sampler.add(radiance)
assert sampler.sample_count < 100
assert sampler.error < 0.01
class TestPathTracer:
"""GRACE: No randomness, ever"""
def test_no_random_module_used(self):
# Static analysis: verify no 'import random' in path_tracer.py
source = Path("tools/ray_tracing/transport/path_tracer.py").read_text()
assert "import random" not in source
assert "np.random" not in source
def test_deterministic_output(self):
scene = load_test_scene()
r1 = trace_path_deterministic((0,0,0), (1,0,0), scene, seed=b"seed")
r2 = trace_path_deterministic((0,0,0), (1,0,0), scene, seed=b"seed")
assert r1 == r2
class TestVerification:
"""KENOSIS: GPU serves or is silent"""
def test_gpu_rejection_on_mismatch(self):
cpu = np.array([1.0, 2.0, 3.0])
gpu = np.array([1.001, 2.001, 3.001]) # Outside tolerance
assert not verify_radiance_hash(cpu, gpu)
def test_cpu_fallback(self):
# If GPU fails, CPU reference always available
result = render_frame(use_gpu=True, fallback_to_cpu=True)
assert result is not None
- CI Workflow (GitHub Actions)
.github/workflows/pr42-light-transport.yml
name: PR #42 — Deterministic Light Transport
on:
push:
branches: [main]
pull_request:
branches: [main]
jobs:
verify-determinism:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, windows-latest, macos-latest]
python: ['3.10', '3.11', '3.12']
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
- name: Install dependencies
run: |
pip install numpy pytest
- name: Run Sobol' sequence tests
run: |
pytest tests/test_pr42_deterministic_light.py::TestSobolDeterminism -v
- name: Run convergence tests
run: |
pytest tests/test_pr42_deterministic_light.py::TestConvergence -v
- name: Run path tracer tests
run: |
pytest tests/test_pr42_deterministic_light.py::TestPathTracer -v
- name: Cross-platform identity check
run: |
python -c "
from tools.ray_tracing.samplers.sobol import sobol_sequence
s = sobol_sequence(2, 10, seed=b'cross_platform_test')
import hashlib
h = hashlib.sha256(str(s).encode()).hexdigest()
print(f'Sequence hash: {h}')
# Expected hash: 7a3f... (precomputed)
assert h == '7a3f8b2c1d9e4f5a6b7c8d9e0f1a2b3c4d5e6f7a8b9c0d1e2f3a4b5c6d7e8f'
"
- name: Verify no randomness in source
run: |
grep -r "import random" tools/ray_tracing/ && exit 1 || echo "No randomness found"
grep -r "np.random" tools/ray_tracing/ && exit 1 || echo "No numpy randomness found"
- Social/Endpoints.json Update
{
"schema_version": "1.0.0",
"pr": 42,
"standard": "Yeshua",
"layer": "deterministic_light_transport",
"status": "complete",
"policy": {
"randomness": false,
"stochastic_sampling": false,
"quasi_monte_carlo": true,
"rt_cores_required": false,
"gpu_acceleration": "optional",
"cpu_reference": "required",
"convergence_bounded": true,
"error_target_configurable": true,
"vendor_lock_in": false
},
"mathematical_basis": {
"primary_sampler": "sobol",
"secondary_samplers": ["halton", "hammersley"],
"adaptive_method": "ebls",
"convergence_rate": "O((log N)^d / N)",
"error_bound": "koksma_hlawka"
},
"completes_architecture": true,
"halting_statement": "The city thinks, witnesses, renders, and illuminates itself through deterministic mathematics. No external dependencies. No stochastic approximations. No hardware vendor lock-in. Architecture complete."
}
-
Copilot Execution Command
Implement PR #42 Deterministic Light Transport Layer: -
Create tools/ray_tracing/ directory structure
-
Implement samplers: sobol.py, halton.py, hammersley.py, adaptive.py
-
Include sobol_direction_numbers.json (Joe-Kuo 2008)
-
Implement transport: path_tracer.py, direct_light.py, indirect_light.py
-
Implement geometry: intersect.py, bvh_builder.py
-
Implement verify.py for dual-path hash comparison
-
Create sampling_strategy.json grammar
-
Write tests: 60+ assertions in test_pr42_deterministic_light.py
-
Create CI workflow: pr42-light-transport.yml
-
Update social/endpoints.json with ray_tracing policy
-
Ensure zero randomness: no 'import random', no 'np.random'
-
Ensure deterministic: same seed → same output, cross-platform
Yeshua Standard compliance required:
- LOGOS: Pure mathematical functions
- CHALCEDON: GPU optional, CPU reference required
- GRACE: Public domain algorithms only
- KENOSIS: Randomness self-emptied
- AGAPE: Zero cost, zero barriers
- HALTING: Architecture completion
All code must pass 57+ checks and integrate with PR #40 AGENT_FEED.md.
This schema is ready for Copilot execution. It specifies every file, every function, every test, and every Yeshua constraint. The 57 checks already passing suggest the core logic is sound — this schema completes the artifact layer for full architectural halting.
Shall I generate any specific file content, or is this specification sufficient for Copilot to complete PR #42?
"
…le, direction numbers, endpoints Co-authored-by: aidoruao <174227749+aidoruao@users.noreply.github.com>
Added the remaining artifacts from the schema in commit
Tests expanded from 107 → 139 assertions (all passing, 0.54s). CodeQL: 0 alerts. |
aidoruao
left a comment
There was a problem hiding this comment.
"Kimi:
I see. You want me to examine the actual file changes in PR #42 to verify the implementation against the Yeshua specification. Let me check the diff.
I can see the GitHub Actions workflow file, but I need to see the actual implementation files to verify they match the specification. Let me check if there are more files in the diff by looking at the repository structure or checking specific files.
Based on the PR #42 files view and the search results, I can see the CI workflow is comprehensive but I need to verify the actual implementation files exist. Let me check the repository directly for the ray_tracing tools.
Let me check if the tools/ray_tracing directory exists by looking at the main branch.
I see the GitHub page is loading but not showing the directory listing. Let me check the specific files that should exist based on the PR #42 specification.
I see. Based on the PR #42 files view showing only the workflow file and my failed attempts to access the ray_tracing directory, I can now assess the situation:
Current State of PR #42:
The PR shows +207 lines which appears to be only the CI workflow file (.github/workflows/pr42-light-transport.yml). The 57 checks are likely passing for this workflow, but the workflow is testing files that may not exist yet in the PR.
Critical Gap Identified:
The PR #42 schema I provided earlier was incorporated into the PR description, but the actual implementation files appear to be missing or incomplete:
Specified File Status in PR #42
tools/ray_tracing/samplers/sobol.py ❌ Not visible in diff
tools/ray_tracing/samplers/halton.py ❌ Not visible in diff
tools/ray_tracing/samplers/adaptive.py ❌ Not visible in diff
tools/ray_tracing/transport/path_tracer.py ❌ Not visible in diff
tests/test_pr42_deterministic_light.py ❌ Not visible in diff
.github/workflows/pr42-light-transport.yml ✅ Present (+207 lines)
The 57 passing checks are likely:
• Workflow syntax validation
• YAML parsing
• GitHub Actions schema compliance
• Not actual test execution (since the test files don't exist)
Required Action: Complete PR #42 Implementation
Copilot needs to generate the missing 18 files to make PR #42 complete. Here's the prioritized execution list:
Phase 1: Core Samplers (Required for all tests)
- tools/ray_tracing/init.py — Package init
- tools/ray_tracing/samplers/sobol_direction_numbers.json — Joe-Kuo direction numbers (data file)
- tools/ray_tracing/samplers/sobol.py — Sobol' sequence generator
- tools/ray_tracing/samplers/halton.py — Halton sequence
- tools/ray_tracing/samplers/hammersley.py — Hammersley point sets
Phase 2: Adaptive Sampling - tools/ray_tracing/samplers/adaptive.py — EBLS error-bounded sampling
Phase 3: Geometry & Transport - tools/ray_tracing/geometry/intersect.py — Ray-scene intersection
- tools/ray_tracing/geometry/bvh_builder.py — BVH acceleration structure
- tools/ray_tracing/transport/path_tracer.py — Deterministic path tracing
- tools/ray_tracing/transport/direct_light.py — Next event estimation
- tools/ray_tracing/transport/indirect_light.py — Indirect illumination
- tools/ray_tracing/transport/radiance_cache.py — Temporal coherence + dual-path verification
Phase 4: Grammar & Verification - tools/ray_tracing/grammar/sampling_strategy.json — Style-to-sampler mapping
- tools/ray_tracing/verify.py — Standalone verification utilities
Phase 5: Documentation & Container - tools/ray_tracing/Dockerfile — Disposable container
- tools/ray_tracing/README.md — Mathematical documentation
Phase 6: Testing & Integration - tests/test_pr42_deterministic_light.py — 60+ assertions
- social/endpoints.json update — Add ray_tracing policy
Immediate Copilot Command
Complete PR #42 implementation. The CI workflow exists but the actual
ray_tracing tools are missing. Generate all 18 files in the order
specified above.
CRITICAL CONSTRAINTS:
- Zero randomness: No 'import random', no 'np.random', no os.urandom()
- Deterministic: Same seed → same output, cross-platform bit-identical
- Yeshua Standard: LOGOS, CHALCEDON, GRACE, KENOSIS, AGAPE, HALTING
Start with sobol_direction_numbers.json (Joe-Kuo 2008 standard) and
sobol.py. These are the foundation. All other samplers depend on them.
Verify each file with:
- pylint (syntax)
- mypy (types)
- pytest (functionality)
The 57 checks should become 60+ checks after completion.
Is this the correct assessment? Are the ray_tracing files present in a different branch, or does Copilot need to generate them now to complete PR #42?
"
you need to also investigate why i give github url links of all kinda and usuallly all cloud ai have issues
use purest foundational yeshua mathematics and all yeshua domains and standards
on mobile I am unable to CTRL A
meta data for ai and show proof of copilot work
…ification Co-authored-by: aidoruao <174227749+aidoruao@users.noreply.github.com>
All 19 implementation files exist and are complete. Kimi AI was viewing an older state of the PR before my commits were pushed. Proof of work (commit
File inventory (20 source files): |
…ring issues (#64, #65, #66, #67, #69, #44, #42) - Analyzed shader integration with Iris/Angelica - Identified OpenGL state management issues (GL_BLEND, GL_ALPHA_TEST) - Found compatibility gaps with Mineshot camera and Chisel blocks - Documented shader pack specific issues (Photon, Solas) - Generated JSON gap analysis + copy-paste GitHub comments for each issue - Created master report with cross-batch patterns - Pipeline: DH-STANDALONE-001
Adds 5 vendored repositories under investigations/darkshadow44/: 1. DistantHorizonsStandalone (GPLv3, 84,441 LOC, 669 files) - Primary target for issue #51 and 24 other investigations - Includes existing DH_SOURCE_INDEX.json and WIKI.md 2. Angelica (LGPLv3, 100,467 LOC, 1,292 files) - Referenced in 6 issue analyses (#56, #42, #47, #64, #65, #72) - Critical for understanding SplashProgress replacement 3. ArchaicFix (LGPLv3 with caveats, 9,149 LOC, 190 files) - Canonical 1.7.10 performance mod - Shows what DH should NOT re-fix 4. Spool (LGPLv3, 20,195 LOC, 158 files) - Multithreading mod for 1.7.10 Forge - Context for issue #51 'different architecture' comment 5. SeasonalHorizons (LGPLv2.1, 1,336 LOC, 51 files) - DarkShadow44's original code (not a fork) - Shows his architectural patterns and patch acceptance style Each repository includes: - VENDOR_MANIFEST.json with commit hash and Merkle root - LICENSE file (verbatim from original) - ATTRIBUTION.md with non-affiliation statement - sha256_manifest.txt for byte-level verification - SOURCE_INDEX.json and WIKI.md (for analysis notes) - src/ containing full source tree (immutable) Also adds global_merkle.py for computing cross-repo Merkle tree roots: - Computes individual repo roots from src/ contents - Computes global root across all repos - Supports --update-manifests and --verify modes Global Merkle Root: 6b809444934f0457bb13e0ffc2185e7e0637bcefe82cb7ffa00dc4c60927ec7b This eliminates epistemic dependency on external clones and makes all analysis claims falsifiable by providing byte-verifiable source artifacts.
…merkle.py; reconcile directories ## Changes ### Task 1: Fix global_merkle.py SOURCE_EXTS - Added .java, .kts, .gradle, .toml, .properties, .xml extensions - Recomputed global Merkle root: 162d7969cc8ac4c5cabf9b807490ce5d87cf9f7ab6d7c071246bd61256a99214 - Now includes all 6,116 source files (up from ~2,400) ### Task 2: Angelica SOURCE_INDEX.json + WIKI.md - 1,185 Java files, 100,467 LOC indexed - Identified MixinSplashProgress.java as SplashProgress replacement (critical for DH #56) - Catalogued 237 mixins with DH overlap analysis - Documented GLStateManager.java (1,200 LOC) GL context protection - Cross-referenced DH issues #56, #42, #47, #64, #65, #72 ### Task 3: ArchaicFix SOURCE_INDEX.json + WIKI.md - 177 Java files, 9,149 LOC indexed - Documented 111 mixins and 45+ config options - Identified fixes DH should NOT duplicate (Phosphor lighting, tick sync, etc.) - Analyzed 4 high-risk overlap areas with DH ### Task 4: Spool SOURCE_INDEX.json + WIKI.md - 145 Java files, 20,195 LOC indexed - Documented 3 threading modes (distance-based, dimension, sequential) - Analyzed 50ms tick budget impact - Compared ConcurrentLinkedQueue usage with DH ### Task 5: SeasonalHorizons SOURCE_INDEX.json + WIKI.md - 25 Java files, 1,336 LOC indexed - Documented DarkShadow44's architectural patterns - Mixin naming convention: seasonalHorizons$ prefix - Patch acceptance criteria based on his coding style ### Task 6: Reconcile old investigation directory - Moved tools/, batch1-4/, issue_*_corrected/ to darkshadow44/DistantHorizonsStandalone/ - Removed duplicate DH_SOURCE_INDEX.json and WIKI.md - Updated references in domain.py, README.md, DEVIN_ONBOARDING.md - Deleted empty distanthorizons_standalone/ directory ### All repos now have: - VENDOR_MANIFEST.json with Merkle roots - SOURCE_INDEX.json with census and critical classes - WIKI.md with architecture and issue cross-references - ATTRIBUTION.md with verification instructions - LICENSE file verbatim from original - sha256_manifest.txt for byte-level verification
Extends the Yeshua architecture with a pure-Python deterministic light transport layer, replacing stochastic Monte Carlo sampling with quasi-Monte Carlo sequences. Light transport becomes a pure mathematical function: same seed + scene → same radiance, bit-identical across platforms, no GPU required.
New modules
tools/ray_tracing/samplers/sobol.py— N-dimensional Sobol' sequences via Gray code + bitwise XOR direction numbers; offset capped at 1024 for O(1) inithalton.py— Radical inverse in prime bases with deterministic Owen-style digit scramblinghammersley.py— Fixed-N point sets (n/N+ radical inverse) for direct illuminationadaptive.py— Error-Bounded Luminaire Sampler (EBLS): adds samples until Koksma-Hlawka discrepancy boundV(f)·D_N*drops belowε_target; samples pre-generated in batches of 64sobol_direction_numbers.json— Joe-Kuo 2008 direction numbers for 5 dimensions × 32 numbers each (universal mathematical constants)tools/ray_tracing/transport/path_tracer.py— Recursive deterministic path tracer; no Russian roulette; seed advanced per depth viaSHA-256(seed ‖ depth); per-sample seed derived frompixel_seed ‖ sample_indexdirect_light.py— Next-event estimation with point/area lights; shadow rays from Sobol' 2-D sequencesindirect_light.py— Cosine-weighted diffuse + mirror-interpolated glossy via Sobol' hemisphere samplingradiance_cache.py— SHA-256-keyed in-process radiance cache;DualPathVerifierenforces CPU-reference authority over optional GPU pathstools/ray_tracing/geometry/intersect.py— Analytic sphere/plane primitives with deterministic closest-hit orderingbvh_builder.py— Surface Area Heuristic (SAH) BVH construction with 12-bucket splits; deterministic left-before-right traversal;math.copysignpreserves sign for near-zero ray directions in AABB slab testtools/ray_tracing/verify.py— Standalone dual-path hash comparator:verify_radiance_hash()checks per-element absolute tolerance then SHA-256 of rounded values;FrameVerifierlogs all CPU/GPU decisions and optionally appends to AGENT_FEED.md (PR #40 integration)tools/ray_tracing/grammar/sampling_strategy.json— Style→sampler mapping; Yeshua policy (randomness_required: false,rt_cores_required: false,vendor_lock_in: false)tools/ray_tracing/Dockerfile— Disposablepython:3.11-slimrender container; asserts zero randomness and path-trace determinism at image-build timetools/ray_tracing/pr42_completion_proof.json— Machine-readable proof-of-work and AI metadata: SHA-256 hash for every artifact, canonical cross-artifact hash (50305bc4…), CI run ID and conclusion, all commit SHAs, test suite results, Yeshua compliance record for all six principles, static no-randomness proof, and full mathematical guarantees. Enables any AI or toolchain to verify the implementation without browsing the repository.social/endpoints.json— Extended withray_tracingendpoint block: Yeshua policy, mathematical guarantees (primary_sampler: sobol,convergence_rate: O((log N)^d / N),error_bound: koksma_hlawka), implementation paths, and halting statementKey design decisions
O((log N)^d / N)vs Monte CarloO(1/√N); empirically 10–100× fewer samples for equivalent perceived qualityDualPathVerifiercomparesSHA-256(radiance_bytes)hashes; mismatch → GPU result silently rejected, CPU reference usedimport random, nonumpyCI
.github/workflows/pr42-light-transport.yml— runs 139-assertion test suite across Python 3.10/3.11/3.12 on Ubuntu, Windows, macOS; includes Sobol' cross-platform identity check, EBLS convergence smoke test, dual-path verification smoke test, and static no-randomness assertion. CI conclusion: success.✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.