Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 5 additions & 11 deletions PLAN.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,16 +117,10 @@ This document tracks the implementation status of each major module in CORTEX. I
| Hebbian Updates | ✅ Complete | `daydreamer/HebbianUpdater.ts` | LTP (strengthen), LTD (decay), prune below threshold; recompute σ(v) for changed nodes; run promotion/eviction sweep |
| Prototype Recomputation | ✅ Complete | `daydreamer/PrototypeRecomputer.ts` | Recalculate volume/shelf medoids and centroids; recompute salience for affected entries; run tier-quota promotion/eviction |
| Full Neighbor Graph Recalc | ✅ Complete | `daydreamer/FullNeighborRecalc.ts` | Rebuild bounded neighbor lists for dirty volumes; batch size bounded by O(√(t log t)) per idle cycle; recompute salience after recalc. |
| Idle Scheduler | ❌ Missing | `daydreamer/IdleScheduler.ts` (planned) | Cooperative background loop; interruptible; respects CPU budget |
| Hebbian Updates | ❌ Missing | `daydreamer/HebbianUpdater.ts` (planned) | LTP (strengthen), LTD (decay), prune below threshold; recompute σ(v) for changed nodes; run promotion/eviction sweep |
| Prototype Recomputation | ❌ Missing | `daydreamer/PrototypeRecomputer.ts` (planned) | Recalculate volume/shelf medoids and centroids; recompute salience for affected entries; run tier-quota promotion/eviction |
| Full Neighbor Graph Recalc | ❌ Missing | `daydreamer/FullNeighborRecalc.ts` (planned) | Rebuild bounded neighbor lists for dirty volumes; batch size bounded by O(√(t log t)) per idle cycle; recompute salience after recalc. |
| Experience Replay | ❌ Missing | `daydreamer/ExperienceReplay.ts` (planned) | Simulate queries to reinforce connections |
| Cluster Stability | ✅ Complete | `daydreamer/ClusterStability.ts` | Lightweight label propagation for community detection; stores community labels in PageActivity; detects oversized and empty communities |
| Experience Replay | ✅ Complete | `daydreamer/ExperienceReplay.ts` | Simulate queries to reinforce connections; recent-biased sampling; LTP on traversed edges |
| Cluster Stability | ✅ Complete | `daydreamer/ClusterStability.ts` | Lightweight label propagation for community detection; stores community labels in PageActivity; detects oversized and empty communities; volume split/merge with orphan deletion |

**Daydreamer Status:** 4/6 complete (66%)

**Note:** Not a v1 blocker — system can ship without background consolidation (manual recalc only). Community detection is required before graph-community quota enforcement is active.
**Daydreamer Status:** 6/6 complete (100%)

---

Expand Down Expand Up @@ -401,9 +395,9 @@ This document tracks the implementation status of each major module in CORTEX. I
**Impact:** Queries return flat top-K results only; no epistemic balance, no knowledge gap detection, no P2P curiosity.
**Mitigation:** Phase 2 priority; depends on semantic neighbor graph (Blocker 1) and hierarchy builder.

### Blocker 3: No Privacy-Safe Sharing or Curiosity Broadcasting Pipeline
### Blocker 3: No Privacy-Safe Sharing or Curiosity Broadcasting Pipeline — RESOLVED
**Impact:** Core discovery-sharing value proposition is missing; knowledge gaps cannot be resolved via P2P.
**Mitigation:** Phase 3 required track; implement eligibility classifier + curiosity broadcaster + signed subgraph exchange as v1 scope. CuriosityProbe must include `mimeType` and `modelUrn` to prevent incommensurable graph merges.
**Resolution:** Phase 3 sharing pipeline fully implemented. `sharing/EligibilityClassifier.ts` blocks PII/credential/financial/health content. `sharing/CuriosityBroadcaster.ts` provides rate-limited probe broadcasting with fragment response handling. `sharing/SubgraphExporter.ts` and `sharing/SubgraphImporter.ts` handle eligibility-filtered export and schema-validated import with sender identity stripping. `sharing/PeerExchange.ts` orchestrates opt-in signed subgraph exchange. CuriosityProbe includes `mimeType` and `modelUrn` to prevent incommensurable graph merges.

### Blocker 4: Naming Drift (P0-X) — RESOLVED
**Impact:** The term "Metroid" was used for the proximity graph in all code. MetroidBuilder cannot be introduced without a rename collision.
Expand Down
41 changes: 1 addition & 40 deletions daydreamer/ClusterStability.ts
Original file line number Diff line number Diff line change
Expand Up @@ -675,45 +675,6 @@ export class ClusterStability {
private async collectAllShelves(
metadataStore: MetadataStore,
) {
// MetadataStore does not expose a `getAllShelves()` helper, so we iterate
// over all volumes and collect the shelves that reference them.
// We use the reverse-index helper to get shelves for each volume.
const allVolumes = await this.collectAllVolumes(metadataStore);
const shelfMap = new Map<Hash, Awaited<ReturnType<MetadataStore["getShelf"]>>>();

for (const volume of allVolumes) {
const shelves = await metadataStore.getShelvesByVolume(volume.volumeId);
for (const shelf of shelves) {
if (!shelfMap.has(shelf.shelfId)) {
shelfMap.set(shelf.shelfId, shelf);
}
}
}

return [...shelfMap.values()].filter(
(s): s is NonNullable<typeof s> => s !== undefined,
);
}

private async collectAllVolumes(
metadataStore: MetadataStore,
): Promise<Volume[]> {
const allPages = await metadataStore.getAllPages();
const volumeIds = new Set<Hash>();

for (const page of allPages) {
const books = await metadataStore.getBooksByPage(page.pageId);
for (const book of books) {
const volumes = await metadataStore.getVolumesByBook(book.bookId);
for (const volume of volumes) {
volumeIds.add(volume.volumeId);
}
}
}

const volumes = await Promise.all(
[...volumeIds].map((id) => metadataStore.getVolume(id)),
);
return volumes.filter((v): v is Volume => v !== undefined);
return metadataStore.getAllShelves();
}
}
135 changes: 135 additions & 0 deletions tests/integration/Daydreamer.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,11 @@ import { strengthenEdges, decayAndPrune } from "../../daydreamer/HebbianUpdater"
import { runFullNeighborRecalc } from "../../daydreamer/FullNeighborRecalc";
import { recomputePrototypes } from "../../daydreamer/PrototypeRecomputer";
import { runLabelPropagation } from "../../daydreamer/ClusterStability";
import { CuriosityBroadcaster } from "../../sharing/CuriosityBroadcaster";
import type { P2PTransport } from "../../sharing/CuriosityBroadcaster";
import { filterEligible } from "../../sharing/EligibilityClassifier";
import { importFragment } from "../../sharing/SubgraphImporter";
import type { GraphFragment, PeerMessage } from "../../sharing/types";
import type { ModelProfile } from "../../core/ModelProfile";

// ---------------------------------------------------------------------------
Expand Down Expand Up @@ -238,6 +243,136 @@ describe("Daydreamer integration", () => {
}
});

it("curiosity broadcasting filters out PII pages from eligible content", async () => {
const metadataStore = await IndexedDbMetadataStore.open(freshDbName());
const vectorStore = new MemoryVectorStore();
const runner = makeRunner();
const profile = makeProfile();
const keyPair = await generateKeyPair();
const now = Date.now();

// Ingest eligible (public-interest) content
const eligibleRes = await ingestText(CORPUS[0], {
modelProfile: profile,
embeddingRunner: runner,
vectorStore,
metadataStore,
keyPair,
now,
});

// Ingest PII-bearing content (contains an email address and credential)
const piiRes = await ingestText(
"Please contact admin@example.com for the API key password=secret123 to access the private dashboard.",
{
modelProfile: profile,
embeddingRunner: runner,
vectorStore,
metadataStore,
keyPair,
now,
},
);

// Set up a mock P2P transport and CuriosityBroadcaster
const broadcastLog: PeerMessage[] = [];
const transport: P2PTransport = {
broadcast: async (msg) => { broadcastLog.push(msg); },
onMessage: (_handler) => {
// Intentionally not wiring inbound messages for this integration test
},
};

const broadcaster = new CuriosityBroadcaster({
transport,
nodeId: "test-node",
rateLimitMs: 0,
});

// Enqueue a curiosity probe referencing a valid page
const eligiblePageId = eligibleRes.pages[0].pageId;
broadcaster.enqueueProbe({
m1: eligiblePageId,
partialMetroid: { m1: eligiblePageId },
queryContextB64: "AAAA",
knowledgeBoundary: profile.embeddingDimension,
mimeType: "text/plain",
modelUrn: "urn:model:test:v1",
timestamp: new Date(now).toISOString(),
});

// Flush broadcasts the probe
const sent = await broadcaster.flush(now);
expect(sent).toBe(1);
expect(broadcastLog).toHaveLength(1);
expect(broadcastLog[0].kind).toBe("curiosity_probe");

// Verify that PII pages are blocked by the eligibility classifier
const piiPageIds = piiRes.pages.map((p) => p.pageId);
const eligiblePageIds = eligibleRes.pages.map((p) => p.pageId);

const allPages = await metadataStore.getAllPages();
const eligible = filterEligible(allPages);
const eligibleIds = new Set(eligible.map((p) => p.pageId));

// PII pages must be excluded from eligible set
for (const piiId of piiPageIds) {
expect(eligibleIds.has(piiId)).toBe(false);
}

// Public-interest pages must be included in eligible set
for (const id of eligiblePageIds) {
expect(eligibleIds.has(id)).toBe(true);
}
});

it("imported graph fragment pages are discoverable via MetadataStore", async () => {
const metadataStore = await IndexedDbMetadataStore.open(freshDbName());
const vectorStore = new MemoryVectorStore();
const now = Date.now();

// Simulate receiving a graph fragment from a peer
const fragment: GraphFragment = {
fragmentId: "frag-integration-1",
probeId: "probe-1",
nodes: [
{
pageId: "imported-page-1",
content: "Peer-shared knowledge about distributed consensus algorithms and their applications.",
embeddingOffset: 0,
embeddingDim: EMBEDDING_DIM,
contentHash: "hash1",
vectorHash: "vhash1",
creatorPubKey: "peer-pub-key",
signature: "peer-sig",
createdAt: new Date(now).toISOString(),
},
],
edges: [],
signatures: {},
timestamp: new Date(now).toISOString(),
};

const result = await importFragment(fragment, {
metadataStore,
vectorStore,
verifyContentHashes: false,
});

// Nodes should be imported
expect(result.nodesImported).toBe(1);
expect(result.rejected).toHaveLength(0);

// Imported page should be discoverable
const imported = await metadataStore.getPage("imported-page-1");
expect(imported).toBeDefined();
expect(imported?.content).toContain("distributed consensus");

// Sender identity must be stripped
expect(imported?.creatorPubKey).toBe("");
expect(imported?.signature).toBe("");
});

it("community labels are assigned to pages after label propagation", async () => {
const metadataStore = await IndexedDbMetadataStore.open(freshDbName());
const vectorStore = new MemoryVectorStore();
Expand Down
Loading