Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 16 additions & 10 deletions packages/cli/skills/dkg-node/SKILL.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,10 +69,10 @@ curl -X POST $BASE_URL/api/shared-memory/write \
**Step 3 — Publish to Verified Memory:**

```bash
curl -X POST $BASE_URL/api/publish \
curl -X POST $BASE_URL/api/shared-memory/publish \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"contextGraphId": "my-context-graph", "quads": [...]}'
-d '{"contextGraphId": "my-context-graph"}'
```

**Step 4 — Query:**
Expand All @@ -81,7 +81,7 @@ curl -X POST $BASE_URL/api/publish \
curl -X POST $BASE_URL/api/query \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"sparql": "SELECT * WHERE { ?s ?p ?o } LIMIT 10", "contextGraphId": "my-context-graph", "includeSharedMemory": true}'
-d '{"sparql": "SELECT * WHERE { ?s ?p ?o } LIMIT 10", "contextGraphId": "my-context-graph", "view": "shared-working-memory"}'
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Bug: Step 3 publishes from shared memory with clearAfter defaulting to true, so querying view: "shared-working-memory" in the next step will usually come back empty. If this quick start is meant to confirm the publish, query the context graph without the SWM view instead, or add "clearAfter": false to the publish example.

```

## 4. Authentication
Expand Down Expand Up @@ -111,7 +111,8 @@ The token is configured in the node's config file or provided at startup.

### Querying

- `POST /api/query` — SPARQL query with optional `view` (`working-memory`, `shared-working-memory`, `verified-memory`), `agentAddress`, `assertionName`, `verifiedGraph`, `subGraphName`, `includeSharedMemory`, `contextGraphId` parameters
- `POST /api/query` — SPARQL query with optional `contextGraphId`, `includeSharedMemory`, `view` (`working-memory`, `shared-working-memory`, `verified-memory`), `agentAddress`, `assertionName`, `verifiedGraph` parameters
- **Note:** `subGraphName` is supported for legacy routing only and cannot be combined with `view`
- `POST /api/query-remote` — query a remote peer via P2P

### Working Memory (WM) — Private assertions (🚧 Planned)
Expand Down Expand Up @@ -179,14 +180,19 @@ curl -X POST $BASE_URL/api/assertion/my-assertion/import-file \
| 502 | Chain/upstream error | Retry — transient blockchain issue |
| 503 | Service unavailable | Node is starting up or shutting down |

## 10. Workflow Recipes
## 10. Common Workflows

For detailed step-by-step workflow recipes and the full endpoint reference, see
the supporting files in the skill directory:
**Write → Share → Publish:**

- `workflows.md` — 10 workflow recipes with curl examples
- `api-reference.md` — full endpoint reference grouped by workflow
- `examples/sparql-recipes.md` — SPARQL query patterns
1. Create a context graph (`POST /api/context-graph/create`)
2. Write triples to shared memory (`POST /api/shared-memory/write`)
3. Publish to verified memory (`POST /api/shared-memory/publish`)

**Query across layers:**

- Shared memory: `{"sparql": "...", "contextGraphId": "...", "view": "shared-working-memory"}`
- Verified memory: `{"sparql": "...", "contextGraphId": "...", "view": "verified-memory"}`
- Working memory (planned): `{"sparql": "...", "view": "working-memory", "agentAddress": "...", "contextGraphId": "..."}`

## Appendix: V9 → V10 Migration

Expand Down
7 changes: 4 additions & 3 deletions packages/cli/src/daemon.ts
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ function buildSkillMd(opts: {
`- **Base URL:** ${opts.baseUrl}`,
`- **Peer ID:** ${opts.peerId}`,
`- **Node role:** ${opts.nodeRole}`,
`- **Available extraction pipelines:** ${opts.extractionPipelines.length > 0 ? opts.extractionPipelines.join(', ') : 'text/markdown'}`,
`- **Available extraction pipelines:** ${opts.extractionPipelines.length > 0 ? opts.extractionPipelines.join(', ') : 'none (install markitdown to enable document conversion)'}`,
`- **Subscribed Context Graphs:** use \`GET /api/context-graph/list\` (requires auth)`,
].join('\n');

Expand Down Expand Up @@ -1227,7 +1227,7 @@ async function handleRequest(
const proto = req.headers['x-forwarded-proto'] ?? 'http';
const host = req.headers['x-forwarded-host'] ?? req.headers.host ?? `localhost:${config.listenPort ?? 9200}`;
const baseUrl = `${proto}://${host}`;
const pipelines = ['text/markdown', ...extractionRegistry.availableContentTypes()];
const pipelines = extractionRegistry.availableContentTypes();
const content = buildSkillMd({
version: nodeVersion,
baseUrl,
Expand All @@ -1244,6 +1244,7 @@ async function handleRequest(
'Content-Type': 'text/markdown; charset=utf-8',
'ETag': etag,
'Cache-Control': 'public, max-age=300',
'Vary': 'Host, X-Forwarded-Host, X-Forwarded-Proto',
});
res.end(content);
return;
Expand Down Expand Up @@ -2116,7 +2117,7 @@ async function handleRequest(
msg.startsWith('SPARQL rejected:') || msg.startsWith('Parse error') ||
/must start with (SELECT|CONSTRUCT|ASK|DESCRIBE)/i.test(msg) ||
msg.includes('was removed in V10') ||
msg.includes('requires agentAddress') || msg.includes('requires contextGraphId') ||
msg.includes('agentAddress is required') || msg.includes('requires a contextGraphId') ||
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 Issue: The 400/500 split here still depends on exact exception text, and this change makes that coupling tighter. A small wording change in @origintrail-official/dkg-query will turn these client errors back into 500s, so add a request-level regression test for missing agentAddress and missing contextGraphId view queries or switch to typed errors/status codes.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Acknowledged — this is a known coupling risk. A typed error hierarchy (e.g. QueryValidationError with a statusCode property) in dkg-query would be the clean fix. Tracking for a follow-up.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Bug: agent.query() still throws Invalid sub-graph name for query: ... for bad subGraphName values, but this 400 allowlist does not catch that case, so malformed client input still falls through as a 500. Since this PR now documents subGraphName, add that validation error here or reject it before calling agent.query.

msg.includes('cannot be combined with')
) {
return jsonResponse(res, 400, { error: msg });
Expand Down
2 changes: 1 addition & 1 deletion packages/cli/test/skill-endpoint.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ describe('SKILL.md file', () => {
expect(skillContent).toContain('## 7. File Ingestion');
expect(skillContent).toContain('## 8. Node Administration');
expect(skillContent).toContain('## 9. Error Reference');
expect(skillContent).toContain('## 10. Workflow Recipes');
expect(skillContent).toContain('## 10. Common Workflows');
});

it('contains dynamic placeholders for node info', () => {
Expand Down
Loading