Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Feb 5, 2026

Webhook handler fails when removing 140 pieces from a dataset: D1_ERROR: too many SQL variables at offset 766: SQLITE_ERROR. The function generates a single INSERT with 280 parameters (2 per piece), exceeding Cloudflare D1's limit of 100 bound parameters per query.

Changes

  • removeDataSetPieces: Process pieces in batches of 50 (100 parameters)
  • withPieces test helper: Process pieces in batches of 25 (100 parameters)
  • New test: Validates removing 140 pieces succeeds

Implementation

export async function removeDataSetPieces(env, dataSetId, pieceIds) {
  const BATCH_SIZE = 50  // 2 params per piece = 100 total

  for (let i = 0; i < pieceIds.length; i += BATCH_SIZE) {
    const batch = pieceIds.slice(i, i + BATCH_SIZE)
    await env.DB.prepare(
      `INSERT INTO pieces (id, data_set_id, is_deleted)
       VALUES ${batch.map(() => '(?, ?, TRUE)').join(', ')}
       ON CONFLICT DO UPDATE set is_deleted = true`
    )
      .bind(...batch.flatMap((pieceId) => [pieceId, dataSetId]))
      .run()
  }
}

Sequential batch processing trades latency for reliability. Each batch executes independently, so partial failures won't corrupt the entire operation.

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • public.chainalysis.com
    • Triggering command: /home/REDACTED/work/worker/worker/node_modules/@cloudflare/workerd-linux-64/bin/workerd /home/REDACTED/work/worker/worker/node_modules/@cloudflare/workerd-linux-64/bin/workerd serve --binary --experimental --socket-addr=entry=127.0.0.1:0 --external-addr=loopback=127.0.0.1:38319 --control-fd=3 - --verbose (dns block)
    • Triggering command: /home/REDACTED/work/worker/worker/node_modules/@cloudflare/workerd-linux-64/bin/workerd /home/REDACTED/work/worker/worker/node_modules/@cloudflare/workerd-linux-64/bin/workerd serve --binary --experimental --socket-addr=entry=127.0.0.1:0 --external-addr=loopback=127.0.0.1:36351 --control-fd=3 - --verbose (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Original prompt

This section details on the original issue you should resolve

<issue_title>D1_ERROR: SQLITE_ERROR - Too many SQL variables when removing 140 pieces from dataset</issue_title>
<issue_description>## Descripton

The filcdn-indexer-calibration script fails with a SQLite error when attempting to remove a large number of pieces from a dataset in a single operation.

Observed Error

{"Outcome":"exception","Logs":[{"Level":"log","Message":["Request body","{\"vid\":14729448607711332,\"block\":3429467,\"id\":\"0xf4429d206a2884bc5494ab35cc3fab0f5b61dee83a9c184d06d995b4ac5f1e870d000000\",\"data_set_id\":\"6725\",\"piece_ids\":[\"8\",\"17\",\"18\",\"10\",\"9\",\"1\",\"0\",\"6\",\"12\",\"4\",\"19\",\"14\",\"15\",\"5\",\"13\",\"11\",\"16\",\"2\",\"7\",\"3\",\"27\",\"30\",\"37\",\"22\",\"24\",\"34\",\"28\",\"38\",\"36\",\"35\",\"33\",\"23\",\"26\",\"29\",\"31\",\"39\",\"32\",\"20\",\"25\",\"21\",\"52\",\"53\",\"55\",\"41\",\"58\",\"44\",\"42\",\"50\",\"45\",\"56\",\"51\",\"48\",\"43\",\"54\",\"46\",\"40\",\"49\",\"59\",\"57\",\"76\",\"62\",\"60\",\"75\",\"78\",\"69\",\"64\",\"74\",\"73\",\"67\",\"66\",\"63\",\"77\",\"71\",\"70\",\"68\",\"79\",\"61\",\"72\",\"65\",\"85\",\"94\",\"97\",\"81\",\"80\",\"99\",\"84\",\"95\",\"93\",\"88\",\"91\",\"96\",\"86\",\"82\",\"87\",\"89\",\"92\",\"90\",\"98\",\"83\",\"100\",\"105\",\"119\",\"109\",\"112\",\"103\",\"108\",\"114\",\"102\",\"101\",\"104\",\"113\",\"115\",\"107\",\"116\",\"111\",\"110\",\"117\",\"106\",\"118\",\"129\",\"125\",\"134\",\"133\",\"137\",\"120\",\"123\",\"126\",\"121\",\"127\",\"138\",\"139\",\"122\",\"128\",\"135\",\"136\",\"130\",\"131\",\"124\"],\"block_number\":3429467,\"block_timestamp\":1770210390,\"transaction_hash\":\"0xf4429d206a2884bc5494ab35cc3fab0f5b61dee83a9c184d06d995b4ac5f1e87\",\"_gs_chain\":\"filecoin-testnet\",\"_gs_gid\":\"a93f1e0f9e36f53d56421421525216fb\"}"],"TimestampMs":1770210425621},{"Level":"log","Message":["Removing pieces (piece_ids=[8, 17, 18, 10, 9, 1, 0, 6, 12, 4, 19, 14, 15, 5, 13, 11, 16, 2, 7, 3, 27, 30, 37, 22, 24, 34, 28, 38, 36, 35, 33, 23, 26, 29, 31, 39, 32, 20, 25, 21, 52, 53, 55, 41, 58, 44, 42, 50, 45, 56, 51, 48, 43, 54, 46, 40, 49, 59, 57, 76, 62, 60, 75, 78, 69, 64, 74, 73, 67, 66, 63, 77, 71, 70, 68, 79, 61, 72, 65, 85, 94, 97, 81, 80, 99, 84, 95, 93, 88, 91, 96, 86, 82, 87, 89, 92, 90, 98, 83, 100, 105, 119, 109, 112, 103, 108, 114, 102, 101, 104, 113, 115, 107, 116, 111, 110, 117, 106, 118, 129, 125, 134, 133, 137, 120, 123, 126, 121, 127, 138, 139, 122, 128, 135, 136, 130, 131, 124], data_set_id=6725)"],"TimestampMs":1770210425621}],"Exceptions":[{"Name":"Error","Message":"D1_ERROR: too many SQL variables at offset 766: SQLITE_ERROR","TimestampMs":1770210425817}],"ScriptName":"filcdn-indexer-calibration"}
```</issue_description>

<agent_instructions>Rework the webhook handler to split the array and create a batch of SQL queries to execute.</agent_instructions>

## Comments on the Issue (you are @copilot in this section)

<comments>
<comment_new><author>@bajtos</author><body>
Here is the exception:

D1_ERROR: too many SQL variables at offset 766: SQLITE_ERROR
at D1DatabaseSessionAlwaysPrimary._sendOrThrow (cloudflare-internal:d1-api:139:19)
at async cloudflare-internal:d1-api:348:41
at async removeDataSetPieces (lib/pdp-verifier-handlers.js:41:3)
at async Object.fetch (bin/indexer.js:153:7)

<comment_new><author>@bajtos</author><body>
See https://developers.cloudflare.com/d1/platform/limits/

> Maximum bound parameters per query: 100</body></comment_new>
</comments>


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

… pieces

Co-authored-by: bajtos <1140553+bajtos@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix SQLITE_ERROR when removing large number of pieces Batch SQL operations to respect D1's 100-parameter limit Feb 5, 2026
Copilot AI requested a review from bajtos February 5, 2026 14:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

D1_ERROR: SQLITE_ERROR - Too many SQL variables when removing 140 pieces from dataset

2 participants