#26: Fix data loss on Submission with multiple files#146
#26: Fix data loss on Submission with multiple files#146
Conversation
| }: { | ||
| records: Record<string, unknown>[]; | ||
| entityName: string; | ||
| data: EntityData; |
There was a problem hiding this comment.
This submit method now accepts a data argument, which is a map of entity names to arrays of raw records to be processed in the submission. This ensures submission is processed and validated as a single batch.
| await update(activeSubmission.id, { | ||
| data: mergedSubmissionData, | ||
| updatedBy: username, | ||
| }); |
There was a problem hiding this comment.
When new data is added to the submission, it is merged with the existing data in the active submission and then persisted to the database. This ensures that the submission contains all data merged and complete before the the validation occurs afterward.
There was a problem hiding this comment.
Just relocated these methods from a large files and regrouping related functions in smaller files
There was a problem hiding this comment.
relating functions into smaller files.
|
This PR has been split in smaller PRs: |
Summary
Fix issue where processing a submission with multiple files falls in a race condition where some data get lost.
Issues
Description of Changes
Data Provider Package
The Submission Service now handles batch processing of multiple files or entity datasets, updating submission data accordingly where each submission is updated in the database only once, and the entire dataset is validated a single time per submission.
Making this process in a batch based approach ensures consistency and helps to prevent race conditions.
Extras included:
Readiness Checklist
.env.schemafile and documented in the README