Skip to content

[list] Copying large folders is unreliable #244

@auniverseaway

Description

@auniverseaway

We have several issues that creates instability with large (> 500 items) folder copies.

  1. Fundamental issues with content integrity we (or S3 SDK) dies on - "empty" or // prefixes
  2. The relationship with DA_JOBS seems brittle. I cannot reliably copy anything of meaningful size.
  3. We are not resilient to "Network changed" errors that may happen on the client. Ideally the client has a way to pick up where they left off. This could be as simple as getting the continuation token and when supplying back da-admin looks at DA_JOBS to see if there's an existing job.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions