feat: add step.s3_upload pipeline step for S3-compatible object storage#194
feat: add step.s3_upload pipeline step for S3-compatible object storage#194
Conversation
…ge upload Co-authored-by: intel352 <77607+intel352@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
This PR adds a new step.s3_upload pipeline step that enables uploading base64-encoded binary data from the pipeline context to AWS S3 or S3-compatible storage (MinIO, LocalStack). The implementation provides flexible configuration with template-based key generation, environment variable expansion for credentials, and support for both AWS and custom endpoints.
Changes:
- New S3 upload pipeline step with comprehensive base64 decoding support (standard, URL-safe, and raw variants)
- Template-based key resolution supporting dynamic values like
{{ .user_id }}and{{ uuid }} - Content type resolution from both static config and dynamic pipeline context
- 11 comprehensive unit tests covering all major functionality and edge cases
- Plugin registration and manifest updates
- Code formatting improvements (struct field alignment) across multiple files
Reviewed changes
Copilot reviewed 6 out of 36 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
module/pipeline_step_s3_upload.go |
Core implementation of S3UploadStep with AWS SDK integration, template resolution, and base64 decoding |
module/pipeline_step_s3_upload_test.go |
Comprehensive test suite with 11 tests covering templating, content types, custom endpoints, and error cases |
plugins/pipelinesteps/plugin.go |
Registers step.s3_upload factory and adds it to the plugin manifest |
plugins/pipelinesteps/plugin_test.go |
Updates test expectations to account for the new step (28 → 29 factories) |
Various module/*.go files |
Standard gofmt struct field alignment improvements (cosmetic only) |
|
@copilot apply changes based on the comments in this thread Something to think about, there's already s3 storage functionality that exists in workflow, is your explicit addition of s3_upload an absolute necessity, or is it simply redundant of the existing functionaity? |
…egistry Co-authored-by: intel352 <77607+intel352@users.noreply.github.com>
The existing
Also fixed in e895ca6: added |
|
@copilot tests are failing |
Co-authored-by: intel352 <77607+intel352@users.noreply.github.com>
module/pipeline_step_s3_upload.gowith S3UploadStep implementationmodule/pipeline_step_s3_upload_test.gowith testsstep.s3_uploadinplugins/pipelinesteps/plugin.goplugins/pipelinesteps/plugin_test.goto use dynamic step count (avoids hardcoded count breaking on merges)step.s3_uploadtoschema/schema.gocoreModuleTypes(sorted)step.s3_uploadschema entry toschema/module_schema.go💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.