Conversation
Ref: #2136 Signed-off-by: Tomas Dvorak <toomas2d@gmail.com>
Summary of ChangesHello @Tomas2D, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly expands the SDK's capabilities for S3 file storage by introducing a new extension that allows agents to perform S3 operations using dynamically injected, IAM-scoped credentials. This provides a robust and secure alternative to the existing presigned URL method, particularly for scenarios requiring on-demand uploads or fine-grained access control. The changes are accompanied by two detailed examples that showcase both approaches, making it easier for developers to integrate S3 file handling into their agent workflows. Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Changelog
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request introduces S3 file storage extensions with two distinct approaches: presigned URLs and direct credential injection. The code is well-structured, with a clear separation of concerns between the client and server implementations for each method. The inclusion of comprehensive examples is also a great addition. My review focuses on improving maintainability by reducing code duplication and enhancing efficiency in handling S3 bucket checks.
| try: | ||
| await s3.head_bucket(Bucket=self._config.bucket) | ||
| except ClientError as e: | ||
| if e.response["Error"]["Code"] in ("404", "NoSuchBucket"): | ||
| await s3.create_bucket(Bucket=self._config.bucket) | ||
| else: | ||
| raise |
There was a problem hiding this comment.
The check for bucket existence is performed on every call to create_upload_slot. When creating multiple upload slots concurrently (e.g., with asyncio.gather), this can lead to redundant head_bucket calls and a potential race condition where multiple coroutines attempt to create the same bucket if it doesn't exist. While create_bucket is generally idempotent, this approach is inefficient.
Consider performing this check only once per S3ExtensionClient instance. You could use an asyncio.Lock and a flag to ensure the check is performed only by the first coroutine that needs it.
| async def upload(self, filename: str, content: bytes, content_type: str = "application/octet-stream") -> str: | ||
| """PUT content at {prefix}{filename} and return a presigned GET URL.""" | ||
| if not self.data: | ||
| raise ExtensionError(self.spec, "S3 credentials extension metadata was not provided") | ||
|
|
||
| try: | ||
| import aioboto3 | ||
| except ImportError as e: | ||
| raise ImportError( | ||
| "aioboto3 is required for S3CredentialsExtensionServer. " | ||
| "Install it with: pip install 'agentstack-sdk[s3]'" | ||
| ) from e | ||
|
|
||
| from botocore.config import Config | ||
|
|
||
| creds = self.data | ||
| key = f"{creds.prefix}{filename}" | ||
| session = aioboto3.Session() | ||
| async with session.client( | ||
| "s3", | ||
| endpoint_url=creds.endpoint, | ||
| aws_access_key_id=creds.access_key_id, | ||
| aws_secret_access_key=creds.secret_access_key, | ||
| aws_session_token=creds.session_token, | ||
| region_name=creds.region, | ||
| use_ssl=creds.use_ssl, | ||
| config=Config(signature_version="s3v4", s3={"addressing_style": "path"}), | ||
| ) as s3: | ||
| await s3.put_object(Bucket=creds.bucket, Key=key, Body=content, ContentType=content_type) | ||
| download_url = await s3.generate_presigned_url( | ||
| "get_object", | ||
| Params={"Bucket": creds.bucket, "Key": key}, | ||
| ExpiresIn=creds.ttl, | ||
| ) | ||
| return download_url | ||
|
|
||
| async def download(self, key: str) -> bytes: | ||
| """GET object at {prefix}{key} using injected credentials.""" | ||
| if not self.data: | ||
| raise ExtensionError(self.spec, "S3 credentials extension metadata was not provided") | ||
|
|
||
| try: | ||
| import aioboto3 | ||
| except ImportError as e: | ||
| raise ImportError( | ||
| "aioboto3 is required for S3CredentialsExtensionServer. " | ||
| "Install it with: pip install 'agentstack-sdk[s3]'" | ||
| ) from e | ||
|
|
||
| from botocore.config import Config | ||
|
|
||
| creds = self.data | ||
| full_key = f"{creds.prefix}{key}" | ||
| session = aioboto3.Session() | ||
| async with session.client( | ||
| "s3", | ||
| endpoint_url=creds.endpoint, | ||
| aws_access_key_id=creds.access_key_id, | ||
| aws_secret_access_key=creds.secret_access_key, | ||
| aws_session_token=creds.session_token, | ||
| region_name=creds.region, | ||
| use_ssl=creds.use_ssl, | ||
| config=Config(signature_version="s3v4", s3={"addressing_style": "path"}), | ||
| ) as s3: | ||
| resp = await s3.get_object(Bucket=creds.bucket, Key=full_key) | ||
| return await resp["Body"].read() |
There was a problem hiding this comment.
Add S3 credentials extension and file storage examples
Introduces
S3CredentialsExtensionSpecto the SDK and adds two end-to-end examples covering both S3 file storage approaches.SDK — new credentials extension
Added
s3_credentials.py(split froms3.pyto keep modes separate):@field_serializer+redact_str().aioboto3with injected credentials toput_objectand generate a presigned GET URL. Returns a plain HTTPS URL (no S3 SDK required by caller).metadata()for demo/dev (long-lived keys) andfrom_sts(role_arn=...)for production (AssumeRole with an inline policy scoped tocontexts/{context_id}/*).Example: s3-translation-presigned
Shows the presigned URL approach (
S3ExtensionSpec). The client predeclares output “slots,” generates PUT+GET URLs, and passes them in metadata. The agent writes via HTTPS without S3 credentials. Produces two files (reversed text + stats) to demonstrate multi-slot usage. Notes limitations and when to prefer credentials.Example: s3-translation-credentials
Shows the credentials approach (
S3CredentialsExtensionSpec). The client injects short-lived, IAM-scoped credentials; the agent uploads freely withincontexts/{context_id}/—isolation enforced by IAM, not app code.Both examples include
uv run server/uv run client,.env.examplefor local MinIO, and a README with setup, flow, and tradeoffs.