Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -12,17 +12,29 @@ description: Configure S3 cloud storage for the ReportPortal Flaky Test Detectio

ReportPortal supports cloud storage options through the Java library [JCLOUDS](https://jclouds.apache.org/).

To configure storage using Amazon S3, ReportPortal uses the following environment variables for the services **API**, **Jobs**, and **Authorization**:
To configure storage using Amazon S3, ReportPortal uses the following **environment variables for the services API, Jobs, and Authorization**:

```bash
RP_FEATURE_FLAGS: singleBucket # Enable single-bucket storage (recommended)
DATASTORE_TYPE: aws-s3
DATASTORE_TYPE: s3
DATASTORE_REGION: us-standard # Region of the bucket (JCloud ref. to `us-east-1`)
DATASTORE_ACCESSKEY: <access_key>
DATASTORE_SECRETKEY: <secret_key>
DATASTORE_DEFAULTBUCKETNAME: my-bucket # Name of the bucket
```

**Environment variables for for Analyzer:**
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix the duplicate word in the header.

The header contains a typo: "for for Analyzer" should be "for Analyzer".

📝 Proposed fix
-**Environment variables for for Analyzer:**
+**Environment variables for Analyzer:**
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
**Environment variables for for Analyzer:**
**Environment variables for Analyzer:**
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/installation-steps-advanced/file-storage-options/S3CloudStorage.md` at
line 26, Fix the typo in the markdown header by removing the duplicated word:
replace the header string "**Environment variables for for Analyzer:**" with
"**Environment variables for Analyzer:**" (look for that exact header text in
S3CloudStorage.md and update it).


```bash
x-analyzer-environment: &common-analyzer-environment
DATASTORE_REGION: us-east-1
DATASTORE_ACCESSKEY: "<access_key>"
DATASTORE_SECRETKEY: "<secret_key>"
DATASTORE_BUCKETPREFIX: prj-
DATASTORE_BUCKETPOSTFIX: ""
DATASTORE_DEFAULTBUCKETNAME: my-rp-docker-bucket/analyzer
```
Comment on lines +28 to +36
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Description: Verify DATASTORE_REGION and bucket name configuration patterns

# Search for DATASTORE_REGION usage patterns
rg -n -C2 'DATASTORE_REGION.*us-(east-1|standard)' --type=md --type=yaml

# Search for bucket name patterns with slashes
rg -n 'DATASTORE.*BUCKET.*/' --type=md --type=yaml

# Search for documentation about bucket prefix/postfix usage
rg -n -C3 'BUCKETPREFIX|BUCKETPOSTFIX' --type=md

Repository: reportportal/docs

Length of output: 4584


🏁 Script executed:

# Get broader context around the snippet to understand the configuration sections
sed -n '17,50p' docs/installation-steps-advanced/file-storage-options/S3CloudStorage.md

Repository: reportportal/docs

Length of output: 1238


🏁 Script executed:

# Get the full file to understand documentation structure and clarifications
wc -l docs/installation-steps-advanced/file-storage-options/S3CloudStorage.md

Repository: reportportal/docs

Length of output: 136


🏁 Script executed:

# Search for explanations about bucket naming conventions in the file
rg -n -C5 'bucket.*name|bucket.*notation|slash|path' docs/installation-steps-advanced/file-storage-options/S3CloudStorage.md -i

Repository: reportportal/docs

Length of output: 4256


🏁 Script executed:

# Search for any warnings or notes about S3 bucket name format restrictions
rg -n -C3 'S3.*name|bucket.*format|slash|cannot|invalid' docs/installation-steps-advanced/file-storage-options/S3CloudStorage.md -i

Repository: reportportal/docs

Length of output: 740


Fix bucket name format and add clarifying comment for region format in Analyzer section.

Two issues require attention:

  1. Missing region format clarification: Line 30 uses us-east-1 without a comment explaining why it differs from the JClouds format (us-standard) used in line 20. Add a clarifying comment like the one on line 20 or line 360.

  2. Invalid bucket name with slash: Line 35 (and line 374) contains my-rp-docker-bucket/analyzer for DATASTORE_DEFAULTBUCKETNAME. S3 bucket names cannot contain forward slashes per AWS bucket naming rules. The bucket name should be my-rp-docker-bucket, and the /analyzer path should be handled as an object key prefix using DATASTORE_BUCKETPREFIX or similar mechanism (compare to line 361 which correctly shows only my-rp-docker-bucket for the bucket name).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/installation-steps-advanced/file-storage-options/S3CloudStorage.md`
around lines 28 - 36, Update the Analyzer environment block to clarify region
format and correct the bucket name: add a comment beside DATASTORE_REGION
explaining it uses the AWS region format (e.g., us-east-1) as opposed to
JClouds’ names, and change DATASTORE_DEFAULTBUCKETNAME from
"my-rp-docker-bucket/analyzer" to a valid bucket name "my-rp-docker-bucket"
(move the "/analyzer" path into the object key prefix mechanism such as
DATASTORE_BUCKETPREFIX or document using DATASTORE_BUCKETPREFIX for object path
prefixes) so the S3 bucket name conforms to AWS naming rules.


## IAM Role-based authentication

### Amazon EKS-based
Expand Down Expand Up @@ -341,11 +353,25 @@ In your `docker-compose.yml`, configure ReportPortal to use IAM-based S3 access:
```yaml
x-environment: &common-environment
# IAM Role-Based S3 Access - Leave credentials empty
DATASTORE_ACCESSKEY: ""
DATASTORE_SECRETKEY: ""
DATASTORE_TYPE: aws-s3
DATASTORE_REGION: us-standard # JClouds alias for us-east-1
DATASTORE_ACCESSKEY: "" # Leave empty for IAM Role-based access
DATASTORE_SECRETKEY: "" # Leave empty for IAM Role-based access
RP_FEATURE_FLAGS: singleBucket
DATASTORE_TYPE: s3 # Enable single-bucket storage (necessary for Amazon S3)
DATASTORE_REGION: us-standard # JClouds alias for us-east-1
DATASTORE_DEFAULTBUCKETNAME: my-rp-docker-bucket

x-analyzer-environment: &common-analyzer-environment
LOGGING_LEVEL: info
AMQP_EXCHANGE_NAME: analyzer-default
AMQP_VIRTUAL_HOST: analyzer
AMQP_URL: amqp://${RABBITMQ_DEFAULT_USER-rabbitmq}:${RABBITMQ_DEFAULT_PASS-rabbitmq}@rabbitmq:5672
ES_HOSTS: http://opensearch:9200
DATASTORE_REGION: us-east-1
DATASTORE_ACCESSKEY: "" # Leave empty for IAM Role-based access
DATASTORE_SECRETKEY: "" # Leave empty for IAM Role-based access
DATASTORE_BUCKETPREFIX: prj-
DATASTORE_BUCKETPOSTFIX: ""
DATASTORE_DEFAULTBUCKETNAME: my-rp-docker-bucket/analyzer
```

:::note
Expand Down
Loading