Skip to content

Commit 853a0bf

Browse files
committed
move "Processing the exported data" section below "Bulk data files" section
1 parent 1417f5a commit 853a0bf

1 file changed

Lines changed: 13 additions & 12 deletions

File tree

source/includes/_bulk_data.md.erb

Lines changed: 13 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -38,18 +38,6 @@ Bulk data webhooks should be automatically included when adding a new webhook en
3838

3939
Your system should listen for those webhooks to know when and where to get the exported data.
4040

41-
## Processing the exported data
42-
43-
It's possible to consume the Bulk Data API in its underlying format as CSV files in an S3 bucket or as a higher level
44-
HTTPS Webhook API that is not specific to AWS or S3. Many data warehouse integration technologies like [BigQuery S3 Transfers](https://cloud.google.com/bigquery/docs/s3-transfer),
45-
[Airbyte](https://docs.airbyte.com/integrations/sources/s3/) or [AWS Data Glue](https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/build-an-etl-service-pipeline-to-load-data-incrementally-from-amazon-s3-to-amazon-redshift-using-aws-glue.html)
46-
are able to natively process files in S3 buckets. However, if you are using a different technology or want to implement a custom integration
47-
you can use our webhook events to get the same data in a cloud platform agnostic way.
48-
49-
We provide a [ControlShift to Redshift Pipeline](#bulk-data-controlshift-to-redshift-pipeline) as an example of sample code that demonstrates how to use the high-level webhooks to mirror your ControlShift data into Redshift.
50-
Similar strategies can be used to mirror your data into other data warehouses. We've designed the underlying APIs to work flexibly regardless of
51-
your technical architecture. Since we expose the file events as standard HTTPS webhooks they should be compatible with any programming language.
52-
5341
## Bulk data files
5442

5543
Each table exposed by the bulk data API is made available as a CSV file, with the URL to download each file sent via webhook.
@@ -81,6 +69,19 @@ Finally, when the compression for data exports is enabled the filename includes
8169
When the **Compress bulk data exports** option is enabled (available at the Webhooks integration page), incremental and nightly bulk data export files will be compressed in [`bzip2` format](https://sourceware.org/bzip2/). This will improve the performance for fetching the files from S3 since they will be considerably smaller.
8270

8371

72+
## Processing the exported data
73+
74+
It's possible to consume the Bulk Data API in its underlying format as CSV files in an S3 bucket or as a higher level
75+
HTTPS Webhook API that is not specific to AWS or S3. Many data warehouse integration technologies like [BigQuery S3 Transfers](https://cloud.google.com/bigquery/docs/s3-transfer),
76+
[Airbyte](https://docs.airbyte.com/integrations/sources/s3/) or [AWS Data Glue](https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/build-an-etl-service-pipeline-to-load-data-incrementally-from-amazon-s3-to-amazon-redshift-using-aws-glue.html)
77+
are able to natively process files in S3 buckets. However, if you are using a different technology or want to implement a custom integration
78+
you can use our webhook events to get the same data in a cloud platform agnostic way.
79+
80+
We provide a [ControlShift to Redshift Pipeline](#bulk-data-controlshift-to-redshift-pipeline) as an example of sample code that demonstrates how to use the high-level webhooks to mirror your ControlShift data into Redshift.
81+
Similar strategies can be used to mirror your data into other data warehouses. We've designed the underlying APIs to work flexibly regardless of
82+
your technical architecture. Since we expose the file events as standard HTTPS webhooks they should be compatible with any programming language.
83+
84+
8485
## Data schemas
8586

8687
The bulk data webhooks include exports of the following tables:

0 commit comments

Comments
 (0)