You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: source/includes/_bulk_data.md.erb
+13-12Lines changed: 13 additions & 12 deletions
Original file line number
Diff line number
Diff line change
@@ -38,18 +38,6 @@ Bulk data webhooks should be automatically included when adding a new webhook en
38
38
39
39
Your system should listen for those webhooks to know when and where to get the exported data.
40
40
41
-
## Processing the exported data
42
-
43
-
It's possible to consume the Bulk Data API in its underlying format as CSV files in an S3 bucket or as a higher level
44
-
HTTPS Webhook API that is not specific to AWS or S3. Many data warehouse integration technologies like [BigQuery S3 Transfers](https://cloud.google.com/bigquery/docs/s3-transfer),
45
-
[Airbyte](https://docs.airbyte.com/integrations/sources/s3/) or [AWS Data Glue](https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/build-an-etl-service-pipeline-to-load-data-incrementally-from-amazon-s3-to-amazon-redshift-using-aws-glue.html)
46
-
are able to natively process files in S3 buckets. However, if you are using a different technology or want to implement a custom integration
47
-
you can use our webhook events to get the same data in a cloud platform agnostic way.
48
-
49
-
We provide a [ControlShift to Redshift Pipeline](#bulk-data-controlshift-to-redshift-pipeline) as an example of sample code that demonstrates how to use the high-level webhooks to mirror your ControlShift data into Redshift.
50
-
Similar strategies can be used to mirror your data into other data warehouses. We've designed the underlying APIs to work flexibly regardless of
51
-
your technical architecture. Since we expose the file events as standard HTTPS webhooks they should be compatible with any programming language.
52
-
53
41
## Bulk data files
54
42
55
43
Each table exposed by the bulk data API is made available as a CSV file, with the URL to download each file sent via webhook.
@@ -81,6 +69,19 @@ Finally, when the compression for data exports is enabled the filename includes
81
69
When the **Compress bulk data exports** option is enabled (available at the Webhooks integration page), incremental and nightly bulk data export files will be compressed in [`bzip2` format](https://sourceware.org/bzip2/). This will improve the performance for fetching the files from S3 since they will be considerably smaller.
82
70
83
71
72
+
## Processing the exported data
73
+
74
+
It's possible to consume the Bulk Data API in its underlying format as CSV files in an S3 bucket or as a higher level
75
+
HTTPS Webhook API that is not specific to AWS or S3. Many data warehouse integration technologies like [BigQuery S3 Transfers](https://cloud.google.com/bigquery/docs/s3-transfer),
76
+
[Airbyte](https://docs.airbyte.com/integrations/sources/s3/) or [AWS Data Glue](https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/build-an-etl-service-pipeline-to-load-data-incrementally-from-amazon-s3-to-amazon-redshift-using-aws-glue.html)
77
+
are able to natively process files in S3 buckets. However, if you are using a different technology or want to implement a custom integration
78
+
you can use our webhook events to get the same data in a cloud platform agnostic way.
79
+
80
+
We provide a [ControlShift to Redshift Pipeline](#bulk-data-controlshift-to-redshift-pipeline) as an example of sample code that demonstrates how to use the high-level webhooks to mirror your ControlShift data into Redshift.
81
+
Similar strategies can be used to mirror your data into other data warehouses. We've designed the underlying APIs to work flexibly regardless of
82
+
your technical architecture. Since we expose the file events as standard HTTPS webhooks they should be compatible with any programming language.
83
+
84
+
84
85
## Data schemas
85
86
86
87
The bulk data webhooks include exports of the following tables:
0 commit comments