In this section we'll extract the Sales Headers using an ABAP CDS View and the SAP ODP Adapter.
The ABAP CDS View can be found here. Note the annotations by which the CDS View can be found in the SAP Data Dictionary (transaction SE11 or SE11n) and the annotations for Data Extraction and Delta Enablement. Here you can see that the field 'LastChangeDateTime' is used for Delta retrievals by the ODP adapter.
@AbapCatalog.sqlViewName: 'ZBD_ISALESDOC_E1'
@Analytics.dataExtraction.enabled: true
@Analytics.dataExtraction.delta.byElement.name:'LastChangeDateTime'
The extracted Sales Order headers will be saved in a SQL Table within the Synapse SQL Pool. We will begin with creating this table using an SQL Script.
- In the Azure Portal, select your Synapse Workspace.
- Select
Open Synapse Studio
- Select 'Develop'
- Create SQL Script
Note: Make sure to change the "Connect to" value from 'builtin' to your own SQL pool as shown in the screenshot below. As by default it will be connected to the 'builtin' SQL pool of Synapse.
CREATE TABLE SalesOrderHeaders(
BILLINGCOMPANYCODE nvarchar(4),
BILLINGDOCUMENTDATE date,
COUNTRY nvarchar(3),
CREATIONDATE date,
CREATIONTIME time,
CREDITCONTROLAREA nvarchar(4),
CUSTOMERACCOUNTGROUP nvarchar(4),
CUSTOMERGROUP nvarchar(2),
CUSTOMERNAME nvarchar(80),
DISTRIBUTIONCHANNEL nvarchar(2),
LASTCHANGEDATE date,
LASTCHANGEDATETIME decimal(21,0),
-- MANDT int,
ORGANIZATIONDIVISION nvarchar(2),
PRICINGDATE date,
PURCHASEORDERBYCUSTOMER nvarchar(35),
SALESDISTRICT nvarchar(6),
SALESDOCUMENT nvarchar(10) NOT NULL,
SALESDOCUMENTPROCESSINGTYPE nvarchar(1),
SALESDOCUMENTTYPE nvarchar(4),
SALESGROUP nvarchar(3),
SALESOFFICE nvarchar(4),
SALESORGANIZATION nvarchar(4),
SDDOCUMENTCATEGORY nvarchar(4),
SOLDTOPARTY nvarchar(10),
TOTALNETAMOUNT decimal(15, 2),
TRANSACTIONCURRENCY nvarchar(5),
CITYNAME nvarchar(35),
POSTALCODE nvarchar(10)
)The sales order headers are extracted from SAP using the SAP ODP Adapter which uses an RFC.
The CDS View to extract from is : ZBD_ISALESDOC_E.
Note: You can have a look in the SAP system to check the contents. Use the Data Dictionary, transaction
SE11.
- In Synapse Studio, go to the
ManageView
- Select
Linked Services
- Create a new
Linked Serviceof typeSAP CDC Connector
- Enter the connection details for the SAP System, we used the name
S4SCLNT100ODP. Use the username and password for the SAP system provided to you at the start of the Microhack. - Use the Integration Runtime which you installed in the previous steps
- Enter a Subscriber Name. This name will also be used by ODP in the SAP System.
Note : use
Test Connectionto verify your settings
Note : SAP Connection Details will be handed out before the MicroHack
Create an Integration DataSet based on the previously created Linked Service.
This dataset will act as the source.
- Switch to the
DataView - Create a new
Integration Dataset
- Use type
SAP CDC
- As a name we used
S4SalesOrderHeadersODPand for the linked service we used the one we just createdS4DCLNT100ODP - Since we'll be extracting from a CDS View, use
ABAP_CDSas ODP context - Use
ZBD_ISALESDOC_E1$Eas ODP name, it can take some time before the list of tables is loaded
- Use
Preview Datato check if the data can be retrieved
- Once the information is entered successfully and the data can be retrieved, leave the tab as-is. We will publish the changes after the rest of the components of this data flow are done.
Note : the source code of the CDS View can be found here
-
This will represent the target/sink of the pipeline
-
Switch to the
Manageview -
Create a new Linked Service of type
Azure Synapse Analytics, as name we usedSynMicroHackPool
Note: Since this linked service represents the Synapse SQL pool, it will be re-used in the
SalesOrderItemsandPaymentspipeline.
This dataset will act as the sink in our pipeline.
-
Switch to the
DataView -
Create a new
Integration DataSetfor the Synapse Sales Orders
-
As a name we used
SynSalesOrderHeadersand for the linked service we used the one we just createdSynMicroHackPool -
Select the
SalesOrderHeaderstable
- Again leave the information on the tab as-is and move to the next step
- Create a new
Pipeline, we usedExtractSalesOrderHeadersas a name
- Use the
DataFlowaction (withinMove & transformby dragging it onto the pipeline canvas
- In the
Generaltab, change theName. We usedExtractSalesOrderHeadersas a name - In the
Settingstab, change theRun on Azure IRtoAutoResolveIntegrationRuntime - Enable
Stagingand enter the path to the staging directory of your Azure Data Lake. The staging directorysap-data-adls/staging, was already created by the Terraform script.
- Press
+Newto create a new DataFlow
- In the DataFlow, change the name. We used
ExtractSalesOrderHeadersDF - Select
Add Source
- In
Source settings:- change
Output stream name. We usedS4SSalesOrderHeaders - As
Dataset, select the ODP dataset you create previously
- change
- Turn on
Data Flow Debug
- In
Source options:Key Columns:SALESDOCUMENT(Use theRefreshbutton)
- In
Projection, selectImport projection
Note : Date fields like
CREATIONDATEare detected as string. Note : Under "Data
- Now we need to do some date transformations. We'll do this by adding a
DerivedColumnstep ** Use+and then selectDerived Column
- In
Derived column's settings- OutputStream Name = S4SSalesOrderHeadersUpd
- Add
Derived Columnsusing the formulas beneath.
CREATIONDATE = toDate(CREATIONDATE, "yyyyMMdd")
PRICINGDATE = toDate(PRICINGDATE, "yyyyMMdd")
BILLINGDOCUMENTDATE = toDate(BILLINGDOCUMENTDATE, "yyyyMMdd")
LASTCHANGEDATE = toDate(LASTCHANGEDATE, "yyyyMMdd")
CREATIONTIME = toTimestamp(concatWS(" ", CREATIONDATE, CREATIONTIME), "yyyyMMdd HHmmss")
- Use the
+button to be able to add a next step to the dataflow - Select
Sink
- In the
Sinktab- change the
Output stream name, we usedSynSalesOrderHeaders - Point the
Datasetto your DataSet linked to the Synapse Sales Order Headers
- change the
- In the
Settingstab- Verify
Allow insert,Allow delete,Allow upsert,Allow updateis checked - KeyColums : select
SalesDocument, this is the key column of the SalesOrderHeader Table in Synapse.
- Verify
- In the 'Mapping
tab ** DisableAuto Mapping` to cross check the suggested mapping
- Now
Publish alland once this is successful trigger the pipeline, useAdd trigger->Trigger now->OK
- Swith to the
Monitorview to monitor the pipeline run
- Check the result in Synapse using SQL. You can do this via the
Developview and create a new SQL script.
select count(*) from SalesOrderHeaders
select * from SalesOrderHeadersNote : In the SAP BackEnd you can use transaction
ODQMON - Monitor for Operational Delta Queueto monitor the ODP extractions.
You can now continue with Extracting Sales Order Line items
Since our ODP connector (and CDS View) allows for delta changes, you can change a Sales Order.
- Use Transaction
VA02 - Change Sales Order - Change the
Cust. Referencefield in the Sales Order Header
- Rerun the extraction pipeline
- Change the
SalesDocumentin the sql script beneath to the changed Sales Order
select PURCHASEORDERBYCUSTOMER from SalesOrderHeaders WHERE SalesDocument = '0000000004'- Verify the result by running the sql script
You can now continue with Extracting Sales Order Line items




































