Registration

Dear SAP Community Member,
In order to fully benefit from what the SAP Community has to offer, please register at:
http://scn.sap.com
Thank you,
The SAP Community team.
Skip to end of metadata
Go to start of metadata
Summary

How the load Index Designer created is used.

Index Designer will add to the Data Service Designer by default three important jobs

  • create_job: Consists of one script that creates the cube in BWA.
  • delete_job: A script to delete a cube in BWA. Executed before the create_job if fundamental changes to the model have been made.
  • load_job: The job copying the data from the Data Warehouse into BWA.

The create and delete jobs both contain a DataServices script only, a script calling the create_trex_cube() or delete_trex_cube() function. This function is native to DataServices and interacts with the BWA system.

Useful Information

The delete_trex_cube() function marks a cube in BWA as to-be-deleted only and returns right away. Do not execute the create_trex_cube() function directly after, give BWA a few seconds time to actually remove the cube physically.

The load_job itself consists of the steps

  • truncate: a script calling the function truncate_table() it this removes all data of the cube. Not from the fact or the dimension but both.
  • begin: the function begin_trex_transaction() is called for the cube and its return value remembered in a variable called $TREXTID. This transaction ID has to be passed into all table loaders.
  • load_wf workflow: A regular WorkFlow object containing all dataflows to load the BWA tables.
  • commit: calls the commit_trex_transaction() function to commit the data in BWA.
  • And finally a catch block so that if any of the dataflows fail the rollback_trex_transaction() is called automatically.


The load_wf workflow is the only part where data is actually be loaded and that calls the list of all BWA target tables in parallel. The dataflows themselves are again very simple as we are working under the premise DWH tables should be copied to BWA, hence all dataflows are similar, reading the DWH table, a query with a 1:1 mapping and the BWA target table. So everything as usual except a BWA table loader has to have the include-in-transaction flag set to yes and under the TransactionID put the value if the begin_trex_transaction() function, the value we have assigned to the global variable $TREXTID.



This job can be executed right away and will load all the data into the BWA system.

(see Refining the BWA load for typical adjustments made to the generated job)


 

  • No labels