Skip to end of metadata
Go to start of metadata

Simulation of a data upload priori or after the data has already been uploaded to the target can be useful in order to analyse the data. Not only in case of error but also if data are wrong in the data target.

Restriction:

  • DTPs with Processing Mode: SAP HANA execution. A simulation is possible but as the steps are executed on the database no information will be provided
  • DTP for direct access (REMT)

  

OPTIONS

Simulation from the DTP screen: 

 Open the DTP and select the processing Mode: serially in the Dialog Process (for Debugging)
 

Here you have again two possiblities:

  • simulate the upload and set some breakpoints: breakpoints can be set for different steps during the upload and datapackage , but no filter criteria can be set, and no temporary storage can be selected; If a lot of data will be extracted the simulation can take a long time and a lot of data within a datapackage can make it difficult to analyse.
  • simulate the upload in Expert Mode by setting the flag  :  filter criteria can be select, in order to simulate only few records. The temporary storage can also be be useful as you can analyse the data in which step the data are getting incorrect for example.

options which can be set if you simulate the upload in Expert Mode

 RESULT:

A simulation will be executed serially with one dialog process.

In gerneral this kind of simulation is used, if you want to didn't upload the data yet.

If you already uploaded the data and an error occurs or the data are wrong after the upload with the DTP you use the simulation from the DTP monitor.  

Simulation from the DTP monitor

Call the request from the manage screen of the appropriate infoprovider or in case you know the request number you can call the monitor directly via transaction RSRQ.

Press the button 'Debugging'

You will get the same screens as you will get in case you select the 'Expert mode' as described above. The only difference is that the selection criteria including the request number which has been used for the upload are automatically added into the filter criteria. Those criteria can be changed, by adding or removing fields and values.

 

Analysis of wrong data in the data target:

Useful processes/tools

Temporary storage:

If you select the temporary storage you can analyse the data after each step:

STEPS

  1. Simulate the data in 'expert mode' or from the request monitor as described above.
  2. Switch on the temporary storage and check the data after each mainstep (After Extraction, After Error Handling, After Transformation) 

 


 

Although the monitor provides for each substep the icon   the data are the same as for the mainstep. The system will only provide you the data as selected in the selection screen.

 

example: in the transformation an endroutine exists which use a constant 'ENDROUTINE' for the field 'vendor'.

Even if you select the temporary storage in the following line   

you will see in the temporary storage the result of the data after the transformation, not the data before or after the Start Routine.

 

Depending if or after which step the data are wrong the following checks can be useful:

Incorrect data in the temporary storage after the extraction:

Check the data in the PSA table with the same filter criteria

You can call the table directly via transaction se16 if you know the PSA table /BIC/B*xxxxxxx. How to identify the technical name of the PSA table, click here

If the data are already wrong in the PSA table, you need to analyse the data in your source system. In case of a SAP source system you can use transaction RSA3.

 

Incorrect data in the temporary storage after Errorhandling
Data can be filtered out in case the error handling is switched on and there are data with the same key already available in the error stack.

How to identify if there are data in the error stack:

Call the appropriate DTP and press the button 'Display Error Stack (F7)

All requests which are available in the error stack will be displayed. If you want to check the data for a specific requests, call the monitor for the request by using transaction RSRQ and press the button     

 

Incorrect data in the temporary storage after the transformation:

Check the transormation, review start, end and rule routines.

The result of specific rule can be checked in the transformation by using the 'Rule Test'

Open the Detail view of the appropriate rule and select the button 'TEST RULE'

Maintain a value for the source fields and press the button execute. The system will generate a test environment and determine the result of the defined rule.

 

Debugging of the DTP execution

If you use the Expert mode in the DTP screen or if you debug the upload via the monitor request, you have the possibility to set several breakpoints:

or you can set a breakpoint at methode PROCESS_REQUEST (CL_RSBK_PROCESS as this methode will be called for every datapackage as it includes the call of the extractor, the transformation and the update into the data target.

 

 

DTP Extraction

In dependents of the source, different classes will be called
some example from the template class: CL_RSBK_CMD_X

 

Source - from where data will be extractedClass which will be called for extraction
DatasourceCL_RSDD_X_MULTIPROVIDER_CMD
CubeCL_RSDD_X_INFOCUBE_CMD
classic DSOCL_RSDD_X_DS_CMD
MultiproviderCL_RSDD_X_MULTIPROVIDER_CMD
 Masterdata CL_RSDMD_IOBJ_X_CMD

 

DTP Transformation

Will be called twice.

For the error stack, in order to filter out data in case there are already data in the error stack
For the defined transformation (startroutine, defined rules, endroutine or expert routine)

 

The generated program for the transformation will be created when the transformation is activated. It can be displayed in the transformation screen under the menu Extras --> Display Generated Program.

 

 

  

  

  

  

  

  

  

 

  • No labels