HCI-DS Best Practices, Common Mistakes and Debugging Tips
When Creating Dataflows:
• Use consistent naming for tasks, dataflows and column names (don’t forget descriptions!)
• Avoid unnecessary data extraction/movement by filtering data in the first query of the dataflow and Selecting only the columns from the source that are required.
• Avoid using multiple queries when you can accomplish in one query
• For loading S&OP: load dependent master data and key figures in one task to ensure load order is correct.
• Auto-create file format definitions based on table structures to avoid manual work to define columns.
For Extracting from SAP ERP or APO:
• Use “ABAP query” and “ABAP aggregation” transforms for joining tables and extractors, and for aggregation. This will result in push down to ABAP and will give better performance.
• Use Business Content Extractors when available for simple delta loads and consistent data.
Use Global variables to parameterize your tasks, so that you can easily change parameters without needed to go into dataflows/mappings.
· Best Practice: Name global variables starting with “$G_”.
Example: $G_LOAD_DATE (the timestamp indicating when the task was started).
Values for global variables can be initialized in three places within the task.
· Execution Properties
These settings are a part of the task and must be changed by editing the task. If not changed elsewhere, as shown below, these are the values used at task run time.
· Run Task Now
You can override the values from Execution Properties in the Run Task Now popup window. These are one-time settings used only for this specific run of the task.
· Preload Script
You can override the values from both Execution Properties and Run Task Now popup window in the Preload Script. These settings are also a part of the task and must be changed by editing the task.
· If values of your global variables do not change, best to set them in the Preload script. This ensures that they will not be changed when the task executes. The Preload script values will override any settings in the Execution Properties or Run Task Now window.
If values of a global variable can change for each task execution, best to set a default value in Execution Properties, and then set the value for the current execution in the Run Task Now window.
Here are the two most common mistakes made when building a dataflow for loading data to S&OP; they occur when mapping columns from the input table(s) to the output table. Here is the mistake: A user did not ‘Map’ mandatory columns via ‘Transform Details’ – the common columns missed that must be mapped manually (hard coded) are:
FILENAME… This is required on all S&OP jobs – the file name must be mapped to the name of the target stage table. Remember, you must have single quotes to indicate this is a constant text, you need to pass the column name that is being loaded via the FILENAME column.
ID… This is required only on Key Figure jobs. This must be mapped to gen_uuid function.
Note: Other columns you see in all S&OP tables should never be mapped:
Test Connection: The best tip for debugging a job that failed is to be proactive when doing the file selection for Source and Target in the early stage of building a task. This proactive step will prevent a lot of potential errors when running your job. The step is simple - test your data connections when selecting your source file and when selecting your target files. You can do this by high-lighting the file you want to use and Click on the ‘Test Connection’ button. The system will give you an error or state it that the connection was successful. If an error occurs, you can review your datastore files and your agent for proper access and settings.
Privilege Issue: When reviewing your ‘Error Log’, and it states a ‘privilege’ problem with a certain file/system, look at your “Agent” and check the settings for that file/system to see if they have been entered correctly.
Common Issues: Listing of issues when loading data S&OP data, and possible placing to look to resolve the issue.
Place(s) to Look
In S&OP data load status, the information like filename, master data type, do not show up even though the job that was run is successful in HCI
Check if the filename in the target table is mapped correctly in HCI.
(The filename is the target table name that is displayed above the target table in the mapping screen in HCI)
Number of records loaded into S&OP through HCI do not match the number records coming into HCI from source
Check the monitor log tab in the job history in HCI for knowing how many records were processed by HCI. If this doesn't match that of source, then check the datastore configurations (especially, 'Options' tab) to see if the qualifiers, delimiters, etc. are configured correctly.
Then in S&OP data load status, check the number of records that were processed for that job.
The HCI job failed because one or more column(s)' data do not match the data type it was assigned to
Check the datastore configurations (especially, 'Options' tab) to see if the qualifiers, delimiters, etc. are configured correctly and/or remove special characters that come in the feed. The special characters can be removed in HCI by using a code snippet in the mapping - this code will take care of all the special characters that are not allowed in S&OP.
Note: If more than one column is suspected for special characters, then this code has to be included in the mapping of each of those columns.
Set the data type of the transactional data (not the attributes) in the data store to double instead of decimal and while mapping it, cast it as decimal. This is because there is a high possibility that the feed coming in as decimal may have been converted into scientific format.
Records failing in post processing
This will most likely be an issue with the feed coming into HCI from the source. Check for the following in source feed:
A) No duplicate records
B) If it is a transactional data feed, the master data and master data combinations(if applicable) from the transactional records should exist in S&OP
C) For transactional data, the feed has to come into HCI with keyfigure dates and that date has to be within the time profile defined/loaded into the system and in YYYY-MM-DD format
D) If the feed is too big, break into chunks of 10 million each and make sure the 'Status check duration' in HCI is set appropriately