This document covers frequently asked questions from Consultants, Partners and Customers about SAP Integrated Business Planning Integration using CPI-DS. The purpose of this document is to cover important questions that come up during IBP Implementations in application integration areas like ERP, BW, Back Integration using RFC Enabled Modules, Microsoft Azure, and APO Integration. If you have additional questions that need to be added, please contact the SAP IBP Customer Engagement Team (firstname.lastname@example.org).
Q: Is there any alternative to CPI-DS which SAP provides for integration?
A: No, CPI-DS is the integration tool for time series based planning and SDI is for order based planning (Finding the Right Integration Tool)
Q: What is difference between CPI-DS and CPI-PI?
A: CPI-DS and CPI-PI are different products and not interchangeable.
Q: If I currently have SAP ERP.. and I have installed Add-on on top of it.. and have developed the interfaces to IBP.. and when next year, we upgrade from ERP to S/4.. our interfaces will not need any change?
A: It depends on how much the new S/4 implementation differs from ERP. If the Extractors on source system are identical CPI DS Data flows and used Data Stores need a few changes. If there are changes of table names/structures the meta data needs to be imported again into CPI DS Datastore and you might have to rework Data Flows.
Q: What is this add-on?
A: See link (SAP S/4HANA, supply chain integration add-on for SAP IBP)
Q: Are these EXT tables valid for IBP Demand?
A: Yes of course you can use them to get Planning Combinations and Sales history from ECC or S/4
Q: What ports are needed to be open for communication from S4 side?
A: When you connect to SAP application datastore, it is connecting Via ABAP layer, extraction is through RFC read, ABAP generation or execution, ODP ( extractors) and BAPI. It uses standard ports to connect to application server, ports are not exposed and connection is established using application server name, client and system numbers.
Q: What is the best method of triggering these Data extractors on ECC and tasks on CPIDS in one shot?
A: The CPI-DS Task execute the extractors as part of the task execution. There is no need to schedule an ECC job in advance.
Q: Can I have several CPI-DS agent in parallel pointing to the same IBP system?
A: Agent is communicate to CPI-DS Server. The CPI-DS Server communicates to IBP system. You can have multiple agents and group them under one group , when running the task choose the group and CPI-DS assign that task to a specific agent under that group in round robin pattern.
Q: CPI-DS can connect to ECC and S/4 HANA only or more systems?
A: CPI-Ds can connect to other systems as well, such as BW, databases, APO etc. (SAP Cloud Platform Integration for data services - Product Availability Matrix (PAM))
Q: For order based planning is CPI-DS /SDI both can be used or only SDI?
A: SDI is mandatory for Transactional - order level - data and Master Data. For Key Figure type data like Forecast or Allocation you would have to use CPI-DS.
Q: Can we use CPI-DS to pull forecast data from external system to IBP?
A: SDI is mandatory for Transactional - Order level - Data and Master Data. For Key Figure type data like Forecast or Allocation you would have to use CPI-DS.
Q: Do we have standard IBP API to pull forecast data from external systems?
A: Use CPI DS to connect to external systems (Finding the Right Integration Tool)
Q: Are spaces/blanks not allowed in a matnr for IBP?
A: Yes, please check IBP release restriction note.
Q: Do we need to create one task in CPI to fetch data from S/4 and update in IBP or Should we create two tasks?
A: You need to create only one task.
Q: Is it possible to provide a delta load mechanism?
A: In general, CPI-DS needs external record of changed data, it doesn't hold that on its own. Data never really lives in CPI-DS, just passes through.
With tables, CPI-DS will need some kind of change history to work with. For instance, if there is a change date column in the table, we can use that to filter down to just records updated since the last job execution. For this purpose we have save_data() and get_data() functions. In the post-load script, use save_data(systime()), which will save the end time of this job for the next execution. In a filter in the dataflow, you must filter on a last changed date/time and use get_data() as the filter condition. (Change Data Capture- Delta Loads)
Q: We have implemented parallel processing by placing DFs in parallel in a process chain. is there an option to run multiple DFs in 1 task in parallel?
A: No, only via Process. However the IBP Post-processing is single thread by default.
Q: Can agent can be installed on Cloud?
A: Yes and no... the agent can be in a VM that is running in the cloud. It must be running one of the supported operating systems. From the perspective of CPI-DS that would still be treated as "on premise".
Q: The CPI-DS task is mostly recommended for 'Insert update' but in general you may for scenario where you need to delete old master data like Product, BOM etc. Then how does it works with 'insert update'. I believe you need to use 'replace' ?
A: You can use the REPLACE , but you will have to be careful. As REPLACE mode should always be used in the Full load.
Q: Do we need to execute any job to update data into Planning area in IBP from staging tables or Is it automate step?
A: It is Automated step. Data will be pushed after post processing is finished.
Q: Does the integration consider the right sequence of integration, i.e. first the single MDTs and afterwards the compound ones?
A: It considers right sequence
Q: Which one is recommended by SAP to run multiple tasks process chain or application job templates in IBP?
A: It is a common practice to schedule jobs via IBP application job templates.
Q: How do we pass global variables via external job scheduling API?
A: See link for external job scheduling (Web Services Guide for SAP Cloud Platform Integration)
Q: We are not able to view logs >50k records downloaded from Data integ. tile. Any options for the same?
A: Yes you can extract that data from the matching _REP table in CPI-DS, to whatever other system would allow you to view that report data
Q: What is the _REP table used again for?
A: The _REP table in IBP contains the results of the post-processing of the data integration task.
Q: Is there a way to send email notifications with the result of Sandbox task executions?
A: No unfortunately not. Only for Production Tasks.
Q: Is there an advantage/disadvantage to having each data flow in a separate task and running all of them in a process?
A: Using a process is easier to schedule them, especially if there are many.
Q: How can we join the data from ECC and Flat file from SQL server in CPI-DS
A: Join from multiple sources is not possible in CPI DS. One option is to have separate tasks for each of it and use a process to run them in sequence/parallel.
Q: Our CPI-DS system is very slow. takes 2 mins to actually transfer the data... is it because that the agent is slow?
A: Troubleshooting on where the slowness comes from. Check Monitor log in CPI-DS to identify the bottleneck. Then check performance on Agent, check performance of Source and Target System.
Q: SOPMD_STAG_XXXLOCATION_REP - What is the specific use of these tables ?
A: The table contains the result of the post processing in IBP, same content you would see in IBP Data Integration app when you download a complete report.
Q: Can Customer ABAP be used for BW integration with CPI-DS?
A: Yes, customer ABAP can be used for scenarios where creation of DSOs is not possible.
Q: What is the minimum BW version level required for integration?
A: Check supported version (SAP Cloud Platform Integration for data services - Product Availability Matrix (PAM))
Q: Is embedded supported as well as stand alone BW?
Q: Can more systems can be integrated via the same CPI-DS to SAP IBP ? So a seperate S4H and BW system?
A: Yes any system can be connected to CPI-DS but with usage of different datastore types.
Q: Is embeded S4H BW supported?
A: Yes Embedded BW can also be used.
Q: During retraction from IBP to BW PSA multiple info packages are created. Why?
A: Depending on the packet size maintained, the BW PSA is loaded in packets to improve the performance of the system.
Q: What exactly is not supported with BW4HANA -> Target and Source?
A: BW/4HANA is not supported as source and target in CPI-DS .
Q: Is BW Source to export data to IBP, and BW Target to receive data from IBP ?
Q: Data is by default loaded as packages of 5000. How can we improve performance for data load from IBP to BW PSA?
A: Infopackage has the limit set with the parameter Max. size of the data packet to specify the packet size.
Q: Is it possible to connect any BW objects such as Cube, Multiprovider, etc. or it is restricted to DSO?
A: DSO and tables can be used as datastore creation in CPI-DS.
Q: Is it possible to schedule CPI-DS jobs from BW or ERP systems?
A: Yes its possible through ABAP code and through IBP oData services (Finding the Right Integration Tool).
Q: Should the RFC connection must be SAPDS only?
A: This is default, but can be changed. But ensure that the program ID is consistent between the RFC connection and the program ID maintained in the datastore configuration.
Q: Can we update a cube or should it be a dso in BW?
A: DSO should be used which then flows to Cubes.
Q: Are there any cases where you would need to re-import the metadata if the data in BW changes?
A: If there are any changes to the source/target, then re-import is required.
Q: What is the minimum authorization required when loading the data to BW?
A: The S_RS_ADMWB authorization is used for BW loading.
Q: In IBP, we have compounded master data. How we can map in BW?
A: Your data model in BW should match to IBP's. E.g. tables for Location, Product and LocProd to load these MDTs
Q: G_PLAN_AREA global variable is mandatory for Master data upload as well?
A: For master data types, this can be optional. The $G_PLAN_AREA can be set to value ''.
Q: How do we handle the delta load?
A: IBP does not handle delta loads but this is possible that Delta has to be handled through the preload scripts.
Q: Can we modify data while loading from BW to IBP?
A: Yes in the transforms of the dataflow, modifications can be done to the BW data loaded to IBP.
Q: Can we load on Product level instead of the base planning level?
A: Yes that's possible
Q: If the information is in DAY and we need it in TW, what is the logic aggregation in the global parameters used?
A: For S/4 and ECC systems, the aggregation can be done through the Add-on functionality. For other systems, the aggregation should be performed if the planning level is at technical level.
Q: Does CPI support loading from multiprovider?
A: No, not supported
Q: If the number of fields in source is not equal to the target ones, many records from source may represent 1 record in target. Is the data overwritten or aggregated?
A: There is a possibility that the IBP system will give a "Duplicate error" during loading if the data across all fields are same.
Q: How does CPI-DS map fields of staging tables to internal IBP tables?
A: CPI-DS writes to the staging tables in IBP and an internal process in IBP then moves that data to the appropriate IBP tables (Post-processing). This is done automatically as part of the integration job.
Q: Can we use new feature 'RFC enabled function calls' to load data to ECC instead of the old way using webservice?
Q: How often is CPI-DS upgraded?
A: Typical release cycle for CPI-DS is every 3 months.
Q: Does it support pushing to BW an 'OR' filtering condition on one field, e.g. "Region=X OR Region=Y"
A: While querying the data the filtering is pushed down, but the source in BW selects all the data and then applies the filter.
Q: Is there any plans to support connectivity to SCP Hana database sitting in a datacenter that is different from the datacenter of CPI-DS?
A: No, not supported
Q: Can you limit which users can edit which objects in CPI-DS?
A: CPI-DS users can have different roles and these roles are maintained in the administrator section by the administrator.
Q: What is the future of CPI-DS?
A: CPI-DS is here to stay, with IBP as its main use case. Please view the Roadmap
Q: Are Flat Files the only option to get PIRs into the backend system from IBP?
A: No, Webservices and RFC enabled Function Modules can be used to back integrate into SAP ECC/ S4.
Q: Can we send data from CPI-DS to APO through RFC calls?
A: Yes it is possible in custom way. You'll have to build an own RFC enabled Function Module in APO for your specific needs.
Q: IBP generates the Procurement Proposal after heuristics run..can this design work to create the Purchase req in ECC by calling Purchase req Function module?
A: Yes, that is an option. Please check carefully the can provide all required input of the Function Module.
Q: How can we identify the correct/proper version of the data services agent?
A: The version of the agent(s) is displayed in the CPI-DS web UI, in the Agents tab.
Q: Can we use web service to generate PIR's in S/4 OR receive the file from IBP then process it? What is the best option?
A: Web service is the preferred option as it is more secure than having flat files. Also RFC can be used in the future for this.
Q: What agent version is required for RFC function calls?
A: RFC function call feature requires agent version 1.0.11 Patch 36 or later.
Q: Can you connect from SAP CPI to SAP ECC via ABAP proxy (generated by SAP PI/PO)?
A: Right now the only options are Webservices and RFC function integration.
Q: What is included in IBP Add-on for ECC upcoming release?
A: See What's New.
Q: Is the RFC Function call only Applicable for PIR now?
A: It is a PIR example, however can serve as an example for other data or integration scenarios.
Q: Can the FUNCTION object be used both as a source and Target?
A: It is a PIR example, however can serve as an example for other data or integration scenarios.
Q: What is the difference between current IBP Add-on for ECC and upcoming which will be available May 2020?
A: Please view the What's New content for IBP which will have this information.
Q: Are all function models predefined by SAP? Or can the user define the function model themselves?
A: Standard functions will have their predefined interface. But for custom functions you can define it yourself.
Q: Does CPI-DS have any SAP certifications?
A: No, we do not offer any certifications for CPI-DS.
Q: Are there any benchmarks showing how many records direct RFC can handle in given timeframe?
A: For PIR back integration to ECC via RFC we have seen 2 month Records in less than 10 mins.
Q: If 2 function modules are imported, if it is having same output table (here is BAPIRETURN1), will it have any issue?
A: Yes you will have to write it into different files as data stores
Q: Are there any limitations on size of files to configure in datastores ?
A: Generally, there is no limitation on data stores.
Q: Is it possible to use Datastore in different Project?
A: Data stores are not dependent on project.
Q: It would be nice to have a "Merge" transform in CPI-DS just like how we have it in classical Data services tool.
A: Merge transform is in backlog for CPI-DS.
Q: Is there a delta load of the PIR from IBP to ERP ?
A: AS IBP does not support delta extraction. It is a full extraction.
Q: With this RFC mechanism, how does it work? Will multiple RFC calls go in parallel, or is it one by one? Want to understand whether we will have bottleneck with number of RFC connections that gets opened?
A: Currently, in delivered Template it is sequential.
Q: The FM imported for this interface should be from S4 system or IBP system?
A: FM should be in S4 or other target system
Q: In CPI-DS , can we promote the whole project in one go? Instead of promoting task separately?
A: You can promote process and tasks, but as of now the whole project is not possible.
Q: Does File location object support google storage or AWS?
A: Google storage or AWS are not supported by CPI-DS.
Q: What approach we should use for google storage connectivity?
A: File to GCP.
Q: Is Azure cloud storage same as Azure Blob or is a different service?
A: Correct Azure cloud storage is Azure Blob.
Q: Can we configure multiple data store configurations using File Location?
A: Yes, it is possible to create multiple datastore configuration for file locations.
Q: Can single file location can be used for multiple file formats?
A: Yes, you can use file location to define multiple file formats
Q: What is the difference between root directory given in file location and file format group and which is used inside dataflow?
A: It is the same, Root directory configured in file location is viewed in File formats.
Q: My company is using Google cloud to storage data need to integrate to IBP through CPI-DS, will CPI-DS provide any integration function for Google Cloud?
A: File to GCP.
Q: If I have different configurations for DEV & QA then how does that have to be managed? Should we switch the details manually every time?
A: You can switch manually, or you can manage the files manually - in many cases the files are written and read in the same context (Dev or QA) and might not even need to be changed.
Q: Is the local directory defined on the Agent Server?
A: Yes, root directory is local to the agent.
Q: How much load/impact this datasource would have on my CPI-DS agent?
A: Load is similar to reading to writing to agent location.
Q: Can we read the existing structure instead of creating from scratch like ECC?
A: CPI-DS supports 3 options to create file formats: create from sample based on the CSV file on your local system , create from tables based on a existing table or file in a datastore, you can choose multiple tables in a selected datastore.
Q: Is it possible to use FTP / SFTP function to extract files from the Google cloud?
Q: Defined file formats schemas should be inlign with destination table schema in actual Microsoft Cloud Storage container table?
Q: How we can read /write files if we have multiple sub folders under my container?
A: Not possible, need to create separate file location objects.
Q: Why do you need to delete the local file?
A: You don’t need to delete the local file.
Q: Is it possible to send the data from IBP to Azure without going through the local CPI-DS agent (i.e. without using a local file at the CPI-DS file location)?
A: No, streaming to Azure is done through the local file.
Q: We tried sending the data to ERP system using the RFC enabled function module after the last webinar, but couldn't find a way to monitor such messages in target system.
A: You can check for the PIR Back integration the Add-on Transaction /IBP/ETS_PIR_MON - Planned Independent Requirement Monitor.
Q: Can you show how to get these details from Azure site?
A: Please check with your Azure admin.
Q: Is FTP & SFTP in File format group going to be deprecated in near future?
A: Yes, you should use file location object for new implementations.
Q: How to get Azure specific credentials? Where to look for these details?
A: Please check with your Azure admin.
Q: From where is the file deleted? From the agent server folder?
A: Yes, once it is uploaded to Azure.
Q: When ADLS2 will be supported by CPI-DS?
A: Not planned
Q: It says files to be deleted after transfer - why is it not deleted? Does 'Deleted' mean from CPI-DS agent?
Q: I am not 100% sure what does it mean be deletion?
A: Files will be deleted from the temp local folder on agent side.
Q: So the logic behind this is to push data from IBP into Azure like, in a way of archiving there the data?
A: Please check with your Azure experts.
Q: How can we read /write files if we have multiple sub folders under my container?
A: You need to create separate file location objects.
Q: What about Power BI Integration?
A: Power BI can be integrated via IBP oData Services.
Q: Is it possible to integrate resource downtime from PP/DS into SAP IBP to reduce the available capacity?
A: Yes it's possible, but you would have to maintain downtime in ECC and keep one source of truth.
Q: Can we extract the data from Datasource in APO via CPI-DS?
A: Yes, configure Business Suite data store and/or BW data store.
Q: How can the process of loading data to APO BI be improved? Being in APO for quite some years, we have experienced performance issues.
A: Use timefrom and timeto to limit the time horizon for the extraction to improve performance and reduce date volume.
Q: Do BW Extractors and CPI DS ABAP Query contradict each other?
A: No, both are 2 different methods for extracting data from APO. BW extractors extract data first and then apply filters. ABAP Query uses Filters as a SQL select statement. The Filters are pushed down to query. Performance wise, ABAP query is faster than BW extractor.
Q: Is it possible to extract the production cycle from APO to IBP? Even if we are using CDP?
A: Yes, it is possible. Either use BAPI or custom ABAP to extract it.
Q: What is Sandbox? is it testing environment?
A: Sandbox is the development/test environment in CPI-DS. CPI-DS ORG comes with a sandbox and production environment.
Q: Can IBP Time series Integration CPI-DS be real time like CIF?
A: No, this is batch based, periodic Integration (e.g. daily).
Q: Does the CPI DS license come with IBP R&S?
A: Yes, IBP productive license comes with CPI DS license.
Q: Does CPI-DS licence also come with the IBP starter edition?
A: Yes, CPI-DS license comes with SAP IBP. Please refer to this page for more information.
Q: What are the available templates for CPI-DS?
A: Please refer to this help page on the available templates in CPI-DS for IBP integration.
Q: Do we have standard mapping of the APO Tlane fields to the target IBP field mapping in the CPI-DS template?
A: No that is not available, but it is a simple task design. Please refer to this help page on the available templates in CPI-DS for IBP integration.
Q: Can we create a custom RSA1 extractor like the ones provided on ECC add-on? If so, can you share a "how to" document for a custom database table?
Q: When you write the custom ABAP, how can you test the code without creating a new report in the ABAP workbench?
A: You can test it by running it in design time in CPI-DS. Select the ABAP execution option at Data Store level using options generate and execute in Sandbox/Development environment once satisfied with results use execute preloaded option. Syntax check is not available. It's only possible by copying & pasting it into a new executable ABAP report on the ABAP System.
Q: When you create the ABAP form and type the code in CPI, can you debug similarly to how it is done in an ERP system?
A: There 2 prerequisites: mark the Data Store to run in dialog mode and put external break point in ABAP source system.
Q: Is it possible to use the receiving days in Response and Supply?
A: Please check that it can be considered properly in R&S. In 2002, the Calendar API was built in IBP. You will have to load the calendar in a separate data flow for the TLanes. Please refer to this help page for more information on calendar integration.
Q: How can we prevent SAP IBP from deleting already scheduled planned orders by APO PP/DS, since it doesn't respect the horizon currently?
A: You would need to consider them as fixed in SAP IBP.