Latest up to date information for Data Services is available in SAP Note 1817324 - SolMan 7.1: Managed Sys Setup - Data Services 4.0&4.1&4.2 .
To support Data Services cluster scenario, please check in SAP Note 1852508 - How to support Data Services cluster in solman 7.1 .
Guide on importing DS_KPIs job in SAP Note 1618486 - Data Services additional Workload metrics .
In Solution Manager, keep using the latest config.xml file in SAP Note 1259109 - Latest version of configuration file config.xml .
In Enterprise Manager (wily introscope) always deploy the latest management modules in SAP Note 1579474 - Management Modules for Introscope delivered by SAP .
Please go through the above notes before hands on Data Services Managed System Setup in Solution Manager!
Architecture Overview and Setup Procedure:
Before introduction of the Data Services managed system setup in Solution Manager, there are some basic concepts we need to raise here to make sure everyone has an overview of Data Services architecture and managed system setup procedure in Solution Manager.
In general, a Data Services system contains 3 parts, web application, application server and frontend application. Web application is deployed on a web server, by default Tomcat, and it is used for monitoring and administration of Data Services system; application server is installed on OS (Windows or linux/unix), it’s the core runtime system of Data Services; the frontend application especially the designer must be installed on Windows, it’s used for Data Services job design. From Solution Manager point of view, we only support the monitoring of application server.
Meanwhile, Data Service store the authorization information in the CMS DB shared with BOE/IPS system, and there are some dependencies between Data Service web application and BOE/IPS web application, that’s why before you install the Data Services system you must have BOE/IPS system installed beforehand.
IPS is a “mini” BOE which is contained in the Data Services installation file, comparing with BOE, IPS is a much lighter weight application without any reporting function and related services, it just has the basic function and services to support Data Services running. The customer could choose to use BOE or IPS as the CMS DB for Data Services.
In Solution Manager, Data Services and BOE/IPS are treated as different system type, Data Services is using Unspecific Cluster System while BOE/IPS is using SAP BusinessObject Cluster, which means in Solution Manager the managed system setup of Data Services and BOE/IPS will be processed separately.
In the pictures above you can see that the connectivity from Solman to Data Services system are done via agents. The native components (C++) of Data Services, these processes use the embedded NCS library to send data to the SMD agent (Diagnostics Agent), located on the same host as the process. This communication is done via a ‘port’ that has to be configured on the SMD agent as well as in the ncs.conf file with same value, by default 6391. The SMD agent then forwards the data to APM Introscope. APM Introscope is a third party tool that is delivered together with Solution Manager and works as ‘data collector’ for all NON-ABAP systems that are connected to Solution Manager.
Perform adaption of Data Services to Solution Manager in 4 Steps:
- Install necessary agents
- Send landscape information to System Landscape Directory (SLD) and enable NCS instrumentation.
- Import DS_KPIs job in your Data Services repository.
- Execute Managed System Configuration on Solution Manager (MSC)
Install necessary agents
You have to manually install the “SMD Agent” and “SAP Host Agent” on each host that running the Data Services application server.
Please refer to SAP Note 1833501 - Diagnostics Agent - Installer Versions .
Send landscape information to System Landscape Directory (SLD) and enable NCS instrumentation
As we mentioned in previous architecture overview, Data Services generate the performance metrics by NCS library instrumentation and then push to wily introscope.
By default, the NCS library instrumentation is inactive, to enable this function, you need to active this via Data Services Server Manager as screenshot below. You need to restart the Data Services application to make it active. Keep the parameter as default, the port 6391 here is the internal communication port between NCS and SMD agent, we may need this parameter in the future.
For SLD detail information, please refer to SAP Note 1018839 - Registering in the System Landscape Directory using sldreg . Here are some Data Services specific parts:
Basically, what we need to do here is perform sldreg.exe on OS level to send the Data Services landscape information to SLD via a xml file (also known as payload file) with proper parameter. There are some differences between the Data Services standalone and cluster scenarios.
AC Tool Usage
AC Tool is used for the preparation if different products Managed System Configuration, this tool can save quite a lot time by doing some steps in managed system host automatically.
SAP recommend customer to use AC tool to prepare the OS level configuration for Data Services, and it will cover the step: Send landscape information to System Landscape Directory (SLD) and enable NCS instrumentation.
The AC Tool could be downloaded from SAP Note 2137275 - AC Tool release NOTE .
Note: Before use the tool, please follow the AC Tool guide ("ACTool User Guide.pdf", can be found in the doc folder in AC Tool install directory) to maintain the ds.properties file.
Here I will not go through all details for the configuration but the main step with screenshots which are mandatory.
On Windows the file ...ProgramData\SAP BusinessObjects\Data Services\conf\DSConfig.txt is typically marked as "read only", please make sure to untick the "read only" before proceeding.
1. In the ds.properties file, maintain the following information as example.
- product.name = %DS_COMMON_DIR% from command line
- product.host.master.name = hostname
- sld.system.id = the SID you want to define the Data Services system in Solution Manager, by default it will be DSS.
- The SLD server related information. SMD agent and Host agent is the prerequisite for monitoring setup.
- Anything related Data Services configuration changes would require Data Services restart, so here set the ds.restart to true with defined ds.os.user.
- For the first time customer run AC tool for Data Services, AC tool will active the Saposcal processing, so please maintain the smdagent.windows.servicename as well.
There are some other parameter could be maintained in the ds.properties file, like virtual hostname setting, cluster setting, please refer to the guide attached in the note for more information.
2. After the maintenance of ds.properties file, customer need to run the AC tool, we would recommend the customer using Administrator or Root user on different OS system. The basic step would be:
- Start the AC tool from OS command.
- Choose Data Services as configuration product and run the "Check" before "Execute".
- Customer could choose specific step to perform "Check", using the comma to separate, or just Enter to confirm all steps. In this step, the customer will see all many FAILURE which is normal as these configuration have not been done yet.
- After the "Check" step, customer could "Execute" the step they want to perform. Just Enter means select all steps.
- The customer will see the "Execute" result, and the system will report the error if exists.
If the customer choose not using AC tool for this step, they could also configure the it manually instead.
1. Locate the sldreg.exe under Host Agent 'exe' folder.
2. Generating slddest.cfg and slddest.cfg.key files. These 2 files can be generated by the OS commands "sldreg.exe -configure slddest.cfg -usekeyfile slddest.cfg.key”, you will need to input SLD server, port, username and password information while running the above the command.
3. Copy the slddest.cfg and slddest.cfg.key files to %LINK_DIR%\sldreg\ folder. In the same folder, you will find a xml file called sld_data_services.xml, which is the payload file we mentioned before. This payload file will not be updated automatically unless:
- you changed the Data Services landscape configuration, like edit/add/remove Job Server or Access Server
- And you restart the Data Services instances.
4. Change the SID of Data Services system in the payload. If you are using Data Service release later than 4.2 SP6, the step 4 and 5 can be configure in Data Services Server Manager in a graphic way, see detail in follow section - New Features in Data Services equals or after 4.2 SP6, otherwise you should change this manually.
Locate the DSConfig.txt file that is in %DS_COMMON_DIR%\conf folder, add one entry SystemIDForSLD = AAA (for example) in section [AL_JobService]. Restart the Data Services application, then the payload file will be changed accordingly.
5. Run sld_ds.bat/sld_ds.sh in the %LINK_DIR%\sldreg folder. The payload file will be uploaded to SLD, if you see the HTTP send successful, then everything is fine, otherwise, you need to check the information you put into the slddest.cfg and slddest.cfg.key, or the SLD server running status. Every time Data Services application is restart, system will trigger the sld_ds.bat/sld_ds.sh, so the payload file will be sent to SLD automatically, we call this automatic upload mechanism.
To verify the information supplied in the SLD, log on to the SLD and go to Administration. Select Automatically Supplied Data. In the automatically Updated Data you should see the entry from the Data Services System (in this case SID DSS as shown in the SAP Data Services Server Manager) as below:
Note: for the development system, we don’t recommend the customer to use this automatic upload mechanism, because frequently change the Data Service landscape information will generate lots of outdated entries in SLD and mess with the correct landscape information, which will confuse the application operator if they don’t have a clear picture of Data Services landscape information. To turn off the automatic upload mechanism, you just need to move the slddest.cfg and slddest.cfg.key to somewhere else.
If the customer choose not using AC tool for this step, they could also configure it manually instead.
In many cases, customer will configure multiple Job Server or Access Server on different hosts to setup a cluster scenario. Unfortunately, Data Services doesn’t support cluster payload file auto-generate function, means by default, Data Services payload is generated as a standalone system, upload the payload files on cluster system to SLD will generate different SIDs in SLD.
The whole step of upload Data Services cluster scenario payload file to SLD is just the same as standalone scenario. The only difference is step 4, how to change the payload file manually.
Rule of thumb, use the unique Job Server and Access Server name in your cluster.
For Job Server, we would recommend you to use the name format like <hostname>_<jobservername>
For Access Server, please make sure all Access Servers in your landscape are using unique service communication port even they are on different host.
Before Data Services version 126.96.36.1995 and 188.8.131.52, you have to change and upload the payload file to SLD on all related servers manually (more details please refer to the note 1852508).
After Data Services 14.1 SP3 Patch2 and 14.2 SP1 Patch5, the Job Server/Access Server instance name will be unique generated automatically. Without changing the content of DS payload file manually, before upload payload file to the SLD, you only need to update the DSConfig.txt on both servers of this Data Services cluster which you can find in the folder <DS_COMMON_DIR>\conf or <DS_USER_DIR>\conf, such as "C:\ProgramData\SAP BusinessObjects\Data Services\conf". After you make the change, please restart the Data Services to enable the modification (After the restart the payload file will be changed.).
e.g. For a Data Services cluster, the servers' host names are A and B, and server A is the main instance: find out the DSConfig.txt file on both A and B, update them by adding the AL_JobService section at the end of the the file:
Server A insert the option:
CentralHostForSLD = A (host server name)
SystemIDForSLD = DSX (the cluster's SID in managing system)
Server B insert the option:
CentralHostForSLD = A (host server name)
SystemIDForSLD = DSX (the cluster's SID in managing system)
Then after new generated the payload file (by restarting the Data Services) and upload it to the SLD, in SLD you should find the instance are installed on different host but sharing the same application system host.
New Features in Data Services equals or after 4.2 SP6
For the Data Services which version is equals or above 4.2 SP6, a new tag in Data Services Server manager had been added for the sld registration, SID customized and cluster build. Please refer to the following screen shot, enter the content for each option. After enable the SLD registration, restart the Data Services. Then the related information will be changed and uploaded to the SLD.
The fields marked with a red color are mandatory for a successful connection of the Data Services System to the SLD and supply of information.
If the Data Services is configured as a cluster, make sure on every Data Services server, input the unique Central host.
By default, Data Services will use physical hostname as the hostname in payload file which will be stored in Solution Manager later, if you want to use the virtual hostname instead, then please input the virtual hostname in corresponding field, the virtual hostname will replace physical hostname in the payload file.
Import DS_KPIs job in your Data Services repository
DS_KPIs job is a pre-configured Data Services job, after importing this job into Data Services repository, it will generate additional Data Services performance and configuration information to hence the monitoring functionality powed by Solution Manager.
The basic configuration step of import DS_KPIs job includes:
- Download and unzip the DS_KPIs from the SAP Note 1618486 - Data Services additional Workload metrics .
- Locate the correct version of ATL file according to the version of your Data Services system.
- Follow the user guide "DSKPIs User Guide.pdf" to complete the job's import. The guide can be found in the DS_KPIs.zip file attached to this note.
Here I will not go through all details, just mention some key important steps you can’t ignore:
1. After you get the ATL file from note, keep in mind that you should replace the strings “DBO” with the pre-fix of repository table in database, this could be schema name, user name, etc. For example, if you are using oracle database, the pre-fix name should be the schema name in the below screenshot with red rectangle.
2. After importing the Job into Data Services repository, the following global variable should be set:
- $G_FILES_LOCATION, this value should be maintained as %DS_COMMON_DIR% in your OS environment and surrounded with single quotes, like ‘C:\ProgramData\SAP BusinessObjects\Data Services’.
- $G_TME_INTERVAL, by default 30, means the job will be triggered every 30 minutes.
3. After successfully running this job, some files will be generated in the customer system located in the directory %DS_COMMON_DIR%, they are Datastore_<repo name>.xml, Job_DF_config_<repo name>.xml and Introscope_metrices_<repo name>.txt. Don’t worry about the space usage of these files, since they will be replaced each time the job successfully finishes.
Execute Managed System Configuration on Solution Manager (MSC)
Check the technical system in LMDB
Once the information about the Data Services System has been pushed to the SLD, the Automatic Sync job between the LMDB and SLD will push the Technical System configuration to the LMDB. As such in the LMDB select the Technical System Type (the product system does not exist yet as it has not been defined) and scroll to SBOP Data Services Cluster - Unspecified Cluster System and search for the DS SID (e.g. DEM as it was shown in the SLD). You can select the Maintain System option as shows below to check and edit the LMDB information for Data Services system.
The changes should be shown as automatically supplied for the Product Versions and for all the Technical Instances as information in the LMDB should always be only automatically supplied and not manually edited.
Then please go to Software panel, select the Product Instances (Details), and check the Job Server and Access Server detail information. By default, the related Job Server and Access Server instance will be marked as installed with the Supplier status automatic.
If everything is OK, then please mark the Diagnostic-Relevant for he Product Instance Access Server and Job Server, save then close this page.
Note: The Product Instance marked as SAP DATA SERVICES 4.2 (SMP) is an entry which classifies the System SID and should not be marked as Diagnostics Relevant. And normally, we don’t recommend the customer to change the LMDB information manually unless with the guidance of SAP colleagues.
Managed System Configuration for Data Services
Once the LMDB check is done. Go the Managed System Configuration screen for selected Data Services system as below.
There’re 8 steps in the Managed System Configuration screen, while only from step 1 to step 6 is mandatory for Data Services Managed System Configuration.
Step 1 – Assign Product
After you enable the edit mode for this screen, Step 1 will be automatically executed when you click on this step.
Step 2 – Check Prerequisites
Just click “Execute All”, this activity will check the configuration of both the Data Services and the Solution Manager System.
Step 3 – Assign Diagnostic Agent
Assign the Diagnostics Agent to related Data Service server. Diagnostics Agent must be installed on the Managed System (host of the DS System - for cluster systems, and Agent will have to be installed on each of the hosts/nodes of the Cluster System)
Step 4 – Enter System Parameters
Select correct Introscope EM from the drop down list, select the Level 3 (default) for RCA Extractor Level. In the NCS communication panel, by default Solution Manager will input port number 6391 here, which is the same as the default value set in Data Services Server Manager, make sure these 2 values are the same, otherwise the communication between SMD agent and Wily introscope will fail.
Step 5 – Enter Landscape Parameters
In this step, we need to input each Job Server/Access Server instance related path information to schedule extractors on the managed System to collect related information.
In the below screen, we have one Data Services system having 2 Job Server and 1 Access Server configured on host.
First let’s take a look at Job Server related parameter. Here there are two metrics for Job Server need to be maintained, one is the "Installation Path" and the other is "Instance Log Path" as the picture shows:
- Installation Path: shall enter the value of <DS_COMMON_DIR> or <DS_USER_DIR> which represent the default or user specified Data Services configuration directory.
- Instance Log Path: shall enter the value of <DS_COMMON_DIR>/log/<JobServerName> or <DS_USER_DIR>/log/<JobServerName> or other directory you specified to store the job server's logs.
As all the configuration directory of Data Services or logs' store directory can be specified by users, so you need to check the directory before set up the parameters.
For the Access Server, by default, the instance log path of Access Server is C:\Program Files (x86)\SAP BusinessObjects\Data Services\bin\AccessServer_X, in case the customer install Data Services in other directory rather than C:\Program Files (x86)\SAP BusinessObjects\Data Services\, please change the path accordingly. While the "AccessServer_X” is Access Server instance name which is defined by customer, also one Data Services could have more than one Access Server. And what’s more, the whole log path is defined by customer, so you need to type each Access Server instance log path manually in instance log path.
Note: to avoid missing configuration, remember to click the “Save Landscape Parameters” for every instance that you just configured.
Step 6 – Finalize Configuration
Execute all the mandatory steps including executing the "Extractor Setup" activity. This will trigger the extractors on the Data Services System through the help of the Diagnostics Agent.
Step 7 – Check Configuration
The activities that will be executed will check if all the assignments done so far have been correctly executed and if there are any inconsistencies or incorrect information present, this will be shown in the messages.
A summary of all the configuration can be seen in the Step 8. Complete.
Metrics collection configuration
There are still some additional configuration steps if you want to enable BI/Job monitoring for Data Services.
1. In the Agent Administration → Application Configuration tab, choose com.sap.smd.agent.application.remoteos, select the agent in Scope. Add the value of the variable:
Note: all the 'extend SID' is the DS extend SID.
ENV_SID_'extend SID' _BOE_DIR with the right path of <BOE_DIR> in the agent setup, by default, it should be C:\Program Files (x86)\SAP BusinessObjects.
ENV_SID_'extend SID' _BOE_AUTH_TYPE with 'secEnterprise'.
ENV_SID_'extend SID' _BOE_CMS with BOE CMS Cluster Name. The name can be found in the BOE/CMC/Settings/Cluster.
ENV_SID_'extend SID' _BOE_CMS_USR with BOE CMS user.
ENV_SID_'extend SID' _BOE_CMS_PWD with BOE CMS user password.
ENV_SID_'extend SID' _BOE_WEB_PORT with BOE Web Server port.
ENV_SID_'extend SID' _BOE_WEB_HOST with the Data Services related web-app host or the load balancer.
ENV_SID_'extend SID' _DS_DIR with the parent directory of Data Services configuration path: <DS_COMMON_DIR> or <DS_USER_DIR> (the same value of $$SamplesInstall in DS designer, please refer to the fourth step of "Data Services Configuration" show above), e.g. if the <DS_COMMON_DIR> directory is "C:\ProgramData\SAP BusinessObjects\Data Services", then for this parameter please use its parent directory "C:\ProgramData\SAP BusinessObjects\".
To find out which paths to select see this:
2520823 - How to check the definition of ENV_* parameters for metrics collection of Data Services - SAP Solution Manager 7.1 and 7.2
2. In Agent Application panel, choose com.sap.smd.agent.application.wilyhost, select the SapAgentConfig.xml and agent in Scope. Upload the xml file with following content.
For the Solution Manager release larger than 7.2 SP2 or 7.1 SP14, you can ignore this step, since the content within SapAgentConfig.xml will be auto-generated during the Managed System Configuration Step.
<?xml version="1.0" encoding="UTF-8" ?>
<destination class="com.sap.smd.wily.hostagent.destination.SocketDestination" name="Port6391">
<property name="port" value="6391"/>
<action prefix="" name="TCP_6391" destination="Port6391" template="SapMDM" />
<action prefix="BODS|SAP_DS" name="ds_workload" class="com.sap.smd.wily.hostagent.action.RemoteOsCommandAction" period="1800000">
<property name="commandkey" value="ds.cat" />
<property name="longsid" value="DEM" />
Here 6391 is the ncs library used port, by default, Data Services will use the port to communicate with solman, in some former Data Services releases, 59818 may be the default one as well. Customer must choose a free port number on Data Services server here. The port number can be found in ncs.conf file located in Data Services installation directory. And the port number in ncs.conf and in SapAgentConfig.xml must be consistent.
DEM is the SID of your Data Services system in solution manager, please change it accordingly.
3. If you still want to do BI Monitoring, you should also need to change the Data Services Application Settings in CMC, the details is as follows, find "Settings" : BOE\CMC -> Application -> Data Services Application -> Settings, change the value of History Retention Period to a small number, maybe "1" or "2". The following picture shows the content:
For troubleshooting steps of a Data Services System please visit the Troubleshooting Diagnostics Configuration for Data Services 4.X Wiki page.