Skip to end of metadata
Go to start of metadata

 


Document is under construction!

Purpose

This page should describe the monitoring objects and mechanisms to include non-SAP systems into Business  Process Monitoring with SAP Solution Manager. It does not describe the general best practices and procedures for Business Process Monitoring set up and consumption.

Overview and General Prerequisites

The goal of the monitors described in this page is to integrate business data from non-SAP-systems or non-ABAP-systems into Business Process Monitoring.

Most monitoring objects described in this page are so-called local runners. This means even though they are assigned logically to a managed system, they are executed technically on the SAP Solution Manager and collect data from the managed system by other means than the standard collection mechanism.

 

As a prerequisite for this mechanism under SAP Solution Manager 7.2 local RFC connections have to be maintained in Solution Manager Setup for the Solution Manager itself. Especially the TMW connection is required.

They all require a current ST-A/PI being installed in SAP Solution Manager.

Generic Monitoring Objects

Generic Table Entry Counter for Remote DB ("NATABCNX")

Purpose

The “Generic Table Entry Counter for remote DB” provides the possibility to run generic database queries to count the number of entries (or the number of distinct values) in any remote database table. It supports a flexible filtering based on select-options on five fields of a table. Using ADBC (ABAP Database Connectivity) it can access a remote database table on an SAP-supported DBMS (database management system), using NativeSQL statements.

This monitoring object is a substitute or workaround for areas, where no specific application monitor is available. Furthermore this monitor can help to avoid implementing the application monitoring customer-exit, as long as the collector result can be calculated by a simple SELECT statement on one single table or view.

Technical Prerequisites

All SAP Basis (NetWeaver) releases starting from release 7.00 and later are supported. It is recommended to run the data collector directly on SAP Solution Manager for this purpose. 

To monitor an external (non-ABAP) database using ADBC using a secondary database connection, you need to install the correct shared libraries for your database on SAP Solution Manager and the corresponding secondary database connection must be maintained with transaction DBACOCKPIT.

Please see appendix “Remote Database Connectivity” for more details. 

Setup procedure in SAP Solution Manager 

In short: For the application monitoring, choose the monitoring object NATABCNX and specify the following selection criteria:

  • Database connection name and schema name
  • Table name
  • Field name for filtering and select-options for actual filter values
    • this is available five times and will be used to generate a WHERE clause

Detailed instructions:

  1. Start Solution Documentation from the Solution Manager Launchpad.
  2. Navigate to your Solution, Operations-Branch and the relevant process step or interface.
  3. Call the context menu and select New → Operations → Alerting
  4. Select the area "Cross Applications".
  5. Choose application monitor ‘Generic Table Entry Counter for remote DB’ and click on "Create". This will start the maintenance screen for this alerting object.


The first two parameters "Database Connection" and "Database Schema" are to define the target database server (using a DBCON entry name) and database schema for the remote database connection. The preferred way to maintain these attributes is inside the System Landscape definition (transaction SMSY/LMDB). The business process step refers to a logical component, which itself points to a system that uses a database. At the maintenance screen of this database you should enter the DBCON connection name and schema name into the free attributes called "Remote DB Conn. for Appl.Mon." and "DB Schema for Application Data". This provides a central place to store this technical information, which is then valid for all application monitors using a remote database connection to this database. In such a case it is not necessary to maintain the two monitoring object parameters "Database Connection" and "Database Schema" in the setup session. If you maintain the parameters in the session, this will override the SMSY/LMDB attributes and the monitoring will use your entered values instead.
For more information regarding the usage of a remote DB connection, please see appendixes “Remote Database Connectivity” and “Derive Connection Entry”.

As the third parameter, enter the "Table Name". The search help loads the available table names from the remote database. For performance reasons, make sure to enter a pattern (like APP* for all tables prefixed by APP) to restrict the selection. The table name is the only mandatory parameter, whereas the following filters are optional.

The optional fourth parameter called "Display Columns" allows restricting the scope of the detail display screen. Instead of a SELECT * only the specified column names will be loaded from the remote database table, to allow hiding some technical fields. For each field name you can add a descriptive field label text that will be used as column header in the detail display. Just postfix a double-dash (SQL comment "--") and the text as shown in this example:


Otherwise you would see the technical field names only, as the remote database dictionary access does not provide multi-lingual text labels as known from the SAP ABAP dictionary.

As the optional fifth parameter "Filter 1 Fieldname", enter the field name for the first filter, which obviously must be a valid field of the selected database table. A search help dialog is also shown here, which shows all fields of the selected table, based on the remote database dictionary.

In the sixth line, enter the select criteria "Filter 1 Sel.options" for the first filter mentioned above, which represent the actual filter field values. When you have already entered the table name and the field name for filtering, the system tries to provide an entry help from the remote database table column content. However, this may be performance-intensive due to the selection method of distinct values, so please use with care. Please note to enter all values in the native database format, for example a date 2008/09/24 might be entered as 20080924, or timestamps are usually entered in a 24-hour format without separators. Please see the appendix on how to implement a dynamic date filtering, using relative dates instead of absolute values.
Repeat these steps for further filters if required.

When you are finished with the setup of the header parameters, press button “Validation Check” to validate the input and perform a connection test.

Choose the key figure(s) to focus on in area "Key Figure Selection". There are three key figures available:

  • Number of Counted Entries” (with given filter criteria)
  • Number of Distinct Values” (on single field name and with given filter criteria)
  1. Enter the parameter “Distinction Fieldname”, that is the name of the table field for which the different values are to be counted. Please make use of the value help, which only displays valid fields of the selected table.
  • Value of Single Field” (with or without aggregate and with given filter criteria)
  1. As additional customizing at key figure level, you have to enter the name of the single field for which you want to read its value ("Selection Fieldname"). Ideally the data type of this field is an integer. Numeric fields with decimals will be rounded. For other types the result may be undefined or you would even get an error message in the alert text regarding a conversion error. Make sure that the where-clause defined by the filter fields selects one single row only. Otherwise the result might be not unique, which will be indicated in the alert text.
  2. In addition you can optionally enter an "Aggregate Function". If the select results in multiple rows, one single integer value is returned due to aggregation, for example MAX(ERRORCODE) would return the highest error code from the error table column. Please use the value help to select from possible aggregate functions:
    1. Default: None
    2. MIN: Minimum value
    3. MAX: Maximum value
    4. AVG: Average value
    5. SUM: Sum of values

For each key figure, you can customize an "Alternative Alert Text", using certain placeholder variables.

  1. The standard key figure alert texts return a rather technical alert message
    - “Number of entries in table <table> = <counter>” or
    - “Number of distinct values in field <field> of table <table> = <counter>”
    This is automatically proposed if you leave the parameter “Alternative Alerttext” empty.
  2. You can use any free text together with the following placeholders to overwrite the standard alert text with your own alert text:
    - $COUNT returns the measured value from the selection
    - $TABLE returns the database table name
    - $FIELD returns the table’s field name
  3. Example for our use-case in key figure “Number of Counted Entries”:
    -  “$COUNT unprocessed sales items”
  • For each key figure, the parameter "Detailcall Headertext" allows to store a screen title for the detail display, to explain the user what kind of output is shown.
  • For each key figure you can enter a threshold for a YELLOW alert and a threshold for a RED alert as an absolute value, also supporting a four-step rating with minimum and maximum thresholds pairs (RED >= YLW >= Value >= YLW >= RED).

Navigate to tab strip ‘Data Collection": Enter the data for the scheduling of the collector like for any other alerting object in Business Process Monitoring.

Once you have entered and saved all relevant information for the monitoring objects you want to monitor, generate the monitoring customizing and activate the monitoring by clicking on button "Generate and Activate".

Limitations

This data collector must not be used to query databases of SAP ABAP-based systems. For that purpose you can use the “Generic Table Entry Counter” (technical name BOTABCNT), build especially to support database tables defined inside the SAP ABAP Data Dictionary. It used the preferred way of OpenSQL instead of NativeSQL.

Because the data collector works on database tables, which are not defined inside the SAP ABAP Data Dictionary, there is no automatic support of certain functionality usually expected from an SAP system, such as

-          multi-lingual text labels for table columns,

-          complex search helps on check tables or domain fix values,

-          input and output conversion according to certain data types or user’s locale settings.

Search helps for defining filters are fed by selecting distinct field values from the corresponding table columns, which might be a performance-intensive query. All filter criteria are stored as character fields.

The data collector accesses the remote DB data dictionary to query the table meta data (like column’s data types) in order to map the native data types into SAP dictionary data types and ABAP data types, for example to allow a detail display with the list viewer. However, it cannot be guaranteed that all native DBMS-dependent data types can be properly matched into a suitable ABAP data type. Therefore the output of certain data might look different compared to the user interface of native database tools or especially compared to the respective non-ABAP application, which may perform its own output conversion. There might be restrictions especially for number types (like maximum amount of decimals) or date types (like other than expected formats for dates and timestamps).

The data collector can query any table that is accessible by the login user. Therefore it is in the responsibility of the user to avoid frequent selections on large tables. Keep in mind that the necessary SQL selection aggregates for COUNT and DISTINCT create expensive database selects, including full table scans!

Using this application monitor without care can cause severe performance degradation on the managed system.

Please consider your setup carefully and follow these golden rules:

  • Avoid using this application monitor on large database tables
  • Avoid running the data collection with a high monitoring frequency
  • Avoid complex filtering, especially with patterns, ranges and exclusions
  • Avoid filters on fields which are not supported by an index.

If you are in doubt, try to simulate the query with the “Test Environment for Remote DB Query”, to get an idea about the possible runtime.

As the appropriate shared libraries have to be installed, only systems running on a database supported by SAP can be monitored using this alerting object.

Table Entry Counter for oData ("NATABCOD")

Purpose

The "Table Entry Counter for OData" enables dynamic OData queries to be run, to count the number of entries (or the number of distinct values or a single value) in any OData entity. It supports flexible filtering based on the select options from five property fields of an entity and can access any configured and compatible OData service provider.
This monitor is a substitute or workaround for areas without their own application monitor. It can avoid the need to implement the application monitoring customer exit, if the collector result can be calculated by a simple SELECT-statement.

Technical Prerequisites

The data collector is executed directly on SAP Solution Manager so the ST-A/PI has to be implemented on the SAP Solution Manager system. The collector requires then a separate connection in SM59 pointing to the root-URL of the OData service to be called.

Setup Procedure in SAP Solution Manager

Limitations

Same as the first section, this is also part of the main body of the WIKI. You can add as many sections as you require to make the information you are explaining as clear and simple to follow as possible.

Dependent Data Collector for Push WS ("NAWSPUSH")

Purpose

The alerting object is intended to evaluate data from non-SAP-systems which has been pushed into SAP Solution Manager using Web Service DSWP_BPM_PUSH_WS. The data collector does not read data directly from the managed system and the data is not pushed by the Web Service directly to Business Process Monitoring, but the Web Service stores the pushed measured values intpo a staging area from where it is collected by the data collector. 

Technical Prerequisites

The data collector is executed directly on SAP Solution Manager so the ST-A/PI has to be implemented on the SAP Solution Manager system. The collector evaluates data received by Web Service “DSWP_BPM_PUSH_WS”. This Web Service is delivered as part of ST-SER 20101_1. Both the ST-A/PI and the ST-SER with the correct versions are part of SPS23, so this stack or a higher version has to be implemented

Setup Procedure in SAP Solution Manager

To demonstrate the setup of the PUSH scenario we choose a fictive non-SAP legacy warehouse management system WMS where the performance and the number of transfer orders is evaluated by some in-house developments once per day. This data should be forwarded to SAP Solution Manager to utilize the Business Process Monitoring for alerting and reporting.

In short: Create appropriate bindings according to your security needs for the Web service (see SAP Note 1318499)

Set up the BPMon configuration for  application monitoring object NAWSPUSH, specifying the following selection criteria:

  • Identifiers to link the received data to the monitoring object instance

Detailed instructions to set up the transport binding:

  1. Start transaction SOAMANAGER
  2. Navigate to Service Administration and choose Single Service Configuration
  3. Search for a Service using the internal name DSWP_BPM_PUSH_WS
  4. Mark the displayed service line and select Apply Selection
    1. Move to Configurations and choose Create Endpoint.
    Enter the desired data for communication security, authentication and transport setting

Limitations

The Web service has to identify the relevant listeners NAWSPUSH for the received data. This will cause some additional load on SAP Solution Manager, especially if many listeners have been created and a large amount of data is received frequently. Therefore it is in the responsibility of the user to test the impact on the overall system load. 

Please consider your setup carefully and follow these golden rules:

  • Avoid pushing too many data sets into the Web Service.
  • Avoid pushing data into SAP Solution Manager with a high frequency

No data is being stored in the staging area if there is no corresponding monitoring object active. 

Data Collector for Pull WS ("NAWSPULL")

Same as the first section, this is also part of the main body of the WIKI. You can add as many sections as you require to make the information you are explaining as clear and simple to follow as possible.

Purpose

The monitoring object is intended to pull monitoring data from non-SAP-systems via a Web Service interface:

Here, the actual scheduling is done in SAP Solution Manager and a consumer proxy implemented in the data collector calls the Web Service to request the data.

Technical Prerequisites

First you have to create a Web Service that receives the relevant identifiers from SAP Solution Manager, determines from these identifiers that it should query the throughput for which day. The actual details depend on the programming environment available on the managed system like programming language, web server used etc. and cannot be covered here in detail. Instead, we’ll provide a typical example where we want to create the Web Service using Visual Studio.

You have to create a new web site that should be invoked by SAP Solution Manager. The coding has to declare the appropriate communication structures to be used in the data exchange with the Solution Manager system. Mainly we need structures to receive the identifiers and to return the appropriate measured value and alert context. If you would use VB.NET, the coding could look similar to:

 

Public Structure BPM_TRANSFER

    Dim Ident1 As String

    Dim Ident2 As String

    Dim MeasValue As Integer

    Dim Object1 As String

    Dim Object2 As String

    Dim Object3 As String

    Dim Info1 As String

    Dim Info2 As String

    Dim Info3 As String

    Dim Info4 As String

End Structure

 

Public Structure BPM_RETURN

    Dim Type As Char

    Dim Message As String

End Structure

 

Public Structure BPM_PULL

    Dim Ident1 As String

    Dim Ident2 As String

End Structure

 

Public Structure BPM_PULL_RESPONSE

    Dim DATA As BPM_TRANSFER

    Dim RETURN1 As BPM_RETURN

End Structure

Listing 1: Example for definition of the communication structures

 

Here, the public structure BPM_PULL_RESPONSE would return the actual alert data and potential error messages as defined by BPM_TRANSFER and BPM_RETURN while structure BPM_PULL is intended to receive the trigger information as set up in the BPMon Setup session, e.g. the two identifiers maintained in that session.

Note:

It is recommended that you use the same field names in the structure definitions in your foreign application as used in the parameter definitions for the BAdI-interface in SAP Solution Manager as you can transfer the data in the BADI-implementation using MOVE-CORRESPONDING.

To be available as Web Service the project has to use the appropriate .NET-classes and to declare a public method that can be invoked by SAP Solution Manager. In VB.NET the coding could look like Listing 2:

Imports System.Web
Imports System.Web.Services
Imports System.Web.Services.Protocols
<WebService(Namespace:="Your Namespace")> _
<WebServiceBinding(ConformsTo:=WsiProfiles.BasicProfile1_1)> _
<Global.Microsoft.VisualBasic.CompilerServices.DesignerGenerated()> _
Public Class BPM_PULL_WS
    Inherits System.Web.Services.WebService
 
    <WebMethod()> _
    Public Function Example_WS(ByVal Input As BPM_PULL, ByRef Response As BPM_PULL_RESPONSE) As String
 …
<Your code to evaluate Input (e.g. the two identifiers provided), trigger the appropriate selections and return the result in Response.>

    End Function
End Class

Listing 2: Example for a non-ABAP Web Service interface to SAP Solution Manager

Note:

This is just an example for a specific language. It is within the responsibility of your developers to create the implementation in your environment and preferred programming language. SAP does not support the implementation of the Web Service.

Once you have deployed the project it is available in your Web Server and you can implement the invocation from SAP Solution Manager via the provided WSDL-document.

Setup Procedure in SAP Solution Manager

To demonstrate the setup of the Pull Web service we choose a simple fictive non-SAP warehouse management system where information about the transfer orders is stored in a simple table (see Picture 43)

The table will log the transport order number together with the corresponding material movement information like quantities, storage location and execution date.

Now we want to measure the throughput of transfer orders using this table.

Limitations

The data collector can query any information object that is accessible via the exposed web service in the managed system. Therefore, it is in the responsibility of the user to avoid frequent invocations of Web Service performing selections on huge tables returning large amounts of data. Also, it is in the responsibility of the Web Service developer to ensure a good performance.

Please consider your setup carefully and follow these golden rules:

  • Avoid creating web services to query large data sets without a good access path.
  • Avoid running the data collection with a high monitoring frequency

If you are in doubt, try to simulate the query in the managed system to get an idea about the possible amount of data evaluated. Furthermore, monitor the data collection runtime.

Generic Infocube/DSO Reader for Remote BW (NABWICNX)

Purpose

Technical Prerequisites

Setup Procedure in SAP Solution Manager

Limitations

Specific Monitoring Objects

Based on the ADBC-framework described you can  

Further possibilities to include non-SAP systems

Smart Data Integration / Smart Data Access

 

Appendix: Test Environment for remote DB Queries 

Purpose

The “Test Environment for Remote DB Query” provides the possibility to run generic database queries to any database table of a remote database. It supports a flexible filtering based on select-options on five fields of a table. Using ADBC (ABAP Database Connectivity) it can access a remote database table on an SAP-supported DBMS (database management system).

The “Test Environment” is part of the Business Process Monitoring framework within SAP Solution Manager. It can be used to prepare queries for the “Generic Table Entry Counter for Remote DB” (monitoring object NATABCNX) or for generic monitoring objects with free SQL queries towards Remote Databases.

Technical Prerequisites

The Test Environment is delivered with the add-on ST-A/PI starting with release 01M. It can be started by running program /SSA/ENA with transaction SA38.

All SAP Basis (NetWeaver) releases starting from release 7.00 and later are supported.

To monitor an external (non-ABAP) database using ADBC, the corresponding secondary database connection must be maintained with transaction DBACOCKPIT. Please see section “Remote Database Connectivity” for more details.

This Test Environment shall not be used to query databases of SAP ABAP-based systems. For that purpose you can for example use transaction SE16, build especially to support database tables defined inside the SAP ABAP Data Dictionary.

Limitations

Because the Test Environment queries database tables, which are not defined inside the SAP ABAP Data Dictionary, there is no automatic support of certain build-in functionality, such as

-          multi-lingual text labels for table columns,

-          complex search helps on check tables or domain fix values,

-          input and output conversion according to certain data types or user’s locale settings.

 

Search helps for defining filters are fed by selecting distinct field values from the corresponding table columns, which might be a performance-intensive query. On the selection screen, all filter criteria are stored as character fields, so there may occur implicit conversions, e.g. for number types or dates.

 

The Test Environment accesses the remote DB data dictionary to query the table meta data like column’s data types, in order to map the native data types into SAP dictionary data types and ABAP data types, for example to allow a detail display with the list viewer. However, it cannot be guaranteed that all native DBMS-dependent data types can be properly matched into a suitable ABAP data type. Therefore the output of certain data might look different compared to the user interface of native database tools or other non-ABAP applications. There might be restrictions especially for number types (like maximum amount of decimals) or date types (like other than expected formats for dates and timestamps).

The Test Environment can query any table that is accessible by the login user. Therefore it is in the responsibility of the user to avoid frequent selections on large tables which might result in a performance degradation of the database. Keep in mind that the necessary SQL selection aggregates for COUNT and DISTINCT create expensive database selects, including full table scans!

Using the Test Environment without care can cause severe performance degradation on the monitored system.

Please consider your selection criteria carefully and follow these golden rules:

  • Avoid using queries on large      database tables
  • Avoid running the queries as a      background job with a high frequency
  • Avoid complex filtering,      especially with patterns, ranges and exclusions
  • Avoid filters on fields which are      not supported by an index.

Main Selection Screen

The main selection screen of the program consists of four areas:

  1. Remote Database Table” to define the connection target
  2. Processing Mode” to choose the type of database query
  3. Filter Criteria for Where-Clause” to define selection filter      fields and values
  4. Display      Parameters” to set some special output options

 

Picture 88: Main selection screen for test environment

Processing Mode

The effect and usage of the different processing modes are explained in more detail further below. The settings for “Remote Database Table” and “Filter Criteria for Where-Clause” are identical for those processing mode. However, the available “Display Parameters” may differ per processing mode.

Remote Database Table

To identify the remote database table, you need to enter three attributes:

  • The “Database      Connection Name” is the name of a connection entry as defined in      transaction DBACOCKPIT (stored in table DBCON). Please see appendix “Remote Database Connectivity” for more details. The search help shows      the available entries.
         By pressing the test button            , you can perform a connection      test to the      remote database.
  • As “Database      Schema Name” enter the name of the schema, where the table belongs to.      The search help queries the remote database for available schemas. You can      also press button             to      search for the default schema (attribute in transaction DBACOCKPIT) and      take it as a proposal.
  • In field “Table Name”, enter the name of the remote database table. You      can use tables, views and synonyms. The search help queries the remote      database for available tables in the given schema. Please avoid running      this query for all tables. If you know the first characters of the table      name, you can enter it as a pattern to restrict the search help result.
                   

Picture 89: Search help for table names

 

               Picture 90: Search help for table name


By pressing button  you can view the table’s meta data from the remote database dictionary, such as table columns, keys, indexes, or view definitions.

Filter Criteria for Where-Clause

You can use up to five pairs of field name and select-options, which are used to create a where-clause for your select query. Please note that the selection screen is organized like a standard ABAP report making use of the multiple select-options to support ranges and patterns. However, during runtime this will be converted into an ANSI SQL-conform where-clause.

 

In the first column, enter the table field name. You can use the search help to show a popup with the table’s column meta-data, including information about key fields and data types.

 

 

Picture 91: Search help for field names

 

The columns “Field type”, “DB length”, “Precision” and “Scale” refer to the native data type in the remote database. The columns “DTyp”, “Length” and “Decimals” show the corresponding data type as it would be used in an SAP ABAP Dictionary. If the “DTyp” is empty, the native data type cannot be mapped into a suitable SAP Dictionary type.

 

In the from/to-columns on the right-hand side, enter the actual selection criteria. As mentioned, this can be entered using the known multiple-selection screen, using all available select-options, such as single and multiple values, patterns with wildcards, ranges with intervals, all combined either as inclusions or exclusions. Use a double-click on the filter value field, or press button , to access the multiple-selection screen.

All filter values are stored in a character format and should be entered according to the internal format of the native data type.

 

An example for a filter on all material types except those starting with prefix ‘Access’ is shown in Picture 92.

 

Picture 92: Example for a filter


At runtime it gets converted into the following where-clause:

WHERE ( MATERIAL_TYPE NOT LIKE 'Access%' )

 

Multiple filter values on the same field name are connected with a logical ‘OR’ and filter values on different field names are connected with a logical ‘AND’.

 

Please see appendix on how to define variable filters for dynamic date selections.

Display Parameters

At “Free Text as Display Header” you can enter any text that will be shown as a header line in the list output. This is helpful when storing the selection screen as a variant for a later re-use.

 

The parameter “Distinct/Single Field Name” is needed for the following use-cases:

  • For processing mode “Count number of      distinct values” you have to enter the name of the table column on which      you want to apply the COUNT DISTINCT aggregate.
  • For processing mode “Get single field      value” you have to enter the name of the table column from which you want      to select a single field value in order to convert it to an integer      number.
  • For processing mode “Display Table Content”      you can optionally enter the name of a table column on which you want to      apply the SELECT DISTINCT aggregate.

 

The parameter “Maximum number of Rows” is available for all selections, which do not perform a simple count, but rather load table entries for a list display, e.g. at “Display Table Content”. The default number is set to 1,000. Avoid increasing it to more than 10,000 rows for performance reasons on the remote database as well as on the SAP GUI Frontend.

 

The parameter “Display Columns” is available for all selections, which do not perform a simple count, but rather load table entries for a list display. If you do not enter any values in this parameter, all columns are shown that can be mapped from the native data type into an ABAP data type. Using the multiple-selection, you can maintain a list of column names for a restricted display. Using push button , you can easily pre-fill the multiple-selection with all available column names. Furthermore you can define column headings, if you postfix a double-dash plus the heading text to the column name as shown in Picture 93.

 

 

Picture 93: Example for display field selection


This will lead to a display as shown in Picture 94.

 

 

Picture 94: Resulting display screen

 

The parameter “Order Columns” is also available for all selections, which do not perform a simple count, but rather load table entries for a list display. It allows to sort the database and selection and screen output by one or more columns, e.g. ORDER BY "MATERIAL_NAME" , "SALES_UOM". If you leave this parameter empty, you get the default sorting, usually by primary key.

Processing Mode “Count number of entries”

This processing mode does a simple SELECT COUNT(*) query to the target table with the given filter criteria. The result is an integer value.

 

You can use this query to simulate the calculation of the “alert measured value” of the BPMon Data Collector, e.g. for monitoring object “Generic Table Entry Counter” (NATABCNX, 1st key figure).

 

Example query:

SELECT COUNT(*)

  FROM APP_MATERIAL

 WHERE ( APP_MATERIAL.MATERIAL_TYPE NOT LIKE 'Access%' )

 

 

Picture 95: Example for "Count number of entries"

 Processing Mode “Count number of distinct values”

This processing mode does a query of style SELECT COUNT( DISTINCT <fieldname> ) on the target table with the given filter criteria. You need to enter the <fieldname> in parameter “Distinct/Single Field Name”. The result is an integer value.

 

You can use this query to simulate the calculation of the “alert measured value” of the BPMon Data Collector, e.g. for monitoring object “Generic Table Entry Counter” (NATABCNX, 2nd key figure).

 

Example query:

SELECT COUNT( DISTINCT MATERIAL_TYPE )

  FROM APP_MATERIAL

 WHERE ( APP_MATERIAL.MATERIAL_TYPE NOT LIKE 'Access%' )

 

 

Picture 96: Example for "Count number of distinct values"

Processing Mode “Get single field value”

This processing mode does a query of style SELECT SINGLE <fieldname> on the target table with the given filter criteria. You need to enter the <fieldname> in parameter “Distinct/Single Field Name”. It needs to be an integer data type or at least some number data type that can be converted into an integer by rounding. The final result will always be an integer value. You should make sure that the filter criteria restrict the selection to one single row of the table, otherwise you would get the first row that matches the filter criteria and thus the result may not be accurate or unique.

 

Optionally you can apply one of the following aggregates:

- MIN =Minimum value

- MAX = Maximum value

- AVG = Average value

- SUM = Sum of values

This would change the query to the style SELECT <aggregate>(<fieldname>). Also here the result is a (rounded) integer value.

 

You can use this query to simulate the calculation of the “alert measured value” of the BPMon Data Collector, e.g. for monitoring object “Generic Table Entry Counter” (NATABCNX, 3rd key figure).

 

Example query without aggregate:

SELECT MATERIAL_NUMBER

  FROM APP_MATERIAL

 WHERE ( APP_MATERIAL.MATERIAL_TYPE NOT LIKE 'Access%' )

 

 

Picture 97: Example for “Get single field value”

 

Example query with aggregate:

SELECT MAX(MATERIAL_NUMBER)

  FROM APP_MATERIAL

 WHERE ( APP_MATERIAL.MATERIAL_TYPE NOT LIKE 'Access%' )

 

 

Picture 98: Example for "Get single field value" with aggregation

Processing Mode “Display Table Content”

This processing mode does a query of style SELECT <fieldlist> or SELECT * on the target table with the given filter criteria, depending on whether you have or not have maintained the parameter “Display Columns”.

 

As a variant, you can enter a <fieldname> in parameter “Distinct/Single Field Name”. This would change the query to the style SELECT DISTINCT <fieldname> , COUNT(*) ... GROUP BY <fieldname>.

 

The result of the queries is displayed as a list of found rows.

 

You can use this query to simulate the “Detail Info” of the BPMon Data Collector, e.g. for monitoring object “Generic Table Entry Counter” (NATABCNX).

 

Example query without distinct field:

SELECT "MATERIAL_NUMBER" , "MATERIAL_NAME" , "BASE_UOM" ,

       "SALES_UOM" , "MATERIAL_TYPE"

  FROM APP_MATERIAL

 WHERE ( APP_MATERIAL.MATERIAL_TYPE NOT LIKE 'Access%' )

 ORDER BY "MATERIAL_NAME" , "SALES_UOM"

 

 

Picture 99: Example for display of table content

 

Example query with using the distinct field MATERIAL_TYPE for grouping:

SELECT DISTINCT MATERIAL_TYPE , COUNT(*)

  FROM APP_MATERIAL

 WHERE ( APP_MATERIAL.MATERIAL_TYPE NOT LIKE 'Access%' )

 GROUP BY MATERIAL_TYPE

 ORDER BY MATERIAL_TYPE

 

 

Picture 100: Example using grouping

 

Processing Mode “Compare Table with second connection”

This processing mode is an experimental feature. It allows a comparison of two tables that are on different databases, by using data fetches from two connections. The table design needs to be identical, i.e. having the same column names with the same data types. The name of the tables may differ though.

You can enter two triples of “Database Connection Name”, “Database Schema Name” and “Table Name”, to identify the two target tables.

 

There are three additional parameters you can use for this processing mode:

  1. “Key field(s)” allows entering the number      of key fields. The key is important to match identical rows from the two      tables. If they have more than one primary key field (such as order number      plus item number), the amount must be entered. This would treat the first      X columns from the meta data as combined key.
  2. “Hide equal rows” suppresses the display      of all fully identical rows.
  3. “Hide equal columns” suppresses the      display of all fully identical columns.

 

In the list output you see the comparison result with the following structure:

-          On the left-hand side the table columns of the first table

-          On the right-hand side the table columns of the second table

-          In the middle a column called “Compare” that shows an icon for the comparison result:
  => the entry (with the same key(s)) exist in both tables and has identical content
  => the entry (with the same key(s)) exist in both tables but has different content
  => the entry is present in the left-hand table but is missing in the right-hand table
  => the entry is present in the right-hand table but is missing in the left-hand table

 

 

Picture 101: Sample output without options “Hide equal rows” and “Hide equal columns”

 

 

Picture 102: Sample output with option “Hide equal rows”

 

 

Picture 103: Sample output with option “Hide equal columns”

 

 

 

Picture 104: Sample output with both options “Hide equal rows” and “Hide equal columns”

 

By clicking on the button , you get a side-by-side view to compare the single table columns. Columns with different field content are highlighted in red.

 

Picture 105: Example for side-by-side view

Processing Mode “Free Select Query”

The actual SQL Select Statement is always assembled automatically for all the previously described processing modes to provide suitable query syntax for the specific use-case. Although this is very convenient and easy to use, one important restriction to observe is that you are not able to query multiple tables within one selection (unless you enter a view as table name, which internally joins several tables). With mode “Free Select Query” you can enter your own SQL Select Statement, for example to define a JOIN of several database bases, or to allow sub queries.

 

An example for a simple selection of the sales order items (table APP_SALES_ORDER_ITEM), using processing mode “Display Table Content” is shown in Picture 106

 

Picture 106: Example for table contents

 

Now we would like to see the material description, instead just the material number. So we need to join the field MATERIAL_NAME from table APP_MATERIAL.

SELECT APP_SALES_ORDER_ITEM.ORDER_NUMBER /*Order*/,
       APP_SALES_ORDER_ITEM.ITEM_NUMBER /*Item*/,
       APP_SALES_ORDER_ITEM.QUANTITY /*Quantity*/,
       APP_SALES_ORDER_ITEM.MATERIAL /*Mat.No.*/,
       APP_MATERIAL.MATERIAL_NAME /*Material*/,
       APP_SALES_ORDER_ITEM.ITEM_STATUS /*Status*/
  FROM APP.APP_SALES_ORDER_ITEM
  JOIN APP.APP_MATERIAL
    ON APP_MATERIAL.MATERIAL_NUMBER = APP_SALES_ORDER_ITEM.MATERIAL

 

Chose processing mode “Free Select Query” and press button  to start an editor for the SQL query which is shown in Picture 107.

 

 

Picture 107: Example for a free SQl entry

 

Press the save button , to return to the selection screen.

 

In order that your statement can be interpreted, you need to follow a couple of rules:

  1. All SQL keywords (such as SELECT, FROM,      JOIN, WHERE) must be in uppercase.
  2. The SQL statement must start with a SELECT      keyword.
  3. The SELECT-clause must be either an      aggregate that returns an integer value, for example SELECT COUNT(*) or      SELECT COUNT( DISTINCT... ), or must contain a comma-separated field list      that is subject to a list display, such as SELECT table1.field,      table2.field2, and so on. In the latter case, please note that you have to      use qualified field names (table designator plus dot plus column      designator). This is important because for each field the column meta-data      needs to be read per table. Aliases are not allowed. For each selection      field you can add a text label, enclosed in SQL in-line comments      (/*Label*/). This text label will be used as column header in the list      display. Without that, you would get the technical table and field name      only.
  4. In the FROM-clause you can add any      standard or native SQL syntax that can be interpreted by the remote      database. This allows using sub queries, joins, and so on. There are no      special checks done by the test environment. However, you should work with      qualified table names (prefixed by the schema designator). You can also      use the pseudo comment /*SCHEMA*/ which will be replaced by the schema      name from the selection screen during runtime.
         Example:   
           FROM /*SCHEMA*/APP_SALES_ORDER_ITEM
           JOIN /*SCHEMA*/APP_MATERIAL
             ON APP_MATERIAL.MATERIAL_NUMBER = APP_SALES_ORDER_ITEM.MATERIAL
  5. In the WHERE-clause you can add the fixed      part of your filters. Also here you should work with table-qualified field      names. Additional filters as maintained on the selection screen will be      added to the WHERE-clause as well. This is handy for defining variable      filters like dynamic dates (see appendix).
  6. Last but not least, add further clauses      like GROUP BY and ORDER BY.

Further Diagnostics Features

Connection Test

A connection test is useful to check whether the remote database can be reached using the technical settings of the DBCON entry. It may happen that the connection cannot be established from all application servers of the calling system, e.g. when the DBSL or DB Client is not available. Please see appendix “Remote Database Connectivity” for more details.

 

There are two options for a connection test.

 

Single Connection Test

You can press button  (next to the parameter “Database Connection Name”) to perform a single connection test to the entered remote database.

 

Picture 108: Testing a DB connection

 

This will perform a connection test from the current application server to the remote database base (as defined by DBCON entry):

 

Picture 109: Result of a connection test

 

Mark a line and press on the “Details” button  to see a popup with all details (Picture 110)

 

 

Picture 110: Details for a connection test

 

-          “Database Connection Name” is the name of the DBCON entry

-          “Remote Database System” is the type of DBMS for this connection

-          “Application Server RFC Destination” is the name of the internal RFC destination

-          “Application Server Host Name” is the host name of the tested application server

-          “App.Server Local Operating System” is the type of the local OS on the application server

-          “App.Server Local Database System” is the type of local DBMS (primary connection)

-          “Available DBMS-DBSL on App.Server” shows for which DBMS a DBSL is available

-          “Ping Time (seconds)” is the pure remote database connection test time

-          “Total Time with RFC (seconds)” is the total test time that may include internal RFCs

-          “Ping Status” is the result of the connection test

-          “Error Message” would show the message text in case of failures

 

Mass Connection Test

You can press button  (on top of the selection screen) to perform a mass connection test for several DBCON entries and even via several application servers.

 

First you have to fill out an additional selection screen (Picture 111) that will pop up.

 

Picture 111: Mass connection test

 

-          The parameter “Database Connection” allows a multiple-selection of several DBCON entries, including patterns and ranges.

-          If the checkbox “Test via all Application Servers” is initial, the connection attempt is made from the application server where you are currently logged on to. If you activate the checkbox, there will be internal RFC calls to each application server to determine from which application server the database connection can be established successfully.

-          By increasing the “Number of repetitive Pings” you can force more than one connection test to the same remote database.

 

Example for a mass connection test with three DBCON entries on current application server:

 

Picture 112: Example 1 for a mass connection test

 

Example for a mass connection test with three DBCON entries on all application servers:

 

Picture 113: Example 2 for a mass connection test

-          As you can see there are two application servers with host names lu0099 and vmw3193.

-          The connections ZTN_DB6 and ZTN_DB6_2 could be tested successfully via application server lu0099, but failed to connect via application server vmw3193.

-          The connection ZTN_DB6_COPY did not succeed to connect via either application server.

 

Trouble-shooting:

In above example, although each application server has a DBSL available for the DBMS “DB6”, on the second server we always get a “DBI error”. This indicates a problem detected by SAP’s Database Interface during establishing the secondary database connection.

In brackets you see the number of the work process and the application server name. This is helpful when searching for the correct work process trace to find further information on the root cause of the problem.

An extract from the corresponding work process trace file "dev_w2" is shown in Picture 114.

 

 

Picture 114: Example for a work process trace file

To connect to the secondary database, the Database Interface needs to load a certain DB2 library, called “db2app64.dll”, which is not available on this application server. So the installation of the DB Client seems to be incomplete.

 Display Work Process Trace

The standard way to display a work process trace file would be using transaction SM50, mark the work process number, and then press button “Display File” . Furthermore you typically would also restrict the scope of display by choosing “Display Components” and marking only “Database” and “Database (DBSL)” as trace components.

 

You can reach that easily from the Test Environment, given that you are logged on to the same application server.

Press button  to switch on the advanced options.

Press button  to call an additional selection screen (see Picture 115) to chose the work process number and the application server name.

 

Picture 115: Selecting the work process to be displayed

This will load the corresponding work process trace file restricted to the interesting trace components “Database” and “Database (DBSL)”. Scroll down to very bottom of the file to see the most recent entries. You can also search by the name of the DBCON entry.

 Remote Database Dictionary Management

For processing remote database data correctly, such as type-conform list output, the framework needs to load certain dictionary information. This includes schema names, table names, table columns, key fields, primary and secondary index, or view definitions. To speed up processing, this kind of metadata is buffered inside the cross-transaction application buffers of the shared memory on the application server. Assuming that the remote data dictionary would not change so often, the buffer is kept valid for 24 hours. Afterwards the next dictionary query would read the requested remote database dictionary information again to refresh the buffer information.

 

Display Remote Database Dictionary for one table

To display the remote database dictionary information for one single table or view, simply press the button  next to the parameter “Table Name”. More information will be shown (see Picture 116) than possibly already available in the buffer from previous queries, so there might be additional dictionary accesses to load all information types into the buffer.

 

 

Picture 116: Remote dictionary information for a table

 

Double-click on column “Buffer Content” to get a popup with additional information, for example a list of all table columns, key fields, or index fields (see Picture 117).

 

 

Picture 117: List of field names for a remote table

 

In case of a view, double-click on the “Buffer Content” of type “View Definition” to display the definition of the view (see Picture 118) as stored in the remote database.

 

Picture 118: Displaying the definition of view

 

Display the entire Remote Database Dictionary Buffer

In order to display the entire list of buffered remote database dictionary information, first press button  to switch on the advanced options.

Then press button  to list all buffered dictionary information as shown in Picture 119.

 

 

Picture 119: List of buffered entries

 

At the column “Last Update Timestamp” you can see when the dictionary buffer was updated with the actual remote dictionary information for the last time.

 

Clear the entire Remote Database Dictionary Buffer

If you are aware of dictionary changes in the remote database, that have not yet been updated in the dictionary buffer, you can press button . This will clear the entire Remote Database Dictionary Buffer and each first request per dictionary object would go directly to the remote database to query the most recent dictionary information and update that into the buffer for subsequent requests.

 

Remove all outdated entries from the Remote Database Dictionary Buffer

If you just want to clear the remote database buffer for all outdated objects, that is information older than 24 hours, press button . All newer information will stay inside the buffer. Removed buffer entries will force a new remote dictionary access upon next request anyway.

 

 

Related SAP Notes/KBAs

Insert links to any related SAP Notes/KBAs that support your topic or are related. Please hyperlink ONLY SAP Note or KBA number. Example:

SAP Note 83020: What is consulting, what is support

SAP KBA 12345: This is an example KBA link

(Use the following Hyperlink with SAP Note/KBA number at end: https://launchpad.support.sap.com/#/notes/123456) __________________________________________________________________________________________________________

Use this structure to help you compose your contributions for WIKI and at the same time will ensure spelling and grammar.

  • No labels