Skip to end of metadata
Go to start of metadata

Performance Tuning and Job Optimization at Business Objects Data Services

1)     At the Execution Properties go to the Execution Options Tab and increase Monitor sample rate to reduce the number of calls to the operating system to write to the log file. Optimal sample rate is 50000 for large number of data. Default value is 10000

2)    Select the option Collect statistics for Optimization to collect statistics which include number of rows and width of each row. This option is not selected by Default.

3)    We can improve the performance of data transformations by caching as much data as possible. By caching data in memory, one can limit the number of times the system must access the database.

4)    For Tuning the Performance of Source data we need to do a check on
a)    Ordering of the Joins
b)    Need to minimize the extracted Data.

5)    For Tuning the Performance of Target data we need to do a check on
a)    Check and set the Rows per commit value.
b)    Check and ensure the best loading method

6)    The designed Job can be performance tuned be doing the below checks
a) Loading only changed data
b) Minimizing data type conversion
c) Minimizing locale conversion
d) Precision in operation

7)    In Query Transform, Allow only those fields to be passed to the next transformation which are required. Adding the unnecessary fields will slower the process time.

8)    Try to use single query transforms for performing multiple business requirements and conditions.

For example, in a job, if we are using 2 Query transformations to load data from source to target after modifying the data in query. If one query transform is doing a lookup function on one field and next query transform is doing a aggregation like SUM,AVG etc. for some other field then use only one query transform and include both the functionalities in one query itself so that it doesn’t have to fetch the same records twice.