Child pages
  • Data Staging
Skip to end of metadata
Go to start of metadata

Q. After starting an InfoPackage, I can't see any job running in SM37. How can this possibly be?

A. This Q&A was originally discussed in this forum thread.

In the source system, the extraction job has the name BI*<request id>. If the DataSource is in SAP ERP, you can check for this job in your ERP system and see what's the status of the extraction. If you're moving data from BW to BW itself (e.g. you're loading data from a DSO to a datamart InfoCube), the extraction job will be running in BW itself.

Once the extraction job has been completed, in the BW system the data update is done through a dialog process, which you can only monitor in SM50. You won't see any jobs in SM37 related to the update.

In the Scheduling tab of an InfoPackage, you can choose between two options:

  • Start Data Load Immediately - The request will be created in foreground using dialog process;
  • Start Later in Batch - The request will be created using a backgroud job.

However, this only refers to the request creation. Even if you choose the 2nd option, the completion of this job doesn't mean that the load has been completed as well. It just means that the request has been created successfully.

Only when you check the option Request Batch Process Runs Until All Data Has Been Updated in BW this job will be running in SM37 till the upload is complete. Otherwise the only way to monitor your load is SM37 for the extractor job and SM50 for the ETL processing and the update into a data target.

Notice also that the Start Later in Batch setting is hidden if the InfoPackage is used in a process chain. This is because the start of the request is determined by the process chain itself. Therefore, the request is created by the job associated to the process chain's process (see the page Scheduling InfoPackages in the help guide).

  • No labels