Skip to Content
author's profile photo Former Member
Former Member

SAP BI ETL - Performance Test


We are asked to suggest approaches for conducting a SAP BI ETL - Performance Test.

Particularly we are supposed to conduct a test around daily delta extraction and schedule our process chains accordingly.

Also in the production system, there would be parallel ECC Batch jobs running consuming some of the resources, when our Daily Delta Process chains are running.

The source ECC system would have a copy of the production data set, but there would not be any capability to create the daily delta data at the source system (Pre-Production system).

Request your suggestions around what could be the best approach to simulate a near production scenario for ETL Performance Test.

I am thinking around identifying a day's data and extract that particular data, but Infopackage date selection is possible in some cases only & the data-set may not be representative for changes in Masters / Transactions. Further, how to simulate the resources consumed by parallel ECC Batch jobs.

Any inputs would be welcome.


Rajesh K. Sarin

Add a comment
10|10000 characters needed characters exceeded

Related questions

1 Answer

  • author's profile photo Former Member
    Former Member
    Posted on Oct 07, 2009 at 08:51 PM


    As your source is same as production copy. You can perform certain testings in R3 side and get the timings below.

    Fill setup table variants which you would fill in production during golive time and check the timings of them. Some time

    it may result in dump or sometimes it may have longruntime for which you need to apply some notes to fix them.

    Now you will idea on run time of setup table fill timings.

    Then pull those data to BI in the sequence you would pulll in production system. May be 3 months at a time or a year at a time this will help you in knowing how much of data you will be able to pull from R3 .

    Then check no of entries in BI for a year which will help you in estimating the data volume for your historical data and check how much space they occupy in DB02 which will help you in predicting the table space required for data loads.

    If you have logics build in BI side load these huge volume of data and analyzed the performance or your routines and if needed build secondary index needed for them.

    Delta Testing :

    Intialize without data transfer and post some data in R3 and check whether they are gettting captured in R3 delta and getting updated in BI correctly.

    Wait for 1 or 2 days then try to pull the delta to BI. If the delta extractor has really problem even for small amount of data it will take long time. You can check oss notes related to them and apply them if needed to improve performance.

    While you are loading these data check in Sm37 and SM51 how many batch process (Background and Dialog being occupied) which will help you in understanding how many process it needs for loading huge volume of data and what will be the suitable timing for it.

    Hope these points are useful for you.



    Edited by: Arunkumar Ramasamy on Oct 8, 2009 2:24 AM

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.