cancel
Showing results for 
Search instead for 
Did you mean: 

Increse No of BGP while data load and how to bypass the DTPin Process Chain

Former Member
0 Kudos

Hello All,

We want to improve the performance of the loads. Currently we are loading the data from external Data Base though DB link. Just to mention we are on BI 7 system. We are by passing the PSA to load the data quickest. Unfortunately we cannot use PSA. Because loads times are more when we use PSA. So we are directly accessing views on external data base. Also external data base is indexed as per our requirement.

Currently our DTP is set to run on the 10 parallel processes (on DTP settings for batch Batch Manager with job class A). Even though we set to 10 we can see loads are running on 3 or 4 Back ground parallel processes only. Not sure why. Does any one know why it is behaving like that and how to increase them?

If I want to split the load into three. (Diff DTPs with Different selections). And all three will load the data into same info provider parallel. We have the routine in the selection that will look a table to get the respective selection conditions and all three DTPs will kick off parallel as part of the process chain.

But in some cases we only get the data for two or oneDTPs(depends on the selection conditions). In this case is there any way in routine or process chain to say that if there is no selection for that DTP then ignore that DTP or set to success for that DTP and process chain should continue.

Really appreciate your help.

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

Thanks Stephen for quick response.

As mentioned earlier source data base is indexed with required partition and we can see that when we run the same sql in toad. so we need to concentrate on BW side.

Here are the issues we had for your points

1. as per the business requirement we need to use full data. so delats are ruled out

2. We already had the drop and recreate the indexes as part of the load process

3. write optimised DSO is good for loading but not proned for reporting. as we are dealing with large volume of data and needs good pefromance of reports we have to load to cubes.

4) There are no ABAP look ups . we avoided almost all those so we have direct loads from source to target

5) As of now what i am aware is that BI accelerator will improve the performance of loads from SAP system. they are still in process to apply for third party data sources

Let me calrify what my two issues are

1) How to increase no of parallel back ground process while data load? ( batch manager has been set to 10 but its useing 3-4)

2) i want to run the DTPs parallaly. if there is no data available for that selection DTP corresponding DTP has to go to success. how to implement that loagic?

Really Appreciate your help.

Former Member
0 Kudos

Hi

Sounds like a nice problemu2026

Here is a response to your questions:

Before I start, I just want to mention that I do not understand how you are bypassing the PSA if you are using a DTP? Be that as it may, I will respond regardless.

When looking at performance, you need to identify where your problem is.

First, execute your view directly on the database. Ask the DBA if you do not have access. If possible perform a database explain on the view (this can also be done from within SAPu2026I think). This step is required to ensure that the view is not the cause of your performance problem. If it is, we need to implement steps to resolve that.

If the view performs well, consider the following SAP BI ETL design changes:

1. Are you loading deltas or full loads. When you have performance problems u2013 the first thing to consider is to make use of the delta queue (or changing the extraction to just send deltas to BI)

2. Drop indexes before load and re-create them after the load

3. Make use of the BI 7.0 write optimized DSO. This allows for much faster loads.

4. Check if you do ABAP lookups during the load. If you do, consider loading the DSO that you are selecting on in memory and change the lookup to refer to the table in memory rather. This will save tremendous time in terms of DB I/O

5. This will have cost implications but the BI Accelerator will allow for much faster loads

Good luck!