cancel
Showing results for 
Search instead for 
Did you mean: 

General Ledger Accounting (New): Line Items 0FIGL_O14 Performace issue

Former Member
0 Kudos

Dear Forum,

We are facing a performance issue while loading the data to 0FIGL_O14 General Ledger Accounting (New): Line Items from CUBE ZMMPRC01 -> ODSO 0FIGL_O14 DSO.

Please see my requirement below for updating the data to 0FIGL_O14 DSO.

This report is generated to display Dry Dock and Running Repair expenses for the particular Purchase orders with respective G/L's.

1) The G/L DSO will provide us the 0DEBIT_LC and 0DEB_CRE_DC Foreign currency amount with signs (+/-) amounts and.

2) ZMMPRC01 Cube will provide us the 0ORDER_VALUE (Purchse order value)and 0INVCD_AMNT Invoice amount.

While we are loading the data from CUBE ZMMPRC01 -> ODSO 0FIGL_O14 DSO ,we have created nearly 19 InfoObject level routine to derive the below mentioned fields data for MM Purchase Order related records.

0CHRT_ACCTS Chart of accounts

0ITEM_NUM Number of line item within accounting documen

0AC_DOC_NO Accounting document number

0GL_ACCOUNT G/L Account

0COMP_CODE Company code

0COSTCENTER Cost Center

0CO_AREA Controlling area

0COSTELMNT Cost Element

0SEGMENT Segment for Segmental Reporting

0BUS_AREA Business area

0FUNC_AREA Functional area

0AC_DOC_NR Document Number (General Ledger View)

0AC_DOC_TYP Document type

0POST_KEY Posting key

0PSTNG_DATE Posting date in the document

0DOC_CURRCY Document currency

0LOC_CURTP2 Currency Type of Second Local Currency

0CALQUART1 Quarter

0CALYEAR Calendar year

For reference Please see the below logic to derive the data for PO related record.

DATA:

MONITOR_REC TYPE rsmonitor.

$$ begin of routine - insert your code only below this line -

... "insert your code here

types : begin of ty_FIGL,

CHRT_ACCTS type /BI0/OICHRT_ACCTS,

ITEM_NUM type /BI0/OIITEM_NUM,

AC_DOC_NO type /BI0/OIAC_DOC_NO,

GL_ACCOUNT type /BI0/OIGL_ACCOUNT,

end of ty_FIGL.

data :it_figl type STANDARD TABLE OF ty_figl,

wa_figl type ty_figl.

SELECT single CHRT_ACCTS

ITEM_NUM

AC_DOC_NO

GL_ACCOUNT from /BI0/AFIGL_O1400

into wa_figl

where DOC_NUM = SOURCE_FIELDS-DOC_NUM and

DOC_ITEM = SOURCE_FIELDS-DOC_ITEM and

/BIC/Z_PCODE = SOURCE_FIELDS-/BIC/Z_PCODE

and

/BIC/Z_VOY_NO = SOURCE_FIELDS-/BIC/Z_VOY_NO

and

FISCYEAR = SOURCE_FIELDS-FISCYEAR.

if sy-subrc = 0.

RESULT = wa_figl-AC_DOC_NO.

ENDIF.

clear wa_figl.

Please note the same kind of logic is applied for all the above mentioned fields.

Here is my concerns and issue.

For the all above all routines i am referring BI0/AFIGL_O1400

DSO and finally loading to the Same DSO(BI0/AFIGL_O1400

).

The worried part is my DSO 0FIGL_O1400 is currecnly having nearly 60 Lacks records and MM cube is having nearly 55 requests which are required to update to the Above DSO for PO related PO value and Invoice amount.

The big issue here is while uploading data from MM cube to DSO say for example if the request is having 25,000 records from this nearly 500-600 records will be updated to DSO.

But here it is taking huge time ( nearly 3 days for request ) for updating these records , like this i have to pull 50 more requests from Cube to DSO as per the requirement.

Please note as of now i haven't created any indexes on DSO to improve this loads.

Please note am facing this issue in Production environment and need your help ASAP.

Thanks & Regards,

Srinivas Padugula

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

Hi Srinivas,

Kindly have a look at below FAQ note for Extraction performance in source system,

1597364 - FAQ: BW-BCT: Extraction performance in source system

Hope this helps.

Regards,

Mani

Former Member
0 Kudos

Hi,

If selecting data from 0FIGL_O14 is taking long time then you can create secondary indexes on DSO.

0FIGL_O14 would be huge as data volume directly corresponds to data volume in BSEG.

But for you requirement, I think what you can do is,

1. create multiprovider on top of DSO and Cube and create Bex report to give you the fields requried from both the infoproviders, you can then use open hub or APD approach to keep the data in the staging table or direct update DSO and then load the data to the DSO

2. Create secondary indexes on DSO so that fetching would be faster.

3. Do the enhancment at R/3 level to fetch fields from MM during load of G/L

Regards,

Pravin Karkhanis.