cancel
Showing results for 
Search instead for 
Did you mean: 

Delta 0MAT_PLANT_ATTR dumps on table bdcp2

former_member285493
Participant
0 Kudos

Hi all,

i have a problem with this extractor and i can't solve it.

I delete the init from BW and launch a new init without data trasfer, then when i try my first delta load the R/3 job ends with errors due to a dump in table bdcp2 (TSV_TNEW_PAGE_ALLOC_FAILED), RSA7 there are no records in the queue.

I couldn't find a sap note of some help, do you have any suggestion?

Thanks

Stefano

Accepted Solutions (0)

Answers (4)

Answers (4)

AA3
Participant
0 Kudos

I had same issue recently. opened up SAP OSS and was directed to apply SAP Note 1037679-Short dump in source system during

master data delta update

check sap note 170183

The following DataSources are affected:

DataSource comparison value to be used

0ARTICLE_ATTR i_chabasnm_cmp = '0MATERIAL'

0ART_PLANT_ATTR i_chabasnm_cmp = '0MAT_PLANT'

0ART_SALES_ATTR i_chabasnm_cmp = '0MAT_SALES'

0ART_ST_LOC_ATTR i_chabasnm_cmp = '0MAT_ST_LOC'

0ART_UNIT_ATTR i_chabasnm_cmp = '0MAT_UNIT'

0MATERIAL_ATTR i_chabasnm_cmp = '0MATERIAL'

0MAT_PLANT_ATTR i_chabasnm_cmp = '0MAT_PLANT'

0RT_MAT_PLANT_ATTR i_chabasnm_cmp = '0MAT_PLANT'

0MAT_SALES_ATTR i_chabasnm_cmp = '0MAT_SALES'

0MAT_ST_LOC_ATTR i_chabasnm_cmp = '0MAT_ST_LOC'

0MAT_UNIT_ATTR i_chabasnm_cmp = '0MAT_UNIT'

Former Member
0 Kudos

Hi all,

did you solve your problem? I actually have the same issue in my productive system.

I have 3.5 Mio Change Pointer in my BDCP2 table for 0MAT_PLANT. A Delta load lead to the same out of memory dump as you have mentioned.

I do not want to change any system parameters I just want to get rid of those 0MAT_PLANT entries in BDCP2.

Using transaction BD22 only deletes entries from BDCP and BDCPS. Isn't there a way to delete these entries in BDCP2 even if they are not outdated or processed yet?

Regards

Frank

sven_mader2
Active Contributor
0 Kudos

Hi,

what is the reason of so many changes?

I don't know this problem. But it is possible, that the datasource that can't be handle.

When you changed the data in a period job?

- can you split it in any packagages?

- load the delta to BW

- start the next package.

and do on.

How many data are in MARC? When you have changed all data:

=> delete the INIT.

=> load a new INIT.

Sven

Former Member
0 Kudos

Hi Sven,

the MARC table contains 1.4 Mio records. I already made several new INITs without problems. The table BDCP2 is not read during initialization. But even after a new initialization all the changepointers are still there in BDCP2 and the DELTA wants to get them with the dump as result.

I'll try it with different selections in the info package now. But I still think this will not work.

Frank

sven_mader2
Active Contributor
0 Kudos

Hi,

1Mio. data and 3 Mio changed. Thats not normal.

Check the changetable, with job/process/user made this changes.

=> that can't be the right way. Why do you change the table so often?

Sven

Former Member
0 Kudos

Hi,

the reason of the huge number of changepointers is that I didn't collect the DELTAS for 2 months. These are now the changes of this period.

We do lot`s of changes on material level so the number is OK.

I initialized the data with a very a selection (30.000 records out of 1.500.000) - but the Delta still gets a out of memory dump.

Frank

sven_mader2
Active Contributor
0 Kudos

then create a new init

and load the delta every day.

Sven

Former Member
0 Kudos

Sure, loading the delta every day is my plan.

Former Member
0 Kudos

Hi,

This is a very common memory issue. The system is not able to allocate memory for storing more data in the internal table.

Please contact the BASIS guys in your team and ask to change the parameters for heap memory in system.

Also, the below thread should be helpful to you.

former_member285493
Participant
0 Kudos

Hi all,

but when i do a new init the change pointers in table bdcp2 shouldn't be reset?

Also in the dump have this memory limit:

Roll area...................... 6221152
Extended memory (EM)........... 2501339256
Assigned memory (HEAP)......... 2000083328
Short area..................... " "
Paging area.................... 24576
Maximum address space.......... 4294967295

2GB of memory seems quite enough for an internal table, i don't think i should have all these records in bdcp2 to be extracted as delta.

Regards

Stefano

Former Member
0 Kudos

Hi,

Like said earlier it is not the problem of your data source. the extractor is not able to get enough resource (memory) to process the data. It could have due to either not enough memory in the system or allocated for the batch process or too many jobs are requesting for memory when you requested this extraction.

you could wait and re-try again and if not enough jobs are running it should go fine. if you still have issues you have to talk to basis team. When you do init w/o data trasnfer the delta pointer does not move to the latest record. Also if you try init with data transfer you might get the same error.

the other option would be give ranges of material in your info package and extract records in batches. To get the range of material number you could either lookup iin the SPRO or check with your functional and you can use the number range for the material.

Good luck.,

Alex.

former_member285493
Participant
0 Kudos

Hi Alex,

there is no way to reset the change pointer status to let the delta select only the records modified from today?

EDIT: i want to add, i've done some tests in quality, yesterday i've run the delta several times and i was pulling from R/3 always 2649 records, today i've loaded a new delta and the same 2649 have been loaded, i think there's something wrong, today i didn't expect the same data in the delta load...

Regards

Former Member
0 Kudos

Hi,

This is not related to your datasource but a memory issue.

Please ask your basis team to check that there is enough free memory.

-RMP