cancel
Showing results for 
Search instead for 
Did you mean: 

RV_LIKP Delivery archiving aborted due to large change documents

0 Kudos

When we implemented our manufacturing modules many years ago, we turned on auditing for many fields.

As a result, we have change documents that are very large.

So large, that they are causing the delivery archiving to abort due to memory.

When I tried to go into VL03N for the delivery in question, it also aborted when trying to display changes.

Has anyone else had this issue?

One workaround that I came up with is that we first PURGE all delivery change documents

before we archive the deliveries.  I just wanted to know what other alternatives that I have

and if anyone else had the same problem.

Mike

Accepted Solutions (0)

Answers (2)

Answers (2)

former_member201157
Participant
0 Kudos

The archiving object CHANGDOCU should only be used to archive the change documents of master data.
Change documents of transaction data records should still be archived together with the appropriate archiving object.

Change documents are normally only archived using archiving class CHANGEDOCU together with
application data for which they have been generated.

Archiving is usually resource intensive. Most of the times we have to deal with this and archive data in smaller amounts.

you have mentioned two issues here 1.performance 2.display inconsitency.

based on the dumps, further investigations should be carried out to identify root cause.

or maybe you have already done that

If I were in this situation would reach out to SAP Support.

Regards,

Satya

former_member222508
Participant
0 Kudos

Mike,

How are you purging change documents? are you implementing archiving object CHANGEDOCU for doing so?

I would suggest  you to implement the object CHANGEDOCU in your landscape and archive all relevant change documents.Also, revisit the policy for turning on the audit for maximum fields. Try to pick limited tables and turn off for rest so that you can avoid data being generated.

Thanks,

Bala.

0 Kudos

Change docs can be purged with standard program RSCDOK99.

There is no need to use archiving object CHANGEDOCU in this context because the delivery and all of it's related change docs will be archived together.

Another alternative is to implement a user-exit rule in the archive program to set the delivery as non-archiveible if the number of CDPOS records exceeds a specific amount.  I think there are less than 20 deliveries that would cause this memory issue.  This way I can prevent the short dump but I would have to simply leave them on the system.

Also, archiving these change docs with the delivery nearly doubles the run time than if I were to purge them first.  I'm on the fence right now as to which option I should recommend.

The deliveries in question are over 7 years old.

JL23
Active Contributor
0 Kudos

Contact SAP support, if they admit that there is a system limitation then they shall do that in an OSS note, but it can't be a solution that customers use exits to exclude such documents from archiving, as this would turn the reason for archiving up-side down.