Skip to Content
0

Badies/Exits for updating CDHDR in QM

Mar 23 at 11:17 AM

85

avatar image

Hello,

I am trying to find the badies/exits that are called, for updating the table CDHDR in QM module, for change document object QPRUEFLOS (TR: QA11, QA12; QA01, QA03) .

I am trying to create a local table that will hold the changes (the ones that will be updated in CDHDR) made for this object, so I will be able to analyse them afterwards without accessing CDHDR (for performance reasons).

Did anyone do this before, or handled it in another way ?

Any suggestion will be helpful!

Thank you,

Adelina

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

4 Answers

Best Answer
Raymond Giuseppi
Mar 23 at 12:51 PM
0

You could create an event from change document (Tr. SWEC) as in a Workflow trigger definition, then handle the event in a custom receiver function module to fill your own database table (Tr. SWETYPV)

You could also look for ALE change pointers as in an IDOC distribution.

Etc.

Show 1 Share
10 |10000 characters needed characters left characters exceeded

Thank you Raymond for the hint. I will need a little help in defining the event and handling the event.

In SWEC, the Change Document Object will be 'QPRUEFLOS', but for which Business Obj. Type ?

And in SWETYPV where should I specify the custom FM, and where can I see where exactly this FM will be called ?

0
Adelina Suvagau Mar 29 at 01:45 PM
2

Thank you Raymond!

I populated my custom table with the following steps:

1. I copied the Business Object for Inspection Lot in SWO1.

2. I defined the Change Document QPRUEFLOS in SWEC for the previous defined Business Object and for the corresponding event (QM12 event changed).

3. I defined the FMs to handle the event in SWETYPV. The FM where I've put the code is the one for Check Function Module. This FM is called after the tables CDHDR and CDPOS are populated, so the last entry from CDHDR should be the one previously created/modified.

My custom table gets populated correctly for QM12.

Share
10 |10000 characters needed characters left characters exceeded
Adelina Suvagau Mar 23 at 03:46 PM
0

Hello everyone,

Did anyone create a custom implementation for INSPECTIONLOT_UPDATE ? Is it called for all qm transactions ?

Will I be able to access the changes on QM objects with this Badi ?

Thank you in advance,

Adelina

Share
10 |10000 characters needed characters left characters exceeded
Martin Hinderer Mar 26 at 06:43 AM
0

I strongly advise to not mess around with CDHDR. This is the central place of Change Management in SAP, and it has references to places that you might not know. You will for sure get lots of issues, at the latest when implementing SAP updates/support packages. Deactivating/relocating the recording of changes to a "local" table (and therefore changing the built in recording/traceability of changes) might in addition cause you losing a certification of your system (if you have one, e.g. by KPMG). This might be relevant if you are dealing with sensitive products/data.

If I understand you correct, you want to do this due to performance reasons during analysis. In this case my recommendation is to take a look at a data warehouse system, which copies your change data and provides it in a way that is best for data analysis.

Regards
MH

Show 4 Share
10 |10000 characters needed characters left characters exceeded

Hi Martin,

Thank you for you suggestion!

I don't want to make any changes to CDHDR, I just want a way to register the changes that will be made on QM object in my new table - somehow, just to replicate the data from CDHRD (for QM objects) to my custom table. In this way my custom table will hold fewer entries than CDHDR and I'll then read the data directly from my custom tabl instead of CDHDR.

Regards,

Adelina

0

Ah OK, I unterstand. Nevertheless you are then adding complexity to the system (as each QM write access to CDHDR has to be duplicated and to be stored to your custom table. This will decrease operational system performance, and as far as my database knowledge is still present an additional write command consumes more performance than late a read command.

A separate data warehouse is still no option? You might also want to look at the Audit Trail Transactions.

0

No, a separate data warehouse is not an option ..

0

Weel, if I would be your IT department, I would rather work with you on optimizing the performance of your database accesses (how did you try it so far?) than allowing you to create shadow tables that replicate existing data...

0