cancel
Showing results for 
Search instead for 
Did you mean: 

Need help with some questions

Former Member
0 Kudos

what is the use of Queue delta and Direct delta in lo cockpit?

What is the difference between Queue and Direct delta in lo cockpit?

What are the different type of extractors available to extract data from different module?

What is the use of an End routine in Transformation in BW 7.0?

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi RR,

Checkout the blog below

/people/happy.tony/blog/2006/11/24/gist-of-lo-cockpit-delta

Hope it helps

Regards

Sajeed

Answers (2)

Answers (2)

Former Member
0 Kudos

Hi RR,

Check this link:

/people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods

Assign points if helpful.

Thanks,

Amith

Former Member
0 Kudos

hi

With queued delta update mode, the extraction data (for the relevant application) is written in an extraction queue (instead of in the update data as in V3) and can be transferred to the BW delta queues by an update collective run, as previously executed during the V3 update.

After activating this method, up to 10000 document delta/changes to one LUW are cumulated per datasource in the BW delta queues.

direct delta:

With this update mode, extraction data is transferred directly to the BW delta queues with each document posting.

As a consequence, each document posted with delta extraction is converted to exactly one LUW in the related BW delta queues.

Just to remember that ‘LUW’ stands for Logical Unit of Work and it can be considered as an inseparable sequence of database operations that ends with a database commit (or a roll-back if an error occurs).

An <b>end routine</b> is a routine with a table in the target structure format as an inbound parameter and an outbound parameter. You can use an end routine to execute the postprocessing of data after transformation on a package-by-package basis. For example, you can delete records that are not to be updated, or perform data checks.

pls assign points if helpful.

Former Member
0 Kudos

Thanks for the reply,

Could you please help me with this question:

What are the different type of extractors available to extract data from different module?

Say for Example: If i want to extract some datas from General Ledger (modules FI) master table, what are all the possible extractor i can use?.

Thanks

Former Member
0 Kudos

Hi RR,

Reason and Prerequisites

New design

Solution

1. Why should I no longer use the "Serialized V3 update" as of PI 2002.1 or PI-A 2002.1?

The following restrictions and problems with the "serialized V3 update" resulted in alternative update methods being provided with the new plug-in and the "serialized V3 update" update method no longer being provided with an offset of 2 plug-in releases.

a) If, in an R/3 System, several users who are logged on in different languages enter documents in an application, the V3 collective run only ever processes the update entries for one language during a process call. Subsequently, a process call is automatically started for the update entries from the documents that were entered in the next language. During the serialized V3 update, only update entries that were generated in direct chronological order and with the same logon language can therefore be processed. If the language in the sequence of the update entries changes, the V3 collective update process is terminated and then restarted with the new language.

For every restart, the VBHDR update table is read sequentially on the database. If the update tables contain a very high number of entries, it may require so much time to process the update data that more new update records are simultaneously generated than the number of records being processed.

b) The serialized V3 update can only guarantee the correct sequence of extraction data in a document if the document has not been changed twice in one second.

c) Furthermore, the serialized V3 update can only ensure that the extraction data of a document is in the correct sequence if the times have been synchronized exactly on all system instances, since the time of the update record (which is determined using the locale time of the application server) is used in sorting the update data.

d) In addition, the serialized V3 update can only ensure that the extraction data of a document is in the correct sequence if an error did not occur beforehand in the U2 update, since the V3 update only processes update data, for which the U2 update is successfully processed.

2. New "direct delta" update method:

With this update mode, extraction data is transferred directly to the BW delta queues every time a document is posted. In this way, each document posted with delta extraction is converted to exactly one LUW in the related BW delta queues.

If you are using this method, there is no need to schedule a job at regular intervals to transfer the data to the BW delta queues. On the other hand, the number of LUWs per DataSource increases significantly in the BW delta queues because the deltas of many documents are not summarized into one LUW in the BW delta queues as was previously the case for the V3 update.

If you are using this update mode, note that you cannot post any documents during delta initialization in an application from the start of the recompilation run in the OLTP until all delta init requests have been successfully updated successfully in BW. Otherwise, data from documents posted in the meantime is irretrievably lost.

The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method.

This update method is recommended for the following general criteria:

a) A maximum of 10,000 document changes (creating, changing or deleting documents) are accrued between two delta extractions for the application in question. A (considerably) larger number of LUWs in the BW delta queue can result in terminations during extraction.

b) With a future delta initialization, you can ensure that no documents are posted from the start of the recompilation run in R/3 until all delta-init requests have been successfully posted. This applies particularly if, for example, you want to include more organizational units such as another plant or sales organization in the extraction. Stopping the posting of documents always applies to the entire client.

3. The new "queued delta" update method:

With this update mode, the extraction data for the affected application is compiled in an extraction queue (instead of in the update data) and can be transferred to the BW delta queues by an update collective run, as previously executed during the V3 update.

Up to 10,000 delta extractions of documents to an LUW in the BW delta queues are cumulated in this way per DataSource, depending on the application.

If you use this method, it is also necessary to schedule a job to regularly transfer the data to the BW delta queues ("update collective run"). However, you should note that reports delivered using the logistics extract structures Customizing cockpit are used during this scheduling. There is no point in scheduling with the RSM13005 report for this update method since this report only processes V3 update entries. The simplest way to perform scheduling is via the "Job control" function in the logistics extract structures Customizing Cockpit. We recommend that you schedule the job hourly during normal operation - that is, after successful delta initialization.

In the case of a delta initialization, the document postings of the affected application can be included again after successful execution of the recompilation run in the OLTP, provided that you make sure that the update collective run is not started before all delta Init requests have been successfully updated in the BW.

In the posting-free phase during the recompilation run in OLTP, you should execute the update collective run once (as before) to make sure that there are no old delta extraction data remaining in the extraction queues when you resume posting of documents.

If you want to use the functions of the logistics extract structures Customizing cockpit to make changes to the extract structures of an application (for which you selected this update method), you should make absolutely sure that there is no data in the extraction queue before executing these changes in the affected systems. This applies in particular to the transfer of changes to a production system. You can perform a check when the V3 update is already in use in the respective target system using the RMCSBWCC check report.

In the following cases, the extraction queues should never contain any data:

- Importing an R/3 Support Package

- Performing an R/3 upgrade

- Importing a plug-in Support Packages

- Executing a plug-in upgrade

For an overview of the data of all extraction queues of the logistics extract structures Customizing Cockpit, use transaction LBWQ. You may also obtain this overview via the "Log queue overview" function in the logistics extract structures Customizing cockpit. Only the extraction queues that currently contain extraction data are displayed in this case.

The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method.

This update method is recommended for the following general criteria:

a) More than 10,000 document changes (creating, changing or deleting a documents) are performed each day for the application in question.

b) In future delta initializations, you must reduce the posting-free phase to executing the recompilation run in R/3. The document postings should be included again when the delta Init requests are posted in BW. Of course, the conditions described above for the update collective run must be taken into account.

4. The new "unserialized V3 update" update method:

With this update mode, the extraction data of the application in question continues to be written to the update tables using a V3 update module and is retained there until the data is read and processed by a collective update run.

However, unlike the current default values (serialized V3 update), the data is read in the update collective run (without taking the sequence from the update tables into account) and then transferred to the BW delta queues.

The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method since serialized data transfer is never the aim of this update method. However, you should note the following limitation of this update method:

The extraction data of a document posting, where update terminations occurred in the V2 update, can only be processed by the V3 update when the V2 update has been successfully posted.

This update method is recommended for the following general criteria:

a) Due to the design of the data targets in BW and for the particular application in question, it is irrelevant whether or not the extraction data is transferred to BW in exactly the same sequence in which the data was generated in R/3.

5. Other essential points to consider:

a) If you want to select a new update method in application 02 (Purchasing), it is IMPERATIVE that you implement note 500736. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBV302 report.

b) If you want to select a new update method in application 03 (Inventory Management), it is IMPERATIVE that you implement note 486784. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBWV303 report.

c) If you want to select a new update method in application 04 (Production Planning and Control), it is IMPERATIVE that you implement note 491382. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBWV304 report.

d) If you want to select a new update method in application 45 (Agency Business), it is IMPERATIVE that you implement note 507357. Otherwise, even if you have selected another update method, the data will still be written to the V3 update. The update data can then no longer be processed using the RMBWV345 report.

e) If you want to change the update method of an application to "queued delta", we urgently recommended that you install the latest qRFC version. In this case, you must refer to note 438015.

f) If you use the new selection function in the logistics extract structures Customizing Cockpit in an application to change from the "Serialized V3" update method to the "direct delta" or "queued delta", you must make sure that there are no pending V3 updates for the applications concerned before importing the relevant correction in all affected systems. To check this, use transaction SA38 to run the RMCSBWCC report with one of the following extract structures in the relevant systems:

Application 02: MC02M_0HDR

Application 03: MC03BF0

Application 04: MC04PE0ARB

Application 05: MC05Q00ACT

Application 08: MC08TR0FKP

Application 11: MC11VA0HDR

Application 12: MC12VC0HDR

Application 13: MC13VD0HDR

Application 17: MC17I00ACT

Application 18: MC18I00ACT

Application 40: MC40RP0REV

Application 43: MC43RK0CAS

Application 44: MC44RB0REC

Application 45: MC45W_0HDR.

You can switch the update method if this report does return any information on open V3 updates. Of course, you must not post any documents in the affected application after checking with the report and until you import the relevant Customizing request. This procedure applies in particular to importing the relevant Customizing request into a production system.

Otherwise, the pending V3 updates are no longer processed. This processing is still feasible even after you import the Customizing request using the RSM13005 report. However, in this case, you can be sure that the serialization of data in the BW delta queues has not been preserved.

g) Early delta initialization in the logistics extraction:

As of PI 2002.1 and BW Release 3.0B, you can use the early delta initialization to perform the delta initialization for selected DataSources.

Only the DataSources of applications 11, 12 and 13 support this procedure in the logistics extraction for PI 2002.1.

The early delta initialization is used to admit document postings in the OLTP system as early as possible during the initialization procedure. If an early delta initialization InfoPackage was started in BW, data may be written immediately to the delta queue of the central delta management.

When you use the "direct delta" update method in the logistics extraction, you do not have to wait for the successful conclusion of all delta Init requests in order to readmit document postings in the OLTP if you are using an early delta initialization.

When you use the "queued delta" update method, early delta initialization is essentially of no advantage because here, as with conventional initialization procedures, you can readmit document postings after successfully filling the setup tables, provided that you ensure that no data is transferred from the extraction queue to the delta queue, that is, an update collective run is not triggered until all of the delta init requests have been successfully posted.

Regardless of the initialization procedure or update method selected, it is nevertheless necessary to stop any document postings for the affected application during a recompilation run in the OLTP (filling the setup tables).

Hope it helps.

Thanks,

Amith