06-09-2006 7:30 PM
Hi all.
We are starting the implementation of SAP at the company I work and I am designated to prepare the Data Load of the legacy systems. I have already asked our consultants about the data load speed but they didn´t answer really what I need.
Does anyone have a statistic of the data load speed using tools like LSMW, CATT, eCATT, etc... per hour?
I know that the speed depends of what data I´m loading and also the CPU speed but any information is good to me.
Thank you and best regards.
06-10-2006 8:44 PM
About ten years ago, I was given the reponsibility of loading the legacy data into SAP. It's really not possible to tell you much about how fast things will go when you do your production conversion. I found that it went much more slowly in production than in any of the tests because when I did my conversions, it turned out that a lot of other things were going on in the system as well.
For our financial data, I wrote an extract program that read our legacy system and created a number of files. We FTPd those files to the SAP AP server and then used a number of custom and SAP programs to update SAP.
For FI transaction data, the program was structured so that it created a number of files and we used RFBIBL00 to create a number of batch input sessions. We submitted all of them at the same time. Because there are a limited number of batch process running at one time, only about four could run at a time. The rest queued up behind. After about twelve or so hours they were all completed. Now, because they ran simultaneously, there were a number of rejects due to locking problems. But since they weren't really financial rejects, we simply resubmitted them. Many of the transactions could now finish because the objects that were initially locked, were not locked the second (or third time). Eventually, we were left with true financial rejects. The conversion team had to look at each of these and decide how to proceed.
As I recall, we scheduled four or five days for this process and SAP came up on schedule. Your results will be different (in terms of time).
Because there were many rejects for whatever reason, batch input gave us an easy way to process and keep track of them.
Rob
06-09-2006 7:33 PM
Hi
Welcome to SDN.
As for as data transfer techniques are concerned CALL TRANSACTION is faster than other approaches since it uses asysnchronous updates by default.
LSMW has restriction in data transfer as you can only use it for one time transfer and itz not suited for frequent data transfer.Batch input session method is best suited for transferrring large amount of external data to SAP system.This is the SAP's standard approach too.
Cheers,
Abdul Hakim
Mark all useful answers..
06-09-2006 7:50 PM
hi friedel,
Again here is the complete details regarding data transfer techniques.
<b>Call Transaction:</b>
1.Synchronous Processing
2.Synchronous and Asynchrounous database updates
3.Transfer of data for individual transaction each time CALL TRANSACTION statement is executed.
4.No batch input log gets generated
5.No automatic error handling.
<b>Session Method:</b>
1.Asynchronous Processing
2.Synchronous database updates.
3.Transfer of data for multiple transaction
4.Batch input log gets generated
5.Automatic error handling
6.SAP's standard approach
<b>Direct Input Method:</b>
1.Best suited for transferring large amount of data
2.No screens are processed
3.Database is updated directly using standard function modules.eg.check the program RFBIBL00.
<b>LSMW.</b>
1.A code free tool which helps you to transfer data into SAP.
2.Suited for one time transfer only.
<b>CALL DIALOG.</b>
This approach is outdated and you should choose between one of the above techniques..
Also check the knowledge pool for more reference
Cheers,
Abdul Hakim
06-09-2006 8:10 PM
hi friedel.
Again you should pay special attention while transferring the data inparticular the below points.
1.Check whether the values are in valid format.
2.Check whether all the mandatory fields have been filled.
<b>Frequent Data Transfer Errors</b>
The most frequent errors include:
The BDCDATA structure contains screens in incorrect sequence.
The BDCDATA structure assigns a value to a field that does not exist on the current screen.
The BDCDATA structure contains a field that exceeds the specified length.
General guidelines
You should be aware of the following guidelines when you create sessions and call transactions or dialogs:
You must provide data for all required fields on a screen.
You can only specify the initial data for a screen. The system does not accept input as a response to a warning or an error message.
If there is more than one possible screen sequence for a transaction or dialog, your program specifies the screen sequence for the transaction. You must transfer all screens that the dialog user sees to the selected screen sequence. This applies even if the screen itself is not used to input data.
Cheers,
Abdul Hakim
06-09-2006 8:35 PM
Thank you for your answers.
Do you have an estimate quantity of registers that I can load per hour?
06-09-2006 8:38 PM
Hi Friedel,
It all depends on the number of records and not based on the number of hours.
Availablity of work process like Update/Background also matters a lot in database techniques..
Cheers,
Abdul Hakim
06-10-2006 8:44 PM
About ten years ago, I was given the reponsibility of loading the legacy data into SAP. It's really not possible to tell you much about how fast things will go when you do your production conversion. I found that it went much more slowly in production than in any of the tests because when I did my conversions, it turned out that a lot of other things were going on in the system as well.
For our financial data, I wrote an extract program that read our legacy system and created a number of files. We FTPd those files to the SAP AP server and then used a number of custom and SAP programs to update SAP.
For FI transaction data, the program was structured so that it created a number of files and we used RFBIBL00 to create a number of batch input sessions. We submitted all of them at the same time. Because there are a limited number of batch process running at one time, only about four could run at a time. The rest queued up behind. After about twelve or so hours they were all completed. Now, because they ran simultaneously, there were a number of rejects due to locking problems. But since they weren't really financial rejects, we simply resubmitted them. Many of the transactions could now finish because the objects that were initially locked, were not locked the second (or third time). Eventually, we were left with true financial rejects. The conversion team had to look at each of these and decide how to proceed.
As I recall, we scheduled four or five days for this process and SAP came up on schedule. Your results will be different (in terms of time).
Because there were many rejects for whatever reason, batch input gave us an easy way to process and keep track of them.
Rob