Skip to Content
0

Parallel Processing when using ABAP

Dec 31, 2016 at 05:37 PM

89

avatar image
Former Member

When setting up partitioning on a source table, within DS, the data flow becomes duplicated for each partition value. What about the case when using an SAP table as a source and using ABAP execute preloaded? Can we use the partitioning feature with this scenario since the ABAP is static?

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

3 Answers

Arun Sasi Jan 05, 2017 at 10:51 AM
0

Hey Dani,

Partitioning option cane be used when you are using Regular Data flow as per SAP Supplement Guide.

Eventhough you use ABAP data flow with Execute Preloaded option, partition would be helpful in improving performance as it would finally hit the SAP Table. This is kind of a fake partition which is used by Data Services.

Regards

Arun Sasi


jbpe0.png (51.5 kB)
Share
10 |10000 characters needed characters left characters exceeded
avatar image
Former Member Jan 05, 2017 at 04:28 PM
0

With a regular data flow, aren't you forced to use OpenHub? That limits the operations allowed when extracting from SAP, such as the lack of performing a join to another table. ABAP is preferred. I think it could be built into the application when using partitions, so that a where clause is included when generating the ABAP and variables get used to pass the partition values. The .dat files would need to be dynamic as well. OpenHub is too limited.

Share
10 |10000 characters needed characters left characters exceeded
Arun Sasi Jan 06, 2017 at 01:32 PM
0

Yes. ABAP is the preferred option. I fully agree with this option. I am assuming that there are indexes on the huge SAP tables being used.

Let me know if you see any difference when using partitioning feature and then using ABAP Data flows to extract data.

Regards

Arun Sasi

Share
10 |10000 characters needed characters left characters exceeded