Skip to Content
0

How to extract historical data from BW Exist data Physically in the BW cube ?

Oct 18, 2016 at 05:14 PM

407

avatar image
Former Member

Hi Expert,

Could you kindly advise me about the best way to extract historical data from SAP BW.

I'm a beginner in the BW and I need to extract the data to another BI model but I no really sure about the best way to do that.

I will use Informatica Power Center to access BW, but I think, first I should map the source tables to reproduce the logic Right ? in this case I have to look at cubes or DSO which these infoprovides ? In both cases (using RSA1), I can generate an SQL code that shows me the join between the tables, I can assume that is my data source ? furthermore, If we try to copy the fact and dimension tables exactly such was built in the cube how can I do that ?

Please, any suggestion to approach will help me too much. I appreciated your time, case you need more information about the issue please let me know.

Just to show your the SQL code I can generate through RSA1 Tcode.

SELECT "D1"."SID_0COMP_CODE" AS "S____027", "S1"."COMP_CODE" AS "K____027", "D1"."SID_0PLANT" AS "S____040", "X1"."PLANT" AS "K____040", "X1"."S__ZCOD_REG" AS "S____756", CHAR(RIGHT(DIGITS("X1"."S__ZCOD_REG"), 000003), 000003) AS "K____756", "X1"."S__ZCOD_TPN" AS "S____757", CHAR(RIGHT(DIGITS("X1"."S__ZCOD_TPN"), 000003), 000003) AS "K____757", "S2"."CURRENCY" AS "K____034", "F"."/BIC/QTDEFP" AS "K____205", "F"."/BIC/QTDEITENS" AS "K____204", "F"."/BIC/QTDETKT" AS "K____206", "F"."/BIC/VLRACRESC" AS "K____193", "F"."/BIC/VLRBRUTO" AS "K____202", "F"."/BIC/VLRDESCON" AS "K____163", "F"."/BIC/VLRDESPES" AS "K____194", "F"."/BIC/VLRFRETE" AS "K____195", "F"."/BIC/VLRLIQUID" AS "K____203" FROM "/BIC/FPOS_C101" "F" JOIN "/BIC/DPOS_C101U" "DU" ON "F" . "KEY_POS_C101U" = "DU" . "DIMID" JOIN "/BI0/SCURRENCY" "S2" ON "DU" . "SID_0LOC_CURRCY" = "S2" . "SID" JOIN "/BIC/DPOS_C101P" "DP" ON "F" . "KEY_POS_C101P" = "DP" . "DIMID" JOIN "/BIC/DPOS_C1011" "D1" ON "F" . "KEY_POS_C1011" = "D1" . "DIMID" JOIN "/BI0/SCOMP_CODE" "S1" ON "D1" . "SID_0COMP_CODE" = "S1" . "SID" JOIN "/BI0/XPLANT" "X1" ON "D1" . "SID_0PLANT" = "X1" . "SID" WHERE (((("DP"."SID_0CHNGID" = 0 )) AND (("DP"."SID_0RECORDTP" = 0 )) AND (("DP"."SID_0REQUID" BETWEEN 238832 AND 2000000250 )))) AND "X1"."OBJVERS" = 'A'

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

6 Answers

avatar image
Former Member Oct 19, 2016 at 09:46 AM
0

Hello Mate,

Instead of reading data from cube, I will suggest to read data from underlying DSOs as DSO contains Active table and its easy to read active table.

I will suggest you to take help of ABAPER to create RFC using DSO and then use those RFC directly in your system.

Note - SAP system to SAP system extraction will have different method. In your case you want data in non sap system from SAP BW.

Regards,

Ashish

Share
10 |10000 characters needed characters left characters exceeded
Shanthi Bhaskar Oct 18, 2016 at 06:14 PM
0

try to load the BW data into an Open hub then use the Open hub table as source for your BI model

Show 2 Share
10 |10000 characters needed characters left characters exceeded
Former Member

Hi Shanthi, thanks for your touch, I will check this possibility. With the Open Hub I can extract only data or I caught the structure also ? another doubt is about the method of access the Open Hub, I have to read like database table with fields data type and things like that ? Please if you have some documentation more especific please let me know. Thanks again.

0

Hi Elison,

OpenHub is used to export the data from InfoProvider or BEx query into database table or flat file.

You can refer to the wiki for overview of how it works: https://wiki.scn.sap.com/wiki/display/BI/Open+Hub+Service

If you want only to read the cube using sql and can't afford to have another intermediate storage, then you can try to use to use MDX. You can also refer to the wiki: https://wiki.scn.sap.com/wiki/display/BI/OT-MDX

0
avatar image
Former Member Oct 19, 2016 at 07:37 AM
0

Can you please tell the target BW model where you need to throw data? N source you are using cube right?

Show 1 Share
10 |10000 characters needed characters left characters exceeded
Former Member

Hi Sumeet we will create a new BI system for the company, actually, we use the BW as a BI application and we're building a new one, the idea is to load the historical sales data into a DB2 database (in memory) in another model completely different from BW modeling. Ex: Here we're a retail company, so we have the cubes in structures of Sales slip header, items sales, tax and etc... We have also the "Object Data Store" with same structures: sales Items , header sales slip,tax and etc... attached you can see a print of data flow. My point here is to know what the best way to extract the data to represent that in another BI model in a totally different architect from BW. Should I approach the Cubes or Object data store ? The Open Hub or MDX meets this need?

Thank you very much I appreciate your help.

0
avatar image
Former Member Oct 20, 2016 at 02:54 AM
0

Elison,

An Open Hub would work fine for your example and you can write to an Open Hub table and then access that table from power center. Also the open hub allows for delta loads which means that you can do delta enabled subsequent loads for change capture.

The advantage of accessing the cube directly is its indices , but then when you have the data in a flat table ( Open Hub destination ) it does not really matter.

For your question on DSO versus cube - the open hub behaves in the same way for a cube or DSO , the question however is if you want to replicate the cube data or the DSO data in your new BI environment. Also if there is not much logic in the data flow from DSO to cube - use the DSO to build your open hubs because :

1. DSO is overwrite - every record is unique , cube does not overwrite and compression works but have seen many cases where compression does not happen due to system issues , patches etc. Also there are 2 fact tables for a cube - which means that your SQL becomes more complicated.

2. Reading data from DSO to open hub might take time - budget for additional indices ( secondary indices ) on the DSO to speed up the data loads.

Share
10 |10000 characters needed characters left characters exceeded
avatar image
Former Member Oct 21, 2016 at 03:58 PM
0

Many thanks for your help, we decided to take this way!

Share
10 |10000 characters needed characters left characters exceeded
avatar image
Former Member Nov 11, 2016 at 06:27 PM
0

Hello Expert,

Could you please help me understand one point that happens after I created an OpenHub Extraction on BW production environment.I follow the step by step to create this kind of extraction defining output as a file on my local machine. The extraction goes well and the file was recorded normally and extremely fast, but the admin team told me that: When an export is done and sent to an external source(.csv) In this case the BW understands as if a load is made to the next level and prevents the data from being deleted. Data has been exported to an external source, occurs as if it were a load for an upcoming data mart and prevents the subsequent deletion of these requests.

I would like to know if this scenario is known to everybody and what can I do to not allow that happen again ?

Please, I appreciate your help.

Thanks Advance.

Share
10 |10000 characters needed characters left characters exceeded