cancel
Showing results for 
Search instead for 
Did you mean: 

Modification / Enhancing Datastream Upload in BCS

Former Member
0 Kudos

Scenario:

Transactional data is loaded via Extractor 0FI_GL_7 into BW Staging Area (ZBCS_O20) in delta mode,

After Delta Load from source system (AFS) data will be loaded into 2nd DSO (ZBCS_O21) in Consolidation Area.

Data in 2nd DSO gets deleted before full load from entry layer is being started.

From deletion Data in 2nd DSO until loading and activating data in 2nd DSO we can get into trouble.

If anybody is starting a BCS datastream load at this time 0 values are delivered to BCS.

In this case existing values in actual period are calculated into zero values u2013 and this should not happen.

Needed Solution:

We need a solution to prevent data loads into BCS-Cube when source DSO (ZBCS_O21) represents no active Data.

We tried to find a place for enhancing implementation of class CL_UC_* but we do not find right place to do an enhancement when debugging BCS upload functionality.

Do anybody of you have an idea on in class or method we can or should do such an enhancement?

Or do you have other ideas how we can prevent loading BCS datastream loads until our Sourceobject is empty?

Thanks a lot.

Regards,

Timo

Accepted Solutions (1)

Accepted Solutions (1)

dan_sullivan
Active Contributor
0 Kudos

It might be useful to consider having a custom task or tasks in the BCS monitor to execute these DSO loads. Such tasks would be in the task hierarchy prior to the data collection for load from datastream.

Former Member
0 Kudos

Hi Dan,

thanks for your fast response.

Di you think that i can solve my problem with a custom task

Let me explain my problem in detail.

Pulling Data via datastream load is working fine. When we block actual period via consolidation monitor user is not able to load data anymore.

Data for loading into BCS in Provided in an DSO.

This DSO is getting loaded via FULL-Update. Befor Loading data into DSO data in this DSI gets deleted.

Until source DSO do not provide activated Data we want prevent loading Data into BCS Cube.

1.)We thougt if it is possible to block/lock consolidation period with an ABAP Report in process chain but I do not find a way how blocking / locking of period can be implemented in an ABAP Report

2.) Maybee is it possible prevent the execution of the TASK.

When debugging I saw task TAEX in class method

CL_UC_MONITOR_MAIN_GUI_WD=>EXECUTE_TASK

but how can I stop or abort execution of this task?

I hope you understand my problem.

Do you think that there is any chance to solve my problem in some case a describe above?

Regards,

Timo

Former Member
0 Kudos

Timo

one way of preventing users from loading data into BCS via monitor tasks whilst these data loads are taking place is, put a step in the chain to change the real time load behaviour of the BCS transactional cube to "Real-Time Data Target Can Be Loaded with Data; Planning Not Allowed", and then after the loads have finished a step in the chain to set it back to "Real-Time Target Can Be PLanned; Data Loading Not Allowed".

Any users who try to run a data load via the monitor whilst this flag is switched will get a short dump.

Regards

Former Member
0 Kudos

Hi all,

thanks for your replies.

i dont know if a custom task solves my problem.

Because my problem is not that i want to start datastream load via process chain.

Loading Data into BCS from Source DSO is triggered from every single company by their own vio web monitor.

And of courrse this causes my problem. If a dyughter vompany starts loading via dta stream and my source DSO is empty zero records are delivered to BCS. As a result of this all the values in BCS for this period will be set to 0.

Or can I solve scenario describe above with using custom task?

If yes, can you describe a little example.

We also tried the method with switching transactual cbe from planning mode into load mode.

But as Marsy wrote users will get a dump when they start datastream load. And this is not very nice.

Maybee this problem we can get fixed from SAP when opening an OOS for this.

But there is another problem when setting transactual cube into loading mode.

Guys from Controlling are also not able to make consolidation bookings in BCS. But the hint of switching transactual cube into loading mode created another idea.

Data from source dso will be loaded in version 00 in BCS cubes. Controller do their consolidation booking in version 01.

When we use the functionality of planing like switching the mode of the cube from loading to planing is it then also possible to use a dataslice to prevent loading Data via datastrem load into vrsion 00.

This I think would be the best solution.

Controller can do their consolidation bookings into version 01 and daughter companies can only load data to BCS when version 00 is not blocked by dataslice. And value of dataslice can be set by report in process chain.

Has anybody any ideas how this problem can be solved. Or is it not possible to work with Dataslice on BCS?

Thanks for your response.

Regards,

Timo

dan_sullivan
Active Contributor
0 Kudos

When a custom task is used and it may be set as a predecessor task to data collection. The users should see in the BCS monitor whether or not the custom task is executed completely. Only when that is the case should they try their load from datastream data collection.

Sorry to say I am not aware of a way to set a data slice for BCS.

former_member209721
Active Contributor
0 Kudos

Custom task and execution in background are different concepts.

A custom task enables you to build your own BCS task with ABAP code. It is not necessary executed in background...

In your situation you could have in your consolidation monitor a first (custom) task that triggers the DSOs update, and then, just after, the load from data stream task. And as Dan mentions, you can use the preceding task option to force the user to execute the DSO upload task before the load from data stream task.

http://help.sap.com/erp2005_ehp_05/helpdata/EN/62/f7e73ac6e7ec28e10000000a114084/frameset.htm

Look at the Task sequence chapter

Former Member
0 Kudos

Hi Collet,

I do not want to trigger update load into my Source DSO. This is handled by a process chain and this works fine.

My Problem is the run of the ldatastream load into BCS-Cube.

Process in our company is defined as follows.

Data update should be done every 30 minutes to get new bookings from R/3 as fast as possible into BCS.

1.) Data is Loaded from R/3 (Datasource 0FI_GL_7) into DSO in entry Layer (ZBCS_O20) via Delta Load.

2.) Deleting Data in DSO in consolidation Layer (ZBCS_O21).

3.) Data from DSO ZBCS_O20 (Entry Layer) is beeing loaded into DSO ZBCS_O21 (consolidation Layer) in Full mode. In this data load Data gets enriched with several masterdate informations which cant be provided from R/3 extractor OFI_GL_7.

4.) ZBCS_O21 gets activated.

To get data into BCS every company starts data pull (datastream load) by its own (manually) via BCS Webmonitor. Because Holding wants that every company takes care about their data in BCS by their own.

I Hope you know understand my/our problem.

From step 2 up to the end of step 4 their is now active data present in DSO ZBCS_O21. This means if some company starts data load into BCS Cube via Webmonitor in this time slot zero records will be booked into BCS and this we want to prevent.

So our goal is find a solution to prevent starting the datastream load tast on this time..

Thanks for your response.

former_member209721
Active Contributor
0 Kudos

But if you do an automatic delta upload every 30 min, how do you guarantee that the users have manually uploaded the lastest data in the monitor ?

Former Member
0 Kudos

Hi Collet,

this is the reason why holding designed process in that way that every country have to pull their company data by their own.

in ZBCS_O21 (consolidation area) is the newest data present.

Countries pull thate by their own into Version 00 into BCS. Then they have to check their data in BCS Version 00 if everthing is OK with data in version 00 countries give feedback to company controlling.

Company controlling pulls data from version 00 in version 01 and do consolidation work on it.

This process works fine. we just have the problem with pulling zero values.

Regards,

Timo

Former Member
0 Kudos

It is not necessary executed in background...

In your situation you could have in your consolidation monitor a first (custom) task that triggers the DSOs update, and then, just after, the load from data stream task. And as Dan mentions, you can use the preceding task option to force the user to execute the DSO upload task before the load from data stream task.

Probably, the more suitable words here are:

A custom task invoks BAdI which by using ABAP reads the DSO in question. If there is data for a set of mandatory parameters (company, version etc), then the will BAdI return the appropriate status (code). Load from datastream will not happened if as were already mentioned here - the custom task is the predesessor of LFDS.

Former Member
0 Kudos

Hi,

thanks for your answers. We solved our problem by using Filter BAdI for Datastrema Task.

BAdI checks source DSO for active Data If active Data does not exist an Error Message will be created.

This stops booking Data to BCS.

@Eugene

Your advice to put a custom Task before Datastream Task sounds interessting.

Can you explain you scenario in more detail.

Perhaps you can give more technical details in the way of implementing.

Thanks a lot.

Regards,

Timo

dan_sullivan
Active Contributor
0 Kudos

You will see above that I initially suggested a custom task.

SAP Help for BCS explains this as:

Custom Task

Use

You can use this function to execute tasks that contain functionality, which you as an SAP customer define yourself.

When a custom task is executed, the system calls the active implementation of the delivered Business Add-In (BAdI) UC_TASK_CUSTOM. This BAdI is freely programmable.

As in other tasks, when a custom task is executed, the system first checks the authorization and the task status.

Then the system updates the task status and creates a task log (with messages only).

Integration

Custom tasks can reside anywhere in the task sequence. You can specify that other tasks must precede the custom task (u201Cpreceding tasksu201D). You might require preceding tasks if the custom task employs the results of other tasks within the same task group.

Features

The BAdI UC_TASK_CUSTOM provides you with the interface methods EXECUTE and EXECUTE_FINAL.

You implement the function of the custom task in the EXECUTE method. When the EXECUTE method is called, the system passes the following parameters (among others):

· Consolidation area

· Technical name of the custom task

· Execution parameters, such as the consolidation units or groups to be processed, the group currency, version characteristics, and time characteristics

For more information, see the documentation in the system.

The Customizing settings for custom tasks let you determine whether it should be executed for consolidation units or consolidation groups:

· If you choose processing per consolidation unit, the system executes the EXECUTE method collectively for all consolidation units to be processed, and then updates the status of each consolidation unit.

· If you choose processing per consolidation group, the system executes the EXECUTE method individually for each consolidation group to be processed, and then updates the status of each consolidation group.

The system executes the EXECUTE_FINAL method once only at the end of processing. You can use this method for cleanup activities u2013 for example, to free up working memory.

The Customizing settings for custom tasks also let you determine whether to save the log when you run the task in update mode.

Activities

A. Customizing

To set up a custom task:

...

1. Create a new custom task. To do this, in the process view of the Consolidation Workbench choose Consolidation Functions ® Other Tasks ® Custom Tasks.

2. Implement the BAdI UC_TASK_CUSTOM.

To see the definition of the BAdI UC_TASK_CUSTOM, on the SAP Easy Access screen choose Tools ® Utilities ® Business Add-Ins ® Definition.

3. Activate the implementation of the BAdI UC_TASK_CUSTOM.

B. Execution of the Custom Task

Execute the task in the Consolidation Monitor.

The system calls the active implementation of the BAdI UC_TASK_CUSTOM. When processing is finished, the system displays the messages of the methods EXECUTE and EXECUTE_FINAL.

Example

Extracting Transaction Data

Data collection in your business process is configured so that the data is first extracted from an SAP source system and written to SAP NetWeaver Business Intelligence (BI), where it is then processed by the consolidation component (or by another BI-based application) by means of reading the respective InfoProvider. You want this data collection to be triggered and monitored in the consolidation monitor.

To do this, you create a custom task (processing per consolidation unit) and insert it into the Consolidation Monitor.

You implement the EXECUTE method (see also the source text of the example implementation of this BAdI), and activate the implementation.

Updating Custom Database Tables

Your business process periodically writes and updates certain data in separate database tables specifically defined for this purpose. This data is then used in the consolidation process. For example, you want to load consolidation unit-specific exchange rates into a database table using a data file interface, which is then used by the BAdI to determine exchange rates during currency translation. You want this update to be triggered and monitored in the Consolidation Monitor.

To do this, you create a custom task (processing per consolidation unit) and insert it into the Consolidation Monitor.

You implement the EXECUTE method. This method reads the data, performs the update, and returns any error messages (for example, if the expected data cannot be found).

Answers (1)

Answers (1)

former_member209721
Active Contributor
0 Kudos

I see two options :

1) As Dan mentions, you build a custom task in BCS to load the DSOs from the consolidation monitor. For that you need to create an implementation for the BADI "UC_TASK_CUSTOM" in t-code SE19. Insert your code here.

2) On the opposite, you can sequence your upload tasks, i-e DSO upload and BCS data collection using a BW a process chain. Indeed you can schedule your BCS tasks using the UCBATCH01 program. With this process, you are sure that the tasks are executed in the good sequence