We are on DS 3.1 and we have server group architecture. The requirement is
1. load data into an application in a loop using a WS. The loop is used to handle the volume of data better with a WS.
2. make a WS call to BOE to keep alive a login session for loading data.
I cannot combine both of the steps in one dataflow because DS always throws an error if there are 2 different types of WS in a DF, but 2) has to be send the active session indicator message every 5-10min till 1) completes all the data load which could be 1-2 hr, otherswise 1) will not be able to load the data into the target application. So I took the following approach
1. Start loop1 to load the data in chuncks from a table
2. Have 2 WF in parallel inside the loop1
3. WF1 loads the target in chunk of data and updates the counter for the loop1 in a script
WF2 has a loop2 inside it to send WS call indicating that a session is active for loading data. A script in WF2 checks if WF1 completed the load by checking the loop1 counter value. If yes then change the loop2 counter value to exit WF2
Following is how DS is executing the job:
1. WF1 starts loading the target application
2. WF2 starts sending the session active indicator
3. WF1 completes all the load and sets the counter value to exit the loop1
4. Loop1 inside WF2 cannot see the updated counter value done by WF1. WF2 keeps running in loops, sending the session active WS calls
5. WF1 tries to re-load the target application multiple times since it cannot exit from loop1
6. The whole job keeps running with no exit till I manually abort it
If I have WF1, WF2 in series it works fine, but the issue is by the time WF1 completes data load the session is expired and only a portion of the data is loaded and I cannot re-activate the session
Is this the expected behaviour of DS ?
Any other way of handling this requirement ? Even writing to a table by WF1 and WF2 reading that value may not work, since by the time WF2 reads the value from WF1, WF1 might start re-loading the target again
Any help is appreciated