cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to find pool when using NEW_POOL_FOR_INPUT pooling strategy

Former Member
0 Kudos

We use the NEW_POOL_FOR_INPUT pooling strategy, so a new pool get created each time we upload a data feed (Documentation for NEW_POOL_FOR_INPUT).

Unfortunately, this seems to result in our listener not being able to find the pool in the DataBase when DataHub is under load.

We think this is possibly due to it not being persisted prior to the DataLoadingCompletedEvent being published, since we are getting a nullpointer when retrieving the Pool-name for the Pool-Id. When we later look at the database rows, the pool has been created with the correct information (and name). The creation of the pool is done previously by the DataFeedActor which does appears to save the pool to the DB - but perhaps doesn't complete the persistance operation before the listener is triggered?

It seems that the example event solution provided at the solution book is incompatible with the pooling strategy?

Our DataLoadedEventListener that extends DataHubEventListener is only a slight modification of the sample solution:

 public void handleEvent(DataLoadingCompletedEvent event) {
     String poolName = getPoolNameFromId(event.getPoolId());
     if (poolName.equals(poolType) || poolName.endsWith("_" + poolType))
     {
      InitiateCompositionEvent composeEvent = new InitiateCompositionEvent(event.getPoolId());
         eventPublicationService.publishEvent(composeEvent);
     }
 }

 protected String getPoolNameFromId(long poolId) {
     DataHubPool pool = this.dataHubFeedService.findPoolById(Long.valueOf(poolId));
     return pool != null?pool.getPoolName():null;
 }

Edit: And yes, the listener executes in a transaction:

 @Override public boolean executeInTransaction() { return true; }


The stack trace:

 2016-06-08 14:13:14,750 [INFO] [c.h.d.s.f.DataFeedActor] Loading 152 items to '2090_ONBOARDING' data feed
 2016-06-08 14:13:14,753 [DEBUG] [c.h.d.s.i.DefaultEventPublicationService] Publishing data hub event : DataLoadingStartedEvent{actionId=9, feedId=8, poolId=8, itemCount=152}
 2016-06-08 14:13:14,901 [DEBUG] [c.h.d.s.i.DefaultEventPublicationService] Publishing data hub event : DataLoadingCompletedEvent{actionId=9, feedId=8, itemCount=152, status='COMPLETE'}
 2016-06-08 14:13:14,902 [INFO] [c.c.d.e.DataLoadedEventListener] ****************************************************
 2016-06-08 14:13:14,903 [INFO] [c.c.d.e.DataLoadedEventListener] DataLoadingCompletedEvent For pooltype: [CSV_IMPORT_POOL], poolId  is [8], poolname is [null]
 2016-06-08 14:13:14,903 [INFO] [c.c.d.e.DataLoadedEventListener] ****************************************************
 2016-06-08 14:13:14,906 [ERROR] [o.s.a.i.SimpleAsyncUncaughtExceptionHandler] Unexpected error occurred invoking async method 'public void com.hybris.datahub.service.impl.DefaultEventPublicationService.publishEvent(com.hybris.datahub.api.event.DataHubEvent)'.
 java.lang.NullPointerException: null
     at com.ourextension.datahub.event.DataLoadedEventListener.handleEvent(DataLoadedEventListener.java:35) ~[montrose-datahub-event-5.7.14-SNAPSHOT.jar:na]

Your help would be greatly appreciated!

former_member333910
Active Participant
0 Kudos

The Data Hub team will attempt to recreate this issue. We'll update this Question when we have more information.

Former Member
0 Kudos

Thank you. If you need more information, please reach out to us and we will provide it.

former_member333910
Active Participant
0 Kudos

It's on our backlog now. I'll ping you if we need more info. I don't know when it will make it to the top of the backlog though.

In the meantime is it possible for you to use another pooling strategy as a workaround? Or does your use-case really require the NEW_POOL_FOR_INPUT strategy.

Former Member
0 Kudos

Hi Justin, I'm one of the other developers working on the solution, feel free to loop me as well in any email correspondence as I'm quite intrigued by this

For the time being, our workaround will be using a named pool strategy where we append a timestamp to the pool name. We have used this approach with in another feed, and it does not exhibit the problem. We might implement a custom *named-pool-with-timestamp" pooling strategy that wraps the named-pool strategy... depending on how much effort it takes.

We appreciate you adding this ticket to your backlog!

Accepted Solutions (0)

Answers (1)

Answers (1)

Former Member
0 Kudos

make sure to execute your listener in a transaction. @Override public boolean executeInTransaction() { return true; }

Former Member
0 Kudos

Thanks for the response, but that is unfortunately already the case. Will add it to the main question.