cancel
Showing results for 
Search instead for 
Did you mean: 

Datahub - target system publication items gets stuck in PENDING status

Former Member
0 Kudos

Hi all,

We have a datahub server instance where publication was working until today. Today we noticed that the items are not being sent in hybris any more. We made a call to /target-system-publications endpoint in datahub and most of the items (14 from 15) were in PENDING status. Previously we had some items which generated an error during publish (2 items).

We were trying to replicate b2b customers when encountered this issue.

We are using datahub version 6.0.0.

Any advice to overcome this issue it will be helpful. Thanks in advance.

Accepted Solutions (1)

Accepted Solutions (1)

Slava
Advisor
Advisor
0 Kudos

Alin,

it's hard to guess what's going wrong. However, you're saying 14 out 15 publications are in PENDING status, so what is the status of the 1st one? Is it in IN_PROGRESS status? If so, then you can reformulate the problem to why a publication is stuck in IN_PROGRESS status. No, other publications can be kicked off in the same data pool while previous publication is not finished. So, that explains the PENDING status of the other publications.

As to the IN_PROGRESS publication most likely datahub-adapter on the hybris platform side failed to report completion of the publication back to DataHub. Check the platform log and see, if there are exceptions indicating a problem there.

To terminate that stuck in IN_PROGRESS publication, you can send a PUT request to /core-publications/{publicationID} on the DataHub with

 {
     "crashReport" : "terminated",
     "exportErrorDatas" : []
 }

in the body.

After that the PENDING publications should start processing. If this problem becomes frequent, you can develop a DataHub extension, which keeps track of current publications and, if a publication runs for unacceptably long time, terminates it. This feature will be available in DataHub 6.4 but for 6.0 it's going to be a custom development.

Hope it helps.

Former Member
0 Kudos

Hello, Yes, the other item is IN_PROGRESS status.

Thank you for your response that helped a lot.

felipe_kunzler
Explorer
0 Kudos

Would there be a similar way to terminate PENDING publications?

Slava
Advisor
Advisor
0 Kudos

No, there is not. You can update the database to change status from PENDING to FAILED and then restart DataHub. Prior to 6.0 (or 6.1) you can simply restart DataHub to clear the queues, but often restart is not an option especially in production

nkjahan1388
Explorer
0 Kudos

You mentioned "After that the PENDING publications should start processing. If this problem becomes frequent, you can develop a DataHub extension, which keeps track of current publications and, if a publication runs for unacceptably long time, terminates it. This feature will be available in DataHub 6.4 but for 6.0 it's going to be a custom development."

Is this feature available from 6.4 onwards? Is it a seperate extension?

Slava
Advisor
Advisor
0 Kudos

Yes, that's correct. The extension distributed with DataHub called datahub-cleanup

nkjahan1388
Explorer
0 Kudos

What about terminating a publication if it is in progress for too long. Is it handled in the datahub-cleanup extension? We are facing issues where publications are stuck in progress for long. Sometimes there is no corresponding impex-import job running in Hybris as well. Terminating a publication isn't working in 6.6. It says publication is either pending or in progress and all further actions have been terminated. Neither the publication is terminated nor the other pending publications resume until we do a restart. Any help/suggestions to handle this scenario?

Slava
Advisor
Advisor
0 Kudos

Jahan,

that is pub-recover extension, which is available in hybris artifactory since 6.4 release. It also made its way into the patch releases for 6.0 and 6.3. See if it works for you.

nkjahan1388
Explorer
0 Kudos

Thanks Can you please help me in identifying the property that will kill/terminate the publication after a specified time. Whats the difference between following two properties?

  1. datahubadapter.retry.initial.interval.millis

  2. datahubadapter.retry.max.interval.millis

Slava
Advisor
Advisor
0 Kudos

DataHub performs progressive retry to connect to a target system during publication, so, if the target system is not available datahub.retry.initial.interval.millis sets how soon first re-attempt to connect will be made. Then if the connection failed again,it doubles the interval and attempts again. It does so until datahub.retry.max.interval.millis is reached

Slava
Advisor
Advisor
0 Kudos

For what you need read Monitoring Long Running Publications

nkjahan1388
Explorer
0 Kudos

Thanks, that was really helpful. However I configured the following properties to test this on my local. Notice that after 1 minute publication does timeout but status isn't changed to 'failure'. It continues to remain in 'pending' state. So any other feed that belongs to this pool is still stuck, waiting for the pending publication. Any thoughts?

datahub.publicationmonitor.enabled=true

datahub.publication.monitor.interval=60000

datahub.publication.timeoutinminutes=1

Slava
Advisor
Advisor
0 Kudos

Jahan,

it's hard to say what's the problem. I would recommend the usual steps:

  1. Make sure pub-recover.jar is on the DataHub classpath

  2. Make sure it's loaded ad DataHub startup (search for 'Loading extension pub-recover' in the log).

  3. Examine the log for other clues of why the feature did not work (messages, stack traces, etc). You should see "Checking for IN_PROGRESS publications" every minute in the log.

Answers (1)

Answers (1)

mauricio_calcagno
Discoverer
0 Kudos

I have a simiar issue, I have two data pool, one for customer and other for orders. The customer pool is running correctly but the order pool is stucked. I check on http://datahubserver/datahub-webapp/v1/target-system-publications/ and I only see the customer pool publications, if I go to http://datahubserver/datahub-webapp/v1/pools/ORDER_INBOUND_POOL/publications and all of them are in PENDING status. I checked and pub-recover extensión is loaded correctly (Data Hub version: 6.6.0.3-RC1). Any idea what can it be? Thanks

Slava
Advisor
Advisor
0 Kudos

If they're all in PENDING status, most likely the auto-publication is not configured properly and for that reason after composition the publication does not start. See, if this document will help you https://help.hybris.com/1808/hcd/7cb1b38932bd4da0a4f02c5ccdaad0ce.html Specifically look at specifying target systems for the ORDER_INBOUND_POOL