Skip to Content
4
Former Member
Mar 01, 2016 at 04:38 PM

Datahub 5.7.0.7: Publication is broken (stuck in endless loop)

207 Views

Steps to reproduce

  1. set property datahub.composition.batch.size=1 (the bug shows itself sooner this way)

  2. load 2 canonical items

  3. publish to DevNull

  4. publish again

result -> endless loop

 2016-03-01 16:36:16,036 [TRACE] [c.h.d.r.j.i.DefaultCanonicalItemJpaRepository] Retrieved composed canonical 1 items of type CanonicalType
 2016-03-01 16:36:16,043 [TRACE] [c.h.d.r.j.i.DefaultCanonicalItemJpaRepository] Filtered publishable canonical 0 items of type CanonicalType



Analysis

com.hybris.datahub.repository.jpa.impl.DefaultCanonicalItemJpaRepository#findItemsToBePublished

nextPageRequest is always built with pageRequest.lastProcessedId (not with nextPageRequest.lastProcessedId) -> nextPageRequest.lastProcessedId == 1 && nextPageRequest.pageSize == 1 (forever)

Why not use the max id the elements in composedItems? Then at least the search continues where the last non-publishable items where found

EDIT

here is the datahub log (loglevel: trace) from my test environment. Two "CategoryCanonical" items are loaded, first publish succeeds, second publish is stuck in endless loop

EDIT 2

The proposed workaround unfortunately doesn't help:

  1. Set properties to same value

    datahub.composition.batch.size=1 datahub.max.composition.action.size=1 datahub.max.publication.action.size=1

  2. Load 2 Canonical Items

  3. Publish

  4. Restart Server (kernel.autoInitMode=update)

  5. Publish

  6. Same endless loop as before

The while loop in DefaultCanonicalItemJpaRepository#findItemsToBePublished is faulty.

Attachments

5048-datahub.zip (95.3 kB)