cancel
Showing results for 
Search instead for 
Did you mean: 

How add new pool to Data hub to import price and invoice data

Former Member
0 Kudos

Hi when I see hybris wiki for pricing and invoice data load from ERP to hybris using data hub Pool settings are as below sapcoreconfiguration.pool=SAPCONFIGURATION_POOL sapcoreconfiguration.autocompose.pools=GLOBAL,SAPCONFIGURATION_POOL is it mean price data and invoice data will be loaded in the same pool and if yes will it be any performance issue and if I have to load those to different pools what are the steps I have to do and where and all I have to do changes corresponding that

Accepted Solutions (1)

Accepted Solutions (1)

mpern
Employee
Employee
0 Kudos

I had a similar requirement regarding stock levels.

First, create a new datahub extension that dependes on sapidocintegration (part of the SAP integrations)

To change where the items end up, modify the correct IDOCMappingService bean in the spring.xml file of your extension, here the example for stock levels:

     <bean id="sapproductLOISTDMappingService" class="com.hybris.datahub.sapidocintegration.IDOCMappingService">
     <property name="rawFragmentDataExtensionSource" value="sapproduct" />
     <property name="rawFragmentDataType" value="RawLOISTD" />
     <property name="rawFragmentDataFeedName" value="STOCK_FEED" />
 </bean>

As you can see, you can configure the rawFragmentDataFeedName to a different feed. The feed defines in which pool the raw items end up.

To generate a new pool on startup with create-drop, we added some csv files to META-INF/essential-data of the custom extension (I am not sure if this still works in datahub version 6+!😞

 ├── src
 │   └── main
 │       └── resources
 │           └── META-INF
 │               ├── essential-data
 │               │   ├── essential-data.mcsv
 │               │   ├── stock-feed.csv
 │               │   └── stock-pool.csv

essential-data.mcsv

 # imports default feeds and pools
 DataHubPool,stock-pool.csv
 DataHubFeed,stock-feed.csv

stock-pool.csv

 poolId,poolName
 20000,"STOCK_POOL"

stock-feed.csv

 feedId,name,description,poolingStrategy,poolingCondition
 20000,"STOCK_FEED","Feed for stock levels","NAMED_POOL","STOCK_POOL"

and to to autocompose and autopublish for the new pool, you have to adapt sapcoreconfiguration.autocompose.pools and sapcoreconfiguration.autopublish.targetsystemsbypools

 sapcoreconfiguration.autocompose.pools=GLOBAL,STOCK_POOL,...
 sapcoreconfiguration.autopublish.targetsystemsbypools=STOCK_POOL.HybrisCore,...
former_member224482
Active Contributor
0 Kudos

Great answer.

mpern
Employee
Employee
0 Kudos

Thank you, that means a lot coming from you

Former Member
0 Kudos

Hi Thanks for the update I need one more help. In my POC level I don't have sap connection to do that.Is anyway I can place sample idoc some where and read from there and export data into hybris.If yes what I need to do for that.Can you help me on what are the changes I have to do

mpern
Employee
Employee
0 Kudos

Can you please post a new question with the details to your problem?

rahulverma94
Active Participant
0 Kudos

Will this work with datahub 6.7? If not, could you please suggest any alternative? Also, I dont see any essential-data folder under META-INF folder in my datahub 6.7 set up. Does that mean we need to create it manually??

Answers (0)