Dear experts, We are using Datahub 6.1.0.1 on Apache Tomcat 7.0.69 version.
I need to know how to handle high volumes in Data hub. From SAP, daily half million to one million iDocs are supposed to be pushed to Data hub. How can we make this scenario handled by Data hub without running into performance/memory issues. Our Data hub runs on 16GB memory system. We have already provided below memory and GC related property.
export CATALINA_OPTS="-Xms8G -Xmx8G -XX:+UseConcMarkSweepGC -XX:+UseParNewGC -XX:+AlwaysPreTouch -XX:+DisableExplicitGC -XX:+PrintGCTimeStamps -XX:+PrintGCDetails -Xloggc:'/app/tomcat/7.0.69/logs/java_gc.log'"
I see few more properties with default values below...But how can we tweek these values to handle the high volumes scenario.
datahub.import.batch.size=1000 datahub.config.mode=STANDARD datahub.composition.batch.size=1000 datahub.max.composition.action.size=100000 datahub.publication.page.size=1000 datahub.max.publication.action.size=100000 datahub.jdbc.statement.batch.size=1000 datahub.sql.maxUpdateBatchSize=1000
Need your expert advise on this.