Skip to Content

Servers for S/4HANA on Google Cloud

Background:

In the process of evaluating systems for hosting S/4HANA on Google Cloud (conversion from ECC)

(1) The sizing report run against ECC has estimated the initial memory requirement for SAP HANA to be ~16 TB

(2) Largest HANA certified GCP VM today is a m2-ultramem-416 system with 12 TiB memory. Given that our requirement (16 TB) exceeds this, S/4HANA will need to be deployed in a scale-out architecture requiring GCP VMs with clustering support enabled.

(3) Largest HANA certified GCP VM today with clustering support enabled is a 4 TiB system (n1-ultramem-160)

(4) According to SAP note 2408419 (SAP S/4HANA - Multi-Node Support), S/4HANA scale-out is only supported on nodes with a minimum size of >=8 CPUs per node and >=6 TB RAM per node for Intel hardware for a maximum of 4 nodes.

Given the above, the question is:

(a) What are the recommended VMs in hosting S/4HANA on GCP for a 16 TB memory requirement - given the above info/constraints? I suppose we cannot choose n1-ultramem-160 x 4 nodes as it will not be supported for hosting S/4HANA per note 2408419 since it is a node with mem < 6 TB.

Thanks,

Rakesh

Add a comment
10|10000 characters needed characters exceeded

Related questions

3 Answers

  • Best Answer
    Posted on Sep 25, 2019 at 01:11 PM

    Rakesh

    The experience shows that Sizings should not be fully trusted, I would try to run your environment in the 12 TB system as there are some things that HANA will do to boot the system if some memory is missing.

    You are preparing a Brownfield right? and you are NOT coming from HANA DB.

    Is it possible for you to do the Suite on HANA as a test using DMO with system move, get your ECC on HANA on GCP, see how big is the system without the index tables, even see how you can cold some data of the ECC before the migration to HANA, in parallel to the project of S4 conversion.

    Summary; don't leave everything for the sizing, there are many things still to be done in your conversion to HANA to have your system unavailable for 12 TB;

    • Determine Uncompressed tables
    • Consider Data Archiving once you are on HANA
    • Consider Hot/warm
    • Apply DVM (Data Volume Management) from Solman against your system once you run in HANA
    • The S/4 HANA compress factor might not be as straight forward as it seems.

    You cant apply OLTP scale-out on GCP at this stage, it's your only option.

    Regards.

    Mario

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Sep 26, 2019 at 12:22 PM

    Mario,

    Thanks for your thoughts and tips. We're working on the tests currently.

    Regards,

    Rakesh

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Jul 16, 2020 at 05:50 PM

    HI Rakesh,

    I am wondering to know how did the migration go. Would you be able to share your experience?

    Add a comment
    10|10000 characters needed characters exceeded

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.