cancel
Showing results for 
Search instead for 
Did you mean: 

Normalization for HANA Database

Former Member
0 Kudos

Hi All,

I've been reading different articles regarding what's best for HANA Database Designing about NORMLIZATION, there are opinions where they state that HANA tables should be flattened (does this mean to put into one single table?) others say that it's a standard to normalize your database tables regardless if it is HANA.

My question is what is best for HANA? Should we still follow the normalization forms or loosen those concepts for HANA?

Thanks in advance!

Accepted Solutions (1)

Accepted Solutions (1)

lbreddemann
Active Contributor

Please review the HANA guidelines for developers documentation for the general approach to database modelling in SAP HANA. You will find that it closely resembles the classic entity relationship modelling, but doesn't implement relations via constraints. Instead, associations/joins are declared in the information models and CDS views.

"Flattening" tables basically means to pre-join two tables which means, that the data in those tables is not independent anymore and that data duplication is likely and expected. Think of a 1:n join of two tables, say "invoice header":"invoice line item". By pre-joining the two tables, the header information will be repeated for every line item. When dealing with such a table, one has to be aware of the group of repeating columns and program accordingly. While this approach spares the join execution when the data is queried, it adds complexity to the data model.

Typically, one wants to reduce complexity and keep the model as simple as possible. That also means to keep it as free from surprises as possible. This is, of course, true for all SQL databases and not just for SAP HANA. However, it also applies to SAP HANA and served me well in my design decisions.

If during your performance testing you realise (by measuring) that a specific join takes too much time to fulfil your application goals, de-normalisation can be one of several techniques to consider. Personally, it would be one of the last choices I would consider as it is rather heavy-weight and requires additional effort to deal with the mentioned added complexities.

Frank mentioned one example of the S/4 HANA simplification re-modelling where combining multiple tables into a single one was chosen. Important here is to understand that the application developers here could use decades of experience with the existing data model and how it was used by the different applications. This insight, combined with the processing capabilities of SAP HANA and extensive testing the design allowed taking the informed decision to create the combined table. Such a design also requires additional changes to allow the application work correctly with the new table (i.e. the compatibility views). I mention this, to underline that de-normalisation is not a light-weight decision that can be made without major impact on the rest of the application.

Answers (1)

Answers (1)

architectSAP
Active Contributor
0 Kudos

Hello Marvin,

This depends on your requirements. SAP for example combined multiple tables into one Universal Journal table in SAP S/4HANA to allow for fast in memory access and columnar aggregation. However, if you were designing a classical row based data model, then normalization would continue to be considered best practice to avoid redundancies.

Best regards

Frank