Skip to Content
Former Member
Mar 11, 2015 at 07:54 AM

Naming conventions - why the Hungarian Notation?



I'm stumbling constantly above the hungarian notation in UI5 coding and I'm asking myself constantly, why is SAP recommending it?

I found the source for it on this page:


We strongly recommend to use the Hungarian notation where name prefixes indicate the type for variables and object field names

Well, I'm feeling strongly related to the book Clean Code written by Robert C. Martin. Here an excerpt:

Avoid Encodings

We have enough encodings to deal with without adding more to our burden. Encoding

type or scope information into names simply adds an extra burden of deciphering. It

hardly seems reasonable to require each new employee to learn yet another encoding “language”

in addition to learning the (usually considerable) body of code that they’ll be working

  1. in. It is an unnecessary mental burden when trying to solve a problem. Encoded names

are seldom pronounceable and are easy to mis-type.

Hungarian Notation

In days of old, when we worked in name-length-challenged languages, we violated this

rule out of necessity, and with regret. Fortran forced encodings by making the first letter a

code for the type. Early versions of BASIC allowed only a letter plus one digit. Hungarian

Notation (HN) took this to a whole new level.

HN was considered to be pretty important back in the Windows C API, when everything

was an integer handle or a long pointer or a void pointer, or one of several implementations

of “string” (with different uses and attributes). The compiler did not check types in

those days, so the programmers needed a crutch to help them remember the types.

In modern languages we have much richer type systems, and the compilers remember

and enforce the types. What’s more, there is a trend toward smaller classes and shorter

functions so that people can usually see the point of declaration of each variable they’re

  1. using.

Java programmers don’t need type encoding. Objects are strongly typed, and editing

environments have advanced such that they detect a type error long before you can run a

compile! So nowadays HN and other forms of type encoding are simply impediments.

They make it harder to change the name or type of a variable, function, or class. They

make it harder to read the code. And they create the possibility that the encoding system

will mislead the reader.

I find that argumentation very convincing.

When I read UI5 code from other developers, that use hungarian notation, I'm not feeling comfortable because the code is very hard to read. Also very common are abrevations that give little meaning to the name without looking at the context. Also, code is not fragmented enough into smaller functions.

Examples for unnecessary hungarian notation:

var oDateFormat vs var dateFormat

var oBtnCan vs var cancelButton

var oEntry vs var employee

var oHeaders vs var headerData

var oView vs var mainPageView

var sServiceUrl vs var approvalServiceUrl

So what was the reasoning among the SAP devs to establish that notation in UI5, despite the fact the perception among general software development standards that it's not good?

When trying to enforce types, shouldn't we relate on something like TypeScript or Flow?