Hmm... this one looks tricky and could be a bug.
Tracing the actual output of the query on different levels (JDBC, server-side SQL trace) shows that the result decimal is always rounded to the 34 precision (i.e. ending in ...12350 instead of ...12345). Internally, though, processing is not affected by this "rounding". So filtering/matching values works with the full precision and feeding the data into functions also works with the full precision.
Example 1:
select pk, to_bigint(d1) from t1 where d1 = 12345678901234567890123456789012345; Started: 2017-09-18 14:34:43 Could not execute 'select pk, to_bigint(d1) from t1 where d1 = 12345678901234567890123456789012345' in 5 ms 283 µs . [314]: numeric overflow:cannot convert to BigInt type: 12345678901234567890123456789012345 at function to_bigint() (at pos 23) >>> FULL PRECISION IN ERROR MESSAGE
Example 2:
select pk, to_char(d1) from t1; PK TO_CHAR(D1) 101 12345678901234567890123456789012345 >>> FULL PRECISION AFTER CONVERSION TO CHAR
I just wanted to finish this comment off by writing that this should be checked by HANA support when I found SAP Knowledgebase article https://launchpad.support.sap.com/#/notes/2410312 (2410312 - Tail bits are set to zero for big decimal in SAP HANA). This KBA basically describes the same phenomenon and declares it as "not a bug".
So, while HANA (both HANA 1 and HANA 2) handle decimal internally with the full precision of 38 bits, the direct output is limited to 34 bit.
cheers,
Lars
As Lars mentioned, internally all data is treated as Decimal(38,0). Issue is when streaming the data to the client where it was rounded to 34 precision. This planned to be resolved for the next HANA release which is HANA 2.0 SPS03.
Add comment