Skip to Content

Hitting the 2 billion record limit on HANA on a TDMS read - why? Table is partitioned

Mar 08 at 03:17 PM


avatar image


Seems we've hit the HANA '2 billion records' issue!

During the "Preparation of Tables for Data Transfer" step of a TDMS time slice transfer, we get a short dump in the HANA Sender system:


error "SQL code: 129" occurred while accessing table "BSEG".
Database error text: "SQL message: transaction rolled back by an internal error: Search result size limit exceeded: 4776242334"

I've read OSS "2154870 - How-To: Understanding and defining SAP HANA Limitations" and a few related forums. However, the table it is occurring on in the sender system (a HANA system) is BSEG, which is already partitioned (6 partitions of approx 680 million rows each, totalling 4.7 billion rows).

How come we get the limit issue - is the TDMS code somehow not taking partitions into account?

The TDMS system itself is a Sybase ASE one but that shouldn't matter.


10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

1 Answer

Ross Armstrong Mar 19 at 10:14 AM

Fix provided by SAP - it's in a SAP Note but it's as clear as mud.

"You can use the view V_CNVMBTPARAMS to maintain entry in CNVMBTPARAMS table. NODATA_SET_NEW_SELECTION to X in cnv_mbt_params table. However, this is not required as the Z-report mentioned in the OSS note (2295463) already makes the customizing required to prevent this issue for current package as well as future packages"

Show 2 Share
10 |10000 characters needed characters left characters exceeded

Did this solution work for you ?

This is interesting. We also came across the same 2B records issue for BSEG and SAP came back mentioning that presently there is no solution and this table has to be transferred separately outside TDMS. So we ended up 'exporting' only this table and then 'importing' to target system.


Interesting indeed... yes the solution suggested worked for us.