Skip to Content
0
Jun 14, 2018 at 03:38 AM

Data Services , loading a into hana database causes data base inconsistency.

978 Views Last edit Jun 19, 2018 at 05:00 AM 4 rev

Hi

We are using data services, to move a 'table' CDPOS from one system (MAXDB) to another (HANA). We are getting this error (& others) which makes the DB inconsistent.

ODBC data source <xxx> error message for operation <SQLExecute>: <[SAP AG][LIBODBCHDB SO][HDBODBC] General
                                                            error;129 transaction rolled back by an internal error: AttributeEngine: not enough memory>.

Loading of this table which has about 33 million rows, causes the hana target database to have consistency problems. We have tried various table sizes but seem to be a data issue.

What we have found that an extra character "^?" octal 177 ( the character that DS is using as a field separator , in the intermediate ftp file ) - is in the data of 'table' CDPOS. When this table gets exported to the DS intermediate file, it has a extra field delimter which DS dosen't pick up or report. You only know that the hana db is inconsistent after the load has failed.

UPDATE : we have removed the rows with invalid ^? field delimiters , and are getting a pure memory issue.

ODBC data source <xxxx> error message for operation <SQLExecute>: <[SAP AG][LIBODBCHDB SO][HDBODBC] General 129 transaction rolled back by an internal error: Allocation failed ; $size$=116432; $name$=libhdbcsstore.so;$type$=pool; $inuse_count$=807; $allocated_size$=338320>.

Any ideas how to work around this ?