cancel
Showing results for 
Search instead for 
Did you mean: 

Rows per commit for LONG datatype

0 Kudos

Hi, When am trying to migrate data, am getting below warning and its taking longer time to migrate.

Warning: Rows per commit for table loader has been reset to 1 because table contains a LONG or BLOB column.

I did some search and I didn't find any solution.

This is causing delay. I've already loaded data to DEV. because it's DEV environment time doesn't matter but my concern is moving to PROD.

Database: MySQL AWS.

SAP DS: 14.2.4

we have huge data to migrate.

please help.

0 Kudos

As I expected, BODS is giving me below timeout error. It's definitely not MySQL RDS issue as timeouts set for bigger numbers (8 hours).

38207520DBS-0704018/17/2018 1:02:26 PM|Data flow DF_abc|Reader Q_xyz ODBC data source <QA_DS1> error message for operation <SQLFetchScroll>: <[MySQL][ODBC 8.0(w) Driver][mysqld-5.7.19-log]Lost connection to MySQL server during query>.

MySQL timeouts: SHOW VARIABLES LIKE '%_timeout';

'connect_timeout', '10' 'delayed_insert_timeout', '300' 'have_statement_timeout', 'YES' 'innodb_flush_log_at_timeout', '1' 'innodb_lock_wait_timeout', '50' 'innodb_rollback_on_timeout', 'OFF' 'interactive_timeout', '28800' 'lock_wait_timeout', '31536000' 'net_read_timeout', '30' 'net_write_timeout', '60' 'rpl_stop_slave_timeout', '31536000' 'slave_net_timeout', '60' 'wait_timeout', '28800'

dirk.venken

aasavari.bhave

former_member187605
Active Contributor
0 Kudos

MySQL 5.7 is not supported in DS yet.

0 Kudos

Thank you, Dirk.

I know DS is does not support MySQL 5.7 yet

but the issue is not something new to 5.7. correct?

Accepted Solutions (0)

Answers (2)

Answers (2)

0 Kudos

select col1, max(length(col1)) from table1;

18,726,618 -- XML data

as suggested in this article, based on max lenght I cannot even use Varchar using Query_Transform

jessie.wu01 ravikiran.pagidi

any help is appreciated.

both my source and targets are mySQL 5.7 on AWS.

former_member208402
Active Contributor
0 Kudos

Hi Lakshman,

is your source xml file?

0 Kudos

No. source is MySQL but one field contains XML data. and almost 32 MB size.

former_member442248
Active Participant
0 Kudos

Hi Lakshman.

You will not find any solution as this is not a problem. This is how DS works and the reason I can think of is every row is stored in a buffer and when the limit in rows per commit is reached it commits to the database. Long and LOB type rows will already be big in size and letting it accumulate in buffer can impact the system which in prod could be running 1000's of jobs hence a default value of 1.

Have never used MySQL but can you please check in case there is an option of bulk loading in there. I read some where in the past that this data type loading performance issues can be handled if we use bulk loader.

Regards. S

0 Kudos

Thanks for the response, Shazin.

after did lot of research on Google, even I thought same that there is no solution. this is how BODS works.

Yes, Even I read in some post and suggesting to use Bulk loader but I dont see that option. am not sure why.