Skip to Content
avatar image
Former Member

flat file space issue in data services

Hi Experts,

I am facing the issue with flat file , we are getting data like "NUMBER ", while running the job its failed and data not loaded into target. If i am remove the spaces in source file, its went fine.

in error log it showing below error message.

ODBC data source <dhxhdb.wdf.sap.corp> error message for operation <SQLExecute>: <[SAP AG][LIBODBCHDB DLL][HDBODBC] General 1750416280DBS-0704012/23/2018 7:37:13 AMerror;274 inserted value too large for column: Failed in "PLANNING_GRP_DESC" column with the value 'KR GB 1750416280DBS-0704012/23/2018 7:37:13 AM 'NUMBER�������'>

Could you please help on this

Regads,
Sravan

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

2 Answers

  • Feb 26 at 01:00 PM

    use ltrim_blanks, rtrim_blanks function on the column to removes spaces/blanks.

    Add comment
    10|10000 characters needed characters exceeded

  • Feb 27 at 05:53 AM

    Hello Sravan,

    As suggested by Ravi you can use ltrim_blanks and rtrim_blanks and also you can use overflow file option in your target table. So that if again job fails then you'll get the erroneous data.

    Thanks!

    Add comment
    10|10000 characters needed characters exceeded