Skip to Content
0

Special Character is not loading after upgraded from Data Service 3.1 to 4.2 SP6

Feb 23, 2017 at 11:06 PM

134

avatar image

when we are using data service 3.1, our data load of KNA1 table from SAP to SQL server 2005 version are able to display special character such as "é" itself as "?" in sql table for NAME1 attribute.

currently we upgraded data service to 4.2 SP6, our data load of KNA1 table keeps having error message as below, and data contains special character never get loaded in sql table (SQL server 2014).

both 3.1 and 4.2 config for R3_DS are the same except that 4.2 version is using execute preloaded as ABAP execution option but 3.1 is using generate and execute. both locale setting are English, utf-8. Regarding to RM_DS setting of locale for both 3.1 and 4.2 , the language is set as <default>, code page set as utf-8, and server code page set as utf-8.

For the SQL server, the only changes we made is Server Collation from

SQL_Latin1_General_CP1_CS_AS to SQL_Latin1_General_CP1_CI_AS.


Error Message:

8996 10776 FIL-080105 2/23/2017 2:15:13 PM |Data flow DF_KNA1|Reader KNA1
8996 10776 FIL-080105 2/23/2017 2:15:13 PM A row delimiter was seen for row number <169290> while processing column number <218> in file <E:/DSData/ZKNA1.dat>. The row
8996 10776 FIL-080105 2/23/2017 2:15:13 PM delimiter should be seen after <219> columns. Check the file for bad data, or redefine the input schema for the file by editing
8996 10776 FIL-080105 2/23/2017 2:15:13 PM the file format in the UI.

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

3 Answers

Arun Sasi Feb 24, 2017 at 01:49 PM
0

Hi

Did it previously work fine in 3.1 before the upgrade. The ABAP Data flows output is a transport file which is nothing but a .DAT file. It should be generated under the Generated ABAP Directory .

Try opening the DAT file in Notepad++ and check for the Latin Characters in issue.

Refer SAP Note

2002275 - Error running ABAP dataflow with carriage returns in fields similar to: "A row delimiter was seen for row number <x> while processing column number <y> in file <z>. The row delimiter should be seen after <x> columns. Check the file for bad data, or redefine the input schema for the file by editing the file"

Show 4 Share
10 |10000 characters needed characters left characters exceeded

Hi Arun,

Yes, our 3.1 is working fine before the upgrade. Basically we have a new server to setup for 4.2, therefore, we are still able to make sure we have PRD data from 3.1. The upgrade 4.2 has not passed through testing, therefore, it is not up and running. I used Notepad ++ to do the investigation, the file reading as utf-8 has no issue. But if I converted to ANSI, the special character field got merged with next attribute. Our current 3.1 has no issue to handle the special character but 4.2 just having issue with it. I checked the 3.1 .dat file vs the 4.2 .dat file, the size, format, data are the same look, it just the 4.2 unable to handle the special character.

In the note 2002275 seem like it suggested to make changes of FTP to RFC calls, that will be a lot of efforts to doing so. Futhermore in the note mentioned about 4.7.1 The Transport_Format. Where shall I find the object library to make this changes?

I also find an articles to note that we might able to change the SAP Data Service Locale Setting in server to custom the code page as utf-8 (

https://blogs.sap.com/2016/06/14/sap-ds-code-page-settings-steps/

) , but still not working right after I did the changes in server.

0

Hi Ling,

Recently I also encountered the same issue with the Latin character. I am using DS 4.0 SP2. In my case I had to manually correct the latin characters(é to e). If we dont change it to e then it misinterprets the character and moves one column ahead. This was tedious and I had to spend lot of time. I am sure there must be some workaround with our experts on this forum.

Changing the Tranport format would not work as it is not related to Transport format.

@Dirk, @Brandon Any inputs from you. What would be the setting to allow latin characters in the CSV file

Regards

Arun Sasi

0

we believed it is not related to SQL server setup issue. We did a manual input test into sql table for special character without any issue.

It has alternative way of doing special character which we did for other jobs. We used to write a SAP function to overwrite any special character or symbols while we called the job for the field. But this method will require extra time for loading data from SAP to FTP.

0

Hi Ling,

I found out a solution for this. The DAT file should store latin or special characters correctly. You can check it from SAP for the settings.

I loaded a CSV file with latin Characters in to a Oracle staging table and it was successfully loaded.

1) Right Click the SQL Server Datastore

2) Click Advanced

3) Check the option Import unsupported data type as VARCHAR of size

4) Check the Data @ database level and Data Services in the Target table. Hope you have set the correct collation level in SQL Server Db

Hope this is useful for you and all others who are facing this issue.

Regards

Arun Sasi

mw79x.png (2.2 kB)
3uxf5.png (2.6 kB)
0
Rakesh Yadav Feb 28, 2017 at 04:46 AM
0

Check the Property of .DAT file. The page code .dat file should be UTF-8.

Share
10 |10000 characters needed characters left characters exceeded
Ling Chang Mar 17, 2017 at 09:01 PM
0

found the issue located at the SAP abap side, must be careful if the setting in abap code like this if using

execute preloaded:

this DEFAULT shall be as UTF-8:


4yqdl.jpeg (10.4 kB)
fzxwv.jpeg (10.2 kB)
Share
10 |10000 characters needed characters left characters exceeded