Skip to Content
0

Smart Data Streaming - File/Hadoop Json - Array Issue

Jan 24, 2017 at 03:53 PM

307

avatar image
Former Member

My question regards the File/Hadoop JSON Input adapter and in particular the possibility to import information stored in arrays into hana. I have built a very simple smart data streaming project where I have an input adapter (i.e. File/Hadoop JSON Input) followed by an input stream. As a JSON I am using the JSON file given in the SAP HANA Smart Data Streaming: Adapaters Guide on page 168. Here is what it looks like

{

"firstName": "John",

"lastName": "Smith",

"phoneNumbers": [

{

"type": "home",

"number": "212 555-1234"

},

{

"type": "fax",

"number": "646 555-4567"

}

],

"friends": [

["female1","female2","female3"],

["male1","male2","male3"]

]

}

What I am trying to achieve is to get all the information stored in one of the friend’s arrays. Following the instruction given in the adapters guide I set friends[1] for the jsonRootPath and * for jsonColsMappingList in the properties of the File/Hadoop JSON Input adapter. Yet, I am not able to receive any data in my input stream. Did I get or set something wrong? Appreciate any kind of hint or solution.

Thanks

10 |10000 characters needed characters left characters exceeded

What error do you see in the log files? For example, is the file being found?

0
* Please Login or Register to Answer, Follow or Comment.

1 Answer

avatar image
Former Member May 25, 2017 at 04:05 PM
0

Hi, I have the same problem. Can anyone provide an example of the adapter usage?

Thanks

Show 1 Share
10 |10000 characters needed characters left characters exceeded

Same question for you as the original poster: What error(s) do you see in the log file? For example, is the file with the json content being found?

Testing this today using the following CCL:

CREATE INPUT WINDOW InputWindow1 SCHEMA (
	firstname string ) PRIMARY KEY ( firstname ) ;


ATTACH INPUT ADAPTER File_Hadoop_JSON_Input1 TYPE toolkit_file_json_input
TO InputWindow1
PROPERTIES
	jsonColsMappingList = 'firstname' ,
	dir = '/file_hadoop_json_input_test/streaming_data_files/' ,
	file = 'person.json' ,
	dynamicMode = 'dynamicFile' ,
	removeAfterProcess = FALSE ,
	pollingPeriod = 10 ;

and with the "person.json" file placed in the "/hana/data_streaming/PM1/adapters/default/file_hadoop_json_input_test/streaming_data_files" directory

(Note: "/hana/data_streaming/<SID>/adapters/<workspace>" is the sandbox directory for SDS)

Then I see this error in the project .out file:

06-01-2017 15:36:17.928 INFO [main] (FileInputTransporter.init) /hana/data_streaming/PM1/pm1/adapters/default/file_hadoop_json_input_test/streaming_data_files
06-01-2017 15:36:17.937 ERROR [main] (TransporterWrapper.init) Exception is thrown
java.lang.Exception: Error code:401001, Severity : 3 (Error)
Error message:File person.json doesnt exist.
Error description:File person.json doesnt exist.
	at com.sybase.esp.adapter.transporters.file.FileInputTransporter.init(FileInputTransporter.java:206)
	at com.sybase.esp.adapter.framework.wrappers.TransporterWrapper.init(TransporterWrapper.java:61)
	at com.sybase.esp.adapter.framework.internal.Adapter.init(Adapter.java:216)
	at com.sybase.esp.adapter.framework.internal.AdapterController.executeStart(AdapterController.java:257)
	at com.sybase.esp.adapter.framework.internal.AdapterController.execute(AdapterController.java:156)
	at com.sybase.esp.adapter.framework.Framework.main(Framework.java:62)
06-01-2017 15:36:17.945 INFO [main] (ContextHandler.doStop) Stopped o.e.j.s.ServletContextHandler@6ee477fc{/,null,UNAVAILABLE}
06-01-2017 15:36:17.949 INFO [main] (AbstractConnector.doStop) Stopped ServerConnector@5756920f{HTTP/1.1}{10.173.72.77:19082}

In this case, the server appears to be putting an extra "pm1" directory in the file path.

I'll look into the error that I'm seeing, but it would be good to know if you are seeing a similar issue.

Thanks

0