cancel
Showing results for 
Search instead for 
Did you mean: 

JSON String to Table Data Type

rajeshps
Participant
0 Kudos

Hello Team

I may require your inputs to convert JSON string to table data type. I'm building graph in gen 1 operators of SDI on premise 3.2.

node base operator is use for decoding message and data transform is used to perform data operation within graph

Error

The source and target operators are incompatible for connection. The Data Transform operator supports port of type table.

Group: group1; Messages: Graph failure: failed to deserialize data coming from port 'output' of operator 'python3operator1': failed to deserialize bytes representing type 'string' into type 'vtype.basetypeTable'. Make sure you are sending data that is compatible with the port type 'table'

Process(es) terminated with error(s). restartOnFailure==false

Thanks and Regards,

Rajesh PS

rajeshps
Participant
0 Kudos
  • Please share your thoughts and insights on this question.

kai-michael.roesner

amish1980_95

indu.khurana18

andreas forster

frank schuler

agus jamaludin

indu khurana

leena gopinath

michael eaton

shakti kumar

yuliya reich

wolfgang.albert.epting

minhee.sung

eduardo.haussen

veselm

gvglupita

piotr_radzki

rajeshps
Participant
0 Kudos

thorsten.hapke

Please check this question and kindly revert.
rajeshps
Participant
0 Kudos

thorsten.hapke chris.gruber

Could you please provide your valuable inputs on this query. Thanks!

Accepted Solutions (0)

Answers (1)

Answers (1)

werner_daehn
Active Contributor
0 Kudos

The first thing that comes to mind is the Kafka standard method of using Kafka Connect for SAP Hana.

https://github.com/SAP/kafka-connect-sap

I assume you know about that, so the next question would be what this approach is lacking for your use case.

werner_daehn
Active Contributor
0 Kudos

Kafka Connect and SAP DI are two totally different approaches. I never used DI in projects.

Kafka Connect does use the schema registry. To decode/encode Avro messages from/to Kafka, the Schema registry is a must, because the payload in Kafka contains a magic byte and the subject id as integer. With this value the schema registry is queried and only then you know the schema and the payload. All of that happens fully automatically in every system I have ever seen.