on 02-06-2023 8:48 AM
Hello Team
I may require your inputs to convert JSON string to table data type. I'm building graph in gen 1 operators of SDI on premise 3.2.
node base operator is use for decoding message and data transform is used to perform data operation within graph
Error
The source and target operators are incompatible for connection. The Data Transform operator supports port of type table.
Group: group1; Messages: Graph failure: failed to deserialize data coming from port 'output' of operator 'python3operator1': failed to deserialize bytes representing type 'string' into type 'vtype.basetypeTable'. Make sure you are sending data that is compatible with the port type 'table'
Process(es) terminated with error(s). restartOnFailure==false
Thanks and Regards,
Rajesh PS
The first thing that comes to mind is the Kafka standard method of using Kafka Connect for SAP Hana.
https://github.com/SAP/kafka-connect-sap
I assume you know about that, so the next question would be what this approach is lacking for your use case.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Kafka Connect and SAP DI are two totally different approaches. I never used DI in projects.
Kafka Connect does use the schema registry. To decode/encode Avro messages from/to Kafka, the Schema registry is a must, because the payload in Kafka contains a magic byte and the subject id as integer. With this value the schema registry is queried and only then you know the schema and the payload. All of that happens fully automatically in every system I have ever seen.
User | Count |
---|---|
85 | |
10 | |
10 | |
9 | |
6 | |
6 | |
6 | |
5 | |
4 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.