cancel
Showing results for 
Search instead for 
Did you mean: 

CAP JAVA: Reading LargeBinary from HANA DB as InputStream bigger than 1024 Bytes

christoffer_fuss
Participant
0 Kudos

Hi community,

we have a custom handler in CAP Java where we want to read a CDS entity with a property of type LargeBinary

This LargeBinary property is mapped in Java to type InputStream

When we use the method getContent to read the InputStream we are getting an exception and InputStream is null if the file is bigger than 1024 bytes.

We are getting this error message:

This only happens with HANA DB, with SQLITE it is working fine. If we check the content n SAP HANA Database Explorer the content is full available.

So what is the correct way to read a LargeBinaray (BLOB in HANA) with CAP JAVA?

Best regards and many thanks,

Chris

View Entire Topic
marcbecker
Contributor
0 Kudos

How do you execute the CqnSelect that returns you the book entity? You need to ensure that you retrieve the content InputStream within the active transaction. If there is no explicit transaction boundary opened around your code (e.g. in a background thread or custom RestController), you might need to open an explicit ChangeSetContext. This would look like this:

runtime.changeSetContext().run(changeSet -> {
Books book = service.run(Select.from(...)).single(Books.class);
InputStream content = book.getContent();
// make sure to consume the InputStream within the active transaction
// otherwise transaction is committed in HANA and connection returned to the pool
// which frees the resources not yet streamed
}

If you execute that code as part of an OData request coming from CAPs OData adapters the ChangeSetContext and transactional boundary should be there already by default.

christoffer_fuss
Participant
0 Kudos

Hi Marc, thanks for the fast answer. Yes the query is executed in a background job by the persistenceService without using an explicit ChangeSetContext. We will give it a try and let you know if it is working 😉

Best regards,

Chris