cancel
Showing results for 
Search instead for 
Did you mean: 

JDBC in HANA Cloud

mayank_gupta01
Employee
Employee
0 Kudos

Hello,

I have one question,

does this code will work for SAP HANA Cloud instances?

I tried using hana instance in databricks and I am getting handshake_Failure error.

JDBCDriverException: SAP DBTech JDBC: Cannot connect to jdbc:sap://XXXXXXXX:443 [Cannot connect to host XXXXXXX:443 [Received fatal alert: handshake_failure], -813.]. Caused by: JDBCDriverException: SAP DBTech JDBC: SSL handshake failed : Received fatal alert: handshake_failure. Caused by: SSLHandshakeException: Received fatal alert: handshake_failure

my configuration is

%scala
import java.util.Properties

//Set connection parameters
val jdbcHostname = "XXXX.hana.canary-eu10.hanacloud.ondemand.com"
val jdbcPort = "443"
val jdbcDB = "XXXX"
val jdbcUser = "user"
val jdbcPassword = "<password>"
val driverClass = "com.sap.db.jdbc.Driver"
val jdbcUrl = s"jdbc:sap://${jdbcHostname}:${jdbcPort}"

//Check availability of the JDBC library to access SAP HANA
Class.forName(driverClass)

//Set connection properties
val connectionProperties = new Properties()
connectionProperties.put("user", s"${jdbcUser}")
connectionProperties.put("password", s"${jdbcPassword}")
connectionProperties.setProperty("Driver", driverClass)

//Read and display data
val sflight = spark.read.jdbc(jdbcUrl, "<schema.table>", connectionProperties)
//sflight.show()

Vitaliy-R
Developer Advocate
Developer Advocate
0 Kudos

Hi Mayank,

I am not sure what you mean by "hana instance in databricks"?

Is it possible for you to run on the client (where you run your Scala code)

openssl s_client -tlsextdebug -connect XXXX.hana.canary-eu10.hanacloud.ondemand.com:443

and see what error message you are getting there?

Regards,
-Witalij

mayank_gupta01
Employee
Employee
0 Kudos

Hi vitaliy.rudnytskiy,

Thanks for the answer, I meant above using HANA Cloud instance to connect from Databricks

and

openssl s_client -tlsextdebug -connect XXXX.hana.canary-eu10.hanacloud.ondemand.com:443

output :

CONNECTED(00000003)

TLS server extension "EC point formats" (id=11), len=3 0000 - 02 00 01 ... TLS server extension "renegotiation info" (id=65281), len=1 0000 - 00 . TLS server extension "extended master secret" (id=23), len=0 depth=2 C = US, O = DigiCert Inc, OU = www.digicert.com, CN = DigiCert Global Root CA verify return:1

Vitaliy-R
Developer Advocate
Developer Advocate
0 Kudos

Mayank, is that the complete output from that `openssl` command?

I expected it to be longer and include more details about the handshake step, like in my (successful) case:

...
---
SSL handshake has read 3685 bytes and written 535 bytes
Verification: OK
---
...

Rregards,
-Witalij

mayank_gupta01
Employee
Employee
0 Kudos

Hi Witalij,

sorry i masked the output, but i do have the above output when i ran the command.

--- No client certificate CA names sent Peer signing digest: SHA512 Peer signature type: RSA Server Temp Key: ECDH, P-384, 384 bits --- SSL handshake has read 3691 bytes and written 526 bytes Verification: OK

Accepted Solutions (1)

Accepted Solutions (1)

mayank_gupta01
Employee
Employee
0 Kudos

I fixed this issue by updating my spark version to 3.2.*.

akshay_jaintmdc
Discoverer
0 Kudos

Hi!

Did you try writing data using any method through Databricks Scala?

mayank_gupta01
Employee
Employee

Hi Akshay,

Yes, I am able read/write data to Hana using Databricks Scala and also using hdbcli module.

akshay_jaintmdc
Discoverer
0 Kudos

Could you please share your findings and code block for reference? I am able to to read data but during execution I am getting an error

`Can't get JDBC type for struct<customer_index:bigint,email:string,phone:string,type:string>`

How are you providing the schema for your data while writing data to HANA when the table does not exist?

Answers (0)