cancel
Showing results for 
Search instead for 
Did you mean: 

CAP deploy to CF failing

kammaje_cis
Active Contributor

Hi,

I am deploying my CAP application to CF (trial). I am facing errors. I looked into the log file and the error is very cryptic. Can someone make sense of these errors please?

I am using VS Code for development and deployment. I see that Service Instance is already created. Error occurs after the upload of mtar file and while deploying the db module "cisfiorianalytics-db".

I have pasted here part of the log from MAIN_LOG file.

===================

#2.0#2020 01 14 07:20:52.573#Z#DEBUG#com.sap.cloud.lm.sl.xs2.e9bf6055-369d-11ea-8aff-eeee0a8b6839.MAIN_LOG.executeTaskTask# ######com.sap.cloud.lm.sl.cf.persistence.services.ProcessLogger########flowable-async-job-executor-thread-15### [ExecuteTaskStep] Task execution status: failed# #2.0#2020 01 14 07:20:53.212#Z#ERROR#com.sap.cloud.lm.sl.xs2.e9bf6055-369d-11ea-8aff-eeee0a8b6839.MAIN_LOG.executeTaskTask# ######com.sap.cloud.lm.sl.cf.persistence.services.ProcessLogger########flowable-async-job-executor-thread-15### [ExecuteTaskStep] Execution of task "deploy" failed. Download the application logs "cisfiorianalytics-db", via the dmol command, and check them for more information.# #2.0#2020 01 14 07:20:53.215#Z#ERROR#com.sap.cloud.lm.sl.xs2.e9bf6055-369d-11ea-8aff-eeee0a8b6839.MAIN_LOG.executeTaskTask# ######com.sap.cloud.lm.sl.cf.persistence.services.ProcessLogger########flowable-async-job-executor-thread-15### Exception caught# com.sap.cloud.lm.sl.common.SLException: A step of the process has failed. Retrying it may solve the issue.

===================

Accepted Solutions (1)

Accepted Solutions (1)

kammaje_cis
Active Contributor
0 Kudos

The issue was with the UUID type element in the Entity. I did not provide any data to it within the CSV, and the local deploy and run went smooth. I even got the generated UUIDs.

But when you deploy, HANA seems to have stricter rules. I had to explicitly provide the UUIDs, and it also has to be only in a specific format. Ex: c71122df-18e4-4a78-a446-fbf7b8f2969b

Answers (1)

Answers (1)

mariusobert
Developer Advocate
Developer Advocate

Hi Krishna,

I feel your pain, I've run into this issue before as well...

The important line in this error log is:

Execution of task "deploy" failed. Download the application logs "cisfiorianalytics-db", via the dmol command, and check them for more information.

The "deploy" task is the part of the HDB module where a standard node.js app makes use of the HDB client to connect to the HDI container to set up the schema etc. There are numerous reasons why this could fail like an invalid .hdiconfig file up or an encoding problem of the .csv files (if you want to import data).

Please run the following command to see the detailed log (this command will dump a large log so it might take a while to find the exact problem in there):

cf logs cisfiorianalytics-db --recent 
kammaje_cis
Active Contributor
0 Kudos

Thanks,Marius for the reply. Once I run this command where should I look for the log? The output of the command does not tell where is it downloading the log.

I have a DateTime column in my schema and I have the date entered as 2019-09-09T16:07:40Z in the csv file. This worked all right for SQLite (for local deploy). Can this be an issue for HANA?

mariusobert
Developer Advocate
Developer Advocate
0 Kudos

The log will be printed to stdout only. You could pipe it from there to a file if needed but usually its more helpful to read it straight from stdout.

I don't think it is an issue with HANA. SQLite just takes your input csv and doesn't involve the "deploy" task at all. This is why it is failing in the Cloud. My guess would be that the issue is related to a missing or incorrect configuration in the .hdiconfig file.

kammaje_cis
Active Contributor

Unfortunately, it did not print the log. Very strange.

I found out the problem though. I went to the cloud cockpit, selected the failing application (cisfiorianalytics-db in this case), and clicked on 'Logs' on the left hand sidebar. That gave me meaningful logs. It was about the data which I had in CSV.