cancel
Showing results for 
Search instead for 
Did you mean: 

SAC: Import data from Database SQL Server 2019

former_member596364
Participant
0 Kudos

Hi Exepets,

Are there any limitations on importing SQL Server 2019 data? I'm following the steps in this link: https://help.sap.com/viewer/00f68c2e08b941f081002fd3691d86a7/release/en-US/0d5ffbb6951b45778c1a90ffd...
former_member596364
Participant
0 Kudos

Hi Debjit,

My question is about the limitation to use version Database SQL Server 2019 with com SAP Analytics Cloud, please?

Thanks...

DebjitSingha
Active Contributor
0 Kudos

Hello bveronesiAs mentioned before, we are on latest level of SQL 2019 and Cloud Agent. Setup is live in production environment.

Daily load around 100K rows * 58 colums.
Tomcat 8.0

Hope this helps.
Thanks,

Debjit

DebjitSingha
Active Contributor
0 Kudos

Additionally SAC can support delta extraction, provided DB layer take care of date field based new records. There is no time based granularity when it comes to delta in SAC. This limitation is same with SAP source system as well. Workaround is to setup multiple extraction jobs in different time interval for same datasoure.

Thanks,

Debjit

former_member596364
Participant
0 Kudos

see the image, MS Sql 2019 option does not appear

01.png

I already configured:

1)Cloud Connector

2) SAP Analytics Cloud Agent

3) JDBC Drivers

Accepted Solutions (1)

Accepted Solutions (1)

DebjitSingha
Active Contributor
0 Kudos

Hello bveronesi,

Thanks for your patience.

In "driverconfig.property" file, the name of he Database name should match one of the below entries. Otherwise Connector ignore that entry completely.

MS SQL Server 2019 is not officially supported. Reason why your Database name entry do not show up from SAC

Try using MS SQL Server 2016 to connected back end 2019 server. Any JDBC driver can be added as a Generic JDBC datasource= as long as it uses queries that are supported by MySQL. If required you can use 2019 server under this database name.

Database entries accepted by connector:

#Amazon EMR 5.6 (Hive 2.1)=
#Amazon EMR Hive 0.11=
#Amazon EMR Hive 0.13=
#Amazon Redshift=
#Apache Hadoop HIVE=
#Apache Hadoop Hive 0.10=
#Apache Hadoop Hive 0.12=
#Apache Hadoop Hive 0.13 HiveServer2=
#Apache Hadoop Hive 0.14 HiveServer2=
#Apache Hadoop Hive 0.7=
#Apache Hadoop Hive 0.8=
#Apache Hadoop Hive 0.9=
#Apache Hadoop Hive 0.x HiveServer1=
#Apache Hadoop Hive 0.x HiveServer2=
#Apache Hadoop Hive 1.0 HiveServer2=
#Apache Hadoop Hive 1.x HiveServer2=
#Apache Hadoop Hive 2.x HiveServer2=
#Apache Spark 1.0=
#Apache Spark 2.0=
#BusinessObjects Data Federator Server XI R3=
#BusinessObjects Data Federator Server XI R4=
#Cloudera Impala 1.0=
#Cloudera Impala 2.0=
#DB2 10 for LUW=C:\path_to_DB2_v10_driver\db2jcc4.jar
#DB2 10 for z/OS=
#DB2 10.5 for LUW=C:\path_to_DB2_v10.5_driver\db2jcc4.jar
#DB2 11 for LUW=
#DB2 UDB v5=
#DB2 UDB v6=
#DB2 UDB v7=
#DB2 UDB v8=
#DB2 for z/OS v11=
#DB2 for z/OS v12=
#DB2 v9=
#Data Federator Server=
#Data Federator Server XI R3=
#Data Federator Server XI R4=
#Generic JDBC datasource=
#GreenPlum 3=
#GreenPlum 4=
#HP Vertica 6.1=
#HP Vertica 7.1=
#HP Vertica 8=
#Hortonworks Data Platform 2.3=
#IBM Puredata (Netezza)=
#IBM Puredata (Netezza) Server 7=
#Informix Dynamic Server 10=
#Informix Dynamic Server 11=
#Informix Dynamic Server 12=
#Ingres Database 10=
#Ingres Database 9=
#MS Parallel Data Warehouse=
#MS SQL Server=
#MS SQL Server 2000=
#MS SQL Server 2005=
MS SQL Server 2008=C:\jdbcDrivers\sqljdbc.jar
#MS SQL Server 2012=
#MS SQL Server 2014=
#MS SQL Server 2016=
#MS SQL Server 6.5=
#MS SQL Server 7.x=
#MaxDB 7.7=
#MaxDB 7.9=
#MySQL=
#MySQL 5=
#Netezza Server=
#Netezza Server 4=
#Netezza Server 5=
#Netezza Server 6=
#Netezza Server 7=
#Oracle 10=
#Oracle 11=
#Oracle 12=
#Oracle 12c Release 2=
#Oracle 7.3=
#Oracle 8=
#Oracle 8.0=
#Oracle 8.1=
#Oracle 9=
#Oracle Exadata=
#Oracle Exadata 11=
#Oracle Exadata 12=
#PostgreSQL 8=
#PostgreSQL 9=
#Progress OpenEdge 10=
#Progress OpenEdge 11=C:\path_to_driver\openedgewp.jar
#Sybase ASIQ 12=
#Sybase Adaptive Server 11=
#Sybase Adaptive Server 12=
#Sybase Adaptive Server 15=
#Sybase Adaptive Server Enterprise 15=
#Sybase Adaptive Server Enterprise 15.5=
#Sybase Adaptive Server Enterprise 15.7=
#Sybase Adaptive Server Enterprise 16.0=
#Sybase IQ 15=
#Sybase IQ 16=
#Sybase SQL Anywhere 10=
#Sybase SQL Anywhere 11=
#Sybase SQL Anywhere 12=
#Sybase SQL Anywhere 16=
#Sybase SQL Anywhere 17=
#Sybase SQLServer 11=
#Teradata 12=
#Teradata 13=
#Teradata 14=
#Teradata 15=
#Teradata 16=
#Teradata V2 R=
#Teradata V2 R6=

Hope this helps.

Viewers - If you find above information helpful, feel free to up-vote.

Thanks,

Debjit

former_member596364
Participant
0 Kudos

Dear,

Thank you for help-me.

I created the connection and the 2019 database views and tables appear, but when I select the view to consult, it is observed that no column comes but the database is normal.
02.png
PS: I created with MS SQL Server 16

DebjitSingha
Active Contributor
0 Kudos
Now it seems like driver issue.
  • What version of driver your are using? If you are using the latest offered for 2019, I suggest go 1 or two version down (may be 2016 or 2014).
  • Can you try with generic driver and DB name Generic JDBC datasource.


Note: For me I am able to consume 2019 data using MS SQL Server 2016

former_member596364
Participant
0 Kudos

I'm using Microsoft JDBC Driver 7.4 for SQL Server

DebjitSingha
Active Contributor

I suggest try other version (may be latest) / generic one.

I am not near my system to confirm the driver version we are using.

Answers (3)

Answers (3)

anegoescu
Explorer

Hello Bruno

In terms of configuration to have the best experience in terms of data volume, use the SAP Analytics Cloud Agent version 1.0.303 (in order to archive the max number of cells/rows for your data import) - older versions have some limitations.

Also, for the best experience, ensure that your Apache Tomcat has at least 4GB of memory (our setup is 16GB because we have seen some limitation when many queries are running in the same time with large data volume).

If all these conditions are satisfied, you have to remember the following Data Import limitations for SQL DB:

Data Acquisition: row, column, and cell limits Models and stories:

  • For SAP BW, SAP Universe, SAP HANA, Google BigQuery, and SQL data sources only: 100,000,000 cells; 100 columns.
  • For all other data sources: 800,000 rows; 100 columns

More details here - System Sizing, Tuning, and Limits

I hope it helps.

Alex

former_member596364
Participant
0 Kudos

Hi Alexandru,

My question is about the limitation to use version Database SQL Server 2019 with com SAP Analytics Cloud, please?

Thanks...

former_member596364
Participant
0 Kudos

Thank you and Happy holidays!

former_member596364
Participant
0 Kudos

I installed the latest 8.4 driver and apparently it is working correctly. I appreciate your help!

DebjitSingha
Active Contributor

You are welcome. Happy Holidays.

Just FYI - for everyone else who are seeking similar solution.MS Server 2019 is not yet officially supported. As a result connector filter out / ignores the DB(2019) name from .property file.

Hence this workaround.