Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
frederik_hopt
Product and Topic Expert
Product and Topic Expert

This is the second Blog regarding the SAP Datasphere and Confluent integration. Here is the link to the first one, where I show how to connect both tools.

I'm working at SAP as Customer Advisor since 2015. Previously I was a Consultant for SAP Data Services. So I'm familiar with the SAP Integration tools. Since March 8th 2023, with the announcement of SAP Datasphere, there is a new aspect in Data Integration. It's the Business Data Fabric approach. The benefits of Business Data Fabric is described in this Blog

So the idea is not to have lot's of ETL jobs in between, rather let the data inside of the source applications and only if necessary, store it in SAP Datasphere and use the capabilities inside. Data Products modeled and created inside of SAP Datasphere can be accessed by external tools. In some cases, this is not enough and customers just need to push their data to external systems too. One way is to use the "Replication Flow" in SAP Datasphere which enables you to replicate the data from SAP Datasphere to specific targets or directly from SAP Source Systems to several targets.

Here you can find the corresponding information about how to create a Replication Flow on SAP Help

The list of available Replication Flow sources are shown on SAP Help

 

Replication Flow sourcesReplication Flow sources

And this is the overview of the currently available Replication Flow targets (SAP Help)

Replication Flow targetsReplication Flow targets

So this is the actual status and can and will change. For everyone who wants to see, what comes next, please have a look in the SAP Datasphere Roadmap Explorer:

RoadmapRoadmap

There you can see, that Confluent is planned to be available as a source in Q2 2024! This means that you even can get your streams FROM Confluent INTO SAP Datasphere, which allows you lot's of more possible sources for ingesting data into SAP Datasphere!

When you now want to know the difference between Confluent Cloud and Apache Kafka, just have a look here.

So back to Confluent as target. On their website you can see all possible target's they offer themself:

Confluent connectorsConfluent connectors

What you can see is, that there are lot's of targets available, including several ones, I often hear from customers, which want to connect with SAP data. Mainly they want to get the SAP data into these targets:

  • Amazon (S3, Dynamo DB, Redshift)
  • Azure (Blog Storage, Data Lage Storage)
  • Google (BigQuery, Cloud Storage)
  • InfluxDB
  • Salesforce

The following targets were requested as well from some customers:

  • JDBC
  • OData (v2 and v4)
  • HTTP / HTTPS
As you can see, the hyperscalers are most common as targets for SAP data. With SAP Datasphere customers are already able to connect the 3 Hyperscalers within the Replication Flow. But there are still some targets, which are not available in SAP Datasphere. Luckily we already have the possibility to replicate the data in realtime and with CDC functionality in combination with Confluent for all the missing targets, so there is no need anymore to use an additional tool in between, but directly SAP Datasphere with Confluent to feed all relevant systems with SAP data.

So why using SAP Datasphere in combination with Confluent?

  • 120+ preconfigured targets available, so more then 115 additional targets to SAP Datasphere!
  • On prem and cloud targets, or hybrid landscapes can be connected
  • Either the SAP data is moved into SAP Datasphere, or to any other landscape via Confluent
  • perfect if a customer already has Confluent in place, because it can be easily adapted
  • Confluent cloud is available via SAP store
  • Ensuring security-, compliance-, and governance-standards with security-features for data streaming on enterprise-level and industry-wide full managed governance-suite for Kafka!
  • Flexible realtime SAP data usage
  • in Q2 2024 (actual plan), sources from Confluent can be used to push data INTO SAP Datasphere
1 Comment