CRM and CX Blogs by SAP
Stay up-to-date on the latest developments and product news about intelligent customer experience and CRM technologies through blog posts from SAP experts.
cancel
Showing results for 
Search instead for 
Did you mean: 
kishorekumar_p
Product and Topic Expert
Product and Topic Expert

SAP Sales and Service Cloud V2 provides multiple options for exporting or importing data to/from non-SAP systems. These include utilizing CSV files for data import/export, Autoflow for triggering events upon data modifications within the V2 system, and REST APIs. For further information on these options, please consult the help guide below:

Data Import and Export

AutoFlow

REST API

Mechanisms for Data Replication:

Pull Mechanism: In the pull mechanism, the receiver regularly polls for modified data. This approach is advantageous in the following scenarios:

  • To regulate the flow of data, enabling the receiver to manage incoming data based on resource availability.
  • When dealing with large data volumes, replication activities can be scheduled outside of standard business hours to alleviate strain on the receiver.

However, the pull mechanism may not be ideal in the following cases:

  • When instantaneous data dissemination to the receiver is necessary. Increasing polling frequency to achieve near real-time replication could lead to network overhead and additional strain on the sender system.
  • If duplicate data replications could result in adverse side effects.

Push Mechanism: In the push mechanism, the sender delivers data either instantaneously or at scheduled intervals. This approach is advantageous in the following scenarios:

  • When the receiver requires immediate access to changes.
  • If the receiver possesses the necessary resources to handle substantial data volumes from the sender, regardless of the sender's ability to regulate data flow.
  • In cases where duplicate data replications are unacceptable, assuming the sender does not transmit duplicates since the changes originated within the same system.

However, the push mechanism may not be suitable if the sender lacks control over the data flow rate, or if the middleware layer or receiver lacks sufficient resources to process incoming data volumes.

 

 

This blog delves into the integration of a non-SAP system with Sales and Service Cloud V2 using REST APIs, with SAP BTP Integration Suite serving as the middleware layer.

  • The scenario outlined below focuses on replicating opportunities from Sales Cloud V2 to the non-SAP system using a pull mechanism.
  • Pagination is employed to retrieve 1000 opportunities at a time.

What's not accounted for:

  • Initial and delta replication: If the receiver requires access to all existing opportunities as well as those created or modified since the last replication occurred at regular intervals, this criterion must be incorporated as a filter condition when reading opportunities. The last replication datetime could be utilized to identify opportunities modified after this specific datetime..
  • Error handling: In the event of partial replication of opportunities during any iteration, failed opportunities are not re-pulled unless they undergo modifications in the source system. Resolving this can be achieved by setting the last replication datetime to the datetime of the last successful replication. It's crucial to ensure proper management of duplicate replications, particularly if the receiver does not accept duplicates.

IFlow Steps:

1. Authentication with Sales Cloud V2: Authentication is conducted via the OAuth URL (https://<HOSTNAME>/auth/token) using provided credentials. OAuth authentication requires the content type to be "application/x-www-form-urlencoded". However, if the content/request body is empty, the Iflow OAuth adapter does not utilize the selected content type. To address this, a content modifier is used to set the content type, and the HTTP adapter is utilized for the authentication request.

kishorekumar_p_0-1714503945408.png

 

kishorekumar_p_1-1714504010088.png

 

2. Determining the quantity of modified opportunities: To implement pagination and ascertain the number of pages to be read, retrieve the count of new and modified opportunities using the query parameter $count=true.

kishorekumar_p_2-1714504033876.png

 

kishorekumar_p_3-1714504041459.png

3. Retrieving modified opportunities: Retrieve modified opportunities one page at a time using a looping process control. The process concludes either when the index reaches the count read earlier or when the loop pass reaches the maximum number of iterations set on the looping process control. Utilize query parameters $top to define the page size and $skip to bypass pages already read in previous loop passes. Ensure opportunities are sorted by unique ID to ensure proper pagination.

API Endpoint for Reading Opportunities: https://api.sap.com/api/SalesSvcCloudV2_opportunity/path/queryopportunityservice_opportunity

 

kishorekumar_p_6-1714504250230.png

 

kishorekumar_p_7-1714504268973.png

4. Publishing to the target:

Map the opportunities to the target structure and publish them to the target system.

kishorekumar_p_8-1714504324503.png

 

kishorekumar_p_9-1714504331550.png

Sample script used for extracting opportunity count, calculating index, and logging

 

 

 

	import com.sap.gateway.ip.core.customdev.util.Message;
	import java.util.HashMap;
	import groovy.json.JsonSlurper;
	
	def Message initializeLog(Message message) {
	    if (message.getProperty("enableLog") == "true")
	    {
	        def logMsg = new StringBuilder();
	        logMsg.append("************* B E G I N   M A S S   L O A D *************").append(System.lineSeparator());
	    
	        message.setProperty("logMsg", logMsg.toString());
	    }
	    return message;
	}
	
	def Message extractCount(Message message) {
	    def jsonSlurper = new JsonSlurper();
	    def body = jsonSlurper.parseText(message.getBody(String));
	    final String COUNT = 'count';
	
	    if (body.get(COUNT) != null)
	    {
	        message.setProperty("count", body.get(COUNT));
	        message.setProperty("page", 0);
	        message.setProperty("index", 0);
	        
	        if (message.getProperty("enableLog") == "true")
	        {
	            def logMsg = new StringBuilder();
	            def log = message.getProperty("logMsg");
	            if (log != null)
	            {
	                logMsg.append(log);
	            }
	    
	            logMsg.append("*************** B E F O R E   R E A D   P A G E ***************").append(System.lineSeparator());
	            logMsg.append("Count :: ").append(message.getProperty("count")).append(System.lineSeparator());
	            logMsg.append("Page Size :: ").append(message.getProperty("pageSize")).append(System.lineSeparator());
	    
	            message.setProperty("logMsg", logMsg.toString());
	        }
	    }
	    return message;
	}
	
	def Message extractToken(Message message) {
	    def jsonSlurper = new JsonSlurper();
	    def body = jsonSlurper.parseText(message.getBody(String));
	    final String VALUE = 'value';
	    final String TOKEN = 'access_token';
	
	    if (body.get(VALUE) != null && body.get(VALUE).get(TOKEN) != null)
	    {
	        message.setHeader("Authorization", "Bearer " + body.get(VALUE).get(TOKEN));
	    }
	    
	    if (message.getProperty("enableLog") == "true")
	    {
	        def logMsg = new StringBuilder();
	        def log = message.getProperty("logMsg");
	        if (log != null)
	        {
	            logMsg.append(log);
	        }
	        
	        logMsg.append(System.lineSeparator()).append("*************** A U T H   T O K E N   A D D E D **********").append(System.lineSeparator());
	        message.setProperty("logMsg", logMsg.toString());    
	    }
	    return message;
	}
	
	def Message calculateIndex(Message message) {
	    def properties = message.getProperties();
	    def page = properties.get("page");
	    def index = properties.get("index");
	    def pageSize = properties.get("pageSize") as Integer;
	
	    if (page != null && index != null && pageSize != null)
	    {
	        message.setProperty("index", page * pageSize);
	        message.setProperty("page", page + 1);
	
	        if (message.getProperty("enableLog") == "true")
	        {
	            def logMsg = new StringBuilder();
	            def log = message.getProperty("logMsg");
	            if (log != null)
	            {
	                logMsg.append(log);
	            }
	    
	            logMsg.append("*************** I T E R A T I O N :: ").append(properties.get("page") + 1).append(" ***************").append(System.lineSeparator());
	            logMsg.append("Page :: ").append(properties.get("page")).append(System.lineSeparator());
	            logMsg.append("Current Index :: ").append(properties.get("index") + 1).append(System.lineSeparator());
	    
	            message.setProperty("logMsg",logMsg.toString());
	        }
	    }
	
	    return message;
	}
	
	def Message saveLogMsg(Message message) {
	
	    if (message.getProperty("enableLog") == "true")
	    {
	        def log =  message.getProperty("logMsg");
	    
	        def logMsg = new StringBuilder();
	        logMsg.append("************* B E G I N   M A S S   L O A D *************").append(System.lineSeparator());
	    
	        if (log != null)
	        {
	            logMsg.append(log);
	        }
	        
	        logMsg.append("************* E N D   M A S S   L O A D *************").append(System.lineSeparator());
	    	def messageLog = messageLogFactory.getMessageLog(message);
	    	if (messageLog != null) {
	            messageLog.addAttachmentAsString("Message Log ", log.toString(), "text/plain");
	    	}
	    }
	    return message;
	}

 

 

 

Disclaimer: The objective of this blog post is to illustrate the process of authenticating with SAP Sales Cloud V2 and extracting data at regular intervals. It does not serve as a recommendation for implementing data extraction from SAP Sales Cloud V2. As stated earlier, the example provided above does not address scenarios involving initial data loads, error handling, and other considerations. Additionally, the sample code provided here is intended solely for demonstration purposes and is not suitable for production use.

4 Comments