Skip to Content
0

How to Integrate SalesForce BULK API

May 02, 2017 at 11:07 AM

248

avatar image

Hi All,

I have a requirement to Integrate Sales force Bulk API with SOAP Adapter.

Currently the PI version which I am using is PI 7.4 Single Stack.

I did go through the blogs:

https://blogs.sap.com/2014/07/14/sap-pi-salesforce-bulk-api/ - Which is based on ccBPM

for NW BPM Creation I have went through:

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/90308598-4b8d-2f10-4a9a-b78973859665?QuickLink=index&…; but was not able to succeed in doing it.

Please let me know how to create a job , Add a job and Close Job in NW BPM?

Can we achieve BULK API without NW BPM?

Thanks

Sai

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

3 Answers

Sugata Bagchi Majumder May 04, 2017 at 03:10 AM
0

Hello Sai,

Bulk API is not supported using SOAP.. you can opt Advantco SFDC adapter which supports BULK API and even can switch between BULK and SOAP API based on the number of records in the request payload.

You can connect also using Kate's REST adapter.. but Advantco is a better one in seamless integration.
BULK API is a async based data transfer. it requires a JOB and multiple batches assigned to the job.
Other than middle ware , any Java based App / client can be used to transfer the data to SFDC using BULK API.

Thanks

Sugata Bagchi Majumder

Show 1 Share
10 |10000 characters needed characters left characters exceeded

Hi Sugata,

Sorry for the delay.

Since due to licensing costs and other constraints Client does not want to go with Advantco SFDC adapter.

Client wants to use SAP PI as a middle ware to transfer data to SFDC as BULK API. Scenario is IDOC to SFDC .

As per the blog : https://blogs.sap.com/2014/07/14/sap-pi-salesforce-bulk-api/

it is mentioned that Bulk API can be achieved using SOAP Adapter.

As of now the number of records in payload would be 1000 and in future it may get increase.

It will be helpful if you provide the process of creating job , Add a job and Close Job in NW BPM?

Thanks

Sai

0
Sai D May 25, 2017 at 07:43 AM
0

Hi All,

Sorry for the delay,

Since we do not have SFDC Adapter and REST adapter installed and also the BPM is not installed in our PI system :7.4(AEX)

As there are licensing costs involved in it and also considering the timelines . What other alternative approaches we could suggest to client.

I am thinking of two approaches to achieve BULK API:

Let me which approach is more feasible and less maintenance

1) Achieving this using Java Mapping?

- Below is the sample code for BULK API which is given in the sales force document

2) Can we do it using the ABAP Proxies?

a) Call the outbound proxy call for create session - The response(Server url and Session ID) is captured in proxy

b) Call the proxy(passing Server url and Session ID) for Creating the job - The response(Job ID) from the sales force is captured in the proxy

c) Call the proxy(Server url , SessionId and JobID) for Creating the Batch Job along with the actual data that needs to passed to SF.

d) Call the proxy(Server url , SessionId and JobID) for Closing the Job.

Jave code for Bulk API: I guess this will help for others as well

import java.io.*; import java.util.*; import com.sforce.async.*; import com.sforce.soap.partner.PartnerConnection; import com.sforce.ws.ConnectionException; import com.sforce.ws.ConnectorConfig;

Set Up the main() Method This code sets up the main() method for the class. It calls the runSample() method, which encompasses the processing logic for the sample. We'll look at the methods called in runSample() in subsequent sections.

Main Method:

public static void main(String[] args) throws AsyncApiException, ConnectionException, IOException { BulkExample example = new BulkExample(); // Replace arguments below with your credentials and test file name // The first parameter indicates that we are loading Account records example.runSample("Account", "myUser@myOrg.com", "myPassword", "mySampleData.csv"); }

/** * Creates a Bulk API job and uploads batches for a CSV file. */ public void runSample(String sobjectType, String userName, String password, String sampleFileName) throws AsyncApiException, ConnectionException, IOException { BulkConnection connection = getBulkConnection(userName, password); JobInfo job = createJob(sobjectType, connection); List batchInfoList = createBatchesFromCSVFile(connection, job, sampleFileName); closeJob(connection, job.getId()); awaitCompletion(connection, job, batchInfoList); checkResults(connection, job, batchInfoList); }


Login and Configure BulkConnection:

The following code logs in using a partner connection (PartnerConnection) and then reuses the session to create a Bulk API connection (BulkConnection).

/** * Create the BulkConnection used to call Bulk API operations. */ private BulkConnection getBulkConnection(String userName, String password) throws ConnectionException, AsyncApiException { ConnectorConfig partnerConfig = new ConnectorConfig(); partnerConfig.setUsername(userName); partnerConfig.setPassword(password); partnerConfig.setAuthEndpoint("https://login.salesforce.com/services/Soap/u/40.0"); // Creating the connection automatically handles login and stores // the session in partnerConfig new PartnerConnection(partnerConfig); // When PartnerConnection is instantiated, a login is implicitly // executed and, if successful, // a valid session is stored in the ConnectorConfig instance. // Use this key to initialize a BulkConnection: ConnectorConfig config = new ConnectorConfig(); config.setSessionId(partnerConfig.getSessionId()); // The endpoint for the Bulk API service is the same as for the normal // SOAP uri until the /Soap/ part. From here it's '/async/versionNumber' String soapEndpoint = partnerConfig.getServiceEndpoint(); String apiVersion = "40.0"; String restEndpoint = soapEndpoint.substring(0, soapEndpoint.indexOf("Soap/")) + "async/" + apiVersion; config.setRestEndpoint(restEndpoint); // This should only be false when doing debugging. config.setCompression(true); // Set this to true to see HTTP requests and responses on stdout config.setTraceMessage(false); BulkConnection connection = new BulkConnection(config); return connection; } This BulkConnection instance is the base for using the Bulk API. The instance can be reused for the rest of the application lifespan.

Create a Job:

After creating the connection create a job. Data is always processed in the context of a job. The job specifies the details about the data being processed: which operation is being executed (insert, update, upsert, or delete) and the object type. The following code creates a new insert job on the Account object.

/** * Create a new job using the Bulk API. * * @param sobjectType * The object type being loaded, such as "Account" * @param connection * BulkConnection used to create the new job. * @return The JobInfo for the new job. * @throws AsyncApiException */ private JobInfo createJob(String sobjectType, BulkConnection connection) throws AsyncApiException { JobInfo job = new JobInfo(); job.setObject(sobjectType); job.setOperation(OperationEnum.insert); job.setContentType(ContentType.CSV); job = connection.createJob(job); System.out.println(job); return job; }

When a job is created, it’s in the Open state. In this state, new batches can be added to the job. When a job is Closed, batches can no longer be added.

Add Batches to the Job:

Data is processed in a series of batch requests. Each request is an HTTP POST containing the data set in XML format in the body. Your client application determines how many batches are used to process the whole data set as long as the batch size and total number of batches per day.

The processing of each batch comes with an overhead. Batch sizes should be large enough to minimize the overhead processing cost and small enough to be easily handled and transferred. Batch sizes between 1,000 and 10,000 records are considered reasonable.

The following code splits a CSV file into smaller batch files and uploads them to Salesforce.

/** * Create and upload batches using a CSV file. * The file into the appropriate size batch files. * * @param connection * Connection to use for creating batches * @param jobInfo * Job associated with new batches * @param csvFileName * The source file for batch data */

private List createBatchesFromCSVFile(BulkConnection connection, JobInfo jobInfo, String csvFileName) throws IOException, AsyncApiException { List batchInfos = new ArrayList(); BufferedReader rdr = new BufferedReader( new InputStreamReader(new FileInputStream(csvFileName)) ); // read the CSV header row byte[] headerBytes = (rdr.readLine() + "\n").getBytes("UTF-8"); int headerBytesLength = headerBytes.length; File tmpFile = File.createTempFile("bulkAPIInsert", ".csv");

// Split the CSV file into multiple batches try

{ FileOutputStream tmpOut = new FileOutputStream(tmpFile); int maxBytesPerBatch = 10000000;

// 10 million bytes per batch int maxRowsPerBatch = 10000;

// 10 thousand rows per batch int currentBytes = 0;

int currentLines = 0; String nextLine; while ((nextLine = rdr.readLine()) != null) { byte[] bytes = (nextLine + "\n").getBytes("UTF-8"); // Create a new batch when our batch size limit is reached if (currentBytes + bytes.length > maxBytesPerBatch || currentLines > maxRowsPerBatch) { createBatch(tmpOut, tmpFile, batchInfos, connection, jobInfo); currentBytes = 0; currentLines = 0; } if (currentBytes == 0) { tmpOut = new FileOutputStream(tmpFile); tmpOut.write(headerBytes); currentBytes = headerBytesLength; currentLines = 1; } tmpOut.write(bytes); currentBytes += bytes.length; currentLines++; } // Finished processing all rows // Create a final batch for any remaining data if (currentLines > 1) { createBatch(tmpOut, tmpFile, batchInfos, connection, jobInfo); } } finally { tmpFile.delete(); } return batchInfos; }

/** * Create a batch by uploading the contents of the file. * This closes the output stream. * * @param tmpOut * The output stream used to write the CSV data for a single batch. * @param tmpFile * The file associated with the above stream. * @param batchInfos * The batch info for the newly created batch is added to this list. * @param connection * The BulkConnection used to create the new batch. * @param jobInfo * The JobInfo associated with the new batch. */

private void createBatch(FileOutputStream tmpOut, File tmpFile, List batchInfos, BulkConnection connection, JobInfo jobInfo) throws IOException, AsyncApiException { tmpOut.flush(); tmpOut.close(); FileInputStream tmpInputStream = new FileInputStream(tmpFile); try { BatchInfo batchInfo = connection.createBatchFromStream(jobInfo, tmpInputStream); System.out.println(batchInfo); batchInfos.add(batchInfo); } finally { tmpInputStream.close(); } }


Close the Job

After all batches have been added to a job, close the job. Closing the job ensures that processing of all batches can finish.

private void closeJob(BulkConnection connection, String jobId) throws AsyncApiException { JobInfo job = new JobInfo(); job.setId(jobId); job.setState(JobStateEnum.Closed); connection.updateJob(job);

Thanks

Sai

Share
10 |10000 characters needed characters left characters exceeded
KARUNAKAR ADAPA 6 days ago
0

HI All,

Need help form you. My scenario is SAP PO 7.5 to Sales-force using REST adapter with oauth2.0.

I am able to send data from SAP PO 7.5 to Sales-force using REST adapter with oauth2.0 including PATCH method.

But how can i do Bulk API transmission using REST adapter?. Please guide me to achieve Bulk API data transmission.

Thanks in advance for your help.

Best Regards,

Karunakar A

Share
10 |10000 characters needed characters left characters exceeded