Skip to Content
0

File Location Azure Blob Storage Connection via HTTPS

Mar 13, 2017 at 05:07 PM

319

avatar image
Former Member

Hi,

I am extracting Data from SAP Systems into the Azure Blob Storage using SAP Data Services. From what I could find out about the protocol being used by Data Service to upload the Data into the blob storage it establishes a connection via Microsoft-HTTPAPI/2.0

But we require a HTTPS-Connection.

Is there any way to change the connection type of the Azure Cloud Storage Protocol?

Thanks in advance!

Best regards,

Scott

10 |10000 characters needed characters left characters exceeded
* Please Login or Register to Answer, Follow or Comment.

3 Answers

Best Answer
Vamsi Krishna Sep 12, 2017 at 07:24 PM
0

Per note 2448174, by default it uses HTTPS though job tracing shows as HTTP only.

Show 1 Share
10 |10000 characters needed characters left characters exceeded
Former Member

I opened a ticket to SAP and afterwards they created the Note.

It was just wrongly displayed in traces.

But thanks for the reply.

0
avatar image
Former Member Sep 21, 2017 at 08:12 AM
0

Hi Scot,

Could you please tell how you are extracting and updating blob storage?

I need to do it in secure way using Shared Access Signature.

Please let me know.

Regards,

Praveen

Show 1 Share
10 |10000 characters needed characters left characters exceeded
Former Member

Hi Praveen,

In BODS you can create a file location which uses the azure cloud storage protocol.

You need

Account Name: Can be found in Azure

Account Shared Key: Can be found in Azure

Local Directory: Some folder on you Data Services Server or on a network drive. BODS will transfer the file to the local directory and upload it afterwards to the blob storage.

Container: Container in your blob storage. This can be created in Azure or BODS will it create during the batch job.

This file location can be associated to a flat file format

The flat file format can be used in every batch job.

First BODS copies the file to the local directory. When this is done, it will be uploaded. So BODS does it sequentially.

You should consider a script which deletes the file after a succesfull upload. Otherwise you local storage will be full sooner or later.

Since BODS is not able to write into Append Blobs, you are not able to update the data in the blob. Except you overwrite the file when you use the same name. If you do a full load every time, it would be an option.

Otherwise I recommend using a variable for the filename with a timestamp. (you can see it in the screenshot above).

This is how I do it:

$filename = to_char(sysdate( ),'YY')||'_'||to_char( sysdate( ),'MM')||'_'||to_char( sysdate( ),'DD')||'_'||to_char( systime( ),'hh24miss')||'_<TABLENAME>.dat';

Hope I could help you!

Best regards

Scott

0
avatar image
Former Member Feb 02 at 02:54 PM
0

hiii scott,

We have the same requirement. But i am able to copy data from local drive to azure blob container using azure container protocol.

Would like to know if u have been able to create a path/ folder in a dynamic way within a container.

Share
10 |10000 characters needed characters left characters exceeded