Skip to Content

File Location Azure Blob Storage Connection via HTTPS


I am extracting Data from SAP Systems into the Azure Blob Storage using SAP Data Services. From what I could find out about the protocol being used by Data Service to upload the Data into the blob storage it establishes a connection via Microsoft-HTTPAPI/2.0

But we require a HTTPS-Connection.

Is there any way to change the connection type of the Azure Cloud Storage Protocol?

Thanks in advance!

Best regards,


Add comment
10|10000 characters needed characters exceeded

  • Follow
  • Get RSS Feed

3 Answers

  • Best Answer
    Sep 12, 2017 at 07:24 PM

    Per note 2448174, by default it uses HTTPS though job tracing shows as HTTP only.

    Add comment
    10|10000 characters needed characters exceeded

  • avatar image
    Former Member
    Sep 21, 2017 at 08:12 AM

    Hi Scot,

    Could you please tell how you are extracting and updating blob storage?

    I need to do it in secure way using Shared Access Signature.

    Please let me know.



    Add comment
    10|10000 characters needed characters exceeded

    • Hi Praveen,

      In BODS you can create a file location which uses the azure cloud storage protocol.

      You need

      Account Name: Can be found in Azure

      Account Shared Key: Can be found in Azure

      Local Directory: Some folder on you Data Services Server or on a network drive. BODS will transfer the file to the local directory and upload it afterwards to the blob storage.

      Container: Container in your blob storage. This can be created in Azure or BODS will it create during the batch job.

      This file location can be associated to a flat file format

      The flat file format can be used in every batch job.

      First BODS copies the file to the local directory. When this is done, it will be uploaded. So BODS does it sequentially.

      You should consider a script which deletes the file after a succesfull upload. Otherwise you local storage will be full sooner or later.

      Since BODS is not able to write into Append Blobs, you are not able to update the data in the blob. Except you overwrite the file when you use the same name. If you do a full load every time, it would be an option.

      Otherwise I recommend using a variable for the filename with a timestamp. (you can see it in the screenshot above).

      This is how I do it:

      $filename = to_char(sysdate( ),'YY')||'_'||to_char( sysdate( ),'MM')||'_'||to_char( sysdate( ),'DD')||'_'||to_char( systime( ),'hh24miss')||'_<TABLENAME>.dat';

      Hope I could help you!

      Best regards


  • avatar image
    Former Member
    Feb 02, 2018 at 02:54 PM

    hiii scott,

    We have the same requirement. But i am able to copy data from local drive to azure blob container using azure container protocol.

    Would like to know if u have been able to create a path/ folder in a dynamic way within a container.

    Add comment
    10|10000 characters needed characters exceeded