Skip to Content

Loading the data into Hadoop via HTTPFS in SAP Data Services (BODS)

Dear Team,

I am trying to load the source flat/table data into Hadoop via HTTPFS using SAP BODS tool and finally successfully loaded into hadoop layer But Can we use Global parameter($G_Location_Path) to set the location and to load data into different locations with .csv format since I have different target locations( more than 40 locations) to be loaded using FILE LOCATION. I don't like to create multiple FILE LOCATION connections for all 40 Target locations..Attached the screenshot for your reference.

Please let me know how to parameterise the location/path in FILE LOCATION feature (highlighted with Arrow) and i think this FILE LOCATION is acting like datastore connections (which can't parameterise)

Note: This is not normal .csv loading, target flat file we can easily parameterise but this is a FILE LOCATION, i tried different options but didn't work...please check the attached screenshot and response asap

Source: Table/flatfile

Target : csv flat file

File location : create File location with Protocol type Hadoop.

sap-query.png
sap-query.png (42.2 kB)
Add comment
10|10000 characters needed characters exceeded

  • Follow
  • Get RSS Feed

2 Answers

  • Best Answer
    Jan 03 at 07:59 AM

    You cannot use a Global Variable in a File Location definition.

    Add comment
    10|10000 characters needed characters exceeded

  • Jan 03 at 08:15 AM

    Thanks Dirk for your response ..but Is there anyway to parameterise the remote directory path in File location.? Even i tried with substitution parameters..its not working.

    Any workaround for this ? since we have to place .csv files in many locations (around 40- 60 locations)

    Best regards,

    Jayanth

    Add comment
    10|10000 characters needed characters exceeded