cancel
Showing results for 
Search instead for 
Did you mean: 

Loading the data into Hadoop via HTTPFS in SAP Data Services (BODS)

former_member700508
Discoverer
0 Kudos

Dear Team,

I am trying to load the source flat/table data into Hadoop via HTTPFS using SAP BODS tool and finally successfully loaded into hadoop layer But Can we use Global parameter($G_Location_Path) to set the location and to load data into different locations with .csv format since I have different target locations( more than 40 locations) to be loaded using FILE LOCATION. I don't like to create multiple FILE LOCATION connections for all 40 Target locations..Attached the screenshot for your reference.

Please let me know how to parameterise the location/path in FILE LOCATION feature (highlighted with Arrow) and i think this FILE LOCATION is acting like datastore connections (which can't parameterise)

Note: This is not normal .csv loading, target flat file we can easily parameterise but this is a FILE LOCATION, i tried different options but didn't work...please check the attached screenshot and response asap

Source: Table/flatfile

Target : csv flat file

File location : create File location with Protocol type Hadoop.

sap-query.png

Accepted Solutions (1)

Accepted Solutions (1)

former_member187605
Active Contributor
0 Kudos

You cannot use a Global Variable in a File Location definition.

Answers (1)

Answers (1)

former_member700508
Discoverer
0 Kudos

Thanks Dirk for your response ..but Is there anyway to parameterise the remote directory path in File location.? Even i tried with substitution parameters..its not working.

Any workaround for this ? since we have to place .csv files in many locations (around 40- 60 locations)

Best regards,

Jayanth

former_member187605
Active Contributor
0 Kudos

Use an HFDS File Format where you can parameterise both Root directory and File name.