on 01-02-2019 5:01 PM
Dear Team,
I am trying to load the source flat/table data into Hadoop via HTTPFS using SAP BODS tool and finally successfully loaded into hadoop layer But Can we use Global parameter($G_Location_Path) to set the location and to load data into different locations with .csv format since I have different target locations( more than 40 locations) to be loaded using FILE LOCATION. I don't like to create multiple FILE LOCATION connections for all 40 Target locations..Attached the screenshot for your reference.
Please let me know how to parameterise the location/path in FILE LOCATION feature (highlighted with Arrow) and i think this FILE LOCATION is acting like datastore connections (which can't parameterise)
Note: This is not normal .csv loading, target flat file we can easily parameterise but this is a FILE LOCATION, i tried different options but didn't work...please check the attached screenshot and response asap
Source: Table/flatfile
Target : csv flat file
File location : create File location with Protocol type Hadoop.
sap-query.pngYou cannot use a Global Variable in a File Location definition.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Dirk for your response ..but Is there anyway to parameterise the remote directory path in File location.? Even i tried with substitution parameters..its not working.
Any workaround for this ? since we have to place .csv files in many locations (around 40- 60 locations)
Best regards,
Jayanth
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
93 | |
11 | |
10 | |
9 | |
9 | |
7 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.