Skip to Content
avatar image
Former Member

Dataflow - Automatically read from multiple text file

I have multiple text files generated from different system.
I have currently created multiple dataflows in BODS to read those files and set them to populate in a single table. All the dataflows have been set to run in parallel.
So whenever a new file is placed, i replicate the dataflow and change the file location and name(All files and path have standard naming convention).
I'm trying to automate this setup by maintaining certain parameters in a table and the job will automatically pick the text files and read them.
We can achieve this by means of While Loop but we cannot set them to run in Parallel. The reason i want to achieve parallel execution here is that, there are lots of dataflows that are available. Running them sequentially will take lot of time.
How can i handle this situation ?
Any help would be greatly appreciated.

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

1 Answer

  • Aug 01, 2017 at 06:52 PM

    Hi Parsuwanath AnanthaVijayan,

    If your loading all these files to a single target table and all these files are placed in a single location then give filename as *.txt in the file format editor. it will load all the files that exists in that path. You can built this in one dataflow only.

    if all your files are have same naming convention, example :- sales_india, sales_us,sales_eur then give filename as sales*.txt.

    you can use substitution parameter in DS where you can parameterize your file location and call parameter in the place of file location.

    Please let me know if your requirement is different.

    thanks,

    Ravi kiran

    Add comment
    10|10000 characters needed characters exceeded

    • Hi Parsuwanath AnanthaVijayan,

      In this case try if you can move all files to a single location daily and start processing all the files in a single dataflow. But it still process file by file. You can maintain a table where you can store filepath for diff files, file names in a table and when ever you get any new file you just add file path and file name in the table, so build a script with while loop which automatically moves all files from diff location to single location using to table.

      Other wise whenever you get a new file, replicate dataflow as you are doing now.

      Thanks,

      Ravi kiran.