Skip to Content

Reading flat files and after reading move it to different location using script in Data Services

Hello Experts,

I wanted read flat files from FTP folder location within data services and after reading those files wanted to move those files to same FTP but different location so that next time new files will be read by DS and that to be using script.

I have checked couple of threads but they are not straight forward hence either something is missing or the script code does not simply work, can someone please suggest the script little clearly and well described..I am also keeping my search on to achieve this.

Regards,

Neil

Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

4 Answers

  • Sep 27, 2018 at 05:07 PM

    Thanks moderator for suggesting below similar thread BODS- Move file from SFTP to local , but unfortunately this is not what I am looking for, the dependency with using FILE_LOCATION, that needs to have those custom functions where it will read those input and output file variables. Instead I wanted to use FILE_EXIST and CMD,MOVE command used on SCRIPT.

    kindly suggest experts if anyone has implemented this within your project.

    Neil.

    Add comment
    10|10000 characters needed characters exceeded

  • Sep 30, 2018 at 03:11 AM

    Hello Experts,

    Any suggestion on reading and moving file to archieve folder using BODS Script.

    Regards,

    Neil

    Add comment
    10|10000 characters needed characters exceeded

  • Oct 02, 2018 at 05:04 PM

    Hello Neil

    You can configure that functionality in a script component. Following is a sample code:
    Use the substitution parameters for your file locations.

    $GV_FileName='\[$$FilePathConverted]\ABC.csv';
    $GV_FileArchive = 'ABC'||'_'||to_char(sysdate(),'dd-mm-yyyy')||'.csv';
    
    IF ( file_exists ( $GV_FileName ) = 1 )
    begin
    Exec('cmd','move "\[$$FilePathConverted]\ABC.csv" "\[$$ArchiveFolder]\[$GV_FileArchive]"',8);
    print('{$GV_FileName}');
    print('Above file archived to the folder : [$$ArchiveFolder]' );
    end
    ELSE
    begin
    print('{$GV_FileName}');
    print('For Above File - MATCH NOT FOUND. Archive not possible');
    end
    

    Regards,
    Rishabh

    Add comment
    10|10000 characters needed characters exceeded

    • Hi Rishabh,

      Thanks for your script, the scripts works pretty well as expected, I have made some more addition to the script as per my requirement, but the challenge I am facing here is, if I place a script before all DF, script checks for the file and move it to the archive folder before BODS reads the file and job terminates saying Cannot open file <//File_path/File_name.txt>. Check its path and permission. I suspect this error pop up because script has already moved the file to the Archive folder, because the source file folder has complete rights for BO user. And if i keep it at last then some different error message comes.

      Kindly suggest, where shall I place the script.

      Regards,

      Neil

  • Nov 22, 2018 at 09:15 PM

    Hello Neil

    The script is just to archive the files. Use the file check before you start the load in DFs and once the load is complete then archive the files.

    Use a script and get value of file_exists ($GV_FileName ) into a variable and execute the dataflows only if the file is available. Once load is completed then use the archive in another script.

    Add comment
    10|10000 characters needed characters exceeded