Skip to Content

Currency conversion using Routine with Unit (R_W_UNIT)

Dear Guru's,

I have tried to use this Rule type to derive local currency from document currency.

I have a routine to check the availability of Local currency and document currency. If both are different then convert to local currency using exchange rate ( from source fields ).

Below are more details about it. similar to this there are few other fields having the conversion routine.

I am using FN - 'CONVERT_TO_LOCAL_CURRENCY'

When when i load the data to the info cube i find, duplicate records in it , as below ( test data ).

One record with local currency as blank and the other with local currency and corresponding key figure updated.

Please help to correct, if there is any mistake in using the rule type.

Routine is coded as below:

         .

         data: lc_rate_dec type p decimals 5.

         lc_rate_dec = source_fields-exchg_rate.

         if source_fields-doc_currcy = source_fields-loc_currcy

*   no conversion necessary -> Main case 1

           and not ( source_fields-doc_currcy is initial

                  or source_fields-loc_currcy is initial ) .

           result = hlp_value.

         elseif not ( source_fields-doc_currcy is initial

         or source_fields-loc_currcy is initial or

         source_fields-trans_date is initial ) .

*conversion necessary with SOURCE_FIELDS-TRANS_DATE -> Normally not

*possible

           call function 'CONVERT_TO_LOCAL_CURRENCY'

             exporting

               date                 = source_fields-trans_date

               foreign_amount       = hlp_value

               foreign_currency     = source_fields-doc_currcy

               local_currency       = source_fields-loc_currcy

               rate                 = lc_rate_dec

             importing

*       EXCHANGE_RATE        =

               local_amount         = result

             exceptions

               no_rate_found        = 1

               overflow             = 2

               no_factors_found     = 3

               no_spread_found      = 4

               derived_2_times      = 5.

           if sy-subrc ne 0.

*message a802 with SOURCE_FIELDS-TRANS_DATESOURCE_FIELDS-DOC_CURRCY

*SOURCE_FIELDS-LOC_CURRCY

*                     sy-subrc.

           endif.

         else.

*   if conversion not possible -> assign target values

           result = hlp_value.

           currency = source_fields-doc_currcy.

         endif.

         currency = source_fields-loc_currcy.

*        RETURNCODE = 0.

Rule type 1.png (50.9 kB)
Rule type 2.png (32.4 kB)
Add comment
10|10000 characters needed characters exceeded

  • Get RSS Feed

2 Answers

  • Best Answer
    Jan 08, 2015 at 01:53 AM

    Dear All,

    The issue is with the routine at the field and not at the start routine.

    I realized that the Unit( Currency ) is not passed in the field routines along with result .

    I do not know how it loops back to the field routine and adds a duplicate row when the Currency is not passed.

    Appreciate your help .

    Regards,

    Sudhir.

    Add comment
    10|10000 characters needed characters exceeded

  • Dec 29, 2014 at 06:28 AM

    A field routine cannot normally create duplicate records. Please compare with the source data - is it really doubling the records?

    Also, every time you assign a value to the RESULT and CURRENCY fields, please add the following two lines

    RETURNCODE = 0.

    RETURN.

    Without these two lines, the logic is bound to be wrong.

    Add comment
    10|10000 characters needed characters exceeded

    • Suhas Karnik Chanda Janardhan sudhir Kumar

      That's a valid point. At least for the example you have given (the one for which you ran the debug DTP) there are two records in the source. For this example the records in target = records in source.

      It might be that some other set of records is getting doubled. There might be some code in the Start/End routine that appends to the Source/Result package. To test that you'll need to identify an example record that is getting doubled, by comparing the source and target data.