I'm creating a characteristic relationship exit class.
I used the standard class CL_RSPLS_CR_EXIT_BASE.
In the derive method there is some example code on how to use buffer.
It looks like this.
* begin of example code: * use the buffer? * o_use_buffer is switched on by default in the constructor * IF o_use_buffer = rs_c_true. ** yes: * ASSIGN o_r_th_buf->_d* TO <l_th_buf>. * ASSIGN o_r_s_buf->* TO <l_s_buf>. * <l_s_buf> = c_s_chas. * READ TABLE <l_th_buf> INTO <l_s_buf> FROM <l_s_buf>. * IF sy-subrc = 0. * IF o_r_is_valid->* = rs_c_true. * c_s_chas = <l_s_buf>. * RETURN. * ELSE.´ ...
1. First of all i this line i dont understand.
ASSIGN o_r_th_buf->_d* TO <l_th_buf>.
and it doesn't compile either.
Does anybody know what it is suppose to do?
Is it an error in the example code?
2. Second the read statement seems off.
READ TABLE <l_th_buf> INTO <l_s_buf> FROM <l_s_buf>.
It reads with the input structure and returns the same dataset if it is found.
From my point of view that then either there was no deriviation when the buffer was saved or we do not get the derived data now.
Has anybody implemented buffering of deriviation succcesfully?