I've been logging a number of issues we have with data services which rarely come back with anything helpful from SAP.
Q: I get an error reading table abc
A: use ABAP dataflows
Q: I have corrupt data in field x in table abc. How can I work around this ?
A: Use an ABAP dataflow
Q: Extractor xyz gives me a storage space error. How can I detemine how much temporary space is needed ?
A: Use an ABAP data flow
While I'm sure ABAP data flows are wonderful, they don't suit our environment very well. We chose data services so we can use the built in extractors and avoid having to make changes on the SAP side. our change control around ECC means that we would rather not use them if possible.
Firstly, is this a typical support experience ?
Secondly, I know all extractors are not officially supported, but it is the main new feature that was sold to us in DS4 - surely more effort should be going into helping customers. Most of the issues seem to be ones that are not specific to Data Services, but to any use of the extractors..
Finally, why is table support so flakey ? In every case where we have had trouble reading a table, we have been able to put together a customer extractor to read it in a couple of hours. This imples that the SAP RFC_READ_TABLE function itself is the main issue here. Is there a 'fixed' version of this function for Data Services whithout the 512 limit and storage issues ?