cancel
Showing results for 
Search instead for 
Did you mean: 

ORA-01114 : Runtime Errors: DBIF_REPO_SQL_ERROR

Former Member
0 Kudos

When I run report in rsrt it is throwing me following error..

Messages:

ORA-01114: IO error writing block to file (block # )

SQL Error: 1114

Error while reading data; navigation is possible

>> Row: 71 Inc: NEXT_PACKAGE Prog: CL_SQL_RESULT_SET

Since this is a sql related error I have contacted our basis team..they are still not able to resolve this

we have also verified if it is becoz of table space issue.....but it is not so....

Please help me with this if any one of you have faced a similar error before...

I also see that when i execute some queries it is working fine..but for some queries it is throwing this error...

one more case where we observed was

when display data in an infocube using some selection say profit center....and restrict display rows to 200 it works fine...

but when i display same data with same selection on profit center without rows limit it is throwing this error.

Please suggest me a solution for this asap

Regards

Krishna.

Accepted Solutions (0)

Answers (1)

Answers (1)

former_member188080
Active Contributor
0 Kudos

Hi,

check the point to be notes is file 256. The number greater than 255 indicates a temp datafile. This indicates an issue with PSAPTEMP tablespace. When checked on PSAPTEMP tablespace, one of the filesystems where one temp data file sitting was full. The filesystem was 100% full. This will prevent the temp datafile to grow and shrink as and when required.

So, adding more space to the filesystem solved the problem

http://forums.sdn.sap.com/thread.jspa?threadID=695305&tstart=0

just see if server restart solve the error..check with basis as well

Thanks and regards

Kiran

Former Member
0 Kudos

Hi,

I agree with Kiran. We too faced the same problem.

Worked with Basis team who extended the temp table space which resolved the issue.

Temp workaround if these are full loads;

Try to delete the older PSA requests (keep only 3-4 days older requests) for this data flow and repeat the load.

Regards,

Sunil

Former Member
0 Kudos

i asked our basis team to see if there is any table space issues they say that there is no problem with table space

when it comes to PSAPTEMP it shows as 0% used

Regards

Krishna

anindya_bose
Active Contributor
0 Kudos

Hi

Somehow oracle is not able to access your temp tablespace.

Ask your basis team to drop and recreate temp table space.

similar issue description you can find in the link below

http://oracleplz.blogspot.com/2006/01/ora-01114-io-error-writing-block-to.html

Regards

Anindya

Former Member
0 Kudos

thanks all for the replies

our basis guy is saying that it is a problem to drop and recreate temp table spaces (since it has something to do with network-on UNIX ?)

is it really a tough (huge) job to recreate temp table spaces

regards

Krishna

anindya_bose
Active Contributor
0 Kudos

not at all..In case they do not want to drop the existing one , they can just create a new one and point to that temp file...

did not remember what we used to do exactly but it were 2 SQL command only...

SQL> ALTER DATABASE TEMPFILE <'/oradata/temp02.dbf'> DROP INCLUDING DATAFILES;

SQL> ALTER TABLESPACE temp ADD TEMPFILE <'/oradata/temp03.dbf'> SIZE 100M;

Regards

Anindya

Edited by: Anindya Bose on Feb 6, 2012 9:33 PM

Former Member
0 Kudos

we have tried all the options

increased the table space of temp files but still the issue is not resolved

one thing we have observed is

1 case which we are seeing (not sure if this is only because of the below case)

we see this error specifically when we run query on our infocube 0wbs_C11 which has MANDATORY VARIABLE ON 0PROJECT

do this variable has anything to do with the error..

because in other cases where we have like 5 reports on same infocube..

NOTE: We see this issue in our development client where as all the above queries and infocubes are in production client and we dont see any issues in production

we dont see this error in other 2 cases where we dont have this mandatory variable on project

also we see this error in several other cases like when we are doing compression of info cube zic_c03

absolutely no clue of why this is happening..and when this is happening

Please suggest some solution asap several developments are pending because of this

Regards

krishna