on 07-27-2017 4:31 PM
Hi all,
I am using a stored procedure to extract data by a calculation view and to store the result into columns table.
I move very large quantity of data and table can have 10/15 million record, the extraction can move 1/2 million record each times.
This is my SQL statement:
insert into "SCHEMA"."TABLE"
( Select "F1", "F2"... from Calculation_View )
During this loading process I can see the RAM consuming is heavy and I would like to know if is possible to unload RAM memory allocated when I finish the procedure, how can I reduce the RAM consuming after the procedure's execution?
This sounds a lot like your approach of copying data is not working well for you.
Why do you do this in the first place?
With HANA there usually is no need to manually manage the memory - if you load too much data for your HANA system, then you will run into issues, no matter what. So, rather look into why you use up that much memory in the first place.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
The result is showed by BOC and I don't have acceptable performance for business users.
I asked to user of team basis, the frequence of delta merge process and if is it possible to increase the frequency. This could solve my problem.
That's likely just going to be the band-aid here.
You might as well trigger the delta merge for this table specifically once you loaded your data in. To hold off the automatic merge while the insert runs, you may even disable the AUTOMERGE of the table.
However, the 1 million resulting records are probably not the issue. It's more likely that computing those is the problem.
User | Count |
---|---|
93 | |
10 | |
10 | |
9 | |
9 | |
7 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.