Skip to main content
Hello,
Which has better performance for large volumes?

while
..
..
end while

or

forentity
..
..
endfor


------------------------------
Edson Gomems
Analyst
Coamo -Coop. Agropecuaria Mouraoense
------------------------------
Hello,
Which has better performance for large volumes?

while
..
..
end while

or

forentity
..
..
endfor


------------------------------
Edson Gomems
Analyst
Coamo -Coop. Agropecuaria Mouraoense
------------------------------

I doubt there would be much difference between the two sets of ProcScript commands per se, but if you're processing all occurrences and not making database updates then forentity would probably be better than a while loop with additional setocc commands. However, more importantly, you don't want to use discard in a forentity loop as it will make the next occurrence active which could result in occurrences being skipped, and when making updates to a large number of records it is usually more performant to store updates to records in smaller batches.



------------------------------
David Akerman
Principal Solution & Enablement Architect
Rocket Internal - All Brands
------------------------------
Hello,
Which has better performance for large volumes?

while
..
..
end while

or

forentity
..
..
endfor


------------------------------
Edson Gomems
Analyst
Coamo -Coop. Agropecuaria Mouraoense
------------------------------

Hi Edson,

It depends on what you'd like to do....

BUT

If you need to process a large number of records there are a number of factors you need to consider;

Do you need to keep the records in memory?  If so, if the dataset is large enough, you'll run out of memory...
(at least, on a 32 bit Uniface version).

You can resolve this by using discard - in which case the "while" construct is probably your best bet.

The 2nd thing you might want to consider is to use  a raw 'sql/data' statement - and do a forlist on the
data in $result.  There doesn't seem to be a limit on the size of data returned in $result (according to
the manual) - I guess the best way is to try and see...  ;-)
I have done this latter part a number of times... 
For Oracle "select 'transaction_id='||transaction_id,'transaction_amount='||transaction_amount etc etc"
When I get the dataset back I've used
forlist record in list
  creocc "DUMMY", -1
  record = $replace(record, 1, ",", "<gold sep", -1)
  getlistitems/occ record, "DUMMY"
endfor

Of course you can go any which direction with this as well...

HTH,
Knut



------------------------------
Knut Dybendahl
------------------------------

Hi Edson,

It depends on what you'd like to do....

BUT

If you need to process a large number of records there are a number of factors you need to consider;

Do you need to keep the records in memory?  If so, if the dataset is large enough, you'll run out of memory...
(at least, on a 32 bit Uniface version).

You can resolve this by using discard - in which case the "while" construct is probably your best bet.

The 2nd thing you might want to consider is to use  a raw 'sql/data' statement - and do a forlist on the
data in $result.  There doesn't seem to be a limit on the size of data returned in $result (according to
the manual) - I guess the best way is to try and see...  ;-)
I have done this latter part a number of times... 
For Oracle "select 'transaction_id='||transaction_id,'transaction_amount='||transaction_amount etc etc"
When I get the dataset back I've used
forlist record in list
  creocc "DUMMY", -1
  record = $replace(record, 1, ",", "<gold sep", -1)
  getlistitems/occ record, "DUMMY"
endfor

Of course you can go any which direction with this as well...

HTH,
Knut



------------------------------
Knut Dybendahl
------------------------------
As said above, forentity is going to wind you up with all the occurrences in memory by the end, which for large volumes may involve overuse of memory and slow down. So I think the answer to your question is going to be based on how much is a ‘large volume’ and how much data per record versus available memory for the thread.

However, there’s a mix you should be able to try which is something like

Forentity “ENTITY”
…….
…..
If($curocc(“ENTITY”) % 1000 = 0)
Discard “ENTITY”,1,$curocc(“ENTITY”)-1
Setocc “ENTITY”,1
Endif
Endfor

This should drop records from memory, reset to the last processed occurrence, and then ‘endfor’ on to the next one. (This is why it’s important to discard up to current occurrence minus one, so you have the current occurrence to ‘stand on’ before the endfor.

Regards,
Iain