Maybe there is no simple answer to this question, but I ask that someone have, if not a simple answer, at least an insight.
I have had several cases when I create a cycle that goes through many records in the database table that perform some updating, and where I could legally make one big commit at the end or commit each record as it is processed, i.e. make one at a time, would not create data integrity problems.
Is there a clear case that is better?
What came to mind was that I had one such program, which I recently switched from one big commit to a bunch of small commits, because it was a rather long program - about 80 minutes - and it didn’t work halfway through bad data . I fixed the problem and restarted it, but then she had to start all over again from the very beginning, when I could just process previously unprocessed records.
I noticed that when I made this change, the runtime was about the same.
source share