I noticed the following behavior.
I have a file about 3 MB in size, containing several thousand lines. In the lines, I divide and create a prepared report (about 250,000 statements).
What am I doing:
preparedStatement
addBatch
do for every 200 rows {
executeBatch
clearBatch().
}
at the end
commit()
Memory usage will increase to approximately 70 MB without a memory error. Is it possible to use memory? and have transactional behavior (if one of the failed failures). I was able to reduce memory by committing with executeBatchand clearBatch... but this will lead to a partial insertion of the shared set.
source
share