I have a script that when used against a timer is getting slower. This is pretty simple, because all he does is read a line, check it and then add it to the database, then move on to the next line.
Here his result is gradually deteriorating:
Record: #1,001 Memory: 1,355,360kb taking 1.84s
Record: #1,001 Memory: 1,355,360kb taking 1.84s
Record: #2,002 Memory: 1,355,192kb taking 2.12s
Record: #3,003 Memory: 1,355,192kb taking 2.39s
Record: #4,004 Memory: 1,355,192kb taking 2.65s
Record: #5,005 Memory: 1,355,200kb taking 2.94s
Record: #6,006 Memory: 1,355,376kb taking 3.28s
Record: #7,007 Memory: 1,355,176kb taking 3.56s
Record: #8,008 Memory: 1,355,408kb taking 3.81s
Record: #9,009 Memory: 1,355,464kb taking 4.07s
Record: #10,010 Memory: 1,355,392kb taking 4.32s
Record: #11,011 Memory: 1,355,352kb taking 4.63s
Record: #12,012 Memory: 1,355,376kb taking 4.90s
Record: #13,013 Memory: 1,355,200kb taking 5.14s
Record: #14,014 Memory: 1,355,184kb taking 5.43s
Record: #15,015 Memory: 1,355,344kb taking 5.72s
The file, unfortunately, is about ~ 20 GB, so I will probably be dead by the time everything is read at a speed of increase. The code is (mostly) below, but I suspect it has something to do with fgets (), but I'm not sure what.
$handle = fopen ($import_file, 'r');
while ($line = fgets ($handle))
{
$data = json_decode ($line);
save_record ($data, $line);
}
Thanks in advance!
EDIT:
Commenting 'save_record ($ data, $ line); seems to be doing nothing.