You can use this function to iterate over a huge number of objects without using too much memory:
import gc def queryset_iterator(qs, batchsize = 500, gc_collect = True): iterator = qs.values_list('pk', flat=True).order_by('pk').distinct().iterator() eof = False while not eof: primary_key_buffer = [] try: while len(primary_key_buffer) < batchsize: primary_key_buffer.append(iterator.next()) except StopIteration: eof = True for obj in qs.filter(pk__in=primary_key_buffer).order_by('pk').iterator(): yield obj if gc_collect: gc.collect()
Then you can use the function to iterate over deleted objects:
for obj in queryset_iterator(HugeQueryset.objects.all()): obj.delete()
For more information, you can check out this blog post .
baxeico
source share