Django ORM Limit QuerySet By Using Start Limit

I am recently building a Django project, which deals with result set of 20 k rows.

All these data is responses as JSON, and am parsing this to use in template.

Currently, I am using objects.all() from django ORM.

I would like to know, if we can get complete result set in parts. Say, if result is 10k rows, then split in 2K rows each.

My approach would be to lazy load data, using a limit variable incremented by 2k at a time.

Would like to know if, this approach is feasible or any help in this regards?

1 answer

  • answered 2020-09-30 20:23 Willem Van Onsem

    Yes, you can make use of .iterator(…) [Django-doc]. As the documentation says:

    for obj in MyModel.objects.all().iterator(chunk_size=2000):
        # … do something with obj

    This will fetch the records in chucks of 2'000 in thus case. If you set the limit higher, it will fetch more records per query, but then you need more memory to store all these records at a specific moment. Setting the chunk_size lower will result in less memory usage, but more queries to the database.

    You might however be interested in pagination [Django-doc] instead. In that case the request contains a page number, and you return only a limited number of records. This is often better since not all clients per se need all the data, and furthermore the client often will need to be able to process the data itself, and if the chunks are too large, the client can get flooded as well.