Sometimes when you query a large amount of database records assume in billions, you will surely face application performance issues when you go to process the large amount of data, however Laravel have some cool features which you may use to optimize the processing speed,
Chunking Results
The chunk
method will retrieve a "chunk" of Eloquent models, feeding them to a given Closure
for processing. Using the chunk
method will conserve memory when working with large result sets:
Product::chunk(200, function ($products) { foreach ($products as $product) { // } });
The first argument passed to the method is the number of records you wish to receive per "chunk". The Closure passed as the second argument will be called for each chunk that is retrieved from the database. A database query will be executed to retrieve each chunk of records passed to the Closure.
Using Cursors
The cursor
method allows you to iterate through your database records using a cursor, which will only execute a single query. When processing large amounts of data, the cursor
method may be used to greatly reduce your memory usage:
foreach (Product::where('foo', 'bar')->cursor() as $product) { // }
If you have any other questions, experience or insights on "Working with Large amount of Database rows with Laravel " please feel free to leave your thoughts in the comments bellow which might be helpful to someone!
Be the first one to write a response :(
{{ reply.member.name }} - {{ reply.created_at_human_readable }}