Skip to content

Instantly share code, notes, and snippets.

@Braunson
Created August 7, 2024 01:32
Show Gist options
  • Save Braunson/1414e86bd5e45054cd13c445110c9789 to your computer and use it in GitHub Desktop.
Save Braunson/1414e86bd5e45054cd13c445110c9789 to your computer and use it in GitHub Desktop.
Technical Challenge Investigation Breakdown - Optimizing a complex query in Laravel

Optimizing a complex query in Laravel

The Issue

In one of my recent projects, I encountered a significant performance issue (with some digging found out this was due to a slow-running query) in an existing Laravel application. The application relied heavily on a query that fetched data from multiple related tables, causing delays and affecting user experience.

The Challenge

The challenge was to find the affecting queries and optimize them to improve performance witout affecitng the accuracy or integrity of the data.

Approach

Analysis

As I had done some prior light digging to narrow the performance issue down to slow queries we now had to narrow it down to which queries were causing the performance issues.

Since I was able to run this project locally, I was able to leverage the Clockwork plugin and Chrome plugin to display what queries were running on a given page and how long they were taking on the serveride to complete.

Now that we had identified the queries causing the issue, it's time to look at why they are slow and from here identify potential solutions.

After using MySQL's EXPLAIN statement to see the amount of rows searched and Extra xcolumn for any specific information.

Optimization Strategies

Optimizing the table

Looking over the table, we noticed there was missing indexes. They wouldn't solve the slowdown but certainly would be part of the solution.

Optimizing the query

We now needed to refine the query to be more performant. Asking myself the questions:

  • Are there any unnecessary joins?
  • Are there any improper joins?
  • Are we selecting only the columns we need?
  • Can we cache this data?
  • Can we leverage eager loading?

The answer was YES to many of the above. We could remove an unnecessary join, we can only select t he columns necessary and not every one *, we could cache the result of the query for a specific interval because the data only changed once daily. We also were able to add in eager loading.

After making the optimizations to both the table and query in the form of a Laravel migration to add indexes and an update to the query affected, running a quick test again with the Clockwork plugin, we were able to cut the response time of the query in a quarter.

In addition to the speed boost, the query now is run once and cached early each morning (Redis + Scheduler). The user is no longer affected by this slow query and the system is using less resources every page load.

Testing

We could even go as far as to write a performance test to confirm the execution time of a large dataset is under a given threshold. This depends on what DB connection that testing has set but let's assum eht econnection is the same as production (MySQL). While we will want to test both small and large datasets, here's an example of a performance test for a larger dataset.

public function testOptimizedQueryPerformanceInALargeDataset()
{
    // Arrange: Seed test data
    User::factory()->count(10000)->create(); // Large dataset to test performance

    // Act: Measure execution time before optimization
    $startTime = microtime(true);

    $results = \DB::table('users')
                  ->join('profiles', 'users.id', '=', 'profiles.user_id')
                  ->where('users.active', 1)
                  ->select('users.*', 'profiles.bio')
                  ->get();

    $endTime = microtime(true);
    $executionTime = $endTime - $startTime;

    // Assert: Check that execution time meets performance criteria
    // Example threshold: 1 second (adjust as needed)
    $this->assertLessThan(1, $executionTime, 'The optimized query is too slow.');
}

Result

We've achieved a significant reduction in query execution time which has improved the overall application performance and user satisfaction.

There are other ways we could further optimize the page with the query but the focus for this example was a slow complex MySQL query.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment