How to Resolve Performance Issues Caused by Pagination on Large Joomla Websites

Warning: The solution presented in this post consists of a core modification, which means a future Joomla update may wipe it out. You have been warned!

If you check the MySQL slow query log of a large Joomla website, you will probably see something like the below query:

SELECT a.id, a.title, a.alias, a.introtext, a.fulltext, a.checked_out, a.checked_out_time, a.catid, a.created, a.created_by, a.created_by_alias, CASE WHEN a.modified = '0000-00-00 00:00:00' THEN a.created ELSE a.modified END as modified, a.modified_by, uam.name as modified_by_name,CASE WHEN a.publish_up = '0000-00-00 00:00:00' THEN a.created ELSE a.publish_up END as publish_up,a.publish_down, a.images, a.urls, a.attribs, a.metadata, a.metakey, a.metadesc, a.access, a.hits, a.xreference, a.featured, a.language, LENGTH(a.fulltext) AS readmore,CASE WHEN badcats.id is not null THEN 0 ELSE a.state END AS state,c.title AS category_title, c.path AS category_route, c.access AS category_access, c.alias AS category_alias,CASE WHEN a.created_by_alias > ' ' THEN a.created_by_alias ELSE ua.name END AS author,ua.email AS author_email,parent.title as parent_title, parent.id as parent_id, parent.path as parent_route, parent.alias as parent_alias,ROUND(v.rating_sum / v.rating_count, 0) AS rating, v.rating_count as rating_count,c.published, CASE WHEN badcats.id is null THEN c.published ELSE 0 END AS parents_published
FROM #__content AS a
LEFT JOIN #__content_frontpage AS fp ON fp.content_id = a.id
LEFT JOIN #__categories AS c ON c.id = a.catid
LEFT JOIN #__users AS ua ON ua.id = a.created_by
LEFT JOIN #__users AS uam ON uam.id = a.modified_by
LEFT JOIN #__categories as parent ON parent.id = c.parent_id
LEFT JOIN #__content_rating AS v ON a.id = v.content_id
LEFT OUTER JOIN (SELECT cat.id as id FROM #__categories AS cat JOIN #__categories AS parent ON cat.lft BETWEEN parent.lft AND parent.rgt WHERE parent.extension = 'com_content' AND parent.published != 1 GROUP BY cat.id ) AS badcats ON badcats.id = c.id
WHERE a.access IN (1,1) AND c.access IN (1,1) AND CASE WHEN badcats.id is null THEN a.state ELSE 0 END = 1 AND (a.publish_up = '0000-00-00 00:00:00' OR a.publish_up <= '2016-03-13 13:01:11') AND (a.publish_down = '0000-00-00 00:00:00' OR a.publish_down >= '2016-03-13 13:01:11')
ORDER BY c.lft, a.featured DESC, fp.ordering, CASE WHEN a.publish_up = '0000-00-00 00:00:00' THEN a.created ELSE a.publish_up END DESC , a.created LIMIT 34680, 20;

Now we have explained how to optimize most of this query a long time ago, but still, there is one part of it that we didn’t address, it’s the LIMIT 34680, 20 part, in which we tell MySQL to grab results which ordering is between 34680 and 34679 (inclusive) for pagination purposes.

The problem with the LIMIT 34680, 20 is that it is extremely slow when fetching results, and can cause a delay of 5-6 seconds.

So, what’s the solution?

Well, a LIMIT 34680, 20 means that we are trying to fetch the data for page 1735 (since we are listing 20 results per page, so that’s 34680/20 + 1), but that it’s a bit too excessive. Even Google, with all its processing power, doesn’t do that. If you don’t believe that, try visiting the following link: https://www.google.ca/?gws_rd=ssl#q=test&start=3000, and you will see the below message:

Sorry, Google does not serve more than 1000 results for any query. (You asked for results starting from 3000.)

So, Google, the most important website in the world, restricts the number of its search results to 1000 results (or 100 pages, since they display 10 results per page), but we are trying to display more than 30,000 results on the Joomla website, which doesn’t make much sense.

So, we discussed the issue with the client and we convinced him that it’s a much better idea (for performance reasons) to restrict pagination to only 100 pages (2,000 results, that’s 1,000 more than what Google displays), and the client immediately agreed, and so implemented the fix the following way:

  • We opened the file index.php located under the main directory of the website.
  • We added the following code at the very beginning of the file:

    $myStart = intval($_GET['start']);
    if ($myStart > 2000)
    	die('Pagination Limit Exceeded');

  • That’s it. The problem was solved! No more slow queries caused by excessive pagination!

But wouldn’t the above cause a problem when indexing older articles?

Usually, humans (unless they are really, really, really bored) never browse past page 10, only bots do, and they only do that to index older articles. If you disallow pagination past page 100, then you might run into indexing issues. In order to resolve the problem, you will need to make sure that you have a valid and up-to-date XML sitemap of your website.

Isn’t there a better solution?

Well, if you have contiguous IDs on your Joomla website, then the answer is yes, there is a better solution that allows you to have all the pagination that you want, and it consists of changing the LIMIT statement to something like WHERE id > n LIMIT 0, 20 (where n substitutes 34680 in the query above). However, on large Joomla websites, it’s almost an impossibility to have contiguous IDs.

If you have a large website, we really recommend implementing the above solution. It will definitely lessen the load on your server and will make your Joomla website work a lot faster. If you’re afraid of the implementation, then please contact us. We will do it for you quickly, professionally, and for a very reasonable fee!

No comments yet.

Leave a comment