Make your website lightning fast using caching
Thijs Feryn explains why caching can make your websites faster, how that happens and which tools are available.
Which factors influence the speed of your website?
Fast websites are a necessity, because Internet users are becoming increasingly impatient and abandon a website at the slightest delay, which is obviously good for competitors. But Google also takes this factor into account by rewarding fast websites with a higher ranking in search results.
However, creating a fast website is not as easy as it might seem. Because there is no one-size-fits-all solution. Yet, caching can provide a solution and ensure optimal performance of your website. Caching will store static data and will thus not recalculate it each time someone visits your website. One typical example is the product page on your website. Pricing information and product specifications do not always have to be retrieved from the database, as they can be stored in the cache instead.
What causes this delay?
Nevertheless, the main cause of delay is the combination of the code running on the infrastructure. And that makes sense. Because that is where most calculations are made, and where most processes are carried out.
For each visit, an HTTP request must be accepted, and then passed on to the code. The code often needs to use a database. After that, the result of the database must be processed again. This eventually results in an HTTP response that the server sends to the browser.
This sequence of actions can be very time consuming and can also be resource intensive. It may even lead to a cascade effect, especially when the number of simultaneous visitors is high. And that is where caching can be a very effective tool, as it reduces the frequency of these calculations.
Making and maintaining your website faster
Still, optimising your code is important. But in an ideal world, optimised code will not always be able to guarantee a high-performance website. Caching can always be helpful in many processes and calculations.
In reality, optimisation is often carried out until a critical point is reached where the costs/benefits ratio is satisfactory. It is often found that the necessary investments do not correspond to the desired results in terms of time, budget and expertise. Caching is therefore not just a quick temporary solution, but a long-term vision that is an essential part of both strategy and architecture.
“After all, why would you want to constantly recalculate the output when the data is static?”
Caching can make slow websites fast, and we are then talking about performance. But it can also keep fast websites fast when under increasing pressure from a large number of visitors. In that case, we are talking about scalability.
How does caching actually work?
How does caching work? The answer to this question depends on what exactly you are trying to cache. At best, this will be the final result of the calculations, i.e. the entire page. And this is something you do to reduce the load on the server. The entire server. If we take the example of the product page again, it will be the complete output of this page. If this is impossible, you still have the option to cache the output of the data processes.
This is more likely to be a code-related issue and is intended to reduce the load on the database. Going back to the example of the product pages: this will be the output of the database, which contains pricing information and product specifications.
You can use different technologies depending on the choice you make. Varnish is one such technology that stores entire pages and is designed to reduce the load on the entire server. Cached items are stored in RAM, which makes everything lightning fast. Identification is based on the URL. Varnish is a standalone system that sits in front of the web server and communicates via HTTP.
Technologies such as Redis and Memcached are more suited to caching the output of data processes and operate from the backend. Memcached and Redis are distributed caches. Or, to put it simply, remote RAM that you can access over the network. Memcached and Redis do not support complex calculations and store data in key-value format.
Redis supports multiple data types. Memcached does not do this, but despite its simplicity it can still be very efficient and very stable. In order to choose between these two options, you first have to decide whether or not you want to use the data types Redis offers, but most importantly, you must consider the compatibility with your code.
Data can be stored in a cache, but also needs to be removed from it!
Regardless of the chosen technology, you will have to decide how long you want to keep your items in the cache and how you will remove them from the cache when the data changes. Caching too many items for too long is indeed worse than caching too few items.
The duration of caching can be determined using what is known as the Time To Live value. Redis and Memcached allow you to specify this value based on specific features to add and manage data. With Varnish, this can be done by means of the Cache-Control header, which indicates your final code via HTTP. The removal of items from the cache can be done in Redis, Varnish and Memcached based on certain methods they offer, but fortunately there are a wide variety of CMS and framework plug-ins that can facilitate this. It may sound ironic, but storing items in a cache is easier than removing them from it. So, you might want to carefully think about this.
A web hosting package allows you to activate caching via your control panel. If you have a managed hosting environment, your account manager will be happy to give you advice on the most suitable caching option for you.