A fast-loading site is kind of a ‘make or break’ big deal, especially if you build sites for clients or run an online business. Hence, the importance of being cached up. Find out just how important web caching is to WordPress in this post.
Oct 2020: We tested all the best WordPress hosts! Find out how we compare.
Keeping your website running at blazing-fast speeds requires a superheroic effort.
That’s why we employ superheroes like Hummingbird and Smush to keep your site and your images optimized.
It’s also why we publish loads of content on ways to improve WordPress performance and boost your site’s speed.
This post explains how to improve your site performance and get that super boost of speed through the use of web caching.
By the time you’re done reading this, you’ll know:
- What a cache is
- Different types of web caching available
- Downsides to web caching
- How WPMU DEV can maximize your web cache with minimum cash
- …and even how to say ‘cache’ properly (it’s pronounced “cash,” not cayche, cash-ay, or quiche).
- 1 What is a Cache?
- 2 Why Use Caching?
- 3 Different Cache Types and When to Use Caching
- 4 Client-side Caching vs. Server-side Caching
- 5 WordPress Caching
- 6 Persistent Object Caching
- 7 Are There Any Downsides To Caching?
- 8 Get Maximum Cache For Minimum Cash
- 9 Time To Get Cached Up
What is a Cache?
A cache is a temporary way to store frequently accessed data. It’s a way of storing reusable responses to speed up subsequent requests.
When users make subsequent requests for cached content, these requests are then fulfilled from a cache instead of going all the way back to the server, where the content is originally stored.
Put simply, when someone visits your site, a cache takes a static snapshot of your page, then stores and serves that snapshot to your visitor.
This allows your information to be delivered quickly, instead of making your visitors wait for various requests to be made, which would slow everything down.
Effectively, caching provides an easier and faster mechanism for delivering stored data to users than where the original data is located.
So, to put this in context…
Here’s how a web page loads:
- An online user clicks on a link to your website (from a search result, another website, social media post, email, etc.)
- The user’s browser sends a request to your server (called an HTTP request).
- Your server compiles and delivers all the files required to display the website in the user’s browser (every image, file, script, stylesheet, etc. to be compiled adds time to this request).
- The user is eventually served a complete and fully loaded website via their browser.
Now, let’s compare this to what happens when caching is enabled for your website:
- An online user clicks on a link to your website (from a search result, another website, social media post, email, etc.)
- The user’s browser sends an HTTP/HTTPS request to your server.
- The server detects that your content has not changed since someone last visited your website.
- The server grabs a static copy of your website that’s stored in its cache and delivers it quickly to the user’s web browser.
- The server repeats #4 for all subsequent visits until the content on a page has changed or the cache expires and is automatically purged.
Compiling every element of a requested page takes longer than just serving a stored copy of that page. Hence, caching can reduce processing time and deliver pages faster.
Why Use Caching?
The main purpose of caching is speed.
Caching is probably the only thing that directly affects page speed score.
Caching = faster loading pages = better page speed score.
There is a trade-off, however. In order to save time and speed things up, caching requires storage.
So, less computing time to deliver pages faster means more computer (i.e. server) space to store the cached page data.
Although storage is becoming less and less expensive, there is no free cache!
Caching sounds like a simple concept (i.e. storing data for subsequent usage) but, in fact, it’s quite complex, as many aspects of caching depend on variables like server configurations, available memory, or scale of enterprise to speed up certain responses.
For example, large multinational telcos may employ caching to reduce infrastructure costs like bandwidth, while small business websites will use caching to improve their page-loading speed.
Another aspect that can complicate caching is the difficulty of storing the same results for subsequent usage in a cache.
As software engineer Yihui Xie states in his blog, “when things become different, you have to invalidate the cache, and do the (presumably time-consuming) computing again.”
Hence, this often-quoted quip by Phil Karlton…
“There are only two hard things in Computer Science: cache invalidation and naming things.”
Given the complex nature of caching, this post will focus on the methods used in web caching only.
Web caching is a core design feature of the HTTP protocol. With web caching, the use of rules and policies govern how HTTP responses to requests are stored using different cache types.
Cache-control directives within these policies control who can cache the response, under which conditions, and for how long.
If you need help understanding basic web caching terminology, see this article.
Different Cache Types and When to Use Caching
As content travels from the origin server (i.e. where your content is originally located) through to the web browsers of your visitors, different types of caches can be used to improve and speed up the journey.
Effective caching results in benefits to both content providers and content consumers. Some of the benefits include:
- Decreased network costs.
- Improved responsiveness.
- Increased performance being squeezed from the same hardware.
- Content can be made available to end-users during periods of interrupted network services.
Here are some of the best types of data that work well with web caching:
- Data that can be reused throughout the session or reused across all users and requests.
- Static data (not constantly or rapidly changing).
- Data that can be expensive to compute or retrieve.
Types of data you DON’T want to cache include security information like bank account details, logins, passwords, etc.
Before we get into ways to use caching to speed up your site, let’s look at different types of caching and their unique characteristics:
Client-side Caching vs. Server-side Caching
At its most basic form, the World Wide Web works like this: you store content on a server and allow users (i.e. clients) to access it from their web browsers.
This means that caching can occur either at the server’s end (called ‘server-side’ caching) or the client’s end (called ‘client-side’ caching).
Client-side caching takes place on your visitors’ browsers. When someone visits your site, their browser creates and stores the cache.
The next time they visit your site, things will load faster because it’s reading information from their browser cache.
Client-side caching is efficient because it allows browsers to access files without reloading these from the server.
While this may be fine for small websites, if you have many concurrent users, you’ll want to look at using server-side caching.
Since client-side caching takes place on your visitors’ browsers, you have no control over it. But you can control what happens on your server.
Something you can control, for example, is whether to enable caching on your server or not.
Let’s say that a visitor arrives on your site and caching is not enabled. Their browser will send a request for the page to your server.
Your server must then process this request, compile the page, and then send it back to the browser. All of these processes take time and use up server resources.
While this may work just fine for small sites with low traffic, it can spell disaster for larger sites with higher traffic volumes that have to continually process loads of requests and compile loads of pages every second.
This is where server-side caching comes in useful.
The server stores a copy of every request being made. Next time your server has to process the same request, it checks the cache and if there’s a copy stored there, it serves it from the cache.
If there’s no copy, the request is then sent to the server where it is processed and compiled, and as the information is sent back to the visitor’s browser, a copy is then stored in the cache for subsequent requests.
This reduces server load, allows the server to handle more traffic than it normally would, delivers your content faster to users, and reduces their wait time.
Server-side caching is a cache that you (or, most likely, your web host) sets up.
The kind of cache you should choose will depend on your needs. There are different caching options and types to consider.
- Mobile caching
- User caching
- OpCode caching
- Edge caching (e.g. CDNs)
- Object caching
Let’s go briefly through each of these different caching types:
Mobile caching works just like regular caching, but it’s caching for mobile applications and devices.
The mobile application asks a server for something and records the answer (and the question along with some metadata, like when the question was asked and/or how long to keep the answer for) in a dedicated cache file for mobile users.
The next time the mobile application wants to ask the server a question, it checks to see if the server already knows the answer, and if it does and the answer is still fresh enough, then it serves users the answer from its cache.
Otherwise, it just asks the server again. If the mobile application can’t get an answer (e.g. there’s no network connection) it will deliver users stale answers from its cache.
A user cache creates a dedicated set of cache files for every logged-in user.
User caches are useful when you have user-specific content on your website, such as if you’re providing membership functionality or allowing different users to access different content or features.
With user caching, your site would create a separate cache for each logged-in user and another set of cache files for non-logged in visitors, so they won’t have access to your special content.
If you plan to use user caching, be aware that some services–like Cloudflare–have the option to “cache everything.” This can cause issues as the cache won’t distinguish your logged-in and logged-out users.
OpCode caches are a performance-enhancing extension for PHP. OpCode caching caches compiled PHP code between every request.
Each time a PHP script executes, it consults your server to see if the result is in the cache. If it isn’t, it stores the result of the script in the cache.
It’s saved for the next user(s) who request the content. If it’s already cached, it loads from the cache.
OpCode caching can improve the performance of medium to large sites and should always be used in production environments.
Microcaching is a variation of full-page caching but only caches a static copy of dynamically-generated content for a very short period of time–between 1-10 seconds.
The only practical setting where microcaching should be considered is for highly-trafficked sites that feature rapidly-changing public content like real-time stock prices, breaking news, sports scores, etc.
Microcaching is not worth using if your site doesn’t have enough users hitting your servers with the same requests within a very short timeframe.
Edge caching (e.g. Content Delivery Networks or CDN) refers to the use of caching servers to store content closer to the end-users.
Let’s say your site is hosted on a server in Los Angeles. If a user in Johannesburg, South Africa, visits your site, their page request will have to travel over 10,000 miles (16,000 km) to reach your web server, and then travel back the same distance to deliver the page to their browser.
That’s not quite like going to the moon and back, but it’s pretty far and can take a long time.
The delay before a transfer of data begins following an instruction for its transfer is called ‘latency’.
Whereas most caching is stored on the same server, with CDN caching a website’s files are cached onto distributed data centers around the world.
When a user next visits your site from thousands of miles away, latency is minimized by giving them access to your site’s files from a CDN server located nearer to them.
Object caching stores database queries so that when data is needed, it is delivered from the cache without having to query the database.
Since the server doesn’t need to generate a new result, when enabled on your WordPress site, it can improve your PHP execution times, reduce the load on your database, and deliver content to your visitors faster.
As the WordPress CMS is heavily dependent on its database, it’s very important to keep your WordPress database optimized and running as efficiently as possible.
When enabled on your site, object caching helps prevent your server from being overwhelmed by easing the load on your database and delivering queries faster.
WordPress has utilized a built-in object cache (called WP_Object_Cache) since introducing version 2.0 in 2005.
WordPress object caching automatically stores data from the database in PHP memory to prevent repeated queries from overloading your database.
Where this feature falls a little short, however, is that WP object cache only stores data from the database for a single load.
At the end of each request, objects are discarded and have to be built again from scratch the next time a user requests the page.
While this is useful as it ensures that the WordPress database isn’t being queried multiple times during a single page load, object caching is more efficient and powerful if it can be used to cache similar query requests persistently through multiple page loads.
This is where persistent object caching solutions can help.
Persistent Object Caching
Persistent object caching helps to speed up the delivery of database queries and ease the workload of your server by allowing the object cache to persist between requests.
Popular persistent object caching tools include Memcached, Redis, and Varnish.
Memcached and Redis are not ‘caching’, they are caching servers–or caching engines that use databases to store cached items.
Essentially, they are database servers like MySQL designed to store data persistently and serve content faster, as data is stored in RAM.
Each time a user makes a request, the request consults the cache. If there’s a match in the database, the cache will serve the content.
As stated on their website …
“Memcached is an in-memory key-value store for small chunks of arbitrary data (strings, objects) from results of database calls, API calls, or page rendering.”
First developed in 2003, Memcached is a free, open-source distributed memory object caching system intended for use in speeding up dynamic web applications by alleviating database load.
Essentially, this allows you to redistribute and reallocate memory as you need.
If you think of all areas of your memory as one combined entity, then as you increase your servers and memory, your memory pool also increases, allowing for greater scaling and increased traffic handling.
Redis began in 2009. It’s open-source like Memcached and does everything that Memcached can do, plus a bit more.
According to its website, Redis supports:
“strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes with radius queries and streams.”
Stackoverflow.com has an interesting discussion on using Memcached vs Redis.
Also, a number of caching plugins support Memcached or Redis for cache storage, either for full page cache or the existing wp object cache.
Varnish works differently than Memcached and Redis. Varnish Cache is a web application accelerator, also known as a caching HTTP reverse proxy.
It is designed for content-heavy dynamic web sites as well as APIs.
Varnish allows caching and accelerates web pages without the need to modify any of your code or backend.
You can install it “in front” of any server that can use HTTP. When a user makes a request on your site, it consults Varnish’s server first.
If the result of the request exists, Varnish will serve the request. If it doesn’t exist, Varnish will allow the request to pass to the site server.
It’ll store the result for the next time a user requests the content.
Since requests process through Varnish’s server first, it can speed up your load times. Varnish can also store separate caches for desktop and mobile users–even if it’s the same URL.
Are There Any Downsides To Caching?
As we’ve seen, caching is a complex topic and you can cache your web site or web application in a multitude of ways.
Your caching strategy and the cache type you choose can impact your load times, but you need to be careful.
If caching is not set up correctly, browsers will not be able to validate cached content and pages may load outdated content, which can affect user experience.
As caching entities often include third-party proxy servers shared by multiple users, if caching server security is compromised, this can impact all users connected to the server.
Sometimes, you can layer different types of caches and it will work out well.
Other times, the complexity of caching can cause unwanted effects on your site, such as serving sensitive data to users (ouch!) or exposing your backend to visitors (what a bummer!).
Running benchmarks with different kinds of caching can help you make an educated choice about which caching you should use and avoid problems.
For example, check out this article where we compare the performance of top WordPress caching plugins.
Get Maximum Cache For Minimum Cash
Caching can give your site benefits like faster page loading and reduced content delivery time between client and server requests.
But the complexity of implementing effective caching strategies can be a real drawback.
WPMU DEV can help you avoid going through a steep and costly learning curve.
We have built cache web optimization into many of our website performance-enhancing solutions – from our award-winning plugins to our new blazing fast hosting service.
Smash A Stash Of Cache With Hummingbird
Hummingbird, our site optimization plugin, includes full caching features like page, browser, RSS, and Gravatar caching, plus full Cloudfare CDN integration.
With Hummingbird installed, you have complete control of your web caching settings, including the ability to clear page cache from your dashboard.
With a WPMU DEV membership, you automatically get upgraded to Hummingbird Pro with even more speed-boosting features.
Membership also gives you access to our entire suite of plugins, including the Pro version of Smush, our award-winning image compression plugin that uses CDN caching to serve your images crazy fast from 45 locations around the world.
To learn more about Hummingbird’s caching abilities, check out the plugin’s information page, visit the plugin’s documentation section, or read the articles below:
Get Hosting On Steroids With a Built-In Cache Cow
As a WPMU DEV member, you get blazing-fast hosting for three sites. There’s nothing you need to do to get cached up.
Caching is instantly activated, fully configured, and expertly managed for you. You don’t need to install any object caching plugins.
As Aaron Edwards, our Chief Technical Officer, states:
“Our hosting has built-in Memcached object caching sized to your plan. Nothing needs to be installed or configured.”
WPMU DEV hosting is optimized for WordPress and designed to make your and your clients’ sites fly with object and page caching, CDN, IPv6 support, and fully independent, dedicated, managed hosting for all your hosted sites.
Here are some additional useful notes about WPMU DEV hosting and caching:
- With each site you host with us, we turn off object cache in the staging environment, so you can build and make changes to your site with cache turned off to avoid any problems.
- As a member, you also get the opportunity to resell our world-class state-of-the-art hosting to your clients (coming soon!), so… you get more cache and make more money!
Time To Get Cached Up
If you want to experience the next-level caching benefits of our hosting service for yourself, now is the perfect time to do it.
We’d love for you to try our hosting service for free with a WPMU DEV membership trial.
Finally, if you’re already a WPMU DEV member and you don’t currently host any sites with us, be sure to migrate a site over, or whip up a test site and let’s get you all cached up.
Why 100 is NOT a Perfect Google PageSpeed Score (*5 Min Watch)
Learn how to use Google PageSpeed Insights to set realistic goals, improve site speed, and why aiming for a perfect 100 is the WRONG goal.