What You Need to Know About Caching – Part 1

by Mario Korf
8347

In the first video of their “unofficial” video series, Tedd Smith and CJ Arnesen explain the basics of caching: what it is, why it’s important, and how to get the most out of it.

Tedd and CJ are Solutions Engineers at Akamai, and regularly onboard new customers. What they learned was that while customers found the Akamai Quick Starts and User Guides useful, they needed something different for level-setting new team members and introducing other departments within the company. This video, and those that follow, are a great introduction to Akamai, caching, and CDNs in general.

A transcript of the video follows, but I encourage you to watch the video, as the whiteboard illustrations are fun and informative.

Transcript

Hi, I’m Tedd Smith, and I’m a Solutions Engineer at Akamai. That means I have the privilege of explaining how Akamai’s technology works with a broad range of customers. I thought it might be nice to capture some of that knowledge in a series of videos that help to explain some of the concepts we work on.

I wanted to start this video series off with a quick introduction to caching. The concept of caching is not new, shoot… just think to your backyard. Your neighborhood squirrels are always working towards caching their acorns in empty tree stumps for the impending el Nino.

The idea with caching is basic: Store needed resources close to you so that you don’t need to spend time or energy to retrieve them. While this is great for those cold winters when you don’t want to forage for food, it also works great when dealing with the internet. Caching is key to any Content Delivery Network (CDN).

Think of your CDN partner like your own personal acorn vending machine. The takeaway is, by caching you: increase delivery performance, decrease the need for compute and bandwidth resources, and ultimately save money. It’s really a great deal!

We’re going to dive a bit deeper into the tools available for caching, the various places you can cache, and explore in greater detail some of the basic options Akamai offers for caching.

Controlling Your Cache

There are some basic methods for controlling your cache that are built into the HTTP standard. These are things like “expires” and “cache control” headers. So let’s explain those first.

Expires: is a standard HTTP header and was the “old” way of doing things in HTTP. It provided a time and day when a file should “expire”. If you still use this header, make sure you set the date no more than one year into the future.

Cache Control: is the default SUPER header for controlling the cache behavior in the newer versions of HTTP. It offers various directives for caching such as:

  • Max-Age: which indicates how long a file can stay in cache. It only takes values in seconds and looks like this: or
  • No-Store: which does exactly what you think. It tells the cache not to store any of the responses with this header value. This is great for unique or personalized content that changes ALL the time or CAN’T be served to different users.
  • No-Cache: which tells the cache server to re-validate every request (using something like an If-Modified-Since). This only incurs a small round-trip instead of fetching the full payload.
  • Public: which indicates the file can be cached and served for all subsequent requests. If you already have max-age in front of Cache-Control, this option becomes redundant!
  • Private: which dictates that only the browser is allowed to cache the content, but no intermediary cache server (such as a CDN).

There are more headers and values that you can read about in one of our attached links. We’ll also cover them in future videos.

So now that we know the basic methods of setting or controlling the cache, let’s dive deeper into all the places you can cache.

Places You Can Cache

A lot of you are already familiar with some level of caching that happens at your web server or load balancer. Think Varnish or Squid. These are great places to start, but they still require the request to make it all the way back to your web server thereby limiting the performance gains, and leveraging resources like CPU & bandwidth.

The next place you can cache is with your CDN partner. This is where a lot of the performance gains and resource offload can come from. By using a CDN partner, your content is now stored on their servers and if configured correctly, stored near your customers so that performance is improved. By having customers connect to the CDN, they don’t have to inundate your web server with the same requests over and over again.

Your content can also get cached elsewhere, such as ISPs and transparent proxies around the globe. If two people on the same street are downloading the same content or browsing to the same web page, the ISP might cache that request and serve the response to both users. This would save them money by not having to connect to the CDN for the content over and over.

The last place to get in on the action is the browser or the device where content is being viewed on. The device stores files in it’s local cache so that no further bandwidth is used and assets load immediately. This improves performance even more, and saves even more resources.

Understanding these four layers of caching will help greatly, as you explore your needs and requirements as well as debug issues.

Configuring Akamai with the Luna Control Center

When configuring caching settings on Akamai’s Luna Control Center, you would utilize Property Manager to set up various rules. The Caching behavior gives you five options in the drop down list:

  • honor origin cache control + expires – indicates that Akamai servers should respect both of the Cache-Control & Expires headers received from your servers.
  • honor origin cache control – indicates that Akamai servers should only respect the Cache-Control header received from your backend servers.
  • honor origin expires – tells the Akamai servers to only respect the Expires header received from your backend servers.
    Note: when selecting any of these three options, you need to define a default Max-Age which will be used in the case where your servers do NOT return any of the specified headers.
  • cache – lets you define the Time-To-Live (TTL) of all assets served through Akamai, regardless of what your servers indicate in the Cache-Control or Expires headers. We recommend this for all your static assets.
  • no store – prevents the Akamai cache from storing a copy of the file, and always requests a new copy from your backend servers. We recommend this for your dynamic HTML or API calls.

Conclusion

As you can see, there are lots of ways and places to control your content. Proper management of cached assets can help increase performance by ensuring your CDN partner is delivering as much of your content from the edge of the Internet as possible.

It can also help you offload your infrastructure by allowing more of your compute resources to focus on what they’re meant to do, instead of serving bits during peak load.

Lastly, all of this equates to cost savings for your organization by reducing your egress bandwidth costs, eliminating the need to build and maintain data centers in various geographies, and reducing the time spent by precious engineering, operations, and support staff working to make your web presence faster and more reliable.

Please check out the provided links for more details, and ask your Akamai account team for help in optimizing your cache setup. We’re here to help you get the most out of your Akamai services!

Additional Resources

Categories: Technology, Video

Suggested Article

LEAVE A REPLY