Blog

An Introduction to Caching with Akamai

June 11, 2020 · by David Kolbo ·
Categories:

Many online businesses that Akamai supports are focused on Internet security best practices — this is especially true for customers in the financial services industry, which is a frequent target of cyberattacks and fraud.

In an effort to provide the safest online experience, sometimes these customers overlook opportunities to improve performance and availability. By deploying basic caching strategies, you can offload origin servers and enhance user experiences.

Just as you protect your websites, you should also preserve their functionality. With caching, Akamai offloads standard requests from your origin servers, so you can continue to deliver rich and responsive online experiences and avoid service interruptions when demand surges.

Understanding the Benefits of Caching

To understand the benefits of caching, consider the distance your content travels and the potential obstacles on that journey. The first mile of the Internet is the connection into your web servers, the middle mile consists of backbone providers with high-capacity trunks for transmitting large amounts of data, and the last mile is the online connection into your users’ homes or offices. 

It’s the middle mile that doesn’t always behave. There, you can experience packet loss, congestion, and outages while the request is traveling over a sub-optimal path to your web servers.

Why? The way the border gateway protocol (BGP) works, Tier 1 Internet service providers (ISP) and giant carriers are connecting and exchanging traffic, causing slowdowns and potential errors along the way.

cloud

To avoid problems on the middle mile, caching serves content closer to customers by keeping objects in a temporary storage area for quick delivery. Akamai helps you serve the content on the last mile to avoid traversing along the middle mile for every request.

edge servers

Using Akamai Content Delivery 

Akamai has the world’s largest and most trusted cloud delivery platform. Akamai is a highly-distributed content delivery network (CDN) with approximately 288,000 servers in 136 countries and nearly 1,500 networks around the world. 

distributed platform

Akamai accelerates performance by serving content closer to the end user. With caching, your origin servers are offloaded, requiring less CPU and front-end network interface controller/card (NIC) utilization, key performance indicator (KPI) metrics that spike when servers are overwhelmed with traffic and return loading errors to users.

In addition, browser caching stores the object locally in the browser. Not only are you helping avoid the middle mile problems, but the content is as close as possible to your users on their devices for better performance and server utilization.

Getting started

To deploy caching, you must determine which content can be safely stored, set the duration that the objects should remain there, and ensure the readiness of your origin servers.

Determining Content to Cache

There is plenty of content that can be offloaded from the origin to improve page load speed. For example, basic caching of paths and filenames that don’t include personally identifiable information (PII) data. Advanced caching includes query strings, cookies, and request headers. To learn more about Akamai cache keys and the cache ID modification behavior, please see: https://developer.akamai.com/blog/2017/04/14/what-you-need-know-about-caching-part-3

Setting Time-to-live

Time-to-live (TTL) is the amount of time that an object can be served from the cache. Specifically for Akamai, TTL is the time period an Akamai Edge server may serve an object from the cache without revalidating its freshness with the origin server.

As a guideline, for Akamai caching, if the TTL is short, you’re not achieving maximum offload of your origin servers. However, if the TTL is long, you have to purge web content all the time, unless you have file versioning coding best practices. Furthermore, browser caching TTL should be less than the Akamai caching TTL — be careful with these settings, as you can’t clear your customer’s browser cache.

There are several TTL settings available on Akamai, depending on the sensitivity of your content. Following is a list of the typical settings available and conditions for selecting each.

  • No Store: The object is never cached, or written on the disk. Select this option for PII, such as account information, Social Security numbers, driver's license details, or any other personal data. To avoid accidentally caching PII, select the No Store caching rule in your Akamai configuration, and then set up specific caching rules for content that is not sensitive.

  • Bypass Cache: Causes the request and origin response to be passed without removing the underlying object from the cache if it's already there. This setting is similar to No Store, except that it doesn't remove the cache entry if one already exists. For example, if the URL remains the same, but the user experience is different, like in A/B testing. 

  • 0s TTL: Appropriate for large, time-sensitive objects. Examples include news that changes quickly, sports scores, or stock tickers. If the object is large, the advantage of serving it from the Edge of the Internet may easily outweigh the cost of revalidating it for every client request. Setting a TTL to zero allows Akamai servers to cache the object, but requires that they revalidate the object with an if-modified-since (IMS) header request to the origin each time the object is requested. This approach uses less bandwidth than requesting the entire object again form the origin.

  • Fixed Value: A fixed time period, such as a 30-day TTL that’s appropriate for font, JavaScript, cascading style sheet (CSS), and image files.

  • Negative TTL: This option helps provide origin relief for response codes 204, 305, 404, and 405 with a TTL of 30 seconds.

Preparing Servers to Support Caching

Prior to implementing caching settings in your Akamai configurations, be sure that your origin servers are ready. There are some common obstructions to successful caching. For example, make sure your servers support Last-Modified and IMS headers as well as HTTP 304s. Also, an Edge server may not cache an object with an HTTP Vary header (some applications apply an HTTP Vary header even when the content does not vary), so make sure to remove the header for the content to be cached.

Implementing Delivery Configurations

As mentioned above, to avoid caching PII data, please make sure your Akamai delivery configuration contains a No Store caching behavior in the Default Rule. Rules are processed top-down in Akamai delivery configurations, so be careful that the caching rules towards the bottom of the delivery configuration don’t override those towards the top. For example, if you have a rule towards the top for images with a TTL of 1 hour, and another rule below it of 7 days for the same images, the 7-day rule would take precedence. 

caching

 In your Akamai delivery configuration settings, you’ll be able to edit by clicking the Add Rule button to set up caching rules. 

 Here’s an example of Static Content Caching on Akamai for a TTL/Max-Age of 60 days:

static content

For advanced caching, use the actions available in Cache ID Modification behavior:

Cache ID Modification

Finally, here’s an example of browser caching called Downstream Cacheability:

Criteria

 Measuring Offload

You can measure the effectiveness of your caching strategy with the cache hit ratio. A cache hit ratio is the number of requests delivered by the Akamai cache server, divided by the total number of requests. For example, if the Akamai cache server delivered nine requests out of a total of 10, your cache hit ratio would be 90% — nine of the requests in our example were cache hits served by the Akamai cache servers, and one request was a cache miss where the Akamai cache server had to go forward to the origin to retrieve the content. Cache hit ratios are important to understand as they help quantify how much Akamai is offloading your servers. 

Prioritizing Performance

In addition to making sure your online content is secure, it’s important to take steps to improve performance and availability by implementing content caching. Content caching not only improves performance and availability, but also helps offload your origin and might just save your infrastructure when the next peak event occurs. For example, during the coronavirus health crisis, financial services sites were so overwhelmed with elevated traffic loads from customers seeking loan information that some of their online services experienced outages. 

Please reach out to your Akamai account team for the best guidance on utilizing our caching features. We can make recommendations based on your specific use case as well as help with offload analysis reports and recommendations.