Blog

Manage Akamai Configs

Manage Akamai Configs Across the Globe

March 26, 2020 · by Gokul Sengottuvelu and Josh Johnson ·

This blog is part of the Akamai March Release, where we’re giving you all the details about what Akamai has added and improved for developers! You can view all of our updates here.  

Large enterprises have unique needs when managing Akamai properties at scale. When you’re managing dozens or hundreds of websites across multiple business units and geographies, a single Akamai configuration change often needs to be replicated across many online properties. Performing these updates manually is both time consuming and prone to human errors.

A major automobile manufacturer encountered this challenge while managing 20 European marketing sites. The manufacturer operates a separate website (and Akamai property) for each country where it sells vehicles. With each Akamai property change, it took 4 – 8 hours of effort to modify all of the sites and validate that the changes were identical on them. 

To improve efficiency and accuracy, the manufacturer applied a DevOps philosophy to the Akamai implementation, managing configurations as code. Initial setup took less than 30 minutes and now the effort to implement a change across all 20 Akamai properties is less than 5 minutes. Furthermore, a common template guarantees that the changes are applied consistently to each site. 

Before diving into this use case further, let’s quickly revisit Akamai Pipeline.

Akamai Pipeline

Akamai Pipeline is a command-line interface (CLI) that allows you to manage Akamai properties as code. Akamai Pipeline’s traditional workflow is designed to streamline management of a single site with multiple environments — Dev, QA, Prod. Managing an Akamai Pipeline consists of three unique operations:

Breaking properties into code snippets

When creating the Pipeline, the Akamai Pipeline CLI breaks the property into two distinct sets of code snippets.

  • Generic templates — Functional aspects of the properties like rules and behaviors that are common across the properties

  • Environment-specific settings — Configuration aspects of the properties, like CP code and origin, that are different across each environment

test pipeline

Tokenizing the values

Pipeline tokenizes the configuration settings in the template files, which provides the ability to customize the properties with environment-specific values. 

{
                "name": "cpCode",
                "options": {
                    "value": {
                        "id": "${env.cpCode}"
                    }
                 }
 }

Building and deploying properties

Pipeline merges the template files with the environment settings for deployable artifacts, updates the Akamai properties, and activates the new version of the Akamai properties on Akamai staging and production networks.

"Environment Name"  

“dev”

“prod”

"Property Name"   

“dev.gsengo"     

“prod.gsengo"             

"Latest Version"   

1

1

"Production Version"│"

“N/A”

“N/A”

"Staging Version"   

“N/A”

“N/A”

"Rule Format"    

“v2020-02-20"

“v2020-02-20"

The pipeline approach streamlines change management and reduces the risk of human errors while propagating changes across multiple environments.

Get started with Akamai Pipeline in developer.akamai.com.

Managing Multiple Properties with CLI for Akamai Pipeline and Property Manager

The traditional workflow of Akamai Pipeline automates the creation and activation of a series of properties related to a web property for different environments by promoting and propagating the changes seamlessly in the CI/CD process. However, many larger companies have several web properties very similar to each other, like the automobile manufacturer. 

Imagine a conglomerate with localized versions of the website in 50 different languages delivered by 50 instances of identical property configurations that have the same functional aspects. If there is a small change in the configuration, it is cumbersome to reapply the changes to 50 different properties. Let’s review the workflow leveraged by the automobile manufacturer to improve this process.

An easier way to manage changes across multiple properties is by setting up the pipeline with the localized versions defined as individual environments. Unfortunately, the philosophy behind the Pipeline design is to support the changes that cascade within a single property that in-turn get deployed into multiple environments. Pipeline enforces the orderly activation of the properties in a predefined sequence that may not suit all requirements. Hence, the automobile manufacture took a different approach.

Splitting, Merging, and Building Properties

If you observe the Pipeline design, it breaks up the configuration code in a way that provides the ability to maintain the consistency of the rule tree structure, and abstracts out the configuration settings that are unique between the environments. It also provides the mechanism to merge these distinct parts into a complete property configuration JSON file.

The automobile manufacturer leveraged these individual components in the pipeline to build a solution that could automate changes across numerous properties. Let’s go step-by-step to understand the workflow.

Define Pipeline

The first step is to set up the property configurations of all the web properties as an individual pipeline.

pipeline

Identify common rules and behaviors

Move all common rules and behaviors into a folder accessible to individual pipelines.

french

Reference rules and behaviors in main.json

Configure each property within the pipeline to have one “main.json” file that references the default rules in the common folder (“#include” of the default rules file). 

main jsondefaultRules.json

Leverage Environment Variables

Similar to the traditional pipeline workflow, the settings that are different between the properties are managed with environment variables. The environment variables are referenced with the syntax, "${env.variableName}", and values are set for each environment variable in the“Definitions.json” files.

defaultBehaviors.json

Merge, update, and activate the properties from the command line:

Once the pipeline setup is configured, a quick three-step process activates a change for a specific property.

Merge the JSON files and environment variables:

akamai pipeline merge -np <pipeline_name> <environment_name>

Update the Akamai configuration with the merged JSON:

akamai property update <property_name> \
--file <pipeline_name>/dist/<environment_name>.<pipeline_name>.papi.json \
--notes "<version_notes>" 

Activate the Akamai configuration to the staging network:

akamai property activate <property_name> \
--network staging 

The ability of Pipeline to merge the JSON files combined with the Property Manager update functionality allows management of multiple properties in a CI/CD workflow.

Automation with Jenkins

The automobile manufacturer also uses Jenkins for its DevOps automation. Jenkins is an open-source automation server that can be used for building, testing, and deploying code. A vast plugin library is available to extend the capabilities of Jenkins. The typical Jenkins workflow is as follows:

  1. Commit the source code to a central source control repository, such as Git

  2. Configure the Jenkins job to trigger manually or based upon detection of code commit operations 

  3. Configure steps to automatically build the code, execute test cases, deploy the code to a server, or perform other steps necessary

  4. Configure follow up actions based on the notification of success or failure through email, IM, webhooks, or other configured plugins upon completion

  5. Archive any build artifacts and retain a report and audit history of the build

Let’s examine how the automobile manufacturer leveraged Jenkins to automate change management across multiple properties with Pipeline and Property Manager CLIs. 

Source Control

The Pipeline source files have to be committed to a central repository.
 

github

Jenkins job

A declarative Jenkins Pipeline job can be created to automate the build, letting users define the whole lifecycle in a Jenkinsfile. The Jenkinsfile also ensures that the Jenkins Pipeline can also be maintained as code and easily migrated to any Jenkins server.

Managing multiple properties involves two important operations, merging JSON file and updating/activating the properties. The setup can be defined as two stages in the Jenkins Pipeline as follows:

  1. Jenkins Pipeline job

    • Stage 1: Merge

    • stages
    • Stage 2: Update
    • Stage 2: Activate
    • deploy

       

Take a look at the flow in the Jenkins interface:
 

Dealing with Failure

By default, if a stage fails, then the build is halted, and subsequent stages will not be executed. For example, if the Akamai CLI merge fails on any property in the first stage, then the build stage to update and activate all properties will not be attempted. During the process, the auto manufacturer even unearthed a few inconsistencies caused by manual changes earlier. 

Summary

With Akamai Property Manager and Pipeline CLI, a large international enterprise automated the change management across multiple identical properties to reduce the turnaround time from one day to less than a few minutes.