Blog

Set Up an Akamai Pipeline From Scratch

June 26, 2020 · by Gokul Ethilkandy ·
Categories:

DevOps is everywhere—but how do you apply it to your workflow? In this blog post, we’ll show you how to simulate an entire continuous integration and continuous delivery (CI/CD) pipeline with Akamai. Let’s walk through the complete Akamai pipeline workflow from scratch, so you can see how the blocks work together, and apply a similar approach to your own code.

But first, what are we simulating? The sample pipeline scenario below uses Akamai command-line interface (CLI) commands, GIT, Jenkins, and a text editor to automate Akamai configuration changes on the cloud. Let’s get started.

workflow

Prerequisites

To be successful here, you need the right set of tools, an Akamai Control Center account, and the required CLI packages.

Following is a list of tools used in this example:

  • Any text editor — Mine is Atom
  • GIT/BitBucket instance — I used the Akamai internal instance of GIT, but this can be a non-Akamai instance managed by your organization
  • Jenkins instance — I used the Akamai internal instance of Jenkins, but this can also be a non-Akamai instance

You’ll also need an account and test hostname:

  • This demo uses an internal Akamai account — if you are a customer, you can create a temporary group within your Akamai portal for testing and experimentation
  • Use of a test hostname is recommended for this demo, preferably one that does not have any production traffic or impact

You must install a couple of Akamai CLI packages:

Lastly, you’ll need the following: 

  • API credentials — Learn how to generate API credentials to interact with Akamai
  • SSH keys for Jenkins — Your Jenkins administrator can help with this step
  • A recommended add-on is GitHub connectivity to the Atom text editor

CLI Commands

Once you verify that you have the above prerequisites, you can begin creating the Akamai pipeline. Open up the terminal and start executing the CLI commands on your test hostname. I am using a MacBook in this demo, however, these commands can also run on Windows or other operating systems in a similar manner.

Command-1

akamai pipeline new-pipeline -p gokulpipe1 -e gokul-sandbox dev qa prod

Here, we are creating three environments: Development, QA, and production, based off a source configuration called gokul-sandbox — you can use a source configuration of your choice and akamai help <command> is available for additional information.

code block

There is a lot happening in the background when running Command-1. Output 1 creates the following:

1. The below folder structure is auto-created on your local machine as shown in the set of screenshots that follow: 

  • devops.log
  • gokulpipe1 //this is your new pipeline name
    • cache
    • dist
    • environments
    • templates
    • projectInfo.json

devops.logchartdevtemplates

2. Also, upon running Command-1, the following three new configurations based off of the source configuration “gokul-sandbox” are created within the Akamai portal:

  • dev.gokulpipe1
  • qa.gokulpipe1
  • prod.gokulpipe1

Property GroupsProperty Detailswebpage

Command-2

  • akamai pipeline -p gokulpipe1 lstat

This command helps us show the current status of the created pipeline as seen in Output 2:

chart

Optional Command: Stage Environment 

(Continue on to Command-3 to proceed with the original flow)

Though a deviation from the above flow of commands, it is important to note that a new “stage” environment can be created using the following approach of copying from an already existing environment (“dev” in this below example). 

Create Pipeline Target Environments

  • cp -R dev/ stage

This optional command manually copies the structure of an environment called “dev” to create a new “stage” environment.

chartprojectinfo.jsoncode block

Command-3

  • akamai pipeline -p gokulpipe1 lstat

This command helps us show the current status of the created pipeline with the new environment called “stage” as seen in Output 3 below:

chart

Command-4

  • akamai pipeline search dev.gokulpipe1
  • akamai pipeline search qa.gokulpipe1
  • akamai pipeline search prod.gokulpipe1
  • akamai pipeline search stage.gokulpipe1

The above command set gives more information via search for properties by their names. The output of Command-4 is shown below:

chart

Command-5

  • akamai pl -p gokulpipe1 merge dev

The dev merge command populates /Users/gethilka/Desktop/AkamaiPipeline/gokulpipe1/dist/dev.gokulpipe1.papi.json.

This command helps merge the template json and variable values into a PM/PAPI ruletree JSON document, which is stored in the “dist” folder under the current pipeline folder. The output is shown below:

chart

Command-6

  •       akamai pl -p gokulpipe1 save dev 

The locally merged changes are saved to the Akamai portal configurations as shown below:

chart

Some tips on how to save and promote changes through the pipeline with Akamai follow:

chart

Command-7

  • akamai pl -p gokulpipe1 promote -n staging -e gethilka@akamai.com -m DevPromoted dev

This command promotes the prepared configuration to Akamai staging along with the activation note, "DevPromoted" as shown below in Output 7:

chartProperty Details

Command-8

  • akamai pl -p gokulpipe1 check-promotion-status dev

  • akamai pl -p gokulpipe1 lstat

These commands help you monitor the status of the properties as shown below:

chart

So there you have it. Your environments are set up now in less than 10 CLI commands. 

As an alternate path, if you want to include Akamai Sandbox into the pipeline workflow, you can run the following:

  • akamai sandbox update -r /path/to/pipeline/dist/…..json

  • akamai pipeline merge -p <pipeline name> <env>

The Akamai “property update” command saves changes to the PM UI, then “activate config” via property CLI module.

GitHub

Now let’s set up GitHub. Below is the initial set of commands to create GIT access and sync the folder structure that you created locally on your system to the GIT repo on the cloud.

configure Git for the first time

Once you run the above command, set to sync your local files to your GIT repo, and you should see the output below:

source

You can also view the entire file structure (shown below) after you sync your files to GIT:

commits

  • Git pull

This command pulls from a server repo to local repo // pulls and store locally a .git/ folder > (/Users/gethilka/Desktop/AkamaiPipeline/gokulpipe1/.git).

chart

Here is how you can add GitHub connectivity to the Atom text editor:

files

Automating the Process

Finally, we can automate all of these commands as well as the configuration build via Jenkins by setting up our Jenkins project as shown below:

list view

Create a new Jenkins project folder:

jenkins

Under that, create a workspace — I called the “gokulpipe1” the same name as the new pipeline we created.

Jenkins

Once the workspace is created, you need to sync the Jenkins build to the GIT repo as shown below (use your GIT repo link at this step):

Source Code Management

Note: We are using the “pipeline” and “property” CLI commands together to provide a more flexible workflow that is non-forced/linear — this also helps in parallel development.

build

Lastly, you can set up Jenkins to send out an email notification to the user once the configuration activation is complete.

Post-build Actions

Optionally, you can add local files to GIT using the commands below to ensure that all local files are synced to the GIT repo once the GIT and Jenkins instances are complete:

git add -all

git status

chart

Another option is to enter the command below, which updates Config Version 2 by pulling data from Version 1 via the UI (shown in the screen that follows):

akamai property update dev.gokulpipe1 --file /Users/gethilka/Desktop/AkamaiPipeline/gokulpipe1/dist/dev.gokulpipe1.papi.json 

chart

The automation is complete! Here is a Jenkins command prompt activation via a build and the corresponding UI result (Version 3 is highlighted to show the result of the Jenkins build):

chart

If you’re interested in setting up a Jenkins Bot, visit the following link: https://apphub.webex.com/teams/applications/jenkins-8938 

Here’s the automated email notification you will receive upon successful activation:

chart

Congratulations! In this example, we have successfully moved Akamai CLI commands from running on a local user's laptop/desktop to automatically executing in the cloud with the help of GitHub/BitBucket and Jenkins.

For additional information on this topic, visit:

https://developer.akamai.com/blog/2020/03/26/manage-akamai-configs-across-globe

https://developer.akamai.com/blog/washington-post-akamai-as-code

About the Author

Gokul

Gokul Ethilkandy is a Technical Project Manager at Akamai where helps the largest companies on the internet run fast and secure apps by leveraging web performance, security, and DevOps best practices. Gokul has been an advocate for DevOps since its inception and has shared his deep knowledge of DevOps practices to teammates, customers, and audiences. In 2018, he gave a talk at the SAP UI5 Conference called, ‘Introduction to Content Delivery Networks.' His life’s motto is: Stay calm and make life simple. In his free time, he enjoys trekking and volunteering for non-profits in the areas of social service and green initiatives.