DataStream 2 Configuration API v1

Create and configure streams to collect transaction data from the edge and send logs to destinations of your choice.

Learn more:


Overview

Akamai constantly gathers log entries from thousands of edge servers around the world. You can use the DataStream 2 Configuration API to capture these logs and deliver them to a destination of your choice at low latency.

The DataStream 2 API lets you create and manage stream configurations. You can choose data set fields to monitor in all request-response cycles at the edge. Configure your streams to collect data and send log files to custom destinations for analytics.

This API offers a programmatic alternative to the features of DataStream 2 available in Akamai Control Center.

The API limits the scope of the returned data by the user’s account, its permissions, and access to business objects that are associated with the user and account.

DataStream 2 is optimized for high volume raw data delivery. We recommend using raw log data for basic traffic analysis and monitoring CDN health. In the event of issues related to third-party destinations, such as latency or connection problems, data will be lost after three failed retries to connect. You should take these limitations into account before using data served on your stream for audit, compliance and billing purposes. See the recommended Use cases in the DataStream 2 user guide.

You can leverage log collection that DataStream 2 provides with these products:

Get started

To configure this API for the first time:

  • Contact your Akamai representative to enable the DataStream module for your account.

  • Review Get Started with APIs for details on how to set up client tokens to access any Akamai API. These tokens appear as custom hostnames that look like this: https://akzz-XXXXXXXXXXXXXXXX-XXXXXXXXXXXXXXXX.luna.akamaiapis.net.

  • To enable this API, choose the API service called Datastream and set the access level to READ-WRITE.

  • To start collecting and streaming logs using this API, enable the DataStream behavior in Property Manager.

  • To successfully run this API, you may need to access other APIs, such as the Property Manager API, Identity Management: User Administration API, and Contracts API.

API concepts

This section provides a road map of all the conceptual objects you deal with when interacting with the DataStream 2 Configuration API. You can follow the links to learn more about related objects.

  • Groups. Each account features a hierarchy of groups, which control access to properties. Using either Control Center or the Identity Management: User Administration API, account administrators can assign properties to specific groups, each with its own set of users and accompanying roles. Your access to any given property depends on the role set for you in its group. Typically, you need a group identifier to create a stream configuration. See the Group object.

  • Contracts. Each account features one or more contracts, each of which has a fixed term of service during which specified Akamai products and modules are active. Typically, you need a contract identifier to create a stream configuration. Get and store the contract ID from your account—for example, using Contracts API.

  • Products. Each contract enables one or more products, each of which allows you to deploy web properties on the Akamai edge network and receive support from Akamai Professional Services. Products allow you to create new properties, CP codes, and edge hostnames. They also determine the baseline set of a property’s rule behaviors. Only some products enable log collection. Typically, you need to know a product name to create a stream configuration. Get and store the name from your account—for example, using Contracts API. See the Product object.

  • Properties. A property, also referred to as a configuration, lets you control how edge servers respond to various kinds of requests to your assets. You can manage them from Property Manager or the Property Manager API. Properties apply rules to a set of hostnames, and you can only apply one property at a time to any given hostname. Each property is assigned to a product, which determines which rule behaviors you can use. Streams let you monitor the traffic served by a property. Also, you can associate a property with a single stream. See the Property object.

  • Connectors. A connector, also known as a destination, represents a third-party configuration where you can send your stream’s log files. DataStream 2 supports these connectors: Amazon S3, Datadog, Azure Storage, and Splunk.

  • Streams. Collects, bundles, and streams raw request log records to a chosen destination at selected time windows. It lets you control the data set fields you monitor in your logs, the order of these fields’ log lines, and the delivery time frames. You can update a stream through versioning any time you want to change the properties it monitors or the data set fields it collects. A stream can monitor up to three properties that aren’t part of any other streams. Note that RAW_LOGS is the only stream type currently available. See the StreamConfiguration and StreamVersion objects.

  • Templates. Each product provides a number of pre-defined sets of data called templates that you can monitor in a stream. You can configure data set fields for your stream to collect data in all request-response cycles at the edge and send logs to a destination. If needed, you can also add custom data to the log records. Note that EDGE_LOGS is the only template currently available. See the Dataset object.

  • Activation history. You can activate and deactivate the latest version of a stream at any time. Activation history lets you track changes in activation status across all versions of a stream. See the ActivationHistory object.

Workflow

In DataStream 2, you can activate a stream upon creating or save the configuration to activate at any time. Both require getting details necessary to configure a stream, including product and group IDs, destination for streaming logs, or data set fields you want to monitor.

Get stream details

Before creating a new or updating an existing stream, use this workflow to get the required details:

  1. Run the List delivery products and associated groups operation to determine the group and the product that you want to create a stream for.

  2. Run the List properties to determine the properties within the selected product that you want to monitor in the stream. Note that running this request returns only active properties. To list a property, activate it on the production network in Property Manager first. See Activate a property.

  3. Run the List connector types operation to determine the destination where you want the stream to deliver log files. Keep the destination details for configuration.

  4. Run the List data set fields operation to determine the data set fields that you want the stream to log.

Create and activate a stream

If you want to activate your stream on making the POST or PUT request, use this basic workflow:

  1. Run the Create a stream operation to create and configure a stream or the Edit a stream operation to update a stream. Make sure you set activateNow to true.

  2. Enable the DataStream behavior in Property Manager to start collecting and streaming logs.

  3. Optionally, set the sample rate in the DataStream behavior to control the percentage of data you want to collect and stream.

Create and save a stream configuration

If you want to save your stream configuration and activate it later, use this basic workflow:

  1. Run the Create a stream operation to create and configure a stream or the Edit a stream operation to update a stream. Make sure you set activateNow to false.

  2. When you decide the configuration is ready for activation, run the Activate a stream operation.

  3. Enable the DataStream behavior in Property Manager to start collecting and streaming logs.

  4. Optionally, set the sample rate in the DataStream behavior to control the percentage of data you want to collect and stream.

Next, you can configure these optional behaviors and criteria to have better control over your stream:

  • Custom log fields. Enable to log any of these data set fields: User-Agent, Accept-Language, Cookie, Referer, X-Forwarded-For or Custom field. See Custom log field.

  • Criteria. In each property that’s part of the stream, you can configure the DataStream behavior in a custom rule to log requests matching your criteria. This lets you filter by hostnames, paths, IP addresses, or other criteria. See Criteria in the Property Manager API.

Version management

Every time you edit a stream, you create a version. This lets you quickly adapt your existing streams to collect logs for different properties, modify data set parameters they monitor, or change destinations where they send log files.

Each version that you activate becomes a default version for a stream with the same activation status as its base version. For example, when editing version 1 of an active stream, you create an active version 2 that replaces the previous version on the production network. Conversely, by editing version 1 of an inactive stream, you create an inactive version 2 of this stream that overwrites the previous version.

You can view and compare all configuration versions within a stream, but you can only manage the activation status of the latest version. You also can’t revert a stream to any previous version. See View a stream history.

Data set parameters

You can configure a data stream to collect these parameters. To get a list of available parameters, you can run the List data set fields operation.

Data set field ID Field name Data type Description
1000 CP code string The Content Provider code associated with the request.
1002 Request ID string The identifier of the request.
1100 Request time number The time when the edge server accepted the request from the client.
1005 Bytes number The content bytes served in the response body. For HTTP/2, this includes overhead bytes.
1006 Client IP string The IPv4 or IPv6 address of the requesting client.
1008 HTTP status code number The HTTP status code of the response.
1009 Scheme string The scheme of the request-response cycle, either HTTP or HTTPS.
1011 Request host string The value of the Host header in the request.
1012 Request method string The HTTP method of the request.
1013 Request path string The path to a resource in the request, excluding query parameters.
1014 Request port number The client TCP port number of the requested service.
1015 Response Content-Length number The size of the entity-body in bytes returned to the client.
1016 Response Content-Type string The type of the content returned to the client.
1017 User-Agent string The URI-encoded user agent making the request.
1019 Accept-Language string The list of languages acceptable in the response.
1023 Cookie string A list of HTTP cookies previously sent by the server with the Set-Cookie header.
1031 Range string The part of an entity that the server is requested to return.
1032 Referer string The address of the resource from which the requested URL was followed.
1037 X-Forwarded-For string The originating IP address of a client connecting to a web server through an HTTP proxy or load balancer.
1033 Request end time number The time it takes the edge server to fully read the request.
1068 Error code string A string describing the problem with serving the request.
1101 Total bytes number The total bytes served in the response, including content and HTTP overhead.
1102 Turn around time number The time in milliseconds from when the edge server receives the last byte of the request to when it sends the first bytes of the response.
1103 Transfer time number The time in milliseconds from when the edge server is ready to send the first byte of the response to when the last byte reaches the kernel.
1066 Edge IP string The IP address of the edge server that handled the response to the client. This is useful when resolving issues with your account representative.
1068 Error code string A string describing a specific issue with serving the request.
2001 TLS overhead time number The time in milliseconds between when the edge server accepts the connection and the completion of the SSL handshake.
2002 TLS version number The protocol of the TLS handshake, either TLSv1.2 or TLSv1.3.
2003 Object size number The size of the object, excluding HTTP response headers.
2004 Uncompressed size number The size of the uncompressed object, if zipped before sending to the client.
2005 Max-Age number The time in seconds that the object is valid for positive cache responses.
2006 Overhead bytes number TCP overhead in bytes for the request and response.
2007 DNS lookup time number The time in seconds between the start of the request and the completion of the DNS lookup, if one was required. For cached IP addresses, this value is zero.
2009 Query string string The query string in the incoming URL from the client. To monitor this parameter in your logs, you need to update your property configuration to set the Cache Key Query Parameters behavior to include all parameters.
2010 Cache status number The request was served from the edge cache or peer (1), or forwarded to parent or origin (0).
1082 Custom field N/A The data specified in the Custom Log Field of the Log Requests Details that you want to receive in the stream. See Custom field.

Custom log field

DataStream 2 lets you append additional data fields such as headers, cookies, or any performance data to each log line. The value of the data is maximum 40 bytes and is typically based on a dynamically generated built-in system variable. To include a custom field in your stream, you need to define it in the Log Request Details behavior in Property Manager or the Property Manager API.

Follow these steps to add a custom field to your stream:

  1. Add or edit the Log Request Details behavior in the property configurations for which you want to collect a custom log field. Configure the customLogField to log a custom field.

  2. Configure your stream’s datasetFieldIds array to include the identifier of a custom data field. See Create a stream or Edit a stream.

Dynamic time variables

DataStream 2 lets you use dynamic time variables in folder paths where you store logs and names of log files that you upload to your Amazon S3 and Azure Storage destinations.

Use {} to enter a dynamic variable. These are supported dynamic variables:

  • %Y for a year. For example, 2021.
  • %m for a month (01–12). For example, 03.
  • %d for a day (01–31). For example, 31.
  • %H for an hour (00–23). For example, 15.

You can combine static values and dynamic variables in a string of up to 255 characters in the path member to point to the folder path where you want to store logs. On sending a log file to this path, the system resolves dynamic variables into the current date, time, and hour in the UTC standard. Multiple dynamic variables separated by / within one {} create separate folders. For example, { %Y/%m/%d/ } creates these folders 2020/10/05. Multiple variables joined without a separator create one folder. For example, { %Y }{ %m }{ %d } creates a 20201005 folder.

Here are examples of valid paths in connector configurations and the folder paths created in destinations:

Path Folder path
logs/{ %Y / %m / %d } logs/2022/10/27
{ %m }-logs/diagnostics 05-logs/diagnostics
diagnostics/{ %Y }/{ %m }/{ %d }{ %H }/ diagnostics/2022/11/0516

Log files that the system uploads to your destination follow this naming pattern: uploadFilePrefix-{random-string}-{epoch-timestamp}-{random-string}-uploadFileSuffix. You can customize the uploadFilePrefix and uploadFileSuffix values of these files.

  • You can use static values and dynamic variables in a string of up to 200 characters in uploadFilePrefix names of log files that you upload to destinations. On sending a log file, the system resolves dynamic variables into the current date, time, and hour. You can use multiple dynamic values separated by -, _, or no separator inside or outside the {} regions. Filename prefixes don’t allow . characters, as using them may result in errors and data loss. If unspecified, the uploadFilePrefix value defaults to ak.

  • You can use static values in a string of up to 10 characters in uploadFileSuffix names of log files that you upload to destinations. Filename suffixes don’t allow dynamic values, and ., /, %, ? characters, as using them may result in errors and data loss. If unspecified, the uploadFileSuffix value defaults to ds.

Here are examples of valid prefix and suffix names in log files:

Prefix Suffix Filename
{ %Y }-{ %m }-{ %d } akam 2022-10-27-rps79rkvx-1666884947-dkmzsi6z8-akam
diagnostics{ %Y-%m-%d } logs diagnostics2022-12-01-8ds3lufkh-1669908947-m1onxoa16-logs
{ %m }-diagnostics delivery 12-diagnostics-dk4j0sh3m-1669856400-9kv08v9oy-delivery
upload-{ %m_%d_%H }-file data upload-04_23_18-file-gao0pip6y-1650736800-981bz2ipd-data
Unspecified Unspecified ak-ae47rr5a8-1650736800-ae47rg6hu-ds

Rate limiting

DataStream 2 limits each client to 20 requests per minute. Exceeding this limit results in a 429 error response. Consider this when calling successive operations as part of a loop.

All responses specify these rate limit headers:

  • X-RateLimit-Limit: The maximum number of tokens allowed.

  • X-RateLimit-Remaining: The number of tokens remaining. Except for any subsequent requests that reduce the number, this gradually increments until it reaches the X-RateLimit-Limit.

  • X-RateLimit-Next: If the X-RateLimit-Remaining has reached 0, this ISO 8601 timestamp indicates when you can next make an additional request.

Resources

This section provides details on the DataStream 2 API’s various operations and parameters.

API summary

Download the RAML descriptors for this API.

Operation Method Endpoint
List properties GET /datastream-config-api/v1/log/properties/product/{productId}/group/{groupId}
List connector types GET /datastream-config-api/v1/log/connectors
List data set fields GET /datastream-config-api/v1/log/datasets/template/{templateName}
List groups GET /datastream-config-api/v1/log/groups
List delivery products and associated groups GET /datastream-config-api/v1/log/products
List stream types GET /datastream-config-api/v1/log/streamTypes
List streams GET /datastream-config-api/v1/log/streams{?groupId}
Create a stream POST /datastream-config-api/v1/log/streams
Get a stream GET /datastream-config-api/v1/log/streams/{streamId}{?version}
Edit a stream PUT /datastream-config-api/v1/log/streams/{streamId}
Delete a stream DELETE /datastream-config-api/v1/log/streams/{streamId}
View activation history GET /datastream-config-api/v1/log/streams/{streamId}/activationHistory
Activate a stream PUT /datastream-config-api/v1/log/streams/{streamId}/activate
Deactivate a stream PUT /datastream-config-api/v1/log/streams/{streamId}/deactivate
View history GET /datastream-config-api/v1/log/streams/{streamId}/history

List properties

Returns properties that are active on the production network for a specific product type that are available within a group. Run this operation to get and store the propertyId and propertyName values for the Create a stream and Edit a stream operations.

GET /datastream-config-api/v1/log/properties/product/{productId}/group/{groupId}

Sample: /datastream-config-api/v1/log/properties/product/Download_Delivery/group/12345

Parameter Type Sample Description
URL path parameters
productId String Download_Delivery Identifies the product.
groupId Integer 12345 Uniquely identifies the group that can access the product.

Status 200 application/json

Object type: Property

Download schema: Properties.json

Response body:

[
    {
        "propertyId": 382631,
        "propertyName": "customp.akamai.com"
    },
    {
        "propertyId": 347459,
        "propertyName": "example.com"
    },
    {
        "propertyId": 389150,
        "propertyName": "eip.yjeong.com"
    },
    {
        "propertyId": 389178,
        "propertyName": "eip.yjeong3.com"
    },
    {
        "propertyId": 398541,
        "propertyName": "gomez-eip"
    },
    {
        "propertyId": 349773,
        "propertyName": "eip1.com"
    }
]
  1. Run the List delivery products operation to find the relevant productId and its associated groupId.

  2. Make a GET request to /datastream-config-api/api/v1/log/properties/product/{productId}/group/{groupId}.

  3. From the result list, you can choose properties that you can enable log collection for in a stream configuration. See the Create a stream or Edit a stream operations.

List connector types

Returns available connector types. You can use one of the connector types as a destination for log delivery in a stream configuration. See the Create a stream or Edit a stream operations.

GET /datastream-config-api/v1/log/connectors

Status 200 application/json

Object type: ConnectorType

Download schema: ConnectorTypes.json

Response body:

[
    {
        "connectorType": "AZURE",
        "connectorTypeName": "Azure Storage"
    },
    {
        "connectorType": "S3",
        "connectorTypeName": "S3"
    },
    {
        "connectorType": "DATADOG",
        "connectorTypeName": "Datadog"
    },
    {
        "connectorType": "SPLUNK",
        "connectorTypeName": "Splunk"
    }
]

List data set fields

Returns groups of data set fields available in the template. You can choose the data set fields that you want to monitor in your logs in a stream configuration. See the Create a stream or Edit a stream operations. Note that EDGE_LOGS is the only templateName value currently available.

GET /datastream-config-api/v1/log/datasets/template/{templateName}

Sample: /datastream-config-api/v1/log/datasets/template/EDGE_LOGS

Parameter Type Sample Description
URL path parameters
templateName Enumeration EDGE_LOGS The name of the data set template that you want to use in your stream configuration. Currently, EDGE_LOGS is the only data set template available.

Status 200 application/json

Object type: Dataset

Download schema: Datasets.json

Response body:

[
    {
        "datasetGroupName": "Log information",
        "datasetGroupDescription": "Contains fields that can be used to identify/tag a log line",
        "datasetFields": [
            {
                "datasetFieldId": 1000,
                "datasetFieldName": "CP Code",
                "datasetFieldDescription": "Content Provider Code associated with Request"
            },
            {
                "datasetFieldId": 1002,
                "datasetFieldName": "Request ID",
                "datasetFieldDescription": "The request identifier associated with request"
            },
            {
                "datasetFieldId": 1100,
                "datasetFieldName": "Request Time",
                "datasetFieldDescription": "Start time of the request"
            }
        ]
    },
    {
        "datasetGroupName": "Message exchange data",
        "datasetGroupDescription": "Contains fields representing the exchange of data between Akamai & end user",
        "datasetFields": [
            {
                "datasetFieldId": 1005,
                "datasetFieldName": "Bytes",
                "datasetFieldDescription": "The content bytes served in the client response"
            },
            {
                "datasetFieldId": 1006,
                "datasetFieldName": "Client IP",
                "datasetFieldDescription": "The IP address of the requesting client"
            },
            {
                "datasetFieldId": 1008,
                "datasetFieldName": "HTTP Status Codes",
                "datasetFieldDescription": "The HTTP Response status sent to the client"
            },
            {
                "datasetFieldId": 1009,
                "datasetFieldName": "Protocol Type",
                "datasetFieldDescription": "The protocol of the transaction being monitored. Currently HTTP or HTTPS."
            },
            {
                "datasetFieldId": 1011,
                "datasetFieldName": "Request Host",
                "datasetFieldDescription": "The value of the Host header of the incoming client request"
            },
            {
                "datasetFieldId": 1012,
                "datasetFieldName": "Request Method",
                "datasetFieldDescription": "The method of the incoming request - assuming an HTTP request. For example: GET, POST, PUT, and HEAD"
            },
            {
                "datasetFieldId": 1013,
                "datasetFieldName": "Request Path",
                "datasetFieldDescription": "The path used in the incoming URL from the client, not including query strings"
            },
            {
                "datasetFieldId": 1014,
                "datasetFieldName": "Request Port",
                "datasetFieldDescription": "The port number of the incoming client request"
            },
            {
                "datasetFieldId": 1015,
                "datasetFieldName": "Response Content Length",
                "datasetFieldDescription": "The value of the Content-Length header in the client response"
            },
            {
                "datasetFieldId": 1016,
                "datasetFieldName": "Response Content Type",
                "datasetFieldDescription": "The value of the Content-Type header in the client request"
            },
            {
                "datasetFieldId": 1017,
                "datasetFieldName": "User-Agent",
                "datasetFieldDescription": "The value of the User-Agent header in the client request"
            },
            {
                "datasetFieldId": 1101,
                "datasetFieldName": "Total Bytes",
                "datasetFieldDescription": "The total bytes served in client response including content & HTTP overhead"
            }
        ]
    },
    {
        "datasetGroupName": "Request header data",
        "datasetGroupDescription": "Contains fields representing the information sent as part of request headers to Akamai",
        "datasetFields": [
            {
                "datasetFieldId": 1019,
                "datasetFieldName": "Accept-Language",
                "datasetFieldDescription": "Provides a list of acceptable human languages for response. For example, American English is en-US"
            },
            {
                "datasetFieldId": 1023,
                "datasetFieldName": "Cookie",
                "datasetFieldDescription": "Lists the HTTP cookie previously sent by the server in the Set-Cookie "
            },
            {
                "datasetFieldId": 1031,
                "datasetFieldName": "Range",
                "datasetFieldDescription": "Requests a specific part of an entity by providing a single byte range or a set of byte ranges. Bytes are numbered from 0. "
            },
            {
                "datasetFieldId": 1032,
                "datasetFieldName": "Referer",
                "datasetFieldDescription": "Lists the resource from which the requested URL was obtained"
            },
            {
                "datasetFieldId": 1037,
                "datasetFieldName": "X-Forwarded-For",
                "datasetFieldDescription": "Identifies the originating IP address of a client connecting to a web server through an HTTP proxy or load balancer"
            }
        ]
    },
    {
        "datasetGroupName": "Network performance data",
        "datasetGroupDescription": "Contains fields representing the performance/troubleshooting metrics",
        "datasetFields": [
            {
                "datasetFieldId": 1033,
                "datasetFieldName": "Request Time",
                "datasetFieldDescription": "Provides the time of the request"
            },
            {
                "datasetFieldId": 1066,
                "datasetFieldName": "Edge IP",
                "datasetFieldDescription": "The IP address of the edge server that served the response to the client. This is useful when resolving issues with your account representative"
            },
            {
                "datasetFieldId": 1068,
                "datasetFieldName": "Error Code R14",
                "datasetFieldDescription": "If there's an error serving the request a string indicating the problem is logged here."
            },
            {
                "datasetFieldId": 1102,
                "datasetFieldName": "Turn Around Time",
                "datasetFieldDescription": "The time in milliseconds between receipt of the end of the request headers and when the first byte of the reply is written to the client socket"
            },
            {
                "datasetFieldId": 1103,
                "datasetFieldName": "Transfer Time",
                "datasetFieldDescription": "The time in milliseconds it took to send the response to the client, measured from the time Akamai Edge was ready to send the first byte to when it sent the last byte."
            }
        ]
    },
    {
        "datasetGroupName": "Other",
        "datasetFields": [
            {
                "datasetFieldId": 1082,
                "datasetFieldName": "Custom Field",
                "datasetFieldDescription": "contains the value specified by the metadata tag reporting:lds.custom-field. Note that, the tag can (and generally would) take an extracted variable as its content, so the value of this field isn't fixed."
            }
        ]
    }
]

List groups

Returns groups within the context of your account. These groups let you view and create stream configurations in properties that they have access to.

GET /datastream-config-api/v1/log/groups

Status 200 application/json

Object type: Group

Download schema: Groups.json

Response body:

[
    {
        "parentGroupId": 21484,
        "groupId": 67377,
        "groupName": "Default Group",
        "description": null,
        "accountId": "1-ABCDE",
        "enabled": true,
        "contractIds": [
            "2-FGHIJ"
        ],
        "childGroupIds": [],
        "childGroups": []
    },
    {
        "parentGroupId": null,
        "groupId": 143553,
        "groupName": "Group-4",
        "description": null,
        "accountId": "1-ABCDE",
        "enabled": true,
        "contractIds": [
            "2-FGHIJ"
        ],
        "childGroupIds": [],
        "childGroups": []
    },
    {
        "parentGroupId": null,
        "groupId": 21483,
        "groupName": "Group-1",
        "description": "Group-1",
        "accountId": "1-ABCDE",
        "enabled": true,
        "contractIds": [
            "2-FGHIJ"
        ],
        "childGroupIds": [
            67377,
            21484
        ],
        "childGroups": [
            67377,
            21484
        ]
    },
    {
        "parentGroupId": 21483,
        "groupId": 21484,
        "groupName": "Group 2",
        "description": "Group 2",
        "accountId": "1-ABCDE",
        "enabled": true,
        "contractIds": [
            "2-FGHIJ"
        ],
        "childGroupIds": [
            67377
        ],
        "childGroups": [
            67377
        ]
    },
    {
        "parentGroupId": null,
        "groupId": 143215,
        "groupName": "Group 3",
        "description": null,
        "accountId": "1-ABCDE",
        "enabled": true,
        "contractIds": [
            "3-KLMNO"
        ],
        "childGroupIds": [],
        "childGroups": []
    },
    {
        "parentGroupId": null,
        "groupId": 21485,
        "groupName": "Backup Group",
        "description": "Backup Group",
        "accountId": "1-ABCDE",
        "enabled": true,
        "contractIds": [
            "3-KLMNO"
        ],
        "childGroupIds": [],
        "childGroups": []
    }
]

List delivery products and associated groups

Returns products that you can collect logs for and the associated groups that have access to these products. It also provides data set templates that you can use with a product. You can use the relevant productId and its associated groupId to find the properties that you can monitor in your stream. See List properties.

GET /datastream-config-api/v1/log/products

Status 200 application/json

Download schema: Products.json

Response body:

[
    {
        "productId": "Adaptive_Media_Delivery",
        "productName": "Adaptive Media Delivery",
        "groups": [
            {
                "groupId": 21483,
                "groupName": "DefaultGroup",
                "parentGroupId": null,
                "contractIds": [
                    "2-FGHIJ"
                ],
                "childGroups": []
            },
            {
                "groupId": 21484,
                "groupName": "Group-1",
                "parentGroupId": 21483,
                "contractIds": [
                    "2-FGHIJ"
                ],
                "childGroups": []
            },
            {
                "groupId": 67377,
                "groupName": "Group-2",
                "parentGroupId": 21484,
                "contractIds": [
                    "2-FGHIJ"
                ],
                "childGroups": []
            }
        ],
        "templates": [
            {
                "templateName": "EDGE_LOGS"
            }
        ]
    },
    {
        "productId": "Download_Delivery",
        "productName": "Download Delivery",
        "groups": [
            {
                "groupId": 21483,
                "groupName": "DefaultGroup",
                "parentGroupId": null,
                "contractIds": [
                    "2-FGHIJ"
                ],
                "childGroups": []
            },
            {
                "groupId": 21484,
                "groupName": "Group-1",
                "parentGroupId": 21483,
                "contractIds": [
                    "2-FGHIJ"
                ],
                "childGroups": []
            },
            {
                "groupId": 67377,
                "groupName": "Group-2",
                "parentGroupId": 21484,
                "contractIds": [
                    "2-FGHIJ"
                ],
                "childGroups": []
            }
        ],
        "templates": [
            {
                "templateName": "EDGE_LOGS"
            }
        ]
    }
]

List stream types

Returns available stream types that you can configure to monitor traffic of your properties. RAW_LOGS is the only stream type currently available.

GET /datastream-config-api/v1/log/streamTypes

Status 200 application/json

Download schema: StreamTypes.json

Response body:

[
    {
        "streamTypeName": "Logs - Raw",
        "streamType": "RAW_LOGS",
        "isRaw": true
    }
]

List streams

Returns the latest versions of the stream configurations for all groups within in your account. You can use the groupId parameter to view the latest versions of all configurations in a specific group.

GET /datastream-config-api/v1/log/streams{?groupId}

Sample: /datastream-config-api/v1/log/streams?groupId=21483

Parameter Type Sample Description
Optional query parameters
groupId Integer 21483 Uniquely identifies the group that can access the product.

Status 200 application/json

Object type: StreamVersion

Download schema: Streams.json

Response body:

[
    {
        "streamId": 4608,
        "streamName": "Stream1",
        "streamVersionId": 2,
        "createdBy": "johndoe",
        "createdDate": "14-07-2020 07:07:40 GMT",
        "currentVersionId": 2,
        "archived": false,
        "activationStatus": "DEACTIVATED",
        "groupId": 21483,
        "groupName": "Default Group 1-ABCDE",
        "contractId": "1-ABCDE",
        "connectors": "S3-S1",
        "streamTypeName": "Logs - Raw",
        "properties": [
            {
                "propertyId": 349772,
                "propertyName": "eib1.com"
            }
        ]
    },
    {
        "streamId": 7050,
        "streamName": "Datasets Change",
        "streamVersionId": 1,
        "createdBy": "janesmith",
        "createdDate": "10-07-2020 12:19:02 GMT",
        "currentVersionId": 1,
        "archived": false,
        "activationStatus": "ACTIVATED",
        "groupId": 21483,
        "groupName": "DefaultGroup",
        "contractId": "2-FGHIJ",
        "connectors": "Azure Storage-DatasetsChange",
        "streamTypeName": "Logs - Raw",
        "properties": [
            {
                "propertyId": 349772,
                "propertyName": "eib1.com"
            }
        ]
    },
    {
        "streamId": 4602,
        "streamName": "Stream2",
        "streamVersionId": 2,
        "createdBy": "johndoe",
        "createdDate": "10-07-2020 06:57:29 GMT",
        "currentVersionId": 2,
        "archived": false,
        "activationStatus": "ACTIVATED",
        "groupId": 21483,
        "groupName": "DefaultGroup",
        "contractId": "2-FGHIJ",
        "connectors": "Azure Storage-S2",
        "streamTypeName": "Logs - Raw",
        "properties": [
            {
                "propertyId": 349772,
                "propertyName": "eib1.com"
            }
        ]
    },
    {
        "streamId": 5258,
        "streamName": "Stream3",
        "streamVersionId": 2,
        "createdBy": "janesmith",
        "createdDate": "08-07-2020 12:15:34 GMT",
        "currentVersionId": 2,
        "archived": false,
        "activationStatus": "ACTIVATED",
        "groupId": 21483,
        "groupName": "DefaultGroup",
        "contractId": "2-FGHIJ",
        "connectors": "Azure Storage-Stream3",
        "streamTypeName": "Logs - Raw",
        "properties": [
            {
                "propertyId": 349772,
                "propertyName": "eib1.com"
            }
        ]
    },
    {
        "streamId": 5252,
        "streamName": "Stream4",
        "streamVersionId": 1,
        "createdBy": "johndoe",
        "createdDate": "08-07-2020 05:01:03 GMT",
        "currentVersionId": 1,
        "archived": false,
        "activationStatus": "ACTIVATED",
        "groupId": 21483,
        "groupName": "DefaultGroup",
        "contractId": "2-FGHIJ",
        "connectors": "S3-Stream4",
        "streamTypeName": "Logs - Raw",
        "properties": [
            {
                "propertyId": 349772,
                "propertyName": "eib1.com"
            }
        ]
    },
    {
        "streamId": 46054,
        "streamName": "stream5",
        "streamVersionId": 1,
        "createdBy": "johndoe",
        "createdDate": "21-01-2021 07:53:11 GMT",
        "currentVersionId": 1,
        "archived": false,
        "activationStatus": "DEACTIVATED",
        "groupId": 18544,
        "groupName": "DefaultGroup",
        "contractId": "1-5C1778",
        "properties": [
            {
                "propertyId": 249753,
                "propertyName": "property.name"
            }
        ],
        "connectors": "S3-amazon",
        "streamTypeName": "Logs - Raw",
        "errors": [
            {
                "type": "UNEXPECTED_SYSTEM_ERROR",
                "title": "Unexpected System Error",
                "detail": "Version 1 of stream5 has failed to publish after an unexpected system error. For assistance, contact technical support."
            }
        ]
    },
    {
        "streamId": 50646,
        "streamName": "stream6",
        "streamVersionId": 2,
        "createdBy": "johndoe",
        "createdDate": "02-02-2021 08:08:47 GMT",
        "currentVersionId": 2,
        "archived": false,
        "activationStatus": "DEACTIVATED",
        "groupId": 18544,
        "groupName": "DefaultGroup",
        "contractId": "1-5C13O8",
        "properties": [
            {
                "propertyId": 439885,
                "propertyName": "multicdn"
            },
            {
                "propertyId": 493475,
                "propertyName": "akamaized.net"
            }
        ],
        "connectors": "Https-Https",
        "streamTypeName": "Logs - Raw",
        "errors": [
            {
                "type": "ACTIVATION_ERROR",
                "title": "Activation/Deactivation Error",
                "detail": "Version 2 of stream6 has failed to activate. For assistance, contact technical support."
            }
        ]
    }
]
  1. If you don’t already have the groupId, run the List groups operation and store the relevant value.

  2. Make a GET request to /datastream-config-api/api/v1/log/streams{?groupId}.

Create a stream

Creates a stream configuration. Within a stream configuration, you can select properties to associate with the stream, data set fields to monitor in logs, and a destination to send these logs to. Apart from the log and delivery frequency configurations, you can decide whether to activate the stream on making the request or later. Note that only active streams collect and send logs to their destinations. See Activate a stream.

POST /datastream-config-api/v1/log/streams

Content-Type: application/json

Object type: StreamConfiguration

Download schema: SaveStreamRequest.json

Request body:

{
    "streamName": "DD Stream",
    "activateNow": false,
    "streamType": "RAW_LOGS",
    "productId": "Download_Delivery",
    "templateName": "EDGE_LOGS",
    "groupId": 21484,
    "contractId": "2-FGHIJ",
    "emailIds": "useremail@akamai.com",
    "propertyIds": [
        382631,
        347459
    ],
    "datasetFieldIds": [
        1002,
        1005,
        1006,
        1008,
        1009,
        1011,
        1012,
        1013,
        1014,
        1015,
        1016,
        1017,
        1101
    ],
    "config": {
        "uploadFilePrefix": "logs",
        "uploadFileSuffix": "ak",
        "delimiter": "SPACE",
        "frequency": {
            "timeInSec": 30
        }
    },
    "connectors": [
        {
            "path": "log/edgelogs",
            "connectorName": "S3Destination",
            "bucket": "datastream.akamai.com",
            "region": "ap-south-1",
            "accessKey": "1T2ll1H4dXWx5itGhpc7FlSbvvOvky1098nTtEMg",
            "secretAccessKey": "AKIA6DK7TDQLVGZ3TYP1",
            "connectorType": "S3"
        }
    ]
}

Status 202 application/json

Download schema: StreamUpdate.json

Response body:

{
    "streamVersionKey": {
        "streamId": 7050,
        "streamVersionId": 1
    }
}
  1. Get and store these values:

  2. Use the stored values to build a stream object and POST it to /datastream-config-api/api/v1/log/streams.

  3. If you want to activate the stream right away, set activateNow to true. Otherwise you can follow up later and run Activate a stream to launch the first version on the production network and receive logs.

  4. Get and store the streamId and streamVersionId values from the response and run the Get a stream operation to GET the object.

Get a stream

Returns information about any version of a stream, including details about the monitored properties, logged data set fields, and log delivery destination. If you omit the version query parameter, this operation returns the last version of the stream.

GET /datastream-config-api/v1/log/streams/{streamId}{?version}

Sample: /datastream-config-api/v1/log/streams/7050?version=1

Parameter Type Sample Description
URL path parameters
streamId Integer 7050 Uniquely identifies the stream.
Optional query parameters
version Integer 1 Identifies the version of the stream. If omitted, the operation returns the last version of the stream.

Status 200 application/json

Object type: DetailedStreamVersion

Download schema: StreamDetail.json

Response body:

{
    "streamId": 7050,
    "streamVersionId": 2,
    "streamName": "Datasets Change",
    "productId": "Download_Delivery",
    "productName": "Download Delivery",
    "templateName": "EDGE_LOGS",
    "groupId": 21483,
    "groupName": "Default Group-1-ABCDE",
    "contractId": "1-ABCDE",
    "streamType": "RAW_LOGS",
    "activationStatus": "ACTIVATING",
    "createdBy": "johndoe",
    "createdDate": "10-07-2020 12:19:02 GMT",
    "modifiedBy": "janesmith",
    "modifiedDate": "15-07-2020 05:51:52 GMT",
    "emailIds": "useremail@akamai.com",
    "config": {
        "delimiter": "SPACE",
        "uploadFilePrefix": "logs",
        "uploadFileSuffix": "ds",
        "frequency": {
            "timeInSec": 30
        }
    },
    "datasets": [
        {
            "datasetGroupName": "Log information",
            "datasetGroupDescription": "Contains fields that can be used to identify/tag a log line",
            "datasetFields": [
                {
                    "datasetFieldId": 1000,
                    "datasetFieldName": "CP Code",
                    "order": 0,
                    "datasetFieldDescription": "Content Provider Code associated with Request"
                },
                {
                    "datasetFieldId": 1002,
                    "datasetFieldName": "Request ID",
                    "order": 1,
                    "datasetFieldDescription": "The request identifier associated with request "
                },
                {
                    "datasetFieldId": 1100,
                    "datasetFieldName": "Request Time",
                    "order": 2,
                    "datasetFieldDescription": "Start time of the request"
                }
            ]
        }
    ],
    "connectors": [
        {
            "connectorId": 8600,
            "connectorType": "AZURE",
            "connectorName": "Azure",
            "path": "storage/akamai/log",
            "compressLogs": true,
            "accountName": "blobstorage",
            "containerName": "logs"
        }
    ],
    "properties": [
        {
            "propertyId": 349772,
            "propertyName": "example.com"
        }
    ]
}
  1. If you don’t already have the streamId value, run the List streams operation and store the relevant value.

  2. Make a GET request to /datastream-config-api/api/v1/log/streams/{streamId}.

Edit a stream

Updates the latest version of a stream. Running this operation creates a version of the stream that replaces the existing one. Note that only active streams collect and send logs to their destinations. See Version management and Activate a stream.

PUT /datastream-config-api/v1/log/streams/{streamId}

Sample: /datastream-config-api/v1/log/streams/7050

Content-Type: application/json

Object type: StreamConfiguration

Download schema: EditStreamRequest.json

Request body:

{
    "streamName": "Stream1",
    "streamType": "RAW_LOGS",
    "templateName": "EDGE_LOGS",
    "contractId": "1-ABCDE",
    "emailIds": "useremail@akamai.com",
    "propertyIds": [
        349772
    ],
    "datasetFieldIds": [
        1002,
        1100
    ],
    "config": {
        "delimiter": "SPACE",
        "uploadFilePrefix": "ak",
        "uploadFileSuffix": "ds",
        "frequency": {
            "timeInSec": 30
        }
    }
}
Parameter Type Sample Description
URL path parameters
streamId Integer 7050 Uniquely identifies the stream.

Status 202 application/json

Download schema: StreamUpdate.json

Response body:

{
    "streamVersionKey": {
        "streamId": 7050,
        "streamVersionId": 1
    }
}
  1. Get the stream that you want to edit. Run the Get a stream operation.

  2. If you don’t already have these values, run these operations:

  3. Optionally, run the Deactivate a stream operation. If you deactivate the stream, you stop receiving logs in your destination, and the API creates an inactive version for this stream in response to the edit request. See Version management.

  4. Build a stream object and make a PUT request to /datastream-config-api/api/v1/log/streams.

  5. Optionally, if you edited an inactive stream, run the Activate a stream operation to activate the new version on the production network.

Delete a stream

Deletes a stream. Deleting a stream means that you can’t activate this stream again, and that you stop receiving logs for the properties that this stream monitors.

DELETE /datastream-config-api/v1/log/streams/{streamId}

Sample: /datastream-config-api/v1/log/streams/7050

Parameter Type Sample Description
URL path parameters
streamId Integer 7050 Uniquely identifies the stream.

Status 200 application/json

Download schema: DeleteStream.json

Response body:

{
    "message": "Success"
}
  1. If you don’t already have the streamId, run the List streams operation and store the relevant value.

  2. Run the Deactivate a stream operation.

  3. Make a DELETE request to /datastream-config-api/api/v1/log/streams/{streamId}.

View activation history

Returns a history of activation status changes for all versions of a stream.

GET /datastream-config-api/v1/log/streams/{streamId}/activationHistory

Sample: /datastream-config-api/v1/log/streams/7050/activationHistory

Parameter Type Sample Description
URL path parameters
streamId Integer 7050 Uniquely identifies the stream.

Status 200 application/json

Object type: ActivationHistory

Download schema: ActivationHistories.json

Response body:

[
    {
        "streamId": 7050,
        "streamVersionId": 2,
        "createdBy": "user1",
        "createdDate": "16-01-2020 11:07:12 GMT",
        "isActive": false
    },
    {
        "streamId": 7050,
        "streamVersionId": 2,
        "createdBy": "user2",
        "createdDate": "16-01-2020 09:31:02 GMT",
        "isActive": true
    },
    {
        "streamId": 7050,
        "streamVersionId": 1,
        "createdBy": "user3",
        "createdDate": "16-01-2020 09:18:35 GMT",
        "isActive": false
    }
]
  1. If you don’t already have the streamId value, run the List streams operation and store the relevant value.

  2. Make a GET request to /datastream-config-api/api/v1/log/streams/{streamId}/activationHistory.

Activate a stream

Activates the latest version of a stream. Activating a stream takes approximately 90 minutes. Once a stream is active and the DataStream behavior is enabled in Property Manager, it starts collecting and sending logs to a destination. If you want to stop receiving these logs, you can deactivate a stream at any time. See the Deactivate a stream operation.

PUT /datastream-config-api/v1/log/streams/{streamId}/activate

Sample: /datastream-config-api/v1/log/streams/7050/activate

Parameter Type Sample Description
URL path parameters
streamId Integer 7050 Uniquely identifies the stream.

Status 202 application/json

Download schema: StreamUpdate.json

Response body:

{
    "streamVersionKey": {
        "streamId": 7050,
        "streamVersionId": 1
    }
}
  1. If you don’t already have the streamId, run the List streams operation and store the relevant value.

  2. Make a PUT request to /datastream-config-api/api/v1/log/streams/{streamId}/activate.

Deactivate a stream

Deactivates the latest version of a stream. Deactivating a stream means that you stop receiving logs for the properties that this stream monitors. Deactivating a stream takes approximately 90 minutes. If you want to start receiving these logs again, you can activate this stream at any time. See the Activate a stream operation.

PUT /datastream-config-api/v1/log/streams/{streamId}/deactivate

Sample: /datastream-config-api/v1/log/streams/7050/deactivate

Parameter Type Sample Description
URL path parameters
streamId Integer 7050 Uniquely identifies the stream.

Status 202 application/json

Download schema: StreamUpdate.json

Response body:

{
    "streamVersionKey": {
        "streamId": 7050,
        "streamVersionId": 1
    }
}
  1. If you don’t already have the streamId, run the List streams operation and store the relevant value.

  2. Make a PUT request to /datastream-config-api/api/v1/log/streams/{streamId}/deactivate.

View history

Returns detailed information about all versions of a stream. It lets you track changes across all versions of a stream, including monitored properties, logged data set fields, log delivery destinations, and activation statuses.

GET /datastream-config-api/v1/log/streams/{streamId}/history

Sample: /datastream-config-api/v1/log/streams/7050/history

Parameter Type Sample Description
URL path parameters
streamId Integer 7050 Uniquely identifies the stream.

Status 200 application/json

Object type: DetailedStreamVersion

Download schema: StreamHistories.json

Response body:

[
    {
        "streamId": 7050,
        "streamVersionId": 2,
        "streamName": "Datasets Change",
        "productId": "Download_Delivery",
        "productName": "Download Delivery",
        "templateName": "EDGE_LOGS",
        "groupId": 21483,
        "groupName": "Default Group-1-ABCDE",
        "contractId": "1-ABCDE",
        "streamType": "RAW_LOGS",
        "activationStatus": "ACTIVATING",
        "createdBy": "johndoe",
        "createdDate": "10-07-2020 12:19:02 GMT",
        "modifiedBy": "janesmith",
        "modifiedDate": "15-07-2020 05:51:52 GMT",
        "emailIds": "useremail@akamai.com",
        "config": {
            "delimiter": "SPACE",
            "uploadFilePrefix": "ak",
            "uploadFileSuffix": "ds",
            "frequency": {
                "timeInSec": 30
            }
        },
        "datasets": [
            {
                "datasetGroupName": "Log information",
                "datasetGroupDescription": "Contains fields that can be used to identify/tag a log line",
                "datasetFields": [
                    {
                        "datasetFieldId": 1000,
                        "datasetFieldName": "CP Code",
                        "order": 0,
                        "datasetFieldDescription": "Content Provider Code associated with Request"
                    },
                    {
                        "datasetFieldId": 1002,
                        "datasetFieldName": "Request ID",
                        "order": 1,
                        "datasetFieldDescription": "The request identifier associated with request "
                    },
                    {
                        "datasetFieldId": 1100,
                        "datasetFieldName": "Request Time",
                        "order": 2,
                        "datasetFieldDescription": "Start time of the request"
                    }
                ]
            }
        ],
        "connectors": [
            {
                "connectorId": 8600,
                "connectorType": "AZURE",
                "connectorName": "Azure",
                "path": "storage/akamai/log",
                "compressLogs": true,
                "accountName": "blobstorage",
                "containerName": "logs"
            }
        ],
        "properties": [
            {
                "propertyId": 349772,
                "propertyName": "eib1.com"
            }
        ]
    },
    {
        "streamId": 7050,
        "streamVersionId": 1,
        "streamName": "Datasets Change",
        "productName": "Download_Delivery",
        "templateName": "EDGE_LOGS",
        "groupId": 21483,
        "groupName": "Default Group-1-ABCDE",
        "contractId": "1-ABCDE",
        "streamType": "RAW_LOGS",
        "activationStatus": "ACTIVATING",
        "createdBy": "johndoe",
        "createdDate": "10-07-2020 12:19:02 GMT",
        "modifiedBy": "janesmith",
        "modifiedDate": "10-07-2020 12:19:02 GMT",
        "config": {
            "delimiter": "SPACE",
            "uploadFilePrefix": "ak",
            "uploadFileSuffix": "ds",
            "frequency": {
                "timeInSec": 30
            }
        },
        "datasets": [
            {
                "datasetGroupName": "Log information",
                "datasetGroupDescription": "Contains fields that can be used to identify/tag a log line",
                "datasetFields": [
                    {
                        "datasetFieldId": 1002,
                        "datasetFieldName": "Request ID",
                        "order": 1,
                        "datasetFieldDescription": "The request identifier"
                    },
                    {
                        "datasetFieldId": 1000,
                        "datasetFieldName": "CP Code",
                        "order": 3,
                        "datasetFieldDescription": "Content Provider Code associated with the request"
                    },
                    {
                        "datasetFieldId": 1100,
                        "datasetFieldName": "Request Time",
                        "order": 4,
                        "datasetFieldDescription": "Start time of the request"
                    }
                ]
            },
            {
                "datasetGroupName": "Message exchange data",
                "datasetGroupDescription": "Contains fields representing the exchange of data between Akamai & end user",
                "datasetFields": [
                    {
                        "datasetFieldId": 1012,
                        "datasetFieldName": "Request Method",
                        "order": 0,
                        "datasetFieldDescription": "The method of the incoming request - assuming an HTTP request. For example: GET, POST, PUT, and HEAD"
                    },
                    {
                        "datasetFieldId": 1006,
                        "datasetFieldName": "Client IP",
                        "order": 2,
                        "datasetFieldDescription": "The IP address of the requesting client"
                    },
                    {
                        "datasetFieldId": 1013,
                        "datasetFieldName": "Request Path",
                        "order": 5,
                        "datasetFieldDescription": "The path used in the incoming URL from the client, not including query strings"
                    },
                    {
                        "datasetFieldId": 1014,
                        "datasetFieldName": "Request Port",
                        "order": 6,
                        "datasetFieldDescription": "The port number of the incoming client request"
                    }
                ]
            }
        ],
        "connectors": [
            {
                "connectorId": 7362,
                "connectorType": "AZURE",
                "connectorName": "Azure",
                "path": "storage/akamai/log",
                "compressLogs": true,
                "accountName": "blobstorage",
                "containerName": "logs"
            }
        ],
        "properties": [
            {
                "propertyId": 349772,
                "propertyName": "eib1.com"
            }
        ]
    }
]
  1. If you don’t already have the streamId, run the List streams operation and store the relevant value.

  2. Make a GET request to /datastream-config-api/v1/log/streams/{streamId}/history.

Data

This section provides you with the data model for the DataStream 2 API.

Download the JSON schemas for this API.

This section’s data schema tables list membership requirements as follows:

Member is required in requests, or always present in responses, even if its value is empty or null.
Member is optional, and may be omitted in some cases.
Member is out of scope, and irrelevant to the specified interaction context. If you include the member in that context, it either triggers an error, or is ignored.

StreamConfiguration

Provides information that you need to specify when creating a stream configuration.

Download schema: SaveStreamRequest.json, EditStreamRequest.json

Sample POST request:

{
    "streamName": "DD Stream",
    "activateNow": false,
    "streamType": "RAW_LOGS",
    "productId": "Download_Delivery",
    "templateName": "EDGE_LOGS",
    "groupId": 21484,
    "contractId": "2-FGHIJ",
    "emailIds": "useremail@akamai.com",
    "propertyIds": [
        382631,
        347459
    ],
    "datasetFieldIds": [
        1002,
        1005,
        1006,
        1008,
        1009,
        1011,
        1012,
        1013,
        1014,
        1015,
        1016,
        1017,
        1101
    ],
    "config": {
        "uploadFilePrefix": "logs",
        "uploadFileSuffix": "ak",
        "delimiter": "SPACE",
        "frequency": {
            "timeInSec": 30
        }
    },
    "connectors": [
        {
            "path": "log/edgelogs",
            "connectorName": "S3Destination",
            "bucket": "datastream.akamai.com",
            "region": "ap-south-1",
            "accessKey": "1T2ll1H4dXWx5itGhpc7FlSbvvOvky1098nTtEMg",
            "secretAccessKey": "AKIA6DK7TDQLVGZ3TYP1",
            "connectorType": "S3"
        }
    ]
}

StreamConfiguration members

Member Type POST PUT Description
StreamConfiguration: Provides information that you need to specify when creating a stream configuration.
activateNow Boolean Whether to start activating the stream after publishing, either true for activating the stream after publishing or false for leaving the stream inactive after publishing. You can activate or deactivate the stream at your convenience by running the Activate a stream or Deactivate a stream operations.
config StreamConfiguration.config The configuration of log lines, names of the log files sent to a destination, and delivery frequency for these files.
connectors Either Amazon S3, Azure Storage, Datadog, or Splunk The connector configuration in the stream.
contractId String Identifies the contract that has access to the product.
datasetFieldIds Array Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appear in the log lines.
emailIds String A comma-delimited list of email addresses where you want to send notifications about activations and deactivations of the stream. You can omit this member and activate or deactivate the stream without notifications.
groupId Integer Identifies the group that has access to the product and this stream configuration.
productId String Identifies the product that you want to enable log collection for.
propertyIds Array Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties. You can activate a property in Property Manager.
streamName String The name of the stream.
streamType String The type of stream that you want to create. RAW_LOGS is the only possible stream type at this time.
templateName String The name of the data set template available for the product that you want to use in the stream. EDGE_LOGS is the only data set template available at this time.
StreamConfiguration.config: The configuration of log lines, names of the log files sent to a destination, and delivery frequency for these files.
delimiter Enumeration A delimiter that you want to use to separate data set fields in the log lines. SPACE is the only available delimiter at this time.
frequency StreamConfiguration.config.frequency The frequency of collecting logs from each uploader and sending these logs to a destination.
uploadFilePrefix String The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, it defaults to ak. This member supports Dynamic time variables.
uploadFileSuffix String The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, it defaults to ds.
StreamConfiguration.config.frequency: The frequency of collecting logs from each uploader and sending these logs to a destination.
timeInSec Enumeration The time in seconds after which the system bundles log lines into a file and sends it to a destination. These are possible values: 30 or 60.

DetailedStreamVersion

Provides detailed information about the latest configuration version of a data stream. It provides descriptions of the data set fields selected in the version, identifiers of properties they collect logs for, configuration of the destination where they deliver logs, and structure of these log lines. It also informs whether this version is active.

Download schema: StreamDetail.json

Sample GET response:

{
    "streamId": 7050,
    "streamVersionId": 2,
    "streamName": "Datasets Change",
    "productId": "Download_Delivery",
    "productName": "Download Delivery",
    "templateName": "EDGE_LOGS",
    "groupId": 21483,
    "groupName": "Default Group-1-ABCDE",
    "contractId": "1-ABCDE",
    "streamType": "RAW_LOGS",
    "activationStatus": "ACTIVATING",
    "createdBy": "johndoe",
    "createdDate": "10-07-2020 12:19:02 GMT",
    "modifiedBy": "janesmith",
    "modifiedDate": "15-07-2020 05:51:52 GMT",
    "emailIds": "useremail@akamai.com",
    "config": {
        "delimiter": "SPACE",
        "uploadFilePrefix": "logs",
        "uploadFileSuffix": "ds",
        "frequency": {
            "timeInSec": 30
        }
    },
    "datasets": [
        {
            "datasetGroupName": "Log information",
            "datasetGroupDescription": "Contains fields that can be used to identify/tag a log line",
            "datasetFields": [
                {
                    "datasetFieldId": 1000,
                    "datasetFieldName": "CP Code",
                    "order": 0,
                    "datasetFieldDescription": "Content Provider Code associated with Request"
                },
                {
                    "datasetFieldId": 1002,
                    "datasetFieldName": "Request ID",
                    "order": 1,
                    "datasetFieldDescription": "The request identifier associated with request "
                },
                {
                    "datasetFieldId": 1100,
                    "datasetFieldName": "Request Time",
                    "order": 2,
                    "datasetFieldDescription": "Start time of the request"
                }
            ]
        }
    ],
    "connectors": [
        {
            "connectorId": 8600,
            "connectorType": "AZURE",
            "connectorName": "Azure",
            "path": "storage/akamai/log",
            "compressLogs": true,
            "accountName": "blobstorage",
            "containerName": "logs"
        }
    ],
    "properties": [
        {
            "propertyId": 349772,
            "propertyName": "example.com"
        }
    ]
}

Sample GET response:

{
    "streamId": 7050,
    "streamVersionId": 2,
    "streamName": "Datasets Change",
    "productId": "Download_Delivery",
    "productName": "Download Delivery",
    "templateName": "EDGE_LOGS",
    "groupId": 21483,
    "groupName": "Default Group-1-ABCDE",
    "contractId": "1-ABCDE",
    "streamType": "RAW_LOGS",
    "activationStatus": "ACTIVATING",
    "createdBy": "johndoe",
    "createdDate": "10-07-2020 12:19:02 GMT",
    "modifiedBy": "janesmith",
    "modifiedDate": "15-07-2020 05:51:52 GMT",
    "emailIds": "useremail@akamai.com",
    "config": {
        "delimiter": "SPACE",
        "uploadFilePrefix": "ak",
        "uploadFileSuffix": "ds",
        "frequency": {
            "timeInSec": 30
        }
    },
    "datasets": [
        {
            "datasetGroupName": "Log information",
            "datasetGroupDescription": "Contains fields that can be used to identify/tag a log line",
            "datasetFields": [
                {
                    "datasetFieldId": 1000,
                    "datasetFieldName": "CP Code",
                    "order": 0,
                    "datasetFieldDescription": "Content Provider Code associated with Request"
                },
                {
                    "datasetFieldId": 1002,
                    "datasetFieldName": "Request ID",
                    "order": 1,
                    "datasetFieldDescription": "The request identifier associated with request "
                },
                {
                    "datasetFieldId": 1100,
                    "datasetFieldName": "Request Time",
                    "order": 2,
                    "datasetFieldDescription": "Start time of the request"
                }
            ]
        }
    ],
    "connectors": [
        {
            "connectorId": 8600,
            "connectorType": "AZURE",
            "connectorName": "Azure",
            "path": "logs/edgelogs/{ %Y/%m/%d }",
            "compressLogs": true,
            "accountName": "blobstorage",
            "containerName": "logs"
        }
    ],
    "properties": [
        {
            "propertyId": 349772,
            "propertyName": "example.com"
        }
    ]
}

DetailedStreamVersion members

Member Type Required Description
DetailedStreamVersion: Provides detailed information about the latest configuration version of a data stream. It provides descriptions of the data set fields selected in the version, identifiers of properties they collect logs for, configuration of the destination where they deliver logs, and structure of these log lines. It also informs whether this version is active.
activationStatus Enumeration The activation status of the data stream configuration version. These are possible values: ACTIVATED, DEACTIVATED, ACTIVATING, or DEACTIVATING. See the Activate a stream and Deactivate a stream operations.
config DetailedStreamVersion.config The configuration of log lines, names of the files sent to a destination, and delivery frequency for these files.
connectors Either Amazon S3, Azure Storage, Datadog, or Splunk Provides detailed information about the connector’s configuration in the stream.
contractId String Identifies the contract that you created the stream for.
createdBy String The username who created the stream.
createdDate String The date and time when the stream was created.
datasets DetailedStreamVersion.datasets[] A list of data set fields selected from the associated template that the stream monitors in logs.
emailIds String A comma-delimited list of email addresses where the system sends notifications about activations and deactivations of the stream.
errors DetailedStreamVersion.errors[] Errors associated to the stream.
groupId Integer Identifies the group that has access to the product and that you created the stream configuration for.
groupName String The name of the user group that you created the stream for.
modifiedBy String The username who modified the stream.
modifiedDate String The date and time when the stream was modified.
productId String The ID of the product that you created stream for.
productName String The name of the product that you created this stream for.
properties DetailedStreamVersion.properties[] Identifies the properties that you monitor in the stream.
streamId Integer Identifies the stream.
streamName String The name of the stream.
streamType Enumeration Specifies the type of the data stream. RAW_LOGS is the only stream type currently available.
streamVersionId Integer Identifies the configuration version of the stream.
templateName Enumeration The name of the template that you associated with the stream. RAW_LOGS is the only template currently available.
DetailedStreamVersion.config: The configuration of log lines, names of the files sent to a destination, and delivery frequency for these files.
delimiter Enumeration A delimiter that you use to separate data set fields in log lines. SPACE is the only delimiter currently available.
frequency DetailedStreamVersion.config.frequency The frequency of collecting logs from each uploader and sending these logs to a destination.
uploadFilePrefix String The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, it defaults to ak. This member supports Dynamic time variables, but doesn’t support the . character. See S3 naming conventions and Azure blob naming conventions.
uploadFileSuffix String The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, it defaults to ds. This member doesn’t support Dynamic time variables, and the ., /, %, ? characters. See S3 naming conventions and Azure blob naming conventions.
DetailedStreamVersion.config.frequency: The frequency of collecting logs from each uploader and sending these logs to a destination.
timeInSec Integer The time in seconds after which the system bundles log lines into a file and sends it to a destination. These are possible values: 30 or 60.
DetailedStreamVersion.datasets[]: A list of data set fields selected from the associated template that the stream monitors in logs.
datasetFields DetailedStreamVersion.datasets[].datasetFields[] A list of data set fields in the group that the stream monitors.
datasetGroupDescription String A descriptive label for the dataset group.
datasetGroupName String The name of the dataset group.
DetailedStreamVersion.datasets[].datasetFields[]: A list of data set fields in the group that the stream monitors.
datasetFieldDescription String Describes the data set field.
datasetFieldId Integer Identifies the field.
datasetFieldName String A name of the data set field.
order Integer Specifies the order of the field in a log line, starting at 0. You can rearrange this order in the datasetFieldIds member. You can get this value only when you run the Get a stream operation.
DetailedStreamVersion.errors[]: Errors associated to the stream.
detail String A message informing about the status of the failed stream.
title String A descriptive label for the type of error.
type Enumeration Identifies the error type, either ACTIVATION_ERROR or UNEXPECTED_SYSTEM_ERROR. In case of these errors, contact support for assistance before continuing. See Errors for more details.
DetailedStreamVersion.properties[]: Identifies the properties that you monitor in the stream.
propertyId Integer The identifier of the property.
propertyName String The descriptive label for the property.

StreamVersion

Provides basic information about the latest version of a stream. Apart from contextual information and the activation status, it provides names of the properties the stream monitors and the destinations where it sends logs.

Download schema: Stream.json

Sample GET response:

{
    "streamId": 4523,
    "streamName": "editAuto8943708",
    "streamVersionId": 5,
    "createdBy": "johndoe",
    "createdDate": "26-12-2019 08:43:31 GMT",
    "currentVersionId": 5,
    "archived": false,
    "activationStatus": "ACTIVATED",
    "groupId": 21483,
    "groupName": "Default Group-1-ABCDE",
    "contractId": "1-ABCDE",
    "properties": "eib1.com",
    "connectors": "S3-eA8943708",
    "streamTypeName": "Logs - Raw"
}

StreamVersion members

Member Type Description
StreamVersion: Provides basic information about the latest version of a stream. Apart from contextual information and the activation status, it provides names of the properties the stream monitors and the destinations where it sends logs.
activationStatus Enumeration The activation status of the stream. These are possible values: ACTIVATED, DEACTIVATED, ACTIVATING, or DEACTIVATING.
archived Boolean Whether the stream is archived, either true or false.
connectors String The connector where the stream sends logs.
contractId String Identifies the contract that the stream is associated with.
createdBy String The username who created the stream.
createdDate String The date and time when the stream was created.
currentVersionId Integer Identifies the current version of the stream.
errors StreamVersion.errors[] Errors associated to the stream.
groupId Integer Identifies the group where the stream is created.
groupName String The group name where the stream is created.
properties StreamVersion.properties[] Identifies the properties that you monitor in the stream.
streamId Integer Identifies the stream.
streamName String The name of the stream.
streamTypeName String Specifies the type of the data stream. Logs - Raw is the only value currently available for this field.
streamVersionId Integer Identifies the version of the stream.
StreamVersion.errors[]: Errors associated to the stream.
detail String A message informing about the status of the failed stream.
title String A descriptive label for the type of error.
type Enumeration Identifies the error type, either ACTIVATION_ERROR or UNEXPECTED_SYSTEM_ERROR. In case of these errors, contact support for assistance before continuing. See Errors for more details.
StreamVersion.properties[]: Identifies the properties that you monitor in the stream.
propertyId Integer The identifier of the property.
propertyName String The descriptive label for the property.

AmazonS3

Provides details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided accessKey and secretAccessKey values and tries to save an akamai_write_test_2147483647.txt file in your S3 folder. You can only see this file if the validation process is successful, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to.

Download schema: S3.json, S3SavedConnector.json

Sample POST request:

{
    "path": "log/edgelogs/{ %Y/%m/%d }",
    "connectorName": "S3Destination",
    "bucket": "media-datastream.akamai.com",
    "region": "ap-south-1",
    "accessKey": "AKIA6DK7TDQLVGZ3TYP1",
    "secretAccessKey": "1T2ll1H4dXWx5itGhpc7FlSbvvOvky1098nTtEMg",
    "connectorType": "S3"
}

AmazonS3 members

Member Type POST/PUT GET Description
AmazonS3: Provides details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided accessKey and secretAccessKey values and tries to save an akamai_write_test_2147483647.txt file in your S3 folder. You can only see this file if the validation process is successful, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to.
accessKey String The access key identifier that you use to authenticate requests to your S3 account. See Managing access keys AWS API.
bucket String The name of the S3 bucket. See Working with Amazon S3 Buckets in AWS.
compressLogs Boolean Enables GZIP compression for a log file sent to a destination. This value is always true for this connector type.
connectorId Integer Identifies the connector associated with the stream.
connectorName String The name of the connector.
connectorType Enumeration The name of the connector type. Set it to S3 for this connector type.
path String The path to the folder within your S3 bucket where you want to store your logs. This member supports Dynamic time variables. See S3 naming conventions.
region String The AWS region where your S3 bucket resides. See Regions, Availability Zones, and Local Zones in AWS.
secretAccessKey String The secret access key identifier that you use to authenticate requests to your S3 account.

AzureStorage

Provides details about the Azure Storage connector configuration in a data stream. Note that DataStream 2 supports only streaming data to block objects at this time. See Block objects.

Download schema: Azure.json, AzureSavedConnector.json

Sample POST request:

{
    "accountName": "ccdnsaccount",
    "accessKey": "X17x2f2+2MxT1Eh/KaBApKrAEUnetiD0LBbQybczlOwpJVJHPYlDwV99nFSvhDd6z0YTdhHGYz6inRHjQPLlsA==",
    "connectorName": "azure_connector",
    "containerName": "rawslogscontainer",
    "path": "logs/edgelogs/{ %Y/%m/%d }",
    "connectorType": "AZURE"
}

AzureStorage members

Member Type POST/PUT GET Description
AzureStorage: Provides details about the Azure Storage connector configuration in a data stream. Note that DataStream 2 supports only streaming data to block objects at this time. See Block objects.
accessKey String Either of the access keys associated with your Azure Storage account. See View account access keys in Azure.
accountName String Specifies the Azure Storage account name.
compressLogs Boolean Enables GZIP compression for a log file sent to a destination. This value is always true for this connector type.
connectorId Integer Identifies the connector associated with the stream.
connectorName String The name of the connector.
connectorType Enumeration The name of the connector type. Set this value to AZURE for this connector type.
containerName String Specifies the Azure Storage container name.
path String The path to the folder within the Azure Storage container where you want to store your logs. This member supports Dynamic time variables. See Azure blob naming conventions.

Datadog

Provides detailed information about Datadog connector that you can use in your stream.

Download schema: Datadog.json, DatadogSavedConnector.json

Sample POST request:

{
    "service": "datastream",
    "authToken": "6fe69pp3788877bd7b3bv18oo2c68fe",
    "connectorName": "Datadog_connector",
    "url": "https://http-intake.logs.datadoghq.com/v1/input/",
    "source": "java",
    "tags": "env:sqa,user:insomnia",
    "connectorType": "DATADOG",
    "compressLogs": true
}

Datadog members

Member Type POST/PUT GET Description
Datadog: Provides detailed information about Datadog connector that you can use in your stream.
authToken String The API key associated with your Datadog account. See View API keys in Datadog.
compressLogs Boolean Enables GZIP compression for a log file sent to a destination. If not set, this member’s value is false by default.
connectorId Integer Identifies the connector associated with the data stream.
connectorName String The name of the connector.
connectorType Enumeration The name of the connector type. Set this value to DATADOG for this connector type.
service String The service of the Datadog connector. See View Datadog reserved attribute list.
source String The source of the Datadog connector. See View Datadog reserved attribute list.
tags String The tags of the Datadog connector. See View Datadog tags.
url String The Datadog endpoint where you want to store your logs. See View Datadog logs endpoint.

Splunk

Provides detailed information about Splunk connector that you can use in your stream. Note that DataStream 2 supports only endpoint URLs ending with collector/raw at this time.

Download schema: Splunk.json, SplunkSavedConnector.json

Sample POST request:

{
    "connectorName": "Splunk_connector",
    "url": "https://http-inputs-customer.splunkcloud.com/services/collector/raw",
    "eventCollectorToken": "894-51c5-4b2e-888y-54fb-hh62",
    "connectorType": "SPLUNK"
}

Splunk members

Member Type POST/PUT GET Description
Splunk: Provides detailed information about Splunk connector that you can use in your stream. Note that DataStream 2 supports only endpoint URLs ending with collector/raw at this time.
compressLogs Boolean Enables GZIP compression for a log file sent to a destination. If not set, this member’s value is trueby default.
connectorId Integer Identifies the connector associated with the data stream.
connectorName String The name of the connector.
connectorType Enumeration The name of the connector type. Set this value to SPLUNK for this connector type.
eventCollectorToken String The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.
url String The raw event Splunk URL where you want to store your logs.

Product

Provides information about a product that you can enable log collection for and the groups that have permission to access this product. It also specifies a list of data set templates available for this product.

Download schema: Product.json

Sample GET response:

{
    "productId": "Adaptive_Media_Delivery",
    "productName": "Adaptive Media Delivery",
    "groups": [
        {
            "groupId": 21483,
            "groupName": "DefaultGroup",
            "parentGroupId": null,
            "contractIds": [
                "2-FGHIJ"
            ],
            "childGroups": []
        },
        {
            "groupId": 21484,
            "groupName": "Group11",
            "parentGroupId": 21483,
            "contractIds": [
                "2-FGHIJ"
            ],
            "childGroups": []
        },
        {
            "groupId": 67377,
            "groupName": "Group-2",
            "parentGroupId": 21484,
            "contractIds": [
                "2-FGHIJ"
            ],
            "childGroups": []
        }
    ],
    "templates": [
        {
            "templateName": "EDGE_LOGS"
        }
    ]
}

Product members

Member Type Description
Product: Provides information about a product that you can enable log collection for and the groups that have permission to access this product. It also specifies a list of data set templates available for this product.
groups Product.groups[] A list of groups that has access to the product.
productId String Identifies the product.
productName String The name of the product.
templates Product.templates[] A list of data set templates that you can use with this product in a stream.
Product.groups[]: A list of groups that has access to the product.
accountId String Identifies the account that the group is part of.
childGroupIds Array Identifies the child groups associated with the group.
childGroups Array Identifies child groups associated with the group.
contractIds Array Identifies the contracts associated with the group.
description String, Null Describes the group.
enabled Boolean Whether this group allows you to create and view stream configurations.
groupId Integer Identifies the group.
parentGroupId Integer, Null Identifies the parent group.
Product.templates[]: A list of data set templates that you can use with this product in a stream.
templateName String The name of the template.

Group

A list of groups that has access to the product.

Download schema: Group.json

Sample GET response:

{
    "parentGroupId": 21484,
    "groupId": 67377,
    "description": null,
    "accountId": "1-FE6JH",
    "enabled": true,
    "contractIds": [
        "1-ABCDE"
    ],
    "childGroupIds": [],
    "childGroups": []
}

Group members

Member Type Description
Group: A list of groups that has access to the product.
accountId String Identifies the account that the group is part of.
childGroupIds Array Identifies the child groups associated with the group.
childGroups Array Identifies child groups associated with the group.
contractIds Array Identifies the contracts associated with the group.
description String, Null Describes the group.
enabled Boolean Whether this group allows you to create and view stream configurations.
groupId Integer Identifies the group.
parentGroupId Integer, Null Identifies the parent group.

ConnectorType

Provides information about a connector type that you can use as a destination for log delivery in a stream.

Download schema: ConnectorType.json

Sample GET response:

{
    "connectorType": "AZURE",
    "connectorTypeName": "Azure Storage"
}

ConnectorType members

Member Type Description
ConnectorType: Provides information about a connector type that you can use as a destination for log delivery in a stream.
connectorType Enumeration Specifies the connector type, either AZURE, S3, DATADOG, or SPLUNK.
connectorTypeName String A name of the connector type.

Dataset

Provides information about a group of data set fields available in a template.

Download schema: Dataset.json

Sample GET response:

{
    "datasetGroupName": "Log information",
    "datasetGroupDescription": "Contains fields that can be used to identify/tag a log line",
    "datasetFields": [
        {
            "datasetFieldId": 1000,
            "datasetFieldName": "CP Code",
            "datasetFieldDescription": "Content Provider Code associated with Request"
        },
        {
            "datasetFieldId": 1002,
            "datasetFieldName": "Request ID",
            "datasetFieldDescription": "The request identifier associated with request"
        },
        {
            "datasetFieldId": 1100,
            "datasetFieldName": "Request Time",
            "datasetFieldDescription": "Start time of the request"
        }
    ]
}

Dataset members

Member Type Description
Dataset: Provides information about a group of data set fields available in a template.
datasetFields Dataset.datasetFields[] A list of data set fields available within the data set group.
datasetGroupDescription String Describes the dataset group.
datasetGroupName String A name of the dataset group.
Dataset.datasetFields[]: A list of data set fields available within the data set group.
datasetFieldDescription String Describes the data set field.
datasetFieldId Integer Identifies the field.
datasetFieldName String A name of the data set field.
order Integer Specifies the order of the field in a log line, starting at 0. You can rearrange this order in the datasetFieldIds member. You can get this value only when you run the Get a stream operation.

ActivationHistory

Provides detailed information about an activation status change for a version of a stream.

Download schema: ActivationHistory.json

Sample GET response:

{
    "streamId": 7050,
    "streamVersionId": 2,
    "createdBy": "johndoe",
    "createdDate": "16-01-2020 11:07:12 GMT",
    "isActive": false
}

ActivationHistory members

Member Type Description
ActivationHistory: Provides detailed information about an activation status change for a version of a stream.
createdBy String The username who activated or deactivated the stream.
createdDate String The date and time when activation status was modified.
isActive Boolean Whether the version of the stream is active.
streamId Integer Identifies the stream.
streamVersionId Integer Identifies the version of the stream.

Property

Identifies the properties that you monitor in the stream.

Download schema: Property.json

Sample GET response:

{
    "propertyId": 382631,
    "propertyName": "example.com"
}

Property members

Member Type Description
Property: Identifies the properties that you monitor in the stream.
propertyId Integer The identifier of the property.
propertyName String The descriptive label for the property.

Errors

This section provides details on the data object that reflect the API’s common response to error cases, and lists the API’s range of response status codes for both error and success cases.

Error responses

When the API encounters a problem, it responds with a JSON object that follows the HTTP Problem Details standard. This sample shows a bad request error, where the title is a descriptive label for the overall problem, and the instance may be useful if you need to communicate the problem to your Akamai support representative. It also includes an optional errors array that lists potentially more than one problem detected in the request.

{
  "type": "bad-request",
  "title": "Erroneous data input",
  "instance": "8ed959ae-bc22-43f4-893c-2f293518f258",
  "status": 400,
  "errors": [
    {
      "type": "bad-request",
      "title": "Bad Request",
      "instance": "1664e1a7-d916-4cf9-944f-2d9f6d176f8f",
      "detail": "Expiry Date of Previous Key is required"
    }
  ]
}

The List streams and Get a stream operations return objects that may indicate stream failure errors, such as ACTIVATION_ERROR or UNEXPECTED_SYSTEM_ERROR. To perform any action on a failed stream, contact technical support.

{
  "type": "UNEXPECTED_SYSTEM_ERROR",
  "title": "Unexpected System Error",
  "detail": "Version 1 of stream5 has failed to publish after an unexpected system error. For assistance, contact technical support."
}

HTTP status codes

The API produces these set of HTTP status codes for both success and failure scenarios:

Code Description
200 The operation was successful.
201 Resource created.
202 Activation or deactivation request received for processing. Errors may include failed configuration push due to connectivity issues or system failures. Run the Get a stream operation to check the activation status.
400 Bad request. Invalid parameters or data, including JSON parse errors, invalid connector credentials, unresponsive connector or undefined request errors.
401 Unauthorized request.
402 Failed request.
403 Access is forbidden. The user doesn’t have access to the requested resource.
404 Resource not found.
405 Method not allowed.
415 Unsupported media type.
429 Too many requests. See Rate limiting.
500 Internal server error. Unexpected error.