Overview

Connecting Sym reporting logs to Datadog is a similar process to New Relic, abstracted with a datadog-connector module that provisions a Kinesis Firehose configured to pipe logs to Datadog.

For more information about streaming logs from Kinesis Firehose to Datadog, checkout the Datadog Docs.

πŸ“˜

Make sure you've enabled the aws/kinesis-firehose add-on in your Runtime Connector, as described on the main AWS Kinesis Firehose page.

Create a DataDog API key

On the Datadog Configuration page, create an API Key. Save the key, so that you can use it to configure the Datadog Connector in the next section.

Configure the Datadog Connector

Declare a datadog_access_key variable

We will be utilizing a sensitive variable to pass your Datadog access key to the datadog-connector module.

🚧

Security Note

The datadog_access_key variable is sensitive. Be sure to manage this value in an environment variable or a tfvars file that is excluded from version control. For example, export TF_VAR_datadog_access_key="my-access-key".

The Access Key will also still be stored in Terraform state, so ensure that your Terraform state file is securely stored. See Terraform's recommendations for storing sensitive state.

Declare a variable datadog_access_key in a variables.tf file in the same directory as your main.tf file.

variable "datadog_access_key" {
  description = "Secret used by the Firehose to send logs to Datadog. DO NOT check this into version control."
  type        = string
  sensitive   = true
}

Add the Datadog Connector Module

The Datadog Connector is a extended version of the Kinesis Firehose Connector, and will create a Kinesis Firehose Delivery Stream that will pipe reporting logs to Datadog, given an API key. The output of this module will be used to configure a Sym Log Destination next.

🚧

Check your environments!

Make sure that the environment in your datadog_connector module matches the environment in your runtime_connector module!

module "datadog_connector" {
  source  = "symopsio/datadog-connector/aws"
  version = ">= 2.0.0"

  # This environment MUST match the `environment` in your `runtime_connector` module
  # because the aws/kinesis-firerhose add-on only grants access to Firehoses with the same environment name
  environment = "main"
  
  # This variable should NOT be checked into tfvars!
  datadog_access_key = var.datadog_access_key

  # If you are in the EU, you will need to specify the EU intake URL!
  # datadog_intake_url = "https://aws-kinesis-http-intake.logs.datadoghq.eu/v1/input"
}

Add a Log Destination

Define a sym_log_destination resource with type = kinesis_firehose.

  • integration_id: The integration containing the permissions to push to Kinesis Firehose. This should be set to your Runtime Permission Context Integration, which has the permissions created by the aws/kinesis-firehose add-on.
  • stream_name: The name of the Kinesis Firehose Delivery Stream
resource "sym_log_destination" "datadog" {
  type = "kinesis_firehose"

  # The Runtime Permission Context has Kinesis Firehose permissions from the aws/kinesis-firehose add-on
  integration_id = sym_integration.runtime_context.id

  settings = {
    # The firehose stream name is outputted by the datadog_connector module
    stream_name = module.datadog_connector.firehose_name
  }
}

Add the Log Destination to your Environment

Each sym_environment accepts a list of Log Destinations to send reporting logs to. Add the ID of the Log Destination you just defined to the log_destination_ids list.

resource "sym_environment" "this" {
  name            = "main"
  runtime_id      = sym_runtime.this.id
  error_logger_id = sym_error_logger.slack.id
  
  # Add your log destinations here
  log_destination_ids = [sym_log_destination.datadog.id]

  integrations = {
    slack_id = sym_integration.slack.id
  }
}

Full Example

You can find the complete code for this example in our S3 Log Destination Example.