Datadog via Kinesis Firehose

Overview

Use the datadog-connector module to provision a Kinesis Firehose that to pipes logs to Datadog. For more information about streaming logs from Kinesis Firehose to Datadog, checkout the Datadog Docs.

📘

Prerequisites

  1. An environment.tf file generated by symflow init
    a. If you have not run symflow init, please follow the instructions in Installing Sym
  2. A runtime_connector module defined in connectors.tf
    a. If you do not have a connectors.tf, please follow the instructions in AWS Runtime Setup
  3. A kinesis_firehose_access module defined in connectors.tf during the tutorial on the main AWS Kinesis Firehose

Create a DataDog API key

On the Datadog Configuration page, create an API Key. Save the key, so that you can use it to configure the Datadog Connector in the next section.

Configure the Datadog Connector

Declare a datadog_access_key variable

We will be utilizing a sensitive variable to pass your Datadog access key to the datadog-connector module.

🚧

Security Note

The datadog_access_key variable is sensitive. Be sure to manage this value either:

  • In AWS Secrets Manager, where the value can be retrieved using aws_secretsmanager_secret_version
  • In an environment variable or a tfvars file that is excluded from version control. For example, export TF_VAR_datadog_access_key="my-access-key".

The Access Key will also still be stored in Terraform state, so ensure that your Terraform state file is securely stored. See Terraform's recommendations for storing sensitive state.

Declare a variable datadog_access_key in a variables.tf file.

variable "datadog_access_key" {
  description = "Secret used by the Firehose to send logs to Datadog. DO NOT check this into version control."
  type        = string
  sensitive   = true
}

Add the Datadog Connector Module

The Datadog Connector is a extended version of the Kinesis Firehose Connector, and will create a Kinesis Firehose Delivery Stream that will pipe reporting logs to Datadog, given an API key. The output of this module will be used to configure a Sym Log Destination next.

🚧

Check your environments!

Make sure that the environment in your datadog_connector module matches the environment input of your kinesis_firehose_access module.

module "datadog_connector" {
  source  = "symopsio/datadog-connector/aws"

  # Note: To use this module with AWS Provider 4.x, use version "~> 3.0"
  version = "~> 4.0"

  # This environment value MUST match environment value passed into the kinesis_firehose_access module
  environment = local.environment_name
  
  # This variable should NOT be checked into source control!
  datadog_access_key = var.datadog_access_key

  # If you are in the EU, you will need to specify the EU intake URL!
  # datadog_intake_url = "https://aws-kinesis-http-intake.logs.datadoghq.eu/v1/input"

	# Any tags passed into the connector will be used to tag the Kinesis Firehose Deliery Stream,
  # as well as each log entry sent to Datadog. The SymEnv = environment_name tag will always
  # be included by default.
	tags = {
		TagName = "TagValue"
  }
}

Add a Log Destination

Define a sym_log_destination resource with type = kinesis_firehose.

  • integration_id: The integration containing the permissions to push to Kinesis Firehose. This should be set to module.runtime_connector.sym_integration.id, which has the permissions created by the kinesis_firehose_access module.
  • stream_name: The name of the Kinesis Firehose Delivery Stream
resource "sym_log_destination" "datadog" {
  type = "kinesis_firehose"

  # The Runtime Connector sym_integration has Kinesis Firehose permissions defined by the kinesis_firehose_access module
  integration_id = module.runtime_connector.sym_integration.id

  settings = {
    # The firehose stream name is outputted by the datadog_connector module
    stream_name = module.datadog_connector.firehose_name
  }
}

Add the Log Destination to your Environment

Each sym_environment accepts a list of Log Destinations to send reporting logs to. Add the ID of the Log Destination you just defined to the log_destination_ids list.

resource "sym_environment" "this" {
  name            = "main"
  runtime_id      = sym_runtime.this.id
  error_logger_id = sym_error_logger.slack.id
  
  # Add your log destinations here
  log_destination_ids = [sym_log_destination.datadog.id]

  integrations = {
    slack_id = sym_integration.slack.id
  }
}

Full Example

You can find the complete code for this example in our Datadog Log Destination Example.