Terraform Scanning: CI/CD Integration

This article describes how to start a terraform file scan using Data Theorem’s API.

This can for example be used as part of a CI/CD pipeline in order to verify that the terraform file won’t create any resource with some urgent policy violation. This way a cloud resource can be prevented to be deployed in production.

See Overview - Configuration Language | Terraform | HashiCorp Developer for an explanation on Terraform files.

Running a terraform scan

Data Theorem’s API can be used to run a terraform scan against a specific terraform configuration file

  1. Retrieve an API Key that has the permission “API Security Results API” enabled; API keys are available in the Data Theorem portal at API Key

  2. A terraform file scan can then be run using the following CURL command:

    curl -X POST 'https://api.securetheorem.com/apis/devops/v1/iac_scans' \ --header 'Content-Type: multipart/form-data' \ --header 'Authorization: APIKey ABCACBA=' \ --form 'file=@"terraform_example_configuration:file.tf"' \ --form 'scan_type="TERRAFORM"'

3. Check the output for any issues in the files.

response = { "status": "COMPLETED" // if value is not COMPLETED then datatheorem was not able to scan the received file "issues_count": "1" // if value > 0 then the file is not safe to run on production. "result_as_json": { "issues": [ { "resource_name": "aws_s3_bucket.public_read_acp_from_grant", "linked_policy_rule_type_id": 16 } ] } // An explanation of the issues encounter in a printable markdown format. "result_as_markdown": "##terraform_example_file.tf contains the following issues:\n###In aws_s3_bucket.public_read_acp_from_grant###\n**AWS S3 Bucket has Publicly Accessible ACLs** \nThe S3 bucket's Access Control List (ACL) permissions allow anyone to read (but not modify) the bucket's\nACL permissions. This includes anyone on the Internet who knows the bucket's URL.\n\nWhile this permission does not allow accessing\nthe content of the bucket, there is usually no business need for exposing the list of users and their level of access.\n\nThe ACL group that has the `READ_ACP` permission is the `http://acs.amazonaws.com/groups/global/AllUsers` group.\n\nFor more information see the relevant AWS documentation\n[ACL Overview](https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html) and\n[How to restrict Amazon S3 Bucket Access to an\nIAM Role](https://aws.amazon.com/blogs/security/how-to-restrict-amazon-s3-bucket-access-to-a-specific-iam-role/).\n\n" }

4. To check that the file is safe to deploy, please run

export terraform_issues_count=$(curl -X POST 'https://api.securetheorem.com/apis/devops/v1/iac_scans' \ --header 'Content-Type: multipart/form-data' \ --header 'Authorization: APIKey ABCACBA=' \ --form 'file=@"terraform_example_configuration:file.tf"' \ --form 'scan_type="TERRAFORM"'| jq -r ".issues_count") // Then deploy your file only if terraform_issues_count is equal to 0 if [ $TERRAFORM_ISSUES_COUNT == 0 ]; then echo "Deploying file: terraform_example_configuration" else exit 1

Integrating into a CI/CD pipeline

GitHub Actions

For terraform files hosted on GitHub, a GitHub Actions workflow can be configured. The workflow will perform terraform scans every time the repository is tagged with a new version.

To setup this workflow:

  1. Create a new secret containing your API key (called DATATHEOREM_API_RESULT_API_KEY in the below example).

  2. Create a new workflow by creating a file at .github/workflows/datatheorem.yaml with the following content:

Bitbucket Pipelines

For terraform files hosted on Bitbucket, a similar workflow can be configured in Bitbucket Pipelines:

  1. Create a new secure “Repository variables” in Pipelines configuration (from the repository setting).

  2. Create a Pipeline by creating a .bitbucket-pipelines.yml within your repository with the following content: