Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 14 Current »

This article describes how to start a terraform file scan using Data Theorem’s API.

This can for example be used as part of a CI/CD pipeline in order to verify that the terraform file won’t create any resource with some urgent policy violation. This way a cloud resource can be prevented to be deployed in production.

See https://www.terraform.io/docs/language/index.html for an explanation on Terraform files.

Running a terraform scan

Data Theorem’s API can be used to run a terraform scan against a specific terraform configuration file

  1. Retrieve an API Key that has the permission “API Security Results API” enabled; API keys are available in the Data Theorem portal at API Key

  2. A terraform file scan can then be run using the following CURL command:

    curl -X POST 'https://api.securetheorem.com/apis/devops/v1/iac_scans' \
    --header 'Content-Type: multipart/form-data' \
    --header 'Authorization: APIKey ABCACBA=' \
    --form 'file=@"terraform_example_configuration:file.tf"' \
    --form 'scan_type="TERRAFORM"'

3. Check the output for any issues in the files.

response = {
    "status": "COMPLETED" // if value is not COMPLETED then datatheorem was not able to scan the received file
    "issues_count": "1" // if value > 0 then the file is not safe to run on production.
    "result_as_json": {
        "issues": [
            {
                "resource_name": "aws_s3_bucket.public_read_acp_from_grant",
                "linked_policy_rule_type_id": 16
            }
        ]
    }
    // An explanation of the issues encounter in a printable markdown format. 
    "result_as_markdown": "##terraform_example_file.tf contains the following issues:\n###In aws_s3_bucket.public_read_acp_from_grant###\n**AWS S3 Bucket has Publicly Accessible ACLs**  \nThe S3 bucket's Access Control List (ACL) permissions allow anyone to read (but not modify) the bucket's\nACL permissions. This includes anyone on the Internet who knows the bucket's URL.\n\nWhile this permission does not allow accessing\nthe content of the bucket, there is usually no business need for exposing the list of users and their level of access.\n\nThe ACL group that has the `READ_ACP` permission is the `http://acs.amazonaws.com/groups/global/AllUsers` group.\n\nFor more information see the relevant AWS documentation\n[ACL Overview](https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html) and\n[How to restrict Amazon S3 Bucket Access to an\nIAM Role](https://aws.amazon.com/blogs/security/how-to-restrict-amazon-s3-bucket-access-to-a-specific-iam-role/).\n\n"
}

4. To check that the file is safe to deploy, please run

export terraform_issues_count=$(curl -X POST 'https://api.securetheorem.com/apis/devops/v1/iac_scans' \
--header 'Content-Type: multipart/form-data' \
--header 'Authorization: APIKey ABCACBA=' \
--form 'file=@"terraform_example_configuration:file.tf"' \
--form 'scan_type="TERRAFORM"'| jq -r ".issues_count")

// Then deploy your file only if terraform_issues_count is equal to 0
if [ $TERRAFORM_ISSUES_COUNT == 0 ]; then
  echo "Deploying file: terraform_example_configuration"
else
  exit 1

Integrating into a CI/CD pipeline

GitHub Actions

For terraform files hosted on GitHub, a GitHub Actions workflow can be configured. The workflow will perform terraform scans every time the repository is tagged with a new version.

To setup this workflow:

  1. Create a new secret containing your API key (called DATATHEOREM_API_RESULT_API_KEY in the below example).

  2. Create a new workflow by creating a file at .github/workflows/datatheorem.yaml with the following content:

name: Data Theorem Terraform Scans
on:
  push:
    tags:
       - '*'
jobs:
  datatheorem-terraform-scan:
    runs-on: ubuntu-latest
    steps:
      - env:
          DATATHEOREM_WEB_SECURE_SCANS_API_KEY: ${{ secrets.DATATHEOREM_API_RESULT_API_KEY }}
        run: |
          curl -X POST 'https://api.securetheorem.com/apis/devops/v1/iac_scans' \
              --header 'Content-Type: multipart/form-data' \
              --header 'Authorization: APIKey $DATATHEOREM_API_RESULT_API_KEY' \
              --form 'file=@"terraform_example_configuration:file.tf"' \
              --form 'scan_type="TERRAFORM"'

Bitbucket Pipelines

For terraform files hosted on Bitbucket, a similar workflow can be configured in Bitbucket Pipelines:

  1. Create a new secure “Repository variables” in Pipelines configuration (from the repository setting).

  2. Create a Pipeline by creating a .bitbucket-pipelines.yml within your repository with the following content:

pipelines:
  tags:
    '*':
      - step:
        script:
            - apt-get update
            - apt-get install -y jq
            - export FILEPATH="$BITBUCKET_CLONE_DIR/my_terraform_file.tf"
            - if [ -f "$FILEPATH" ]; then echo "File exists" ; else exit 1; fi
            - |
                export TERRAFORM_DATA_THEOREM_RESPONSE=$(curl -X POST 'https://api.securetheorem.com/apis/devops/v1/iac_scans' \
                                --header 'Content-Type: multipart/form-data' \
                                --header "Authorization: APIKey $DATATHEOREM_TERRAFORM_API_KEY" \
                                --form 'file=@"'"$FILEPATH"'"' \
                                --form 'scan_type="TERRAFORM"')
                export TERRAFORM_ISSUES_COUNT=$(echo $TERRAFORM_DATA_THEOREM_RESPONSE | tr '\r\n' ' '  | jq -r ".issues_count")
                export MARKDOWN_RESULT=$(echo $TERRAFORM_DATA_THEOREM_RESPONSE | tr '\r\n' ' ' | jq -r ".result_as_markdown")
                if [ $TERRAFORM_ISSUES_COUNT == 0 ]; then
                  echo "Deploying file: terraform_example_configuration"
                else
                  echo "Terraform file contains $TERRAFORM_ISSUES_COUNT issues, abort deployment..."
                  echo "Terraform file issues report: $MARKDOWN_RESULT"
                  exit 1
                fi

  • No labels