...
Code Block |
---|
name: Data Theorem SAST
# Controls when the workflow will run, adapt to your own needs
on:
# Triggers the workflow on push or pull request events but only for the "main" branch
# Adapt triggers to your own needs
push:
branches: [ "main" ]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
scan:
continue-on-error: true
timeout-minutes: 30
runs-on: ubuntu-latest
container:
image: us-central1-docker.pkg.dev/prod-scandal-us/datatheorem-sast/datatheorem-sast:latest
env:
DT_SAST_API_KEY: ${{ secrets.DT_SAST_API_KEY }}
DT_SAST_REPOSITORY_NAME: ${{ github.event.repository.full_name }}
DT_SAST_REPOSITORY_PLATFORM: GITHUB
DT_SAST_REPOSITORY_ID: ${{ github.event.repository.id }}
DT_SAST_REPOSITORY_HTML_URL: ${{ github.event.repository.html_url }}
DT_SAST_REPOSITORY_DEFAULT_BRANCH_NAME: ${{ github.event.repository.default_branch }}
DT_SAST_OUTPUT_DIR: ./
steps:
- uses: actions/checkout@v4
- name: Start Data Theorem SAST Scan
run: data_theorem_sast_analyzer scan ./
- uses: actions/upload-artifact@v4
with:
name: dt-sast-scan-result
path: ./scan-results-sarif.json |
Scans on pull requests
Code Block |
---|
name: Data Theorem SAST # Controls when the workflow will run, adapt to your own needs on: # Triggers the workflow on push or pull request events but only for the "main" branch # Adapt triggers to your own needs pull_request jobs: scan: continue-on-error: true timeout-minutes: 30 runs-on: ubuntu-latest container: image: us-central1-docker.pkg.dev/prod-scandal-us/datatheorem-sast/datatheorem-sast:latest env: DT_SAST_API_KEY: ${{ secrets.DT_SAST_API_KEY }} DT_SAST_REPOSITORY_NAME: ${{ github.event.repository.full_name }} DT_SAST_REPOSITORY_PLATFORM: GITHUB DT_SAST_REPOSITORY_ID: ${{ github.event.repository.id }} DT_SAST_REPOSITORY_HTML_URL: ${{ github.event.repository.html_url }} DT_SAST_REPOSITORY_DEFAULT_BRANCH_NAME: ${{ github.event.repository.default_branch }} DT_SAST_SCAN_HEAD_REF: "refs/remotes/origin/${{ github.head_ref }}" DT_SAST_SCAN_TARGET_REF: "refs/remotes/origin/${{ github.base_ref }}" DT_SAST_FAIL_MODE: true steps: - uses: actions/checkout@v4 with: fetch-depth: 0 # IMPORTANT: Needed because by default, actions/checkout@v4 doesn't load the full git history/refs - name: Start Data Theorem SAST Scan run: data_theorem_sast_analyzer scan ./ |
Example with Github Code Scanning integration
You can send the scan results to Github Code Scanning by uploading the output of Data Theorem SAST to Github Code Scanning using the github/codeql-action/upload-sarif
action.
Here is a sample pipeline that does this:
Code Block |
---|
name: CI
# Controls when the workflow will run, adapt to your own needs
on:
# Triggers the workflow on push or pull request events but only for the "main" branch
push:
branches: [ "main" ]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
scan:
runs-on: ubuntu-latest
# Permissions needed if using the optional github/codeql-action/upload-sarif step
permissions:
# required for all workflows
security-events: write
# only required for workflows in private repositories
actions: read
contents: read
container:
image: us-central1-docker.pkg.dev/prod-scandal-us/datatheorem-sast/datatheorem-sast:latest
env:
DT_SAST_API_KEY: ${{ secrets.DT_SAST_API_KEY }}
DT_SAST_REPOSITORY_NAME: ${{ github.event.repository.full_name }}
DT_SAST_REPOSITORY_PLATFORM: GITHUB
DT_SAST_REPOSITORY_ID: ${{ github.event.repository.id }}
DT_SAST_REPOSITORY_HTML_URL: ${{ github.event.repository.html_url }}
DT_SAST_REPOSITORY_DEFAULT_BRANCH_NAME: ${{ github.event.repository.default_branch }}
DT_SAST_OUTPUT_DIR: ./
steps:
- uses: actions/checkout@v4
- name: Start Data Theorem SAST Scan
run: data_theorem_sast_analyzer scan ./
# Optional: output sarif as a Github artifact
- uses: actions/upload-artifact@v4
with:
name: dt-sast-scan-result
path: ./scan-results-sarif.json
# Optional: Upload output sarif to Github Code Scanning
- name: Upload SARIF to GitHub Code Scanning
uses: github/codeql-action/upload-sarif@v3
with:
sarif_file: ./scan-results-sarif.json |
Bitbucket pipeline example
...