Skip to content

CI/CD Integration#

APHIDS integrates into any CI/CD pipeline for automated security scanning. No config.yaml is needed — use environment variables for authentication and Docker volume mounts for file access.


Quick Start#

The simplest way to run APHIDS in a pipeline — container direct with environment variables:

docker run --rm \
  -e APHIDS_API_KEY="$APHIDS_API_KEY" \
  -v $(pwd)/options.yaml:/output/options.yaml:ro \
  -v $(pwd)/output:/output \
  ghcr.io/darksidesecurity/aphids:latest \
  -o options.yaml --unattended

No Python, no config.yaml, no SDK. Just the container, an options file, and an API key.


Two Approaches#

No Python needed. Pass everything via environment variables and volume mounts:

docker run --rm \
  -e APHIDS_API_KEY="$APHIDS_API_KEY" \
  -v $(pwd)/options.yaml:/output/options.yaml:ro \
  -v $(pwd)/output:/output \
  ghcr.io/darksidesecurity/aphids:latest \
  -o options.yaml --unattended

Install the CLI for runbooks, attack trees, and advanced Hive integration:

pip install git+https://github.com/darksidesecurity/aphids.git
# or: uv pip install git+https://github.com/darksidesecurity/aphids.git

export APHIDS_API_KEY="$APHIDS_API_KEY"
aphids-cli -o options.yaml --unattended

Volume Mounts#

Understanding container paths is critical for CI/CD:

Host Path Container Path Mode Purpose
options.yaml /output/options.yaml :ro Scan configuration
./output/ /output/ read-write Results, checkpoints, tool output
./src/ or $(pwd) /workspace/ :ro Source code for SAST/SCA tools

Mount individual files, not directories

Mount options.yaml directly rather than the whole directory. This avoids accidentally exposing other files to the container.


Environment Variables#

These replace config.yaml entirely in CI/CD:

Variable Required Purpose
APHIDS_API_KEY Yes (online mode) Hive API authentication
APHIDS_API_URL No Custom API URL (default: https://api.hive.darksidesecurity.io/)
APHIDS_WS_URL No Custom WebSocket URL (auto-derived from API URL)
APHIDS_DEBUG No Enable debug logging (true/false)
APHIDS_TOOL_TIMEOUT No Per-tool timeout in seconds (default: 1800)

Pipeline Examples#

Network Scans (DAST)#

name: Security Scan
on: [push, pull_request]

jobs:
  security-scan:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Run APHIDS scan
        run: |
          docker run --rm \
            -e APHIDS_API_KEY=${{ secrets.APHIDS_API_KEY }} \
            -v ${{ github.workspace }}/options.yaml:/output/options.yaml:ro \
            -v ${{ github.workspace }}/output:/output \
            ghcr.io/darksidesecurity/aphids:latest \
            -o options.yaml --unattended \
            --fail-on-severity high \
            --sarif /output/results.sarif

      - name: Upload SARIF to GitHub Security
        uses: github/codeql-action/upload-sarif@v3
        if: always()
        with:
          sarif_file: output/results.sarif

      - name: Upload results
        uses: actions/upload-artifact@v4
        if: always()
        with:
          name: aphids-results
          path: output/
security-scan:
  stage: test
  image: ghcr.io/darksidesecurity/aphids:latest
  variables:
    APHIDS_API_KEY: $APHIDS_API_KEY
  script:
    - python3 aphids.py
        -o $CI_PROJECT_DIR/options.yaml
        --unattended
        --fail-on-severity high
        --sarif /output/results.sarif
  artifacts:
    paths:
      - /output/results.sarif
      - /output/*.json
pipeline {
    agent any
    environment {
        APHIDS_API_KEY = credentials('aphids-api-key')
    }
    stages {
        stage('Security Scan') {
            steps {
                sh '''
                    docker run --rm \
                      -e APHIDS_API_KEY=${APHIDS_API_KEY} \
                      -v ${WORKSPACE}/options.yaml:/output/options.yaml:ro \
                      -v ${WORKSPACE}/output:/output \
                      ghcr.io/darksidesecurity/aphids:latest \
                      -o options.yaml --unattended \
                      --fail-on-severity high \
                      --sarif /output/results.sarif
                '''
            }
            post {
                always {
                    archiveArtifacts artifacts: 'output/results.sarif', allowEmptyArchive: true
                }
            }
        }
    }
}
trigger:
  - main

pool:
  vmImage: 'ubuntu-latest'

steps:
  - script: |
      docker run --rm \
        -e APHIDS_API_KEY=$(APHIDS_API_KEY) \
        -v $(Build.SourcesDirectory)/options.yaml:/output/options.yaml:ro \
        -v $(Build.SourcesDirectory)/output:/output \
        ghcr.io/darksidesecurity/aphids:latest \
        -o options.yaml --unattended \
        --fail-on-severity high \
        --sarif /output/results.sarif
    displayName: 'Run APHIDS scan'

  - task: PublishBuildArtifacts@1
    condition: always()
    inputs:
      pathToPublish: $(Build.SourcesDirectory)/output/results.sarif
      artifactName: sarif-results

SAST / Source Code Scanning#

Mount your source code to /workspace/:ro for static analysis:

name: SAST Scan
on: [push, pull_request]

jobs:
  sast:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Run SAST scan
        run: |
          docker run --rm \
            -e APHIDS_API_KEY=${{ secrets.APHIDS_API_KEY }} \
            -v ${{ github.workspace }}/sast-options.yaml:/output/sast-options.yaml:ro \
            -v ${{ github.workspace }}/output:/output \
            -v ${{ github.workspace }}:/workspace:ro \
            --network none \
            ghcr.io/darksidesecurity/aphids:latest \
            -o options.yaml --unattended \
            --fail-on-severity high --sarif /output/results.sarif

      - name: Upload SARIF to GitHub
        uses: github/codeql-action/upload-sarif@v3
        if: always()
        with:
          sarif_file: output/results.sarif
sast-scan:
  stage: test
  image: ghcr.io/darksidesecurity/aphids:latest
  variables:
    APHIDS_API_KEY: $APHIDS_API_KEY
  script:
    - python3 aphids.py
        -o $CI_PROJECT_DIR/sast-options.yaml
        --unattended
        --fail-on-severity high
        --sarif /output/results.sarif
  artifacts:
    paths:
      - /output/results.sarif
    reports:
      sast: /output/results.sarif
  allow_failure:
    exit_codes:
      - 3  # Threshold exceeded — visible but non-blocking
pipeline {
    agent any
    environment {
        APHIDS_API_KEY = credentials('aphids-api-key')
    }
    stages {
        stage('SAST Scan') {
            steps {
                sh '''
                    docker run --rm \
                      -e APHIDS_API_KEY=${APHIDS_API_KEY} \
                      -v ${WORKSPACE}/sast-options.yaml:/output/sast-options.yaml:ro \
                      -v ${WORKSPACE}/output:/output \
                      -v ${WORKSPACE}:/workspace:ro \
                      --network none \
                      ghcr.io/darksidesecurity/aphids:latest \
                      -o options.yaml --unattended \
                      --fail-on-severity high --sarif /output/results.sarif
                '''
            }
            post {
                always {
                    archiveArtifacts artifacts: 'output/results.sarif', allowEmptyArchive: true
                }
            }
        }
    }
}
trigger:
  - main

pool:
  vmImage: 'ubuntu-latest'

steps:
  - checkout: self

  - script: |
      docker run --rm \
        -e APHIDS_API_KEY=$(APHIDS_API_KEY) \
        -v $(Build.SourcesDirectory)/sast-options.yaml:/output/sast-options.yaml:ro \
        -v $(Build.SourcesDirectory)/output:/output \
        -v $(Build.SourcesDirectory):/workspace:ro \
        --network none \
        ghcr.io/darksidesecurity/aphids:latest \
        -o options.yaml --unattended \
        --fail-on-severity high --sarif /output/results.sarif
    displayName: 'Run APHIDS SAST'

  - task: PublishBuildArtifacts@1
    condition: always()
    inputs:
      pathToPublish: $(Build.SourcesDirectory)/output/results.sarif
      artifactName: sarif-results

The options file for SAST:

# sast-options.yaml
configuration:
  online: enabled
  network: public

modules:
  semgrep-scan:
    module: semgrep
    target_dir: '/workspace'
    args: ['--config', 'auto']

  gitleaks-scan:
    module: gitleaks
    target_dir: '/workspace'

  trufflehog-scan:
    module: trufflehog
    target_dir: '/workspace'

  bandit-scan:
    module: bandit
    target_dir: '/workspace'
    args: ['-r', '.']

  safety-check:
    module: safety
    target_dir: '/workspace'

Network isolation for SAST

Use --network none for static analysis tools. They don't need network access and this prevents accidental data exfiltration.

Fully Inline (No Files at All)#

For simple scans, skip the options file entirely with inline JSON:

docker run --rm \
  -e APHIDS_API_KEY="$APHIDS_API_KEY" \
  -v $(pwd)/output:/output \
  ghcr.io/darksidesecurity/aphids:latest \
  -jo '{"configuration":{"online":"enabled"},"modules":{"nmap-scan":{"module":"nmap","target":"10.0.0.1","args":["-sV","-T4"]}}}' \
  --unattended

Unattended Mode#

The --unattended flag auto-approves all interactive prompts — required for pipeline execution:

# CLI
aphids-cli -o options.yaml --unattended

# Docker
docker run --rm \
  -v $(pwd)/options.yaml:/output/options.yaml:ro \
  -v $(pwd)/output:/output \
  ghcr.io/darksidesecurity/aphids:latest \
  -o options.yaml --unattended

Warning

In unattended mode, APHIDS executes all configured modules without confirmation. Verify your targets before running.


Agent Mode in CI/CD#

Deploy a temporary agent that registers with Hive, runs queued scans, and exits:

docker run --rm \
  -e APHIDS_API_KEY="$APHIDS_API_KEY" \
  -v $(pwd)/output:/output \
  ghcr.io/darksidesecurity/aphids:latest \
  --agent --agent-name "ci-${CI_JOB_ID}" --exit-on-idle 300

This is useful for on-demand scanning capacity that scales with your pipeline.


Findings Summary & Pipeline Gates#

After every scan, APHIDS prints a findings summary table:

================================================================================
  SCAN RESULTS SUMMARY
================================================================================
  Tool                      Total   Crit   High    Med    Low   Info
  ----------------------------------------------------------------------------
  semgrep                      12      0      3      5      4      0
  gitleaks                      2      2      0      0      0      0
  bandit                        8      0      1      4      3      0
  ----------------------------------------------------------------------------
  TOTAL                        22      2      4      9      7      0
================================================================================

Fail on Severity#

Fail the pipeline if findings at or above a severity level are found:

# Fail on critical or high findings
docker run --rm \
  -e APHIDS_API_KEY="$APHIDS_API_KEY" \
  -v $(pwd)/options.yaml:/output/options.yaml:ro \
  -v $(pwd):/workspace:ro \
  -v $(pwd)/output:/output \
  ghcr.io/darksidesecurity/aphids:latest \
  -o options.yaml --unattended --fail-on-severity high

Valid severity values: critical, high, medium, low, info

Fail on Count#

Fail the pipeline if total findings exceed a threshold:

# Fail if more than 10 findings total
docker run --rm \
  -e APHIDS_API_KEY="$APHIDS_API_KEY" \
  -v $(pwd)/options.yaml:/output/options.yaml:ro \
  -v $(pwd)/output:/output \
  ghcr.io/darksidesecurity/aphids:latest \
  -o options.yaml --unattended --fail-on-count 10

Combine Both#

# Fail on any critical OR more than 20 findings total
--fail-on-severity critical --fail-on-count 20

SARIF Output#

Generate a SARIF v2.1.0 file for integration with GitHub Security, VS Code, and other SARIF-compatible tools:

docker run --rm \
  -e APHIDS_API_KEY="$APHIDS_API_KEY" \
  -v $(pwd)/options.yaml:/output/options.yaml:ro \
  -v $(pwd)/output:/output \
  -v $(pwd):/workspace:ro \
  ghcr.io/darksidesecurity/aphids:latest \
  -o options.yaml --unattended --sarif /output/results.sarif

The SARIF file collects findings from all tool outputs — both JSON (gitleaks, nikto, etc.) and native SARIF (semgrep) — into a single unified report with per-tool runs, severity mappings, CWE tags, and fingerprints for deduplication.

SARIF + Platform Integration#

Upload SARIF to create inline PR annotations and populate Security → Code scanning alerts:

- name: Upload SARIF to GitHub Security
  uses: github/codeql-action/upload-sarif@v3
  if: always()  # Upload even when threshold fails the step
  with:
    sarif_file: output/results.sarif

Branch protection

Enable Require code scanning results in branch protection rules to block merges when SARIF findings are present.

GitLab supports SARIF as a SAST report artifact:

artifacts:
  reports:
    sast: /output/results.sarif

Findings appear in the Security dashboard and as MR widgets. Use allow_failure: exit_codes: [3] to show threshold warnings without blocking the pipeline.

Install the SARIF Viewer extension, then open results.sarif. Findings appear as inline editor annotations with severity, description, and CWE references.

SARIF v2.1.0 is supported by DefectDojo, SonarQube, Jira (via import plugins), Azure DevOps, and any tool conforming to the OASIS SARIF specification.


How It All Fits Together#

┌─────────────────────────────────────────────────────┐
│  CI/CD Pipeline (GitHub Actions, GitLab, Jenkins)   │
├─────────────────────────────────────────────────────┤
│                                                     │
│  1. Checkout code                                   │
│  2. docker run aphids ... \                         │
│       --fail-on-severity high \                     │
│       --sarif /output/results.sarif                 │
│                                                     │
│  ┌───────────────────────────────────────────┐      │
│  │  APHIDS Container                         │      │
│  │  ├── Run tools (semgrep, gitleaks, etc.)  │      │
│  │  ├── Print findings summary table         │      │
│  │  ├── Write results.sarif (all findings)   │      │
│  │  └── Check thresholds → exit 0 or 3      │      │
│  └───────────────────────────────────────────┘      │
│                                                     │
│  3. Upload SARIF → Security alerts (PR annotations) │
│  4. Exit code 3 → step fails → PR blocked          │
│  5. Results also sent to Hive (online mode)         │
│                                                     │
└─────────────────────────────────────────────────────┘
  • SARIF = the detailed report (what, where, severity)
  • Exit code 3 = the gate (pass/fail decision)
  • Hive = centralized tracking across all repos and scans

Exit Codes#

Code Meaning Pipeline Action
0 All modules completed, no thresholds exceeded Pass
1 Execution error Fail
2 Configuration error Fail
3 Findings exceeded threshold (--fail-on-severity or --fail-on-count) Fail

Using exit code 3 in GitLab

GitLab supports allow_failure: exit_codes: [3] to mark the job as a warning instead of a hard failure. This lets you surface threshold violations without blocking the pipeline while you tune thresholds.


API Key Management#

Never hardcode API keys. Use your platform's secret management:

Platform Secret Setup Usage in Pipeline
GitHub Actions Settings → Secrets → Actions ${{ secrets.APHIDS_API_KEY }}
GitLab CI Settings → CI/CD → Variables (masked) $APHIDS_API_KEY
Jenkins Credentials → Add → Secret text credentials('aphids-api-key')
Azure Pipelines Pipelines → Library → Variable groups $(APHIDS_API_KEY)
AWS Secrets Manager / Parameter Store SDK or CLI retrieval
GCP Secret Manager gcloud secrets versions access
HashiCorp Vault vault kv put secret/aphids vault kv get -field=api_key

Results upload to Hive automatically for centralized vulnerability tracking across all repositories.