Compare commits

...

51 Commits

Author SHA1 Message Date
dependabot[bot] 124d1b9c65 chore(deps): bump actions/github-script from 8.0.0 to 9.0.0
Bumps [actions/github-script](https://github.com/actions/github-script) from 8.0.0 to 9.0.0.
- [Release notes](https://github.com/actions/github-script/releases)
- [Commits](https://github.com/actions/github-script/compare/ed597411d8f924073f98dfc5c65a23a2325f34cd...3a2844b7e9c422d3c10d287c895573f7108da1b3)

---
updated-dependencies:
- dependency-name: actions/github-script
  dependency-version: 9.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-05-14 12:37:45 +00:00
Pepe Fagoaga 3410fc927a chore(security): replace safety with osv-scanner (#11167) 2026-05-14 14:35:09 +02:00
Alejandro Bailo dcf91ef252 feat(ui): add health check endpoint (#11145) 2026-05-14 13:47:48 +02:00
lydiavilchez bf4fd8fabd fix(googleworkspace): use per-service resources for Directory (#11176) 2026-05-14 13:07:06 +02:00
Alejandro Bailo 3d65208fd3 fix(ui): update vulnerable npm packages (#11173) 2026-05-14 12:55:29 +02:00
Adrián Peña 7d3ed62e90 chore(sdk): migrate from poetry to uv (#11162) 2026-05-14 12:51:57 +02:00
lydiavilchez 5f92989492 fix(googleworkspace): use per-service resources for Calendar and Drive (#11161) 2026-05-14 12:43:29 +02:00
Hugo Pereira Brito 6befa78978 fix(cloudflare): plan-aware WAF FAIL hints for zones (#9896) 2026-05-14 12:27:47 +02:00
lydiavilchez 78af0c24fe fix(googleworkspace): use per-service resources for Gmail (#11169) 2026-05-14 12:01:07 +02:00
Andoni Alonso 1bb547e5e1 docs(cloudflare): add pre-configured token creation links (#11156) 2026-05-14 11:58:00 +02:00
June 1f39b01fb2 feat(sagemaker): add sagemaker_domain_sso_configured check (#11094)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-05-14 11:42:30 +02:00
AOrps fb0ef391f2 ci(api): replace poetry with uv (api) (#10775)
Signed-off-by: AOrps <aorbeandrews@gmail.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-05-14 11:17:17 +02:00
Pablo Fernandez Guerra (PFE) f2e6a3264d chore(ui): scope prek pre-commit to staged files, drop legacy husky (#11118)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-14 11:12:25 +02:00
Hugo Pereira Brito 9bd4e4b65c fix(ui): enforce 100-char limit on mute rule name input (#11158) 2026-05-14 09:13:36 +01:00
Hugo Pereira Brito 68ffb2b219 docs(sdk): update Scanning Unused Services tutorial (#11151) 2026-05-14 08:26:29 +01:00
Hugo Pereira Brito 739be07077 chore(aws): skip unattached IAM policies unless --scan-unused-services (#11150) 2026-05-14 08:10:20 +01:00
Alejandro Bailo 0abbb7fc59 feat(mcp): add finding groups tools (#11140) 2026-05-13 18:11:32 +02:00
Alan Buscaglia 0b4393776c chore: harden npm supply chain controls (#11157) 2026-05-13 17:30:25 +02:00
Daniel Barranquero 4dd5baadf6 feat(okta): add provider to the SDK with 1 security check (#11079) 2026-05-13 15:57:57 +02:00
Pablo Fernandez Guerra (PFE) 934d995661 test(ui): fix flaky attack paths test (#11154)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
2026-05-13 15:05:18 +02:00
Hugo Pereira Brito ccdc01ed7b fix(ui): render inline code without literal backticks in finding drawer (#11142) 2026-05-13 10:31:48 +01:00
Andoni Alonso d84099e87a feat(aws): add external resource link to AWS Console (#9172)
Co-authored-by: Hugo P.Brito <hugopbrit@gmail.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-05-13 10:16:28 +01:00
Hugo Pereira Brito cf55f7eb43 style(sdk): apply black formatting to contrib/inventory-graph (#11153) 2026-05-13 09:52:46 +01:00
Rubén De la Torre Vico 9293c7b58d fix(api): correct service principal for Bedrock AgentCore attack paths (#11141) 2026-05-13 10:14:59 +02:00
Pepe Fagoaga a883bb30d4 chore: SAML ACS URL is only shown if the email domain is configured (#11144)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-05-13 09:38:19 +02:00
Sandiyo Christan e476bbde2d feat(outputs): add AWS inventory connectivity graph output format (#10382)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-05-13 09:22:37 +02:00
abdou 7f3dcdf02f fix(m365): surface AuditLog.Read.All permission errors instead of false positives (#10907)
Co-authored-by: Hugo P.Brito <hugopbrit@gmail.com>
2026-05-12 18:22:19 +01:00
Alejandro Bailo 132e79df89 chore(skills): update Next.js guidance to version 16 (#11143) 2026-05-12 19:05:13 +02:00
Alejandro Bailo b2ed9ee221 refactor: clean tests and improve selector (#11139) 2026-05-12 17:21:50 +02:00
Hugo Pereira Brito def2d3d188 chore(skills): forbid /issues/ links in changelog entries (#11121) 2026-05-12 16:08:01 +01:00
Pablo Fernandez Guerra (PFE) 1090ed59b7 feat(ui): replace D3+Dagre attack path graph with React Flow (#10686)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
2026-05-12 16:33:29 +02:00
Alan Buscaglia 67e4b1a082 docs(skills): clarify changelog release preflight (#11137) 2026-05-12 16:06:19 +02:00
Prowler Bot 7478ec9420 chore(docs): Bump version to v5.26.1 (#11132)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-05-12 15:07:19 +02:00
Josema Camacho a30b6623ed fix(api): make findings GIN index migration idempotent (#11129) 2026-05-12 13:47:08 +02:00
Daniel Barranquero 15bc317ec4 chore(changelog): prepare changelog for v5.26.1 (#11127) 2026-05-12 13:14:41 +02:00
Alejandro Bailo 1536102784 fix(ui): fix role cancel and select dropdown scroll (#11125)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-05-12 13:10:56 +02:00
Hugo Pereira Brito 1b99550572 feat(m365): add entra_service_principal_no_secrets_for_permanent_tier0_roles security check (#10788)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-05-12 10:45:32 +01:00
Josema Camacho 6dfa135755 perf(api): add multi-column GIN index on findings array fields (#11001) 2026-05-12 11:45:16 +02:00
Hugo Pereira Brito 80482da1cb refactor(m365): scope entra_emergency_access_exclusion to Block-grant policies (#10849) 2026-05-12 10:40:46 +01:00
Adrián Peña 9cedbd3582 fix(api): defer scan broker publish until transaction commits (#11122) 2026-05-12 11:04:39 +02:00
Pablo Fernandez Guerra (PFE) c3d1c5c5f7 chore(ui): remove unused npm dependencies flagged by knip (#11115)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-12 10:07:15 +02:00
Pablo Fernandez Guerra (PFE) 1fd6c51af6 chore(precommit): scope zizmor hook to workflows, actions and dependabot (#10997)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-12 10:06:47 +02:00
Hugo Pereira Brito adbfc0bcd1 docs(compliance): expand developer guide for new compliance frameworks (#10870)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2026-05-12 09:04:35 +01:00
Hugo Pereira Brito 8f041f6f52 docs(changelog): link entra_users_mfa_capable fix to PR #11002 (#11120) 2026-05-12 08:52:50 +01:00
Hugo Pereira Brito 1b0e12ec51 fix(m365): exclude disabled guest users from entra_users_mfa_capable (#11002) 2026-05-12 08:35:24 +01:00
Daniel Barranquero 759f7b84d6 feat(aws): add cloudtrail_bedrock_logging_enabled security check (#10858) 2026-05-11 17:11:49 +02:00
Hugo Pereira Brito 0b26c1a39c feat(aws): add iam_user_access_not_stale_to_sagemaker security check (#11000)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-05-11 16:34:18 +02:00
Prowler Bot fc7fbddfe7 chore(docs): Bump version to v5.26.0 (#11108)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-05-11 15:38:23 +02:00
Prowler Bot 500b395125 chore(api): Bump version to v1.28.0 (#11112)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-05-11 15:36:36 +02:00
Prowler Bot a1961d6d5f chore(sdk): Bump version to v5.27.0 (#11109)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-05-11 15:35:59 +02:00
Prowler Bot a7e988c361 chore(ui): Bump version to v5.27.0 (#11110)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-05-11 15:35:45 +02:00
421 changed files with 35739 additions and 27487 deletions
+5 -6
View File
@@ -2,20 +2,19 @@
# Runs automatically on `wt switch --create`. # Runs automatically on `wt switch --create`.
# Block 1: setup + copy gitignored env files (.envrc, ui/.env.local) # Block 1: setup + copy gitignored env files (.envrc, ui/.env.local)
# from the primary worktree patterns selected via .worktreeinclude. # from the primary worktree - patterns selected via .worktreeinclude.
[[pre-start]] [[pre-start]]
skills = "./skills/setup.sh --claude" skills = "./skills/setup.sh --claude"
python = "poetry env use python3.12"
envs = "wt step copy-ignored" envs = "wt step copy-ignored"
# Block 2: install Python deps (requires `poetry env use` from block 1). # Block 2: install Python deps (uv manages the venv on `uv sync`).
[[pre-start]] [[pre-start]]
deps = "poetry install --with dev" deps = "uv sync"
# Block 3: reminder last visible output before `wt switch` returns. # Block 3: reminder - last visible output before `wt switch` returns.
# Hooks can't mutate the parent shell, so venv activation is manual. # Hooks can't mutate the parent shell, so venv activation is manual.
[[pre-start]] [[pre-start]]
reminder = "echo '>> Reminder: activate the venv in this shell with: eval $(poetry env activate)'" reminder = "echo '>> Reminder: activate the venv in this shell with: source .venv/bin/activate'"
# Background: pnpm install runs while you start working. # Background: pnpm install runs while you start working.
# Tail logs via `wt config state logs`. # Tail logs via `wt config state logs`.
+1 -1
View File
@@ -145,7 +145,7 @@ SENTRY_RELEASE=local
NEXT_PUBLIC_SENTRY_ENVIRONMENT=${SENTRY_ENVIRONMENT} NEXT_PUBLIC_SENTRY_ENVIRONMENT=${SENTRY_ENVIRONMENT}
#### Prowler release version #### #### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.26.0 NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.27.0
# Social login credentials # Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google" SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
+169
View File
@@ -0,0 +1,169 @@
name: 'OSV-Scanner'
description: 'Install osv-scanner and scan a lockfile, failing on HIGH/CRITICAL/UNKNOWN severity findings. Posts/updates a PR comment with findings on pull_request events (requires pull-requests: write).'
author: 'Prowler'
inputs:
lockfile:
description: 'Path to the lockfile to scan, relative to the repository root (e.g. uv.lock, api/uv.lock, ui/pnpm-lock.yaml).'
required: true
severity-levels:
description: 'Comma-separated severity levels that fail the scan. Default: HIGH,CRITICAL,UNKNOWN.'
required: false
default: 'HIGH,CRITICAL,UNKNOWN'
version:
description: 'osv-scanner release tag to install. When overriding, you MUST also override binary-sha256.'
required: false
default: 'v2.3.8'
binary-sha256:
description: 'Expected SHA256 of osv-scanner_linux_amd64 for the given version. Default tracks v2.3.8. See https://github.com/google/osv-scanner/releases/download/<version>/osv-scanner_SHA256SUMS.'
required: false
default: 'bc98e15319ed0d515e3f9235287ba53cdc5535d576d24fd573978ecfe9ab92dc'
post-pr-comment:
description: 'Post or update a PR comment with the scan report. Only effective on pull_request events. Requires pull-requests: write permission on the caller job.'
required: false
default: 'true'
runs:
using: 'composite'
steps:
- name: Install osv-scanner
shell: bash
env:
OSV_SCANNER_VERSION: ${{ inputs.version }}
# Download the binary AND the published SHA256SUMS file, then verify the
# binary checksum against the upstream-signed manifest. Aborts on mismatch.
run: |
set -euo pipefail
if command -v osv-scanner >/dev/null 2>&1; then
INSTALLED="$(osv-scanner --version 2>&1 | awk '/scanner version/ {print $NF; exit}')"
if [ "v${INSTALLED}" = "${OSV_SCANNER_VERSION}" ]; then
echo "osv-scanner ${OSV_SCANNER_VERSION} already installed."
exit 0
fi
fi
BASE="https://github.com/google/osv-scanner/releases/download/${OSV_SCANNER_VERSION}"
BIN_NAME="osv-scanner_linux_amd64"
curl -fSL --retry 3 "${BASE}/${BIN_NAME}" -o "${RUNNER_TEMP}/${BIN_NAME}"
curl -fSL --retry 3 "${BASE}/osv-scanner_SHA256SUMS" -o "${RUNNER_TEMP}/osv-scanner_SHA256SUMS"
(cd "${RUNNER_TEMP}" && sha256sum --check --ignore-missing osv-scanner_SHA256SUMS)
chmod +x "${RUNNER_TEMP}/${BIN_NAME}"
sudo mv "${RUNNER_TEMP}/${BIN_NAME}" /usr/local/bin/osv-scanner
rm -f "${RUNNER_TEMP}/osv-scanner_SHA256SUMS"
osv-scanner --version
- name: Run osv-scanner
id: scan
shell: bash
working-directory: ${{ github.workspace }}
env:
OSV_LOCKFILE: ${{ inputs.lockfile }}
OSV_SEVERITY_LEVELS: ${{ inputs.severity-levels }}
OSV_REPORT_FILE: ${{ runner.temp }}/osv-scanner-findings.json
# Per-vulnerability ignores (reason + expiry) live in osv-scanner.toml at the repo root, if present.
# Severity filter is enforced in the wrapper via OSV_SEVERITY_LEVELS.
# `continue-on-error: true` lets the PR-comment step run even when findings exist;
# the gate step below re-fails the job from the wrapper exit code.
continue-on-error: true
run: ./.github/scripts/osv-scan.sh --lockfile="${OSV_LOCKFILE}"
- name: Post osv-scanner report on PR
if: >-
always()
&& inputs.post-pr-comment == 'true'
&& github.event_name == 'pull_request'
&& github.event.pull_request.head.repo.full_name == github.repository
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
env:
OSV_REPORT_FILE: ${{ runner.temp }}/osv-scanner-findings.json
OSV_LOCKFILE: ${{ inputs.lockfile }}
OSV_SEVERITY_LEVELS: ${{ inputs.severity-levels }}
with:
script: |
const fs = require('fs');
const lockfile = process.env.OSV_LOCKFILE;
const severityLevels = process.env.OSV_SEVERITY_LEVELS;
const reportFile = process.env.OSV_REPORT_FILE;
const marker = `<!-- osv-scanner-report:${lockfile} -->`;
const runUrl = `${context.serverUrl}/${context.repo.owner}/${context.repo.repo}/actions/runs/${context.runId}`;
let findings = [];
if (fs.existsSync(reportFile)) {
try {
findings = JSON.parse(fs.readFileSync(reportFile, 'utf8'));
} catch (err) {
core.warning(`Could not parse ${reportFile}: ${err.message}`);
return;
}
}
const { data: comments } = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
});
const existing = comments.find(c => c.body?.includes(marker));
if (findings.length === 0) {
if (existing) {
await github.rest.issues.deleteComment({
owner: context.repo.owner,
repo: context.repo.repo,
comment_id: existing.id,
});
core.info(`Deleted stale osv-scanner comment for ${lockfile}.`);
} else {
core.info(`No findings and no stale comment for ${lockfile}.`);
}
return;
}
const sevIcon = (s) => ({
CRITICAL: '🔴', HIGH: '🟠', MEDIUM: '🟡', LOW: '🟢', UNKNOWN: '⚪',
}[s] || '⚪');
const escape = (s) => String(s ?? '').replace(/\|/g, '\\|').replace(/\n/g, ' ');
const rows = findings.map(f =>
`| ${sevIcon(f.severity)} ${f.severity}${f.score ? ` (${f.score})` : ''} | \`${escape(f.id)}\` | \`${escape(f.ecosystem)}/${escape(f.package)}\` | \`${escape(f.version)}\` | ${escape(f.summary || '(no summary)')} |`
);
const body = [
marker,
`## 🔒 osv-scanner: ${findings.length} finding(s) in \`${lockfile}\``,
'',
`Severity gate: \`${severityLevels}\``,
'',
'| Severity | ID | Package | Version | Summary |',
'|----------|----|---------|---------|---------|',
...rows,
'',
`To accept a finding, add an \`[[IgnoredVulns]]\` entry to \`osv-scanner.toml\` at the repo root with a reason and \`ignoreUntil\`.`,
'',
`<sub>[View run](${runUrl})</sub>`,
].join('\n');
if (existing) {
await github.rest.issues.updateComment({
owner: context.repo.owner,
repo: context.repo.repo,
comment_id: existing.id,
body,
});
core.info(`Updated osv-scanner comment for ${lockfile}.`);
} else {
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body,
});
core.info(`Posted new osv-scanner comment for ${lockfile}.`);
}
- name: Enforce osv-scanner severity gate
shell: bash
env:
SCAN_OUTCOME: ${{ steps.scan.outcome }}
run: |
if [ "${SCAN_OUTCOME}" != "success" ]; then
echo "osv-scanner gate: scan reported findings (outcome=${SCAN_OUTCOME})" >&2
exit 1
fi
@@ -1,5 +1,5 @@
name: 'Setup Python with Poetry' name: 'Setup Python with uv'
description: 'Setup Python environment with Poetry and install dependencies' description: 'Setup Python environment with uv and install dependencies'
author: 'Prowler' author: 'Prowler'
inputs: inputs:
@@ -7,23 +7,15 @@ inputs:
description: 'Python version to use' description: 'Python version to use'
required: true required: true
working-directory: working-directory:
description: 'Working directory for Poetry' description: 'Working directory for uv'
required: false required: false
default: '.' default: '.'
poetry-version: uv-version:
description: 'Poetry version to install' description: 'uv version to install'
required: false required: false
default: '2.3.4' default: '0.11.14'
install-dependencies: install-dependencies:
description: 'Install Python dependencies with Poetry' description: 'Install Python dependencies with uv'
required: false
default: 'true'
update-lock:
description: 'Run `poetry lock` during setup. Only enable when a prior step mutates pyproject.toml (e.g. API `@master` VCS rewrite). Default: false.'
required: false
default: 'false'
enable-cache:
description: 'Whether to enable Poetry dependency caching via actions/setup-python'
required: false required: false
default: 'true' default: 'true'
@@ -47,54 +39,52 @@ runs:
sed -i "s|\(git+https://github.com/prowler-cloud/prowler[^@]*\)@master|\1@$BRANCH_NAME|g" pyproject.toml sed -i "s|\(git+https://github.com/prowler-cloud/prowler[^@]*\)@master|\1@$BRANCH_NAME|g" pyproject.toml
fi fi
- name: Install poetry - name: Update uv.lock with latest Prowler commit
shell: bash
run: |
python -m pip install --upgrade pip
pipx install poetry==${INPUTS_POETRY_VERSION}
env:
INPUTS_POETRY_VERSION: ${{ inputs.poetry-version }}
- name: Update poetry.lock with latest Prowler commit
if: github.repository_owner == 'prowler-cloud' && github.repository != 'prowler-cloud/prowler' if: github.repository_owner == 'prowler-cloud' && github.repository != 'prowler-cloud/prowler'
shell: bash shell: bash
working-directory: ${{ inputs.working-directory }} working-directory: ${{ inputs.working-directory }}
run: | run: |
LATEST_COMMIT=$(curl -s "https://api.github.com/repos/prowler-cloud/prowler/commits/master" | jq -r '.sha') LATEST_COMMIT=$(curl -s "https://api.github.com/repos/prowler-cloud/prowler/commits/master" | jq -r '.sha')
echo "Latest commit hash: $LATEST_COMMIT" echo "Latest commit hash: $LATEST_COMMIT"
sed -i '/url = "https:\/\/github\.com\/prowler-cloud\/prowler\.git"/,/resolved_reference = / { sed -i "s|\(git = \"https://github\.com/prowler-cloud/prowler\.git?rev=master\)#[a-f0-9]\{40\}\"|\1#${LATEST_COMMIT}\"|g" uv.lock
s/resolved_reference = "[a-f0-9]\{40\}"/resolved_reference = "'"$LATEST_COMMIT"'"/ echo "Updated uv.lock entry:"
}' poetry.lock grep "prowler-cloud/prowler" uv.lock
echo "Updated resolved_reference:"
grep -A2 -B2 "resolved_reference" poetry.lock
- name: Update poetry.lock (prowler repo only) - name: Update uv.lock SDK commit (prowler repo on push)
if: github.repository == 'prowler-cloud/prowler' && inputs.update-lock == 'true' if: github.event_name == 'push' && github.ref == 'refs/heads/master' && github.repository == 'prowler-cloud/prowler'
shell: bash shell: bash
working-directory: ${{ inputs.working-directory }} working-directory: ${{ inputs.working-directory }}
run: poetry lock run: |
LATEST_COMMIT=$(curl -s "https://api.github.com/repos/prowler-cloud/prowler/commits/master" | jq -r '.sha')
echo "Latest commit hash: $LATEST_COMMIT"
sed -i "s|\(git = \"https://github\.com/prowler-cloud/prowler\.git?rev=master\)#[a-f0-9]\{40\}\"|\1#${LATEST_COMMIT}\"|g" uv.lock
echo "Updated uv.lock entry:"
grep "prowler-cloud/prowler" uv.lock
- name: Install uv
shell: bash
env:
UV_VERSION: ${{ inputs.uv-version }}
run: pip install --no-cache-dir --upgrade pip && pip install --no-cache-dir "uv==${UV_VERSION}"
- name: Set up Python ${{ inputs.python-version }} - name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0 uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with: with:
python-version: ${{ inputs.python-version }} python-version: ${{ inputs.python-version }}
# Disable cache when callers skip dependency install: Poetry 2.3.4 creates cache: 'pip'
# the venv in a path setup-python can't hash, breaking the post-step save-cache.
cache: ${{ inputs.enable-cache == 'true' && 'poetry' || '' }}
cache-dependency-path: ${{ inputs.enable-cache == 'true' && format('{0}/poetry.lock', inputs.working-directory) || '' }}
- name: Install Python dependencies - name: Install Python dependencies
if: inputs.install-dependencies == 'true' if: inputs.install-dependencies == 'true'
shell: bash shell: bash
working-directory: ${{ inputs.working-directory }} working-directory: ${{ inputs.working-directory }}
run: | run: |
poetry install --no-root uv sync --no-install-project
poetry run pip list uv run pip list
- name: Update Prowler Cloud API Client - name: Update Prowler Cloud API Client
if: github.repository_owner == 'prowler-cloud' && github.repository != 'prowler-cloud/prowler' if: github.repository_owner == 'prowler-cloud' && github.repository != 'prowler-cloud/prowler'
shell: bash shell: bash
working-directory: ${{ inputs.working-directory }} working-directory: ${{ inputs.working-directory }}
run: | run: |
poetry remove prowler-cloud-api-client uv remove prowler-cloud-api-client
poetry add ./prowler-cloud-api-client uv add ./prowler-cloud-api-client
+7
View File
@@ -72,6 +72,11 @@ provider/vercel:
- any-glob-to-any-file: "prowler/providers/vercel/**" - any-glob-to-any-file: "prowler/providers/vercel/**"
- any-glob-to-any-file: "tests/providers/vercel/**" - any-glob-to-any-file: "tests/providers/vercel/**"
provider/okta:
- changed-files:
- any-glob-to-any-file: "prowler/providers/okta/**"
- any-glob-to-any-file: "tests/providers/okta/**"
github_actions: github_actions:
- changed-files: - changed-files:
- any-glob-to-any-file: ".github/workflows/*" - any-glob-to-any-file: ".github/workflows/*"
@@ -109,6 +114,8 @@ mutelist:
- any-glob-to-any-file: "tests/providers/googleworkspace/lib/mutelist/**" - any-glob-to-any-file: "tests/providers/googleworkspace/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/vercel/lib/mutelist/**" - any-glob-to-any-file: "prowler/providers/vercel/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/vercel/lib/mutelist/**" - any-glob-to-any-file: "tests/providers/vercel/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/okta/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/okta/lib/mutelist/**"
integration/s3: integration/s3:
- changed-files: - changed-files:
+2 -1
View File
@@ -36,6 +36,7 @@ Please add a detailed description of how to review this PR.
#### UI #### UI
- [ ] All issue/task requirements work as expected on the UI - [ ] All issue/task requirements work as expected on the UI
- [ ] If this PR adds or updates npm dependencies, include package-health evidence (maintenance, popularity, known vulnerabilities, license, release age) and explain why existing/native alternatives are insufficient.
- [ ] Screenshots/Video of the functionality flow (if applicable) - Mobile (X < 640px) - [ ] Screenshots/Video of the functionality flow (if applicable) - Mobile (X < 640px)
- [ ] Screenshots/Video of the functionality flow (if applicable) - Table (640px > X < 1024px) - [ ] Screenshots/Video of the functionality flow (if applicable) - Table (640px > X < 1024px)
- [ ] Screenshots/Video of the functionality flow (if applicable) - Desktop (X > 1024px) - [ ] Screenshots/Video of the functionality flow (if applicable) - Desktop (X > 1024px)
@@ -48,7 +49,7 @@ Please add a detailed description of how to review this PR.
- [ ] Performance test results (if applicable) - [ ] Performance test results (if applicable)
- [ ] Any other relevant evidence of the implementation (if applicable) - [ ] Any other relevant evidence of the implementation (if applicable)
- [ ] Verify if API specs need to be regenerated. - [ ] Verify if API specs need to be regenerated.
- [ ] Check if version updates are required (e.g., specs, Poetry, etc.). - [ ] Check if version updates are required (e.g., specs, uv, etc.).
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/api/CHANGELOG.md), if applicable. - [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/api/CHANGELOG.md), if applicable.
### License ### License
+122
View File
@@ -0,0 +1,122 @@
#!/usr/bin/env bash
# Run osv-scanner and fail when findings match the configured severity levels.
#
# Replaces `safety check --policy-file .safety-policy.yml`. Used by:
# - .github/actions/osv-scanner/action.yml (composite action)
# - .github/workflows/api-security.yml, sdk-security.yml, ui-security.yml
#
# Severity levels (comma-separated) are read from OSV_SEVERITY_LEVELS.
# Default: HIGH,CRITICAL,UNKNOWN — preserves prior .safety-policy.yml policy
# (ignore-cvss-severity-below: 7 + ignore-cvss-unknown-severity: False).
# osv-scanner has no native CVSS threshold (google/osv-scanner#1400, closed
# not-planned). Severity is derived from $group.max_severity (numeric CVSS
# score string) which osv-scanner emits per group.
#
# CVSS v3 score → categorical label:
# CRITICAL >= 9.0
# HIGH >= 7.0
# MEDIUM >= 4.0
# LOW > 0.0
# UNKNOWN no score available
#
# Per-vulnerability ignores (with reason + expiry) live in osv-scanner.toml at
# the repo root, if it exists. Without that file, osv-scanner uses defaults.
#
# Usage:
# osv-scan.sh [osv-scanner pass-through args...]
# Examples:
# osv-scan.sh --lockfile=uv.lock
# osv-scan.sh --recursive .
# OSV_SEVERITY_LEVELS=CRITICAL osv-scan.sh --lockfile=uv.lock
set -euo pipefail
ROOT="$(git rev-parse --show-toplevel)"
CONFIG="${ROOT}/osv-scanner.toml"
SEVERITY_LEVELS="${OSV_SEVERITY_LEVELS:-HIGH,CRITICAL,UNKNOWN}"
for bin in osv-scanner jq; do
if ! command -v "${bin}" >/dev/null 2>&1; then
echo "error: ${bin} not found in PATH" >&2
exit 2
fi
done
SCAN_ARGS=()
if [ -f "${CONFIG}" ]; then
SCAN_ARGS+=(--config="${CONFIG}")
fi
# Exit codes: 0=clean, 1=findings, 127=no supported files, 128=internal error.
STDERR="$(mktemp)"
trap 'rm -f "${STDERR}"' EXIT
set +e
OUTPUT="$(osv-scanner scan source "${SCAN_ARGS[@]}" --format=json "$@" 2>"${STDERR}")"
RC=$?
set -e
case "${RC}" in
0|1) ;;
127) echo "osv-scanner: no supported lockfiles in scan target"; exit 0 ;;
*)
echo "osv-scanner: exited with code ${RC}" >&2
[ -s "${STDERR}" ] && cat "${STDERR}" >&2
exit "${RC}"
;;
esac
# Build a JSON array of normalized severity levels for jq.
SEVERITY_JSON="$(printf '%s' "${SEVERITY_LEVELS}" | jq -Rc '
split(",") | map(ascii_upcase | sub("^\\s+"; "") | sub("\\s+$"; ""))
')"
# Walk each vulnerability, look up its group's max_severity (numeric CVSS),
# map to a categorical label, then filter by OSV_SEVERITY_LEVELS.
FINDINGS="$(printf '%s' "${OUTPUT}" | jq --argjson sevs "${SEVERITY_JSON}" '
[ .results[]?.packages[]?
| . as $pkg
| ($pkg.groups // []) as $groups
| ($pkg.vulnerabilities // [])[]
| . as $vuln
| ([ $groups[] | select((.ids // []) | index($vuln.id)) ][0] // {}) as $group
| (($group.max_severity // "") | tonumber? // null) as $score
| (if $score == null then "UNKNOWN"
elif $score >= 9.0 then "CRITICAL"
elif $score >= 7.0 then "HIGH"
elif $score >= 4.0 then "MEDIUM"
elif $score > 0 then "LOW"
else "UNKNOWN"
end) as $label
| {
id: $vuln.id,
severity: $label,
score: $score,
summary: ($vuln.summary // null),
package: $pkg.package.name,
version: $pkg.package.version,
ecosystem: $pkg.package.ecosystem
}
| select(.severity as $s | $sevs | any(. == $s))
]
')"
COUNT="$(printf '%s' "${FINDINGS}" | jq 'length')"
# Write the findings JSON to OSV_REPORT_FILE so callers (e.g. the composite
# action's PR-comment step) can consume the same data the gate decision uses.
if [ -n "${OSV_REPORT_FILE:-}" ]; then
printf '%s' "${FINDINGS}" > "${OSV_REPORT_FILE}"
fi
if [ "${COUNT}" -gt 0 ]; then
echo "osv-scanner: ${COUNT} finding(s) at severity ${SEVERITY_LEVELS}"
printf '%s' "${FINDINGS}" | jq -r '
.[] | " [\(.severity)\(if .score then " \(.score)" else "" end)] \(.id) \(.ecosystem)/\(.package)@\(.version) — \(.summary // "(no summary)")"
'
echo
echo "To accept a finding, create osv-scanner.toml at the repo root with a reason and ignoreUntil."
exit 1
fi
echo "osv-scanner: no findings at severity levels: ${SEVERITY_LEVELS}"
+8 -8
View File
@@ -43,6 +43,7 @@ jobs:
pypi.org:443 pypi.org:443
files.pythonhosted.org:443 files.pythonhosted.org:443
api.github.com:443 api.github.com:443
raw.githubusercontent.com:443
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -63,26 +64,25 @@ jobs:
api/CHANGELOG.md api/CHANGELOG.md
api/AGENTS.md api/AGENTS.md
- name: Setup Python with Poetry - name: Setup Python with uv
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/setup-python-poetry uses: ./.github/actions/setup-python-uv
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
working-directory: ./api working-directory: ./api
update-lock: 'true'
- name: Poetry check - name: uv lock check
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry check --lock run: uv lock --check
- name: Ruff lint - name: Ruff lint
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run ruff check . --exclude contrib run: uv run ruff check . --exclude contrib
- name: Ruff format - name: Ruff format
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run ruff format --check . --exclude contrib run: uv run ruff format --check . --exclude contrib
- name: Pylint - name: Pylint
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run pylint --disable=W,C,R,E -j 0 -rn -sn src/ run: uv run pylint --disable=W,C,R,E -j 0 -rn -sn src/
+26 -15
View File
@@ -9,7 +9,9 @@ on:
- 'api/**' - 'api/**'
- '.github/workflows/api-tests.yml' - '.github/workflows/api-tests.yml'
- '.github/workflows/api-security.yml' - '.github/workflows/api-security.yml'
- '.github/actions/setup-python-poetry/**' - '.github/actions/setup-python-uv/**'
- '.github/actions/osv-scanner/**'
- '.github/scripts/osv-scan.sh'
pull_request: pull_request:
branches: branches:
- "master" - "master"
@@ -18,7 +20,9 @@ on:
- 'api/**' - 'api/**'
- '.github/workflows/api-tests.yml' - '.github/workflows/api-tests.yml'
- '.github/workflows/api-security.yml' - '.github/workflows/api-security.yml'
- '.github/actions/setup-python-poetry/**' - '.github/actions/setup-python-uv/**'
- '.github/actions/osv-scanner/**'
- '.github/scripts/osv-scan.sh'
concurrency: concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
@@ -35,6 +39,7 @@ jobs:
timeout-minutes: 15 timeout-minutes: 15
permissions: permissions:
contents: read contents: read
pull-requests: write # osv-scanner action posts/updates a PR comment with findings
strategy: strategy:
matrix: matrix:
python-version: python-version:
@@ -52,10 +57,12 @@ jobs:
pypi.org:443 pypi.org:443
files.pythonhosted.org:443 files.pythonhosted.org:443
github.com:443 github.com:443
auth.safetycli.com:443
pyup.io:443
data.safetycli.com:443
api.github.com:443 api.github.com:443
objects.githubusercontent.com:443
release-assets.githubusercontent.com:443
api.osv.dev:443
api.deps.dev:443
osv-vulnerabilities.storage.googleapis.com:443
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -70,30 +77,34 @@ jobs:
files: | files: |
api/** api/**
.github/workflows/api-security.yml .github/workflows/api-security.yml
.safety-policy.yml .github/actions/osv-scanner/**
.github/scripts/osv-scan.sh
files_ignore: | files_ignore: |
api/docs/** api/docs/**
api/README.md api/README.md
api/CHANGELOG.md api/CHANGELOG.md
api/AGENTS.md api/AGENTS.md
- name: Setup Python with Poetry - name: Setup Python with uv
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/setup-python-poetry uses: ./.github/actions/setup-python-uv
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
working-directory: ./api working-directory: ./api
update-lock: 'true'
- name: Bandit - name: Bandit
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run bandit -q -lll -x '*_test.py,./contrib/' -r . # Exclude .venv because uv places the project venv inside ./api; otherwise
# bandit would recurse into installed third-party packages.
run: uv run bandit -q -lll -x '*_test.py,./contrib/,./.venv/' -r .
- name: Safety - name: Dependency vulnerability scan with osv-scanner
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
# Accepted CVEs, severity threshold, and ignore expirations live in ../.safety-policy.yml uses: ./.github/actions/osv-scanner
run: poetry run safety check --policy-file ../.safety-policy.yml with:
lockfile: api/uv.lock
- name: Vulture - name: Vulture
if: steps.check-changes.outputs.any_changed == 'true' # Run even when osv-scanner reports findings so dead-code signal isn't masked by SCA failures.
run: poetry run vulture --exclude "contrib,tests,conftest.py" --min-confidence 100 . if: ${{ !cancelled() && steps.check-changes.outputs.any_changed == 'true' }}
run: uv run vulture --exclude "contrib,tests,conftest.py,.venv" --min-confidence 100 .
+4 -4
View File
@@ -87,6 +87,7 @@ jobs:
files.pythonhosted.org:443 files.pythonhosted.org:443
cli.codecov.io:443 cli.codecov.io:443
keybase.io:443 keybase.io:443
raw.githubusercontent.com:443
ingest.codecov.io:443 ingest.codecov.io:443
storage.googleapis.com:443 storage.googleapis.com:443
o26192.ingest.us.sentry.io:443 o26192.ingest.us.sentry.io:443
@@ -112,17 +113,16 @@ jobs:
api/CHANGELOG.md api/CHANGELOG.md
api/AGENTS.md api/AGENTS.md
- name: Setup Python with Poetry - name: Setup Python with uv
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/setup-python-poetry uses: ./.github/actions/setup-python-uv
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
working-directory: ./api working-directory: ./api
update-lock: 'true'
- name: Run tests with pytest - name: Run tests with pytest
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run pytest --cov=./src/backend --cov-report=xml src/backend run: uv run pytest --cov=./src/backend --cov-report=xml src/backend
- name: Upload coverage reports to Codecov - name: Upload coverage reports to Codecov
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
+1 -1
View File
@@ -29,7 +29,7 @@ jobs:
api.github.com:443 api.github.com:443
- name: Comment and lock issue - name: Comment and lock issue
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with: with:
script: | script: |
const { owner, repo } = context.repo; const { owner, repo } = context.repo;
+24 -24
View File
@@ -75,7 +75,7 @@ jobs:
with: with:
destination: /opt/gh-aw/actions destination: /opt/gh-aw/actions
- name: Check workflow file timestamps - name: Check workflow file timestamps
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_WORKFLOW_FILE: "issue-triage.lock.yml" GH_AW_WORKFLOW_FILE: "issue-triage.lock.yml"
with: with:
@@ -86,7 +86,7 @@ jobs:
await main(); await main();
- name: Compute current body text - name: Compute current body text
id: compute-text id: compute-text
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with: with:
script: | script: |
const { setupGlobals } = require('/opt/gh-aw/actions/setup_globals.cjs'); const { setupGlobals } = require('/opt/gh-aw/actions/setup_globals.cjs');
@@ -96,7 +96,7 @@ jobs:
- name: Add comment with workflow run link - name: Add comment with workflow run link
id: add-comment id: add-comment
if: github.event_name == 'issues' || github.event_name == 'issue_comment' || github.event_name == 'pull_request_review_comment' || github.event_name == 'discussion' || github.event_name == 'discussion_comment' || (github.event_name == 'pull_request') && (github.event.pull_request.head.repo.id == github.repository_id) if: github.event_name == 'issues' || github.event_name == 'issue_comment' || github.event_name == 'pull_request_review_comment' || github.event_name == 'discussion' || github.event_name == 'discussion_comment' || (github.event_name == 'pull_request') && (github.event.pull_request.head.repo.id == github.repository_id)
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_WORKFLOW_NAME: "Issue Triage" GH_AW_WORKFLOW_NAME: "Issue Triage"
GH_AW_SAFE_OUTPUT_MESSAGES: "{\"footer\":\"\\u003e 🤖 Generated by [Prowler Issue Triage]({run_url}) [Experimental]\"}" GH_AW_SAFE_OUTPUT_MESSAGES: "{\"footer\":\"\\u003e 🤖 Generated by [Prowler Issue Triage]({run_url}) [Experimental]\"}"
@@ -148,7 +148,7 @@ jobs:
with: with:
persist-credentials: false persist-credentials: false
- name: Merge remote .github folder - name: Merge remote .github folder
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_AGENT_FILE: ".github/agents/issue-triage.md" GH_AW_AGENT_FILE: ".github/agents/issue-triage.md"
GH_AW_AGENT_IMPORT_SPEC: "../agents/issue-triage.md" GH_AW_AGENT_IMPORT_SPEC: "../agents/issue-triage.md"
@@ -175,7 +175,7 @@ jobs:
id: checkout-pr id: checkout-pr
if: | if: |
github.event.pull_request github.event.pull_request
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_TOKEN: ${{ secrets.GH_AW_GITHUB_MCP_SERVER_TOKEN || secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }} GH_TOKEN: ${{ secrets.GH_AW_GITHUB_MCP_SERVER_TOKEN || secrets.GH_AW_GITHUB_TOKEN || secrets.GITHUB_TOKEN }}
with: with:
@@ -187,7 +187,7 @@ jobs:
await main(); await main();
- name: Generate agentic run info - name: Generate agentic run info
id: generate_aw_info id: generate_aw_info
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with: with:
script: | script: |
const fs = require('fs'); const fs = require('fs');
@@ -511,7 +511,7 @@ jobs:
} }
GH_AW_MCP_CONFIG_EOF GH_AW_MCP_CONFIG_EOF
- name: Generate workflow overview - name: Generate workflow overview
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with: with:
script: | script: |
const { generateWorkflowOverview } = require('/opt/gh-aw/actions/generate_workflow_overview.cjs'); const { generateWorkflowOverview } = require('/opt/gh-aw/actions/generate_workflow_overview.cjs');
@@ -606,7 +606,7 @@ jobs:
{{#runtime-import .github/workflows/issue-triage.md}} {{#runtime-import .github/workflows/issue-triage.md}}
GH_AW_PROMPT_EOF GH_AW_PROMPT_EOF
- name: Substitute placeholders - name: Substitute placeholders
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_PROMPT: /tmp/gh-aw/aw-prompts/prompt.txt GH_AW_PROMPT: /tmp/gh-aw/aw-prompts/prompt.txt
GH_AW_GITHUB_ACTOR: ${{ github.actor }} GH_AW_GITHUB_ACTOR: ${{ github.actor }}
@@ -640,7 +640,7 @@ jobs:
} }
}); });
- name: Interpolate variables and render templates - name: Interpolate variables and render templates
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_PROMPT: /tmp/gh-aw/aw-prompts/prompt.txt GH_AW_PROMPT: /tmp/gh-aw/aw-prompts/prompt.txt
GH_AW_GITHUB_EVENT_ISSUE_NUMBER: ${{ github.event.issue.number }} GH_AW_GITHUB_EVENT_ISSUE_NUMBER: ${{ github.event.issue.number }}
@@ -757,7 +757,7 @@ jobs:
bash /opt/gh-aw/actions/stop_mcp_gateway.sh "$GATEWAY_PID" bash /opt/gh-aw/actions/stop_mcp_gateway.sh "$GATEWAY_PID"
- name: Redact secrets in logs - name: Redact secrets in logs
if: always() if: always()
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with: with:
script: | script: |
const { setupGlobals } = require('/opt/gh-aw/actions/setup_globals.cjs'); const { setupGlobals } = require('/opt/gh-aw/actions/setup_globals.cjs');
@@ -779,7 +779,7 @@ jobs:
if-no-files-found: warn if-no-files-found: warn
- name: Ingest agent output - name: Ingest agent output
id: collect_output id: collect_output
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_SAFE_OUTPUTS: ${{ env.GH_AW_SAFE_OUTPUTS }} GH_AW_SAFE_OUTPUTS: ${{ env.GH_AW_SAFE_OUTPUTS }}
GH_AW_ALLOWED_DOMAINS: "*.pythonhosted.org,anaconda.org,api.business.githubcopilot.com,api.enterprise.githubcopilot.com,api.github.com,api.githubcopilot.com,api.individual.githubcopilot.com,api.snapcraft.io,archive.ubuntu.com,azure.archive.ubuntu.com,binstar.org,bootstrap.pypa.io,conda.anaconda.org,conda.binstar.org,crl.geotrust.com,crl.globalsign.com,crl.identrust.com,crl.sectigo.com,crl.thawte.com,crl.usertrust.com,crl.verisign.com,crl3.digicert.com,crl4.digicert.com,crls.ssl.com,files.pythonhosted.org,github.com,host.docker.internal,json-schema.org,json.schemastore.org,keyserver.ubuntu.com,mcp.context7.com,mcp.prowler.com,ocsp.digicert.com,ocsp.geotrust.com,ocsp.globalsign.com,ocsp.identrust.com,ocsp.sectigo.com,ocsp.ssl.com,ocsp.thawte.com,ocsp.usertrust.com,ocsp.verisign.com,packagecloud.io,packages.cloud.google.com,packages.microsoft.com,pip.pypa.io,ppa.launchpad.net,pypi.org,pypi.python.org,raw.githubusercontent.com,registry.npmjs.org,repo.anaconda.com,repo.continuum.io,s.symcb.com,s.symcd.com,security.ubuntu.com,telemetry.enterprise.githubcopilot.com,ts-crl.ws.symantec.com,ts-ocsp.ws.symantec.com" GH_AW_ALLOWED_DOMAINS: "*.pythonhosted.org,anaconda.org,api.business.githubcopilot.com,api.enterprise.githubcopilot.com,api.github.com,api.githubcopilot.com,api.individual.githubcopilot.com,api.snapcraft.io,archive.ubuntu.com,azure.archive.ubuntu.com,binstar.org,bootstrap.pypa.io,conda.anaconda.org,conda.binstar.org,crl.geotrust.com,crl.globalsign.com,crl.identrust.com,crl.sectigo.com,crl.thawte.com,crl.usertrust.com,crl.verisign.com,crl3.digicert.com,crl4.digicert.com,crls.ssl.com,files.pythonhosted.org,github.com,host.docker.internal,json-schema.org,json.schemastore.org,keyserver.ubuntu.com,mcp.context7.com,mcp.prowler.com,ocsp.digicert.com,ocsp.geotrust.com,ocsp.globalsign.com,ocsp.identrust.com,ocsp.sectigo.com,ocsp.ssl.com,ocsp.thawte.com,ocsp.usertrust.com,ocsp.verisign.com,packagecloud.io,packages.cloud.google.com,packages.microsoft.com,pip.pypa.io,ppa.launchpad.net,pypi.org,pypi.python.org,raw.githubusercontent.com,registry.npmjs.org,repo.anaconda.com,repo.continuum.io,s.symcb.com,s.symcd.com,security.ubuntu.com,telemetry.enterprise.githubcopilot.com,ts-crl.ws.symantec.com,ts-ocsp.ws.symantec.com"
@@ -808,7 +808,7 @@ jobs:
if-no-files-found: ignore if-no-files-found: ignore
- name: Parse agent logs for step summary - name: Parse agent logs for step summary
if: always() if: always()
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_AGENT_OUTPUT: /tmp/gh-aw/sandbox/agent/logs/ GH_AW_AGENT_OUTPUT: /tmp/gh-aw/sandbox/agent/logs/
with: with:
@@ -819,7 +819,7 @@ jobs:
await main(); await main();
- name: Parse MCP gateway logs for step summary - name: Parse MCP gateway logs for step summary
if: always() if: always()
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with: with:
script: | script: |
const { setupGlobals } = require('/opt/gh-aw/actions/setup_globals.cjs'); const { setupGlobals } = require('/opt/gh-aw/actions/setup_globals.cjs');
@@ -891,7 +891,7 @@ jobs:
echo "GH_AW_AGENT_OUTPUT=/tmp/gh-aw/safeoutputs/agent_output.json" >> "$GITHUB_ENV" echo "GH_AW_AGENT_OUTPUT=/tmp/gh-aw/safeoutputs/agent_output.json" >> "$GITHUB_ENV"
- name: Process No-Op Messages - name: Process No-Op Messages
id: noop id: noop
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }} GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }}
GH_AW_NOOP_MAX: 1 GH_AW_NOOP_MAX: 1
@@ -905,7 +905,7 @@ jobs:
await main(); await main();
- name: Record Missing Tool - name: Record Missing Tool
id: missing_tool id: missing_tool
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }} GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }}
GH_AW_WORKFLOW_NAME: "Issue Triage" GH_AW_WORKFLOW_NAME: "Issue Triage"
@@ -918,7 +918,7 @@ jobs:
await main(); await main();
- name: Handle Agent Failure - name: Handle Agent Failure
id: handle_agent_failure id: handle_agent_failure
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }} GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }}
GH_AW_WORKFLOW_NAME: "Issue Triage" GH_AW_WORKFLOW_NAME: "Issue Triage"
@@ -937,7 +937,7 @@ jobs:
await main(); await main();
- name: Handle No-Op Message - name: Handle No-Op Message
id: handle_noop_message id: handle_noop_message
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }} GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }}
GH_AW_WORKFLOW_NAME: "Issue Triage" GH_AW_WORKFLOW_NAME: "Issue Triage"
@@ -954,7 +954,7 @@ jobs:
await main(); await main();
- name: Update reaction comment with completion status - name: Update reaction comment with completion status
id: conclusion id: conclusion
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }} GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }}
GH_AW_COMMENT_ID: ${{ needs.activation.outputs.comment_id }} GH_AW_COMMENT_ID: ${{ needs.activation.outputs.comment_id }}
@@ -1008,7 +1008,7 @@ jobs:
run: | run: |
echo "Agent output-types: $AGENT_OUTPUT_TYPES" echo "Agent output-types: $AGENT_OUTPUT_TYPES"
- name: Setup threat detection - name: Setup threat detection
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
WORKFLOW_NAME: "Issue Triage" WORKFLOW_NAME: "Issue Triage"
WORKFLOW_DESCRIPTION: "[Experimental] AI-powered issue triage for Prowler - produces coding-agent-ready fix plans" WORKFLOW_DESCRIPTION: "[Experimental] AI-powered issue triage for Prowler - produces coding-agent-ready fix plans"
@@ -1062,7 +1062,7 @@ jobs:
XDG_CONFIG_HOME: /home/runner XDG_CONFIG_HOME: /home/runner
- name: Parse threat detection results - name: Parse threat detection results
id: parse_results id: parse_results
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
with: with:
script: | script: |
const { setupGlobals } = require('/opt/gh-aw/actions/setup_globals.cjs'); const { setupGlobals } = require('/opt/gh-aw/actions/setup_globals.cjs');
@@ -1102,7 +1102,7 @@ jobs:
- name: Add eyes reaction for immediate feedback - name: Add eyes reaction for immediate feedback
id: react id: react
if: github.event_name == 'issues' || github.event_name == 'issue_comment' || github.event_name == 'pull_request_review_comment' || github.event_name == 'discussion' || github.event_name == 'discussion_comment' || (github.event_name == 'pull_request') && (github.event.pull_request.head.repo.id == github.repository_id) if: github.event_name == 'issues' || github.event_name == 'issue_comment' || github.event_name == 'pull_request_review_comment' || github.event_name == 'discussion' || github.event_name == 'discussion_comment' || (github.event_name == 'pull_request') && (github.event.pull_request.head.repo.id == github.repository_id)
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_REACTION: "eyes" GH_AW_REACTION: "eyes"
with: with:
@@ -1114,7 +1114,7 @@ jobs:
await main(); await main();
- name: Check team membership for workflow - name: Check team membership for workflow
id: check_membership id: check_membership
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_REQUIRED_ROLES: admin,maintainer,write GH_AW_REQUIRED_ROLES: admin,maintainer,write
with: with:
@@ -1126,7 +1126,7 @@ jobs:
await main(); await main();
- name: Check user rate limit - name: Check user rate limit
id: check_rate_limit id: check_rate_limit
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_RATE_LIMIT_MAX: "5" GH_AW_RATE_LIMIT_MAX: "5"
GH_AW_RATE_LIMIT_WINDOW: "60" GH_AW_RATE_LIMIT_WINDOW: "60"
@@ -1185,7 +1185,7 @@ jobs:
echo "GH_AW_AGENT_OUTPUT=/tmp/gh-aw/safeoutputs/agent_output.json" >> "$GITHUB_ENV" echo "GH_AW_AGENT_OUTPUT=/tmp/gh-aw/safeoutputs/agent_output.json" >> "$GITHUB_ENV"
- name: Process Safe Outputs - name: Process Safe Outputs
id: process_safe_outputs id: process_safe_outputs
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8 uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9.0.0
env: env:
GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }} GH_AW_AGENT_OUTPUT: ${{ env.GH_AW_AGENT_OUTPUT }}
GH_AW_SAFE_OUTPUTS_HANDLER_CONFIG: "{\"add_comment\":{\"hide_older_comments\":true,\"max\":1},\"missing_data\":{},\"missing_tool\":{}}" GH_AW_SAFE_OUTPUTS_HANDLER_CONFIG: "{\"add_comment\":{\"hide_older_comments\":true,\"max\":1},\"missing_data\":{},\"missing_tool\":{}}"
+3 -3
View File
@@ -59,7 +59,7 @@ jobs:
ui/** ui/**
prowler/** prowler/**
mcp_server/** mcp_server/**
poetry.lock uv.lock
pyproject.toml pyproject.toml
- name: Check for folder changes and changelog presence - name: Check for folder changes and changelog presence
@@ -84,9 +84,9 @@ jobs:
fi fi
done done
# Check root-level dependency files (poetry.lock, pyproject.toml) # Check root-level dependency files (uv.lock, pyproject.toml)
# These are associated with the prowler folder changelog # These are associated with the prowler folder changelog
root_deps_changed=$(echo "${STEPS_CHANGED_FILES_OUTPUTS_ALL_CHANGED_FILES}" | tr ' ' '\n' | grep -E "^(poetry\.lock|pyproject\.toml)$" || true) root_deps_changed=$(echo "${STEPS_CHANGED_FILES_OUTPUTS_ALL_CHANGED_FILES}" | tr ' ' '\n' | grep -E "^(uv\.lock|pyproject\.toml)$" || true)
if [ -n "$root_deps_changed" ]; then if [ -n "$root_deps_changed" ]; then
echo "Detected changes in root dependency files: $root_deps_changed" echo "Detected changes in root dependency files: $root_deps_changed"
# Check if prowler/CHANGELOG.md was already updated (might have been caught above) # Check if prowler/CHANGELOG.md was already updated (might have been caught above)
+8 -8
View File
@@ -40,12 +40,11 @@ jobs:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }} token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
persist-credentials: false persist-credentials: false
- name: Setup Python with Poetry - name: Setup Python with uv
uses: ./.github/actions/setup-python-poetry uses: ./.github/actions/setup-python-uv
with: with:
python-version: '3.12' python-version: '3.12'
install-dependencies: 'false' install-dependencies: 'false'
enable-cache: 'false'
- name: Configure Git - name: Configure Git
run: | run: |
@@ -339,10 +338,11 @@ jobs:
exit 1 exit 1
fi fi
# Update poetry lock file # Update uv lock file
echo "Updating poetry.lock file..." echo "Updating uv.lock file..."
pip install --no-cache-dir uv==0.11.14
cd api cd api
poetry lock uv lock
cd .. cd ..
echo "✓ Prepared prowler dependency update to: $UPDATED_PROWLER_REF" echo "✓ Prepared prowler dependency update to: $UPDATED_PROWLER_REF"
@@ -357,7 +357,7 @@ jobs:
base: ${{ env.BRANCH_NAME }} base: ${{ env.BRANCH_NAME }}
add-paths: | add-paths: |
api/pyproject.toml api/pyproject.toml
api/poetry.lock api/uv.lock
title: "chore(api): Update prowler dependency to ${{ env.BRANCH_NAME }} for release ${{ env.PROWLER_VERSION }}" title: "chore(api): Update prowler dependency to ${{ env.BRANCH_NAME }} for release ${{ env.PROWLER_VERSION }}"
body: | body: |
### Description ### Description
@@ -366,7 +366,7 @@ jobs:
**Changes:** **Changes:**
- Updates `api/pyproject.toml` prowler dependency from `@master` to `@${{ env.BRANCH_NAME }}` - Updates `api/pyproject.toml` prowler dependency from `@master` to `@${{ env.BRANCH_NAME }}`
- Updates `api/poetry.lock` file with resolved dependencies - Updates `api/uv.lock` file with resolved dependencies
This PR should be merged into the `${{ env.BRANCH_NAME }}` release branch. This PR should be merged into the `${{ env.BRANCH_NAME }}` release branch.
+9 -7
View File
@@ -71,24 +71,26 @@ jobs:
contrib/** contrib/**
**/AGENTS.md **/AGENTS.md
- name: Setup Python with Poetry - name: Setup Python with uv
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/setup-python-poetry uses: ./.github/actions/setup-python-uv
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
- name: Check Poetry lock file - name: Check uv lock file
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry check --lock run: uv lock --check
- name: Lint with flake8 - name: Lint with flake8
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib,ui,api,skills run: uv run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude .venv,contrib,ui,api,skills,mcp_server
- name: Check format with black - name: Check format with black
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run black --exclude "api|ui|skills" --check . # mcp_server has its own pyproject and uses ruff format, exclude it so SDK black
# does not fight ruff over rules it never formatted.
run: uv run black --exclude "\.venv|api|ui|skills|mcp_server" --check .
- name: Lint with pylint - name: Lint with pylint
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run pylint --disable=W,C,R,E -j 0 -rn -sn prowler/ run: uv run pylint --disable=W,C,R,E -j 0 -rn -sn prowler/
+1 -11
View File
@@ -73,20 +73,10 @@ jobs:
with: with:
persist-credentials: false persist-credentials: false
- name: Setup Python with Poetry
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ env.PYTHON_VERSION }}
install-dependencies: 'false'
enable-cache: 'false'
- name: Inject poetry-bumpversion plugin
run: pipx inject poetry poetry-bumpversion
- name: Get Prowler version and set tags - name: Get Prowler version and set tags
id: get-prowler-version id: get-prowler-version
run: | run: |
PROWLER_VERSION="$(poetry version -s 2>/dev/null)" PROWLER_VERSION="$(grep -E '^version = ' pyproject.toml | sed -E 's/version = "([^"]+)"/\1/' | tr -d '[:space:]')"
echo "prowler_version=${PROWLER_VERSION}" >> "${GITHUB_OUTPUT}" echo "prowler_version=${PROWLER_VERSION}" >> "${GITHUB_OUTPUT}"
PROWLER_VERSION_MAJOR="${PROWLER_VERSION%%.*}" PROWLER_VERSION_MAJOR="${PROWLER_VERSION%%.*}"
+2 -2
View File
@@ -9,7 +9,7 @@ on:
- 'prowler/**' - 'prowler/**'
- 'Dockerfile*' - 'Dockerfile*'
- 'pyproject.toml' - 'pyproject.toml'
- 'poetry.lock' - 'uv.lock'
- '.github/workflows/sdk-container-checks.yml' - '.github/workflows/sdk-container-checks.yml'
pull_request: pull_request:
branches: branches:
@@ -19,7 +19,7 @@ on:
- 'prowler/**' - 'prowler/**'
- 'Dockerfile*' - 'Dockerfile*'
- 'pyproject.toml' - 'pyproject.toml'
- 'poetry.lock' - 'uv.lock'
- '.github/workflows/sdk-container-checks.yml' - '.github/workflows/sdk-container-checks.yml'
concurrency: concurrency:
+6 -8
View File
@@ -75,15 +75,14 @@ jobs:
with: with:
persist-credentials: false persist-credentials: false
- name: Setup Python with Poetry - name: Setup Python with uv
uses: ./.github/actions/setup-python-poetry uses: ./.github/actions/setup-python-uv
with: with:
python-version: ${{ env.PYTHON_VERSION }} python-version: ${{ env.PYTHON_VERSION }}
install-dependencies: 'false' install-dependencies: 'false'
enable-cache: 'false'
- name: Build Prowler package - name: Build Prowler package
run: poetry build run: uv build
- name: Publish Prowler package to PyPI - name: Publish Prowler package to PyPI
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0 uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
@@ -112,12 +111,11 @@ jobs:
with: with:
persist-credentials: false persist-credentials: false
- name: Setup Python with Poetry - name: Setup Python with uv
uses: ./.github/actions/setup-python-poetry uses: ./.github/actions/setup-python-uv
with: with:
python-version: ${{ env.PYTHON_VERSION }} python-version: ${{ env.PYTHON_VERSION }}
install-dependencies: 'false' install-dependencies: 'false'
enable-cache: 'false'
- name: Install toml package - name: Install toml package
run: pip install toml run: pip install toml
@@ -128,7 +126,7 @@ jobs:
python util/replicate_pypi_package.py python util/replicate_pypi_package.py
- name: Build prowler-cloud package - name: Build prowler-cloud package
run: poetry build run: uv build
- name: Publish prowler-cloud package to PyPI - name: Publish prowler-cloud package to PyPI
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0 uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
+24 -15
View File
@@ -9,10 +9,12 @@ on:
- 'prowler/**' - 'prowler/**'
- 'tests/**' - 'tests/**'
- 'pyproject.toml' - 'pyproject.toml'
- 'poetry.lock' - 'uv.lock'
- '.github/workflows/sdk-tests.yml' - '.github/workflows/sdk-tests.yml'
- '.github/workflows/sdk-security.yml' - '.github/workflows/sdk-security.yml'
- '.github/actions/setup-python-poetry/**' - '.github/actions/setup-python-uv/**'
- '.github/actions/osv-scanner/**'
- '.github/scripts/osv-scan.sh'
pull_request: pull_request:
branches: branches:
- 'master' - 'master'
@@ -21,10 +23,12 @@ on:
- 'prowler/**' - 'prowler/**'
- 'tests/**' - 'tests/**'
- 'pyproject.toml' - 'pyproject.toml'
- 'poetry.lock' - 'uv.lock'
- '.github/workflows/sdk-tests.yml' - '.github/workflows/sdk-tests.yml'
- '.github/workflows/sdk-security.yml' - '.github/workflows/sdk-security.yml'
- '.github/actions/setup-python-poetry/**' - '.github/actions/setup-python-uv/**'
- '.github/actions/osv-scanner/**'
- '.github/scripts/osv-scan.sh'
concurrency: concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
@@ -39,6 +43,7 @@ jobs:
timeout-minutes: 15 timeout-minutes: 15
permissions: permissions:
contents: read contents: read
pull-requests: write # osv-scanner action posts/updates a PR comment with findings
steps: steps:
- name: Harden Runner - name: Harden Runner
@@ -49,10 +54,12 @@ jobs:
pypi.org:443 pypi.org:443
files.pythonhosted.org:443 files.pythonhosted.org:443
github.com:443 github.com:443
auth.safetycli.com:443
pyup.io:443
data.safetycli.com:443
api.github.com:443 api.github.com:443
objects.githubusercontent.com:443
release-assets.githubusercontent.com:443
api.osv.dev:443
api.deps.dev:443
osv-vulnerabilities.storage.googleapis.com:443
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -87,21 +94,23 @@ jobs:
contrib/** contrib/**
**/AGENTS.md **/AGENTS.md
- name: Setup Python with Poetry - name: Setup Python with uv
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/setup-python-poetry uses: ./.github/actions/setup-python-uv
with: with:
python-version: '3.12' python-version: '3.12'
- name: Security scan with Bandit - name: Security scan with Bandit
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run bandit -q -lll -x '*_test.py,./contrib/,./api/,./ui' -r . run: uv run bandit -q -lll -x '*_test.py,./.venv/,./contrib/,./api/,./ui' -r .
- name: Security scan with Safety - name: Dependency vulnerability scan with osv-scanner
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
# Accepted CVEs, severity threshold, and ignore expirations live in .safety-policy.yml uses: ./.github/actions/osv-scanner
run: poetry run safety check -r pyproject.toml --policy-file .safety-policy.yml with:
lockfile: uv.lock
- name: Dead code detection with Vulture - name: Dead code detection with Vulture
if: steps.check-changes.outputs.any_changed == 'true' # Run even when osv-scanner reports findings so dead-code signal isn't masked by SCA failures.
run: poetry run vulture --exclude "contrib,api,ui" --min-confidence 100 . if: ${{ !cancelled() && steps.check-changes.outputs.any_changed == 'true' }}
run: uv run vulture --exclude ".venv,contrib,api,ui" --min-confidence 100 .
+57 -33
View File
@@ -92,9 +92,9 @@ jobs:
contrib/** contrib/**
**/AGENTS.md **/AGENTS.md
- name: Setup Python with Poetry - name: Setup Python with uv
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/setup-python-poetry uses: ./.github/actions/setup-python-uv
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
@@ -107,7 +107,7 @@ jobs:
files: | files: |
./prowler/**/aws/** ./prowler/**/aws/**
./tests/**/aws/** ./tests/**/aws/**
./poetry.lock ./uv.lock
- name: Resolve AWS services under test - name: Resolve AWS services under test
if: steps.changed-aws.outputs.any_changed == 'true' if: steps.changed-aws.outputs.any_changed == 'true'
@@ -209,11 +209,11 @@ jobs:
echo "AWS service_paths='${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS}'" echo "AWS service_paths='${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS}'"
if [ "${STEPS_AWS_SERVICES_OUTPUTS_RUN_ALL}" = "true" ]; then if [ "${STEPS_AWS_SERVICES_OUTPUTS_RUN_ALL}" = "true" ]; then
poetry run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml tests/providers/aws uv run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml tests/providers/aws
elif [ -z "${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS}" ]; then elif [ -z "${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS}" ]; then
echo "No AWS service paths detected; skipping AWS tests." echo "No AWS service paths detected; skipping AWS tests."
else else
poetry run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml ${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS} uv run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml ${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS}
fi fi
env: env:
STEPS_AWS_SERVICES_OUTPUTS_RUN_ALL: ${{ steps.aws-services.outputs.run_all }} STEPS_AWS_SERVICES_OUTPUTS_RUN_ALL: ${{ steps.aws-services.outputs.run_all }}
@@ -237,11 +237,11 @@ jobs:
files: | files: |
./prowler/**/azure/** ./prowler/**/azure/**
./tests/**/azure/** ./tests/**/azure/**
./poetry.lock ./uv.lock
- name: Run Azure tests - name: Run Azure tests
if: steps.changed-azure.outputs.any_changed == 'true' if: steps.changed-azure.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/azure --cov-report=xml:azure_coverage.xml tests/providers/azure run: uv run pytest -n auto --cov=./prowler/providers/azure --cov-report=xml:azure_coverage.xml tests/providers/azure
- name: Upload Azure coverage to Codecov - name: Upload Azure coverage to Codecov
if: steps.changed-azure.outputs.any_changed == 'true' if: steps.changed-azure.outputs.any_changed == 'true'
@@ -261,11 +261,11 @@ jobs:
files: | files: |
./prowler/**/gcp/** ./prowler/**/gcp/**
./tests/**/gcp/** ./tests/**/gcp/**
./poetry.lock ./uv.lock
- name: Run GCP tests - name: Run GCP tests
if: steps.changed-gcp.outputs.any_changed == 'true' if: steps.changed-gcp.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/gcp --cov-report=xml:gcp_coverage.xml tests/providers/gcp run: uv run pytest -n auto --cov=./prowler/providers/gcp --cov-report=xml:gcp_coverage.xml tests/providers/gcp
- name: Upload GCP coverage to Codecov - name: Upload GCP coverage to Codecov
if: steps.changed-gcp.outputs.any_changed == 'true' if: steps.changed-gcp.outputs.any_changed == 'true'
@@ -285,11 +285,11 @@ jobs:
files: | files: |
./prowler/**/kubernetes/** ./prowler/**/kubernetes/**
./tests/**/kubernetes/** ./tests/**/kubernetes/**
./poetry.lock ./uv.lock
- name: Run Kubernetes tests - name: Run Kubernetes tests
if: steps.changed-kubernetes.outputs.any_changed == 'true' if: steps.changed-kubernetes.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/kubernetes --cov-report=xml:kubernetes_coverage.xml tests/providers/kubernetes run: uv run pytest -n auto --cov=./prowler/providers/kubernetes --cov-report=xml:kubernetes_coverage.xml tests/providers/kubernetes
- name: Upload Kubernetes coverage to Codecov - name: Upload Kubernetes coverage to Codecov
if: steps.changed-kubernetes.outputs.any_changed == 'true' if: steps.changed-kubernetes.outputs.any_changed == 'true'
@@ -309,11 +309,11 @@ jobs:
files: | files: |
./prowler/**/github/** ./prowler/**/github/**
./tests/**/github/** ./tests/**/github/**
./poetry.lock ./uv.lock
- name: Run GitHub tests - name: Run GitHub tests
if: steps.changed-github.outputs.any_changed == 'true' if: steps.changed-github.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/github --cov-report=xml:github_coverage.xml tests/providers/github run: uv run pytest -n auto --cov=./prowler/providers/github --cov-report=xml:github_coverage.xml tests/providers/github
- name: Upload GitHub coverage to Codecov - name: Upload GitHub coverage to Codecov
if: steps.changed-github.outputs.any_changed == 'true' if: steps.changed-github.outputs.any_changed == 'true'
@@ -324,6 +324,30 @@ jobs:
flags: prowler-py${{ matrix.python-version }}-github flags: prowler-py${{ matrix.python-version }}-github
files: ./github_coverage.xml files: ./github_coverage.xml
# Okta Provider
- name: Check if Okta files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-okta
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/okta/**
./tests/**/okta/**
./uv.lock
- name: Run Okta tests
if: steps.changed-okta.outputs.any_changed == 'true'
run: uv run pytest -n auto --cov=./prowler/providers/okta --cov-report=xml:okta_coverage.xml tests/providers/okta
- name: Upload Okta coverage to Codecov
if: steps.changed-okta.outputs.any_changed == 'true'
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: prowler-py${{ matrix.python-version }}-okta
files: ./okta_coverage.xml
# NHN Provider # NHN Provider
- name: Check if NHN files changed - name: Check if NHN files changed
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
@@ -333,11 +357,11 @@ jobs:
files: | files: |
./prowler/**/nhn/** ./prowler/**/nhn/**
./tests/**/nhn/** ./tests/**/nhn/**
./poetry.lock ./uv.lock
- name: Run NHN tests - name: Run NHN tests
if: steps.changed-nhn.outputs.any_changed == 'true' if: steps.changed-nhn.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/nhn --cov-report=xml:nhn_coverage.xml tests/providers/nhn run: uv run pytest -n auto --cov=./prowler/providers/nhn --cov-report=xml:nhn_coverage.xml tests/providers/nhn
- name: Upload NHN coverage to Codecov - name: Upload NHN coverage to Codecov
if: steps.changed-nhn.outputs.any_changed == 'true' if: steps.changed-nhn.outputs.any_changed == 'true'
@@ -357,11 +381,11 @@ jobs:
files: | files: |
./prowler/**/m365/** ./prowler/**/m365/**
./tests/**/m365/** ./tests/**/m365/**
./poetry.lock ./uv.lock
- name: Run M365 tests - name: Run M365 tests
if: steps.changed-m365.outputs.any_changed == 'true' if: steps.changed-m365.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/m365 --cov-report=xml:m365_coverage.xml tests/providers/m365 run: uv run pytest -n auto --cov=./prowler/providers/m365 --cov-report=xml:m365_coverage.xml tests/providers/m365
- name: Upload M365 coverage to Codecov - name: Upload M365 coverage to Codecov
if: steps.changed-m365.outputs.any_changed == 'true' if: steps.changed-m365.outputs.any_changed == 'true'
@@ -381,11 +405,11 @@ jobs:
files: | files: |
./prowler/**/iac/** ./prowler/**/iac/**
./tests/**/iac/** ./tests/**/iac/**
./poetry.lock ./uv.lock
- name: Run IaC tests - name: Run IaC tests
if: steps.changed-iac.outputs.any_changed == 'true' if: steps.changed-iac.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/iac --cov-report=xml:iac_coverage.xml tests/providers/iac run: uv run pytest -n auto --cov=./prowler/providers/iac --cov-report=xml:iac_coverage.xml tests/providers/iac
- name: Upload IaC coverage to Codecov - name: Upload IaC coverage to Codecov
if: steps.changed-iac.outputs.any_changed == 'true' if: steps.changed-iac.outputs.any_changed == 'true'
@@ -405,11 +429,11 @@ jobs:
files: | files: |
./prowler/**/mongodbatlas/** ./prowler/**/mongodbatlas/**
./tests/**/mongodbatlas/** ./tests/**/mongodbatlas/**
./poetry.lock ./uv.lock
- name: Run MongoDB Atlas tests - name: Run MongoDB Atlas tests
if: steps.changed-mongodbatlas.outputs.any_changed == 'true' if: steps.changed-mongodbatlas.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/mongodbatlas --cov-report=xml:mongodbatlas_coverage.xml tests/providers/mongodbatlas run: uv run pytest -n auto --cov=./prowler/providers/mongodbatlas --cov-report=xml:mongodbatlas_coverage.xml tests/providers/mongodbatlas
- name: Upload MongoDB Atlas coverage to Codecov - name: Upload MongoDB Atlas coverage to Codecov
if: steps.changed-mongodbatlas.outputs.any_changed == 'true' if: steps.changed-mongodbatlas.outputs.any_changed == 'true'
@@ -429,11 +453,11 @@ jobs:
files: | files: |
./prowler/**/oraclecloud/** ./prowler/**/oraclecloud/**
./tests/**/oraclecloud/** ./tests/**/oraclecloud/**
./poetry.lock ./uv.lock
- name: Run OCI tests - name: Run OCI tests
if: steps.changed-oraclecloud.outputs.any_changed == 'true' if: steps.changed-oraclecloud.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/oraclecloud --cov-report=xml:oraclecloud_coverage.xml tests/providers/oraclecloud run: uv run pytest -n auto --cov=./prowler/providers/oraclecloud --cov-report=xml:oraclecloud_coverage.xml tests/providers/oraclecloud
- name: Upload OCI coverage to Codecov - name: Upload OCI coverage to Codecov
if: steps.changed-oraclecloud.outputs.any_changed == 'true' if: steps.changed-oraclecloud.outputs.any_changed == 'true'
@@ -453,11 +477,11 @@ jobs:
files: | files: |
./prowler/**/openstack/** ./prowler/**/openstack/**
./tests/**/openstack/** ./tests/**/openstack/**
./poetry.lock ./uv.lock
- name: Run OpenStack tests - name: Run OpenStack tests
if: steps.changed-openstack.outputs.any_changed == 'true' if: steps.changed-openstack.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/openstack --cov-report=xml:openstack_coverage.xml tests/providers/openstack run: uv run pytest -n auto --cov=./prowler/providers/openstack --cov-report=xml:openstack_coverage.xml tests/providers/openstack
- name: Upload OpenStack coverage to Codecov - name: Upload OpenStack coverage to Codecov
if: steps.changed-openstack.outputs.any_changed == 'true' if: steps.changed-openstack.outputs.any_changed == 'true'
@@ -477,11 +501,11 @@ jobs:
files: | files: |
./prowler/**/googleworkspace/** ./prowler/**/googleworkspace/**
./tests/**/googleworkspace/** ./tests/**/googleworkspace/**
./poetry.lock ./uv.lock
- name: Run Google Workspace tests - name: Run Google Workspace tests
if: steps.changed-googleworkspace.outputs.any_changed == 'true' if: steps.changed-googleworkspace.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/googleworkspace --cov-report=xml:googleworkspace_coverage.xml tests/providers/googleworkspace run: uv run pytest -n auto --cov=./prowler/providers/googleworkspace --cov-report=xml:googleworkspace_coverage.xml tests/providers/googleworkspace
- name: Upload Google Workspace coverage to Codecov - name: Upload Google Workspace coverage to Codecov
if: steps.changed-googleworkspace.outputs.any_changed == 'true' if: steps.changed-googleworkspace.outputs.any_changed == 'true'
@@ -501,11 +525,11 @@ jobs:
files: | files: |
./prowler/**/vercel/** ./prowler/**/vercel/**
./tests/**/vercel/** ./tests/**/vercel/**
./poetry.lock ./uv.lock
- name: Run Vercel tests - name: Run Vercel tests
if: steps.changed-vercel.outputs.any_changed == 'true' if: steps.changed-vercel.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/vercel --cov-report=xml:vercel_coverage.xml tests/providers/vercel run: uv run pytest -n auto --cov=./prowler/providers/vercel --cov-report=xml:vercel_coverage.xml tests/providers/vercel
- name: Upload Vercel coverage to Codecov - name: Upload Vercel coverage to Codecov
if: steps.changed-vercel.outputs.any_changed == 'true' if: steps.changed-vercel.outputs.any_changed == 'true'
@@ -525,11 +549,11 @@ jobs:
files: | files: |
./prowler/lib/** ./prowler/lib/**
./tests/lib/** ./tests/lib/**
./poetry.lock ./uv.lock
- name: Run Lib tests - name: Run Lib tests
if: steps.changed-lib.outputs.any_changed == 'true' if: steps.changed-lib.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/lib --cov-report=xml:lib_coverage.xml tests/lib run: uv run pytest -n auto --cov=./prowler/lib --cov-report=xml:lib_coverage.xml tests/lib
- name: Upload Lib coverage to Codecov - name: Upload Lib coverage to Codecov
if: steps.changed-lib.outputs.any_changed == 'true' if: steps.changed-lib.outputs.any_changed == 'true'
@@ -549,11 +573,11 @@ jobs:
files: | files: |
./prowler/config/** ./prowler/config/**
./tests/config/** ./tests/config/**
./poetry.lock ./uv.lock
- name: Run Config tests - name: Run Config tests
if: steps.changed-config.outputs.any_changed == 'true' if: steps.changed-config.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/config --cov-report=xml:config_coverage.xml tests/config run: uv run pytest -n auto --cov=./prowler/config --cov-report=xml:config_coverage.xml tests/config
- name: Upload Config coverage to Codecov - name: Upload Config coverage to Codecov
if: steps.changed-config.outputs.any_changed == 'true' if: steps.changed-config.outputs.any_changed == 'true'
+7 -1
View File
@@ -130,6 +130,12 @@ jobs:
echo "AWS_ACCESS_KEY_ID=${{ secrets.E2E_AWS_PROVIDER_ACCESS_KEY }}" >> .env echo "AWS_ACCESS_KEY_ID=${{ secrets.E2E_AWS_PROVIDER_ACCESS_KEY }}" >> .env
echo "AWS_SECRET_ACCESS_KEY=${{ secrets.E2E_AWS_PROVIDER_SECRET_KEY }}" >> .env echo "AWS_SECRET_ACCESS_KEY=${{ secrets.E2E_AWS_PROVIDER_SECRET_KEY }}" >> .env
- name: Build API image from current code
# docker-compose.yml references prowlercloud/prowler-api:latest from the registry,
# which lags behind PR changes; build locally so E2E exercises the API image
# produced by this PR.
run: docker build -t prowlercloud/prowler-api:latest ./api
- name: Start API services - name: Start API services
run: | run: |
export PROWLER_API_VERSION=latest export PROWLER_API_VERSION=latest
@@ -158,7 +164,7 @@ jobs:
for fixture in api/fixtures/dev/*.json; do for fixture in api/fixtures/dev/*.json; do
if [ -f "$fixture" ]; then if [ -f "$fixture" ]; then
echo "Loading $fixture" echo "Loading $fixture"
poetry run python manage.py loaddata "$fixture" --database admin uv run python manage.py loaddata "$fixture" --database admin
fi fi
done done
' '
+75
View File
@@ -0,0 +1,75 @@
name: 'UI: Security'
on:
push:
branches:
- 'master'
- 'v5.*'
paths:
- 'ui/package.json'
- 'ui/pnpm-lock.yaml'
- '.github/workflows/ui-security.yml'
- '.github/actions/osv-scanner/**'
- '.github/scripts/osv-scan.sh'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'ui/package.json'
- 'ui/pnpm-lock.yaml'
- '.github/workflows/ui-security.yml'
- '.github/actions/osv-scanner/**'
- '.github/scripts/osv-scan.sh'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
ui-security-scans:
if: github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
timeout-minutes: 15
permissions:
contents: read
pull-requests: write # osv-scanner action posts/updates a PR comment with findings
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
api.github.com:443
objects.githubusercontent.com:443
release-assets.githubusercontent.com:443
api.osv.dev:443
api.deps.dev:443
osv-vulnerabilities.storage.googleapis.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
# zizmor: ignore[artipacked]
persist-credentials: true # Required by tj-actions/changed-files to fetch PR branch
- name: Check for UI dependency changes
id: check-changes
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
ui/package.json
ui/pnpm-lock.yaml
.github/workflows/ui-security.yml
.github/actions/osv-scanner/**
.github/scripts/osv-scan.sh
- name: Dependency vulnerability scan with osv-scanner
if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/osv-scanner
with:
lockfile: ui/pnpm-lock.yaml
+34 -9
View File
@@ -1,14 +1,14 @@
name: 'UI: Tests' name: "UI: Tests"
on: on:
push: push:
branches: branches:
- 'master' - "master"
- 'v5.*' - "v5.*"
pull_request: pull_request:
branches: branches:
- 'master' - "master"
- 'v5.*' - "v5.*"
concurrency: concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
@@ -16,7 +16,7 @@ concurrency:
env: env:
UI_WORKING_DIR: ./ui UI_WORKING_DIR: ./ui
NODE_VERSION: '24.13.0' NODE_VERSION: "24.13.0"
permissions: {} permissions: {}
@@ -42,6 +42,9 @@ jobs:
fonts.gstatic.com:443 fonts.gstatic.com:443
api.github.com:443 api.github.com:443
release-assets.githubusercontent.com:443 release-assets.githubusercontent.com:443
cdn.playwright.dev:443
objects.githubusercontent.com:443
playwright.download.prss.microsoft.com:443
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -129,11 +132,15 @@ jobs:
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
run: pnpm run healthcheck run: pnpm run healthcheck
- name: Run pnpm audit
if: steps.check-changes.outputs.any_changed == 'true'
run: pnpm run audit
- name: Run unit tests (all - critical paths changed) - name: Run unit tests (all - critical paths changed)
if: steps.check-changes.outputs.any_changed == 'true' && steps.critical-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true' && steps.critical-changes.outputs.any_changed == 'true'
run: | run: |
echo "Critical paths changed - running ALL unit tests" echo "Critical paths changed - running ALL unit tests"
pnpm run test:run pnpm run test:unit
- name: Run unit tests (related to changes only) - name: Run unit tests (related to changes only)
if: steps.check-changes.outputs.any_changed == 'true' && steps.critical-changes.outputs.any_changed != 'true' && steps.changed-source.outputs.all_changed_files != '' if: steps.check-changes.outputs.any_changed == 'true' && steps.critical-changes.outputs.any_changed != 'true' && steps.changed-source.outputs.all_changed_files != ''
@@ -142,7 +149,7 @@ jobs:
echo "${STEPS_CHANGED_SOURCE_OUTPUTS_ALL_CHANGED_FILES}" echo "${STEPS_CHANGED_SOURCE_OUTPUTS_ALL_CHANGED_FILES}"
# Convert space-separated to vitest related format (remove ui/ prefix for relative paths) # Convert space-separated to vitest related format (remove ui/ prefix for relative paths)
CHANGED_FILES=$(echo "${STEPS_CHANGED_SOURCE_OUTPUTS_ALL_CHANGED_FILES}" | tr ' ' '\n' | sed 's|^ui/||' | tr '\n' ' ') CHANGED_FILES=$(echo "${STEPS_CHANGED_SOURCE_OUTPUTS_ALL_CHANGED_FILES}" | tr ' ' '\n' | sed 's|^ui/||' | tr '\n' ' ')
pnpm exec vitest related $CHANGED_FILES --run pnpm exec vitest related $CHANGED_FILES --run --project unit
env: env:
STEPS_CHANGED_SOURCE_OUTPUTS_ALL_CHANGED_FILES: ${{ steps.changed-source.outputs.all_changed_files }} STEPS_CHANGED_SOURCE_OUTPUTS_ALL_CHANGED_FILES: ${{ steps.changed-source.outputs.all_changed_files }}
@@ -150,7 +157,25 @@ jobs:
if: steps.check-changes.outputs.any_changed == 'true' && steps.critical-changes.outputs.any_changed != 'true' && steps.changed-source.outputs.all_changed_files == '' if: steps.check-changes.outputs.any_changed == 'true' && steps.critical-changes.outputs.any_changed != 'true' && steps.changed-source.outputs.all_changed_files == ''
run: | run: |
echo "Only test files changed - running ALL unit tests" echo "Only test files changed - running ALL unit tests"
pnpm run test:run pnpm run test:unit
- name: Cache Playwright browsers
if: steps.check-changes.outputs.any_changed == 'true'
id: playwright-cache
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
with:
path: ~/.cache/ms-playwright
key: ${{ runner.os }}-playwright-chromium-${{ hashFiles('ui/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-playwright-chromium-
- name: Install Playwright Chromium browser
if: steps.check-changes.outputs.any_changed == 'true' && steps.playwright-cache.outputs.cache-hit != 'true'
run: pnpm exec playwright install chromium
- name: Run browser tests
if: steps.check-changes.outputs.any_changed == 'true'
run: pnpm run test:browser
- name: Build application - name: Build application
if: steps.check-changes.outputs.any_changed == 'true' if: steps.check-changes.outputs.any_changed == 'true'
+14 -36
View File
@@ -44,7 +44,9 @@ repos:
rev: v1.24.1 rev: v1.24.1
hooks: hooks:
- id: zizmor - id: zizmor
files: ^\.github/ # zizmor only audits workflows, composite actions and dependabot
# config; broader paths trip exit 3 ("no audit was performed").
files: ^\.github/(workflows|actions)/.+\.ya?ml$|^\.github/dependabot\.ya?ml$
priority: 30 priority: 30
## BASH ## BASH
@@ -105,35 +107,21 @@ repos:
files: { glob: ["{api,mcp_server}/**/*.py"] } files: { glob: ["{api,mcp_server}/**/*.py"] }
priority: 20 priority: 20
## PYTHON — Poetry ## PYTHON — uv (API + SDK)
- repo: https://github.com/python-poetry/poetry - repo: https://github.com/astral-sh/uv-pre-commit
rev: 2.3.4 rev: 0.11.14
hooks: hooks:
- id: poetry-check - id: uv-lock
name: API - poetry-check name: API - uv-lock
args: ["--directory=./api"] args: ["--check", "--project=./api"]
files: { glob: ["api/{pyproject.toml,poetry.lock}"] } files: { glob: ["api/{pyproject.toml,uv.lock}"] }
pass_filenames: false pass_filenames: false
priority: 50 priority: 50
- id: poetry-lock - id: uv-lock
name: API - poetry-lock name: SDK - uv-lock
args: ["--directory=./api"] args: ["--check", "--project=./"]
files: { glob: ["api/{pyproject.toml,poetry.lock}"] } files: { glob: ["{pyproject.toml,uv.lock}"] }
pass_filenames: false
priority: 50
- id: poetry-check
name: SDK - poetry-check
args: ["--directory=./"]
files: { glob: ["{pyproject.toml,poetry.lock}"] }
pass_filenames: false
priority: 50
- id: poetry-lock
name: SDK - poetry-lock
args: ["--directory=./"]
files: { glob: ["{pyproject.toml,poetry.lock}"] }
pass_filenames: false pass_filenames: false
priority: 50 priority: 50
@@ -177,16 +165,6 @@ repos:
exclude: { glob: ["{contrib,skills}/**", "**/.venv/**", "**/*_test.py"] } exclude: { glob: ["{contrib,skills}/**", "**/.venv/**", "**/*_test.py"] }
priority: 40 priority: 40
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
# Accepted CVEs, severity threshold, and ignore expirations live in .safety-policy.yml
entry: safety check --policy-file .safety-policy.yml
language: system
pass_filenames: false
files: { glob: ["**/pyproject.toml", "**/poetry.lock", "**/requirements*.txt", ".safety-policy.yml"] }
priority: 40
- id: vulture - id: vulture
name: vulture name: vulture
description: "Vulture finds unused code in Python programs." description: "Vulture finds unused code in Python programs."
+2 -6
View File
@@ -11,15 +11,11 @@ build:
python: "3.11" python: "3.11"
jobs: jobs:
post_create_environment: post_create_environment:
# Install poetry - python -m pip install uv==0.11.14
# https://python-poetry.org/docs/#installing-manually
- python -m pip install poetry==2.3.4
post_install: post_install:
# Install dependencies with 'docs' dependency group
# https://python-poetry.org/docs/managing-dependencies/#dependency-groups
# VIRTUAL_ENV needs to be set manually for now. # VIRTUAL_ENV needs to be set manually for now.
# See https://github.com/readthedocs/readthedocs.org/pull/11152/ # See https://github.com/readthedocs/readthedocs.org/pull/11152/
- VIRTUAL_ENV=${READTHEDOCS_VIRTUALENV_PATH} python -m poetry install --only=docs - VIRTUAL_ENV=${READTHEDOCS_VIRTUALENV_PATH} uv sync --group docs --no-install-project
mkdocs: mkdocs:
configuration: mkdocs.yml configuration: mkdocs.yml
-58
View File
@@ -1,58 +0,0 @@
# Safety policy for `safety check` (Safety CLI 3.x, v2 schema).
# Applied in: .pre-commit-config.yaml, .github/workflows/api-security.yml,
# .github/workflows/sdk-security.yml via `--policy-file`.
#
# Validate: poetry run safety validate policy_file --path .safety-policy.yml
security:
# Scan unpinned requirements too. Prowler pins via poetry.lock, so this is
# defensive against accidental unpinned entries.
ignore-unpinned-requirements: False
# CVSS severity filter. 7 = report only HIGH (7.08.9) and CRITICAL (9.010.0).
# Reference: 9=CRITICAL only, 7=CRITICAL+HIGH, 4=CRITICAL+HIGH+MEDIUM.
ignore-cvss-severity-below: 7
# Unknown severity is unrated, not safe. Keep False so unrated CVEs still fail
# the build and get a human eye. Flip to True only if noise is unmanageable.
ignore-cvss-unknown-severity: False
# Fail the build when a non-ignored vulnerability is found.
continue-on-vulnerability-error: False
# Explicit accepted vulnerabilities. Each entry MUST have a reason and an
# expiry. Expired entries fail the scan, forcing re-audit.
ignore-vulnerabilities:
77744:
reason: "Botocore requires urllib3 1.X. Remove once upgraded to urllib3 2.X."
expires: '2026-10-22'
77745:
reason: "Botocore requires urllib3 1.X. Remove once upgraded to urllib3 2.X."
expires: '2026-10-22'
79023:
reason: "knack ReDoS; blocked until azure-cli-core (via cartography) allows knack >=0.13.0."
expires: '2026-10-22'
79027:
reason: "knack ReDoS; blocked until azure-cli-core (via cartography) allows knack >=0.13.0."
expires: '2026-10-22'
86217:
reason: "alibabacloud-tea-openapi==0.4.3 blocks upgrade to cryptography >=46.0.0."
expires: '2026-10-22'
71600:
reason: "CVE-2024-1135 false positive. Fixed in gunicorn 22.0.0; project uses 23.0.0."
expires: '2026-10-22'
70612:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
66963:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
74429:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
76352:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
76353:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
+17 -9
View File
@@ -15,7 +15,7 @@ Use these skills for detailed patterns on-demand:
|-------|-------------|-----| |-------|-------------|-----|
| `typescript` | Const types, flat interfaces, utility types | [SKILL.md](skills/typescript/SKILL.md) | | `typescript` | Const types, flat interfaces, utility types | [SKILL.md](skills/typescript/SKILL.md) |
| `react-19` | No useMemo/useCallback, React Compiler | [SKILL.md](skills/react-19/SKILL.md) | | `react-19` | No useMemo/useCallback, React Compiler | [SKILL.md](skills/react-19/SKILL.md) |
| `nextjs-15` | App Router, Server Actions, streaming | [SKILL.md](skills/nextjs-15/SKILL.md) | | `nextjs-16` | App Router, Server Actions, proxy.ts, streaming | [SKILL.md](skills/nextjs-16/SKILL.md) |
| `tailwind-4` | cn() utility, no var() in className | [SKILL.md](skills/tailwind-4/SKILL.md) | | `tailwind-4` | cn() utility, no var() in className | [SKILL.md](skills/tailwind-4/SKILL.md) |
| `playwright` | Page Object Model, MCP workflow, selectors | [SKILL.md](skills/playwright/SKILL.md) | | `playwright` | Page Object Model, MCP workflow, selectors | [SKILL.md](skills/playwright/SKILL.md) |
| `pytest` | Fixtures, mocking, markers, parametrize | [SKILL.md](skills/pytest/SKILL.md) | | `pytest` | Fixtures, mocking, markers, parametrize | [SKILL.md](skills/pytest/SKILL.md) |
@@ -60,11 +60,14 @@ When performing these actions, ALWAYS invoke the corresponding skill FIRST:
|--------|-------| |--------|-------|
| Add changelog entry for a PR or feature | `prowler-changelog` | | Add changelog entry for a PR or feature | `prowler-changelog` |
| Adding DRF pagination or permissions | `django-drf` | | Adding DRF pagination or permissions | `django-drf` |
| Adding a compliance output formatter (per-provider class + table dispatcher) | `prowler-compliance` |
| Adding indexes or constraints to database tables | `django-migration-psql` |
| Adding new providers | `prowler-provider` | | Adding new providers | `prowler-provider` |
| Adding privilege escalation detection queries | `prowler-attack-paths-query` | | Adding privilege escalation detection queries | `prowler-attack-paths-query` |
| Adding services to existing providers | `prowler-provider` | | Adding services to existing providers | `prowler-provider` |
| After creating/modifying a skill | `skill-sync` | | After creating/modifying a skill | `skill-sync` |
| App Router / Server Actions | `nextjs-15` | | App Router / Server Actions | `nextjs-16` |
| Auditing check-to-requirement mappings as a cloud auditor | `prowler-compliance` |
| Building AI chat features | `ai-sdk-5` | | Building AI chat features | `ai-sdk-5` |
| Committing changes | `prowler-commit` | | Committing changes | `prowler-commit` |
| Configuring MCP servers in agentic workflows | `gh-aw` | | Configuring MCP servers in agentic workflows | `gh-aw` |
@@ -78,6 +81,7 @@ When performing these actions, ALWAYS invoke the corresponding skill FIRST:
| Creating a git commit | `prowler-commit` | | Creating a git commit | `prowler-commit` |
| Creating new checks | `prowler-sdk-check` | | Creating new checks | `prowler-sdk-check` |
| Creating new skills | `skill-creator` | | Creating new skills | `skill-creator` |
| Creating or reviewing Django migrations | `django-migration-psql` |
| Creating/modifying Prowler UI components | `prowler-ui` | | Creating/modifying Prowler UI components | `prowler-ui` |
| Creating/modifying models, views, serializers | `prowler-api` | | Creating/modifying models, views, serializers | `prowler-api` |
| Creating/updating compliance frameworks | `prowler-compliance` | | Creating/updating compliance frameworks | `prowler-compliance` |
@@ -85,6 +89,7 @@ When performing these actions, ALWAYS invoke the corresponding skill FIRST:
| Debugging gh-aw compilation errors | `gh-aw` | | Debugging gh-aw compilation errors | `gh-aw` |
| Fill .github/pull_request_template.md (Context/Description/Steps to review/Checklist) | `prowler-pr` | | Fill .github/pull_request_template.md (Context/Description/Steps to review/Checklist) | `prowler-pr` |
| Fixing bug | `tdd` | | Fixing bug | `tdd` |
| Fixing compliance JSON bugs (duplicate IDs, empty Section, stale refs) | `prowler-compliance` |
| General Prowler development questions | `prowler` | | General Prowler development questions | `prowler` |
| Implementing JSON:API endpoints | `django-drf` | | Implementing JSON:API endpoints | `django-drf` |
| Implementing feature | `tdd` | | Implementing feature | `tdd` |
@@ -102,6 +107,8 @@ When performing these actions, ALWAYS invoke the corresponding skill FIRST:
| Review changelog format and conventions | `prowler-changelog` | | Review changelog format and conventions | `prowler-changelog` |
| Reviewing JSON:API compliance | `jsonapi` | | Reviewing JSON:API compliance | `jsonapi` |
| Reviewing compliance framework PRs | `prowler-compliance-review` | | Reviewing compliance framework PRs | `prowler-compliance-review` |
| Running makemigrations or pgmakemigrations | `django-migration-psql` |
| Syncing compliance framework with upstream catalog | `prowler-compliance` |
| Testing RLS tenant isolation | `prowler-test-api` | | Testing RLS tenant isolation | `prowler-test-api` |
| Testing hooks or utilities | `vitest` | | Testing hooks or utilities | `vitest` |
| Troubleshoot why a skill is missing from AGENTS.md auto-invoke | `skill-sync` | | Troubleshoot why a skill is missing from AGENTS.md auto-invoke | `skill-sync` |
@@ -129,6 +136,7 @@ When performing these actions, ALWAYS invoke the corresponding skill FIRST:
| Writing React components | `react-19` | | Writing React components | `react-19` |
| Writing TypeScript types/interfaces | `typescript` | | Writing TypeScript types/interfaces | `typescript` |
| Writing Vitest tests | `vitest` | | Writing Vitest tests | `vitest` |
| Writing data backfill or data migration | `django-migration-psql` |
| Writing documentation | `prowler-docs` | | Writing documentation | `prowler-docs` |
| Writing unit tests for UI | `vitest` | | Writing unit tests for UI | `vitest` |
@@ -140,9 +148,9 @@ Prowler is an open-source cloud security assessment tool supporting AWS, Azure,
| Component | Location | Tech Stack | | Component | Location | Tech Stack |
|-----------|----------|------------| |-----------|----------|------------|
| SDK | `prowler/` | Python 3.10+, Poetry 2.3+ | | SDK | `prowler/` | Python 3.10+, uv |
| API | `api/` | Django 5.1, DRF, Celery | | API | `api/` | Django 5.1, DRF, Celery |
| UI | `ui/` | Next.js 15, React 19, Tailwind 4 | | UI | `ui/` | Next.js 16, React 19, Tailwind 4 |
| MCP Server | `mcp_server/` | FastMCP, Python 3.12+ | | MCP Server | `mcp_server/` | FastMCP, Python 3.12+ |
| Dashboard | `dashboard/` | Dash, Plotly | | Dashboard | `dashboard/` | Dash, Plotly |
@@ -152,13 +160,13 @@ Prowler is an open-source cloud security assessment tool supporting AWS, Azure,
```bash ```bash
# Setup # Setup
poetry install --with dev uv sync
poetry run prek install uv run prek install
# Code quality # Code quality
poetry run make lint uv run make lint
poetry run make format uv run make format
poetry run prek run --all-files uv run prek run --all-files
``` ```
--- ---
+29 -6
View File
@@ -1,11 +1,34 @@
# Do you want to learn on how to... # Do you want to learn on how to...
- Contribute with your code or fixes to Prowler - [Contribute with your code or fixes to Prowler](https://docs.prowler.com/developer-guide/introduction)
- Create a new check for a provider - [Create a new provider](https://docs.prowler.com/developer-guide/provider)
- Create a new security compliance framework - [Create a new service](https://docs.prowler.com/developer-guide/services)
- Add a custom output format - [Create a new check for a provider](https://docs.prowler.com/developer-guide/checks)
- Add a new integration - [Create a new security compliance framework](https://docs.prowler.com/developer-guide/security-compliance-framework)
- Contribute with documentation - [Add a custom output format](https://docs.prowler.com/developer-guide/outputs)
- [Add a new integration](https://docs.prowler.com/developer-guide/integrations)
- [Contribute with documentation](https://docs.prowler.com/developer-guide/documentation)
- [Write unit tests](https://docs.prowler.com/developer-guide/unit-testing)
- [Write integration tests](https://docs.prowler.com/developer-guide/integration-testing)
- [Write end-to-end tests](https://docs.prowler.com/developer-guide/end2end-testing)
- [Debug Prowler](https://docs.prowler.com/developer-guide/debugging)
- [Configure checks](https://docs.prowler.com/developer-guide/configurable-checks)
- [Rename checks](https://docs.prowler.com/developer-guide/renaming-checks)
- [Follow the check metadata guidelines](https://docs.prowler.com/developer-guide/check-metadata-guidelines)
- [Extend the MCP server](https://docs.prowler.com/developer-guide/mcp-server)
- [Extend Lighthouse AI](https://docs.prowler.com/developer-guide/lighthouse-architecture)
- [Add AI skills](https://docs.prowler.com/developer-guide/ai-skills)
Provider-specific developer notes:
- [AWS](https://docs.prowler.com/developer-guide/aws-details)
- [Azure](https://docs.prowler.com/developer-guide/azure-details)
- [Google Cloud](https://docs.prowler.com/developer-guide/gcp-details)
- [Alibaba Cloud](https://docs.prowler.com/developer-guide/alibabacloud-details)
- [Kubernetes](https://docs.prowler.com/developer-guide/kubernetes-details)
- [Microsoft 365](https://docs.prowler.com/developer-guide/m365-details)
- [GitHub](https://docs.prowler.com/developer-guide/github-details)
- [LLM](https://docs.prowler.com/developer-guide/llm-details)
Want some swag as appreciation for your contribution? Want some swag as appreciation for your contribution?
+6 -6
View File
@@ -78,7 +78,7 @@ WORKDIR /home/prowler
# Copy necessary files # Copy necessary files
COPY prowler/ /home/prowler/prowler/ COPY prowler/ /home/prowler/prowler/
COPY dashboard/ /home/prowler/dashboard/ COPY dashboard/ /home/prowler/dashboard/
COPY pyproject.toml /home/prowler COPY pyproject.toml uv.lock /home/prowler/
COPY README.md /home/prowler/ COPY README.md /home/prowler/
COPY prowler/providers/m365/lib/powershell/m365_powershell.py /home/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py COPY prowler/providers/m365/lib/powershell/m365_powershell.py /home/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py
@@ -87,17 +87,17 @@ ENV HOME='/home/prowler'
ENV PATH="${HOME}/.local/bin:${PATH}" ENV PATH="${HOME}/.local/bin:${PATH}"
#hadolint ignore=DL3013 #hadolint ignore=DL3013
RUN pip install --no-cache-dir --upgrade pip && \ RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry==2.3.4 pip install --no-cache-dir uv==0.11.14
RUN poetry install --compile && \ RUN uv sync --compile-bytecode && \
rm -rf ~/.cache/pip rm -rf ~/.cache/uv
# Install PowerShell modules # Install PowerShell modules
RUN poetry run python prowler/providers/m365/lib/powershell/m365_powershell.py RUN .venv/bin/python prowler/providers/m365/lib/powershell/m365_powershell.py
# Remove deprecated dash dependencies # Remove deprecated dash dependencies
RUN pip uninstall dash-html-components -y && \ RUN pip uninstall dash-html-components -y && \
pip uninstall dash-core-components -y pip uninstall dash-core-components -y
USER prowler USER prowler
ENTRYPOINT ["poetry", "run", "prowler"] ENTRYPOINT [".venv/bin/prowler"]
+2 -3
View File
@@ -23,7 +23,7 @@ format: ## Format Code
lint: ## Lint Code lint: ## Lint Code
@echo "Running flake8..." @echo "Running flake8..."
flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude .venv,contrib
@echo "Running black... " @echo "Running black... "
black --check . black --check .
@echo "Running pylint..." @echo "Running pylint..."
@@ -35,7 +35,7 @@ pypi-clean: ## Delete the distribution files
pypi-build: ## Build package pypi-build: ## Build package
$(MAKE) pypi-clean && \ $(MAKE) pypi-clean && \
poetry build uv build
pypi-upload: ## Upload package pypi-upload: ## Upload package
python3 -m twine upload --repository pypi dist/* python3 -m twine upload --repository pypi dist/*
@@ -56,4 +56,3 @@ run-api-dev: ## Start development environment with API, PostgreSQL, Valkey, MCP,
##@ Development Environment ##@ Development Environment
build-and-run-api-dev: build-no-cache-dev run-api-dev build-and-run-api-dev: build-no-cache-dev run-api-dev
+13 -23
View File
@@ -117,9 +117,10 @@ Every AWS provider scan will enqueue an Attack Paths ingestion job automatically
| MongoDB Atlas | 10 | 3 | 0 | 8 | Official | UI, API, CLI | | MongoDB Atlas | 10 | 3 | 0 | 8 | Official | UI, API, CLI |
| LLM | [See `promptfoo` docs.](https://www.promptfoo.dev/docs/red-team/plugins/) | N/A | N/A | N/A | Official | CLI | | LLM | [See `promptfoo` docs.](https://www.promptfoo.dev/docs/red-team/plugins/) | N/A | N/A | N/A | Official | CLI |
| Image | N/A | N/A | N/A | N/A | Official | CLI, API | | Image | N/A | N/A | N/A | N/A | Official | CLI, API |
| Google Workspace | 25 | 4 | 2 | 4 | Official | CLI | | Google Workspace | 25 | 4 | 2 | 4 | Official | UI, API, CLI |
| OpenStack | 34 | 5 | 0 | 9 | Official | UI, API, CLI | | OpenStack | 34 | 5 | 0 | 9 | Official | UI, API, CLI |
| Vercel | 26 | 6 | 0 | 5 | Official | CLI | | Vercel | 26 | 6 | 0 | 5 | Official | UI, API, CLI |
| Okta | 1 | 1 | 0 | 1 | Official | CLI |
| NHN | 6 | 2 | 1 | 0 | Unofficial | CLI | | NHN | 6 | 2 | 1 | 0 | Unofficial | CLI |
> [!Note] > [!Note]
@@ -176,7 +177,7 @@ You can find more information in the [Troubleshooting](./docs/troubleshooting.md
**Requirements** **Requirements**
* `git` installed. * `git` installed.
* `poetry` v2 installed: [poetry installation](https://python-poetry.org/docs/#installation). * `uv` installed: [uv installation](https://docs.astral.sh/uv/getting-started/installation/).
* `pnpm` installed: [pnpm installation](https://pnpm.io/installation). * `pnpm` installed: [pnpm installation](https://pnpm.io/installation).
* `Docker Compose` installed: https://docs.docker.com/compose/install/. * `Docker Compose` installed: https://docs.docker.com/compose/install/.
@@ -185,8 +186,8 @@ You can find more information in the [Troubleshooting](./docs/troubleshooting.md
``` console ``` console
git clone https://github.com/prowler-cloud/prowler git clone https://github.com/prowler-cloud/prowler
cd prowler/api cd prowler/api
poetry install uv sync
eval $(poetry env activate) source .venv/bin/activate
set -a set -a
source .env source .env
docker compose up postgres valkey -d docker compose up postgres valkey -d
@@ -194,11 +195,6 @@ cd src/backend
python manage.py migrate --database admin python manage.py migrate --database admin
gunicorn -c config/guniconf.py config.wsgi:application gunicorn -c config/guniconf.py config.wsgi:application
``` ```
> [!IMPORTANT]
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
>
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
> After completing the setup, access the API documentation at http://localhost:8080/api/v1/docs. > After completing the setup, access the API documentation at http://localhost:8080/api/v1/docs.
@@ -207,8 +203,8 @@ gunicorn -c config/guniconf.py config.wsgi:application
``` console ``` console
git clone https://github.com/prowler-cloud/prowler git clone https://github.com/prowler-cloud/prowler
cd prowler/api cd prowler/api
poetry install uv sync
eval $(poetry env activate) source .venv/bin/activate
set -a set -a
source .env source .env
cd src/backend cd src/backend
@@ -220,8 +216,8 @@ python -m celery -A config.celery worker -l info -E
``` console ``` console
git clone https://github.com/prowler-cloud/prowler git clone https://github.com/prowler-cloud/prowler
cd prowler/api cd prowler/api
poetry install uv sync
eval $(poetry env activate) source .venv/bin/activate
set -a set -a
source .env source .env
cd src/backend cd src/backend
@@ -282,24 +278,18 @@ The container images are available here:
### From GitHub ### From GitHub
Python >=3.10, <3.13 is required with pip and Poetry: Python >=3.10, <3.13 is required with [uv](https://docs.astral.sh/uv/):
``` console ``` console
git clone https://github.com/prowler-cloud/prowler git clone https://github.com/prowler-cloud/prowler
cd prowler cd prowler
eval $(poetry env activate) uv sync
poetry install source .venv/bin/activate
python prowler-cli.py -v python prowler-cli.py -v
``` ```
> [!IMPORTANT] > [!IMPORTANT]
> To clone Prowler on Windows, configure Git to support long file paths by running the following command: `git config core.longpaths true`. > To clone Prowler on Windows, configure Git to support long file paths by running the following command: `git config core.longpaths true`.
> [!IMPORTANT]
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
>
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
# 🛡️ GitHub Action # 🛡️ GitHub Action
The official **Prowler GitHub Action** runs Prowler scans in your GitHub workflows using the official [`prowlercloud/prowler`](https://hub.docker.com/r/prowlercloud/prowler) Docker image. Scans run on any [supported provider](https://docs.prowler.com/user-guide/providers/), with optional [`--push-to-cloud`](https://docs.prowler.com/user-guide/tutorials/prowler-app-import-findings) to send findings to Prowler Cloud and optional SARIF upload so findings show up in the repo's **Security → Code scanning** tab and as inline PR annotations. The official **Prowler GitHub Action** runs Prowler scans in your GitHub workflows using the official [`prowlercloud/prowler`](https://hub.docker.com/r/prowlercloud/prowler) Docker image. Scans run on any [supported provider](https://docs.prowler.com/user-guide/providers/), with optional [`--push-to-cloud`](https://docs.prowler.com/user-guide/tutorials/prowler-app-import-findings) to send findings to Prowler Cloud and optional SARIF upload so findings show up in the repo's **Security → Code scanning** tab and as inline PR annotations.
+1 -1
View File
@@ -167,7 +167,7 @@ runs:
- name: Upload SARIF to GitHub Code Scanning - name: Upload SARIF to GitHub Code Scanning
if: always() && inputs.upload-sarif == 'true' && steps.find-sarif.outputs.sarif_path != '' if: always() && inputs.upload-sarif == 'true' && steps.find-sarif.outputs.sarif_path != ''
uses: github/codeql-action/upload-sarif@d4b3ca9fa7f69d38bfcd667bdc45bc373d16277e # v4 uses: github/codeql-action/upload-sarif@68bde559dea0fdcac2102bfdf6230c5f70eb485e # v4.35.4
with: with:
sarif_file: ${{ steps.find-sarif.outputs.sarif_path }} sarif_file: ${{ steps.find-sarif.outputs.sarif_path }}
category: ${{ inputs.sarif-category }} category: ${{ inputs.sarif-category }}
+8 -8
View File
@@ -124,24 +124,24 @@ api/src/backend/
```bash ```bash
# Development # Development
poetry run python src/backend/manage.py runserver uv run python src/backend/manage.py runserver
poetry run celery -A config.celery worker -l INFO uv run celery -A config.celery worker -l INFO
# Database # Database
poetry run python src/backend/manage.py makemigrations uv run python src/backend/manage.py makemigrations
poetry run python src/backend/manage.py migrate uv run python src/backend/manage.py migrate
# Testing & Linting # Testing & Linting
poetry run pytest -x --tb=short uv run pytest -x --tb=short
poetry run make lint uv run make lint
``` ```
--- ---
## QA CHECKLIST ## QA CHECKLIST
- [ ] `poetry run pytest` passes - [ ] `uv run pytest` passes
- [ ] `poetry run make lint` passes - [ ] `uv run make lint` passes
- [ ] Migrations created if models changed - [ ] Migrations created if models changed
- [ ] New endpoints have `@extend_schema` decorators - [ ] New endpoints have `@extend_schema` decorators
- [ ] RLS properly applied for tenant data - [ ] RLS properly applied for tenant data
+32
View File
@@ -2,11 +2,43 @@
All notable changes to the **Prowler API** are documented in this file. All notable changes to the **Prowler API** are documented in this file.
## [1.28.0] (Prowler UNRELEASED)
### 🚀 Added
- GIN index on `findings(categories, resource_services, resource_regions, resource_types)` to speed up `/api/v1/finding-groups` array filters [(#11001)](https://github.com/prowler-cloud/prowler/pull/11001)
### 🔄 Changed
- Replace `poetry` with `uv` (`0.11.14`) as the API package manager; migrate `pyproject.toml` to `[dependency-groups]` and regenerate as `uv.lock` [(#10775)](https://github.com/prowler-cloud/prowler/pull/10775)
- Remove orphaned `gin_resources_search_idx` declaration from `Resource.Meta.indexes` (DB index dropped in `0072_drop_unused_indexes`) [(#11001)](https://github.com/prowler-cloud/prowler/pull/11001)
---
## [1.27.2] (Prowler UNRELEASED)
### 🐞 Fixed
- Attack Paths: BEDROCK-001 and BEDROCK-002 now target roles trusting `bedrock-agentcore.amazonaws.com` instead of `bedrock.amazonaws.com`, eliminating false positives against regular Bedrock service roles (Agents, Knowledge Bases, model invocation) [(#11141)](https://github.com/prowler-cloud/prowler/pull/11141)
---
## [1.27.1] (Prowler v5.26.1)
### 🐞 Fixed
- `POST /api/v1/scans` was intermittently failing with `Scan matching query does not exist` in the `scan-perform` worker; the Celery task is now published via `transaction.on_commit` so the worker cannot read the Scan before the dispatch-wide transaction commits [(#11122)](https://github.com/prowler-cloud/prowler/pull/11122)
---
## [1.27.0] (Prowler v5.26.0) ## [1.27.0] (Prowler v5.26.0)
### 🚀 Added ### 🚀 Added
- `scan-reset-ephemeral-resources` post-scan task zeroes `failed_findings_count` for resources missing from the latest full-scope scan, keeping ephemeral resources from polluting the Resources page sort [(#10929)](https://github.com/prowler-cloud/prowler/pull/10929) - `scan-reset-ephemeral-resources` post-scan task zeroes `failed_findings_count` for resources missing from the latest full-scope scan, keeping ephemeral resources from polluting the Resources page sort [(#10929)](https://github.com/prowler-cloud/prowler/pull/10929)
### 🔄 Changed
- ASD Essential Eight (AWS) compliance framework support [(#10982)](https://github.com/prowler-cloud/prowler/pull/10982) - ASD Essential Eight (AWS) compliance framework support [(#10982)](https://github.com/prowler-cloud/prowler/pull/10982)
### 🔐 Security ### 🔐 Security
+7 -6
View File
@@ -14,6 +14,7 @@ ENV ZIZMOR_VERSION=${ZIZMOR_VERSION}
# hadolint ignore=DL3008 # hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends \ RUN apt-get update && apt-get install -y --no-install-recommends \
wget \ wget \
git \
libicu72 \ libicu72 \
gcc \ gcc \
g++ \ g++ \
@@ -88,18 +89,18 @@ WORKDIR /home/prowler
# Ensure output directory exists # Ensure output directory exists
RUN mkdir -p /tmp/prowler_api_output RUN mkdir -p /tmp/prowler_api_output
COPY pyproject.toml ./ COPY pyproject.toml uv.lock ./
RUN pip install --no-cache-dir --upgrade pip && \ RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry==2.3.4 pip install --no-cache-dir uv==0.11.14
ENV PATH="/home/prowler/.local/bin:$PATH" ENV PATH="/home/prowler/.local/bin:$PATH"
# Add `--no-root` to avoid installing the current project as a package # Add `--no-install-project` to avoid installing the current project as a package
RUN poetry install --no-root && \ RUN uv sync --no-install-project && \
rm -rf ~/.cache/pip rm -rf ~/.cache/uv
RUN poetry run python "$(poetry env info --path)/src/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py" RUN .venv/bin/python .venv/lib/python3.12/site-packages/prowler/providers/m365/lib/powershell/m365_powershell.py
COPY src/backend/ ./backend/ COPY src/backend/ ./backend/
COPY docker-entrypoint.sh ./docker-entrypoint.sh COPY docker-entrypoint.sh ./docker-entrypoint.sh
+12 -19
View File
@@ -25,12 +25,11 @@ If you dont set `DJANGO_TOKEN_SIGNING_KEY` or `DJANGO_TOKEN_VERIFYING_KEY`, t
**Important note**: Every Prowler version (or repository branches and tags) could have different variables set in its `.env` file. Please use the `.env` file that corresponds with each version. **Important note**: Every Prowler version (or repository branches and tags) could have different variables set in its `.env` file. Please use the `.env` file that corresponds with each version.
## Local deployment ## Local deployment
Keep in mind if you export the `.env` file to use it with local deployment that you will have to do it within the context of the Poetry interpreter, not before. Otherwise, variables will not be loaded properly. Keep in mind if you export the `.env` file to use it with local deployment that you will have to do it within the context of the virtual environment, not before. Otherwise, variables will not be loaded properly.
To do this, you can run: To do this, you can run:
```console ```console
poetry shell
set -a set -a
source .env source .env
``` ```
@@ -78,7 +77,7 @@ docker logs -f $(docker ps --format "{{.Names}}" | grep 'api-')
## Local deployment ## Local deployment
To use this method, you'll need to set up a Python virtual environment (version ">=3.11,<3.13") and keep dependencies updated. Additionally, ensure that `poetry` and `docker compose` are installed. To use this method, you'll need to set up a Python virtual environment (version ">=3.11,<3.13") and keep dependencies updated. Additionally, ensure that `uv` and `docker compose` are installed.
### Clone the repository ### Clone the repository
@@ -90,11 +89,10 @@ git clone https://github.com/prowler-cloud/api.git
git clone git@github.com:prowler-cloud/api.git git clone git@github.com:prowler-cloud/api.git
``` ```
### Install all dependencies with Poetry ### Install all dependencies with uv
```console ```console
poetry install uv sync
poetry shell
``` ```
## Start the PostgreSQL Database and Valkey ## Start the PostgreSQL Database and Valkey
@@ -139,7 +137,7 @@ gunicorn -c config/guniconf.py config.wsgi:application
## Local deployment ## Local deployment
To use this method, you'll need to set up a Python virtual environment (version ">=3.11,<3.13") and keep dependencies updated. Additionally, ensure that `poetry` and `docker compose` are installed. To use this method, you'll need to set up a Python virtual environment (version ">=3.11,<3.13") and keep dependencies updated. Additionally, ensure that `uv` and `docker compose` are installed.
### Clone the repository ### Clone the repository
@@ -165,11 +163,10 @@ docker compose up postgres valkey -d
### Install the Python dependencies ### Install the Python dependencies
> You must have Poetry installed > You must have uv installed
```console ```console
poetry install uv sync
poetry shell
``` ```
### Apply migrations ### Apply migrations
@@ -246,9 +243,8 @@ docker logs -f $(docker ps --format "{{.Names}}" | grep 'api-')
For migrations, you need to force the `admin` database router. Assuming you have the correct environment variables and Python virtual environment, run: For migrations, you need to force the `admin` database router. Assuming you have the correct environment variables and Python virtual environment, run:
```console ```console
poetry shell
cd src/backend cd src/backend
python manage.py migrate --database admin uv run python manage.py migrate --database admin
``` ```
## Apply fixtures ## Apply fixtures
@@ -256,9 +252,8 @@ python manage.py migrate --database admin
Fixtures are used to populate the database with initial development data. Fixtures are used to populate the database with initial development data.
```console ```console
poetry shell
cd src/backend cd src/backend
python manage.py loaddata api/fixtures/0_dev_users.json --database admin uv run python manage.py loaddata api/fixtures/0_dev_users.json --database admin
``` ```
> The default credentials are `dev@prowler.com:Thisisapassword123@` or `dev2@prowler.com:Thisisapassword123@` > The default credentials are `dev@prowler.com:Thisisapassword123@` or `dev2@prowler.com:Thisisapassword123@`
@@ -270,9 +265,8 @@ Note that the tests will fail if you use the same `.env` file as the development
For best results, run in a new shell with no environment variables set. For best results, run in a new shell with no environment variables set.
```console ```console
poetry shell
cd src/backend cd src/backend
pytest uv run pytest
``` ```
# Custom commands # Custom commands
@@ -284,8 +278,7 @@ Django provides a way to create custom commands that can be run from the command
To run a custom command, you need to be in the `prowler/api/src/backend` directory and run: To run a custom command, you need to be in the `prowler/api/src/backend` directory and run:
```console ```console
poetry shell uv run python manage.py <command_name>
python manage.py <command_name>
``` ```
## Generate dummy data ## Generate dummy data
@@ -308,7 +301,7 @@ This command creates, for a given tenant, a provider, scan and a set of findings
### Example ### Example
```console ```console
~/backend $ poetry run python manage.py findings --tenant ~/backend $ uv run python manage.py findings --tenant
fffb1893-3fc7-4623-a5d9-fae47da1c528 --findings 25000 --re fffb1893-3fc7-4623-a5d9-fae47da1c528 --findings 25000 --re
sources 1000 --batch 5000 --alias test-script sources 1000 --batch 5000 --alias test-script
+8 -8
View File
@@ -5,9 +5,9 @@ apply_migrations() {
echo "Applying database migrations..." echo "Applying database migrations..."
# Fix Inconsistent migration history after adding sites app # Fix Inconsistent migration history after adding sites app
poetry run python manage.py check_and_fix_socialaccount_sites_migration --database admin uv run python manage.py check_and_fix_socialaccount_sites_migration --database admin
poetry run python manage.py migrate --database admin uv run python manage.py migrate --database admin
} }
apply_fixtures() { apply_fixtures() {
@@ -15,19 +15,19 @@ apply_fixtures() {
for fixture in api/fixtures/dev/*.json; do for fixture in api/fixtures/dev/*.json; do
if [ -f "$fixture" ]; then if [ -f "$fixture" ]; then
echo "Loading $fixture" echo "Loading $fixture"
poetry run python manage.py loaddata "$fixture" --database admin uv run python manage.py loaddata "$fixture" --database admin
fi fi
done done
} }
start_dev_server() { start_dev_server() {
echo "Starting the development server..." echo "Starting the development server..."
poetry run python manage.py runserver 0.0.0.0:"${DJANGO_PORT:-8080}" uv run python manage.py runserver 0.0.0.0:"${DJANGO_PORT:-8080}"
} }
start_prod_server() { start_prod_server() {
echo "Starting the Gunicorn server..." echo "Starting the Gunicorn server..."
poetry run gunicorn -c config/guniconf.py config.wsgi:application uv run gunicorn -c config/guniconf.py config.wsgi:application
} }
resolve_worker_hostname() { resolve_worker_hostname() {
@@ -47,7 +47,7 @@ resolve_worker_hostname() {
start_worker() { start_worker() {
echo "Starting the worker..." echo "Starting the worker..."
poetry run python -m celery -A config.celery worker \ uv run python -m celery -A config.celery worker \
-n "$(resolve_worker_hostname)" \ -n "$(resolve_worker_hostname)" \
-l "${DJANGO_LOGGING_LEVEL:-info}" \ -l "${DJANGO_LOGGING_LEVEL:-info}" \
-Q celery,scans,scan-reports,deletion,backfill,overview,integrations,compliance,attack-paths-scans \ -Q celery,scans,scan-reports,deletion,backfill,overview,integrations,compliance,attack-paths-scans \
@@ -56,7 +56,7 @@ start_worker() {
start_worker_beat() { start_worker_beat() {
echo "Starting the worker-beat..." echo "Starting the worker-beat..."
poetry run python -m celery -A config.celery beat -l "${DJANGO_LOGGING_LEVEL:-info}" --scheduler django_celery_beat.schedulers:DatabaseScheduler uv run python -m celery -A config.celery beat -l "${DJANGO_LOGGING_LEVEL:-info}" --scheduler django_celery_beat.schedulers:DatabaseScheduler
} }
manage_db_partitions() { manage_db_partitions() {
@@ -64,7 +64,7 @@ manage_db_partitions() {
echo "Managing DB partitions..." echo "Managing DB partitions..."
# For now we skip the deletion of partitions until we define the data retention policy # For now we skip the deletion of partitions until we define the data retention policy
# --yes auto approves the operation without the need of an interactive terminal # --yes auto approves the operation without the need of an interactive terminal
poetry run python manage.py pgpartition --using admin --skip-delete --yes uv run python manage.py pgpartition --using admin --skip-delete --yes
fi fi
} }
-9427
View File
File diff suppressed because it is too large Load Diff
+393 -27
View File
@@ -1,6 +1,24 @@
[build-system] [dependency-groups]
build-backend = "poetry.core.masonry.api" dev = [
requires = ["poetry-core"] "bandit==1.7.9",
"coverage==7.5.4",
"django-silk==5.3.2",
"docker==7.1.0",
"filelock==3.20.3",
"freezegun==1.5.1",
"mypy==1.10.1",
"pylint==3.2.5",
"pytest==9.0.3",
"pytest-cov==5.0.0",
"pytest-django==4.8.0",
"pytest-env==1.1.3",
"pytest-randomly==3.15.0",
"pytest-xdist==3.6.1",
"ruff==0.5.0",
"tqdm==4.67.1",
"vulture==2.14",
"prek==0.3.9"
]
[project] [project]
authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}] authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
@@ -50,28 +68,376 @@ name = "prowler-api"
package-mode = false package-mode = false
# Needed for the SDK compatibility # Needed for the SDK compatibility
requires-python = ">=3.11,<3.13" requires-python = ">=3.11,<3.13"
version = "1.27.0" version = "1.28.0"
[project.scripts] [tool.uv]
celery = "src.backend.config.settings.celery" # Transitive pins matching master to avoid silent drift; bump deliberately.
constraint-dependencies = [
[tool.poetry.group.dev.dependencies] "about-time==4.2.1",
bandit = "1.7.9" "adal==1.2.7",
coverage = "7.5.4" "aioboto3==15.5.0",
django-silk = "5.3.2" "aiobotocore==2.25.1",
docker = "7.1.0" "aiofiles==24.1.0",
filelock = "3.20.3" "aiohappyeyeballs==2.6.1",
freezegun = "1.5.1" "aiohttp==3.13.5",
mypy = "1.10.1" "aioitertools==0.13.0",
prek = "0.3.9" "aiosignal==1.4.0",
pylint = "3.2.5" "alibabacloud-actiontrail20200706==2.4.1",
pytest = "9.0.3" "alibabacloud-credentials==1.0.3",
pytest-cov = "5.0.0" "alibabacloud-credentials-api==1.0.0",
pytest-django = "4.8.0" "alibabacloud-cs20151215==6.1.0",
pytest-env = "1.1.3" "alibabacloud-darabonba-array==0.1.0",
pytest-randomly = "3.15.0" "alibabacloud-darabonba-encode-util==0.0.2",
pytest-xdist = "3.6.1" "alibabacloud-darabonba-map==0.0.1",
ruff = "0.5.0" "alibabacloud-darabonba-signature-util==0.0.4",
safety = "3.7.0" "alibabacloud-darabonba-string==0.0.4",
tqdm = "4.67.1" "alibabacloud-darabonba-time==0.0.1",
vulture = "2.14" "alibabacloud-ecs20140526==7.2.5",
"alibabacloud-endpoint-util==0.0.4",
"alibabacloud-gateway-oss==0.0.17",
"alibabacloud-gateway-oss-util==0.0.3",
"alibabacloud-gateway-sls==0.4.0",
"alibabacloud-gateway-sls-util==0.4.0",
"alibabacloud-gateway-spi==0.0.3",
"alibabacloud-openapi-util==0.2.4",
"alibabacloud-oss-util==0.0.6",
"alibabacloud-oss20190517==1.0.6",
"alibabacloud-ram20150501==1.2.0",
"alibabacloud-rds20140815==12.0.0",
"alibabacloud-sas20181203==6.1.0",
"alibabacloud-sls20201230==5.9.0",
"alibabacloud-sts20150401==1.1.6",
"alibabacloud-tea==0.4.3",
"alibabacloud-tea-openapi==0.4.4",
"alibabacloud-tea-util==0.3.14",
"alibabacloud-tea-xml==0.0.3",
"alibabacloud-vpc20160428==6.13.0",
"alive-progress==3.3.0",
"aliyun-log-fastpb==0.2.0",
"amqp==5.3.1",
"annotated-types==0.7.0",
"anyio==4.12.1",
"applicationinsights==0.11.10",
"apscheduler==3.11.2",
"argcomplete==3.5.3",
"asgiref==3.11.0",
"astroid==3.2.4",
"async-timeout==5.0.1",
"attrs==25.4.0",
"authlib==1.6.9",
"autopep8==2.3.2",
"awsipranges==0.3.3",
"azure-cli-core==2.83.0",
"azure-cli-telemetry==1.1.0",
"azure-common==1.1.28",
"azure-core==1.38.1",
"azure-identity==1.21.0",
"azure-keyvault-certificates==4.10.0",
"azure-keyvault-keys==4.10.0",
"azure-keyvault-secrets==4.10.0",
"azure-mgmt-apimanagement==5.0.0",
"azure-mgmt-applicationinsights==4.1.0",
"azure-mgmt-authorization==4.0.0",
"azure-mgmt-compute==34.0.0",
"azure-mgmt-containerinstance==10.1.0",
"azure-mgmt-containerregistry==12.0.0",
"azure-mgmt-containerservice==34.1.0",
"azure-mgmt-core==1.6.0",
"azure-mgmt-cosmosdb==9.7.0",
"azure-mgmt-databricks==2.0.0",
"azure-mgmt-datafactory==9.2.0",
"azure-mgmt-eventgrid==10.4.0",
"azure-mgmt-eventhub==11.2.0",
"azure-mgmt-keyvault==10.3.1",
"azure-mgmt-loganalytics==12.0.0",
"azure-mgmt-logic==10.0.0",
"azure-mgmt-monitor==6.0.2",
"azure-mgmt-network==28.1.0",
"azure-mgmt-postgresqlflexibleservers==1.1.0",
"azure-mgmt-rdbms==10.1.0",
"azure-mgmt-recoveryservices==3.1.0",
"azure-mgmt-recoveryservicesbackup==9.2.0",
"azure-mgmt-resource==24.0.0",
"azure-mgmt-search==9.1.0",
"azure-mgmt-security==7.0.0",
"azure-mgmt-sql==3.0.1",
"azure-mgmt-storage==22.1.1",
"azure-mgmt-subscription==3.1.1",
"azure-mgmt-synapse==2.0.0",
"azure-mgmt-web==8.0.0",
"azure-monitor-query==2.0.0",
"azure-storage-blob==12.24.1",
"azure-synapse-artifacts==0.21.0",
"backoff==2.2.1",
"bandit==1.7.9",
"billiard==4.2.4",
"blinker==1.9.0",
"boto3==1.40.61",
"botocore==1.40.61",
"cartography==0.135.0",
"celery==5.6.2",
"certifi==2026.1.4",
"cffi==2.0.0",
"charset-normalizer==3.4.4",
"circuitbreaker==2.1.3",
"click==8.3.1",
"click-didyoumean==0.3.1",
"click-plugins==1.1.1.2",
"click-repl==0.3.0",
"cloudflare==4.3.1",
"colorama==0.4.6",
"contextlib2==21.6.0",
"contourpy==1.3.3",
"coverage==7.5.4",
"cron-descriptor==1.4.5",
"crowdstrike-falconpy==1.6.0",
"cryptography==46.0.7",
"cycler==0.12.1",
"darabonba-core==1.0.5",
"dash==3.1.1",
"dash-bootstrap-components==2.0.3",
"debugpy==1.8.20",
"decorator==5.2.1",
"defusedxml==0.7.1",
"detect-secrets==1.5.0",
"dill==0.4.1",
"distro==1.9.0",
"dj-rest-auth==7.0.1",
"django==5.1.15",
"django-allauth==65.15.0",
"django-celery-beat==2.9.0",
"django-celery-results==2.6.0",
"django-cors-headers==4.4.0",
"django-environ==0.11.2",
"django-filter==24.3",
"django-guid==3.5.0",
"django-postgres-extra==2.0.9",
"django-silk==5.3.2",
"django-timezone-field==7.2.1",
"djangorestframework==3.15.2",
"djangorestframework-jsonapi==7.0.2",
"djangorestframework-simplejwt==5.5.1",
"dnspython==2.8.0",
"docker==7.1.0",
"dogpile-cache==1.5.0",
"dparse==0.6.4",
"drf-extensions==0.8.0",
"drf-nested-routers==0.95.0",
"drf-simple-apikey==2.2.1",
"drf-spectacular==0.27.2",
"drf-spectacular-jsonapi==0.5.1",
"dulwich==0.23.0",
"duo-client==5.5.0",
"durationpy==0.10",
"email-validator==2.2.0",
"execnet==2.1.2",
"filelock==3.20.3",
"flask==3.1.3",
"fonttools==4.62.1",
"freezegun==1.5.1",
"frozenlist==1.8.0",
"gevent==25.9.1",
"google-api-core==2.29.0",
"google-api-python-client==2.163.0",
"google-auth==2.48.0",
"google-auth-httplib2==0.2.0",
"google-cloud-access-context-manager==0.3.0",
"google-cloud-asset==4.2.0",
"google-cloud-org-policy==1.16.0",
"google-cloud-os-config==1.23.0",
"google-cloud-resource-manager==1.16.0",
"googleapis-common-protos==1.72.0",
"gprof2dot==2025.4.14",
"graphemeu==0.7.2",
"greenlet==3.3.1",
"grpc-google-iam-v1==0.14.3",
"grpcio==1.76.0",
"grpcio-status==1.76.0",
"gunicorn==23.0.0",
"h11==0.16.0",
"h2==4.3.0",
"hpack==4.1.0",
"httpcore==1.0.9",
"httplib2==0.31.2",
"httpx==0.28.1",
"humanfriendly==10.0",
"hyperframe==6.1.0",
"iamdata==0.1.202602021",
"idna==3.11",
"importlib-metadata==8.7.1",
"inflection==0.5.1",
"iniconfig==2.3.0",
"iso8601==2.1.0",
"isodate==0.7.2",
"isort==5.13.2",
"itsdangerous==2.2.0",
"jinja2==3.1.6",
"jiter==0.13.0",
"jmespath==1.1.0",
"joblib==1.5.3",
"jsonpatch==1.33",
"jsonpickle==4.1.1",
"jsonpointer==3.0.0",
"jsonschema==4.23.0",
"jsonschema-specifications==2025.9.1",
"keystoneauth1==5.13.0",
"kiwisolver==1.4.9",
"knack==0.11.0",
"kombu==5.6.2",
"kubernetes==32.0.1",
"lxml==5.3.2",
"lz4==4.4.5",
"markdown==3.10.2",
"markdown-it-py==4.0.0",
"markupsafe==3.0.3",
"marshmallow==4.3.0",
"matplotlib==3.10.8",
"mccabe==0.7.0",
"mdurl==0.1.2",
"microsoft-kiota-abstractions==1.9.2",
"microsoft-kiota-authentication-azure==1.9.2",
"microsoft-kiota-http==1.9.2",
"microsoft-kiota-serialization-form==1.9.2",
"microsoft-kiota-serialization-json==1.9.2",
"microsoft-kiota-serialization-multipart==1.9.2",
"microsoft-kiota-serialization-text==1.9.2",
"microsoft-security-utilities-secret-masker==1.0.0b4",
"msal==1.35.0b1",
"msal-extensions==1.2.0",
"msgraph-core==1.3.8",
"msgraph-sdk==1.55.0",
"msrest==0.7.1",
"msrestazure==0.6.4.post1",
"multidict==6.7.1",
"mypy==1.10.1",
"mypy-extensions==1.1.0",
"narwhals==2.16.0",
"neo4j==6.1.0",
"nest-asyncio==1.6.0",
"nltk==3.9.4",
"numpy==2.0.2",
"oauthlib==3.3.1",
"oci==2.169.0",
"openai==1.109.1",
"openstacksdk==4.2.0",
"opentelemetry-api==1.39.1",
"opentelemetry-sdk==1.39.1",
"opentelemetry-semantic-conventions==0.60b1",
"os-service-types==1.8.2",
"packageurl-python==0.17.6",
"packaging==26.0",
"pagerduty==6.1.0",
"pandas==2.2.3",
"pbr==7.0.3",
"pillow==12.2.0",
"pkginfo==1.12.1.2",
"platformdirs==4.5.1",
"plotly==6.5.2",
"pluggy==1.6.0",
"policyuniverse==1.5.1.20231109",
"portalocker==2.10.1",
"prek==0.3.9",
"prompt-toolkit==3.0.52",
"propcache==0.4.1",
"proto-plus==1.27.0",
"protobuf==6.33.5",
"psutil==7.2.2",
"psycopg2-binary==2.9.9",
"py-deviceid==0.1.1",
"py-iam-expand==0.1.0",
"py-ocsf-models==0.8.1",
"pyasn1==0.6.3",
"pyasn1-modules==0.4.2",
"pycodestyle==2.14.0",
"pycparser==3.0",
"pydantic==2.12.5",
"pydantic-core==2.41.5",
"pygithub==2.8.0",
"pygments==2.20.0",
"pyjwt==2.12.1",
"pylint==3.2.5",
"pymsalruntime==0.18.1",
"pynacl==1.6.2",
"pyopenssl==26.0.0",
"pyparsing==3.3.2",
"pyreadline3==3.5.4",
"pysocks==1.7.1",
"pytest==9.0.3",
"pytest-celery==1.3.0",
"pytest-cov==5.0.0",
"pytest-django==4.8.0",
"pytest-docker-tools==3.1.9",
"pytest-env==1.1.3",
"pytest-randomly==3.15.0",
"pytest-xdist==3.6.1",
"python-crontab==3.3.0",
"python-dateutil==2.9.0.post0",
"python-digitalocean==1.17.0",
"python3-saml==1.16.0",
"pytz==2025.1",
"pywin32==311",
"pyyaml==6.0.3",
"redis==7.1.0",
"referencing==0.37.0",
"regex==2026.1.15",
"reportlab==4.4.10",
"requests==2.33.1",
"requests-file==3.0.1",
"requests-oauthlib==2.0.0",
"requestsexceptions==1.4.0",
"retrying==1.4.2",
"rich==14.3.2",
"rpds-py==0.30.0",
"rsa==4.9.1",
"ruamel-yaml==0.19.1",
"ruff==0.5.0",
"s3transfer==0.14.0",
"scaleway==2.10.3",
"scaleway-core==2.10.3",
"schema==0.7.5",
"sentry-sdk==2.56.0",
"setuptools==80.10.2",
"shellingham==1.5.4",
"shodan==1.31.0",
"six==1.17.0",
"slack-sdk==3.39.0",
"sniffio==1.3.1",
"sqlparse==0.5.5",
"statsd==4.0.1",
"std-uritemplate==2.0.8",
"stevedore==5.6.0",
"tabulate==0.9.0",
"tenacity==9.1.2",
"tldextract==5.3.1",
"tomlkit==0.14.0",
"tqdm==4.67.1",
"typer==0.21.1",
"types-aiobotocore-ecr==3.1.1",
"typing-extensions==4.15.0",
"typing-inspection==0.4.2",
"tzdata==2025.3",
"tzlocal==5.3.1",
"uritemplate==4.2.0",
"urllib3==2.6.3",
"uuid6==2024.7.10",
"vine==5.1.0",
"vulture==2.14",
"wcwidth==0.5.3",
"websocket-client==1.9.0",
"werkzeug==3.1.7",
"workos==6.0.4",
"wrapt==1.17.3",
"xlsxwriter==3.2.9",
"xmlsec==1.3.14",
"xmltodict==1.0.2",
"yarl==1.22.0",
"zipp==3.23.0",
"zope-event==6.1",
"zope-interface==8.2",
"zstd==1.5.7.3"
]
# prowler@master needs okta==3.4.2; cartography 0.135.0 declares okta<1.0.0 for an
# integration prowler does not import.
override-dependencies = [
"okta==3.4.2"
]
@@ -484,8 +484,8 @@ AWS_BEDROCK_PRIVESC_PASSROLE_CODE_INTERPRETER = AttackPathsQueryDefinition(
OR action = '*' OR action = '*'
) )
// Find roles that trust Bedrock service (can be passed to Bedrock) // Find roles that trust the Bedrock AgentCore service (can be passed to a code interpreter)
MATCH path_target = (aws)--(target_role:AWSRole)-[:TRUSTS_AWS_PRINCIPAL]->(:AWSPrincipal {{arn: 'bedrock.amazonaws.com'}}) MATCH path_target = (aws)--(target_role:AWSRole)-[:TRUSTS_AWS_PRINCIPAL]->(:AWSPrincipal {{arn: 'bedrock-agentcore.amazonaws.com'}})
WHERE any(resource IN stmt_passrole.resource WHERE WHERE any(resource IN stmt_passrole.resource WHERE
resource = '*' resource = '*'
OR target_role.arn CONTAINS resource OR target_role.arn CONTAINS resource
@@ -536,8 +536,8 @@ AWS_BEDROCK_PRIVESC_INVOKE_CODE_INTERPRETER = AttackPathsQueryDefinition(
OR action = '*' OR action = '*'
) )
// Find roles that trust Bedrock service (already attached to existing code interpreters) // Find roles that trust the Bedrock AgentCore service (already attached to existing code interpreters)
MATCH path_target = (aws)--(target_role:AWSRole)-[:TRUSTS_AWS_PRINCIPAL]->(:AWSPrincipal {{arn: 'bedrock.amazonaws.com'}}) MATCH path_target = (aws)--(target_role:AWSRole)-[:TRUSTS_AWS_PRINCIPAL]->(:AWSPrincipal {{arn: 'bedrock-agentcore.amazonaws.com'}})
WITH collect(path_principal) + collect(path_target) AS paths WITH collect(path_principal) + collect(path_target) AS paths
UNWIND paths AS p UNWIND paths AS p
@@ -0,0 +1,31 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0090_attack_paths_cleanup_priority"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_arrays_idx",
columns="categories, resource_services, resource_regions, resource_types",
method="GIN",
all_partitions=True,
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_arrays_idx",
),
)
]
@@ -0,0 +1,73 @@
import django.contrib.postgres.indexes
from django.db import migrations
INDEX_NAME = "gin_find_arrays_idx"
PARENT_TABLE = "findings"
def create_parent_and_attach(apps, schema_editor):
with schema_editor.connection.cursor() as cursor:
# Idempotent: the parent index may already exist if it was created
# manually on an environment before this migration ran.
cursor.execute(
f"CREATE INDEX IF NOT EXISTS {INDEX_NAME} ON ONLY {PARENT_TABLE} "
f"USING gin (categories, resource_services, resource_regions, resource_types)"
)
cursor.execute(
"SELECT inhrelid::regclass::text "
"FROM pg_inherits "
"WHERE inhparent = %s::regclass",
[PARENT_TABLE],
)
for (partition,) in cursor.fetchall():
child_idx = f"{partition.replace('.', '_')}_{INDEX_NAME}"
# ALTER INDEX ... ATTACH PARTITION has no IF NOT ATTACHED clause,
# so check pg_inherits first to keep the migration re-runnable.
cursor.execute(
"""
SELECT 1
FROM pg_inherits i
JOIN pg_class p ON p.oid = i.inhparent
JOIN pg_class c ON c.oid = i.inhrelid
WHERE p.relname = %s AND c.relname = %s
""",
[INDEX_NAME, child_idx],
)
if cursor.fetchone() is None:
cursor.execute(f"ALTER INDEX {INDEX_NAME} ATTACH PARTITION {child_idx}")
def drop_parent_index(apps, schema_editor):
with schema_editor.connection.cursor() as cursor:
cursor.execute(f"DROP INDEX IF EXISTS {INDEX_NAME}")
class Migration(migrations.Migration):
dependencies = [
("api", "0091_findings_arrays_gin_index_partitions"),
]
operations = [
migrations.SeparateDatabaseAndState(
state_operations=[
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=[
"categories",
"resource_services",
"resource_regions",
"resource_types",
],
name=INDEX_NAME,
),
),
],
database_operations=[
migrations.RunPython(
create_parent_and_attach,
reverse_code=drop_parent_index,
),
],
),
]
+9 -1
View File
@@ -946,7 +946,6 @@ class Resource(RowLevelSecurityProtectedModel):
OpClass(Upper("name"), name="gin_trgm_ops"), OpClass(Upper("name"), name="gin_trgm_ops"),
name="res_name_trgm_idx", name="res_name_trgm_idx",
), ),
GinIndex(fields=["text_search"], name="gin_resources_search_idx"),
models.Index(fields=["tenant_id", "id"], name="resources_tenant_id_idx"), models.Index(fields=["tenant_id", "id"], name="resources_tenant_id_idx"),
models.Index( models.Index(
fields=["tenant_id", "provider_id"], fields=["tenant_id", "provider_id"],
@@ -1152,6 +1151,15 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
fields=["tenant_id", "scan_id", "check_id"], fields=["tenant_id", "scan_id", "check_id"],
name="find_tenant_scan_check_idx", name="find_tenant_scan_check_idx",
), ),
GinIndex(
fields=[
"categories",
"resource_services",
"resource_regions",
"resource_types",
],
name="gin_find_arrays_idx",
),
] ]
class JSONAPIMeta: class JSONAPIMeta:
+1 -1
View File
@@ -1,7 +1,7 @@
openapi: 3.0.3 openapi: 3.0.3
info: info:
title: Prowler API title: Prowler API
version: 1.27.0 version: 1.28.0
description: |- description: |-
Prowler API specification. Prowler API specification.
+38 -19
View File
@@ -4,6 +4,7 @@ import json
import logging import logging
import os import os
import time import time
import uuid
from collections import defaultdict from collections import defaultdict
from copy import deepcopy from copy import deepcopy
from datetime import datetime, timedelta, timezone from datetime import datetime, timedelta, timezone
@@ -16,7 +17,7 @@ from allauth.socialaccount.providers.github.views import GitHubOAuth2Adapter
from allauth.socialaccount.providers.google.views import GoogleOAuth2Adapter from allauth.socialaccount.providers.google.views import GoogleOAuth2Adapter
from allauth.socialaccount.providers.saml.views import FinishACSView, LoginView from allauth.socialaccount.providers.saml.views import FinishACSView, LoginView
from botocore.exceptions import ClientError, NoCredentialsError, ParamValidationError from botocore.exceptions import ClientError, NoCredentialsError, ParamValidationError
from celery import chain from celery import chain, states
from celery.result import AsyncResult from celery.result import AsyncResult
from config.custom_logging import BackendLogger from config.custom_logging import BackendLogger
from config.env import env from config.env import env
@@ -60,6 +61,7 @@ from django.utils.dateparse import parse_date
from django.utils.decorators import method_decorator from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_control from django.views.decorators.cache import cache_control
from django_celery_beat.models import PeriodicTask from django_celery_beat.models import PeriodicTask
from django_celery_results.models import TaskResult
from drf_spectacular.settings import spectacular_settings from drf_spectacular.settings import spectacular_settings
from drf_spectacular.types import OpenApiTypes from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import ( from drf_spectacular.utils import (
@@ -422,7 +424,7 @@ class SchemaView(SpectacularAPIView):
def get(self, request, *args, **kwargs): def get(self, request, *args, **kwargs):
spectacular_settings.TITLE = "Prowler API" spectacular_settings.TITLE = "Prowler API"
spectacular_settings.VERSION = "1.27.0" spectacular_settings.VERSION = "1.28.0"
spectacular_settings.DESCRIPTION = ( spectacular_settings.DESCRIPTION = (
"Prowler API specification.\n\nThis file is auto-generated." "Prowler API specification.\n\nThis file is auto-generated."
) )
@@ -2534,28 +2536,45 @@ class ScanViewSet(BaseRLSViewSet):
def create(self, request, *args, **kwargs): def create(self, request, *args, **kwargs):
input_serializer = self.get_serializer(data=request.data) input_serializer = self.get_serializer(data=request.data)
input_serializer.is_valid(raise_exception=True) input_serializer.is_valid(raise_exception=True)
# Broker publish is deferred to on_commit so the worker cannot read
# Scan before BaseRLSViewSet's dispatch-wide atomic commits.
pre_task_id = str(uuid.uuid4())
with transaction.atomic(): with transaction.atomic():
scan = input_serializer.save() scan = input_serializer.save()
with transaction.atomic(): scan.task_id = pre_task_id
task = perform_scan_task.apply_async( scan.save(update_fields=["task_id"])
kwargs={
"tenant_id": self.request.tenant_id, attack_paths_db_utils.create_attack_paths_scan(
"scan_id": str(scan.id), tenant_id=self.request.tenant_id,
"provider_id": str(scan.provider_id), scan_id=str(scan.id),
# Disabled for now provider_id=str(scan.provider_id),
# checks_to_execute=scan.scanner_args.get("checks_to_execute")
},
) )
attack_paths_db_utils.create_attack_paths_scan( task_result, _ = TaskResult.objects.get_or_create(
tenant_id=self.request.tenant_id, task_id=pre_task_id,
scan_id=str(scan.id), defaults={"status": states.PENDING, "task_name": "scan-perform"},
provider_id=str(scan.provider_id), )
) prowler_task, _ = Task.objects.update_or_create(
id=pre_task_id,
tenant_id=self.request.tenant_id,
defaults={"task_runner_task": task_result},
)
prowler_task = Task.objects.get(id=task.id) scan_kwargs = {
scan.task_id = task.id "tenant_id": self.request.tenant_id,
scan.save(update_fields=["task_id"]) "scan_id": str(scan.id),
"provider_id": str(scan.provider_id),
# Disabled for now
# checks_to_execute=scan.scanner_args.get("checks_to_execute")
}
transaction.on_commit(
lambda: perform_scan_task.apply_async(
kwargs=scan_kwargs, task_id=pre_task_id
)
)
self.response_serializer_class = TaskSerializer self.response_serializer_class = TaskSerializer
output_serializer = self.get_serializer(prowler_task) output_serializer = self.get_serializer(prowler_task)
Generated
+5983
View File
File diff suppressed because it is too large Load Diff
@@ -1,8 +1,9 @@
#!/bin/bash #!/bin/bash
# Run Prowler against All AWS Accounts in an AWS Organization # Run Prowler against All AWS Accounts in an AWS Organization
# Activate Poetry Environment # Activate uv-managed virtualenv
eval "$(poetry env activate)" # shellcheck disable=SC1091
source .venv/bin/activate
# Show Prowler Version # Show Prowler Version
prowler -v prowler -v
+335
View File
@@ -0,0 +1,335 @@
# AWS Inventory Connectivity Graph
A community-contributed tool that generates interactive connectivity graphs from Prowler AWS scans, visualizing relationships between AWS resources with zero additional API calls.
## Overview
This tool extends Prowler by producing two artifacts after a scan completes:
- **`<output>.inventory.json`** Machine-readable graph (nodes + edges)
- **`<output>.inventory.html`** Interactive D3.js force-directed visualization
### Why?
Prowler's existing outputs (CSV, ASFF, OCSF, HTML) report individual check findings but provide no cross-service topology view. Security engineers need to understand **how** resources are connected—which Lambda functions sit inside which VPC, which IAM roles can be assumed by which services, which event sources trigger which functions—before they can reason about attack paths, blast-radius, or lateral-movement risk.
This tool fills that gap by building a connectivity graph from the service clients that are already loaded during a Prowler scan.
## Features
### Supported AWS Services
The tool currently extracts connectivity information from:
- **Lambda** Functions, VPC/subnet/SG edges, event source mappings, layers, DLQ, KMS
- **EC2** Instances, security groups, subnet/VPC edges
- **VPC** VPCs, subnets, peering connections
- **RDS** DB instances, VPC/SG/cluster/KMS edges
- **ELBv2** ALB/NLB load balancers, SG and VPC edges
- **S3** Buckets, replication targets, logging buckets, KMS keys
- **IAM** Roles, trust-relationship edges (who can assume what)
### Edge Semantic Types
Edges are typed for downstream filtering and attack-path analysis:
- `network` Resources share a network path (VPC/subnet/SG)
- `iam` IAM trust or permission relationship
- `triggers` One resource can invoke another (event source → Lambda)
- `data_flow` Data is written/read (Lambda → SQS dead-letter queue)
- `depends_on` Soft dependency (Lambda layer, subnet belongs to VPC)
- `routes_to` Traffic routing (LB → target)
- `replicates_to` S3 replication
- `encrypts` KMS key encrypts the resource
- `logs_to` Logging relationship
### Interactive HTML Graph Features
- Force-directed layout with drag-and-drop node pinning
- Zoom / pan (mouse wheel + click-drag on background)
- Per-service color-coded nodes with a legend
- Hover tooltips showing ARN + all metadata properties
- Service filter dropdown (show only Lambda, EC2, RDS, etc.)
- Adjustable link-distance and charge-strength physics sliders
- Edge labels on every arrow
## Installation
### Prerequisites
- Python 3.9.1 or higher
- Prowler installed and configured (see [Prowler documentation](https://docs.prowler.com/))
### Setup
1. Clone or download this directory to your local machine
2. Ensure Prowler is installed and working
3. No additional dependencies required beyond Prowler's existing requirements
## Usage
### Basic Usage
Run Prowler with your desired checks, then use the inventory graph script:
```bash
# Run Prowler scan (example)
prowler aws --output-formats csv
# Generate inventory graph from the scan
python contrib/inventory-graph/inventory_graph.py --output-directory ./output
```
### Command-Line Options
```bash
python contrib/inventory-graph/inventory_graph.py [OPTIONS]
Options:
--output-directory DIR Directory to save output files (default: ./output)
--output-filename NAME Base filename without extension (default: prowler-inventory-<timestamp>)
--help Show this help message and exit
```
### Example Workflow
```bash
# 1. Run a Prowler scan on your AWS account
prowler aws --profile my-aws-profile --output-formats csv html
# 2. Generate the inventory graph
python contrib/inventory-graph/inventory_graph.py \
--output-directory ./output \
--output-filename my-aws-inventory
# 3. Open the HTML file in your browser
open output/my-aws-inventory.inventory.html
```
### Integration with Prowler Scans
The tool reads from already-loaded AWS service clients in memory (via `sys.modules`). This means:
- **Zero extra AWS API calls** Uses data already collected during the Prowler scan
- **Graceful degradation** Services not scanned are silently skipped
- **Flexible** Works with any subset of Prowler checks
## Output Files
### JSON Output (`*.inventory.json`)
Machine-readable graph structure:
```json
{
"generated_at": "2026-03-19T12:34:56Z",
"nodes": [
{
"id": "arn:aws:lambda:us-east-1:123456789012:function:my-function",
"type": "lambda_function",
"name": "my-function",
"service": "lambda",
"region": "us-east-1",
"account_id": "123456789012",
"properties": {
"runtime": "python3.9",
"vpc_id": "vpc-abc123"
}
}
],
"edges": [
{
"source_id": "arn:aws:lambda:...",
"target_id": "arn:aws:ec2:...:vpc/vpc-abc123",
"edge_type": "network",
"label": "in-vpc"
}
],
"stats": {
"node_count": 42,
"edge_count": 87
}
}
```
### HTML Output (`*.inventory.html`)
Self-contained interactive visualization that opens in any modern browser. No server or build step required.
## Architecture
### Design Decisions
| Decision | Rationale |
|----------|-----------|
| **Read from sys.modules** | Zero extra AWS API calls; services not scanned are silently skipped |
| **Self-contained HTML** | D3.js v7 via CDN; no server, no build step; opens in any browser |
| **One extractor per service** | Each extractor is independently testable; adding a new service = one new file + one line in the registry |
| **Typed edges** | Semantic types allow downstream consumers (attack-path tools, Neo4j import) to filter by relationship class |
### Project Structure
```
contrib/inventory-graph/
├── README.md # This file
├── inventory_graph.py # Main entry point script
├── lib/
│ ├── __init__.py
│ ├── models.py # ResourceNode, ResourceEdge, ConnectivityGraph dataclasses
│ ├── graph_builder.py # Reads loaded service clients from sys.modules
│ ├── inventory_output.py # write_json(), write_html()
│ └── extractors/
│ ├── __init__.py
│ ├── lambda_extractor.py # Lambda functions → VPC/subnet/SG/event-sources/layers/DLQ/KMS
│ ├── ec2_extractor.py # EC2 instances + security groups → subnet/VPC
│ ├── vpc_extractor.py # VPCs, subnets, peering connections
│ ├── rds_extractor.py # RDS instances → VPC/SG/cluster/KMS
│ ├── elbv2_extractor.py # ALB/NLB load balancers → SG/VPC
│ ├── s3_extractor.py # S3 buckets → replication targets/logging buckets/KMS keys
│ └── iam_extractor.py # IAM roles + trust-relationship edges
└── examples/
└── sample_output.html # Example output (optional)
```
## Testing
### Smoke Test (No AWS Credentials Needed)
```python
import sys
from unittest.mock import MagicMock
# Wire a fake Lambda client
mock_module = MagicMock()
mock_fn = MagicMock()
mock_fn.arn = "arn:aws:lambda:us-east-1:123:function:test"
mock_fn.name = "test"
mock_fn.region = "us-east-1"
mock_fn.vpc_id = "vpc-abc"
mock_fn.security_groups = ["sg-111"]
mock_fn.subnet_ids = {"subnet-aaa"}
mock_fn.environment = None
mock_fn.kms_key_arn = None
mock_fn.layers = []
mock_fn.dead_letter_config = None
mock_fn.event_source_mappings = []
mock_module.awslambda_client.functions = {mock_fn.arn: mock_fn}
mock_module.awslambda_client.audited_account = "123"
sys.modules["prowler.providers.aws.services.awslambda.awslambda_client"] = mock_module
from contrib.inventory_graph.lib.graph_builder import build_graph
from contrib.inventory_graph.lib.inventory_output import write_json, write_html
graph = build_graph()
write_json(graph, "/tmp/test.inventory.json")
write_html(graph, "/tmp/test.inventory.html")
# Open /tmp/test.inventory.html in a browser
```
## Extending
### Adding a New Service
1. Create a new extractor file in `lib/extractors/` (e.g., `dynamodb_extractor.py`)
2. Implement the `extract(client)` function that returns `(nodes, edges)`
3. Register it in `lib/graph_builder.py` in the `_SERVICE_REGISTRY` tuple
Example extractor template:
```python
from typing import List, Tuple
from prowler.lib.outputs.inventory.models import ResourceNode, ResourceEdge
def extract(client) -> Tuple[List[ResourceNode], List[ResourceEdge]]:
"""Extract DynamoDB tables and their relationships."""
nodes = []
edges = []
for table in client.tables:
nodes.append(
ResourceNode(
id=table.arn,
type="dynamodb_table",
name=table.name,
service="dynamodb",
region=table.region,
account_id=client.audited_account,
properties={"billing_mode": table.billing_mode}
)
)
# Add edges for KMS encryption, streams, etc.
if table.kms_key_arn:
edges.append(
ResourceEdge(
source_id=table.kms_key_arn,
target_id=table.arn,
edge_type="encrypts",
label="encrypts"
)
)
return nodes, edges
```
## Troubleshooting
### No nodes discovered
**Problem:** The tool reports "no nodes discovered" after running.
**Solution:** Ensure you've run a Prowler scan first. The tool reads from in-memory service clients loaded during the scan. If no services were scanned, no nodes will be discovered.
### Missing services in the graph
**Problem:** Some AWS services are not appearing in the graph.
**Solution:** The tool only includes services that have been scanned by Prowler. Run Prowler with the services you want to include, or run without service filters to scan all available services.
### HTML file doesn't display properly
**Problem:** The HTML visualization doesn't load or shows errors.
**Solution:**
- Ensure you're opening the file in a modern browser (Chrome, Firefox, Safari, Edge)
- Check your browser's console for JavaScript errors
- Verify the file was generated completely (check file size > 0)
- The HTML requires internet access to load D3.js from CDN
## Roadmap
Potential future enhancements:
- [ ] Support for additional AWS services (DynamoDB, SQS, SNS, etc.)
- [ ] Export to Neo4j / Cartography format
- [ ] Attack path analysis integration
- [ ] Multi-account/multi-region aggregation
- [ ] Custom edge type filtering in HTML UI
- [ ] Graph diff between two scans
## Contributing
This is a community contribution. If you'd like to enhance it:
1. Fork the Prowler repository
2. Make your changes in `contrib/inventory-graph/`
3. Test thoroughly
4. Submit a pull request with a clear description
## License
This tool is part of the Prowler project and is licensed under the Apache License 2.0.
## Credits
- **Author:** [@sandiyochristan](https://github.com/sandiyochristan)
- **Related PR:** [#10382](https://github.com/prowler-cloud/prowler/pull/10382)
- **Prowler Project:** [prowler-cloud/prowler](https://github.com/prowler-cloud/prowler)
## Support
For issues or questions:
- Open an issue in the [Prowler repository](https://github.com/prowler-cloud/prowler/issues)
- Join the [Prowler Community Slack](https://goto.prowler.com/slack)
- Tag your issue with `contrib:inventory-graph`
@@ -0,0 +1,181 @@
#!/usr/bin/env python3
"""
Example: Generate AWS Inventory Graph with Mock Data
This example demonstrates how to use the inventory graph tool with mock AWS data.
No AWS credentials required.
"""
import sys
from pathlib import Path
from unittest.mock import MagicMock
# Add parent directory to path
sys.path.insert(0, str(Path(__file__).parent.parent))
from lib.graph_builder import build_graph
from lib.inventory_output import write_json, write_html
def create_mock_lambda_client():
"""Create a mock Lambda client with sample data."""
mock_module = MagicMock()
# Create a mock Lambda function
mock_fn = MagicMock()
mock_fn.arn = "arn:aws:lambda:us-east-1:123456789012:function:my-test-function"
mock_fn.name = "my-test-function"
mock_fn.region = "us-east-1"
mock_fn.vpc_id = "vpc-abc123"
mock_fn.security_groups = ["sg-111222"]
mock_fn.subnet_ids = {"subnet-aaa111", "subnet-bbb222"}
mock_fn.environment = {"Variables": {"ENV": "production"}}
mock_fn.kms_key_arn = (
"arn:aws:kms:us-east-1:123456789012:key/12345678-1234-1234-1234-123456789012"
)
mock_fn.layers = []
mock_fn.dead_letter_config = None
mock_fn.event_source_mappings = []
mock_module.awslambda_client.functions = {mock_fn.arn: mock_fn}
mock_module.awslambda_client.audited_account = "123456789012"
return mock_module
def create_mock_ec2_client():
"""Create a mock EC2 client with sample data."""
mock_module = MagicMock()
# Create a mock EC2 instance
mock_instance = MagicMock()
mock_instance.arn = (
"arn:aws:ec2:us-east-1:123456789012:instance/i-1234567890abcdef0"
)
mock_instance.id = "i-1234567890abcdef0"
mock_instance.region = "us-east-1"
mock_instance.vpc_id = "vpc-abc123"
mock_instance.subnet_id = "subnet-aaa111"
mock_instance.security_groups = [MagicMock(id="sg-111222")]
mock_instance.state = "running"
mock_instance.type = "t3.micro"
mock_instance.tags = [{"Key": "Name", "Value": "test-instance"}]
# Create a mock security group
mock_sg = MagicMock()
mock_sg.arn = "arn:aws:ec2:us-east-1:123456789012:security-group/sg-111222"
mock_sg.id = "sg-111222"
mock_sg.name = "test-security-group"
mock_sg.region = "us-east-1"
mock_sg.vpc_id = "vpc-abc123"
mock_module.ec2_client.instances = [mock_instance]
mock_module.ec2_client.security_groups = [mock_sg]
mock_module.ec2_client.audited_account = "123456789012"
return mock_module
def create_mock_vpc_client():
"""Create a mock VPC client with sample data."""
mock_module = MagicMock()
# Create a mock VPC
mock_vpc = MagicMock()
mock_vpc.arn = "arn:aws:ec2:us-east-1:123456789012:vpc/vpc-abc123"
mock_vpc.id = "vpc-abc123"
mock_vpc.region = "us-east-1"
mock_vpc.cidr_block = "10.0.0.0/16"
mock_vpc.tags = [{"Key": "Name", "Value": "test-vpc"}]
# Create mock subnets
mock_subnet1 = MagicMock()
mock_subnet1.arn = "arn:aws:ec2:us-east-1:123456789012:subnet/subnet-aaa111"
mock_subnet1.id = "subnet-aaa111"
mock_subnet1.region = "us-east-1"
mock_subnet1.vpc_id = "vpc-abc123"
mock_subnet1.cidr_block = "10.0.1.0/24"
mock_subnet1.availability_zone = "us-east-1a"
mock_subnet2 = MagicMock()
mock_subnet2.arn = "arn:aws:ec2:us-east-1:123456789012:subnet/subnet-bbb222"
mock_subnet2.id = "subnet-bbb222"
mock_subnet2.region = "us-east-1"
mock_subnet2.vpc_id = "vpc-abc123"
mock_subnet2.cidr_block = "10.0.2.0/24"
mock_subnet2.availability_zone = "us-east-1b"
mock_module.vpc_client.vpcs = [mock_vpc]
mock_module.vpc_client.subnets = [mock_subnet1, mock_subnet2]
mock_module.vpc_client.vpc_peering_connections = []
mock_module.vpc_client.audited_account = "123456789012"
return mock_module
def main():
"""Main function to demonstrate the inventory graph generation."""
print("=" * 70)
print("AWS Inventory Graph - Mock Data Example")
print("=" * 70)
print()
# Create mock clients and inject them into sys.modules
print("Creating mock AWS service clients...")
sys.modules["prowler.providers.aws.services.awslambda.awslambda_client"] = (
create_mock_lambda_client()
)
sys.modules["prowler.providers.aws.services.ec2.ec2_client"] = (
create_mock_ec2_client()
)
sys.modules["prowler.providers.aws.services.vpc.vpc_client"] = (
create_mock_vpc_client()
)
print("✓ Mock clients created")
print()
# Build the graph
print("Building connectivity graph...")
graph = build_graph()
print(f"✓ Graph built: {len(graph.nodes)} nodes, {len(graph.edges)} edges")
print()
# Display discovered nodes
print("Discovered nodes:")
for node in graph.nodes:
print(f" - {node.type}: {node.name} ({node.region})")
print()
# Display discovered edges
print("Discovered edges:")
for edge in graph.edges:
source_node = next((n for n in graph.nodes if n.id == edge.source_id), None)
target_node = next((n for n in graph.nodes if n.id == edge.target_id), None)
source_name = source_node.name if source_node else edge.source_id
target_name = target_node.name if target_node else edge.target_id
print(f" - {source_name} --[{edge.edge_type}]--> {target_name}")
print()
# Write outputs
output_dir = Path(__file__).parent
json_path = output_dir / "example_output.inventory.json"
html_path = output_dir / "example_output.inventory.html"
print("Writing output files...")
write_json(graph, str(json_path))
write_html(graph, str(html_path))
print(f"✓ JSON written to: {json_path}")
print(f"✓ HTML written to: {html_path}")
print()
print("=" * 70)
print("✓ Example complete!")
print("=" * 70)
print()
print(f"Open the HTML file to view the interactive graph:")
print(f" open {html_path}")
print()
if __name__ == "__main__":
main()
+158
View File
@@ -0,0 +1,158 @@
#!/usr/bin/env python3
"""
AWS Inventory Connectivity Graph Generator
A standalone tool that generates interactive connectivity graphs from Prowler AWS scans.
This tool reads from already-loaded AWS service clients in memory and produces:
- JSON graph (nodes + edges)
- Interactive HTML visualization
Usage:
python inventory_graph.py --output-directory ./output --output-filename my-inventory
For more information, see README.md
"""
import argparse
import os
import sys
from datetime import datetime
from pathlib import Path
# Add the contrib directory to the path so we can import the lib modules
CONTRIB_DIR = Path(__file__).parent
sys.path.insert(0, str(CONTRIB_DIR))
from lib.graph_builder import build_graph
from lib.inventory_output import write_json, write_html
def parse_arguments():
"""Parse command-line arguments."""
parser = argparse.ArgumentParser(
description="Generate AWS inventory connectivity graph from Prowler scan data",
formatter_class=argparse.RawDescriptionHelpFormatter,
epilog="""
Examples:
# Generate graph with default settings
python inventory_graph.py
# Specify custom output directory and filename
python inventory_graph.py --output-directory ./my-output --output-filename aws-inventory
# After running a Prowler scan
prowler aws --profile my-profile
python inventory_graph.py --output-directory ./output
For more information, see README.md
""",
)
parser.add_argument(
"--output-directory",
"-o",
default="./output",
help="Directory to save output files (default: ./output)",
)
parser.add_argument(
"--output-filename",
"-f",
default=None,
help="Base filename without extension (default: prowler-inventory-<timestamp>)",
)
parser.add_argument(
"--verbose",
"-v",
action="store_true",
help="Enable verbose output",
)
return parser.parse_args()
def main():
"""Main entry point for the inventory graph generator."""
args = parse_arguments()
# Set up output paths
output_dir = Path(args.output_directory)
output_dir.mkdir(parents=True, exist_ok=True)
# Generate filename with timestamp if not provided
if args.output_filename:
base_filename = args.output_filename
else:
timestamp = datetime.now().strftime("%Y%m%d%H%M%S")
base_filename = f"prowler-inventory-{timestamp}"
json_path = output_dir / f"{base_filename}.inventory.json"
html_path = output_dir / f"{base_filename}.inventory.html"
print("=" * 70)
print("AWS Inventory Connectivity Graph Generator")
print("=" * 70)
print()
# Build the graph from loaded service clients
if args.verbose:
print("Building connectivity graph from loaded AWS service clients...")
graph = build_graph()
# Check if any nodes were discovered
if not graph.nodes:
print("⚠️ WARNING: No nodes discovered!")
print()
print("This usually means:")
print(" 1. No Prowler scan has been run yet in this Python session")
print(" 2. No AWS service clients are loaded in memory")
print()
print("To fix this:")
print(" 1. Run a Prowler scan first: prowler aws --output-formats csv")
print(" 2. Then run this script in the same session")
print()
print(
"Alternatively, integrate this tool directly into Prowler's output pipeline."
)
sys.exit(1)
print(f"✓ Discovered {len(graph.nodes)} nodes and {len(graph.edges)} edges")
print()
# Write outputs
if args.verbose:
print(f"Writing JSON output to: {json_path}")
write_json(graph, str(json_path))
if args.verbose:
print(f"Writing HTML output to: {html_path}")
write_html(graph, str(html_path))
print()
print("=" * 70)
print("✓ Graph generation complete!")
print("=" * 70)
print()
print(f"📄 JSON: {json_path}")
print(f"🌐 HTML: {html_path}")
print()
print(f"Open the HTML file in your browser to explore the interactive graph:")
print(f" open {html_path}")
print()
if __name__ == "__main__":
try:
main()
except KeyboardInterrupt:
print("\n\nInterrupted by user. Exiting...")
sys.exit(130)
except Exception as e:
print(f"\n❌ Error: {e}", file=sys.stderr)
if "--verbose" in sys.argv or "-v" in sys.argv:
import traceback
traceback.print_exc()
sys.exit(1)
@@ -0,0 +1,94 @@
from typing import List, Tuple
from lib.models import ResourceEdge, ResourceNode
def extract(client) -> Tuple[List[ResourceNode], List[ResourceEdge]]:
"""
Extract EC2 instance and security-group nodes with their edges.
Edges produced:
- instance security-group [network]
- instance subnet [network]
- security-group VPC [network]
"""
nodes: List[ResourceNode] = []
edges: List[ResourceEdge] = []
# EC2 Instances
for instance in client.instances:
name = instance.id
for tag in instance.tags or []:
if tag.get("Key") == "Name":
name = tag["Value"]
break
props = {
"instance_type": getattr(instance, "type", None),
"state": getattr(instance, "state", None),
"vpc_id": getattr(instance, "vpc_id", None),
"subnet_id": getattr(instance, "subnet_id", None),
"public_ip": getattr(instance, "public_ip_address", None),
"private_ip": getattr(instance, "private_ip_address", None),
}
nodes.append(
ResourceNode(
id=instance.arn,
type="ec2_instance",
name=name,
service="ec2",
region=instance.region,
account_id=client.audited_account,
properties={k: v for k, v in props.items() if v is not None},
)
)
for sg_id in instance.security_groups or []:
edges.append(
ResourceEdge(
source_id=instance.arn,
target_id=sg_id,
edge_type="network",
label="sg",
)
)
if instance.subnet_id:
edges.append(
ResourceEdge(
source_id=instance.arn,
target_id=instance.subnet_id,
edge_type="network",
label="subnet",
)
)
# Security Groups
for sg in client.security_groups.values():
name = (
sg.name if hasattr(sg, "name") else sg.id if hasattr(sg, "id") else sg.arn
)
nodes.append(
ResourceNode(
id=sg.arn,
type="security_group",
name=name,
service="ec2",
region=sg.region,
account_id=client.audited_account,
properties={"vpc_id": sg.vpc_id},
)
)
if sg.vpc_id:
edges.append(
ResourceEdge(
source_id=sg.arn,
target_id=sg.vpc_id,
edge_type="network",
label="in-vpc",
)
)
return nodes, edges
@@ -0,0 +1,60 @@
from typing import List, Tuple
from lib.models import ResourceEdge, ResourceNode
def extract(client) -> Tuple[List[ResourceNode], List[ResourceEdge]]:
"""
Extract ELBv2 (ALB/NLB) load balancer nodes and their edges.
Edges produced:
- load_balancer security-group [network]
- load_balancer VPC [network]
"""
nodes: List[ResourceNode] = []
edges: List[ResourceEdge] = []
for lb in client.loadbalancersv2.values():
props = {
"type": getattr(lb, "type", None),
"scheme": getattr(lb, "scheme", None),
"dns_name": getattr(lb, "dns", None),
"vpc_id": getattr(lb, "vpc_id", None),
}
name = getattr(lb, "name", lb.arn.split("/")[-2] if "/" in lb.arn else lb.arn)
nodes.append(
ResourceNode(
id=lb.arn,
type="load_balancer",
name=name,
service="elbv2",
region=lb.region,
account_id=client.audited_account,
properties={k: v for k, v in props.items() if v is not None},
)
)
for sg_id in lb.security_groups or []:
edges.append(
ResourceEdge(
source_id=lb.arn,
target_id=sg_id,
edge_type="network",
label="sg",
)
)
vpc_id = getattr(lb, "vpc_id", None)
if vpc_id:
edges.append(
ResourceEdge(
source_id=lb.arn,
target_id=vpc_id,
edge_type="network",
label="in-vpc",
)
)
return nodes, edges
@@ -0,0 +1,84 @@
import json
from typing import Any, Dict, List, Tuple
from prowler.lib.logger import logger
from lib.models import ResourceEdge, ResourceNode
def _parse_trust_principals(assume_role_policy: Any) -> List[str]:
"""
Return a flat list of principal strings from an IAM assume-role policy document.
The policy may be a dict already or a JSON string.
"""
if not assume_role_policy:
return []
if isinstance(assume_role_policy, str):
try:
assume_role_policy = json.loads(assume_role_policy)
except (json.JSONDecodeError, ValueError):
return []
principals = []
for statement in assume_role_policy.get("Statement", []):
principal = statement.get("Principal", {})
if isinstance(principal, str):
principals.append(principal)
elif isinstance(principal, dict):
for v in principal.values():
if isinstance(v, list):
principals.extend(v)
else:
principals.append(v)
elif isinstance(principal, list):
principals.extend(principal)
return principals
def extract(client) -> Tuple[List[ResourceNode], List[ResourceEdge]]:
"""
Extract IAM role nodes and their trust-relationship edges.
Edges produced:
- trusted-principal role [iam] (who can assume this role)
"""
nodes: List[ResourceNode] = []
edges: List[ResourceEdge] = []
for role in client.roles:
props: Dict[str, Any] = {
"path": getattr(role, "path", None),
"create_date": str(getattr(role, "create_date", "") or ""),
}
nodes.append(
ResourceNode(
id=role.arn,
type="iam_role",
name=role.name,
service="iam",
region="global",
account_id=client.audited_account,
properties={k: v for k, v in props.items() if v},
)
)
# Trust-relationship edges: principal → role (principal CAN assume role)
try:
for principal in _parse_trust_principals(role.assume_role_policy):
if principal and principal != "*":
edges.append(
ResourceEdge(
source_id=principal,
target_id=role.arn,
edge_type="iam",
label="can-assume",
)
)
except Exception as e:
logger.debug(
f"inventory iam_extractor: could not parse trust policy for {role.arn}: {e}"
)
return nodes, edges
@@ -0,0 +1,118 @@
from typing import List, Tuple
from lib.models import ResourceEdge, ResourceNode
def extract(client) -> Tuple[List[ResourceNode], List[ResourceEdge]]:
"""
Extract Lambda function nodes and their edges from an awslambda_client.
Edges produced:
- lambda VPC [network]
- lambda subnet [network]
- lambda sg [network]
- lambda event-source[triggers] (from EventSourceMapping)
- lambda layer ARN [depends_on]
- lambda DLQ target [data_flow]
- lambda KMS key [encrypts]
"""
nodes: List[ResourceNode] = []
edges: List[ResourceEdge] = []
for fn in client.functions.values():
props = {
"runtime": fn.runtime,
"vpc_id": fn.vpc_id,
}
if fn.environment:
props["has_env_vars"] = True
if fn.kms_key_arn:
props["kms_key_arn"] = fn.kms_key_arn
nodes.append(
ResourceNode(
id=fn.arn,
type="lambda_function",
name=fn.name,
service="lambda",
region=fn.region,
account_id=client.audited_account,
properties=props,
)
)
# Network edges → VPC, subnets, security groups
if fn.vpc_id:
edges.append(
ResourceEdge(
source_id=fn.arn,
target_id=fn.vpc_id,
edge_type="network",
label="in-vpc",
)
)
for sg_id in fn.security_groups or []:
edges.append(
ResourceEdge(
source_id=fn.arn,
target_id=sg_id,
edge_type="network",
label="sg",
)
)
for subnet_id in fn.subnet_ids or set():
edges.append(
ResourceEdge(
source_id=fn.arn,
target_id=subnet_id,
edge_type="network",
label="subnet",
)
)
# Trigger edges from event source mappings
for esm in getattr(fn, "event_source_mappings", []):
edges.append(
ResourceEdge(
source_id=esm.event_source_arn,
target_id=fn.arn,
edge_type="triggers",
label=f"esm:{esm.state}",
)
)
# Layer dependency edges
for layer in getattr(fn, "layers", []):
edges.append(
ResourceEdge(
source_id=fn.arn,
target_id=layer.arn,
edge_type="depends_on",
label="layer",
)
)
# Dead-letter queue data-flow edge
dlq = getattr(fn, "dead_letter_config", None)
if dlq and dlq.target_arn:
edges.append(
ResourceEdge(
source_id=fn.arn,
target_id=dlq.target_arn,
edge_type="data_flow",
label="dlq",
)
)
# KMS encryption edge
if fn.kms_key_arn:
edges.append(
ResourceEdge(
source_id=fn.kms_key_arn,
target_id=fn.arn,
edge_type="encrypts",
label="kms",
)
)
return nodes, edges
@@ -0,0 +1,86 @@
from typing import List, Tuple
from lib.models import ResourceEdge, ResourceNode
def extract(client) -> Tuple[List[ResourceNode], List[ResourceEdge]]:
"""
Extract RDS DB instance nodes and their edges.
Edges produced:
- db_instance security-group [network]
- db_instance VPC [network]
- db_instance cluster [depends_on]
- db_instance KMS key [encrypts]
"""
nodes: List[ResourceNode] = []
edges: List[ResourceEdge] = []
for db in client.db_instances.values():
props = {
"engine": getattr(db, "engine", None),
"engine_version": getattr(db, "engine_version", None),
"instance_class": getattr(db, "db_instance_class", None),
"vpc_id": getattr(db, "vpc_id", None),
"multi_az": getattr(db, "multi_az", None),
"publicly_accessible": getattr(db, "publicly_accessible", None),
"storage_encrypted": getattr(db, "storage_encrypted", None),
}
nodes.append(
ResourceNode(
id=db.arn,
type="rds_instance",
name=db.id,
service="rds",
region=db.region,
account_id=client.audited_account,
properties={k: v for k, v in props.items() if v is not None},
)
)
for sg in getattr(db, "security_groups", []):
sg_id = sg if isinstance(sg, str) else getattr(sg, "id", str(sg))
edges.append(
ResourceEdge(
source_id=db.arn,
target_id=sg_id,
edge_type="network",
label="sg",
)
)
vpc_id = getattr(db, "vpc_id", None)
if vpc_id:
edges.append(
ResourceEdge(
source_id=db.arn,
target_id=vpc_id,
edge_type="network",
label="in-vpc",
)
)
cluster_arn = getattr(db, "cluster_arn", None)
if cluster_arn:
edges.append(
ResourceEdge(
source_id=db.arn,
target_id=cluster_arn,
edge_type="depends_on",
label="cluster-member",
)
)
kms_key_id = getattr(db, "kms_key_id", None)
if kms_key_id:
edges.append(
ResourceEdge(
source_id=kms_key_id,
target_id=db.arn,
edge_type="encrypts",
label="kms",
)
)
return nodes, edges
@@ -0,0 +1,92 @@
from typing import List, Tuple
from lib.models import ResourceEdge, ResourceNode
def extract(client) -> Tuple[List[ResourceNode], List[ResourceEdge]]:
"""
Extract S3 bucket nodes and their edges.
Edges produced:
- bucket replication-target bucket [replicates_to]
- bucket KMS key [encrypts]
- bucket logging bucket [logs_to]
"""
nodes: List[ResourceNode] = []
edges: List[ResourceEdge] = []
for bucket in client.buckets.values():
encryption = getattr(bucket, "encryption", None)
versioning = getattr(bucket, "versioning_enabled", None)
logging = getattr(bucket, "logging", None)
public = getattr(bucket, "public_access_block", None)
props = {}
if versioning is not None:
props["versioning"] = versioning
if encryption:
enc_type = getattr(encryption, "type", str(encryption))
props["encryption"] = enc_type
nodes.append(
ResourceNode(
id=bucket.arn,
type="s3_bucket",
name=bucket.name,
service="s3",
region=bucket.region,
account_id=client.audited_account,
properties=props,
)
)
# Replication edges
for rule in getattr(bucket, "replication_rules", None) or []:
dest_bucket = getattr(rule, "destination_bucket", None)
if dest_bucket:
dest_arn = (
dest_bucket
if dest_bucket.startswith("arn:")
else f"arn:aws:s3:::{dest_bucket}"
)
edges.append(
ResourceEdge(
source_id=bucket.arn,
target_id=dest_arn,
edge_type="replicates_to",
label="s3-replication",
)
)
# Logging edges
if logging:
target_bucket = getattr(logging, "target_bucket", None)
if target_bucket:
target_arn = (
target_bucket
if target_bucket.startswith("arn:")
else f"arn:aws:s3:::{target_bucket}"
)
edges.append(
ResourceEdge(
source_id=bucket.arn,
target_id=target_arn,
edge_type="logs_to",
label="access-logs",
)
)
# KMS encryption edges
if encryption:
kms_arn = getattr(encryption, "kms_master_key_id", None)
if kms_arn:
edges.append(
ResourceEdge(
source_id=kms_arn,
target_id=bucket.arn,
edge_type="encrypts",
label="kms",
)
)
return nodes, edges
@@ -0,0 +1,92 @@
from typing import List, Tuple
from lib.models import ResourceEdge, ResourceNode
def extract(client) -> Tuple[List[ResourceNode], List[ResourceEdge]]:
"""
Extract VPC and subnet nodes with their edges.
Edges produced:
- subnet VPC [depends_on]
- peering connection between VPCs [network]
"""
nodes: List[ResourceNode] = []
edges: List[ResourceEdge] = []
# VPCs
for vpc in client.vpcs.values():
name = vpc.id if hasattr(vpc, "id") else vpc.arn
for tag in vpc.tags or []:
if isinstance(tag, dict) and tag.get("Key") == "Name":
name = tag["Value"]
break
nodes.append(
ResourceNode(
id=vpc.arn,
type="vpc",
name=name,
service="vpc",
region=vpc.region,
account_id=client.audited_account,
properties={
"cidr_block": getattr(vpc, "cidr_block", None),
"is_default": getattr(vpc, "is_default", None),
},
)
)
# VPC Subnets
for subnet in client.vpc_subnets.values():
name = subnet.id if hasattr(subnet, "id") else subnet.arn
for tag in getattr(subnet, "tags", None) or []:
if isinstance(tag, dict) and tag.get("Key") == "Name":
name = tag["Value"]
break
nodes.append(
ResourceNode(
id=subnet.arn,
type="subnet",
name=name,
service="vpc",
region=subnet.region,
account_id=client.audited_account,
properties={
"vpc_id": getattr(subnet, "vpc_id", None),
"cidr_block": getattr(subnet, "cidr_block", None),
"availability_zone": getattr(subnet, "availability_zone", None),
"public": getattr(subnet, "public", None),
},
)
)
vpc_id = getattr(subnet, "vpc_id", None)
if vpc_id:
# Find the VPC ARN for this vpc_id
vpc_arn = next(
(v.arn for v in client.vpcs.values() if v.id == vpc_id),
vpc_id,
)
edges.append(
ResourceEdge(
source_id=subnet.arn,
target_id=vpc_arn,
edge_type="depends_on",
label="subnet-of",
)
)
# VPC Peering Connections
for peering in getattr(client, "vpc_peering_connections", {}).values():
edges.append(
ResourceEdge(
source_id=peering.arn,
target_id=getattr(peering, "accepter_vpc_id", peering.arn),
edge_type="network",
label="vpc-peer",
)
)
return nodes, edges
@@ -0,0 +1,106 @@
"""
graph_builder.py
----------------
Builds a ConnectivityGraph by reading already-loaded AWS service clients from
sys.modules. Only services that were actually scanned (i.e. whose client
module is already imported) contribute nodes and edges. Unknown / unloaded
services are silently skipped, so the output degrades gracefully when only a
subset of checks has been run.
"""
import sys
from typing import Tuple
from prowler.lib.logger import logger
from lib.models import ConnectivityGraph
# Registry: (sys.modules key, attribute name inside that module, extractor module path)
_SERVICE_REGISTRY: Tuple[Tuple[str, str, str], ...] = (
(
"prowler.providers.aws.services.awslambda.awslambda_client",
"awslambda_client",
"lib.extractors.lambda_extractor",
),
(
"prowler.providers.aws.services.ec2.ec2_client",
"ec2_client",
"lib.extractors.ec2_extractor",
),
(
"prowler.providers.aws.services.vpc.vpc_client",
"vpc_client",
"lib.extractors.vpc_extractor",
),
(
"prowler.providers.aws.services.rds.rds_client",
"rds_client",
"lib.extractors.rds_extractor",
),
(
"prowler.providers.aws.services.elbv2.elbv2_client",
"elbv2_client",
"lib.extractors.elbv2_extractor",
),
(
"prowler.providers.aws.services.s3.s3_client",
"s3_client",
"lib.extractors.s3_extractor",
),
(
"prowler.providers.aws.services.iam.iam_client",
"iam_client",
"lib.extractors.iam_extractor",
),
)
def build_graph() -> ConnectivityGraph:
"""
Iterate over every registered service, check whether its client module is
already loaded, and call the corresponding extractor.
Returns a ConnectivityGraph with all discovered nodes and edges.
Duplicate node IDs are silently deduplicated (first occurrence wins).
"""
graph = ConnectivityGraph()
seen_node_ids: set = set()
for client_module_key, client_attr, extractor_module_key in _SERVICE_REGISTRY:
client_module = sys.modules.get(client_module_key)
if client_module is None:
continue
service_client = getattr(client_module, client_attr, None)
if service_client is None:
continue
extractor_module = sys.modules.get(extractor_module_key)
if extractor_module is None:
try:
import importlib
extractor_module = importlib.import_module(extractor_module_key)
except ImportError as e:
logger.debug(
f"inventory graph_builder: cannot import extractor {extractor_module_key}: {e}"
)
continue
try:
nodes, edges = extractor_module.extract(service_client)
except Exception as e:
logger.error(
f"inventory graph_builder: extractor {extractor_module_key} failed: "
f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}]: {e}"
)
continue
for node in nodes:
if node.id not in seen_node_ids:
graph.add_node(node)
seen_node_ids.add(node.id)
for edge in edges:
graph.add_edge(edge)
return graph
@@ -0,0 +1,502 @@
"""
inventory_output.py
-------------------
Writes the ConnectivityGraph produced by graph_builder to two files:
<output_path>.inventory.json machine-readable graph (nodes + edges)
<output_path>.inventory.html interactive D3.js force-directed graph
"""
import json
import os
from dataclasses import asdict
from datetime import datetime
from typing import Optional
from prowler.lib.logger import logger
from lib.models import ConnectivityGraph
# ---------------------------------------------------------------------------
# JSON output
# ---------------------------------------------------------------------------
def write_json(graph: ConnectivityGraph, file_path: str) -> None:
"""Serialise the graph to a JSON file."""
try:
os.makedirs(os.path.dirname(file_path), exist_ok=True)
data = {
"generated_at": datetime.utcnow().isoformat() + "Z",
"nodes": [asdict(n) for n in graph.nodes],
"edges": [asdict(e) for e in graph.edges],
"stats": {
"node_count": len(graph.nodes),
"edge_count": len(graph.edges),
},
}
with open(file_path, "w", encoding="utf-8") as fh:
json.dump(data, fh, indent=2, default=str)
logger.info(f"Inventory graph JSON written to {file_path}")
except Exception as e:
logger.error(
f"inventory_output.write_json: {e.__class__.__name__}[{e.__traceback__.tb_lineno}]: {e}"
)
# ---------------------------------------------------------------------------
# HTML output (self-contained, D3.js CDN)
# ---------------------------------------------------------------------------
# Colour palette per node type
_NODE_COLOURS = {
"lambda_function": "#f59e0b",
"ec2_instance": "#3b82f6",
"security_group": "#6366f1",
"vpc": "#10b981",
"subnet": "#34d399",
"rds_instance": "#ef4444",
"load_balancer": "#8b5cf6",
"s3_bucket": "#06b6d4",
"iam_role": "#f97316",
"default": "#94a3b8",
}
# Edge stroke colours per edge type
_EDGE_COLOURS = {
"network": "#64748b",
"iam": "#f97316",
"triggers": "#a855f7",
"data_flow": "#0ea5e9",
"depends_on": "#94a3b8",
"routes_to": "#22c55e",
"replicates_to": "#ec4899",
"encrypts": "#eab308",
"logs_to": "#78716c",
}
_HTML_TEMPLATE = """\
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8"/>
<meta name="viewport" content="width=device-width, initial-scale=1.0"/>
<title>Prowler AWS Connectivity Graph</title>
<script src="https://d3js.org/d3.v7.min.js"></script>
<style>
*, *::before, *::after {{ box-sizing: border-box; }}
body {{
margin: 0;
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, sans-serif;
background: #0f172a;
color: #e2e8f0;
}}
#header {{
padding: 12px 20px;
background: #1e293b;
border-bottom: 1px solid #334155;
display: flex;
align-items: center;
gap: 16px;
}}
#header h1 {{ margin: 0; font-size: 18px; font-weight: 700; }}
#header .stats {{ font-size: 13px; color: #94a3b8; }}
#controls {{
padding: 8px 20px;
background: #1e293b;
border-bottom: 1px solid #334155;
display: flex;
gap: 12px;
align-items: center;
flex-wrap: wrap;
}}
#controls label {{ font-size: 12px; color: #94a3b8; }}
#controls select, #controls input[type=range] {{
background: #0f172a;
color: #e2e8f0;
border: 1px solid #334155;
border-radius: 4px;
padding: 3px 6px;
font-size: 12px;
}}
#graph-container {{ width: 100%; height: calc(100vh - 100px); position: relative; }}
svg {{ width: 100%; height: 100%; }}
.node circle {{
stroke: #1e293b;
stroke-width: 1.5px;
cursor: pointer;
transition: r 0.15s;
}}
.node circle:hover {{ stroke-width: 3px; }}
.node text {{
font-size: 10px;
fill: #e2e8f0;
pointer-events: none;
text-shadow: 0 0 4px #0f172a;
}}
.link {{
stroke-opacity: 0.6;
stroke-width: 1.5px;
}}
.link-label {{
font-size: 8px;
fill: #94a3b8;
pointer-events: none;
}}
#tooltip {{
position: fixed;
background: #1e293b;
border: 1px solid #334155;
border-radius: 6px;
padding: 10px 14px;
font-size: 12px;
pointer-events: none;
max-width: 320px;
word-break: break-all;
z-index: 9999;
display: none;
}}
#tooltip strong {{ color: #f8fafc; }}
#tooltip .prop {{ color: #94a3b8; margin-top: 4px; }}
#legend {{
position: absolute;
top: 10px;
right: 10px;
background: rgba(30,41,59,0.9);
border: 1px solid #334155;
border-radius: 6px;
padding: 10px 14px;
font-size: 11px;
}}
#legend h3 {{ margin: 0 0 6px; font-size: 12px; }}
.legend-row {{ display: flex; align-items: center; gap: 6px; margin: 3px 0; }}
.legend-dot {{ width: 12px; height: 12px; border-radius: 50%; flex-shrink: 0; }}
.legend-line {{ width: 20px; height: 2px; flex-shrink: 0; }}
</style>
</head>
<body>
<div id="header">
<h1>🔗 AWS Connectivity Graph</h1>
<span class="stats" id="stat-label">Generated: {generated_at}</span>
</div>
<div id="controls">
<label>Filter service:
<select id="filter-service">
<option value="">All services</option>
</select>
</label>
<label>Link distance:
<input type="range" id="link-distance" min="40" max="300" value="120"/>
</label>
<label>Charge strength:
<input type="range" id="charge-strength" min="-800" max="-20" value="-250"/>
</label>
<span class="stats" id="visible-count"></span>
</div>
<div id="graph-container">
<svg id="graph-svg"></svg>
<div id="tooltip"></div>
<div id="legend">
<h3>Node types</h3>
{legend_nodes_html}
<h3 style="margin-top:8px">Edge types</h3>
{legend_edges_html}
</div>
</div>
<script>
const RAW_NODES = {nodes_json};
const RAW_EDGES = {edges_json};
const NODE_COLOURS = {node_colours_json};
const EDGE_COLOURS = {edge_colours_json};
// helpers
function nodeColour(d) {{
return NODE_COLOURS[d.type] || NODE_COLOURS["default"];
}}
function edgeColour(d) {{
return EDGE_COLOURS[d.edge_type] || "#94a3b8";
}}
function nodeRadius(d) {{
const base = {{
lambda_function: 9, ec2_instance: 10, vpc: 14, subnet: 8,
security_group: 7, rds_instance: 11, load_balancer: 12,
s3_bucket: 9, iam_role: 9
}};
return base[d.type] || 8;
}}
// filter controls
const services = [...new Set(RAW_NODES.map(n => n.service))].sort();
const sel = document.getElementById("filter-service");
services.forEach(s => {{
const o = document.createElement("option");
o.value = s; o.textContent = s;
sel.appendChild(o);
}});
// D3 setup
const svg = d3.select("#graph-svg");
const container = svg.append("g");
// zoom
svg.call(
d3.zoom().scaleExtent([0.05, 8])
.on("zoom", e => container.attr("transform", e.transform))
);
// arrowhead marker
const defs = svg.append("defs");
defs.append("marker")
.attr("id", "arrow")
.attr("viewBox", "0 -5 10 10")
.attr("refX", 20).attr("refY", 0)
.attr("markerWidth", 6).attr("markerHeight", 6)
.attr("orient", "auto")
.append("path")
.attr("d", "M0,-5L10,0L0,5")
.attr("fill", "#94a3b8");
// tooltip
const tooltip = document.getElementById("tooltip");
// simulation
let simulation, linkSel, nodeSel, labelSel;
function buildGraph(nodeFilter) {{
// Determine which nodes to show
const visibleNodes = nodeFilter
? RAW_NODES.filter(n => n.service === nodeFilter)
: RAW_NODES;
const visibleIds = new Set(visibleNodes.map(n => n.id));
// Only show edges where BOTH endpoints are visible
const visibleEdges = RAW_EDGES.filter(
e => visibleIds.has(e.source_id) && visibleIds.has(e.target_id)
);
document.getElementById("visible-count").textContent =
`Showing ${{visibleNodes.length}} nodes · ${{visibleEdges.length}} edges`;
container.selectAll("*").remove();
if (simulation) simulation.stop();
const nodes = visibleNodes.map(n => ({{ ...n }}));
const nodeIndex = Object.fromEntries(nodes.map(n => [n.id, n]));
const links = visibleEdges.map(e => ({{
...e,
source: nodeIndex[e.source_id] || e.source_id,
target: nodeIndex[e.target_id] || e.target_id,
}}));
const dist = +document.getElementById("link-distance").value;
const charge = +document.getElementById("charge-strength").value;
simulation = d3.forceSimulation(nodes)
.force("link", d3.forceLink(links).id(d => d.id).distance(dist))
.force("charge", d3.forceManyBody().strength(charge))
.force("center", d3.forceCenter(
document.getElementById("graph-container").clientWidth / 2,
document.getElementById("graph-container").clientHeight / 2
))
.force("collision", d3.forceCollide().radius(d => nodeRadius(d) + 6));
// Edges
linkSel = container.append("g").attr("class", "links")
.selectAll("line")
.data(links)
.join("line")
.attr("class", "link")
.attr("stroke", edgeColour)
.attr("marker-end", "url(#arrow)");
// Edge labels
labelSel = container.append("g").attr("class", "link-labels")
.selectAll("text")
.data(links)
.join("text")
.attr("class", "link-label")
.text(d => d.label || "");
// Nodes
nodeSel = container.append("g").attr("class", "nodes")
.selectAll("g")
.data(nodes)
.join("g")
.attr("class", "node")
.call(
d3.drag()
.on("start", (event, d) => {{
if (!event.active) simulation.alphaTarget(0.3).restart();
d.fx = d.x; d.fy = d.y;
}})
.on("drag", (event, d) => {{ d.fx = event.x; d.fy = event.y; }})
.on("end", (event, d) => {{
if (!event.active) simulation.alphaTarget(0);
d.fx = null; d.fy = null;
}})
)
.on("mouseover", (event, d) => {{
const props = Object.entries(d.properties || {{}})
.map(([k, v]) => `<div class="prop"><b>${{k}}</b>: ${{v}}</div>`)
.join("");
tooltip.innerHTML = `
<strong>${{d.name}}</strong>
<div class="prop"><b>type</b>: ${{d.type}}</div>
<div class="prop"><b>service</b>: ${{d.service}}</div>
<div class="prop"><b>region</b>: ${{d.region}}</div>
<div class="prop"><b>account</b>: ${{d.account_id}}</div>
<div class="prop" style="word-break:break-all"><b>arn</b>: ${{d.id}}</div>
${{props}}
`;
tooltip.style.display = "block";
tooltip.style.left = (event.clientX + 12) + "px";
tooltip.style.top = (event.clientY - 10) + "px";
}})
.on("mousemove", event => {{
tooltip.style.left = (event.clientX + 12) + "px";
tooltip.style.top = (event.clientY - 10) + "px";
}})
.on("mouseout", () => {{ tooltip.style.display = "none"; }});
nodeSel.append("circle")
.attr("r", nodeRadius)
.attr("fill", nodeColour);
nodeSel.append("text")
.attr("dx", d => nodeRadius(d) + 3)
.attr("dy", "0.35em")
.text(d => d.name.length > 24 ? d.name.slice(0, 22) + "" : d.name);
simulation.on("tick", () => {{
linkSel
.attr("x1", d => d.source.x)
.attr("y1", d => d.source.y)
.attr("x2", d => d.target.x)
.attr("y2", d => d.target.y);
labelSel
.attr("x", d => (d.source.x + d.target.x) / 2)
.attr("y", d => (d.source.y + d.target.y) / 2);
nodeSel.attr("transform", d => `translate(${{d.x}},${{d.y}})`);
}});
}}
// Initial render
buildGraph(null);
// Filter change
sel.addEventListener("change", () => buildGraph(sel.value || null));
// Simulation control sliders restart on change
document.getElementById("link-distance").addEventListener("input", () => buildGraph(sel.value || null));
document.getElementById("charge-strength").addEventListener("input", () => buildGraph(sel.value || null));
</script>
</body>
</html>
"""
def _build_legend_html(colours: dict, shape: str) -> str:
rows = []
for key, colour in sorted(colours.items()):
if shape == "dot":
rows.append(
f'<div class="legend-row">'
f'<div class="legend-dot" style="background:{colour}"></div>'
f"<span>{key}</span></div>"
)
else:
rows.append(
f'<div class="legend-row">'
f'<div class="legend-line" style="background:{colour}"></div>'
f"<span>{key}</span></div>"
)
return "\n".join(rows)
def write_html(graph: ConnectivityGraph, file_path: str) -> None:
"""Render the graph as a self-contained interactive HTML page."""
try:
os.makedirs(os.path.dirname(file_path), exist_ok=True)
nodes_json = json.dumps(
[
{
"id": n.id,
"type": n.type,
"name": n.name,
"service": n.service,
"region": n.region,
"account_id": n.account_id,
"properties": n.properties,
}
for n in graph.nodes
],
indent=None,
default=str,
)
edges_json = json.dumps(
[
{
"source_id": e.source_id,
"target_id": e.target_id,
"edge_type": e.edge_type,
"label": e.label or "",
}
for e in graph.edges
],
indent=None,
default=str,
)
html = _HTML_TEMPLATE.format(
generated_at=datetime.utcnow().strftime("%Y-%m-%d %H:%M UTC"),
nodes_json=nodes_json,
edges_json=edges_json,
node_colours_json=json.dumps(_NODE_COLOURS),
edge_colours_json=json.dumps(_EDGE_COLOURS),
legend_nodes_html=_build_legend_html(_NODE_COLOURS, "dot"),
legend_edges_html=_build_legend_html(_EDGE_COLOURS, "line"),
)
with open(file_path, "w", encoding="utf-8") as fh:
fh.write(html)
logger.info(f"Inventory graph HTML written to {file_path}")
except Exception as e:
logger.error(
f"inventory_output.write_html: {e.__class__.__name__}[{e.__traceback__.tb_lineno}]: {e}"
)
# ---------------------------------------------------------------------------
# Convenience entry-point called from __main__.py
# ---------------------------------------------------------------------------
def generate_inventory_outputs(output_path: str) -> None:
"""
Build the connectivity graph from currently-loaded service clients and write
both JSON and HTML outputs.
Args:
output_path: base file path WITHOUT extension, e.g.
"output/prowler-output-20240101120000".
The function appends .inventory.json and .inventory.html.
"""
from lib.graph_builder import build_graph
graph = build_graph()
if not graph.nodes:
logger.warning(
"Inventory graph: no nodes discovered. "
"Make sure at least one AWS service was scanned before generating the inventory."
)
write_json(graph, f"{output_path}.inventory.json")
write_html(graph, f"{output_path}.inventory.html")
+71
View File
@@ -0,0 +1,71 @@
from dataclasses import dataclass, field
from typing import Any, Dict, List, Optional
@dataclass
class ResourceNode:
"""
Represents a single AWS resource as a node in the connectivity graph.
id : globally unique identifier always the resource ARN
type : coarse resource type used for grouping/colour, e.g. "lambda_function"
name : human-readable label shown on the graph
service : AWS service name, e.g. "lambda", "ec2", "rds"
region : AWS region the resource lives in
account_id: AWS account ID
properties: additional resource-specific metadata (runtime, vpc_id, etc.)
"""
id: str
type: str
name: str
service: str
region: str
account_id: str
properties: Dict[str, Any] = field(default_factory=dict)
@dataclass
class ResourceEdge:
"""
Represents a directional relationship between two resource nodes.
source_id : ARN of the source node
target_id : ARN of the target node
edge_type : semantic type of the relationship, e.g.:
"network" resources share a network path (VPC/subnet/SG)
"iam" IAM trust or permission relationship
"triggers" one resource can invoke another (event source Lambda)
"data_flow" data is written/read (Lambda SQS dead-letter queue)
"depends_on" soft dependency (Lambda layer, subnet belongs to VPC)
"routes_to" traffic routing (LB target)
"encrypts" KMS key encrypts the resource
label : optional short label rendered on the edge in the HTML graph
"""
source_id: str
target_id: str
edge_type: str
label: Optional[str] = None
@dataclass
class ConnectivityGraph:
"""
Container for the full inventory connectivity graph.
nodes: all discovered resource nodes
edges: all discovered edges between nodes
"""
nodes: List[ResourceNode] = field(default_factory=list)
edges: List[ResourceEdge] = field(default_factory=list)
def add_node(self, node: ResourceNode) -> None:
self.nodes.append(node)
def add_edge(self, edge: ResourceEdge) -> None:
self.edges.append(edge)
def node_ids(self) -> set:
return {n.id for n in self.nodes}
+1 -1
View File
@@ -73,7 +73,7 @@ secrets:
DJANGO_SECRETS_ENCRYPTION_KEY: DJANGO_SECRETS_ENCRYPTION_KEY:
DJANGO_BROKER_VISIBILITY_TIMEOUT: 86400 DJANGO_BROKER_VISIBILITY_TIMEOUT: 86400
releaseConfigRoot: /home/prowler/.cache/pypoetry/virtualenvs/prowler-api-NnJNioq7-py3.12/lib/python3.12/site-packages/ releaseConfigRoot: /home/prowler/.venv/lib/python3.12/site-packages/
releaseConfigPath: prowler/config/config.yaml releaseConfigPath: prowler/config/config.yaml
mainConfig: mainConfig:
+7 -1
View File
@@ -48,10 +48,16 @@ services:
- path: .env - path: .env
required: false required: false
ports: ports:
- ${UI_PORT:-3000}:${UI_PORT:-3000} - ${UI_PORT:-3000}:3000
depends_on: depends_on:
mcp-server: mcp-server:
condition: service_healthy condition: service_healthy
healthcheck:
test: ["CMD-SHELL", "wget -q -O /dev/null http://127.0.0.1:3000/api/health || exit 1"]
interval: 10s
timeout: 5s
retries: 3
start_period: 60s
postgres: postgres:
image: postgres:16.3-alpine3.20 image: postgres:16.3-alpine3.20
+1 -1
View File
@@ -467,7 +467,7 @@ Effective headers and section titles enhance document readability and structure,
* **Example:** * **Example:**
* How to Clone and Install Prowler from GitHub (header: Title case) * How to Clone and Install Prowler from GitHub (header: Title case)
* How to install poetry dependencies (subheading: Sentence case) * How to install uv dependencies (subheading: Sentence case)
5. **Using Keywords in Headers** 5. **Using Keywords in Headers**
Headers should include relevant keywords to improve document searchability: Headers should include relevant keywords to improve document searchability:
* **Good:** Scanning AWS Accounts in Parallel * **Good:** Scanning AWS Accounts in Parallel
+2 -2
View File
@@ -10,10 +10,10 @@ This repository contains the Prowler Open Source documentation powered by [Mintl
## Local Development ## Local Development
Install the [Mintlify CLI](https://www.npmjs.com/package/mint) to preview documentation changes locally: Install a reviewed version of the [Mintlify CLI](https://www.npmjs.com/package/mint) to preview documentation changes locally:
```bash ```bash
npm i -g mint npm install --global mint@4.2.560
``` ```
Run the following command at the root of your documentation (where `mint.json` is located): Run the following command at the root of your documentation (where `mint.json` is located):
+2 -2
View File
@@ -20,8 +20,8 @@ The most common high level steps to create a new check are:
3. Create a check-specific folder. The path should follow this pattern: `prowler/providers/<provider>/services/<service>/<check_name_want_to_implement>`. Adhere to the [Naming Format for Checks](#naming-format-for-checks). 3. Create a check-specific folder. The path should follow this pattern: `prowler/providers/<provider>/services/<service>/<check_name_want_to_implement>`. Adhere to the [Naming Format for Checks](#naming-format-for-checks).
4. Populate the folder with files as specified in [File Creation](#file-creation). 4. Populate the folder with files as specified in [File Creation](#file-creation).
5. Run the check locally to ensure it works as expected. For checking you can use the CLI in the next way: 5. Run the check locally to ensure it works as expected. For checking you can use the CLI in the next way:
- To ensure the check has been detected by Prowler: `poetry run python prowler-cli.py <provider> --list-checks | grep <check_name>`. - To ensure the check has been detected by Prowler: `uv run python prowler-cli.py <provider> --list-checks | grep <check_name>`.
- To run the check, to find possible issues: `poetry run python prowler-cli.py <provider> --log-level ERROR --verbose --check <check_name>`. - To run the check, to find possible issues: `uv run python prowler-cli.py <provider> --log-level ERROR --verbose --check <check_name>`.
6. Create comprehensive tests for the check that cover multiple scenarios including both PASS (compliant) and FAIL (non-compliant) cases. For detailed information about test structure and implementation guidelines, refer to the [Testing](/developer-guide/unit-testing) documentation. 6. Create comprehensive tests for the check that cover multiple scenarios including both PASS (compliant) and FAIL (non-compliant) cases. For detailed information about test structure and implementation guidelines, refer to the [Testing](/developer-guide/unit-testing) documentation.
7. If the check and its corresponding tests are working as expected, you can submit a PR to Prowler. 7. If the check and its corresponding tests are working as expected, you can submit a PR to Prowler.
+1 -1
View File
@@ -28,7 +28,7 @@ This includes the [AGENTS.md](https://github.com/prowler-cloud/prowler/blob/mast
<Steps> <Steps>
<Step title="Install Mintlify CLI"> <Step title="Install Mintlify CLI">
```bash ```bash
npm i -g mint npm install --global mint@4.2.560
``` ```
For detailed instructions, check the [Mintlify documentation](https://www.mintlify.com/docs/installation). For detailed instructions, check the [Mintlify documentation](https://www.mintlify.com/docs/installation).
</Step> </Step>
+9 -16
View File
@@ -80,7 +80,7 @@ Before proceeding, ensure the following:
- Git is installed. - Git is installed.
- Python 3.10 or higher is installed. - Python 3.10 or higher is installed.
- `poetry` is installed to manage dependencies. - `uv` is installed to manage dependencies.
### Forking the Prowler Repository ### Forking the Prowler Repository
@@ -97,28 +97,21 @@ cd prowler
### Dependency Management and Environment Isolation ### Dependency Management and Environment Isolation
To prevent conflicts between environments, we recommend using `poetry`, a Python dependency management solution. Install it by following the [instructions](https://python-poetry.org/docs/#installation). To prevent conflicts between environments, we recommend using [`uv`](https://docs.astral.sh/uv/), a fast Python package and project manager. Install it by following the [official instructions](https://docs.astral.sh/uv/getting-started/installation/).
### Installing Dependencies ### Installing Dependencies
To install all required dependencies, including those needed for development, run: To install all required dependencies, including those needed for development, run:
``` ```
poetry install --with dev uv sync
eval $(poetry env activate) source .venv/bin/activate
``` ```
<Warning>
Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
In case you have any doubts, consult the [Poetry environment activation guide](https://python-poetry.org/docs/managing-environments/#activating-the-environment).
</Warning>
### Pre-Commit Hooks ### Pre-Commit Hooks
This repository uses Git pre-commit hooks managed by the [prek](https://prek.j178.dev/) tool, it is installed with `poetry install --with dev`. Next, run the following command in the root of this repository: This repository uses Git pre-commit hooks managed by the [prek](https://prek.j178.dev/) tool, it is installed with `uv sync`. Next, run the following command in the root of this repository:
```shell ```shell
prek install prek install
@@ -155,7 +148,7 @@ Once installed, TruffleHog runs before each push and blocks the operation when v
Before merging pull requests, several automated checks and utilities ensure code security and updated dependencies: Before merging pull requests, several automated checks and utilities ensure code security and updated dependencies:
<Note> <Note>
These should have been already installed if `poetry install --with dev` was already run. These should have been already installed if `uv sync` was already run.
</Note> </Note>
- [`bandit`](https://pypi.org/project/bandit/) for code security review. - [`bandit`](https://pypi.org/project/bandit/) for code security review.
@@ -183,7 +176,7 @@ These resources help ensure that AI-assisted contributions maintain consistency
All dependencies are listed in the `pyproject.toml` file. All dependencies are listed in the `pyproject.toml` file.
The SDK keeps direct dependencies pinned to exact versions, while `poetry.lock` records the full resolved dependency tree and the artifact hashes for every package. Use `poetry install` from the lock file instead of ad-hoc `pip` installs when you need a reproducible environment. The SDK keeps direct dependencies pinned to exact versions, while `uv.lock` records the full resolved dependency tree and the artifact hashes for every package. Use `uv sync` from the lock file instead of ad-hoc `pip` installs when you need a reproducible environment.
For proper code documentation, refer to the following and follow the code documentation practices presented there: [Google Python Style Guide - Comments and Docstrings](https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings). For proper code documentation, refer to the following and follow the code documentation practices presented there: [Google Python Style Guide - Comments and Docstrings](https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings).
@@ -209,8 +202,8 @@ prowler/
├── contrib/ # Community-contributed scripts or modules ├── contrib/ # Community-contributed scripts or modules
├── kubernetes/ # Kubernetes deployment files ├── kubernetes/ # Kubernetes deployment files
├── .github/ # GitHub-related files (workflows, issue templates, etc.) ├── .github/ # GitHub-related files (workflows, issue templates, etc.)
├── pyproject.toml # Python project configuration (Poetry) ├── pyproject.toml # Python project configuration (uv)
├── poetry.lock # Poetry lock file ├── uv.lock # uv lock file
├── README.md # Project overview and getting started ├── README.md # Project overview and getting started
├── Makefile # Common development commands ├── Makefile # Common development commands
├── Dockerfile # SDK Docker container ├── Dockerfile # SDK Docker container
+6 -4
View File
@@ -1277,10 +1277,12 @@ Dependencies ensure that your provider's required libraries are available when P
**File:** `pyproject.toml` **File:** `pyproject.toml`
```toml ```toml
[tool.poetry.dependencies] [project]
python = ">=3.10,<3.13" requires-python = ">=3.10,<3.13"
# ... other dependencies dependencies = [
your-sdk-library = "^1.0.0" # Add your SDK dependency # ... other dependencies
"your-sdk-library>=1.0.0,<2.0.0", # Add your SDK dependency
]
``` ```
#### Step 18: Create Tests #### Step 18: Create Tests
@@ -2,47 +2,378 @@
title: 'Creating a New Security Compliance Framework in Prowler' title: 'Creating a New Security Compliance Framework in Prowler'
--- ---
This guide explains how to add a new security compliance framework to Prowler, end to end. It covers directory layout, the JSON schema, check mapping conventions, the Pydantic models that validate each framework, the CSV output formatter, local validation, testing, and the pull request process.
## Introduction ## Introduction
To create or contribute a custom security framework for Prowler—or to integrate a public framework—you must ensure the necessary checks are available. If they are missing, they must be implemented before proceeding. A compliance framework in Prowler maps a public or custom control catalog (for example CIS, NIST 800-53, PCI DSS, HIPAA, ENS, CCC) to the security checks that Prowler already runs. Each requirement links to zero, one or more Prowler checks. When a scan executes, findings are aggregated per requirement to produce the compliance report rendered by Prowler CLI and Prowler Cloud.
Each framework is defined in a compliance file per provider. The file should follow the structure used in `prowler/compliance/<provider>/` and be named `<framework>_<version>_<provider>.json`. Follow the format below to create your own. Prowler ships with 85+ compliance frameworks across All Providers. The catalog lives under `prowler/compliance/<provider>/` (or `prowler/compliance/` for universal compliance frameworks)
## Compliance Framework <Warning>
A compliance framework must represent the **complete state** of the source catalog. Every requirement defined by the framework has to be present in the JSON file, even when none of the existing Prowler checks can automate it. In that case, leave `Checks` as an empty array, but do not omit the requirement.
### Compliance Framework Structure Requirement coverage feeds the compliance percentage calculations and the metadata surfaces (dashboards, widgets, exports). Missing requirements skew those metrics and break the report as a faithful snapshot of the framework.
</Warning>
Each compliance framework file consists of structured metadata that identifies the framework and maps security checks to requirements or controls. Please note that a single requirement can be linked to multiple Prowler checks: ### Prerequisites
- `Framework`: string The distinguished name of the framework (e.g., CIS). Before adding a new framework, complete the following checks:
- `Provider`: string The cloud provider where the framework applies (AWS, Azure, OCI).
- `Version`: string The framework version (e.g., 1.4 for CIS). - **Verify the framework is not already supported.** Inspect `prowler/compliance/<provider>/` for an existing JSON file matching the name and version.
- `Requirements`: array of objects. Defines security requirements and their mapping to Prowler checks. All requirements or controls are to be included with the mapping to Prowler. - **Confirm the required checks exist.** Every requirement that can be automated must point to one or more existing Prowler checks. For each missing check, implement it first by following the [Prowler Checks](/developer-guide/checks) guide.
- `Requirements_Id`: string A unique identifier for each requirement within the framework - **Review a reference framework.** Use an existing framework with a similar structure as your template. `cis_2.0_aws.json` is the canonical reference for CIS-style frameworks. `ccc_aws.json`, `ens_rd2022_aws.json`, and `nist_800_53_revision_5_aws.json` illustrate other attribute shapes.
- `Requirements_Description`: string The requirement description as specified in the framework.
- `Requirements_Attributes`: array of objects. Contains relevant metadata such as security levels, sections, and any additional data needed for reporting with the result of the findings. Attributes should be derived directly from the frameworks own terminology, ensuring consistency with its established definitions. ## Four-Layer Architecture
- `Requirements_Checks`: array. The Prowler checks that are needed to prove this requirement. It can be one or multiple checks. In case automation is not feasible, this can be empty.
A compliance framework spans four layers. A complete contribution must touch each layer that applies.
- **Layer 1 Schema validation:** The Pydantic models in `prowler/lib/check/compliance_models.py` define the canonical schema for each attribute shape (CIS, ENS, Mitre, CCC, C5, CSA CCM, ISO 27001, KISA ISMS-P, AWS Well-Architected, Prowler ThreatScore, and a generic fallback).
- **Layer 2 JSON catalog:** The framework JSON file in `prowler/compliance/<provider>/` lists every requirement and maps it to checks.
- **Layer 3 Output formatter:** The Python module in `prowler/lib/outputs/compliance/<framework>/` builds the CSV row model, the per-provider transformer, and the CLI summary table.
- **Layer 4 Output dispatchers:** The dispatchers in `prowler/lib/outputs/compliance/compliance.py` and `prowler/lib/outputs/compliance/compliance_output.py` route findings to the right formatter based on the framework identifier.
The rest of this guide walks each layer in order.
## Directory Structure and File Naming
Compliance frameworks live at:
``` ```
prowler/compliance/<provider>/<framework>_<version>_<provider>.json
```
The filename conventions are:
- All lowercase, words separated with underscores.
- `<provider>` is a supported provider identifier: `aws`, `azure`, `gcp`, `kubernetes`, `m365`, `github`, `googleworkspace`, `alibabacloud`, `oraclecloud`, `cloudflare`, `mongodbatlas`, `nhn`, `openstack`, `iac`, `llm`.
- `<version>` is optional. Omit it when the framework has no versioning, as in `ccc_aws.json`.
- The file basename (without `.json`) is the framework key that Prowler CLI accepts via `--compliance`.
Examples:
- `prowler/compliance/aws/cis_2.0_aws.json`
- `prowler/compliance/aws/nist_800_53_revision_5_aws.json`
- `prowler/compliance/azure/ens_rd2022_azure.json`
- `prowler/compliance/kubernetes/cis_1.10_kubernetes.json`
- `prowler/compliance/aws/ccc_aws.json`
The output formatter directory mirrors the framework name:
```
prowler/lib/outputs/compliance/<framework>/
├── <framework>.py # CLI summary-table dispatcher
├── <framework>_<provider>.py # Per-provider transformer class
├── models.py # Pydantic CSV row model
└── __init__.py
```
## JSON Schema Reference
Every compliance file is a JSON document with the following top-level keys.
| Field | Type | Required | Description |
|---|---|---|---|
| `Framework` | string | Yes | Canonical framework identifier, for example `CIS`, `NIST-800-53-Revision-5`, `ENS`, `CCC`. |
| `Name` | string | Yes | Human-readable framework name displayed by Prowler App. |
| `Version` | string | Yes | Framework version, for example `2.0`. Use an empty string only for frameworks without versioning. See [Version Handling](#version-handling). |
| `Provider` | string | Yes | Upper-cased provider identifier: `AWS`, `AZURE`, `GCP`, `KUBERNETES`, `M365`, `GITHUB`, `GOOGLEWORKSPACE`, and so on. |
| `Description` | string | Yes | Short description of the framework's scope and purpose. |
| `Requirements` | array | Yes | List of [requirement objects](#requirement-object). |
### Requirement Object
Each entry in `Requirements` describes one control or requirement.
| Field | Type | Required | Description |
|---|---|---|---|
| `Id` | string | Yes | Unique identifier within the framework, for example `1.10` or `CCC.Core.CN01.AR01`. |
| `Name` | string | No | Optional human-readable name used by frameworks that distinguish control name from description, such as NIST. |
| `Description` | string | Yes | Verbatim description from the source framework. |
| `Attributes` | array | Yes | List of [attribute objects](#attribute-objects). The shape depends on the framework. |
| `Checks` | array of strings | Yes | Prowler check identifiers that automate the requirement. Leave the list empty when the control cannot be automated. |
### Attribute Objects
Attributes carry the metadata that Prowler App and the CSV output display for each requirement. The object shape is framework-specific and is validated by a dedicated Pydantic model in `prowler/lib/check/compliance_models.py`. The most common shapes are summarized below.
#### CIS_Requirement_Attribute
Used by every CIS benchmark.
| Field | Type | Required | Notes |
|---|---|---|---|
| `Section` | string | Yes | Top-level section, for example `1 Identity and Access Management`. |
| `SubSection` | string | No | Optional second-level grouping. |
| `Profile` | enum | Yes | One of `Level 1`, `Level 2`, `E3 Level 1`, `E3 Level 2`, `E5 Level 1`, `E5 Level 2`. |
| `AssessmentStatus` | enum | Yes | `Manual` or `Automated`. |
| `Description` | string | Yes | Control description. |
| `RationaleStatement` | string | Yes | Reason the control exists. |
| `ImpactStatement` | string | Yes | Impact of non-compliance. |
| `RemediationProcedure` | string | Yes | Remediation steps. |
| `AuditProcedure` | string | Yes | Audit steps. |
| `AdditionalInformation` | string | Yes | Free-form notes. |
| `DefaultValue` | string | No | Default configuration value, when relevant. |
| `References` | string | Yes | Colon-separated list of reference URLs. |
#### ENS_Requirement_Attribute
Used by the Spanish ENS (Esquema Nacional de Seguridad) frameworks.
| Field | Type | Required | Notes |
|---|---|---|---|
| `IdGrupoControl` | string | Yes | Control group identifier. |
| `Marco` | string | Yes | Framework block (`operacional`, `organizativo`, `proteccion`). |
| `Categoria` | string | Yes | Control category. |
| `DescripcionControl` | string | Yes | Control description in Spanish. |
| `Tipo` | enum | Yes | `refuerzo`, `requisito`, `recomendacion`, `medida`. |
| `Nivel` | enum | Yes | `opcional`, `bajo`, `medio`, `alto`. |
| `Dimensiones` | array of enum | Yes | Subset of `confidencialidad`, `integridad`, `trazabilidad`, `autenticidad`, `disponibilidad`. |
| `ModoEjecucion` | string | Yes | Execution mode (`manual`, `automático`, `híbrido`). |
| `Dependencias` | array of strings | Yes | Ids of prerequisite controls. Empty list when none. |
#### CCC_Requirement_Attribute
Used by the Common Cloud Controls Catalog.
| Field | Type | Required | Notes |
|---|---|---|---|
| `FamilyName` | string | Yes | Control family, for example `Data`. |
| `FamilyDescription` | string | Yes | Description of the family. |
| `Section` | string | Yes | Section title. |
| `SubSection` | string | Yes | Subsection title, or empty string. |
| `SubSectionObjective` | string | Yes | Stated objective for the subsection. |
| `Applicability` | array of strings | Yes | Applicability tags such as `tlp-green`, `tlp-amber`, `tlp-red`. |
| `Recommendation` | string | Yes | Implementation recommendation. |
| `SectionThreatMappings` | array of objects | Yes | Each entry has `ReferenceId` and `Identifiers`. |
| `SectionGuidelineMappings` | array of objects | Yes | Each entry has `ReferenceId` and `Identifiers`. |
#### Generic_Compliance_Requirement_Attribute
The fallback attribute model used when no framework-specific schema applies (for example NIST 800-53, PCI DSS, GDPR, HIPAA).
| Field | Type | Required | Notes |
|---|---|---|---|
| `ItemId` | string | No | Item identifier. |
| `Section` | string | No | Section name. |
| `SubSection` | string | No | Subsection name. |
| `SubGroup` | string | No | Subgroup name. |
| `Service` | string | No | Affected service, for example `aws`, `iam`. |
| `Type` | string | No | Control type. |
| `Comment` | string | No | Free-form comment. |
Additional per-framework attribute models exist for `AWS_Well_Architected_Requirement_Attribute`, `ISO27001_2013_Requirement_Attribute`, `Mitre_Requirement_Attribute_<Provider>`, `KISA_ISMSP_Requirement_Attribute`, `Prowler_ThreatScore_Requirement_Attribute`, `C5Germany_Requirement_Attribute`, and `CSA_CCM_Requirement_Attribute`. Consult `prowler/lib/check/compliance_models.py` for their full field sets.
<Note>
The `Attributes` field is a Pydantic `Union`. The generic attribute model must remain the last element of that Union, otherwise Pydantic v1 silently coerces every framework into the generic shape and your specialized fields are dropped.
</Note>
## Minimal Working Example
The following snippet is a complete, valid framework file named `my_framework_1.0_aws.json`, saved at `prowler/compliance/aws/my_framework_1.0_aws.json`. It uses the generic attribute shape for simplicity.
```json title="prowler/compliance/aws/my_framework_1.0_aws.json"
{ {
"Framework": "<framework>-<provider>", "Framework": "My-Framework",
"Version": "<version>", "Name": "My Framework 1.0 for AWS",
"Version": "1.0",
"Provider": "AWS",
"Description": "Internal baseline for AWS accounts.",
"Requirements": [ "Requirements": [
{ {
"Id": "<unique-id>", "Id": "MF-1.1",
"Description": "Full description of the requirement", "Description": "Root account must have multi-factor authentication enabled.",
"Checks": [
"Here is the prowler check or checks that will be executed"
],
"Attributes": [ "Attributes": [
{ {
<Add here your custom attributes.> "ItemId": "MF-1.1",
"Section": "Identity and Access Management",
"SubSection": "Root Account",
"Service": "iam"
} }
],
"Checks": [
"iam_root_mfa_enabled",
"iam_root_hardware_mfa_enabled"
] ]
}, },
... {
"Id": "MF-2.1",
"Description": "S3 buckets must block public access at the account level.",
"Attributes": [
{
"ItemId": "MF-2.1",
"Section": "Data Protection",
"Service": "s3"
}
],
"Checks": [
"s3_account_level_public_access_blocks"
]
}
] ]
} }
``` ```
Finally, to have a proper output file for your reports, your framework data model has to be created in `prowler/lib/outputs/models.py` and also the CLI table output in `prowler/lib/outputs/compliance.py`. Also, you need to add a new conditional in `prowler/lib/outputs/file_descriptors.py` if creating a new CSV model. ## Mapping Checks to Requirements
Each requirement links to the Prowler checks that, together, produce a PASS or FAIL verdict for that control.
- **Include every requirement from the source catalog.** The framework file must mirror the full control list, one-to-one. Compliance percentages, dashboards, and exported metadata are computed against the total requirement count, so omitting an unmappable control inflates coverage and misrepresents the framework.
- List every check by its canonical identifier, the value of `CheckID` inside the check's `.metadata.json` file.
- One requirement can reference multiple checks. The requirement is evaluated as FAIL when any referenced check produces a FAIL finding for a resource in scope.
- Leave `Checks` as an empty array when the requirement cannot be automated. The requirement still appears in the report, contributes to the total, and resolves to `MANUAL`. An empty mapping is valid; a missing requirement is not.
- Reuse checks across requirements when the same control applies in multiple places. Do not duplicate check logic to match framework structure.
- Avoid referencing checks from a different provider. A compliance file is bound to one provider, and cross-provider checks will never match findings in the scan.
To discover available checks, run:
```bash
uv run python prowler-cli.py <provider> --list-checks
```
## Supporting Multiple Providers
Each compliance file targets a single provider. To cover several providers with the same framework (for example CIS across AWS, Azure, and GCP), ship one JSON file per provider:
```
prowler/compliance/aws/cis_2.0_aws.json
prowler/compliance/azure/cis_2.0_azure.json
prowler/compliance/gcp/cis_2.0_gcp.json
```
Keep the `Framework` and `Version` values identical across the files so the dispatcher matches them, and change only the `Provider`, `Checks`, and provider-specific metadata.
The CIS output formatter already supports every provider listed above. For a brand-new framework that spans several providers, add one transformer per provider in `prowler/lib/outputs/compliance/<framework>/` and extend the summary-table dispatcher accordingly. See [Output Formatter](#output-formatter).
## Output Formatter
Prowler renders every compliance framework in two forms: a detailed CSV report written to disk, and a summary table printed in the CLI. Both are produced by the output formatter package for the framework.
For a new framework named `my_framework`, create:
```
prowler/lib/outputs/compliance/my_framework/
├── __init__.py
├── my_framework.py # CLI summary table dispatcher
├── my_framework_aws.py # Per-provider transformer
└── models.py # CSV row Pydantic model
```
### Step 1 Define the CSV Row Model
In `models.py`, declare a Pydantic v1 model with one field per CSV column. Use existing models such as `AWSCISModel` in `prowler/lib/outputs/compliance/cis/models.py` as the reference. Fields typically include `Provider`, `Description`, `AccountId`, `Region`, `AssessmentDate`, `Requirements_Id`, `Requirements_Description`, one `Requirements_Attributes_*` field per attribute key, plus the finding fields `Status`, `StatusExtended`, `ResourceId`, `ResourceName`, `CheckId`, `Muted`, `Framework`, `Name`.
### Step 2 Implement the Transformer Class
In `my_framework_aws.py`, subclass `ComplianceOutput` from `prowler.lib.outputs.compliance.compliance_output` and implement `transform(findings, compliance, compliance_name)`. Iterate over `findings`, match each finding to the requirements it satisfies through `finding.compliance.get(compliance_name, [])`, and append one row per attribute to `self._data`.
### Step 3 Add the Summary-Table Dispatcher
In `my_framework.py`, implement `get_my_framework_table(findings, bulk_checks_metadata, compliance_framework, output_filename, output_directory, compliance_overview)` following the pattern in `prowler/lib/outputs/compliance/cis/cis.py`.
### Step 4 Register the Framework in the Dispatchers
- Add the dispatcher call in `prowler/lib/outputs/compliance/compliance.py`, inside `display_compliance_table`, with a branch such as `elif "my_framework" in compliance_framework:`.
- Register the CSV model and transformer in `prowler/lib/outputs/compliance/compliance_output.py` so the CSV file is emitted during the scan.
<Note>
For NIST-style catalogs that use `Generic_Compliance_Requirement_Attribute`, no custom formatter is needed. The generic formatter in `prowler/lib/outputs/compliance/generic/` handles them automatically, provided the JSON validates against the generic attribute schema.
</Note>
## Version Handling
Prowler matches frameworks by concatenating `Framework` and `Version`. A missing or empty `Version` collapses several frameworks to the same key and breaks CLI filtering with `--compliance`.
- Always set `Version` to a non-empty string, even for frameworks that rename editions rather than version them. Use the edition identifier (for example `RD2022`, `v2025.10`, `4.0`).
- When the source catalog has no version, use the first year of adoption or the release date.
- Make sure the version substring embedded in the filename matches `Version`, because the CLI dispatcher reads `compliance_framework.split("_")[1]` to select the correct version.
## Validating the Framework Locally
Follow the steps below before opening a pull request.
### 1. Run the Compliance Model Validator
```bash
uv run python prowler-cli.py <provider> --list-compliance
```
The framework must appear in the output. A validation error indicates a schema mismatch between the JSON file and the attribute model.
### 2. Run a Scan Filtered by the Framework
```bash
uv run python prowler-cli.py <provider> \
--compliance <framework>_<version>_<provider> \
--log-level ERROR
```
Verify that:
- Prowler produces a CSV file under `output/compliance/` with the expected name.
- The CLI summary table lists every section in the framework.
- Findings roll up under the expected requirements.
### 3. Inspect the CSV Output
Open the generated CSV and confirm:
- All columns defined in `models.py` appear.
- Every requirement has at least one row per scanned resource.
- Values such as `Requirements_Attributes_Section` reflect the JSON content.
### 4. Verify the Framework in Prowler App
Launch Prowler App locally (`docker compose up` from the repository root) and run a scan with the new compliance framework. Confirm the compliance page renders the requirements, sections, and status widgets correctly.
## Testing
Compliance contributions require two layers of tests.
- **Schema tests** exercise the Pydantic models. Extend `tests/lib/check/universal_compliance_models_test.py` with a case that loads the new JSON file and asserts the attribute type matches the expected model.
- **Output tests** exercise the transformer. Mirror the structure under `tests/lib/outputs/compliance/<framework>/` with fixtures that feed synthetic findings through the transformer and assert the resulting CSV rows.
Run the suite with:
```bash
uv run pytest -n auto tests/lib/check/universal_compliance_models_test.py \
tests/lib/outputs/compliance/
```
For guidance on writing Prowler SDK tests, refer to [Unit Testing](/developer-guide/unit-testing).
## Submitting the Pull Request
Before opening the pull request:
1. Run the complete QA pipeline:
```bash
uv run pre-commit run --all-files
uv run pytest -n auto
```
2. Add a changelog entry under the `### 🚀 Added` section of `prowler/CHANGELOG.md`, describing the new framework and the providers it covers.
3. Follow the [Pull Request Template](https://github.com/prowler-cloud/prowler/blob/master/.github/pull_request_template.md) and set the PR title using Conventional Commits, for example `feat(compliance): add My Framework 1.0 for AWS`.
4. Request review from the compliance codeowners listed in `.github/CODEOWNERS`.
## Troubleshooting
The following issues are the most common when contributing a compliance framework.
- **`ValidationError: field required` during scan.** The JSON is missing a required attribute field. Re-check the matching Pydantic model in `prowler/lib/check/compliance_models.py`.
- **All attributes collapse to `Generic_Compliance_Requirement_Attribute` values.** The Pydantic `Union` is ordered incorrectly, or the JSON matches only the generic shape. Move the generic model to the last Union position and ensure every required field is present in the JSON.
- **`--compliance` filter does not find the framework.** The filename does not match the expected pattern `<framework>_<version>_<provider>.json`, the version is empty, or the file lives outside `prowler/compliance/<provider>/`.
- **CLI summary table is empty but the CSV is populated.** The dispatcher branch in `prowler/lib/outputs/compliance/compliance.py` is missing or its substring match does not catch the framework key.
- **CSV file is missing after the scan.** The transformer class is not registered in `prowler/lib/outputs/compliance/compliance_output.py`, or `transform()` raises silently. Run the scan with `--log-level DEBUG`.
- **Findings do not roll up under a requirement.** A check listed in `Checks` either does not exist for that provider or is spelled incorrectly. Run `--list-checks | grep <check_name>` to confirm.
## Reference Examples
Use the following files as templates when modeling a new contribution.
- `prowler/compliance/aws/cis_2.0_aws.json` CIS attribute shape.
- `prowler/compliance/aws/nist_800_53_revision_5_aws.json` Generic attribute shape.
- `prowler/compliance/aws/ccc_aws.json` CCC attribute shape.
- `prowler/compliance/azure/ens_rd2022_azure.json` ENS attribute shape.
- `prowler/lib/check/compliance_models.py` Canonical Pydantic schemas.
- `prowler/lib/outputs/compliance/cis/` Reference implementation of a multi-provider output formatter.
- `prowler/lib/outputs/compliance/generic/` Reference implementation of a generic output formatter.
+1 -1
View File
@@ -23,7 +23,7 @@ Within this folder the following files are also to be created:
- `<new_service_name>_service.py` Contains all the logic and API calls of the service. - `<new_service_name>_service.py` Contains all the logic and API calls of the service.
- `<new_service_name>_client_.py` Contains the initialization of the freshly created service's class so that the checks can use it. - `<new_service_name>_client_.py` Contains the initialization of the freshly created service's class so that the checks can use it.
Once the files are create, you can check that the service has been created by running the following command: `poetry run python prowler-cli.py <provider> --list-services | grep <new_service_name>`. Once the files are create, you can check that the service has been created by running the following command: `uv run python prowler-cli.py <provider> --list-services | grep <new_service_name>`.
## Service Structure and Initialisation ## Service Structure and Initialisation
+12 -1
View File
@@ -177,7 +177,6 @@
"pages": [ "pages": [
"user-guide/cli/tutorials/misc", "user-guide/cli/tutorials/misc",
"user-guide/cli/tutorials/reporting", "user-guide/cli/tutorials/reporting",
"user-guide/cli/tutorials/compliance",
"user-guide/cli/tutorials/dashboard", "user-guide/cli/tutorials/dashboard",
"user-guide/cli/tutorials/configuration_file", "user-guide/cli/tutorials/configuration_file",
"user-guide/cli/tutorials/logging", "user-guide/cli/tutorials/logging",
@@ -333,12 +332,20 @@
"user-guide/providers/vercel/getting-started-vercel", "user-guide/providers/vercel/getting-started-vercel",
"user-guide/providers/vercel/authentication" "user-guide/providers/vercel/authentication"
] ]
},
{
"group": "Okta",
"pages": [
"user-guide/providers/okta/getting-started-okta",
"user-guide/providers/okta/authentication"
]
} }
] ]
}, },
{ {
"group": "Compliance", "group": "Compliance",
"pages": [ "pages": [
"user-guide/compliance/tutorials/compliance",
"user-guide/compliance/tutorials/threatscore" "user-guide/compliance/tutorials/threatscore"
] ]
}, },
@@ -504,6 +511,10 @@
} }
}, },
"redirects": [ "redirects": [
{
"source": "/user-guide/cli/tutorials/compliance",
"destination": "/user-guide/compliance/tutorials/compliance"
},
{ {
"source": "/projects/prowler-open-source/en/latest/tutorials/prowler-app-lighthouse", "source": "/projects/prowler-open-source/en/latest/tutorials/prowler-app-lighthouse",
"destination": "/user-guide/tutorials/prowler-app-lighthouse" "destination": "/user-guide/tutorials/prowler-app-lighthouse"
@@ -10,7 +10,7 @@ Complete reference guide for all tools available in the Prowler MCP Server. Tool
|----------|------------|------------------------| |----------|------------|------------------------|
| Prowler Hub | 10 tools | No | | Prowler Hub | 10 tools | No |
| Prowler Documentation | 2 tools | No | | Prowler Documentation | 2 tools | No |
| Prowler Cloud/App | 29 tools | Yes | | Prowler Cloud/App | 32 tools | Yes |
## Tool Naming Convention ## Tool Naming Convention
@@ -36,6 +36,14 @@ Tools for searching, viewing, and analyzing security findings across all cloud p
- **`prowler_app_get_finding_details`** - Get comprehensive details about a specific finding including remediation guidance, check metadata, and resource relationships - **`prowler_app_get_finding_details`** - Get comprehensive details about a specific finding including remediation guidance, check metadata, and resource relationships
- **`prowler_app_get_findings_overview`** - Get aggregate statistics and trends about security findings as a markdown report - **`prowler_app_get_findings_overview`** - Get aggregate statistics and trends about security findings as a markdown report
### Finding Groups Management
Tools for listing finding groups aggregated by check ID, viewing complete group counters, and drilling down into affected resources.
- **`prowler_app_list_finding_groups`** - List latest or historical finding groups with filters for provider, region, service, resource, category, check, severity, status, muted state, delta, date range, and sorting
- **`prowler_app_get_finding_group_details`** - Get complete details for a specific finding group including counters, description, timestamps, and impacted providers
- **`prowler_app_list_finding_group_resources`** - List actionable unmuted resources affected by a finding group by default, including nested resource and provider data plus the `finding_id` for remediation details. Set `include_muted` to include suppressed resources
### Provider Management ### Provider Management
Tools for managing cloud provider connections in Prowler. Tools for managing cloud provider connections in Prowler.
@@ -44,13 +44,21 @@ Choose the configuration based on your deployment:
<Tab title="Generic without Native HTTP Support"> <Tab title="Generic without Native HTTP Support">
**Configuration:** **Configuration:**
<Warning>
Avoid configuring MCP clients to run `npx mcp-remote` directly. `npx` can download and execute a new package version on each run. Install a reviewed version of `mcp-remote` in a dedicated local workspace, then point the MCP client to the installed binary.
</Warning>
```bash
mkdir -p ~/.local/share/prowler-mcp-bridge
cd ~/.local/share/prowler-mcp-bridge
npm init -y
npm install --save-exact mcp-remote@0.1.38
```
```json ```json
{ {
"mcpServers": { "mcpServers": {
"prowler": { "prowler": {
"command": "npx", "command": "/absolute/path/to/.local/share/prowler-mcp-bridge/node_modules/.bin/mcp-remote",
"args": [ "args": [
"mcp-remote",
"https://mcp.prowler.com/mcp", // or your self-hosted Prowler MCP Server URL "https://mcp.prowler.com/mcp", // or your self-hosted Prowler MCP Server URL
"--header", "--header",
"Authorization: Bearer ${PROWLER_APP_API_KEY}" "Authorization: Bearer ${PROWLER_APP_API_KEY}"
@@ -72,14 +80,20 @@ Choose the configuration based on your deployment:
2. Go to "Developer" tab 2. Go to "Developer" tab
3. Click in "Edit Config" button 3. Click in "Edit Config" button
4. Edit the `claude_desktop_config.json` file with your favorite editor 4. Edit the `claude_desktop_config.json` file with your favorite editor
5. Add the following configuration: 5. Install a reviewed version of `mcp-remote` in a dedicated local workspace:
```bash
mkdir -p ~/.local/share/prowler-mcp-bridge
cd ~/.local/share/prowler-mcp-bridge
npm init -y
npm install --save-exact mcp-remote@0.1.38
```
6. Add the following configuration:
```json ```json
{ {
"mcpServers": { "mcpServers": {
"prowler": { "prowler": {
"command": "npx", "command": "/absolute/path/to/.local/share/prowler-mcp-bridge/node_modules/.bin/mcp-remote",
"args": [ "args": [
"mcp-remote",
"https://mcp.prowler.com/mcp", "https://mcp.prowler.com/mcp",
"--header", "--header",
"Authorization: Bearer ${PROWLER_APP_API_KEY}" "Authorization: Bearer ${PROWLER_APP_API_KEY}"
@@ -37,8 +37,8 @@ Refer to the [Prowler App Tutorial](/user-guide/tutorials/prowler-app) for detai
_Requirements_: _Requirements_:
- `git` installed. - `git` installed.
- `poetry` installed: [poetry installation](https://python-poetry.org/docs/#installation). - `uv` installed: [uv installation](https://docs.astral.sh/uv/getting-started/installation/).
- `npm` installed: [npm installation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm). - `pnpm` installed through [Corepack](https://pnpm.io/installation#using-corepack) or the standalone [pnpm installation](https://pnpm.io/installation).
- `Docker Compose` installed: https://docs.docker.com/compose/install/. - `Docker Compose` installed: https://docs.docker.com/compose/install/.
<Warning> <Warning>
@@ -49,8 +49,8 @@ Refer to the [Prowler App Tutorial](/user-guide/tutorials/prowler-app) for detai
```bash ```bash
git clone https://github.com/prowler-cloud/prowler \ git clone https://github.com/prowler-cloud/prowler \
cd prowler/api \ cd prowler/api \
poetry install \ uv sync \
eval $(poetry env activate) \ source .venv/bin/activate \
set -a \ set -a \
source .env \ source .env \
docker compose up postgres valkey -d \ docker compose up postgres valkey -d \
@@ -59,11 +59,6 @@ Refer to the [Prowler App Tutorial](/user-guide/tutorials/prowler-app) for detai
gunicorn -c config/guniconf.py config.wsgi:application gunicorn -c config/guniconf.py config.wsgi:application
``` ```
<Warning>
Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment. In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
</Warning>
> Now, you can access the API documentation at http://localhost:8080/api/v1/docs. > Now, you can access the API documentation at http://localhost:8080/api/v1/docs.
_Commands to run the API Worker_: _Commands to run the API Worker_:
@@ -71,8 +66,8 @@ Refer to the [Prowler App Tutorial](/user-guide/tutorials/prowler-app) for detai
```bash ```bash
git clone https://github.com/prowler-cloud/prowler \ git clone https://github.com/prowler-cloud/prowler \
cd prowler/api \ cd prowler/api \
poetry install \ uv sync \
eval $(poetry env activate) \ source .venv/bin/activate \
set -a \ set -a \
source .env \ source .env \
cd src/backend \ cd src/backend \
@@ -84,8 +79,8 @@ Refer to the [Prowler App Tutorial](/user-guide/tutorials/prowler-app) for detai
```bash ```bash
git clone https://github.com/prowler-cloud/prowler \ git clone https://github.com/prowler-cloud/prowler \
cd prowler/api \ cd prowler/api \
poetry install \ uv sync \
eval $(poetry env activate) \ source .venv/bin/activate \
set -a \ set -a \
source .env \ source .env \
cd src/backend \ cd src/backend \
@@ -97,9 +92,11 @@ Refer to the [Prowler App Tutorial](/user-guide/tutorials/prowler-app) for detai
```bash ```bash
git clone https://github.com/prowler-cloud/prowler \ git clone https://github.com/prowler-cloud/prowler \
cd prowler/ui \ cd prowler/ui \
npm install \ corepack enable \
npm run build \ corepack install \
npm start pnpm install --frozen-lockfile \
pnpm run build \
pnpm start
``` ```
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password. > Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
@@ -121,8 +118,8 @@ To update the environment file:
Edit the `.env` file and change version values: Edit the `.env` file and change version values:
```env ```env
PROWLER_UI_VERSION="5.25.3" PROWLER_UI_VERSION="5.26.1"
PROWLER_API_VERSION="5.25.3" PROWLER_API_VERSION="5.26.1"
``` ```
<Note> <Note>
@@ -68,7 +68,7 @@ To install Prowler as a Python package, use `Python >= 3.10, <= 3.12`. Prowler i
_Requirements for Developers_: _Requirements for Developers_:
* `git` * `git`
* `poetry` installed: [poetry installation](https://python-poetry.org/docs/#installation). * `uv` installed: [uv installation](https://docs.astral.sh/uv/getting-started/installation/).
* AWS, GCP, Azure and/or Kubernetes credentials * AWS, GCP, Azure and/or Kubernetes credentials
_Commands_: _Commands_:
@@ -76,8 +76,8 @@ To install Prowler as a Python package, use `Python >= 3.10, <= 3.12`. Prowler i
```bash ```bash
git clone https://github.com/prowler-cloud/prowler git clone https://github.com/prowler-cloud/prowler
cd prowler cd prowler
poetry install uv sync
poetry run python prowler-cli.py -v uv run python prowler-cli.py -v
``` ```
<Note> <Note>
Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 534 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 659 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 759 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 62 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 534 KiB

+3 -2
View File
@@ -47,11 +47,12 @@ Prowler supports a wide range of providers organized by category:
| Provider | Support | Audit Scope/Entities | Interface | | Provider | Support | Audit Scope/Entities | Interface |
| ----------------------------------------------------------------------------------------- | -------- | ---------------------------- | ------------ | | ----------------------------------------------------------------------------------------- | -------- | ---------------------------- | ------------ |
| [GitHub](/user-guide/providers/github/getting-started-github) | Official | Organizations / Repositories | UI, API, CLI | | [GitHub](/user-guide/providers/github/getting-started-github) | Official | Organizations / Repositories | UI, API, CLI |
| [Google Workspace](/user-guide/providers/googleworkspace/getting-started-googleworkspace) | Official | Domains | CLI | | [Google Workspace](/user-guide/providers/googleworkspace/getting-started-googleworkspace) | Official | Domains | UI, API, CLI |
| [LLM](/user-guide/providers/llm/getting-started-llm) | Official | Models | CLI | | [LLM](/user-guide/providers/llm/getting-started-llm) | Official | Models | CLI |
| [M365](/user-guide/providers/microsoft365/getting-started-m365) | Official | Tenants | UI, API, CLI | | [M365](/user-guide/providers/microsoft365/getting-started-m365) | Official | Tenants | UI, API, CLI |
| [MongoDB Atlas](/user-guide/providers/mongodbatlas/getting-started-mongodbatlas) | Official | Organizations | UI, API, CLI | | [MongoDB Atlas](/user-guide/providers/mongodbatlas/getting-started-mongodbatlas) | Official | Organizations | UI, API, CLI |
| [Vercel](/user-guide/providers/vercel/getting-started-vercel) | Official | Teams / Projects | CLI | | [Okta](/user-guide/providers/okta/getting-started-okta) | Official | Organizations | CLI |
| [Vercel](/user-guide/providers/vercel/getting-started-vercel) | Official | Teams / Projects | UI, API, CLI |
### Kubernetes ### Kubernetes
@@ -1,80 +0,0 @@
---
title: 'Compliance'
---
Prowler allows you to execute checks based on requirements defined in compliance frameworks. By default, it will execute and give you an overview of the status of each compliance framework:
<img src="/images/cli/compliance/compliance.png" />
You can find CSVs containing detailed compliance results in the compliance folder within Prowler's output folder.
## Execute Prowler based on Compliance Frameworks
Prowler can analyze your environment based on a specific compliance framework and get more details, to do it, you can use option `--compliance`:
```sh
prowler <provider> --compliance <compliance_framework>
```
Standard results will be shown and additionally the framework information as the sample below for CIS AWS 2.0. For details a CSV file has been generated as well.
<img src="/images/cli/compliance/compliance-cis-sample1.png" />
<Note>
**If Prowler can't find a resource related with a check from a compliance requirement, this requirement won't appear on the output**
</Note>
## List Available Compliance Frameworks
To see which compliance frameworks are covered by Prowler, use the `--list-compliance` option:
```sh
prowler <provider> --list-compliance
```
Or you can visit [Prowler Hub](https://hub.prowler.com/compliance).
## List Requirements of Compliance Frameworks
To list requirements for a compliance framework, use the `--list-compliance-requirements` option:
```sh
prowler <provider> --list-compliance-requirements <compliance_framework(s)>
```
Example for the first requirements of CIS 1.5 for AWS:
```
Listing CIS 1.5 AWS Compliance Requirements:
Requirement Id: 1.1
- Description: Maintain current contact details
- Checks:
account_maintain_current_contact_details
Requirement Id: 1.2
- Description: Ensure security contact information is registered
- Checks:
account_security_contact_information_is_registered
Requirement Id: 1.3
- Description: Ensure security questions are registered in the AWS account
- Checks:
account_security_questions_are_registered_in_the_aws_account
Requirement Id: 1.4
- Description: Ensure no 'root' user account access key exists
- Checks:
iam_no_root_access_key
Requirement Id: 1.5
- Description: Ensure MFA is enabled for the 'root' user account
- Checks:
iam_root_mfa_enabled
[redacted]
```
## Create and contribute adding other Security Frameworks
This information is part of the Developer Guide and can be found [here](/developer-guide/security-compliance-framework).
@@ -56,6 +56,7 @@ The following list includes all the AWS checks with configurable variables that
| `elb_is_in_multiple_az` | `elb_min_azs` | Integer | | `elb_is_in_multiple_az` | `elb_min_azs` | Integer |
| `elbv2_is_in_multiple_az` | `elbv2_min_azs` | Integer | | `elbv2_is_in_multiple_az` | `elbv2_min_azs` | Integer |
| `guardduty_is_enabled` | `mute_non_default_regions` | Boolean | | `guardduty_is_enabled` | `mute_non_default_regions` | Boolean |
| `iam_user_access_not_stale_to_sagemaker` | `max_unused_sagemaker_access_days` | Integer |
| `iam_user_accesskey_unused` | `max_unused_access_keys_days` | Integer | | `iam_user_accesskey_unused` | `max_unused_access_keys_days` | Integer |
| `iam_user_console_access_unused` | `max_console_access_days` | Integer | | `iam_user_console_access_unused` | `max_console_access_days` | Integer |
| `organizations_delegated_administrators` | `organizations_trusted_delegated_administrators` | List of Strings | | `organizations_delegated_administrators` | `organizations_trusted_delegated_administrators` | List of Strings |
@@ -157,6 +158,15 @@ The following list includes all the Vercel checks with configurable variables th
| `team_member_role_least_privilege` | `max_owners` | Integer | | `team_member_role_least_privilege` | `max_owners` | Integer |
| `team_no_stale_invitations` | `stale_invitation_threshold_days` | Integer | | `team_no_stale_invitations` | `stale_invitation_threshold_days` | Integer |
## Okta
### Configurable Checks
The following list includes all the Okta checks with configurable variables that can be changed in the configuration YAML file:
| Check Name | Value | Type |
|---------------------------------------------------------------|------------------------------------|---------|
| `signon_global_session_idle_timeout_15min` | `okta_max_session_idle_minutes` | Integer |
## Config YAML File Structure ## Config YAML File Structure
<Note> <Note>
@@ -186,6 +196,8 @@ aws:
max_unused_access_keys_days: 45 max_unused_access_keys_days: 45
# aws.iam_user_console_access_unused --> CIS recommends 45 days # aws.iam_user_console_access_unused --> CIS recommends 45 days
max_console_access_days: 45 max_console_access_days: 45
# aws.iam_user_access_not_stale_to_sagemaker --> default 90 days
max_unused_sagemaker_access_days: 90
# AWS EC2 Configuration # AWS EC2 Configuration
# aws.ec2_elastic_ip_shodan # aws.ec2_elastic_ip_shodan
@@ -18,9 +18,11 @@ prowler <provider> --scan-unused-services
#### ACM (AWS Certificate Manager) #### ACM (AWS Certificate Manager)
Certificates stored in ACM without active usage in AWS resources are excluded. By default, Prowler only scans actively used certificates. Unused certificates will not be checked if they are expired, if their expiring date is near or if they are good. Certificates stored in ACM without active usage in AWS resources are excluded. By default, Prowler only scans actively used certificates. Unused certificates are not evaluated for expiration, transparency logging, or weak key algorithms.
- `acm_certificates_expiration_check` - `acm_certificates_expiration_check`
- `acm_certificates_transparency_logs_enabled`
- `acm_certificates_with_secure_key_algorithms`
#### Athena #### Athena
@@ -28,6 +30,13 @@ Upon AWS account creation, Athena provisions a default primary workgroup for the
- `athena_workgroup_encryption` - `athena_workgroup_encryption`
- `athena_workgroup_enforce_configuration` - `athena_workgroup_enforce_configuration`
- `athena_workgroup_logging_enabled`
#### Amazon Bedrock
Generative AI workloads benefit from private VPC endpoint connectivity to keep prompt and model traffic off the public internet. Prowler only evaluates this configuration for VPCs in use (with active ENIs).
- `bedrock_vpc_endpoints_configured`
#### AWS CloudTrail #### AWS CloudTrail
@@ -38,15 +47,23 @@ AWS CloudTrail should have at least one trail with a data event to record all S3
#### AWS Elastic Compute Cloud (EC2) #### AWS Elastic Compute Cloud (EC2)
If Amazon Elastic Block Store (EBS) default encyption is not enabled, sensitive data at rest will remain unprotected in EC2. However, Prowler will only generate a finding if EBS volumes exist where default encryption could be enforced. If Amazon Elastic Block Store (EBS) default encryption is not enabled, sensitive data at rest remains unprotected in EC2. Prowler only generates a finding if EBS volumes exist where default encryption could be enforced.
- `ec2_ebs_default_encryption` - `ec2_ebs_default_encryption`
**EBS Snapshot Public Access**: Public EBS snapshots can leak data. Prowler only evaluates the account-level block setting if EBS snapshots exist in the account.
- `ec2_ebs_snapshot_account_block_public_access`
**EC2 Instance Metadata Service (IMDS)**: Enforcing IMDSv2 at the account level mitigates SSRF-based credential theft. Prowler only evaluates the account-level setting if EC2 instances exist in the account.
- `ec2_instance_account_imdsv2_enabled`
**Security Groups**: Misconfigured security groups increase the attack surface. **Security Groups**: Misconfigured security groups increase the attack surface.
Prowler scans only attached security groups to report vulnerabilities in actively used configurations. Applies to: Prowler scans only attached security groups to report vulnerabilities in actively used configurations. Applies to:
- 15 security group-related checks, including open ports and ingress/egress traffic rules. - 20 security group-related checks, including open ports and ingress/egress traffic rules.
- `ec2_securitygroup_allow_ingress_from_internet_to_port_X` - `ec2_securitygroup_allow_ingress_from_internet_to_port_X`
- `ec2_securitygroup_default_restrict_traffic` - `ec2_securitygroup_default_restrict_traffic`
@@ -56,6 +73,18 @@ Prowler scans only attached security groups to report vulnerabilities in activel
- `ec2_networkacl_allow_ingress_X_port` - `ec2_networkacl_allow_ingress_X_port`
#### AWS Identity and Access Management (IAM)
Customer-managed IAM policies that are not attached to any user, group, or role grant no effective permissions until a principal is bound to them. Prowler treats such policies as dormant by default and skips the content-evaluation checks below when `--scan-unused-services` is not set. Enable the flag to surface findings on unattached policies as well.
- `iam_policy_allows_privilege_escalation`
- `iam_policy_no_full_access_to_cloudtrail`
- `iam_policy_no_full_access_to_kms`
- `iam_policy_no_wildcard_marketplace_subscribe`
- `iam_no_custom_policy_permissive_role_assumption`
The dedicated `iam_customer_unattached_policy_no_administrative_privileges` check still inspects unattached policies regardless of the flag, since its purpose is to highlight dormant administrator privileges.
#### AWS Glue #### AWS Glue
AWS Glue best practices recommend encrypting metadata and connection passwords in Data Catalogs. AWS Glue best practices recommend encrypting metadata and connection passwords in Data Catalogs.
@@ -71,6 +100,12 @@ Amazon Inspector is a vulnerability discovery service that automates continuous
- `inspector2_is_enabled` - `inspector2_is_enabled`
#### AWS Key Management Service (KMS)
Customer managed Customer Master Keys (CMKs) in the `Disabled` state cannot be used for cryptographic operations, so Prowler skips the unintentional-deletion check on them by default. Enable the flag to evaluate disabled CMKs as well.
- `kms_cmk_not_deleted_unintentionally`
#### Amazon Macie #### Amazon Macie
Amazon Macie leverages machine learning to automatically discover, classify, and protect sensitive data in S3 buckets. Prowler only generates findings if Macie is disabled and there are S3 buckets in the AWS account. Amazon Macie leverages machine learning to automatically discover, classify, and protect sensitive data in S3 buckets. Prowler only generates findings if Macie is disabled and there are S3 buckets in the AWS account.
@@ -83,6 +118,15 @@ A network firewall is essential for monitoring and controlling traffic within a
- `networkfirewall_in_all_vpc` - `networkfirewall_in_all_vpc`
#### Amazon Relational Database Service (RDS)
RDS event subscriptions notify operators of critical database events. Prowler only evaluates these subscription checks when RDS clusters or instances exist in the account.
- `rds_cluster_critical_event_subscription`
- `rds_instance_critical_event_subscription`
- `rds_instance_event_subscription_parameter_groups`
- `rds_instance_event_subscription_security_groups`
#### Amazon S3 #### Amazon S3
To prevent unintended data exposure: To prevent unintended data exposure:
@@ -99,6 +143,10 @@ VPC settings directly impact network security and availability.
- `vpc_flow_logs_enabled` - `vpc_flow_logs_enabled`
- VPC Endpoint for EC2: Routes EC2 API calls through a private VPC endpoint to keep traffic off the public internet. Prowler only evaluates this configuration for VPCs in use, i.e., those with active ENIs.
- `vpc_endpoint_for_ec2_enabled`
- VPC Subnet Public IP Restrictions: Prevent unintended exposure of resources to the internet. Prowler only checks this configuration for VPCs in use, i.e., those with active ENIs. - VPC Subnet Public IP Restrictions: Prevent unintended exposure of resources to the internet. Prowler only checks this configuration for VPCs in use, i.e., those with active ENIs.
- `vpc_subnet_no_public_ip_by_default` - `vpc_subnet_no_public_ip_by_default`
@@ -0,0 +1,259 @@
---
title: 'Compliance'
description: 'Run security checks against compliance frameworks, review posture across providers, and download CSV or PDF reports from Prowler Cloud, Prowler App, and Prowler CLI.'
---
Prowler maps every security check to one or more industry-standard compliance frameworks, so a single scan produces both technical findings and framework-aligned evidence. The same evaluation runs identically whether scans are launched from Prowler Cloud, Prowler App, or Prowler CLI.
Out of the box, Prowler covers frameworks such as CIS Benchmarks, NIST 800-53, NIST CSF, NIS2, ENS RD2022, ISO 27001, PCI-DSS, SOC 2, GDPR, HIPAA, AWS Well-Architected, BSI C5, CSA CCM, MITRE ATT&CK, KISA ISMS-P, FedRAMP, and Prowler ThreatScore. The full catalog is available at [Prowler Hub](https://hub.prowler.com/compliance).
<Note>
For the unified compliance score methodology used across frameworks, see [Prowler ThreatScore Documentation](/user-guide/compliance/tutorials/threatscore).
</Note>
<CardGroup cols={2}>
<Card title="Prowler Cloud" icon="cloud" href="#prowler-cloud">
Review compliance posture using Prowler Cloud
</Card>
<Card title="Prowler CLI" icon="terminal" href="#prowler-cli">
Run compliance scans using Prowler CLI
</Card>
</CardGroup>
## Prowler Cloud
The Compliance section in Prowler Cloud and Prowler App centralizes compliance posture across every connected provider. It aggregates scan results, surfaces Prowler ThreatScore, and exposes detailed requirement-level evidence for each supported framework.
### Accessing the Compliance Section
To open the compliance overview, follow these steps:
1. Sign in to Prowler Cloud at [cloud.prowler.com](https://cloud.prowler.com/sign-in) or to a self-hosted Prowler App instance.
2. Select **Compliance** from the left navigation.
The page lists every framework evaluated by the most recent completed scan of the selected provider.
<img src="/images/compliance/prowler-app-compliance-overview.png" alt="Compliance overview page in Prowler Cloud and App showing filters, the Prowler ThreatScore card, and the framework grid" width="900" />
<Note>
Compliance results require at least one completed scan. If no scan has finished yet, Prowler Cloud and App display a notice prompting to launch or wait for a scan to complete.
</Note>
### Filtering Compliance Results
The filters bar at the top of the overview controls which scan and which regions feed every card on the page.
#### Scan Selector
The scan selector lists completed scans across all connected providers. Each entry includes the provider type, alias, and completion timestamp. Selecting a scan updates the entire page, including ThreatScore and every framework card.
#### Region Filter
The region multi-select narrows results to one or more regions detected in the selected scan. Use it to evaluate compliance posture for a specific geography or account boundary. The filter applies to:
* The framework grid scores and pass/fail counts.
* The detailed requirement view inside each framework.
<Note>
Region filters apply only to providers that report a region attribute (for example, AWS, Azure, and Google Cloud). Providers without regions ignore the filter.
</Note>
#### Clearing Filters
Select **Clear filters** to reset both the region filter and any other applied filter to its default state. The scan selector is preserved.
### Reviewing the Prowler ThreatScore Card
When the selected scan includes Prowler ThreatScore data, a dedicated card appears at the top of the overview, showing:
* The overall ThreatScore (0100) with a color-coded indicator.
* A progress bar reflecting current posture.
* Per-pillar bars for IAM, Attack Surface, and Logging and Monitoring.
<img src="/images/compliance/prowler-app-compliance-threatscore-card.png" alt="Prowler ThreatScore badge on the Compliance overview showing the overall score and per-pillar bars" width="900" />
Selecting the card opens the ThreatScore framework detail page, covered in [Working With the Framework Detail Page](#working-with-the-framework-detail-page).
For a complete explanation of the methodology, formula, and weighting, see [Prowler ThreatScore Documentation](/user-guide/compliance/tutorials/threatscore).
### Exploring the Framework Grid
Below ThreatScore, the framework grid shows one card per supported compliance framework. Each card includes:
* **Framework logo and name:** Identifies the standard (CIS, NIST, ENS, ISO 27001, PCI-DSS, SOC 2, NIS2, CSA CCM, MITRE ATT&CK, and more).
* **Version:** Indicates the framework version applied to the scan.
* **Score:** The percentage of passing requirements over the total evaluated.
* **Passing Requirements:** A `passed / total` counter for additional context.
* **Download dropdown:** Quick access to the CSV report and, when supported, the PDF report.
<img src="/images/compliance/prowler-app-compliance-card-download.png" alt="Download dropdown on a framework card showing CSV and PDF report options" width="500" />
Select any card to open the framework detail page.
<Note>
Score color coding follows three thresholds: red for severely low compliance, amber for partial compliance, and green for healthy posture. Hover over the score for the exact percentage.
</Note>
### Working With the Framework Detail Page
The detail page provides everything needed to evaluate a single framework: aggregate metrics, top failure sections, and a requirement-by-requirement view.
#### Header, Summary Cards, and Download Actions
The header shows the framework name, version, the provider scan being reviewed, and CSV / PDF download buttons. Below the header, summary cards condense the framework state at a glance:
* **Requirements Status:** Donut chart with `Pass`, `Fail`, and `Manual` counts plus the total number of requirements.
* **Top Failed Sections:** Ranks the sections or pillars with the highest number of failing requirements.
* **ThreatScore Breakdown:** Appears only on the ThreatScore framework. It shows the overall score and per-pillar scores aligned with the ThreatScore pillars (IAM, Attack Surface, Logging and Monitoring, Encryption).
The same layout applies to every compliance framework. ThreatScore is the only framework that includes the extra Breakdown card on the left; for any other framework, the Requirements Status and Top Failed Sections cards span the full row.
<img src="/images/compliance/prowler-app-compliance-threatscore-detail.png" alt="Prowler ThreatScore detail page including the extra Breakdown card alongside Requirements Status and Top Failed Sections" width="900" />
<img src="/images/compliance/prowler-app-compliance-detail-header.png" alt="CIS framework detail page showing only the Requirements Status donut and the Top Failed Sections card, without the ThreatScore Breakdown" width="900" />
#### Requirements Accordion
Below the summary cards, an accordion organizes every requirement of the framework. Expand a section to see:
* **Requirement ID and title:** Reflect the official identifier from the framework.
* **Pass / Fail / Manual badges:** Indicate the status of each requirement based on the underlying checks.
* **Custom details panel:** Opens additional context tailored to the framework. For frameworks with custom layouts, the panel surfaces fields such as control objectives, severity, attack tactics, regulatory references, or required evidence.
Select a requirement to open the detail panel and review the failing checks, the resources affected, and remediation guidance.
<img src="/images/compliance/prowler-app-compliance-requirements-accordion.png" alt="Expanded CIS requirement showing description, rationale, remediation procedure, audit procedure, profile and assessment tags, references, and the underlying check" width="900" />
##### Frameworks With Custom Detail Layouts
Several frameworks include enriched detail panels that highlight fields specific to the standard:
* ASD Essential Eight
* AWS Well-Architected Framework
* BSI C5
* Cloud Controls Matrix (CSA CCM)
* CIS Benchmarks
* CCC (Common Cloud Controls)
* ENS RD2022
* ISO 27001
* KISA ISMS-P
* MITRE ATT&CK
* Prowler ThreatScore
Frameworks without a custom layout fall back to the generic details panel, which still exposes the official requirement metadata captured by Prowler.
### Downloading Compliance Reports
Prowler Cloud and App expose two formats:
* **CSV report:** Every requirement, every check, and every finding for the selected scan and filters. Available for all supported frameworks.
* **PDF report:** Curated executive-style report. Currently supported for Prowler ThreatScore, ENS RD2022, NIS2, and CSA CCM. Additional PDF reports are added in subsequent Prowler releases.
#### Downloading From the Detail Page
Inside any framework detail page, the **CSV** and **PDF** buttons in the header trigger the same downloads as the overview dropdown. The PDF button only appears for frameworks that support it.
<img src="/images/compliance/prowler-app-compliance-detail-download.png" alt="Top of a framework detail page showing the CSV and PDF download buttons in the header" width="900" />
<Note>
Region filters disable the per-card download dropdown to avoid generating partial reports. Open the framework detail page when downloads scoped to a region are required, or remove the region filter to download the full report.
</Note>
#### Downloading the Full Scan Output
To export every framework, finding, and resource at once, use the **Scan Jobs** section instead. The ZIP archive contains the CSV, JSON-OCSF, and HTML reports plus a `compliance/` subfolder with one CSV per framework. See [Prowler App — Getting Started](/user-guide/tutorials/prowler-app) for details.
### API Access
Every report available in the UI is also reachable through the Prowler API. The following endpoints are the most relevant:
* [Retrieve a scan compliance report as CSV](https://api.prowler.com/api/v1/docs#tag/Scan/operation/scans_compliance_retrieve)
* [Download a complete scan output (ZIP)](https://api.prowler.com/api/v1/docs#tag/Scan/operation/scans_report_retrieve)
Use the API to integrate compliance evidence into ticketing systems, executive dashboards, or downstream pipelines.
## Prowler CLI
Prowler CLI evaluates the same compliance frameworks as Prowler Cloud and App, and produces detailed CSV outputs alongside the standard scan results. By default, it runs every supported framework and prints a status summary at the end of the scan:
<img src="/images/cli/compliance/compliance.png" />
Detailed compliance results are stored as CSV files under the `compliance/` subfolder of Prowler's output directory.
### Scan a Specific Compliance Framework
To scope a scan to a single framework and get the framework-specific summary, use the `--compliance` option:
```sh
prowler <provider> --compliance <compliance_framework>
```
Standard results plus the framework breakdown are printed to the terminal. A dedicated CSV is also generated under the `compliance/` output folder. Sample output for CIS AWS 2.0:
<img src="/images/cli/compliance/compliance-cis-sample1.png" />
<Note>
If Prowler cannot find a resource related with a check from a compliance requirement, that requirement is omitted from the output.
</Note>
### List Available Compliance Frameworks
To see which compliance frameworks are covered by a given provider, use the `--list-compliance` option:
```sh
prowler <provider> --list-compliance
```
The full catalog is also browsable at [Prowler Hub](https://hub.prowler.com/compliance).
### List Requirements of a Compliance Framework
To inspect the requirements that compose a specific framework, use the `--list-compliance-requirements` option:
```sh
prowler <provider> --list-compliance-requirements <compliance_framework(s)>
```
Sample output for the first requirements of CIS 1.5 for AWS:
```
Listing CIS 1.5 AWS Compliance Requirements:
Requirement Id: 1.1
- Description: Maintain current contact details
- Checks:
account_maintain_current_contact_details
Requirement Id: 1.2
- Description: Ensure security contact information is registered
- Checks:
account_security_contact_information_is_registered
Requirement Id: 1.3
- Description: Ensure security questions are registered in the AWS account
- Checks:
account_security_questions_are_registered_in_the_aws_account
Requirement Id: 1.4
- Description: Ensure no 'root' user account access key exists
- Checks:
iam_no_root_access_key
Requirement Id: 1.5
- Description: Ensure MFA is enabled for the 'root' user account
- Checks:
iam_root_mfa_enabled
[redacted]
```
## Contributing New Compliance Frameworks
To request a new framework or contribute one, see [Creating a New Security Compliance Framework in Prowler](/developer-guide/security-compliance-framework). The developer guide covers the Pydantic schema, JSON catalog, output formatter, and PR submission steps required to ship a new framework end to end.
## Related Documentation
* [Prowler ThreatScore Documentation](/user-guide/compliance/tutorials/threatscore)
* [Creating a New Security Compliance Framework in Prowler](/developer-guide/security-compliance-framework)
* [Prowler App — Getting Started](/user-guide/tutorials/prowler-app)
+5 -13
View File
@@ -27,7 +27,7 @@ To download results from AWS CloudShell:
## Cloning Prowler from GitHub ## Cloning Prowler from GitHub
Due to the limited storage in AWS CloudShell's home directory, installing Poetry dependencies for running Prowler from GitHub can be problematic. Due to the limited storage in AWS CloudShell's home directory, installing uv dependencies for running Prowler from GitHub can be problematic.
The following workaround ensures successful installation: The following workaround ensures successful installation:
@@ -37,17 +37,9 @@ adduser prowler
su prowler su prowler
git clone https://github.com/prowler-cloud/prowler.git git clone https://github.com/prowler-cloud/prowler.git
cd prowler cd prowler
pip install poetry pip install uv
mkdir /tmp/poetry mkdir /tmp/uv-cache
poetry config cache-dir /tmp/poetry UV_CACHE_DIR=/tmp/uv-cache uv sync
eval $(poetry env activate) source .venv/bin/activate
poetry install
python prowler-cli.py -v python prowler-cli.py -v
``` ```
<Warning>
Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment. For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
</Warning>
@@ -4,7 +4,7 @@ title: 'Check Mapping Prowler v4/v3 to v2'
Prowler v3 and v4 introduce distinct identifiers while preserving the checks originally implemented in v2. This change was made because, in previous versions, check names were primarily derived from the CIS Benchmark for AWS. Starting with v3 and v4, all checks are independent of any security framework and have unique names and IDs. Prowler v3 and v4 introduce distinct identifiers while preserving the checks originally implemented in v2. This change was made because, in previous versions, check names were primarily derived from the CIS Benchmark for AWS. Starting with v3 and v4, all checks are independent of any security framework and have unique names and IDs.
For more details on the updated compliance implementation in Prowler v4 and v3, refer to the [Compliance](/user-guide/cli/tutorials/compliance) section. For more details on the updated compliance implementation in Prowler v4 and v3, refer to the [Compliance](/user-guide/compliance/tutorials/compliance) section.
``` ```
checks_v4_v3_to_v2_mapping = { checks_v4_v3_to_v2_mapping = {
@@ -44,6 +44,15 @@ User API Tokens are the recommended authentication method because they:
Create a **User API Token**, not an Account API Token. User API Tokens are created from the profile settings and offer finer permission control. Create a **User API Token**, not an Account API Token. User API Tokens are created from the profile settings and offer finer permission control.
</Note> </Note>
**Quick Setup:** Use these pre-configured links to open the Cloudflare Dashboard with the required permissions already selected:
- [Create User API Token](https://dash.cloudflare.com/profile/api-tokens?permissionGroupKeys=%5B%7B%22key%22%3A%22account_settings%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22zone%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22zone_settings%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22dns%22%2C%22type%22%3A%22read%22%7D%5D&accountId=%2A&zoneId=all&name=Prowler%20Security%20Scanner) — creates a **User API Token** (recommended). Opens the **Create Custom Token** form prefilled with the four required read-only scopes (`Account Settings`, `Zone`, `Zone Settings`, `DNS`) and the name `Prowler Security Scanner`. Adjust **Account Resources** and **Zone Resources** to match the accounts and zones you want to scan, then click **Create Token**.
- [Create Account-Owned API Token](https://dash.cloudflare.com/?to=/:account/api-tokens&permissionGroupKeys=%5B%7B%22key%22%3A%22account_settings%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22zone%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22zone_settings%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22dns%22%2C%22type%22%3A%22read%22%7D%5D&name=Prowler%20Security%20Scanner) — creates an [account-owned token](https://developers.cloudflare.com/fundamentals/api/how-to/account-owned-token-template/) instead. Use this for automation or CI/CD where the token should not depend on a specific user account remaining active. Requires the **Super Administrator** or **Administrator** role on the account.
<Note>
Template URLs only pre-fill the token creation form. Review the permissions, configure resources, and click **Create Token** to complete the process.
</Note>
### Step 1: Create a User API Token ### Step 1: Create a User API Token
1. Log into the [Cloudflare Dashboard](https://dash.cloudflare.com). 1. Log into the [Cloudflare Dashboard](https://dash.cloudflare.com).
@@ -14,6 +14,15 @@ Set up authentication for Cloudflare with the [Cloudflare Authentication](/user-
- Grant the required read-only permissions (`Account Settings:Read`, `Zone:Read`, `Zone Settings:Read`, `DNS:Read`) - Grant the required read-only permissions (`Account Settings:Read`, `Zone:Read`, `Zone Settings:Read`, `DNS:Read`)
- Identify the Cloudflare Account ID to use as the provider identifier - Identify the Cloudflare Account ID to use as the provider identifier
<Note>
**Quick Setup:** Use these pre-configured links to create a token with the required permissions already selected:
- [Create User API Token](https://dash.cloudflare.com/profile/api-tokens?permissionGroupKeys=%5B%7B%22key%22%3A%22account_settings%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22zone%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22zone_settings%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22dns%22%2C%22type%22%3A%22read%22%7D%5D&accountId=%2A&zoneId=all&name=Prowler%20Security%20Scanner) — creates a User API Token (recommended).
- [Create Account-Owned API Token](https://dash.cloudflare.com/?to=/:account/api-tokens&permissionGroupKeys=%5B%7B%22key%22%3A%22account_settings%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22zone%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22zone_settings%22%2C%22type%22%3A%22read%22%7D%2C%7B%22key%22%3A%22dns%22%2C%22type%22%3A%22read%22%7D%5D&name=Prowler%20Security%20Scanner) — creates an [account-owned token](https://developers.cloudflare.com/fundamentals/api/how-to/account-owned-token-template/), better suited for automation and CI/CD.
Both links open the Cloudflare Dashboard with the four required read-only scopes (`Account Settings`, `Zone`, `Zone Settings`, `DNS`) and the name `Prowler Security Scanner` prefilled. See [Cloudflare Authentication](/user-guide/providers/cloudflare/authentication#api-token-recommended) for the equivalent manual steps.
</Note>
<CardGroup cols={2}> <CardGroup cols={2}>
<Card title="Prowler Cloud" icon="cloud" href="#prowler-cloud"> <Card title="Prowler Cloud" icon="cloud" href="#prowler-cloud">
Onboard Cloudflare using Prowler Cloud Onboard Cloudflare using Prowler Cloud
@@ -153,8 +153,8 @@ Before running Prowler CLI for GitHub, ensure you have:
# Install via pip # Install via pip
pip install prowler pip install prowler
# Or via poetry # Or via uv (from the cloned repo)
poetry install uv sync
``` ```
2. **Authentication Credentials** 2. **Authentication Credentials**
@@ -22,7 +22,7 @@ Install promptfoo using one of the following methods:
**Using npm:** **Using npm:**
```bash ```bash
npm install -g promptfoo npm install --global promptfoo@0.121.11
``` ```
**Using Homebrew (macOS):** **Using Homebrew (macOS):**
@@ -46,7 +46,7 @@ Before you begin, ensure you have:
```bash ```bash
pip install prowler pip install prowler
# or for development: # or for development:
poetry install uv sync
``` ```
2. **OCI Python SDK** (automatically installed with Prowler): 2. **OCI Python SDK** (automatically installed with Prowler):
@@ -398,7 +398,7 @@ prowler oci --severity critical high
### Next Steps ### Next Steps
- Learn about [Compliance Frameworks](/user-guide/cli/tutorials/compliance) in Prowler - Learn about [Compliance Frameworks](/user-guide/compliance/tutorials/compliance) in Prowler
- Review [Prowler Output Formats](/user-guide/cli/tutorials/reporting) - Review [Prowler Output Formats](/user-guide/cli/tutorials/reporting)
- Explore [Integrations](/user-guide/cli/tutorials/integrations) with SIEM and ticketing systems - Explore [Integrations](/user-guide/cli/tutorials/integrations) with SIEM and ticketing systems
@@ -0,0 +1,186 @@
---
title: 'Okta Authentication in Prowler'
---
import { VersionBadge } from "/snippets/version-badge.mdx"
<VersionBadge version="5.27.0" />
Prowler authenticates to Okta as a **service application** using **OAuth 2.0 with a private-key JWT** (Client Credentials grant). The integration is read-only by scope and follows DISA STIG guidance for least-privilege access.
## Common Setup
### Prerequisites
- An Okta organization. The UI examples below use **Identity Engine** terminology such as **Global Session Policy**; Classic Engine exposes equivalent sign-on policy concepts under older naming.
- A **Super Administrator** account on that organization for the one-time service-app setup.
- An **API Services** app integration created in the Okta Admin Console.
### Authentication Method Overview
| Method | Status | Use Case |
|---|---|---|
| **OAuth 2.0 (private-key JWT)** | Supported | Production scans, CI/CD, Prowler App. |
The private-key JWT flow is the only supported authentication method in the initial release. The service application proves possession of a private key on every token request; Okta returns a short-lived access token, refreshed automatically by the SDK.
<Note>
If a different authentication method is needed (SSWS API token, OAuth with user delegation, etc.), please open a [feature request](https://github.com/prowler-cloud/prowler/issues/new?template=feature-request.yml) describing the use case.
</Note>
### Required OAuth Scopes
For the initial check (`signon_global_session_idle_timeout_15min`) only one scope is required:
- `okta.policies.read`
Additional scopes will be needed as more services and checks are added, this are the current ones needed:
| Scope | Used by |
|---|---|
| `okta.policies.read` | Sign-on / password / authentication policies |
### Required Admin Role
The service application must be assigned the built-in **Read-Only Administrator** role.
Okta's Management API enforces a two-layer authorization model: an OAuth **scope** decides which API endpoints the token can call, and an **admin role** decides whether the call returns data. With only a scope granted, the token mint succeeds but every read returns `403 Forbidden`. The Read-Only Administrator role is the minimum that lets the granted `okta.*.read` scopes actually return configuration data to Prowler's checks — without it, the credential probe at provider startup fails and the scan never gets to evaluate any check.
Read-Only Administrator is intentionally the narrowest role that satisfies this requirement and aligns with the least-privilege guidance in DISA STIG.
## Step-by-Step Setup
### 1. Go to the admin console
![Okta — admin console page](/user-guide/providers/okta/images/select-admin-console.png)
### 2. [Optional] - Disable the privilege-escalation bypass (org-wide, one-time)
In the Okta Admin Console, go to **Settings → Account → Public client app admins** and ensure it is **off**. When enabled, every API Services app can be auto-assigned the Super Administrator role after scopes are granted, which would invalidate the read-only premise of this integration.
![Okta — disable Public client app admins](/user-guide/providers/okta/images/public-client-app-admins.png)
### 3. Create the API Services app
1. Go to **Applications → Applications**.
![Okta — create API Services app](/user-guide/providers/okta/images/go-to-applications.png)
2. **Create App Integration**
![Okta — create App integration](/user-guide/providers/okta/images/create-new-application.png)
3. Sign-in method: **API Services**. Click **Next**.
4. Name the app (for example, `Prowler Scanner`) and click **Save**.
5. Copy the displayed **Client ID** — you'll use it as `OKTA_CLIENT_ID`.
![Okta — copy client id](/user-guide/providers/okta/images/copy-client-id.png)
### 4. Switch to private-key authentication and generate a keypair
On the new app's **General** tab, scroll to **Client Credentials**:
1. Click **Edit**.
2. Set **Client authentication** to **Public key / Private key**.
3. Under **Public Keys**, click **Add key**.
4. In the modal, click **Generate new key**. Okta creates a JWK pair.
5. Click the **PEM** tab to switch the displayed format (or keep JWK — Prowler accepts both).
6. Copy the entire `-----BEGIN PRIVATE KEY-----` block (or the JWK JSON).
7. Click **Done**, then **Save**.
<Warning>
Okta displays the private key **only once**. If you close the modal without copying, you must generate a new key.
</Warning>
![Okta — create Public Key](/user-guide/providers/okta/images/create-public-key.png)
### 5. Grant the required OAuth scopes
On the app, open the **Okta API Scopes** tab and click **Grant** on every scope Prowler needs. For the initial release, granting only `okta.policies.read` is sufficient.
![Okta — grant OAuth scopes](/user-guide/providers/okta/images/grant-permissions.png)
### 6. Assign the Read-Only Administrator role
On the app, open the **Admin roles** tab and click **Edit assignments → Add assignment**:
- **Role:** Read-Only Administrator
- **Resources:** All resources
Save the changes.
![Okta — grant Read-Only role](/user-guide/providers/okta/images/grant-roles.png)
### 7. [Optional] Verify DPoP setting
Prowler sends DPoP (Demonstrating Proof of Possession) proofs on every token request. The integration works whether the **Require Demonstrating Proof of Possession (DPoP) header in token requests** setting on the service app is on or off — but enabling it is the more secure default.
## Prowler CLI Authentication
### Using Environment Variables (Required for Secrets)
Private key material **must** be supplied via environment variables — Prowler does not accept secrets through CLI flags.
```bash
export OKTA_ORG_DOMAIN="YOUR-ORG.okta.com"
export OKTA_CLIENT_ID="0oa1234567890abcdef"
# Either of the two — content takes precedence over file when both are set.
export OKTA_PRIVATE_KEY_FILE="/secure/path/to/prowler-okta.pem"
# or
export OKTA_PRIVATE_KEY="$(cat /secure/path/to/prowler-okta.pem)"
# Optional — defaults to "okta.policies.read"
export OKTA_SCOPES="okta.policies.read"
uv run python prowler-cli.py okta
```
### Non-Secret CLI Flags
Non-secret values are also available as CLI flags for ergonomic overrides:
| Flag | Equivalent env var |
|---|---|
| `--okta-org-domain` | `OKTA_ORG_DOMAIN` |
| `--okta-client-id` | `OKTA_CLIENT_ID` |
| `--okta-scopes` | `OKTA_SCOPES` |
Run a single check directly:
```bash
uv run python prowler-cli.py okta --check signon_global_session_idle_timeout_15min
```
## Troubleshooting
### `OktaInvalidOrgDomainError`
The org domain must be `<org>.okta.com` (or `.oktapreview.com` / `.okta-emea.com` / `.okta-gov.com` / `.okta.mil` / `.okta-miltest.com` / `.trex-govcloud.com`). Pass the bare hostname only — no `https://` scheme, no path, no trailing slash. Custom (vanity) domains are not currently accepted.
### `OktaPrivateKeyFileError`
The file at `OKTA_PRIVATE_KEY_FILE` is missing, unreadable, or empty. Confirm the path and that the file contains a non-empty PEM block or JWK JSON document.
### `OktaInvalidCredentialsError` at provider init
Prowler validates credentials at startup by listing one sign-on policy. This error indicates the credential material itself was rejected:
- **`invalid_client`** — the public key registered in Okta does not match the private key on disk. Generate a fresh keypair and try again.
### `OktaInsufficientPermissionsError` at provider init
Raised when the credential probe succeeds at the OAuth layer but the request is rejected because the service app lacks the required scope or admin role:
- **`invalid_scope`** — the `okta.policies.read` scope is not granted on the service app. Grant it from **Okta API Scopes**.
- **`Forbidden` / `not authorized`** — the **Read-Only Administrator** role is not assigned to the service app. Assign it from **Admin roles**.
### `invalid_dpop_proof`
The org or the service app requires DPoP. The provider always sends DPoP proofs, so this error indicates the SDK could not build a valid proof — typically because the private key on disk does not match the public key uploaded to Okta. Regenerate the keypair.
## Additional Resources
- [Implement OAuth 2.0 for an Okta service app](https://developer.okta.com/docs/guides/implement-oauth-for-okta-serviceapp/main/)
- [Okta Policy API reference](https://developer.okta.com/docs/api/openapi/okta-management/management/tag/Policy/)
- [DISA STIG for Okta (V-273186)](https://stigviewer.com/stigs/okta/)
@@ -0,0 +1,144 @@
---
title: 'Getting Started With Okta on Prowler'
---
import { VersionBadge } from "/snippets/version-badge.mdx"
Prowler for Okta scans an Okta organization for identity and session-management misconfigurations. The provider authenticates as a service application using **OAuth 2.0 with a private-key JWT** (Client Credentials grant) — no end-user login, read-only by scope.
## Prerequisites
Set up authentication for Okta with the [Okta Authentication](/user-guide/providers/okta/authentication) guide before starting:
- An Okta organization. The UI examples below use **Identity Engine** terminology such as **Global Session Policy**; Classic Engine exposes the equivalent sign-on policy concepts under older names.
- A **Super Administrator** account on that organization for the one-time service-app setup.
- An **API Services** app integration in the Okta Admin Console with the `okta.policies.read` scope granted and the **Read-Only Administrator** role assigned.
- Python 3.10+ and Prowler 5.27.0 or later installed locally.
<CardGroup cols={2}>
<Card title="Prowler Cloud" icon="cloud" href="#prowler-cloud">
Onboard Okta using Prowler Cloud
</Card>
<Card title="Prowler CLI" icon="terminal" href="#prowler-cli">
Onboard Okta using Prowler CLI
</Card>
</CardGroup>
## Prowler Cloud
<Note>
Prowler Cloud onboarding for Okta is coming soon. Track the [Prowler GitHub repository](https://github.com/prowler-cloud/prowler) for release updates. Use the [Prowler CLI](#prowler-cli) workflow below in the meantime.
</Note>
---
## Prowler CLI
<VersionBadge version="5.27.0" />
### Step 1: Set Up Authentication
Follow the [Okta Authentication](/user-guide/providers/okta/authentication) guide to create the service application, generate a keypair, grant scopes, and assign the Read-Only Administrator role. Then export the credentials:
```bash
export OKTA_ORG_DOMAIN="acme.okta.com"
export OKTA_CLIENT_ID="0oa1234567890abcdef"
export OKTA_PRIVATE_KEY_FILE="/secure/path/to/prowler-okta.pem"
# Optional — defaults to "okta.policies.read"
export OKTA_SCOPES="okta.policies.read"
```
The private key file may contain either a PEM-encoded RSA key or a JWK JSON document.
#### Supplying the Private Key as Content
For automated environments where writing the key to disk is not desirable (CI runners, container secrets, etc.), the private key may be passed directly as a string:
```bash
export OKTA_ORG_DOMAIN="acme.okta.com"
export OKTA_CLIENT_ID="0oa1234567890abcdef"
export OKTA_PRIVATE_KEY="$(cat /secure/path/to/prowler-okta.pem)"
```
`OKTA_PRIVATE_KEY` takes precedence over `OKTA_PRIVATE_KEY_FILE` when both are set. The private key is intentionally not exposed as a CLI flag — secrets must be supplied via environment variables only.
### Step 2: Run the First Scan
Run a baseline scan after credentials are configured:
```bash
prowler okta
```
Or run a specific check directly:
```bash
prowler okta --check signon_global_session_idle_timeout_15min
```
Prowler prints a summary table; full findings are written to the configured output formats.
### Step 3: Use a Custom Configuration (Optional)
Prowler uses a configuration file to customize check thresholds. The Okta configuration currently includes:
```yaml
okta:
# okta.signon_global_session_idle_timeout_15min
# Defaults to 15 minutes per DISA STIG V-273186.
okta_max_session_idle_minutes: 15
```
To use a custom configuration:
```bash
prowler okta --config-file /path/to/config.yaml
```
## Supported Services
Prowler for Okta includes security checks across the following services:
| Service | Description |
| ----------- | ----------------------------------------------------------------------------------- |
| **Sign-On** | Global session policy controls (idle timeout, lifetime, rule priority and ordering) |
## Troubleshooting
### STIG Rule Ordering
The initial check is mapped to DISA STIG `V-273186` / `OKTA-APP-000020`. Prowler implements the STIG procedure as written: the **Default Policy** must have a **Priority 1** rule that is **not** `Default Rule`, and that rule must set **Maximum Okta global session idle time** to 15 minutes or less.
This is stricter than simply finding the same timeout value somewhere else in the policy set. A compliant custom rule in another policy, or a compliant timeout on the built-in `Default Rule`, does not satisfy this STIG procedure.
### Default Scopes
Prowler requests a fixed set of OAuth scopes on every token exchange. The default is a single scope that covers the bundled initial check:
- `okta.policies.read`
The service app must have that scope granted in the **Okta API Scopes** tab. When the granted set is narrower than the requested set, the token request fails with an `invalid_scope` error and the scan stops at provider initialization.
When additional checks are enabled — or when running against a service app that exposes a different scope set — override the default with `OKTA_SCOPES` (comma-separated string for the env var) or `--okta-scopes` (space-separated list for the CLI):
```bash
# Environment variable — comma-separated
export OKTA_SCOPES="okta.policies.read,okta.apps.read,okta.users.read"
# CLI flag — space-separated
prowler okta --okta-scopes okta.policies.read okta.apps.read okta.users.read
```
For the full catalog of OAuth scopes exposed by the Okta Management API, refer to the [Okta OAuth 2.0 scopes documentation](https://developer.okta.com/docs/api/oauth2/).
<Note>
As new services and checks land in the Okta provider, the default scope list grows alongside them. Re-check the granted scopes on the service app after each Prowler upgrade and grant any newly required `okta.*.read` scopes in the Admin Console.
</Note>
### Common Errors
- **`OktaInvalidOrgDomainError`** — the org domain must be `<org>.okta.com` (or `.oktapreview.com` / `.okta-emea.com` / `.okta-gov.com` / `.okta.mil` / `.okta-miltest.com` / `.trex-govcloud.com`). Pass the bare hostname only — no `https://` scheme, no path, no trailing slash.
- **`OktaPrivateKeyFileError`** — confirm the file is readable and contains a non-empty PEM or JWK body.
- **`OktaInsufficientPermissionsError`** — the credential probe reached Okta but the service app cannot perform the request. The error string carries `invalid_scope`, `Forbidden`, `not authorized`, or `permission`. Fix by granting the missing `okta.*.read` scope from **Okta API Scopes** and confirming the **Read-Only Administrator** role is assigned to the service app.
- **`OktaInvalidCredentialsError`** — the credential probe reached Okta but Okta rejected the JWT. Typically the private key on disk does not match the public JWK uploaded to the service app, or the JWT signing parameters are wrong. Regenerate the keypair and re-upload the public JWK.
- **Token requests failing for an unknown scope** — the app was granted a narrower scope set than `OKTA_SCOPES` requests. Either narrow `OKTA_SCOPES` or grant the missing scopes in the Admin Console.
Binary file not shown.

After

Width:  |  Height:  |  Size: 159 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 134 KiB

Some files were not shown because too many files have changed in this diff Show More