Compare commits

...

272 Commits

Author SHA1 Message Date
Hugo P.Brito 4dbb7c7bb7 fix(ui): restore View Resource action in findings drawer
Reapplies the View Resource link that was inadvertently dropped while
removing this PR's overlap with #10847. That feature is already on master
and removing it here would have regressed the findings drawer.

Restores buildResourceDetailHref, the resourceDetailHref binding, the
JSX action below the resource actions menu, and the original positive
assertion in the drawer test.
2026-05-06 09:34:20 +01:00
Hugo P.Brito 64df3a22c0 chore: merge master into PR 10853 2026-05-06 09:09:57 +01:00
César Arroba 16798e293d ci(pr-conflict-checker): restore persist-credentials so base ref fetch works on private mirrors (#11019) 2026-05-06 00:33:40 +02:00
César Arroba 1194d34396 ci(ui-e2e): reduce Playwright artifact retention to 7 days (#11018) 2026-05-06 00:09:34 +02:00
César Arroba 98277689f5 ci: reduce GitHub Actions consumption across CI workflows (#11007) 2026-05-05 17:08:34 +02:00
BMO 0ddd7fbd69 docs(aws): add guide for extending existing services (#10924)
Co-authored-by: Mohamed Solaiman <mohamedsolaiman@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2026-05-05 16:51:58 +02:00
Pedro Martín 22b233f206 chore(deps): bump requests to 2.33.1 to fix CVE-2026-25645 (#10983) 2026-05-05 16:43:18 +02:00
Daniel Barranquero aa759ab6b7 fix(attack-surface): restore ec2-imdsv1 category alignment (#10998) 2026-05-05 16:42:47 +02:00
Hugo Pereira Brito 369d6cecc1 fix: patch CVE-2026-39892 and CVE-2026-33186 across SDK, API and MCP images (#10978)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-05-05 15:04:44 +01:00
Pablo Fernandez Guerra (PFE) d23c2f3b53 refactor(ui): standardize "Providers" wording across UI and docs (#10971)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-05 14:39:54 +02:00
Prowler Bot 786059bfb2 chore(docs): Bump version to v5.25.2 (#10993)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-05-05 10:45:07 +02:00
Pepe Fagoaga 703a33108c chore(changelog): prepare for v5.25.2 (#10991) 2026-05-05 08:47:28 +02:00
Pepe Fagoaga 7c6d658154 fix(k8s): match RBAC rules by apiGroup, not just core (#10969)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-05-04 19:54:03 +02:00
Pepe Fagoaga 21d7d08b4b fix(timeline): Return a compact actor name from CloudTrail events (#10986) 2026-05-04 19:39:17 +02:00
Pepe Fagoaga f314725f4d fix(k8s): deduplicate RBAC findings by unique subject (#10242)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-05-04 18:11:38 +02:00
Rubén De la Torre Vico 02f43a7ad6 docs: add Prowler Studio page and remove check-kreator pages (#10981) 2026-05-04 17:51:02 +02:00
Daniel Barranquero 0dd8981ee4 feat: add issue template for creating new checks (#10976) 2026-05-04 17:47:39 +02:00
Rubén De la Torre Vico 269e51259d docs: add troubleshooting guide for stuck scans after worker crash (#10938) 2026-05-04 17:24:09 +02:00
Hugo Pereira Brito f4afdf0541 chore(ui): decrement changelog entry version to 1.25.2 (#10974) 2026-05-04 14:59:27 +01:00
Hugo Pereira Brito 652cb69216 fix(ui): compliance card layout polish (#10939) 2026-05-04 12:59:06 +01:00
Daniel Barranquero 921f49a0de feat(aws): add bedrock_prompt_management_exists security check (#10878) 2026-05-04 12:38:15 +02:00
Hugo Pereira Brito 6cb770fcc8 fix(ui): clean up findings expanded resource row layout (#10949) 2026-05-04 11:17:54 +01:00
Hugo P.Brito e6fa47be67 fix(ui): refine recommendation link handling
- Move recommendation URL labels to shared UI utilities

- Remove unrelated resource navigation from the drawer diff

- Keep URL-only remediation cards labeled and update tests
2026-05-04 09:32:15 +01:00
Daniel Barranquero 86449fb99d chore(vercel): add disclaimer for checks depending on billing plan (#10663) 2026-05-04 08:56:50 +02:00
Andoni Alonso 40dd0e640b fix(sdk): strip http(s):// scheme from image registry URLs (#10950) 2026-05-04 08:37:46 +02:00
Hugo Pereira Brito 8db3a89669 ci: remove andoniaf from prowler-cloud (#10926) 2026-04-30 18:07:25 +02:00
Danny Lyubenov c802dc8a36 feat(codebuild): use batched API calls to prevent throttling and false positives (#10639)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-30 17:19:21 +02:00
Pedro Martín 3ab9a4efa5 chore(changelog): update with latest changes (#10948) 2026-04-30 14:13:40 +02:00
Pepe Fagoaga 36b8aa1b79 fix(boto3): pass config to clients (#10944) 2026-04-30 14:11:29 +02:00
Pedro Martín e821e07d7d docs(rbac): add Manage Alerts permission (#10947) 2026-04-30 13:58:17 +02:00
Boon 228fe6d579 feat: add ASD Essential Eight compliance framework for AWS (#10808)
Co-authored-by: Boon <boon@security8.work>
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2026-04-30 13:49:08 +02:00
Pedro Martín 578186aa40 feat(sdk): integrate universal compliance into CLI pipeline (#10301) 2026-04-30 13:49:00 +02:00
Hugo P.Brito 545c14d08c Merge remote-tracking branch 'origin/master' into fix/ui-cve-fix-available-links
# Conflicts:
#	ui/CHANGELOG.md
2026-04-30 12:26:35 +01:00
Hugo P.Brito de0ddf5c48 docs: move CVE link entries to next unreleased version
- Move SDK entry from 5.25.0 to 5.26.0 (Prowler UNRELEASED)
- Add a 1.26.0 (Prowler UNRELEASED) section in the UI changelog and move the entry there
2026-04-30 12:16:03 +01:00
Andoni Alonso 4608e45c8a fix(image): block parser-mismatch SSRF in registry auth (#10945) 2026-04-30 12:56:35 +02:00
Pedro Martín 5987651aee chore(README): update with latest changes (#10946) 2026-04-30 12:56:06 +02:00
Adrián Tomás 85800f2ddd chore(pre-commit): add priority tiers to .pre-commit-config.yaml (#10842)
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-30 12:33:09 +02:00
Pablo Fernandez Guerra (PFE) 4fb5272362 refactor(ui): unify DataTable pagination into a single callback (#10863)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
2026-04-30 08:58:11 +02:00
Pepe Fagoaga 85d38b5f71 feat(scans): Reset resource failed findings to 0 for ephemeral resources (#10929) 2026-04-29 19:08:16 +02:00
Prowler Bot 59dcdb87c4 chore(docs): Bump version to v5.25.1 (#10940)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-29 18:44:12 +02:00
Josema Camacho 9297453b8a fix(sdk): add autouse mock_aws fixture and leak detector to prevent AWS test leaks (#10605) 2026-04-29 17:49:40 +02:00
Hugo P.Brito 5310b9e402 fix(ui): label remediation action by destination
- Render "View in Prowler Hub" for hub.prowler.com URLs and "View Advisory" for GitHub Security Advisory URLs alongside the existing "View CVE" action
- Fall back to a generic "View Reference" for other destinations
- Cover the new label with a unit test; refresh changelog entry
2026-04-29 16:11:50 +01:00
Hugo P.Brito ab6f36455f fix(sdk): route Trivy findings to real CVE, Hub and Advisory URLs
- Add `build_finding_reference_url` mapping a finding ID to its canonical reference (`cve.org`, `github.com/advisories`, or `hub.prowler.com/check`)
- IAC provider falls back to the helper when no canonical CVE URL is resolved, so misconfigs and non-CVE vulns get a working remediation link instead of an Aqua URL
- Strip leading `AVD-` so Prowler Hub URLs resolve, since Hub indexes Trivy rules without the prefix
- Cover the helper and IAC behavior with unit tests; refresh changelog entry
2026-04-29 16:09:49 +01:00
Davlet Dzhakishev dd37f4ee1f fix(azure): update flow log compliance text for NSG retirement (#10937) 2026-04-29 16:45:58 +02:00
Pepe Fagoaga 20f36f7c84 chore: changelog v5.25.1 (#10934) 2026-04-29 14:00:53 +02:00
Pablo Fernandez Guerra (PFE) ec4d27746f fix(ui): reposition compliance card export menu (#10918)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-29 13:52:36 +02:00
Andoni Alonso 7076900fb1 fix(kubernetes): use cluster name as provider_uid in OCSF output (#10483)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-29 13:45:49 +02:00
Josema Camacho 5d90352a0f fix(api): redirect scan report and compliance downloads to presigned S3 URLs (#10927) 2026-04-29 13:19:19 +02:00
Hugo P.Brito 67472fa378 chore: merge master into CVE link PR 2026-04-29 11:33:09 +01:00
Hugo P.Brito a22db1172a docs(sdk): add CVE link changelog entry 2026-04-29 11:31:05 +01:00
Hugo P.Brito 9b50dbceb3 fix(ui): label remediation links by destination
- Use recommendation URLs as the single CTA source

- Keep Prowler Hub and CVE labels distinct

- Assert official CVE references are rendered without Aqua URLs
2026-04-29 11:30:35 +01:00
Hugo P.Brito 9645544552 fix(sdk): use official CVE URLs for Trivy findings
- Resolve CVE recommendations from Trivy references

- Remove Aqua advisory URLs from provider metadata

- Preserve Prowler Hub remediation links for IaC checks

- Cover CVE fallback and non-CVE advisory cases
2026-04-29 11:29:01 +01:00
Hugo Pereira Brito a981dc64a7 docs(sdk): link route53 changelog entry to PR (#10928) 2026-04-29 12:24:27 +02:00
Josema Camacho d2086cad3f fix(api): Attack Paths AWS region fallback and stale SCHEDULED cleanup (#10917) 2026-04-29 12:20:43 +02:00
Hugo Pereira Brito 380b89cfb6 fix(sdk): cover CNAME → dangling S3 in route53 takeover check (#10920) 2026-04-29 11:14:33 +01:00
Pablo Fernandez Guerra (PFE) 13b04d339b test(ui): add E2E tests for invitation accept smart router (#10814)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
2026-04-29 10:27:30 +02:00
Pepe Fagoaga be3c5fb3c1 fix(cli): generate compliance after scan (#10919) 2026-04-28 17:18:30 +02:00
Davlet Dzhakishev 1de01bcb78 fix(azure): tighten flow log workspace checks (#10645)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-28 16:57:04 +02:00
baggers27 13d983450c fix(azure): broken link for minimum TLS version (#10916) 2026-04-28 14:23:00 +02:00
Daniel Barranquero 8b368e1343 feat(aws): add bedrock_guardrails_configured security check (#10844) 2026-04-28 14:16:19 +02:00
Prowler Bot c76a9baa20 chore(ui): Bump version to v5.26.0 (#10912)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-28 12:35:54 +02:00
Prowler Bot 30e2813e02 chore(docs): Bump version to v5.25.0 (#10909)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-28 12:35:32 +02:00
Prowler Bot 0f874c6ffd chore(sdk): Bump version to v5.26.0 (#10910)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-28 12:35:21 +02:00
Prowler Bot 2242689295 chore(api): Bump version to v1.27.0 (#10913)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-28 12:34:43 +02:00
Hugo Pereira Brito e252058af4 fix(m365): exclude guest users from entra_users_mfa_capable (#10785) 2026-04-28 08:58:16 +01:00
Pepe Fagoaga 37e6c9761f chore: changelog for v5.25.0 (#10900) 2026-04-28 08:47:20 +02:00
Pepe Fagoaga ebe666bec7 chore(boto3): configure user agent extra via env (#10904) 2026-04-28 08:01:11 +02:00
Pepe Fagoaga 7df2703db1 fix(aws): get organization's metadata with assumed role (#10894) 2026-04-27 22:15:11 +01:00
Kay Agahd 67234210ba feat(aws): add check secretsmanager_has_restrictive_resource_policy (#6985) 2026-04-27 21:49:34 +01:00
Josema Camacho 15ca69942d fix(api): align get_compliance_frameworks with Compliance.get_bulk (#10903) 2026-04-27 18:10:08 +02:00
Adrián Peña df76efc197 fix(api): skip null service/region in scan summary aggregation (#10902) 2026-04-27 17:46:46 +02:00
Hugo Pereira Brito 3441ad7f70 fix(sdk): align googleworkspace finding resources (#10901) 2026-04-27 15:17:29 +01:00
Hugo Pereira Brito 059b71d34b feat(ui): add View Resource action to findings drawer (#10847) 2026-04-27 13:19:18 +01:00
lydiavilchez 013809919c feat(googleworkspace): add Gmail service with first batch of checks (#10683) 2026-04-27 13:49:07 +02:00
Daniel Barranquero 368d9c1519 fix(admincenter): restrict admincenter group visibility check to Unified groups (#10899) 2026-04-27 13:23:03 +02:00
Hugo P.Brito e970a26ad8 Merge remote-tracking branch 'origin/master' into pr-10853-conflicts
# Conflicts:
#	ui/components/findings/table/resource-detail-drawer/resource-detail-drawer-content.tsx
2026-04-27 11:06:53 +01:00
Adrián Peña fb6da427f8 fix(api): prevent /tmp saturation from compliance report generation (#10874) 2026-04-27 11:05:34 +02:00
Adrián Peña 65fd3335d3 fix(api): reaggregate resource inventory and attack surface after muting findings (#10843) 2026-04-27 11:03:28 +02:00
César Arroba d6288be472 chore(ci): align sdk-bump-version PR titles with other bump workflows (#10897) 2026-04-27 10:20:56 +02:00
César Arroba 0cddb71d1c fix(ci): simplify docs-bump-version to a single master-only PR (#10896) 2026-04-27 10:20:47 +02:00
Andoni Alonso af2930130c fix(check): break circular import between config and check.utils (#10895) 2026-04-27 10:11:50 +02:00
Andoni Alonso b668770480 feat(github): add zizmor GitHub Actions scanning as a service of the GitHub provider (#10607) 2026-04-27 08:55:07 +02:00
Andoni Alonso f31c5717e9 chore(devex): add worktrunk worktree bootstrap config (#10867) 2026-04-27 08:45:04 +02:00
Alejandro Bailo 4788dcade2 fix(ui): polish shared table pagination and provider spacing (#10891) 2026-04-24 15:40:40 +02:00
Alejandro Bailo 22a6cc9e73 fix(ui): align resources filters and resource drawer behavior (#10861) 2026-04-24 15:03:47 +02:00
Pablo Fernandez Guerra (PFE) 06bb382f8e chore(ui): add knip for dead code detection (#10654)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-24 14:45:59 +02:00
Pedro Martín d4ece2b43e feat(sdk): add multi-provider compliance framework JSONs (#10300)
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
2026-04-24 13:27:31 +02:00
César Arroba b97d68fbd5 fix(ci): also gate cache-dependency-path on enable-cache in setup-python-poetry (#10885) 2026-04-24 12:38:13 +02:00
César Arroba ca79300440 fix(ci): poetry cache post-step failure on release workflows (#10881) 2026-04-24 11:57:30 +02:00
Pepe Fagoaga 7a0e107617 chore(api): changelog for v5.24.4 (#10882) 2026-04-24 11:57:02 +02:00
César Arroba 6d3fcec5da ci: bump docs version against master on patch releases (#10879) 2026-04-24 11:49:14 +02:00
César Arroba ce1cf51d37 fix(ci): allow github.com egress in backport workflow (#10876) 2026-04-24 10:00:55 +02:00
Pablo Fernandez Guerra (PFE) 3554859a5c fix(ui): load every Attack Paths scan before displaying the selector (#10864)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-24 09:41:47 +02:00
Daniel Barranquero 80d62f355f fix(alibabacloud): fix CS service SDK compatibility and harden Alibaba provider (#10871) 2026-04-24 09:26:09 +02:00
Josema Camacho 0df24eeff6 fix(api): make Neo4j connection acquisition timeout configurable and enable Sentry tracing (#10873) 2026-04-23 17:52:14 +02:00
Alejandro Bailo d1fc482832 feat(ui): improve Mutelist UX and mute modal (#10846) 2026-04-23 17:36:32 +02:00
Andoni Alonso ffb1bb89e1 feat(ci): add official Prowler GitHub Action (#10872) 2026-04-23 16:15:13 +02:00
Alejandro Bailo d877bea0e3 chore(ui): unify filter search and batch patterns (#10859) 2026-04-23 16:03:33 +02:00
Pedro Martín 2304bf0093 feat(compliance): add CIS pdf reporting (#10650) 2026-04-23 13:28:30 +02:00
Pepe Fagoaga 2ca74102a9 chore(poetry): lock poetry with 2.3.4 and install git as required (#10868) 2026-04-23 12:30:14 +02:00
Hugo P.Brito 5da90f9ea3 refactor(ui): drop status-extended linkification, keep CVE action only 2026-04-23 11:29:27 +01:00
Pablo Fernandez Guerra (PFE) 6ae129fcc0 chore: remove unused submodule (#10869)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
2026-04-23 12:13:35 +02:00
Hugo P.Brito e06a682cef fix(ui): show CVE action in remediation drawer
- Use external CVE references when a finding has no Hub recommendation
- Keep the remediation CTA slot as View in Prowler Hub or View CVE
- Cover the CVE drawer behavior with resource detail tests
2026-04-23 11:12:11 +01:00
Hugo P.Brito dd45eeea48 Merge remote-tracking branch 'origin/master' into fix/ui-cve-fix-available-links 2026-04-23 11:08:56 +01:00
Alejandro Bailo e9731f53ad chore(ui): reorganize changelog and open 1.24.4 section (#10866) 2026-04-23 11:22:32 +02:00
Pablo Fernandez Guerra (PFE) db2f92e6d5 chore: add prowler-openspec-opensource as git submodule (#10680)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-23 10:52:17 +02:00
Alejandro Bailo f4b0f8fa22 fix(ui): prevent rescheduling scans during credential update (#10851) 2026-04-23 09:45:16 +02:00
Pedro Martín dff5541e11 fix(ci): improve compliance check action (#10850) 2026-04-22 16:31:05 +02:00
Hugo P.Brito ca7bba28c4 docs(ui): link changelog entry to PR 2026-04-22 15:28:09 +01:00
Hugo P.Brito 4205ebda04 fix(ui): link CVE fix-available versions
- Link fix available versions in finding details for external CVE advisories
- Keep Prowler Hub-backed checks on the existing remediation path
- Cover the drawer behavior with focused UI tests
2026-04-22 15:26:03 +01:00
Mathisdjango 927be17fb7 feat(github): add check for dismissing stale PR approvals on default branch (CIS 1.1.4) (#10569)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-22 16:14:10 +02:00
Pepe Fagoaga c27cb28a2a chore(safety): define policy for high and critical (#10845) 2026-04-22 13:28:59 +02:00
Pepe Fagoaga 94ee24071a refactor: unify filtering and sorting for finding (#10803)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-22 13:11:50 +02:00
Josema Camacho 1093f6c99b fix(api): merge Attack Paths findings on short UIDs for AWS resources (#10839) 2026-04-22 12:19:03 +02:00
Hugo Pereira Brito 48060c47ba fix(ui): improve Resource Inventory cards light mode (#10757)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-22 12:05:09 +02:00
Pedro Martín 72acc2119d fix(aws): disallow me-south-1 & me-central-1 avoid stuck scans (#10837) 2026-04-22 11:16:41 +02:00
Rubén De la Torre Vico b1ebea4a7e chore(pre-commit): scope hooks per monorepo component (#10815) 2026-04-22 11:04:31 +02:00
dependabot[bot] 001057644e chore(deps): bump pyasn1 from 0.6.2 to 0.6.3 (#10365)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: pedrooot <pedromarting3@gmail.com>
Co-authored-by: Hugo P.Brito <hugopbrit@gmail.com>
2026-04-22 10:53:39 +02:00
Adrián Peña 1456def7d4 fix(api): reaggregate overview summaries after muting findings (#10827) 2026-04-22 10:44:21 +02:00
dependabot[bot] 12d475e7af chore(deps-dev): bump pygments from 2.19.2 to 2.20.0 (#10521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2026-04-22 10:09:06 +02:00
Andoni Alonso 43bd1083e0 feat(sdk): add SARIF output format for IaC provider (#10626)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-22 09:32:20 +02:00
dependabot[bot] bbd4ce7565 chore(deps): bump pygments from 2.19.2 to 2.20.0 in /mcp_server (#10523)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-22 09:31:04 +02:00
Davidm4r 97a085bf21 feat(ui): Add user expulsion from tenants with JWT authentication fix (#10787)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Adrián Peña <adrianjpr@gmail.com>
2026-04-22 09:28:39 +02:00
Pablo Fernandez Guerra (PFE) 29a2f8fac8 chore: remove legacy ui-checks hook from root pre-commit config (#10834)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
2026-04-22 09:18:39 +02:00
Pedro Martín a24869fc26 feat(sdk): add universal compliance output modules (CSV, OCSF, table) (#10299) 2026-04-22 09:01:45 +02:00
dependabot[bot] 72c94db1cf chore(deps): bump pygments from 2.19.2 to 2.20.0 in /api (#10522)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-22 08:59:21 +02:00
Pepe Fagoaga 4ef7bbdb7c docs: how to configure AWS SDK Default for IAM Role authentication (#10807) 2026-04-21 18:46:18 +02:00
Pepe Fagoaga f2c5d2ec87 fix(aws): fallback lookup events to resource name (#10828) 2026-04-21 18:31:50 +02:00
Adrián Peña 61a62fd6e0 fix(api): treat muted findings as resolved in finding-groups status (#10825) 2026-04-21 17:31:44 +02:00
Raajhesh Kannaa Chidambaram 39911e3ab7 feat(github): add --repo-list-file flag for GitHub scanning (#10501)
Co-authored-by: Raajhesh Kannaa Chidambaram <495042+raajheshkannaa@users.noreply.github.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-04-21 15:31:34 +02:00
Alejandro Bailo bcce8d6236 fix(ui): centralize default muted findings filter on finding groups (#10818)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-21 14:26:51 +02:00
Pablo Fernandez Guerra (PFE) 570c86948e chore: prek workspace for UI + builtin hooks + parallel execution (#10651)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-21 13:26:07 +02:00
Adrián Peña 548389d79f perf(api): speed up finding-groups /resources endpoint (#10816) 2026-04-21 12:53:59 +02:00
Alejandro Bailo fc3066bc60 refactor(ui): redesign compliance page layout and components (#10767) 2026-04-21 12:48:57 +02:00
Pedro Martín ac6dd03fb8 feat(sdk): add universal compliance schema models and loaders (#10298) 2026-04-21 11:39:04 +02:00
Javier Grau d3a1df3473 chore(skills): centralize AI assistant config via symlinks (#9951)
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-21 09:29:42 +02:00
César Arroba 858dfc2a00 fix(ci): remove broken resolved_reference step from setup-python-poetry (#10687) 2026-04-21 08:58:24 +02:00
Pepe Fagoaga 6b0ba79652 fix(changelog): relocate entries for the SDK (#10812) 2026-04-21 08:17:14 +02:00
Pablo Fernandez Guerra (PFE) 390bbdd1a6 refactor(ui): remove backward-compat redirect for legacy invitation links (#10797)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-21 08:11:51 +02:00
Pepe Fagoaga 8d48c26c1e chore(secrets): don't block for trufflehog (#10806) 2026-04-20 17:57:32 +02:00
Boon 98b9449e14 feat: add nginx reverse proxy configuration (#8516) (#10780)
Co-authored-by: Boon <boon@security8.work>
2026-04-20 17:30:21 +02:00
Pedro Martín 3406c5ec64 chore(skills): improve prowler-compliance (#10627) 2026-04-20 17:22:05 +02:00
Adrián Peña 4346401a0a fix(api): align latest_resources scan selection with completed_at (#10802) 2026-04-20 17:16:01 +02:00
dependabot[bot] dcec79d259 chore(deps): bump pyasn1 from 0.6.2 to 0.6.3 in /api (#10366) 2026-04-20 16:43:19 +02:00
Pepe Fagoaga 2a9c538aff chore: review changelog for v5.24.1 (#10791) 2026-04-20 14:01:29 +02:00
Pepe Fagoaga bf1b53bbd2 fix(ui): sorting and filtering for findings (#10778)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-20 13:34:31 +02:00
César Arroba 94a2ea1e8f chore: update CODEOWNERS for new team hierarchy (#10706) 2026-04-20 11:39:00 +02:00
Daniel Barranquero f7194b32de docs: remove prowler ctf page (#10782) 2026-04-20 09:37:30 +02:00
Pedro Martín 6ffe4e95bf fix(api): detect silent failures in ResourceFindingMapping (#10724)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-20 09:00:43 +02:00
Alan Buscaglia 577aa14acc fix(ui): correct IaC findings counters (#10736)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-17 12:48:57 +02:00
Andoni Alonso 19c752c127 fix(cloudflare): guard validate_credentials against paginator infinite loops (#10771) 2026-04-17 11:23:31 +02:00
Alejandro Bailo f2d35f5885 fix(ui): exclude muted findings and polish filter selectors (#10734) 2026-04-17 11:07:22 +02:00
Josema Camacho 536e90f2a5 perf(attack-paths): cleanup task prioritization, restore default batch sizes to 1000, upgrade Cartography to 0.135.0 (#10729) 2026-04-17 10:22:30 +02:00
Daniel Barranquero 276a5d66bd feat(docs): add ctf documentation (#10761) 2026-04-16 19:35:52 +02:00
Alejandro Bailo 489c6c1073 fix: CHANGELOG minor issue (#10758) 2026-04-16 17:07:22 +02:00
Adrián Peña b08b072288 fix(api): exclude muted findings from pass_count, fail_count and manual_count (#10753) 2026-04-16 15:56:08 +02:00
Josema Camacho ca29e354b6 chore(deps): bump msgraph-sdk to 1.55.0 and azure-mgmt-resource to 24.0.0, remove marshmallow (#10733) 2026-04-16 15:34:28 +02:00
Alejandro Bailo 85a3927950 fix(ui): upgrade React 19.2.5 and Next.js 16.2.3 to mitigate CVE-2026-23869 (#10752) 2026-04-16 15:24:10 +02:00
Rubén De la Torre Vico 04fe3f65e0 chore(deps): enable Dependabot pre-commit ecosystem and bump hooks (#10732) 2026-04-16 13:38:11 +02:00
Andoni Alonso 297c9d0734 fix(sdk): move #10726 changelog entry to unreleased version (#10728) 2026-04-16 13:10:00 +02:00
Erich Blume a2a1a73749 fix(image): --registry-list crashes with AttributeError on global_provider (#10691)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-04-16 13:02:25 +02:00
lydiavilchez 08fbe17e29 fix(googleworkspace): treat secure Google defaults as PASS for Drive checks (#10727) 2026-04-16 13:01:55 +02:00
lydiavilchez d920f78059 fix(googleworkspace): treat secure Google defaults as PASS for Calendar checks (#10726) 2026-04-16 12:51:40 +02:00
Pepe Fagoaga 12bf3d5e70 fix(db): add missing tenant_id filter in queries (#10722) 2026-04-16 11:55:38 +02:00
Adrián Peña 4002c28b5d fix(api): add fallback handling for missing resources in findings (#10708) 2026-04-16 11:45:06 +02:00
Andoni Alonso 2439f54280 fix(sdk): allow account-scoped tokens in Cloudflare connection test (#10723) 2026-04-16 11:38:15 +02:00
Prowler Bot b0e59156e6 chore(ui): Bump version to v5.25.0 (#10711)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-15 20:14:46 +02:00
Prowler Bot f013bd4a53 docs: Update version to v5.24.0 (#10714)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-15 20:14:17 +02:00
Prowler Bot 6ad15f900f chore(release): Bump version to v5.25.0 (#10710)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-15 20:14:06 +02:00
Prowler Bot 1784bf38ab chore(api): Bump version to v1.26.0 (#10715)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-15 20:13:33 +02:00
Pepe Fagoaga ba5b23245f chore: review changelog for v5.24 (#10707) 2026-04-15 18:05:55 +02:00
Daniel Barranquero 43913b1592 feat(aws): support excluding regions from scans via CLI, env var, and config (#10688) 2026-04-15 17:59:46 +02:00
Alan Buscaglia 9e31160887 fix(ui): improve attack paths scan table UX and fix info banner variant (#10704)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-15 17:33:29 +02:00
Pepe Fagoaga 9a0c73256e chore: delete .opencode (#10702) 2026-04-15 15:10:40 +02:00
Alejandro Bailo 2a160a10df refactor(ui): remove legacy side drawers and clean code (#10692) 2026-04-15 13:55:57 +02:00
Alan Buscaglia 8d8bee165b feat(ui): improve attack paths scan selection UX (#10685) 2026-04-15 13:54:25 +02:00
Alan Buscaglia 606efec9f8 fix(ui): keep update credentials wizard open (#10675) 2026-04-15 13:50:20 +02:00
Alan Buscaglia d5354e8b1d feat(ui): add syntax highlighting to finding groups remediation code (#10698) 2026-04-15 12:58:35 +02:00
Rubén De la Torre Vico a96e5890dc docs: replace Excalidraw diagrams with Mermaid and fix architecture connections (#10697) 2026-04-15 12:51:29 +02:00
Pepe Fagoaga bb81c5dd2d docs: add contextual menu for copy and issue/feat (#10699) 2026-04-15 12:50:29 +02:00
Daniel Barranquero c3acb818d9 fix(vercel): handle team-scoped firewall config responses (#10695) 2026-04-15 11:59:20 +02:00
Andoni Alonso e6fc59267b docs: add Finding Groups documentation page (#10696)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-15 11:58:39 +02:00
Josema Camacho 62f114f5d0 refactor(api): remove dead cleanup_findings no-op from attack-paths module (#10684) 2026-04-15 09:16:38 +02:00
Pepe Fagoaga 392ffd5a60 fix(beat): make it dependant from API service (#10603)
Co-authored-by: Josema Camacho <josema@prowler.com>
2026-04-14 18:35:26 +02:00
Alejandro Bailo 507b0882d5 fix(ui): fix findings group resource filters and mute modal migration (#10662) 2026-04-14 18:01:45 +02:00
Alejandro Bailo 89d72cf8fd feat(ui): new resources side drawer with redesigned detail panel (#10673) 2026-04-14 17:20:19 +02:00
Rubén De la Torre Vico f3a042933f chore(deps): replace pre-commit and husky with prek (#10601) 2026-04-14 16:34:54 +02:00
stepsecurity-app[bot] 96e7d6cb3a feat(security): security best practices from StepSecurity (#10682)
Signed-off-by: StepSecurity Bot <bot@stepsecurity.io>
Co-authored-by: stepsecurity-app[bot] <188008098+stepsecurity-app[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-14 15:13:12 +02:00
Hugo Pereira Brito a82eaa885d refactor(m365): normalize CA platforms at model level (#10635)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 15:00:23 +02:00
Hugo Pereira Brito 90a619a8b4 feat(m365): add entra_conditional_access_policy_block_unknown_device_platforms security check (#10615)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 14:32:37 +02:00
Hugo Pereira Brito 638bf62d76 feat(entra): directory sync account exclusion (#10620)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 14:16:32 +02:00
Pablo Fernandez Guerra (PFE) 962615ca1f chore(ui): bump serialize-javascript override from 7.0.3 to 7.0.5 (#10653)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 14:11:59 +02:00
Hugo Pereira Brito 5610f5ad90 feat(m365): add entra_conditional_access_policy_corporate_device_sign_in_frequency_enforced security check (#10618)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 14:10:00 +02:00
Pepe Fagoaga be6fe1db04 chore(security): bump pytest to 9.0.3 (#10678) 2026-04-14 13:59:30 +02:00
Hugo Pereira Brito 92b838866a feat(m365): add entra_conditional_access_policy_mfa_enforced_for_guest_users security check (#10616)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 13:45:12 +02:00
Josema Camacho 51591cb8cd build: bump poetry to 2.3.4 and consolidate SDK workflows (#10681) 2026-04-14 13:32:46 +02:00
Hugo Pereira Brito e24e1ab771 feat(m365): add exchange_organization_delicensing_resiliency_enabled security check (#10608) 2026-04-14 13:30:45 +02:00
Hugo Pereira Brito bc3fd79457 feat(intune): add device compliance policy marks noncompliant check (#10599) 2026-04-14 13:01:47 +02:00
Hugo Pereira Brito 4941ed5797 feat(entra): add new check entra_conditional_access_policy_all_apps_all_users (#10619)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 12:47:57 +02:00
Daniel Barranquero 0f4d8ff891 feat(aws): add bedrock_vpc_endpoints_configured security check (#10591) 2026-04-14 12:22:22 +02:00
Daniel Barranquero d1ab8b8ae5 feat(aws): add iam_policy_no_wildcard_marketplace_subscribe and iam_inline_policy_no_wildcard_marketplace_subscribe checks (#10525)
Co-authored-by: Hugo P.Brito <hugopbrit@gmail.com>
2026-04-14 12:08:40 +02:00
Daniel Barranquero 65e9593b41 feat(aws): add bedrock_access_not_stale security check (#10536) 2026-04-14 11:20:40 +02:00
Daniel Barranquero 131112398b feat(aws): add bedrock_full_access_policy_attached security check (#10577) 2026-04-14 11:00:40 +02:00
Pedro Martín c952ea018e fix(ui): reflect actual provider in compliance detail header (#10674)
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
2026-04-14 10:22:42 +02:00
Pedro Martín 31b645ee53 chore(github): allow GitHub release CDN in trivy scan allowlist (#10679) 2026-04-14 10:09:54 +02:00
harshadkhetpal 0123e603d8 fix: replace bare except with except Exception in prowler-wrapper (#10499) 2026-04-14 08:11:53 +02:00
Prowler Bot b65265da4b feat(aws): Update regions for AWS services (#10659)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-14 08:03:14 +02:00
Prowler Bot 1335332fe9 chore(api): Bump version to v1.25.0 (#10668)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-13 22:18:59 +02:00
Prowler Bot f37a2a1efe chore(release): Bump version to v5.24.0 (#10666)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-13 22:18:54 +02:00
Prowler Bot 3e0e1398c4 docs: Update version to v5.23.0 (#10667)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-13 22:18:13 +02:00
Prowler Bot a4ad9ba01f chore(ui): Bump version to v5.24.0 (#10665)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-13 22:17:44 +02:00
Adrián Peña c6d5f44c5e chore: update pyjwt (#10661) 2026-04-13 14:09:37 +02:00
Adrián Peña 5d24a41625 feat(api): add sort support for all finding group counter fields (#10655) 2026-04-13 13:34:35 +02:00
lydiavilchez e33825747f fix(googleworkspace): apply customer-level policy filter to Calendar service (#10658) 2026-04-13 11:26:35 +02:00
lydiavilchez d919d979dd feat(googleworkspace): add Drive and Docs service checks using Cloud Identity Policy API (#10648) 2026-04-13 10:48:24 +02:00
Pepe Fagoaga 6534faf678 chore: review changelog for v5.23 (#10631) 2026-04-13 08:59:07 +02:00
Alan Buscaglia 1aa91cf60f fix(ui): exclude service filter from finding group resources endpoint (#10652) 2026-04-10 14:06:47 +02:00
Josema Camacho dad84f0ee2 docs(attack-paths): replace basic query examples with graph traversal patterns (#10649) 2026-04-10 12:23:02 +02:00
Alejandro Bailo 0d7c5f6ac5 feat(ui): make finding group delta indicator status-filter aware (#10647) 2026-04-10 11:29:11 +02:00
Hugo Pereira Brito 431776bcfd docs(attack-paths): link custom queries to Prowler docs (#10640) 2026-04-10 10:17:45 +01:00
Alejandro Bailo 0e8080f09c fix(ui): findings groups fixes (#10633) 2026-04-10 10:44:10 +02:00
Adrián Peña e4b2950436 refactor(api): split finding-groups status from muted state (#10630) 2026-04-09 18:07:43 +02:00
Pablo Fernandez Guerra (PFE) 63174caf98 docs: add multi-tenant (organizations) management guide (#10638)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: David <david.copo@gmail.com>
2026-04-09 17:51:54 +02:00
Alejandro Bailo 4e508b69c9 fix(vercel): use canonical Hub URLs in check metadata (#10636) 2026-04-09 16:23:50 +02:00
Andoni Alonso 18cfb191f5 docs: rename Prowler App to Prowler Cloud in provider headers (#10634) 2026-04-09 15:58:35 +02:00
Avula Jeevan Yadav b898f257f1 feat(stepfunctions): add check for secrets in state machine definition (#10570)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-04-09 15:56:29 +02:00
Hugo Pereira Brito cccb3a4b94 chore(sdk,mcp): pin direct dependencies to exact versions (#10593)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-04-09 14:42:49 +01:00
Daniel Barranquero ca50b24d77 docs: add Vercel Cloud getting started (#10609) 2026-04-09 15:40:44 +02:00
mintlify[bot] 7eb204fff0 docs: classify supported providers by category on main page (#10621)
Co-authored-by: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-04-09 15:39:43 +02:00
Pedro Martín 56c370d3a4 chore(ccc): update with latest version and improve mapping (#10625) 2026-04-09 15:27:18 +02:00
Pedro Martín b0d8534907 feat(api): add needed changes for GoogleWorkspace compliance (#10629) 2026-04-09 14:36:55 +02:00
dependabot[bot] ad36938717 chore(deps): bump actions/download-artifact from 6.0.0 to 8.0.1 (#10541)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:25:14 +02:00
dependabot[bot] 10dd9460e9 chore(deps): bump azure/setup-helm from 4.3.0 to 5.0.0 (#10543)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:24:42 +02:00
dependabot[bot] c8d41745dd chore(deps): bump softprops/action-gh-release from 2.5.0 to 2.6.1 (#10544)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:23:44 +02:00
dependabot[bot] c6c000a369 chore(deps): bump actions/setup-node from 6.2.0 to 6.3.0 (#10545)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:23:18 +02:00
dependabot[bot] a2b083e8c8 chore(deps): bump actions/cache from 5.0.3 to 5.0.4 (#10546)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:22:58 +02:00
dependabot[bot] d2f7169537 chore(deps): bump actions/checkout from 6.0.1 to 6.0.2 (#10548)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:22:26 +02:00
dependabot[bot] 632f2633c1 chore(deps): bump zizmorcore/zizmor-action from 0.5.0 to 0.5.2 (#10550)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:20:34 +02:00
dependabot[bot] 82d487a1e7 chore(deps): bump sorenlouv/backport-github-action from 10.2.0 to 11.0.0 (#10540)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:20:11 +02:00
dependabot[bot] 9a6a43637d chore(deps): bump pnpm/action-setup from 4.2.0 to 5.0.0 (#10551)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:19:50 +02:00
dependabot[bot] c21cf0ac20 chore(deps): bump tj-actions/changed-files from 47.0.4 to 47.0.5 (#10552)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:19:28 +02:00
dependabot[bot] f3b142c0cf chore(deps): bump docker/login-action from 3.7.0 to 4.0.0 (#10554)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:19:00 +02:00
dependabot[bot] eda90c4673 chore(deps): bump actions/upload-artifact from 6.0.0 to 7.0.0 (#10555)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:18:16 +02:00
dependabot[bot] def59a8cc2 chore(deps): bump docker/setup-buildx-action from 3.12.0 to 4.0.0 (#10556)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:16:00 +02:00
dependabot[bot] 1bfed74db5 chore(deps): bump docker/build-push-action from 6.19.2 to 7.0.0 (#10557)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:14:27 +02:00
Davidm4r baf1194824 feat(ui): invitation flow smart routing (#10589)
Co-authored-by: Pablo Fernandez Guerra (PFE) <148432447+pfe-nazaries@users.noreply.github.com>
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 10:11:52 +02:00
Alejandro Bailo b9270df3e6 feat(ui): improvements over findings groups feature (#10590) 2026-04-09 09:39:52 +02:00
dependabot[bot] 379df7800d chore(deps): bump aiohttp from 3.13.3 to 3.13.5 in /api (#10538)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-09 09:27:55 +02:00
dependabot[bot] fcabe1f99e chore(deps): bump aiohttp from 3.13.3 to 3.13.5 (#10537)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-09 08:57:16 +02:00
Davidm4r ad7a56d010 fix(ui): show active organization ID in profile page (#10617) 2026-04-09 08:51:39 +02:00
Pablo Fernandez Guerra (PFE) 406eedd68a chore(ui): unset GIT_WORK_TREE in pre-commit hook (#10574)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 14:27:12 +02:00
lydiavilchez bc38104903 feat(googleworkspace): add calendar service checks using Cloud Identity Policy API (#10597) 2026-04-08 13:26:56 +02:00
Andoni Alonso 9290d7e105 feat(sdk): warn when sensitive CLI flags receive explicit values (#10532) 2026-04-08 13:15:05 +02:00
lydiavilchez 72e8f09c07 feat(googleworkspace): add directory check for CIS 1.1.3 - super admin only admin roles (#10488)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-08 12:05:15 +02:00
Pepe Fagoaga 1d43885230 docs: update architecture diagram (#10604) 2026-04-08 11:05:28 +02:00
Adrián Peña e6aedcb207 feat(api): support sort by delta on finding-groups endpoints (#10606) 2026-04-08 11:04:57 +02:00
Kay Agahd 89fe867944 fix(aws): recognize service-specific condition keys as restrictive in is_policy_public (#10600)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 10:55:55 +02:00
Pepe Fagoaga 2be2753c55 fix(codeartifact): only retrieve the latest version from a package (#10243)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-04-08 09:21:19 +02:00
Josema Camacho 283259f34c fix(sdk): resolve empty-set bug in _enabled_regions causing 36-region client creation and CI timeouts (#10598) 2026-04-08 08:40:58 +02:00
Adrián Peña abaacd7dbf feat(api): finding group first_seen_at semantics and resource delta (#10595) 2026-04-07 16:41:08 +02:00
rchotacode 5e1e4bd8e4 fix(oci): Mutelist support (#10566)
Co-authored-by: Ronan Chota <ronan.chota@saic.com>
Co-authored-by: Hugo P.Brito <hugopbrito@users.noreply.github.com>
2026-04-07 13:23:51 +01:00
Davidm4r 33efd72b97 chore(deps): bump authlib from 1.6.5 to 1.6.9 (#10579)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-07 13:31:59 +02:00
Pepe Fagoaga b2788df8cc chore(issues): automate conversation lock on issue close (#10596) 2026-04-07 13:07:02 +02:00
Andoni Alonso b1b361af8b chore(ci): update Pablo user for labeling purposes (#10594) 2026-04-07 12:54:04 +02:00
Josema Camacho 8bc03f8d04 fix(api): remove clear_cache from attack paths read-only query endpoints (#10586) 2026-04-07 12:46:51 +02:00
Andoni Alonso ca03d9c0a9 docs: add Google Workspace SAML SSO configuration guide (#10564)
Co-authored-by: Alan Buscaglia <Alan-TheGentleman@users.noreply.github.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-04-07 12:03:21 +02:00
Kay Agahd 8985280621 fix(azure): create distinct report per key/secret in keyvault checks (#10332)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Hugo P.Brito <hugopbrit@gmail.com>
2026-04-07 09:36:48 +01:00
Pepe Fagoaga b7ee2b9690 chore: rename UI tab regarding the environment (#10588) 2026-04-07 10:30:01 +02:00
Alejandro Bailo 6b2d9b5580 feat(ui): add Vercel provider (#10191)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-07 10:13:18 +02:00
kaiisfree c99ed991b7 fix: show all checks including threat-detection in --list-checks (#10578)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: kaiisfree <kai@users.noreply.github.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-04-06 16:55:15 +01:00
Hugo Pereira Brito 7c0034524a fix(sdk): add missing __init__.py for codebuild GitHub orgs check (#10584) 2026-04-06 16:40:04 +01:00
Josema Camacho 749110de75 chore(sdk): bump cryptography to 46.0.6, oci to 2.169.0, and alibabacloud-tea-openapi to 0.4.4 (#10535) 2026-04-06 15:09:33 +02:00
1071 changed files with 105052 additions and 25876 deletions
+23
View File
@@ -0,0 +1,23 @@
# Prowler worktree automation for worktrunk (wt CLI).
# Runs automatically on `wt switch --create`.
# Block 1: setup + copy gitignored env files (.envrc, ui/.env.local)
# from the primary worktree — patterns selected via .worktreeinclude.
[[pre-start]]
skills = "./skills/setup.sh --claude"
python = "poetry env use python3.12"
envs = "wt step copy-ignored"
# Block 2: install Python deps (requires `poetry env use` from block 1).
[[pre-start]]
deps = "poetry install --with dev"
# Block 3: reminder — last visible output before `wt switch` returns.
# Hooks can't mutate the parent shell, so venv activation is manual.
[[pre-start]]
reminder = "echo '>> Reminder: activate the venv in this shell with: eval $(poetry env activate)'"
# Background: pnpm install runs while you start working.
# Tail logs via `wt config state logs`.
[post-start]
ui = "cd ui && pnpm install"
+1 -1
View File
@@ -145,7 +145,7 @@ SENTRY_RELEASE=local
NEXT_PUBLIC_SENTRY_ENVIRONMENT=${SENTRY_ENVIRONMENT}
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.16.0
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.26.0
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
+12 -11
View File
@@ -1,14 +1,15 @@
# SDK
/* @prowler-cloud/sdk
/prowler/ @prowler-cloud/sdk @prowler-cloud/detection-and-remediation
/tests/ @prowler-cloud/sdk @prowler-cloud/detection-and-remediation
/dashboard/ @prowler-cloud/sdk
/docs/ @prowler-cloud/sdk
/examples/ @prowler-cloud/sdk
/util/ @prowler-cloud/sdk
/contrib/ @prowler-cloud/sdk
/permissions/ @prowler-cloud/sdk
/codecov.yml @prowler-cloud/sdk @prowler-cloud/api
/* @prowler-cloud/detection-remediation
/prowler/ @prowler-cloud/detection-remediation
/prowler/compliance/ @prowler-cloud/compliance
/tests/ @prowler-cloud/detection-remediation
/dashboard/ @prowler-cloud/detection-remediation
/docs/ @prowler-cloud/detection-remediation
/examples/ @prowler-cloud/detection-remediation
/util/ @prowler-cloud/detection-remediation
/contrib/ @prowler-cloud/detection-remediation
/permissions/ @prowler-cloud/detection-remediation
/codecov.yml @prowler-cloud/detection-remediation @prowler-cloud/api
# API
/api/ @prowler-cloud/api
@@ -17,7 +18,7 @@
/ui/ @prowler-cloud/ui
# AI
/mcp_server/ @prowler-cloud/ai
/mcp_server/ @prowler-cloud/detection-remediation
# Platform
/.github/ @prowler-cloud/platform
@@ -0,0 +1,143 @@
name: "🔎 New Check Request"
description: Request a new Prowler security check
title: "[New Check]: "
labels: ["feature-request", "status/needs-triage"]
body:
- type: checkboxes
id: search
attributes:
label: Existing check search
description: Confirm this check does not already exist before opening a new request.
options:
- label: I have searched existing issues, Prowler Hub, and the public roadmap, and this check does not already exist.
required: true
- type: markdown
attributes:
value: |
Use this form to describe the security condition that Prowler should evaluate.
The most useful inputs for [Prowler Studio](https://github.com/prowler-cloud/prowler-studio) are:
- What should be detected
- What PASS and FAIL mean
- Vendor docs, API references, SDK methods, CLI commands, or reference code
- type: dropdown
id: provider
attributes:
label: Provider
description: Cloud or platform this check targets.
options:
- AWS
- Azure
- GCP
- Kubernetes
- GitHub
- Microsoft 365
- OCI
- Alibaba Cloud
- Cloudflare
- MongoDB Atlas
- Google Workspace
- OpenStack
- Vercel
- NHN
- Other / New provider
validations:
required: true
- type: input
id: other_provider_name
attributes:
label: New provider name
description: Only fill this if you selected "Other / New provider" above.
placeholder: "NewProviderName"
validations:
required: false
- type: input
id: service_name
attributes:
label: Service or product area
description: Optional. Main service, product, or feature to audit.
placeholder: "s3, bedrock, entra, repository, apiserver"
validations:
required: false
- type: input
id: suggested_check_name
attributes:
label: Suggested check name
description: Optional. Use `snake_case` following `<service>_<resource>_<best_practice>`, with lowercase letters and underscores only.
placeholder: "bedrock_guardrail_sensitive_information_filter_enabled"
validations:
required: false
- type: textarea
id: context
attributes:
label: Context and goal
description: Describe the security problem, why it matters, and what this new check should help detect.
placeholder: |-
- Security condition to validate:
- Why it matters:
- Resource, feature, or configuration involved:
validations:
required: true
- type: textarea
id: expected_behavior
attributes:
label: Expected behavior
description: Explain what the check should evaluate and what PASS, FAIL, or MANUAL should mean.
placeholder: |-
- Resource or scope to evaluate:
- PASS when:
- FAIL when:
- MANUAL when (if applicable):
- Exclusions, thresholds, or edge cases:
validations:
required: true
- type: textarea
id: references
attributes:
label: References
description: Add vendor docs, API references, SDK methods, CLI commands, endpoint docs, sample payloads, or similar reference material.
placeholder: |-
- Product or service documentation:
- API or SDK reference:
- CLI command or endpoint documentation:
- Sample payload or response:
- Security advisory or benchmark:
validations:
required: true
- type: dropdown
id: severity
attributes:
label: Suggested severity
description: Your best estimate. Reviewers will confirm during triage.
options:
- Critical
- High
- Medium
- Low
- Informational
- Not sure
validations:
required: true
- type: textarea
id: implementation_notes
attributes:
label: Additional implementation notes
description: Optional. Add permissions, unsupported regions, config knobs, product limitations, or anything else that may affect implementation.
placeholder: |-
- Required permissions or scopes:
- Region, tenant, or subscription limitations:
- Configurable behavior or thresholds:
- Other constraints:
validations:
required: false
+14 -17
View File
@@ -13,11 +13,19 @@ inputs:
poetry-version:
description: 'Poetry version to install'
required: false
default: '2.1.1'
default: '2.3.4'
install-dependencies:
description: 'Install Python dependencies with Poetry'
required: false
default: 'true'
update-lock:
description: 'Run `poetry lock` during setup. Only enable when a prior step mutates pyproject.toml (e.g. API `@master` VCS rewrite). Default: false.'
required: false
default: 'false'
enable-cache:
description: 'Whether to enable Poetry dependency caching via actions/setup-python'
required: false
default: 'true'
runs:
using: 'composite'
@@ -60,21 +68,8 @@ runs:
echo "Updated resolved_reference:"
grep -A2 -B2 "resolved_reference" poetry.lock
- name: Update SDK resolved_reference to latest commit (prowler repo on push)
if: github.event_name == 'push' && github.ref == 'refs/heads/master' && github.repository == 'prowler-cloud/prowler'
shell: bash
working-directory: ${{ inputs.working-directory }}
run: |
LATEST_COMMIT=$(curl -s "https://api.github.com/repos/prowler-cloud/prowler/commits/master" | jq -r '.sha')
echo "Latest commit hash: $LATEST_COMMIT"
sed -i '/url = "https:\/\/github\.com\/prowler-cloud\/prowler\.git"/,/resolved_reference = / {
s/resolved_reference = "[a-f0-9]\{40\}"/resolved_reference = "'"$LATEST_COMMIT"'"/
}' poetry.lock
echo "Updated resolved_reference:"
grep -A2 -B2 "resolved_reference" poetry.lock
- name: Update poetry.lock (prowler repo only)
if: github.repository == 'prowler-cloud/prowler'
if: github.repository == 'prowler-cloud/prowler' && inputs.update-lock == 'true'
shell: bash
working-directory: ${{ inputs.working-directory }}
run: poetry lock
@@ -83,8 +78,10 @@ runs:
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ inputs.python-version }}
cache: 'poetry'
cache-dependency-path: ${{ inputs.working-directory }}/poetry.lock
# Disable cache when callers skip dependency install: Poetry 2.3.4 creates
# the venv in a path setup-python can't hash, breaking the post-step save-cache.
cache: ${{ inputs.enable-cache == 'true' && 'poetry' || '' }}
cache-dependency-path: ${{ inputs.enable-cache == 'true' && format('{0}/poetry.lock', inputs.working-directory) || '' }}
- name: Install Python dependencies
if: inputs.install-dependencies == 'true'
+12
View File
@@ -66,6 +66,18 @@ updates:
cooldown:
default-days: 7
- package-ecosystem: "pre-commit"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
- "pre-commit"
cooldown:
default-days: 7
# Dependabot Updates are temporary disabled - 2025/04/15
# v4.6
# - package-ecosystem: "pip"
+2
View File
@@ -13,6 +13,8 @@ env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
permissions: {}
jobs:
detect-release-type:
runs-on: ubuntu-latest
+14 -1
View File
@@ -5,10 +5,20 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'api/**'
- '.github/workflows/api-tests.yml'
- '.github/workflows/api-code-quality.yml'
- '.github/actions/setup-python-poetry/**'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'api/**'
- '.github/workflows/api-tests.yml'
- '.github/workflows/api-code-quality.yml'
- '.github/actions/setup-python-poetry/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -17,6 +27,8 @@ concurrency:
env:
API_WORKING_DIR: ./api
permissions: {}
jobs:
api-code-quality:
runs-on: ubuntu-latest
@@ -50,7 +62,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
api/**
@@ -67,6 +79,7 @@ jobs:
with:
python-version: ${{ matrix.python-version }}
working-directory: ./api
update-lock: 'true'
- name: Poetry check
if: steps.check-changes.outputs.any_changed == 'true'
+2
View File
@@ -24,6 +24,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
api-analyze:
name: CodeQL Security Analysis
@@ -33,6 +33,8 @@ env:
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler-api
permissions: {}
jobs:
setup:
if: github.repository == 'prowler-cloud/prowler'
@@ -137,18 +139,18 @@ jobs:
sed -i "s|prowler-cloud/prowler.git@master|prowler-cloud/prowler.git@${LATEST_SHA}|" api/pyproject.toml
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build and push API container for ${{ matrix.arch }}
id: container-push
if: github.event_name == 'push' || github.event_name == 'release' || github.event_name == 'workflow_dispatch'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true
@@ -156,7 +158,7 @@ jobs:
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ needs.setup.outputs.short-sha }}-${{ matrix.arch }}
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
cache-to: type=gha,mode=${{ github.event_name == 'pull_request' && 'min' || 'max' }},scope=${{ matrix.arch }}
# Create and push multi-architecture manifest
create-manifest:
@@ -178,7 +180,7 @@ jobs:
auth.docker.io:443
production.cloudflare.docker.com:443
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
+19 -21
View File
@@ -5,10 +5,16 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'api/**'
- '.github/workflows/api-container-checks.yml'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'api/**'
- '.github/workflows/api-container-checks.yml'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -18,6 +24,8 @@ env:
API_WORKING_DIR: ./api
IMAGE_NAME: prowler-api
permissions: {}
jobs:
api-dockerfile-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -42,7 +50,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: api/Dockerfile
@@ -55,16 +63,7 @@ jobs:
api-container-build-and-scan:
if: github.repository == 'prowler-cloud/prowler'
runs-on: ${{ matrix.runner }}
strategy:
matrix:
include:
- platform: linux/amd64
runner: ubuntu-latest
arch: amd64
- platform: linux/arm64
runner: ubuntu-24.04-arm
arch: arm64
runs-on: ubuntu-latest
timeout-minutes: 30
permissions:
contents: read
@@ -104,7 +103,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: api/**
files_ignore: |
@@ -115,25 +114,24 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build container for ${{ matrix.arch }}
- name: Build container
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.API_WORKING_DIR }}
push: false
load: true
platforms: ${{ matrix.platform }}
tags: ${{ env.IMAGE_NAME }}:${{ github.sha }}-${{ matrix.arch }}
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
tags: ${{ env.IMAGE_NAME }}:${{ github.sha }}
cache-from: type=gha
cache-to: type=gha,mode=${{ github.event_name == 'pull_request' && 'min' || 'max' }}
- name: Scan container with Trivy for ${{ matrix.arch }}
- name: Scan container with Trivy
if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/trivy-scan
with:
image-name: ${{ env.IMAGE_NAME }}
image-tag: ${{ github.sha }}-${{ matrix.arch }}
image-tag: ${{ github.sha }}
fail-on-critical: 'false'
severity: 'CRITICAL'
+17 -4
View File
@@ -5,10 +5,20 @@ on:
branches:
- "master"
- "v5.*"
paths:
- 'api/**'
- '.github/workflows/api-tests.yml'
- '.github/workflows/api-security.yml'
- '.github/actions/setup-python-poetry/**'
pull_request:
branches:
- "master"
- "v5.*"
paths:
- 'api/**'
- '.github/workflows/api-tests.yml'
- '.github/workflows/api-security.yml'
- '.github/actions/setup-python-poetry/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -17,6 +27,8 @@ concurrency:
env:
API_WORKING_DIR: ./api
permissions: {}
jobs:
api-security-scans:
runs-on: ubuntu-latest
@@ -53,11 +65,12 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
api/**
.github/workflows/api-security.yml
.safety-policy.yml
files_ignore: |
api/docs/**
api/README.md
@@ -70,6 +83,7 @@ jobs:
with:
python-version: ${{ matrix.python-version }}
working-directory: ./api
update-lock: 'true'
- name: Bandit
if: steps.check-changes.outputs.any_changed == 'true'
@@ -77,9 +91,8 @@ jobs:
- name: Safety
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run safety check --ignore 79023,79027,86217
# TODO: 79023 & 79027 knack ReDoS until `azure-cli-core` (via `cartography`) allows `knack` >=0.13.0
# TODO: 86217 because `alibabacloud-tea-openapi == 0.4.3` don't let us upgrade `cryptography >= 46.0.0`
# Accepted CVEs, severity threshold, and ignore expirations live in ../.safety-policy.yml
run: poetry run safety check --policy-file ../.safety-policy.yml
- name: Vulture
if: steps.check-changes.outputs.any_changed == 'true'
+12 -1
View File
@@ -5,10 +5,18 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'api/**'
- '.github/workflows/api-tests.yml'
- '.github/actions/setup-python-poetry/**'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'api/**'
- '.github/workflows/api-tests.yml'
- '.github/actions/setup-python-poetry/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -30,6 +38,8 @@ env:
VALKEY_DB: 0
API_WORKING_DIR: ./api
permissions: {}
jobs:
api-tests:
runs-on: ubuntu-latest
@@ -99,7 +109,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
api/**
@@ -116,6 +126,7 @@ jobs:
with:
python-version: ${{ matrix.python-version }}
working-directory: ./api
update-lock: 'true'
- name: Run tests with pytest
if: steps.check-changes.outputs.any_changed == 'true'
+4 -1
View File
@@ -17,6 +17,8 @@ env:
BACKPORT_LABEL_PREFIX: backport-to-
BACKPORT_LABEL_IGNORE: was-backported
permissions: {}
jobs:
backport:
if: github.event.pull_request.merged == true && !(contains(github.event.pull_request.labels.*.name, 'backport')) && !(contains(github.event.pull_request.labels.*.name, 'was-backported'))
@@ -33,6 +35,7 @@ jobs:
egress-policy: block
allowed-endpoints: >
api.github.com:443
github.com:443
- name: Check labels
id: label_check
@@ -46,7 +49,7 @@ jobs:
- name: Backport PR
if: steps.label_check.outputs.label_check == 'success'
uses: sorenlouv/backport-github-action@516854e7c9f962b9939085c9a92ea28411d1ae90 # v10.2.0
uses: sorenlouv/backport-github-action@9460b7102fea25466026ce806c9ebf873ac48721 # v11.0.0
with:
github_token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
auto_backport_label_prefix: ${{ env.BACKPORT_LABEL_PREFIX }}
+3 -1
View File
@@ -21,6 +21,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
zizmor:
if: github.repository == 'prowler-cloud/prowler'
@@ -49,6 +51,6 @@ jobs:
persist-credentials: false
- name: Run zizmor
uses: zizmorcore/zizmor-action@0dce2577a4760a2749d8cfb7a84b7d5585ebcb7d # v0.5.0
uses: zizmorcore/zizmor-action@71321a20a9ded102f6e9ce5718a2fcec2c4f70d8 # v0.5.2
with:
token: ${{ github.token }}
@@ -9,6 +9,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.issue.number }}
cancel-in-progress: false
permissions: {}
jobs:
update-labels:
if: contains(github.event.issue.labels.*.name, 'status/awaiting-response')
+2 -2
View File
@@ -4,8 +4,6 @@ on:
pull_request:
branches:
- 'master'
- 'v3'
- 'v4.*'
- 'v5.*'
types:
- 'opened'
@@ -16,6 +14,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
conventional-commit-check:
runs-on: ubuntu-latest
+2 -4
View File
@@ -13,6 +13,8 @@ env:
BACKPORT_LABEL_PREFIX: backport-to-
BACKPORT_LABEL_COLOR: B60205
permissions: {}
jobs:
create-label:
runs-on: ubuntu-latest
@@ -41,14 +43,11 @@ jobs:
echo "Processing release tag: $RELEASE_TAG"
# Remove 'v' prefix if present (e.g., v3.2.0 -> 3.2.0)
VERSION_ONLY="${RELEASE_TAG#v}"
# Check if it's a minor version (X.Y.0)
if [[ "$VERSION_ONLY" =~ ^([0-9]+)\.([0-9]+)\.0$ ]]; then
echo "Release $RELEASE_TAG (version $VERSION_ONLY) is a minor version. Proceeding to create backport label."
# Extract X.Y from X.Y.0 (e.g., 5.6 from 5.6.0)
MAJOR="${BASH_REMATCH[1]}"
MINOR="${BASH_REMATCH[2]}"
TWO_DIGIT_VERSION="${MAJOR}.${MINOR}"
@@ -60,7 +59,6 @@ jobs:
echo "Label name: $LABEL_NAME"
echo "Label description: $LABEL_DESC"
# Check if label already exists
if gh label list --repo ${{ github.repository }} --limit 1000 | grep -q "^${LABEL_NAME}[[:space:]]"; then
echo "Label '$LABEL_NAME' already exists."
else
+40 -225
View File
@@ -12,72 +12,12 @@ concurrency:
env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
DOCS_FILE: docs/getting-started/installation/prowler-app.mdx
permissions: {}
jobs:
detect-release-type:
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
contents: read
outputs:
is_minor: ${{ steps.detect.outputs.is_minor }}
is_patch: ${{ steps.detect.outputs.is_patch }}
major_version: ${{ steps.detect.outputs.major_version }}
minor_version: ${{ steps.detect.outputs.minor_version }}
patch_version: ${{ steps.detect.outputs.patch_version }}
current_docs_version: ${{ steps.get_docs_version.outputs.current_docs_version }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Get current documentation version
id: get_docs_version
run: |
CURRENT_DOCS_VERSION=$(grep -oP 'PROWLER_UI_VERSION="\K[^"]+' docs/getting-started/installation/prowler-app.mdx)
echo "current_docs_version=${CURRENT_DOCS_VERSION}" >> "${GITHUB_OUTPUT}"
echo "Current documentation version: $CURRENT_DOCS_VERSION"
- name: Detect release type and parse version
id: detect
run: |
if [[ $PROWLER_VERSION =~ ^([0-9]+)\.([0-9]+)\.([0-9]+)$ ]]; then
MAJOR_VERSION=${BASH_REMATCH[1]}
MINOR_VERSION=${BASH_REMATCH[2]}
PATCH_VERSION=${BASH_REMATCH[3]}
echo "major_version=${MAJOR_VERSION}" >> "${GITHUB_OUTPUT}"
echo "minor_version=${MINOR_VERSION}" >> "${GITHUB_OUTPUT}"
echo "patch_version=${PATCH_VERSION}" >> "${GITHUB_OUTPUT}"
if (( MAJOR_VERSION != 5 )); then
echo "::error::Releasing another Prowler major version, aborting..."
exit 1
fi
if (( PATCH_VERSION == 0 )); then
echo "is_minor=true" >> "${GITHUB_OUTPUT}"
echo "is_patch=false" >> "${GITHUB_OUTPUT}"
echo "✓ Minor release detected: $PROWLER_VERSION"
else
echo "is_minor=false" >> "${GITHUB_OUTPUT}"
echo "is_patch=true" >> "${GITHUB_OUTPUT}"
echo "✓ Patch release detected: $PROWLER_VERSION"
fi
else
echo "::error::Invalid version syntax: '$PROWLER_VERSION' (must be X.Y.Z)"
exit 1
fi
bump-minor-version:
needs: detect-release-type
if: needs.detect-release-type.outputs.is_minor == 'true'
bump-version:
runs-on: ubuntu-latest
timeout-minutes: 15
permissions:
@@ -89,185 +29,60 @@ jobs:
with:
egress-policy: audit
- name: Checkout repository
- name: Validate release version
run: |
if [[ ! $PROWLER_VERSION =~ ^([0-9]+)\.([0-9]+)\.([0-9]+)$ ]]; then
echo "::error::Invalid version syntax: '$PROWLER_VERSION' (must be X.Y.Z)"
exit 1
fi
if (( ${BASH_REMATCH[1]} != 5 )); then
echo "::error::Releasing another Prowler major version, aborting..."
exit 1
fi
- name: Checkout master branch
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ env.BASE_BRANCH }}
persist-credentials: false
- name: Calculate next minor version
- name: Read current docs version on master
id: docs_version
run: |
MAJOR_VERSION=${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MAJOR_VERSION}
MINOR_VERSION=${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MINOR_VERSION}
CURRENT_DOCS_VERSION="${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_CURRENT_DOCS_VERSION}"
NEXT_MINOR_VERSION=${MAJOR_VERSION}.$((MINOR_VERSION + 1)).0
CURRENT_DOCS_VERSION=$(grep -oP 'PROWLER_UI_VERSION="\K[^"]+' "${DOCS_FILE}")
echo "CURRENT_DOCS_VERSION=${CURRENT_DOCS_VERSION}" >> "${GITHUB_ENV}"
echo "NEXT_MINOR_VERSION=${NEXT_MINOR_VERSION}" >> "${GITHUB_ENV}"
echo "Current docs version on master: $CURRENT_DOCS_VERSION"
echo "Target release version: $PROWLER_VERSION"
echo "Current documentation version: $CURRENT_DOCS_VERSION"
echo "Current release version: $PROWLER_VERSION"
echo "Next minor version: $NEXT_MINOR_VERSION"
env:
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MAJOR_VERSION: ${{ needs.detect-release-type.outputs.major_version }}
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MINOR_VERSION: ${{ needs.detect-release-type.outputs.minor_version }}
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_CURRENT_DOCS_VERSION: ${{ needs.detect-release-type.outputs.current_docs_version }}
# Skip if master is already at or ahead of the release version
# (re-run, or patch shipped against an older minor line)
HIGHEST=$(printf '%s\n%s\n' "${CURRENT_DOCS_VERSION}" "${PROWLER_VERSION}" | sort -V | tail -n1)
if [[ "${CURRENT_DOCS_VERSION}" == "${PROWLER_VERSION}" || "${HIGHEST}" != "${PROWLER_VERSION}" ]]; then
echo "skip=true" >> "${GITHUB_OUTPUT}"
echo "Skipping bump: current ($CURRENT_DOCS_VERSION) >= release ($PROWLER_VERSION)"
else
echo "skip=false" >> "${GITHUB_OUTPUT}"
fi
- name: Bump versions in documentation for master
- name: Bump versions in documentation
if: steps.docs_version.outputs.skip == 'false'
run: |
set -e
# Update prowler-app.mdx with current release version
sed -i "s|PROWLER_UI_VERSION=\"${CURRENT_DOCS_VERSION}\"|PROWLER_UI_VERSION=\"${PROWLER_VERSION}\"|" docs/getting-started/installation/prowler-app.mdx
sed -i "s|PROWLER_API_VERSION=\"${CURRENT_DOCS_VERSION}\"|PROWLER_API_VERSION=\"${PROWLER_VERSION}\"|" docs/getting-started/installation/prowler-app.mdx
sed -i "s|PROWLER_UI_VERSION=\"${CURRENT_DOCS_VERSION}\"|PROWLER_UI_VERSION=\"${PROWLER_VERSION}\"|" "${DOCS_FILE}"
sed -i "s|PROWLER_API_VERSION=\"${CURRENT_DOCS_VERSION}\"|PROWLER_API_VERSION=\"${PROWLER_VERSION}\"|" "${DOCS_FILE}"
echo "Files modified:"
git --no-pager diff
- name: Create PR for documentation update to master
if: steps.docs_version.outputs.skip == 'false'
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8.1.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: master
commit-message: 'docs: Update version to v${{ env.PROWLER_VERSION }}'
branch: docs-version-update-to-v${{ env.PROWLER_VERSION }}
title: 'docs: Update version to v${{ env.PROWLER_VERSION }}'
labels: no-changelog,skip-sync
body: |
### Description
Update Prowler documentation version references to v${{ env.PROWLER_VERSION }} after releasing Prowler v${{ env.PROWLER_VERSION }}.
### Files Updated
- `docs/getting-started/installation/prowler-app.mdx`: `PROWLER_UI_VERSION` and `PROWLER_API_VERSION`
- All `*.mdx` files with `<VersionBadge>` components
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
- name: Checkout version branch
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: v${{ needs.detect-release-type.outputs.major_version }}.${{ needs.detect-release-type.outputs.minor_version }}
persist-credentials: false
- name: Calculate first patch version
run: |
MAJOR_VERSION=${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MAJOR_VERSION}
MINOR_VERSION=${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MINOR_VERSION}
CURRENT_DOCS_VERSION="${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_CURRENT_DOCS_VERSION}"
FIRST_PATCH_VERSION=${MAJOR_VERSION}.${MINOR_VERSION}.1
VERSION_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
echo "CURRENT_DOCS_VERSION=${CURRENT_DOCS_VERSION}" >> "${GITHUB_ENV}"
echo "FIRST_PATCH_VERSION=${FIRST_PATCH_VERSION}" >> "${GITHUB_ENV}"
echo "VERSION_BRANCH=${VERSION_BRANCH}" >> "${GITHUB_ENV}"
echo "First patch version: $FIRST_PATCH_VERSION"
echo "Version branch: $VERSION_BRANCH"
env:
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MAJOR_VERSION: ${{ needs.detect-release-type.outputs.major_version }}
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MINOR_VERSION: ${{ needs.detect-release-type.outputs.minor_version }}
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_CURRENT_DOCS_VERSION: ${{ needs.detect-release-type.outputs.current_docs_version }}
- name: Bump versions in documentation for version branch
run: |
set -e
# Update prowler-app.mdx with current release version
sed -i "s|PROWLER_UI_VERSION=\"${CURRENT_DOCS_VERSION}\"|PROWLER_UI_VERSION=\"${PROWLER_VERSION}\"|" docs/getting-started/installation/prowler-app.mdx
sed -i "s|PROWLER_API_VERSION=\"${CURRENT_DOCS_VERSION}\"|PROWLER_API_VERSION=\"${PROWLER_VERSION}\"|" docs/getting-started/installation/prowler-app.mdx
echo "Files modified:"
git --no-pager diff
- name: Create PR for documentation update to version branch
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8.1.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.VERSION_BRANCH }}
commit-message: 'docs: Update version to v${{ env.PROWLER_VERSION }}'
branch: docs-version-update-to-v${{ env.PROWLER_VERSION }}-branch
title: 'docs: Update version to v${{ env.PROWLER_VERSION }}'
labels: no-changelog,skip-sync
body: |
### Description
Update Prowler documentation version references to v${{ env.PROWLER_VERSION }} in version branch after releasing Prowler v${{ env.PROWLER_VERSION }}.
### Files Updated
- `docs/getting-started/installation/prowler-app.mdx`: `PROWLER_UI_VERSION` and `PROWLER_API_VERSION`
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
bump-patch-version:
needs: detect-release-type
if: needs.detect-release-type.outputs.is_patch == 'true'
runs-on: ubuntu-latest
timeout-minutes: 15
permissions:
contents: read
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Calculate next patch version
run: |
MAJOR_VERSION=${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MAJOR_VERSION}
MINOR_VERSION=${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MINOR_VERSION}
PATCH_VERSION=${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_PATCH_VERSION}
CURRENT_DOCS_VERSION="${NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_CURRENT_DOCS_VERSION}"
NEXT_PATCH_VERSION=${MAJOR_VERSION}.${MINOR_VERSION}.$((PATCH_VERSION + 1))
VERSION_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
echo "CURRENT_DOCS_VERSION=${CURRENT_DOCS_VERSION}" >> "${GITHUB_ENV}"
echo "NEXT_PATCH_VERSION=${NEXT_PATCH_VERSION}" >> "${GITHUB_ENV}"
echo "VERSION_BRANCH=${VERSION_BRANCH}" >> "${GITHUB_ENV}"
echo "Current documentation version: $CURRENT_DOCS_VERSION"
echo "Current release version: $PROWLER_VERSION"
echo "Next patch version: $NEXT_PATCH_VERSION"
echo "Target branch: $VERSION_BRANCH"
env:
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MAJOR_VERSION: ${{ needs.detect-release-type.outputs.major_version }}
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_MINOR_VERSION: ${{ needs.detect-release-type.outputs.minor_version }}
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_PATCH_VERSION: ${{ needs.detect-release-type.outputs.patch_version }}
NEEDS_DETECT_RELEASE_TYPE_OUTPUTS_CURRENT_DOCS_VERSION: ${{ needs.detect-release-type.outputs.current_docs_version }}
- name: Bump versions in documentation for patch version
run: |
set -e
# Update prowler-app.mdx with current release version
sed -i "s|PROWLER_UI_VERSION=\"${CURRENT_DOCS_VERSION}\"|PROWLER_UI_VERSION=\"${PROWLER_VERSION}\"|" docs/getting-started/installation/prowler-app.mdx
sed -i "s|PROWLER_API_VERSION=\"${CURRENT_DOCS_VERSION}\"|PROWLER_API_VERSION=\"${PROWLER_VERSION}\"|" docs/getting-started/installation/prowler-app.mdx
echo "Files modified:"
git --no-pager diff
- name: Create PR for documentation update to version branch
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8.1.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.VERSION_BRANCH }}
commit-message: 'docs: Update version to v${{ env.PROWLER_VERSION }}'
branch: docs-version-update-to-v${{ env.PROWLER_VERSION }}
title: 'docs: Update version to v${{ env.PROWLER_VERSION }}'
base: ${{ env.BASE_BRANCH }}
commit-message: 'chore(docs): Bump version to v${{ env.PROWLER_VERSION }}'
branch: docs-version-bump-to-v${{ env.PROWLER_VERSION }}
title: 'chore(docs): Bump version to v${{ env.PROWLER_VERSION }}'
labels: no-changelog,skip-sync
body: |
### Description
+14 -8
View File
@@ -14,6 +14,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
scan-secrets:
runs-on: ubuntu-latest
@@ -25,19 +27,23 @@ jobs:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
ghcr.io:443
pkg-containers.githubusercontent.com:443
# We can't block as Trufflehog needs to verify secrets against vendors
egress-policy: audit
# allowed-endpoints: >
# github.com:443
# ghcr.io:443
# pkg-containers.githubusercontent.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
# PRs only need the diff range; push to master/release walks the new range from event.before.
# 50 is enough headroom for the longest realistic PR/push chain without paying for a full clone.
fetch-depth: 50
persist-credentials: false
- name: Scan for secrets with TruffleHog
- name: Scan diff for secrets with TruffleHog
# Action auto-injects --since-commit/--branch from event payload; passing them in extra_args produces duplicate flags.
uses: trufflesecurity/trufflehog@ef6e76c3c4023279497fab4721ffa071a722fd05 # v3.92.4
with:
extra_args: '--results=verified,unknown'
extra_args: --results=verified,unknown
+4 -2
View File
@@ -21,6 +21,8 @@ concurrency:
env:
CHART_PATH: contrib/k8s/helm/prowler-app
permissions: {}
jobs:
helm-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -36,12 +38,12 @@ jobs:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Helm
uses: azure/setup-helm@1a275c3b69536ee54be43f2070a358922e12c8d4 # v4.3.1
uses: azure/setup-helm@dda3372f752e03dde6b3237bc9431cdc2f7a02a2 # v5.0.0
- name: Update chart dependencies
run: helm dependency update ${{ env.CHART_PATH }}
+4 -2
View File
@@ -13,6 +13,8 @@ concurrency:
env:
CHART_PATH: contrib/k8s/helm/prowler-app
permissions: {}
jobs:
release-helm-chart:
if: github.repository == 'prowler-cloud/prowler'
@@ -29,12 +31,12 @@ jobs:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Helm
uses: azure/setup-helm@b9e51907a09c216f16ebe8536097933489208112 # v4.3.0
uses: azure/setup-helm@dda3372f752e03dde6b3237bc9431cdc2f7a02a2 # v5.0.0
- name: Set appVersion from release tag
run: |
+53
View File
@@ -0,0 +1,53 @@
name: 'Tools: Lock Issue on Close'
on:
issues:
types:
- closed
concurrency:
group: ${{ github.workflow }}-${{ github.event.issue.number }}
cancel-in-progress: false
permissions: {}
jobs:
lock:
if: |
github.repository == 'prowler-cloud/prowler' &&
github.event.issue.locked == false
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
issues: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
- name: Comment and lock issue
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
script: |
const { owner, repo } = context.repo;
const issue_number = context.payload.issue.number;
try {
await github.rest.issues.createComment({
owner,
repo,
issue_number,
body: 'This issue is now locked as it has been closed. If you are still hitting a related problem, please open a new issue and link back to this one for context. Thanks!'
});
} catch (error) {
core.warning(`Failed to post lock comment on issue #${issue_number}: ${error.message}`);
}
const lockParams = { owner, repo, issue_number };
if (context.payload.issue.state_reason === 'completed') {
lockParams.lock_reason = 'resolved';
}
await github.rest.issues.lock(lockParams);
+9 -9
View File
@@ -772,7 +772,7 @@ jobs:
SECRET_GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Upload Safe Outputs
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: safe-output
path: ${{ env.GH_AW_SAFE_OUTPUTS }}
@@ -793,13 +793,13 @@ jobs:
await main();
- name: Upload sanitized agent output
if: always() && env.GH_AW_AGENT_OUTPUT
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: agent-output
path: ${{ env.GH_AW_AGENT_OUTPUT }}
if-no-files-found: warn
- name: Upload engine output files
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: agent_outputs
path: |
@@ -839,7 +839,7 @@ jobs:
- name: Upload agent artifacts
if: always()
continue-on-error: true
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: agent-artifacts
path: |
@@ -880,7 +880,7 @@ jobs:
destination: /opt/gh-aw/actions
- name: Download agent output artifact
continue-on-error: true
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: agent-output
path: /tmp/gh-aw/safeoutputs/
@@ -992,13 +992,13 @@ jobs:
destination: /opt/gh-aw/actions
- name: Download agent artifacts
continue-on-error: true
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: agent-artifacts
path: /tmp/gh-aw/threat-detection/
- name: Download agent output artifact
continue-on-error: true
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: agent-output
path: /tmp/gh-aw/threat-detection/
@@ -1071,7 +1071,7 @@ jobs:
await main();
- name: Upload threat detection log
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: threat-detection.log
path: /tmp/gh-aw/threat-detection/detection.log
@@ -1174,7 +1174,7 @@ jobs:
destination: /opt/gh-aw/actions
- name: Download agent output artifact
continue-on-error: true
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: agent-output
path: /tmp/gh-aw/safeoutputs/
+4 -1
View File
@@ -15,6 +15,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
labeler:
runs-on: ubuntu-latest
@@ -60,7 +62,7 @@ jobs:
"Alan-TheGentleman"
"alejandrobailo"
"amitsharm"
"andoniaf"
# "andoniaf"
"cesararroba"
"danibarranqueroo"
"HugoPBrito"
@@ -76,6 +78,7 @@ jobs:
"StylusFrost"
"toniblyx"
"davidm4r"
"pfe-nazaries"
)
echo "Checking if $AUTHOR is a member of prowler-cloud organization"
@@ -32,6 +32,8 @@ env:
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler-mcp
permissions: {}
jobs:
setup:
if: github.repository == 'prowler-cloud/prowler'
@@ -123,18 +125,18 @@ jobs:
persist-credentials: false
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build and push MCP container for ${{ matrix.arch }}
id: container-push
if: github.event_name == 'push' || github.event_name == 'release' || github.event_name == 'workflow_dispatch'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true
@@ -150,7 +152,7 @@ jobs:
org.opencontainers.image.created=${{ github.event_name == 'release' && github.event.release.published_at || github.event.head_commit.timestamp }}
${{ github.event_name == 'release' && format('org.opencontainers.image.version={0}', env.RELEASE_TAG) || '' }}
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
cache-to: type=gha,mode=${{ github.event_name == 'pull_request' && 'min' || 'max' }},scope=${{ matrix.arch }}
# Create and push multi-architecture manifest
create-manifest:
@@ -173,7 +175,7 @@ jobs:
release-assets.githubusercontent.com:443
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
+22 -21
View File
@@ -5,10 +5,16 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'mcp_server/**'
- '.github/workflows/mcp-container-checks.yml'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'mcp_server/**'
- '.github/workflows/mcp-container-checks.yml'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -18,6 +24,8 @@ env:
MCP_WORKING_DIR: ./mcp_server
IMAGE_NAME: prowler-mcp
permissions: {}
jobs:
mcp-dockerfile-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -42,7 +50,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: mcp_server/Dockerfile
@@ -54,16 +62,7 @@ jobs:
mcp-container-build-and-scan:
if: github.repository == 'prowler-cloud/prowler'
runs-on: ${{ matrix.runner }}
strategy:
matrix:
include:
- platform: linux/amd64
runner: ubuntu-latest
arch: amd64
- platform: linux/arm64
runner: ubuntu-24.04-arm
arch: arm64
runs-on: ubuntu-latest
timeout-minutes: 30
permissions:
contents: read
@@ -87,6 +86,9 @@ jobs:
api.github.com:443
mirror.gcr.io:443
check.trivy.dev:443
get.trivy.dev:443
release-assets.githubusercontent.com:443
objects.githubusercontent.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -96,7 +98,7 @@ jobs:
- name: Check for MCP changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: mcp_server/**
files_ignore: |
@@ -105,25 +107,24 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build MCP container for ${{ matrix.arch }}
- name: Build MCP container
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.MCP_WORKING_DIR }}
push: false
load: true
platforms: ${{ matrix.platform }}
tags: ${{ env.IMAGE_NAME }}:${{ github.sha }}-${{ matrix.arch }}
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
tags: ${{ env.IMAGE_NAME }}:${{ github.sha }}
cache-from: type=gha
cache-to: type=gha,mode=${{ github.event_name == 'pull_request' && 'min' || 'max' }}
- name: Scan MCP container with Trivy for ${{ matrix.arch }}
- name: Scan MCP container with Trivy
if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/trivy-scan
with:
image-name: ${{ env.IMAGE_NAME }}
image-tag: ${{ github.sha }}-${{ matrix.arch }}
image-tag: ${{ github.sha }}
fail-on-critical: 'false'
severity: 'CRITICAL'
+23
View File
@@ -14,6 +14,8 @@ env:
PYTHON_VERSION: "3.12"
WORKING_DIRECTORY: ./mcp_server
permissions: {}
jobs:
validate-release:
if: github.repository == 'prowler-cloud/prowler'
@@ -84,11 +86,32 @@ jobs:
with:
python-version: ${{ env.PYTHON_VERSION }}
# The MCP server version (mcp_server/pyproject.toml) is decoupled from the Prowler release
# version: it only changes when MCP code changes. mcp-bump-version.yml normally keeps it in
# sync with mcp_server/CHANGELOG.md, but this publish workflow still runs on every release.
# Pre-flight PyPI check covers the legitimate "no MCP changes for this release" case (and any
# workflow_dispatch re-runs) without failing with HTTP 400 (version exists).
- name: Check if prowler-mcp version already exists on PyPI
id: pypi-check
working-directory: ${{ env.WORKING_DIRECTORY }}
run: |
MCP_VERSION=$(grep '^version' pyproject.toml | head -1 | sed -E 's/^version[[:space:]]*=[[:space:]]*"([^"]+)".*/\1/')
echo "mcp_version=${MCP_VERSION}" >> "$GITHUB_OUTPUT"
if curl -fsS "https://pypi.org/pypi/prowler-mcp/${MCP_VERSION}/json" >/dev/null 2>&1; then
echo "skip=true" >> "$GITHUB_OUTPUT"
echo "::notice title=Skipping prowler-mcp publish::Version ${MCP_VERSION} already exists on PyPI; bump mcp_server/pyproject.toml to publish a new release."
else
echo "skip=false" >> "$GITHUB_OUTPUT"
echo "::notice title=Publishing prowler-mcp::Version ${MCP_VERSION} not on PyPI yet; proceeding."
fi
- name: Build prowler-mcp package
if: steps.pypi-check.outputs.skip != 'true'
working-directory: ${{ env.WORKING_DIRECTORY }}
run: uv build
- name: Publish prowler-mcp package to PyPI
if: steps.pypi-check.outputs.skip != 'true'
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
with:
packages-dir: ${{ env.WORKING_DIRECTORY }}/dist/
@@ -0,0 +1,98 @@
name: 'Nightly: ARM64 Container Builds'
# Mitigation for amd64-only PR container-checks: build amd64+arm64 nightly against
# master to keep arm-specific Dockerfile regressions caught quickly. Build only —
# no push, no Trivy (weekly checks already cover that).
on:
schedule:
- cron: '0 4 * * *'
workflow_dispatch: {}
concurrency:
group: ${{ github.workflow }}
cancel-in-progress: false
permissions: {}
jobs:
build-arm64:
if: github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-24.04-arm
timeout-minutes: 60
permissions:
contents: read
strategy:
fail-fast: false
matrix:
include:
- component: sdk
context: .
dockerfile: ./Dockerfile
image_name: prowler
- component: api
context: ./api
dockerfile: ./api/Dockerfile
image_name: prowler-api
- component: ui
context: ./ui
dockerfile: ./ui/Dockerfile
image_name: prowler-ui
target: prod
build_args: |
NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY=pk_test_51LwpXXXX
- component: mcp
context: ./mcp_server
dockerfile: ./mcp_server/Dockerfile
image_name: prowler-mcp
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build ${{ matrix.component }} container (linux/arm64)
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ matrix.context }}
file: ${{ matrix.dockerfile }}
target: ${{ matrix.target }}
push: false
load: false
platforms: linux/arm64
tags: ${{ matrix.image_name }}:nightly-arm64
build-args: ${{ matrix.build_args }}
cache-from: type=gha,scope=arm64
cache-to: type=gha,mode=min,scope=arm64
notify-failure:
needs: build-arm64
if: failure() && github.event_name == 'schedule'
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Notify Slack on failure
uses: slackapi/slack-github-action@91efab103c0de0a537f72a35f6b8cda0ee76bf0a # v2.1.1
with:
method: chat.postMessage
token: ${{ secrets.SLACK_BOT_TOKEN }}
payload: |
channel: ${{ secrets.SLACK_PLATFORM_DEPLOYMENTS }}
text: ":rotating_light: Nightly arm64 container build failed for prowler — <${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}|view run>"
errors: true
+9 -2
View File
@@ -16,6 +16,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
check-changelog:
if: contains(github.event.pull_request.labels.*.name, 'no-changelog') == false
@@ -39,13 +41,18 @@ jobs:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
fetch-depth: 1
# zizmor: ignore[artipacked]
persist-credentials: true # Required by tj-actions/changed-files to fetch PR branch
- name: Fetch PR base ref for tj-actions/changed-files
env:
BASE_REF: ${{ github.event.pull_request.base.ref }}
run: git fetch --depth=1 origin "${BASE_REF}"
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
api/**
@@ -16,9 +16,17 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
check-compliance-mapping:
if: contains(github.event.pull_request.labels.*.name, 'no-compliance-check') == false
if: >-
github.event.pull_request.state == 'open' &&
contains(github.event.pull_request.labels.*.name, 'no-compliance-check') == false &&
(
(github.event.action != 'labeled' && github.event.action != 'unlabeled')
|| github.event.label.name == 'no-compliance-check'
)
runs-on: ubuntu-latest
timeout-minutes: 15
permissions:
@@ -37,13 +45,18 @@ jobs:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
fetch-depth: 1
# zizmor: ignore[artipacked]
persist-credentials: true # Required by tj-actions/changed-files to fetch PR branch
- name: Fetch PR base ref for tj-actions/changed-files
env:
BASE_REF: ${{ github.event.pull_request.base.ref }}
run: git fetch --depth=1 origin "${BASE_REF}"
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
prowler/providers/**/services/**/*.metadata.json
+11 -3
View File
@@ -15,6 +15,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
check-conflicts:
runs-on: ubuntu-latest
@@ -34,12 +36,18 @@ jobs:
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
fetch-depth: 1
# zizmor: ignore[artipacked]
persist-credentials: true # Required by tj-actions/changed-files to fetch PR branch
- name: Fetch PR base ref for tj-actions/changed-files
env:
BASE_REF: ${{ github.event.pull_request.base.ref }}
run: git fetch --depth=1 origin "${BASE_REF}"
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: '**'
+2
View File
@@ -12,6 +12,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: false
permissions: {}
jobs:
trigger-cloud-pull-request:
if: |
+7 -8
View File
@@ -17,6 +17,8 @@ concurrency:
env:
PROWLER_VERSION: ${{ inputs.prowler_version }}
permissions: {}
jobs:
prepare-release:
if: github.event_name == 'workflow_dispatch' && github.repository == 'prowler-cloud/prowler'
@@ -38,15 +40,12 @@ jobs:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
- name: Setup Python with Poetry
uses: ./.github/actions/setup-python-poetry
with:
python-version: '3.12'
- name: Install Poetry
run: |
python3 -m pip install --user poetry==2.1.1
echo "$HOME/.local/bin" >> $GITHUB_PATH
install-dependencies: 'false'
enable-cache: 'false'
- name: Configure Git
run: |
@@ -380,7 +379,7 @@ jobs:
no-changelog
- name: Create draft release
uses: softprops/action-gh-release@a06a81a03ee405af7f2048a818ed3f03bbf83c7b # v2.5.0
uses: softprops/action-gh-release@153bb8e04406b158c6c84fc1615b65b24149a1fe # v2.6.1
with:
tag_name: ${{ env.PROWLER_VERSION }}
name: Prowler ${{ env.PROWLER_VERSION }}
+11 -9
View File
@@ -13,6 +13,8 @@ env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
permissions: {}
jobs:
detect-release-type:
runs-on: ubuntu-latest
@@ -111,9 +113,9 @@ jobs:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: master
commit-message: 'chore(release): Bump version to v${{ env.NEXT_MINOR_VERSION }}'
branch: version-bump-to-v${{ env.NEXT_MINOR_VERSION }}
title: 'chore(release): Bump version to v${{ env.NEXT_MINOR_VERSION }}'
commit-message: 'chore(sdk): Bump version to v${{ env.NEXT_MINOR_VERSION }}'
branch: sdk-version-bump-to-v${{ env.NEXT_MINOR_VERSION }}
title: 'chore(sdk): Bump version to v${{ env.NEXT_MINOR_VERSION }}'
labels: no-changelog,skip-sync
body: |
### Description
@@ -163,9 +165,9 @@ jobs:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.VERSION_BRANCH }}
commit-message: 'chore(release): Bump version to v${{ env.FIRST_PATCH_VERSION }}'
branch: version-bump-to-v${{ env.FIRST_PATCH_VERSION }}
title: 'chore(release): Bump version to v${{ env.FIRST_PATCH_VERSION }}'
commit-message: 'chore(sdk): Bump version to v${{ env.FIRST_PATCH_VERSION }}'
branch: sdk-version-bump-to-v${{ env.FIRST_PATCH_VERSION }}
title: 'chore(sdk): Bump version to v${{ env.FIRST_PATCH_VERSION }}'
labels: no-changelog,skip-sync
body: |
### Description
@@ -231,9 +233,9 @@ jobs:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.VERSION_BRANCH }}
commit-message: 'chore(release): Bump version to v${{ env.NEXT_PATCH_VERSION }}'
branch: version-bump-to-v${{ env.NEXT_PATCH_VERSION }}
title: 'chore(release): Bump version to v${{ env.NEXT_PATCH_VERSION }}'
commit-message: 'chore(sdk): Bump version to v${{ env.NEXT_PATCH_VERSION }}'
branch: sdk-version-bump-to-v${{ env.NEXT_PATCH_VERSION }}
title: 'chore(sdk): Bump version to v${{ env.NEXT_PATCH_VERSION }}'
labels: no-changelog,skip-sync
body: |
### Description
@@ -5,11 +5,16 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'tests/providers/**/*_test.py'
- '.github/workflows/sdk-check-duplicate-test-names.yml'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
check-duplicate-test-names:
if: github.repository == 'prowler-cloud/prowler'
+21 -14
View File
@@ -5,15 +5,33 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'prowler/**'
- 'tests/**'
- 'pyproject.toml'
- 'poetry.lock'
- '.github/workflows/sdk-tests.yml'
- '.github/workflows/sdk-code-quality.yml'
- '.github/actions/setup-python-poetry/**'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'prowler/**'
- 'tests/**'
- 'pyproject.toml'
- 'poetry.lock'
- '.github/workflows/sdk-tests.yml'
- '.github/workflows/sdk-code-quality.yml'
- '.github/actions/setup-python-poetry/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
sdk-code-quality:
if: github.repository == 'prowler-cloud/prowler'
@@ -46,7 +64,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ./**
files_ignore: |
@@ -69,22 +87,11 @@ jobs:
contrib/**
**/AGENTS.md
- name: Install Poetry
- name: Setup Python with Poetry
if: steps.check-changes.outputs.any_changed == 'true'
run: pipx install poetry==2.1.1
- name: Set up Python ${{ matrix.python-version }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ matrix.python-version }}
cache: 'poetry'
- name: Install dependencies
if: steps.check-changes.outputs.any_changed == 'true'
run: |
poetry install --no-root
poetry run pip list
- name: Check Poetry lock file
if: steps.check-changes.outputs.any_changed == 'true'
+2
View File
@@ -30,6 +30,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
sdk-analyze:
if: github.repository == 'prowler-cloud/prowler'
+24 -80
View File
@@ -3,9 +3,7 @@ name: 'SDK: Container Build and Push'
on:
push:
branches:
- 'v3' # For v3-latest
- 'v4.6' # For v4-latest
- 'master' # For latest
- 'master'
paths-ignore:
- '.github/**'
- '!.github/workflows/sdk-container-build-push.yml'
@@ -47,6 +45,8 @@ env:
# AWS configuration (for ECR)
AWS_REGION: us-east-1
permissions: {}
jobs:
setup:
if: github.repository == 'prowler-cloud/prowler'
@@ -54,7 +54,6 @@ jobs:
timeout-minutes: 5
outputs:
prowler_version: ${{ steps.get-prowler-version.outputs.prowler_version }}
prowler_version_major: ${{ steps.get-prowler-version.outputs.prowler_version_major }}
latest_tag: ${{ steps.get-prowler-version.outputs.latest_tag }}
stable_tag: ${{ steps.get-prowler-version.outputs.stable_tag }}
permissions:
@@ -74,15 +73,15 @@ jobs:
with:
persist-credentials: false
- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
- name: Setup Python with Poetry
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ env.PYTHON_VERSION }}
install-dependencies: 'false'
enable-cache: 'false'
- name: Install Poetry
run: |
pipx install poetry==2.1.1
pipx inject poetry poetry-bumpversion
- name: Inject poetry-bumpversion plugin
run: pipx inject poetry poetry-bumpversion
- name: Get Prowler version and set tags
id: get-prowler-version
@@ -90,32 +89,13 @@ jobs:
PROWLER_VERSION="$(poetry version -s 2>/dev/null)"
echo "prowler_version=${PROWLER_VERSION}" >> "${GITHUB_OUTPUT}"
# Extract major version
PROWLER_VERSION_MAJOR="${PROWLER_VERSION%%.*}"
echo "prowler_version_major=${PROWLER_VERSION_MAJOR}" >> "${GITHUB_OUTPUT}"
# Set version-specific tags
case ${PROWLER_VERSION_MAJOR} in
3)
echo "latest_tag=v3-latest" >> "${GITHUB_OUTPUT}"
echo "stable_tag=v3-stable" >> "${GITHUB_OUTPUT}"
echo "✓ Prowler v3 detected - tags: v3-latest, v3-stable"
;;
4)
echo "latest_tag=v4-latest" >> "${GITHUB_OUTPUT}"
echo "stable_tag=v4-stable" >> "${GITHUB_OUTPUT}"
echo "✓ Prowler v4 detected - tags: v4-latest, v4-stable"
;;
5)
echo "latest_tag=latest" >> "${GITHUB_OUTPUT}"
echo "stable_tag=stable" >> "${GITHUB_OUTPUT}"
echo "✓ Prowler v5 detected - tags: latest, stable"
;;
*)
echo "::error::Unsupported Prowler major version: ${PROWLER_VERSION_MAJOR}"
exit 1
;;
esac
if [[ "${PROWLER_VERSION_MAJOR}" != "5" ]]; then
echo "::error::Unsupported Prowler major version: ${PROWLER_VERSION_MAJOR}"
exit 1
fi
echo "latest_tag=latest" >> "${GITHUB_OUTPUT}"
echo "stable_tag=stable" >> "${GITHUB_OUTPUT}"
notify-release-started:
if: github.repository == 'prowler-cloud/prowler' && (github.event_name == 'release' || github.event_name == 'workflow_dispatch')
@@ -197,13 +177,13 @@ jobs:
persist-credentials: false
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Public ECR
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
registry: public.ecr.aws
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
@@ -212,12 +192,12 @@ jobs:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build and push SDK container for ${{ matrix.arch }}
id: container-push
if: github.event_name == 'push' || github.event_name == 'release' || github.event_name == 'workflow_dispatch'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: .
file: ${{ env.DOCKERFILE_PATH }}
@@ -226,7 +206,7 @@ jobs:
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ needs.setup.outputs.latest_tag }}-${{ matrix.arch }}
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
cache-to: type=gha,mode=${{ github.event_name == 'pull_request' && 'min' || 'max' }},scope=${{ matrix.arch }}
# Create and push multi-architecture manifest
create-manifest:
@@ -252,13 +232,13 @@ jobs:
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Public ECR
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
registry: public.ecr.aws
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
@@ -295,7 +275,7 @@ jobs:
# Push to toniblyx/prowler only for current version (latest/stable/release tags)
- name: Login to DockerHub (toniblyx)
if: needs.setup.outputs.latest_tag == 'latest'
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.TONIBLYX_DOCKERHUB_USERNAME }}
password: ${{ secrets.TONIBLYX_DOCKERHUB_PASSWORD }}
@@ -320,7 +300,7 @@ jobs:
# Re-login as prowlercloud for cleanup of intermediate tags
- name: Login to DockerHub (prowlercloud)
if: always()
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
@@ -384,39 +364,3 @@ jobs:
payload-file-path: "./.github/scripts/slack-messages/container-release-completed.json"
step-outcome: ${{ steps.outcome.outputs.outcome }}
update-ts: ${{ needs.notify-release-started.outputs.message-ts }}
dispatch-v3-deployment:
needs: [setup, container-build-push]
if: always() && needs.setup.outputs.prowler_version_major == '3' && needs.setup.result == 'success' && needs.container-build-push.result == 'success'
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Calculate short SHA
id: short-sha
run: echo "short_sha=${GITHUB_SHA::7}" >> $GITHUB_OUTPUT
- name: Dispatch v3 deployment (latest)
if: github.event_name == 'push'
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4.0.1
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}
event-type: dispatch
client-payload: '{"version":"v3-latest","tag":"${{ steps.short-sha.outputs.short_sha }}"}'
- name: Dispatch v3 deployment (release)
if: github.event_name == 'release'
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4.0.1
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}
event-type: dispatch
client-payload: '{"version":"release","tag":"${{ needs.setup.outputs.prowler_version }}"}'
+26 -21
View File
@@ -5,10 +5,22 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'prowler/**'
- 'Dockerfile*'
- 'pyproject.toml'
- 'poetry.lock'
- '.github/workflows/sdk-container-checks.yml'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'prowler/**'
- 'Dockerfile*'
- 'pyproject.toml'
- 'poetry.lock'
- '.github/workflows/sdk-container-checks.yml'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -17,6 +29,8 @@ concurrency:
env:
IMAGE_NAME: prowler
permissions: {}
jobs:
sdk-dockerfile-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -41,7 +55,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: Dockerfile
@@ -54,16 +68,7 @@ jobs:
sdk-container-build-and-scan:
if: github.repository == 'prowler-cloud/prowler'
runs-on: ${{ matrix.runner }}
strategy:
matrix:
include:
- platform: linux/amd64
runner: ubuntu-latest
arch: amd64
- platform: linux/arm64
runner: ubuntu-24.04-arm
arch: arm64
runs-on: ubuntu-latest
timeout-minutes: 30
permissions:
contents: read
@@ -85,6 +90,7 @@ jobs:
check.trivy.dev:443
debian.map.fastlydns.net:80
release-assets.githubusercontent.com:443
objects.githubusercontent.com:443
pypi.org:443
files.pythonhosted.org:443
www.powershellgallery.com:443
@@ -102,7 +108,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ./**
files_ignore: |
@@ -127,25 +133,24 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build SDK container for ${{ matrix.arch }}
- name: Build SDK container
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: .
push: false
load: true
platforms: ${{ matrix.platform }}
tags: ${{ env.IMAGE_NAME }}:${{ github.sha }}-${{ matrix.arch }}
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
tags: ${{ env.IMAGE_NAME }}:${{ github.sha }}
cache-from: type=gha
cache-to: type=gha,mode=${{ github.event_name == 'pull_request' && 'min' || 'max' }}
- name: Scan SDK container with Trivy for ${{ matrix.arch }}
- name: Scan SDK container with Trivy
if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/trivy-scan
with:
image-name: ${{ env.IMAGE_NAME }}
image-tag: ${{ github.sha }}-${{ matrix.arch }}
image-tag: ${{ github.sha }}
fail-on-critical: 'false'
severity: 'CRITICAL'
+10 -10
View File
@@ -13,6 +13,8 @@ env:
RELEASE_TAG: ${{ github.event.release.tag_name }}
PYTHON_VERSION: '3.12'
permissions: {}
jobs:
validate-release:
if: github.repository == 'prowler-cloud/prowler'
@@ -73,13 +75,12 @@ jobs:
with:
persist-credentials: false
- name: Install Poetry
run: pipx install poetry==2.1.1
- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
- name: Setup Python with Poetry
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ env.PYTHON_VERSION }}
install-dependencies: 'false'
enable-cache: 'false'
- name: Build Prowler package
run: poetry build
@@ -111,13 +112,12 @@ jobs:
with:
persist-credentials: false
- name: Install Poetry
run: pipx install poetry==2.1.1
- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
- name: Setup Python with Poetry
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ env.PYTHON_VERSION }}
install-dependencies: 'false'
enable-cache: 'false'
- name: Install toml package
run: pip install toml
@@ -13,6 +13,8 @@ env:
PYTHON_VERSION: '3.12'
AWS_REGION: 'us-east-1'
permissions: {}
jobs:
refresh-aws-regions:
if: github.repository == 'prowler-cloud/prowler'
@@ -12,6 +12,8 @@ concurrency:
env:
PYTHON_VERSION: '3.12'
permissions: {}
jobs:
refresh-oci-regions:
if: github.repository == 'prowler-cloud/prowler'
+23 -13
View File
@@ -5,15 +5,33 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'prowler/**'
- 'tests/**'
- 'pyproject.toml'
- 'poetry.lock'
- '.github/workflows/sdk-tests.yml'
- '.github/workflows/sdk-security.yml'
- '.github/actions/setup-python-poetry/**'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'prowler/**'
- 'tests/**'
- 'pyproject.toml'
- 'poetry.lock'
- '.github/workflows/sdk-tests.yml'
- '.github/workflows/sdk-security.yml'
- '.github/actions/setup-python-poetry/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
sdk-security-scans:
if: github.repository == 'prowler-cloud/prowler'
@@ -44,7 +62,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files:
./**
@@ -69,20 +87,11 @@ jobs:
contrib/**
**/AGENTS.md
- name: Install Poetry
- name: Setup Python with Poetry
if: steps.check-changes.outputs.any_changed == 'true'
run: pipx install poetry==2.1.1
- name: Set up Python 3.12
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: ./.github/actions/setup-python-poetry
with:
python-version: '3.12'
cache: 'poetry'
- name: Install dependencies
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry install --no-root
- name: Security scan with Bandit
if: steps.check-changes.outputs.any_changed == 'true'
@@ -90,7 +99,8 @@ jobs:
- name: Security scan with Safety
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run safety check -r pyproject.toml
# Accepted CVEs, severity threshold, and ignore expirations live in .safety-policy.yml
run: poetry run safety check -r pyproject.toml --policy-file .safety-policy.yml
- name: Dead code detection with Vulture
if: steps.check-changes.outputs.any_changed == 'true'
+34 -27
View File
@@ -5,15 +5,31 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'prowler/**'
- 'tests/**'
- 'pyproject.toml'
- 'poetry.lock'
- '.github/workflows/sdk-tests.yml'
- '.github/actions/setup-python-poetry/**'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'prowler/**'
- 'tests/**'
- 'pyproject.toml'
- 'poetry.lock'
- '.github/workflows/sdk-tests.yml'
- '.github/actions/setup-python-poetry/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
sdk-tests:
if: github.repository == 'prowler-cloud/prowler'
@@ -67,7 +83,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ./**
files_ignore: |
@@ -90,26 +106,17 @@ jobs:
contrib/**
**/AGENTS.md
- name: Install Poetry
- name: Setup Python with Poetry
if: steps.check-changes.outputs.any_changed == 'true'
run: pipx install poetry==2.1.1
- name: Set up Python ${{ matrix.python-version }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ matrix.python-version }}
cache: 'poetry'
- name: Install dependencies
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry install --no-root
# AWS Provider
- name: Check if AWS files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-aws
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/aws/**
@@ -239,7 +246,7 @@ jobs:
- name: Check if Azure files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-azure
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/azure/**
@@ -263,7 +270,7 @@ jobs:
- name: Check if GCP files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-gcp
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/gcp/**
@@ -287,7 +294,7 @@ jobs:
- name: Check if Kubernetes files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-kubernetes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/kubernetes/**
@@ -311,7 +318,7 @@ jobs:
- name: Check if GitHub files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-github
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/github/**
@@ -335,7 +342,7 @@ jobs:
- name: Check if NHN files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-nhn
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/nhn/**
@@ -359,7 +366,7 @@ jobs:
- name: Check if M365 files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-m365
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/m365/**
@@ -383,7 +390,7 @@ jobs:
- name: Check if IaC files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-iac
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/iac/**
@@ -407,7 +414,7 @@ jobs:
- name: Check if MongoDB Atlas files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-mongodbatlas
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/mongodbatlas/**
@@ -431,7 +438,7 @@ jobs:
- name: Check if OCI files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-oraclecloud
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/oraclecloud/**
@@ -455,7 +462,7 @@ jobs:
- name: Check if OpenStack files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-openstack
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/openstack/**
@@ -479,7 +486,7 @@ jobs:
- name: Check if Google Workspace files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-googleworkspace
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/googleworkspace/**
@@ -503,7 +510,7 @@ jobs:
- name: Check if Vercel files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-vercel
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/vercel/**
@@ -527,7 +534,7 @@ jobs:
- name: Check if Lib files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-lib
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/lib/**
@@ -551,7 +558,7 @@ jobs:
- name: Check if Config files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-config
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/config/**
+3 -1
View File
@@ -31,6 +31,8 @@ on:
description: "Whether there are UI E2E tests to run"
value: ${{ jobs.analyze.outputs.has-ui-e2e }}
permissions: {}
jobs:
analyze:
runs-on: ubuntu-latest
@@ -66,7 +68,7 @@ jobs:
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
- name: Setup Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
+2
View File
@@ -13,6 +13,8 @@ env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
permissions: {}
jobs:
detect-release-type:
runs-on: ubuntu-latest
+2
View File
@@ -26,6 +26,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
ui-analyze:
if: github.repository == 'prowler-cloud/prowler'
@@ -35,6 +35,8 @@ env:
# Build args
NEXT_PUBLIC_API_BASE_URL: http://prowler-api:8080/api/v1
permissions: {}
jobs:
setup:
if: github.repository == 'prowler-cloud/prowler'
@@ -127,18 +129,18 @@ jobs:
persist-credentials: false
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build and push UI container for ${{ matrix.arch }}
id: container-push
if: github.event_name == 'push' || github.event_name == 'release' || github.event_name == 'workflow_dispatch'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
@@ -149,7 +151,7 @@ jobs:
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ needs.setup.outputs.short-sha }}-${{ matrix.arch }}
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
cache-to: type=gha,mode=${{ github.event_name == 'pull_request' && 'min' || 'max' }},scope=${{ matrix.arch }}
# Create and push multi-architecture manifest
create-manifest:
@@ -172,7 +174,7 @@ jobs:
production.cloudflare.docker.com:443
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
+21 -21
View File
@@ -5,10 +5,16 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'ui/**'
- '.github/workflows/ui-container-checks.yml'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'ui/**'
- '.github/workflows/ui-container-checks.yml'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -18,6 +24,8 @@ env:
UI_WORKING_DIR: ./ui
IMAGE_NAME: prowler-ui
permissions: {}
jobs:
ui-dockerfile-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -42,7 +50,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ui/Dockerfile
@@ -55,16 +63,7 @@ jobs:
ui-container-build-and-scan:
if: github.repository == 'prowler-cloud/prowler'
runs-on: ${{ matrix.runner }}
strategy:
matrix:
include:
- platform: linux/amd64
runner: ubuntu-latest
arch: amd64
- platform: linux/arm64
runner: ubuntu-24.04-arm
arch: arm64
runs-on: ubuntu-latest
timeout-minutes: 30
permissions:
contents: read
@@ -89,6 +88,8 @@ jobs:
mirror.gcr.io:443
check.trivy.dev:443
get.trivy.dev:443
release-assets.githubusercontent.com:443
objects.githubusercontent.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -98,7 +99,7 @@ jobs:
- name: Check for UI changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ui/**
files_ignore: |
@@ -108,28 +109,27 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build UI container for ${{ matrix.arch }}
- name: Build UI container
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.UI_WORKING_DIR }}
target: prod
push: false
load: true
platforms: ${{ matrix.platform }}
tags: ${{ env.IMAGE_NAME }}:${{ github.sha }}-${{ matrix.arch }}
cache-from: type=gha,scope=${{ matrix.arch }}
cache-to: type=gha,mode=max,scope=${{ matrix.arch }}
tags: ${{ env.IMAGE_NAME }}:${{ github.sha }}
cache-from: type=gha
cache-to: type=gha,mode=${{ github.event_name == 'pull_request' && 'min' || 'max' }}
build-args: |
NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY=pk_test_51LwpXXXX
- name: Scan UI container with Trivy for ${{ matrix.arch }}
- name: Scan UI container with Trivy
if: steps.check-changes.outputs.any_changed == 'true'
uses: ./.github/actions/trivy-scan
with:
image-name: ${{ env.IMAGE_NAME }}
image-tag: ${{ github.sha }}-${{ matrix.arch }}
image-tag: ${{ github.sha }}
fail-on-critical: 'false'
severity: 'CRITICAL'
+12 -6
View File
@@ -15,6 +15,12 @@ on:
- 'ui/**'
- 'api/**' # API changes can affect UI E2E
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
# First, analyze which tests need to run
impact-analysis:
@@ -158,12 +164,12 @@ jobs:
'
- name: Setup Node.js
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6.2.0
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: '24.13.0'
- name: Setup pnpm
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
uses: pnpm/action-setup@fc06bc1257f339d1d5d8b3a19a8cae5388b55320 # v5.0.0
with:
package_json_file: ui/package.json
run_install: false
@@ -172,7 +178,7 @@ jobs:
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm and Next.js cache
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
with:
path: |
${{ env.STORE_PATH }}
@@ -192,7 +198,7 @@ jobs:
run: pnpm run build
- name: Cache Playwright browsers
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
id: playwright-cache
with:
path: ~/.cache/ms-playwright
@@ -259,12 +265,12 @@ jobs:
fi
- name: Upload test reports
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
if: failure()
with:
name: playwright-report
path: ui/playwright-report/
retention-days: 30
retention-days: 7
- name: Cleanup services
if: always()
+14 -6
View File
@@ -5,10 +5,16 @@ on:
branches:
- 'master'
- 'v5.*'
paths:
- 'ui/**'
- '.github/workflows/ui-tests.yml'
pull_request:
branches:
- 'master'
- 'v5.*'
paths:
- 'ui/**'
- '.github/workflows/ui-tests.yml'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -18,6 +24,8 @@ env:
UI_WORKING_DIR: ./ui
NODE_VERSION: '24.13.0'
permissions: {}
jobs:
ui-tests:
runs-on: ubuntu-latest
@@ -49,7 +57,7 @@ jobs:
- name: Check for UI changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
ui/**
@@ -62,7 +70,7 @@ jobs:
- name: Get changed source files for targeted tests
id: changed-source
if: steps.check-changes.outputs.any_changed == 'true'
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
ui/**/*.ts
@@ -78,7 +86,7 @@ jobs:
- name: Check for critical path changes (run all tests)
id: critical-changes
if: steps.check-changes.outputs.any_changed == 'true'
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
ui/lib/**
@@ -90,13 +98,13 @@ jobs:
- name: Setup Node.js ${{ env.NODE_VERSION }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6.2.0
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: ${{ env.NODE_VERSION }}
- name: Setup pnpm
if: steps.check-changes.outputs.any_changed == 'true'
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
uses: pnpm/action-setup@fc06bc1257f339d1d5d8b3a19a8cae5388b55320 # v5.0.0
with:
package_json_file: ui/package.json
run_install: false
@@ -108,7 +116,7 @@ jobs:
- name: Setup pnpm and Next.js cache
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
with:
path: |
${{ env.STORE_PATH }}
+1
View File
@@ -8,6 +8,7 @@ rules:
- docs-bump-version.yml
- issue-triage.lock.yml
- mcp-container-build-push.yml
- nightly-arm64-container-builds.yml
- pr-merged.yml
- prepare-release.yml
- sdk-bump-version.yml
+3
View File
@@ -84,6 +84,7 @@ continue.json
.continuerc.json
# AI Coding Assistants - OpenCode
.opencode/
opencode.json
# AI Coding Assistants - GitHub Copilot
@@ -150,6 +151,8 @@ node_modules
# Persistent data
_data/
/openspec/
/.gitmodules
# AI Instructions (generated by skills/setup.sh from AGENTS.md)
CLAUDE.md
+102 -38
View File
@@ -1,148 +1,212 @@
# Priority tiers (lower = runs first, same priority = concurrent):
# P0 — fast file fixers
# P10 — validators and guards
# P20 — auto-formatters
# P30 — linters
# P40 — security scanners
# P50 — dependency validation
default_install_hook_types: [pre-commit, pre-push]
repos:
## GENERAL
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
## GENERAL (prek built-in — no external repo needed)
- repo: builtin
hooks:
- id: check-merge-conflict
priority: 10
- id: check-yaml
args: ["--unsafe"]
exclude: prowler/config/llm_config.yaml
args: ["--allow-multiple-documents"]
exclude: (prowler/config/llm_config.yaml|contrib/)
priority: 10
- id: check-json
priority: 10
- id: end-of-file-fixer
priority: 0
- id: trailing-whitespace
priority: 0
- id: no-commit-to-branch
priority: 10
- id: pretty-format-json
args: ["--autofix", --no-sort-keys, --no-ensure-ascii]
priority: 10
## TOML
- repo: https://github.com/macisamuele/language-formatters-pre-commit-hooks
rev: v2.13.0
rev: v2.16.0
hooks:
- id: pretty-format-toml
args: [--autofix]
files: pyproject.toml
priority: 20
## GITHUB ACTIONS
- repo: https://github.com/zizmorcore/zizmor-pre-commit
rev: v1.6.0
rev: v1.24.1
hooks:
- id: zizmor
files: ^\.github/
priority: 30
## BASH
- repo: https://github.com/koalaman/shellcheck-precommit
rev: v0.10.0
rev: v0.11.0
hooks:
- id: shellcheck
exclude: contrib
priority: 30
## PYTHON
## PYTHON — SDK (prowler/, tests/, dashboard/, util/, scripts/)
- repo: https://github.com/myint/autoflake
rev: v2.3.1
rev: v2.3.3
hooks:
- id: autoflake
exclude: ^skills/
name: "SDK - autoflake"
files: { glob: ["{prowler,tests,dashboard,util,scripts}/**/*.py"] }
args:
[
"--in-place",
"--remove-all-unused-imports",
"--remove-unused-variable",
]
priority: 20
- repo: https://github.com/pycqa/isort
rev: 5.13.2
rev: 8.0.1
hooks:
- id: isort
exclude: ^skills/
name: "SDK - isort"
files: { glob: ["{prowler,tests,dashboard,util,scripts}/**/*.py"] }
args: ["--profile", "black"]
priority: 20
- repo: https://github.com/psf/black
rev: 24.4.2
rev: 26.3.1
hooks:
- id: black
exclude: ^skills/
name: "SDK - black"
files: { glob: ["{prowler,tests,dashboard,util,scripts}/**/*.py"] }
priority: 20
- repo: https://github.com/pycqa/flake8
rev: 7.0.0
rev: 7.3.0
hooks:
- id: flake8
exclude: (contrib|^skills/)
name: "SDK - flake8"
files: { glob: ["{prowler,tests,dashboard,util,scripts}/**/*.py"] }
args: ["--ignore=E266,W503,E203,E501,W605"]
priority: 30
## PYTHON — API + MCP Server (ruff)
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.15.11
hooks:
- id: ruff
name: "API + MCP - ruff check"
files: { glob: ["{api,mcp_server}/**/*.py"] }
args: ["--fix"]
priority: 30
- id: ruff-format
name: "API + MCP - ruff format"
files: { glob: ["{api,mcp_server}/**/*.py"] }
priority: 20
## PYTHON — Poetry
- repo: https://github.com/python-poetry/poetry
rev: 2.1.1
rev: 2.3.4
hooks:
- id: poetry-check
name: API - poetry-check
args: ["--directory=./api"]
files: { glob: ["api/{pyproject.toml,poetry.lock}"] }
pass_filenames: false
priority: 50
- id: poetry-lock
name: API - poetry-lock
args: ["--directory=./api"]
files: { glob: ["api/{pyproject.toml,poetry.lock}"] }
pass_filenames: false
priority: 50
- id: poetry-check
name: SDK - poetry-check
args: ["--directory=./"]
files: { glob: ["{pyproject.toml,poetry.lock}"] }
pass_filenames: false
priority: 50
- id: poetry-lock
name: SDK - poetry-lock
args: ["--directory=./"]
files: { glob: ["{pyproject.toml,poetry.lock}"] }
pass_filenames: false
priority: 50
## CONTAINERS
- repo: https://github.com/hadolint/hadolint
rev: v2.13.0-beta
rev: v2.14.0
hooks:
- id: hadolint
args: ["--ignore=DL3013"]
priority: 30
## LOCAL HOOKS
- repo: local
hooks:
- id: pylint
name: pylint
entry: bash -c 'pylint --disable=W,C,R,E -j 0 -rn -sn prowler/'
name: "SDK - pylint"
entry: pylint --disable=W,C,R,E -j 0 -rn -sn
language: system
files: '.*\.py'
types: [python]
files: { glob: ["{prowler,tests,dashboard,util,scripts}/**/*.py"] }
priority: 30
- id: trufflehog
name: TruffleHog
description: Detect secrets in your data.
entry: bash -c 'trufflehog --no-update git file://. --only-verified --fail'
entry: bash -c 'trufflehog --no-update git file://. --since-commit HEAD --only-verified --fail'
# For running trufflehog in docker, use the following entry instead:
# entry: bash -c 'docker run -v "$(pwd):/workdir" -i --rm trufflesecurity/trufflehog:latest git file:///workdir --only-verified --fail'
language: system
pass_filenames: false
stages: ["pre-commit", "pre-push"]
priority: 40
- id: bandit
name: bandit
description: "Bandit is a tool for finding common security issues in Python code"
entry: bash -c 'bandit -q -lll -x '*_test.py,./contrib/,./.venv/,./skills/' -r .'
entry: bandit -q -lll
language: system
types: [python]
files: '.*\.py'
exclude:
{ glob: ["{contrib,skills}/**", "**/.venv/**", "**/*_test.py"] }
priority: 40
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
# TODO: Botocore needs urllib3 1.X so we need to ignore these vulnerabilities 77744,77745. Remove this once we upgrade to urllib3 2.X
# TODO: 79023 & 79027 knack ReDoS until `azure-cli-core` (via `cartography`) allows `knack` >=0.13.0
# TODO: 86217 because `alibabacloud-tea-openapi == 0.4.3` don't let us upgrade `cryptography >= 46.0.0`
entry: bash -c 'safety check --ignore 70612,66963,74429,76352,76353,77744,77745,79023,79027,86217'
# Accepted CVEs, severity threshold, and ignore expirations live in .safety-policy.yml
entry: safety check --policy-file .safety-policy.yml
language: system
pass_filenames: false
files:
{
glob:
[
"**/pyproject.toml",
"**/poetry.lock",
"**/requirements*.txt",
".safety-policy.yml",
],
}
priority: 40
- id: vulture
name: vulture
description: "Vulture finds unused code in Python programs."
entry: bash -c 'vulture --exclude "contrib,.venv,api/src/backend/api/tests/,api/src/backend/conftest.py,api/src/backend/tasks/tests/,skills/" --min-confidence 100 .'
entry: vulture --min-confidence 100
language: system
types: [python]
files: '.*\.py'
- id: ui-checks
name: UI - Husky Pre-commit
description: "Run UI pre-commit checks (Claude Code validation + healthcheck)"
entry: bash -c 'cd ui && .husky/pre-commit'
language: system
files: '^ui/.*\.(ts|tsx|js|jsx|json|css)$'
pass_filenames: false
verbose: true
priority: 40
+1 -1
View File
@@ -13,7 +13,7 @@ build:
post_create_environment:
# Install poetry
# https://python-poetry.org/docs/#installing-manually
- python -m pip install poetry
- python -m pip install poetry==2.3.4
post_install:
# Install dependencies with 'docs' dependency group
# https://python-poetry.org/docs/managing-dependencies/#dependency-groups
+58
View File
@@ -0,0 +1,58 @@
# Safety policy for `safety check` (Safety CLI 3.x, v2 schema).
# Applied in: .pre-commit-config.yaml, .github/workflows/api-security.yml,
# .github/workflows/sdk-security.yml via `--policy-file`.
#
# Validate: poetry run safety validate policy_file --path .safety-policy.yml
security:
# Scan unpinned requirements too. Prowler pins via poetry.lock, so this is
# defensive against accidental unpinned entries.
ignore-unpinned-requirements: False
# CVSS severity filter. 7 = report only HIGH (7.08.9) and CRITICAL (9.010.0).
# Reference: 9=CRITICAL only, 7=CRITICAL+HIGH, 4=CRITICAL+HIGH+MEDIUM.
ignore-cvss-severity-below: 7
# Unknown severity is unrated, not safe. Keep False so unrated CVEs still fail
# the build and get a human eye. Flip to True only if noise is unmanageable.
ignore-cvss-unknown-severity: False
# Fail the build when a non-ignored vulnerability is found.
continue-on-vulnerability-error: False
# Explicit accepted vulnerabilities. Each entry MUST have a reason and an
# expiry. Expired entries fail the scan, forcing re-audit.
ignore-vulnerabilities:
77744:
reason: "Botocore requires urllib3 1.X. Remove once upgraded to urllib3 2.X."
expires: '2026-10-22'
77745:
reason: "Botocore requires urllib3 1.X. Remove once upgraded to urllib3 2.X."
expires: '2026-10-22'
79023:
reason: "knack ReDoS; blocked until azure-cli-core (via cartography) allows knack >=0.13.0."
expires: '2026-10-22'
79027:
reason: "knack ReDoS; blocked until azure-cli-core (via cartography) allows knack >=0.13.0."
expires: '2026-10-22'
86217:
reason: "alibabacloud-tea-openapi==0.4.3 blocks upgrade to cryptography >=46.0.0."
expires: '2026-10-22'
71600:
reason: "CVE-2024-1135 false positive. Fixed in gunicorn 22.0.0; project uses 23.0.0."
expires: '2026-10-22'
70612:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
66963:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
74429:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
76352:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
76353:
reason: "TBD - audit required. Reason not documented in prior --ignore list."
expires: '2026-07-22'
+2
View File
@@ -0,0 +1,2 @@
.envrc
ui/.env.local
+3 -3
View File
@@ -140,7 +140,7 @@ Prowler is an open-source cloud security assessment tool supporting AWS, Azure,
| Component | Location | Tech Stack |
|-----------|----------|------------|
| SDK | `prowler/` | Python 3.10+, Poetry |
| SDK | `prowler/` | Python 3.10+, Poetry 2.3+ |
| API | `api/` | Django 5.1, DRF, Celery |
| UI | `ui/` | Next.js 15, React 19, Tailwind 4 |
| MCP Server | `mcp_server/` | FastMCP, Python 3.12+ |
@@ -153,12 +153,12 @@ Prowler is an open-source cloud security assessment tool supporting AWS, Azure,
```bash
# Setup
poetry install --with dev
poetry run pre-commit install
poetry run prek install
# Code quality
poetry run make lint
poetry run make format
poetry run pre-commit run --all-files
poetry run prek run --all-files
```
---
+21 -2
View File
@@ -6,9 +6,12 @@ LABEL org.opencontainers.image.source="https://github.com/prowler-cloud/prowler"
ARG POWERSHELL_VERSION=7.5.0
ENV POWERSHELL_VERSION=${POWERSHELL_VERSION}
ARG TRIVY_VERSION=0.69.2
ARG TRIVY_VERSION=0.70.0
ENV TRIVY_VERSION=${TRIVY_VERSION}
ARG ZIZMOR_VERSION=1.24.1
ENV ZIZMOR_VERSION=${ZIZMOR_VERSION}
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends \
wget libicu72 libunwind8 libssl3 libcurl4 ca-certificates apt-transport-https gnupg \
@@ -48,6 +51,22 @@ RUN ARCH=$(uname -m) && \
mkdir -p /tmp/.cache/trivy && \
chmod 777 /tmp/.cache/trivy
# Install zizmor for GitHub Actions workflow scanning
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
ZIZMOR_ARCH="x86_64-unknown-linux-gnu" ; \
elif [ "$ARCH" = "aarch64" ]; then \
ZIZMOR_ARCH="aarch64-unknown-linux-gnu" ; \
else \
echo "Unsupported architecture for zizmor: $ARCH" && exit 1 ; \
fi && \
wget --progress=dot:giga "https://github.com/zizmorcore/zizmor/releases/download/v${ZIZMOR_VERSION}/zizmor-${ZIZMOR_ARCH}.tar.gz" -O /tmp/zizmor.tar.gz && \
mkdir -p /tmp/zizmor-extract && \
tar zxf /tmp/zizmor.tar.gz -C /tmp/zizmor-extract && \
mv /tmp/zizmor-extract/zizmor /usr/local/bin/zizmor && \
chmod +x /usr/local/bin/zizmor && \
rm -rf /tmp/zizmor.tar.gz /tmp/zizmor-extract
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
@@ -68,7 +87,7 @@ ENV HOME='/home/prowler'
ENV PATH="${HOME}/.local/bin:${PATH}"
#hadolint ignore=DL3013
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
pip install --no-cache-dir poetry==2.3.4
RUN poetry install --compile && \
rm -rf ~/.cache/pip
+49 -23
View File
@@ -3,7 +3,7 @@
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/master/docs/img/prowler-logo-white.png#gh-dark-mode-only" width="50%" height="50%">
</p>
<p align="center">
<b><i>Prowler</b> is the Open Cloud Security platform trusted by thousands to automate security and compliance in any cloud environment. With hundreds of ready-to-use checks and compliance frameworks, Prowler delivers real-time, customizable monitoring and seamless integrations, making cloud security simple, scalable, and cost-effective for organizations of any size.
<b><i>Prowler</b> is the Open Cloud Security Platform trusted by thousands to automate security and compliance in any cloud environment. With hundreds of ready-to-use checks and compliance frameworks, Prowler delivers real-time, customizable monitoring and seamless integrations, making cloud security simple, scalable, and cost-effective for organizations of any size.
</p>
<p align="center">
<b>Secure ANY cloud at AI Speed at <a href="https://prowler.com">prowler.com</i></b>
@@ -41,7 +41,7 @@
# Description
**Prowler** is the worlds most widely used _open-source cloud security platform_ that automates security and compliance across **any cloud environment**. With hundreds of ready-to-use security checks, remediation guidance, and compliance frameworks, Prowler is built to _“Secure ANY cloud at AI Speed”_. Prowler delivers **AI-driven**, **customizable**, and **easy-to-use** assessments, dashboards, reports, and integrations, making cloud security **simple**, **scalable**, and **cost-effective** for organizations of any size.
**Prowler** is the worlds most widely used _Open-Source Cloud Security Platform_ that automates security and compliance across **any cloud environment**. With hundreds of ready-to-use security checks, remediation guidance, and compliance frameworks, Prowler is built to _“Secure ANY Cloud at AI Speed”_. Prowler delivers **AI-driven**, **customizable**, and **easy-to-use** assessments, dashboards, reports, and integrations, making cloud security **simple**, **scalable**, and **cost-effective** for organizations of any size.
Prowler includes hundreds of built-in controls to ensure compliance with standards and frameworks, including:
@@ -104,22 +104,22 @@ Every AWS provider scan will enqueue an Attack Paths ingestion job automatically
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) | Support | Interface |
|---|---|---|---|---|---|---|
| AWS | 572 | 83 | 41 | 17 | Official | UI, API, CLI |
| Azure | 165 | 20 | 18 | 13 | Official | UI, API, CLI |
| GCP | 100 | 13 | 15 | 11 | Official | UI, API, CLI |
| Kubernetes | 83 | 7 | 7 | 9 | Official | UI, API, CLI |
| GitHub | 21 | 2 | 1 | 2 | Official | UI, API, CLI |
| M365 | 89 | 9 | 4 | 5 | Official | UI, API, CLI |
| OCI | 48 | 13 | 3 | 10 | Official | UI, API, CLI |
| Alibaba Cloud | 61 | 9 | 3 | 9 | Official | UI, API, CLI |
| Cloudflare | 29 | 2 | 0 | 5 | Official | UI, API, CLI |
| AWS | 595 | 84 | 43 | 17 | Official | UI, API, CLI |
| Azure | 167 | 22 | 19 | 16 | Official | UI, API, CLI |
| GCP | 102 | 18 | 17 | 12 | Official | UI, API, CLI |
| Kubernetes | 83 | 7 | 7 | 11 | Official | UI, API, CLI |
| GitHub | 24 | 3 | 1 | 5 | Official | UI, API, CLI |
| M365 | 101 | 10 | 4 | 10 | Official | UI, API, CLI |
| OCI | 51 | 14 | 4 | 10 | Official | UI, API, CLI |
| Alibaba Cloud | 61 | 9 | 4 | 9 | Official | UI, API, CLI |
| Cloudflare | 29 | 3 | 0 | 5 | Official | UI, API, CLI |
| IaC | [See `trivy` docs.](https://trivy.dev/latest/docs/coverage/iac/) | N/A | N/A | N/A | Official | UI, API, CLI |
| MongoDB Atlas | 10 | 3 | 0 | 8 | Official | UI, API, CLI |
| LLM | [See `promptfoo` docs.](https://www.promptfoo.dev/docs/red-team/plugins/) | N/A | N/A | N/A | Official | CLI |
| Image | N/A | N/A | N/A | N/A | Official | CLI, API |
| Google Workspace | 1 | 1 | 0 | 1 | Official | CLI |
| OpenStack | 27 | 4 | 0 | 8 | Official | UI, API, CLI |
| Vercel | 30 | 6 | 0 | 5 | Official | CLI |
| Google Workspace | 25 | 4 | 2 | 4 | Official | CLI |
| OpenStack | 34 | 5 | 0 | 9 | Official | UI, API, CLI |
| Vercel | 26 | 6 | 0 | 5 | Official | CLI |
| NHN | 6 | 2 | 1 | 0 | Unofficial | CLI |
> [!Note]
@@ -246,14 +246,7 @@ Some pre-commit hooks require tools installed on your system:
1. **Install [TruffleHog](https://github.com/trufflesecurity/trufflehog#install)** (secret scanning) — see the [official installation options](https://github.com/trufflesecurity/trufflehog#install).
2. **Install [Safety](https://github.com/pyupio/safety)** (dependency vulnerability checking):
```console
# Requires a Python environment (e.g. via pyenv)
pip install safety
```
3. **Install [Hadolint](https://github.com/hadolint/hadolint#install)** (Dockerfile linting) — see the [official installation options](https://github.com/hadolint/hadolint#install).
2. **Install [Hadolint](https://github.com/hadolint/hadolint#install)** (Dockerfile linting) — see the [official installation options](https://github.com/hadolint/hadolint#install).
## Prowler CLI
### Pip package
@@ -307,6 +300,36 @@ python prowler-cli.py -v
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
# 🛡️ GitHub Action
The official **Prowler GitHub Action** runs Prowler scans in your GitHub workflows using the official [`prowlercloud/prowler`](https://hub.docker.com/r/prowlercloud/prowler) Docker image. Scans run on any [supported provider](https://docs.prowler.com/user-guide/providers/), with optional [`--push-to-cloud`](https://docs.prowler.com/user-guide/tutorials/prowler-app-import-findings) to send findings to Prowler Cloud and optional SARIF upload so findings show up in the repo's **Security → Code scanning** tab and as inline PR annotations.
```yaml
name: Prowler IaC Scan
on:
pull_request:
permissions:
contents: read
security-events: write
actions: read
jobs:
prowler:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: prowler-cloud/prowler@5.25
with:
provider: iac
output-formats: sarif json-ocsf
upload-sarif: true
flags: --severity critical high
```
Full configuration, per-provider authentication, and SARIF examples: [Prowler GitHub Action tutorial](docs/user-guide/tutorials/prowler-app-github-action.mdx). Marketplace listing: [Prowler Security Scan](https://github.com/marketplace/actions/prowler-security-scan).
# ✏️ High level architecture
## Prowler App
@@ -317,7 +340,10 @@ python prowler-cli.py -v
- **Prowler SDK**: A Python SDK designed to extend the functionality of the Prowler CLI for advanced capabilities.
- **Prowler MCP Server**: A Model Context Protocol server that provides AI tools for Lighthouse, the AI-powered security assistant. This is a critical dependency for Lighthouse functionality.
![Prowler App Architecture](docs/products/img/prowler-app-architecture.png)
![Prowler App Architecture](docs/images/products/prowler-app-architecture.png)
<!-- Diagram source: docs/images/products/prowler-app-architecture.mmd — edit there, re-render at https://mermaid.live, and replace the PNG. -->
## Prowler CLI
+307
View File
@@ -0,0 +1,307 @@
name: Prowler Security Scan
description: Run Prowler cloud security scanner using the official Docker image
branding:
icon: cloud
color: green
inputs:
provider:
description: Cloud provider to scan (e.g. aws, azure, gcp, github, kubernetes, iac). See https://docs.prowler.com for supported providers.
required: true
image-tag:
description: >
Docker image tag for prowlercloud/prowler.
Default is "stable" (latest release). Available tags:
"stable" (latest release), "latest" (master branch, not stable),
"<x.y.z>" (pinned release version).
See all tags at https://hub.docker.com/r/prowlercloud/prowler/tags
required: false
default: stable
output-formats:
description: Output format(s) for scan results (e.g. "json-ocsf", "sarif json-ocsf")
required: false
default: json-ocsf
push-to-cloud:
description: Push scan findings to Prowler Cloud. Requires the PROWLER_CLOUD_API_KEY environment variable. See https://docs.prowler.com/user-guide/tutorials/prowler-app-import-findings#using-the-cli
required: false
default: "false"
flags:
description: 'Additional CLI flags passed to the Prowler scan (e.g. "--severity critical high --compliance cis_aws"). Values containing spaces can be quoted, e.g. "--resource-tag ''Environment=My Server''".'
required: false
default: ""
extra-env:
description: >
Space-, newline-, or comma-separated list of host environment variable NAMES to forward to the Prowler container
(e.g. "AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_SESSION_TOKEN" for AWS,
"GITHUB_PERSONAL_ACCESS_TOKEN" for GitHub, "CLOUDFLARE_API_TOKEN" for Cloudflare).
List names only; set the values via `env:` at the workflow or job level (typically from `secrets.*`).
See the README for per-provider examples.
required: false
default: ""
upload-sarif:
description: 'Upload SARIF results to GitHub Code Scanning (requires "sarif" in output-formats and both `security-events: write` and `actions: read` permissions)'
required: false
default: "false"
sarif-file:
description: Path to the SARIF file to upload (auto-detected from output/ if not set)
required: false
default: ""
sarif-category:
description: Category for the SARIF upload (used to distinguish multiple analyses)
required: false
default: prowler
fail-on-findings:
description: Fail the workflow step when Prowler detects findings (exit code 3). By default the action tolerates findings and succeeds.
required: false
default: "false"
runs:
using: composite
steps:
- name: Validate inputs
shell: bash
env:
INPUT_IMAGE_TAG: ${{ inputs.image-tag }}
INPUT_UPLOAD_SARIF: ${{ inputs.upload-sarif }}
INPUT_OUTPUT_FORMATS: ${{ inputs.output-formats }}
run: |
# Validate image tag format (alphanumeric, dots, hyphens, underscores only)
if [[ ! "$INPUT_IMAGE_TAG" =~ ^[a-zA-Z0-9._-]+$ ]]; then
echo "::error::Invalid image-tag '${INPUT_IMAGE_TAG}'. Must contain only alphanumeric characters, dots, hyphens, and underscores."
exit 1
fi
# Warn if upload-sarif is enabled but sarif not in output-formats
if [ "$INPUT_UPLOAD_SARIF" = "true" ]; then
if [[ ! "$INPUT_OUTPUT_FORMATS" =~ (^|[[:space:]])sarif($|[[:space:]]) ]]; then
echo "::warning::upload-sarif is enabled but 'sarif' is not included in output-formats ('${INPUT_OUTPUT_FORMATS}'). SARIF upload will fail unless you add 'sarif' to output-formats."
fi
fi
- name: Run Prowler scan
shell: bash
env:
INPUT_PROVIDER: ${{ inputs.provider }}
INPUT_IMAGE_TAG: ${{ inputs.image-tag }}
INPUT_OUTPUT_FORMATS: ${{ inputs.output-formats }}
INPUT_PUSH_TO_CLOUD: ${{ inputs.push-to-cloud }}
INPUT_FLAGS: ${{ inputs.flags }}
INPUT_EXTRA_ENV: ${{ inputs.extra-env }}
INPUT_FAIL_ON_FINDINGS: ${{ inputs.fail-on-findings }}
run: |
set -e
# Parse space-separated inputs with shlex so values with spaces can be quoted
# (e.g. `--resource-tag 'Environment=My Server'`).
mapfile -t OUTPUT_FORMATS < <(python3 -c 'import shlex, os; [print(t) for t in shlex.split(os.environ.get("INPUT_OUTPUT_FORMATS", ""))]')
mapfile -t EXTRA_FLAGS < <(python3 -c 'import shlex, os; [print(t) for t in shlex.split(os.environ.get("INPUT_FLAGS", ""))]')
mapfile -t EXTRA_ENV_NAMES < <(python3 -c 'import shlex, os; [print(t) for t in shlex.split(os.environ.get("INPUT_EXTRA_ENV", "").replace(",", " "))]')
env_args=()
for var in "${EXTRA_ENV_NAMES[@]}"; do
[ -z "$var" ] && continue
if [[ ! "$var" =~ ^[A-Za-z_][A-Za-z0-9_]*$ ]]; then
echo "::error::Invalid env var name '${var}' in extra-env. Names must match ^[A-Za-z_][A-Za-z0-9_]*$."
exit 1
fi
env_args+=("-e" "$var")
done
push_args=()
if [ "$INPUT_PUSH_TO_CLOUD" = "true" ]; then
push_args=("--push-to-cloud")
env_args+=("-e" "PROWLER_CLOUD_API_KEY")
fi
mkdir -p "$GITHUB_WORKSPACE/output"
chmod 777 "$GITHUB_WORKSPACE/output"
set +e
docker run --rm \
"${env_args[@]}" \
-v "$GITHUB_WORKSPACE:/home/prowler/workspace" \
-v "$GITHUB_WORKSPACE/output:/home/prowler/workspace/output" \
-w /home/prowler/workspace \
"prowlercloud/prowler:${INPUT_IMAGE_TAG}" \
"$INPUT_PROVIDER" \
--output-formats "${OUTPUT_FORMATS[@]}" \
"${push_args[@]}" \
"${EXTRA_FLAGS[@]}"
exit_code=$?
set -e
# Exit code 3 = findings detected
if [ "$exit_code" -eq 3 ] && [ "$INPUT_FAIL_ON_FINDINGS" != "true" ]; then
echo "::notice::Prowler detected findings (exit code 3). Set fail-on-findings to 'true' to fail the workflow on findings."
exit 0
fi
exit $exit_code
- name: Upload scan results
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with:
name: prowler-${{ inputs.provider }}
path: output/
retention-days: 30
if-no-files-found: warn
- name: Find SARIF file
if: always() && inputs.upload-sarif == 'true'
id: find-sarif
shell: bash
env:
INPUT_SARIF_FILE: ${{ inputs.sarif-file }}
run: |
if [ -n "$INPUT_SARIF_FILE" ]; then
echo "sarif_path=$INPUT_SARIF_FILE" >> "$GITHUB_OUTPUT"
else
sarif_file=$(find output/ -name '*.sarif' -type f | head -1)
if [ -z "$sarif_file" ]; then
echo "::warning::No .sarif file found in output/. Ensure 'sarif' is included in output-formats."
echo "sarif_path=" >> "$GITHUB_OUTPUT"
else
echo "sarif_path=$sarif_file" >> "$GITHUB_OUTPUT"
fi
fi
- name: Upload SARIF to GitHub Code Scanning
if: always() && inputs.upload-sarif == 'true' && steps.find-sarif.outputs.sarif_path != ''
uses: github/codeql-action/upload-sarif@d4b3ca9fa7f69d38bfcd667bdc45bc373d16277e # v4
with:
sarif_file: ${{ steps.find-sarif.outputs.sarif_path }}
category: ${{ inputs.sarif-category }}
- name: Write scan summary
if: always()
shell: bash
env:
INPUT_PROVIDER: ${{ inputs.provider }}
INPUT_UPLOAD_SARIF: ${{ inputs.upload-sarif }}
INPUT_PUSH_TO_CLOUD: ${{ inputs.push-to-cloud }}
RUN_URL: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}
REPO_URL: ${{ github.server_url }}/${{ github.repository }}
BRANCH: ${{ github.head_ref || github.ref_name }}
GH_TOKEN: ${{ github.token }}
run: |
set +e
# Build a link to the scan step in the workflow logs. Requires `actions: read`
# on the caller's GITHUB_TOKEN; silently skips the link if unavailable.
scan_step_url=""
if [ -n "${GH_TOKEN:-}" ] && command -v gh >/dev/null 2>&1; then
job_info=$(gh api \
"repos/${GITHUB_REPOSITORY}/actions/runs/${GITHUB_RUN_ID}/attempts/${GITHUB_RUN_ATTEMPT:-1}/jobs" \
--jq ".jobs[] | select(.runner_name == \"${RUNNER_NAME:-}\")" 2>/dev/null)
if [ -n "$job_info" ]; then
job_id=$(jq -r '.id // empty' <<<"$job_info")
step_number=$(jq -r '[.steps[]? | select((.name // "") | test("Run Prowler scan"; "i")) | .number] | first // empty' <<<"$job_info")
if [ -z "$step_number" ]; then
step_number=$(jq -r '[.steps[]? | select(.status == "in_progress") | .number] | first // empty' <<<"$job_info")
fi
if [ -n "$job_id" ] && [ -n "$step_number" ]; then
scan_step_url="${REPO_URL}/actions/runs/${GITHUB_RUN_ID}/job/${job_id}#step:${step_number}:1"
elif [ -n "$job_id" ]; then
scan_step_url="${REPO_URL}/actions/runs/${GITHUB_RUN_ID}/job/${job_id}"
fi
fi
fi
# Map provider code to a properly-cased display name.
case "$INPUT_PROVIDER" in
alibabacloud) provider_name="Alibaba Cloud" ;;
aws) provider_name="AWS" ;;
azure) provider_name="Azure" ;;
cloudflare) provider_name="Cloudflare" ;;
gcp) provider_name="GCP" ;;
github) provider_name="GitHub" ;;
googleworkspace) provider_name="Google Workspace" ;;
iac) provider_name="IaC" ;;
image) provider_name="Container Image" ;;
kubernetes) provider_name="Kubernetes" ;;
llm) provider_name="LLM" ;;
m365) provider_name="Microsoft 365" ;;
mongodbatlas) provider_name="MongoDB Atlas" ;;
nhn) provider_name="NHN" ;;
openstack) provider_name="OpenStack" ;;
oraclecloud) provider_name="Oracle Cloud" ;;
vercel) provider_name="Vercel" ;;
*) provider_name="${INPUT_PROVIDER^}" ;;
esac
ocsf_file=$(find output/ -name '*.ocsf.json' -type f 2>/dev/null | head -1)
{
echo "## Prowler ${provider_name} Scan Summary"
echo ""
counts=""
if [ -n "$ocsf_file" ] && [ -s "$ocsf_file" ]; then
counts=$(jq -r '[
length,
([.[] | select(.status_code == "FAIL")] | length),
([.[] | select(.status_code == "PASS")] | length),
([.[] | select(.status_code == "MUTED")] | length),
([.[] | select(.status_code == "FAIL" and .severity == "Critical")] | length),
([.[] | select(.status_code == "FAIL" and .severity == "High")] | length),
([.[] | select(.status_code == "FAIL" and .severity == "Medium")] | length),
([.[] | select(.status_code == "FAIL" and .severity == "Low")] | length),
([.[] | select(.status_code == "FAIL" and .severity == "Informational")] | length)
] | @tsv' "$ocsf_file" 2>/dev/null)
fi
if [ -n "$counts" ]; then
read -r total fail pass muted critical high medium low info <<<"$counts"
line="**${fail:-0} failing** · ${pass:-0} passing"
[ "${muted:-0}" -gt 0 ] && line="${line} · ${muted} muted"
echo "${line} — ${total:-0} checks total"
echo ""
echo "| Severity | Failing |"
echo "|----------|---------|"
echo "| ‼️ Critical | ${critical:-0} |"
echo "| 🔴 High | ${high:-0} |"
echo "| 🟠 Medium | ${medium:-0} |"
echo "| 🔵 Low | ${low:-0} |"
echo "| ⚪ Informational | ${info:-0} |"
echo ""
else
echo "_No findings report was produced. Check the scan logs above._"
echo ""
fi
if [ -n "$scan_step_url" ]; then
echo "**Scan logs:** [view in workflow run](${scan_step_url})"
echo ""
fi
echo "**Get the full report:** [\`prowler-${INPUT_PROVIDER}\` artifact](${RUN_URL}#artifacts)"
if [ "$INPUT_UPLOAD_SARIF" = "true" ] && [ -n "$BRANCH" ]; then
encoded_branch=$(jq -nr --arg b "$BRANCH" '$b|@uri')
echo ""
echo "**See results in GitHub Code Security:** [open alerts on \`${BRANCH}\`](${REPO_URL}/security/code-scanning?query=is%3Aopen+branch%3A${encoded_branch})"
fi
if [ "$INPUT_PUSH_TO_CLOUD" != "true" ]; then
echo ""
echo "---"
echo ""
echo "### Scale ${provider_name} security with Prowler Cloud ☁️"
echo ""
echo "Send this scan's findings to **[Prowler Cloud](https://cloud.prowler.com)** and get:"
echo ""
echo "- **Unified findings** across every cloud, SaaS provider (M365, Google Workspace, GitHub, MongoDB Atlas), IaC repo, Kubernetes cluster, and container image"
echo "- **Posture over time** with alerts, and notifications"
echo "- **Prowler Lighthouse AI**: agentic assistant that triages findings, explains root cause and helps with remediation"
echo "- **50+ Compliance frameworks** mapped automatically"
echo "- **Enterprise-ready platform**: SOC 2 Type 2, SSO/SAML, AWS Security Hub, S3 and Jira integrations"
echo ""
echo "**Get started in 3 steps:**"
echo "1. Create an account at [cloud.prowler.com](https://cloud.prowler.com)"
echo "2. Generate a Prowler Cloud API key ([docs](https://docs.prowler.com/user-guide/tutorials/prowler-app-import-findings#using-the-cli))"
echo "3. Add \`PROWLER_CLOUD_API_KEY\` to your GitHub secrets and set \`push-to-cloud: true\` on this action"
echo ""
echo "See [prowler.com/pricing](https://prowler.com/pricing) for plan details."
fi
} >> "$GITHUB_STEP_SUMMARY"
+130 -5
View File
@@ -2,22 +2,143 @@
All notable changes to the **Prowler API** are documented in this file.
## [1.24.0] (Prowler UNRELEASED)
## [1.27.0] (Prowler UNRELEASED)
### 🚀 Added
- Pin all unpinned dependencies to exact versions to prevent supply chain attacks and ensure reproducible builds [(#10469)](https://github.com/prowler-cloud/prowler/pull/10469)
- Filter RBAC role lookup by `tenant_id` to prevent cross-tenant privilege leak [(#10491)](https://github.com/prowler-cloud/prowler/pull/10491)
- `VALKEY_SCHEME`, `VALKEY_USERNAME`, and `VALKEY_PASSWORD` environment variables to configure Celery broker TLS/auth connection details for Valkey/ElastiCache [(#10420)](https://github.com/prowler-cloud/prowler/pull/10420)
- `Vercel` provider support [(#10190)](https://github.com/prowler-cloud/prowler/pull/10190)
- `scan-reset-ephemeral-resources` post-scan task zeroes `failed_findings_count` for resources missing from the latest full-scope scan, keeping ephemeral resources from polluting the Resources page sort [(#10929)](https://github.com/prowler-cloud/prowler/pull/10929)
### 🔐 Security
- `trivy` binary from 0.69.2 to 0.70.0 and `cryptography` from 46.0.6 to 46.0.7 (transitive via prowler SDK) in the API image for CVE-2026-33186 and CVE-2026-39892 [(#10978)](https://github.com/prowler-cloud/prowler/pull/10978)
---
## [1.26.1] (Prowler v5.25.1)
### 🐞 Fixed
- Attack Paths: AWS scans no longer fail when enabled regions cannot be retrieved, and scans stuck in `scheduled` state are now cleaned up after the stale threshold [(#10917)](https://github.com/prowler-cloud/prowler/pull/10917)
- Scan report and compliance downloads now redirect to a presigned S3 URL instead of streaming through the API worker, preventing gunicorn timeouts on large files [(#10927)](https://github.com/prowler-cloud/prowler/pull/10927)
---
## [1.26.0] (Prowler v5.25.0)
### 🚀 Added
- CIS Benchmark PDF report generation for scans, exposing the latest CIS version per provider via `GET /scans/{id}/cis/{name}/` [(#10650)](https://github.com/prowler-cloud/prowler/pull/10650)
- `/overviews/resource-groups` (resource inventory), `/overviews/categories` and `/overviews/attack-surfaces` now reflect newly-muted findings without waiting for the next scan. The post-mute `reaggregate-all-finding-group-summaries` task now also dispatches `aggregate_scan_resource_group_summaries_task`, `aggregate_scan_category_summaries_task` and `aggregate_attack_surface_task` per latest scan of every `(provider, day)` pair, rebuilding `ScanGroupSummary`, `ScanCategorySummary` and `AttackSurfaceOverview` alongside the tables already covered in #10827 [(#10843)](https://github.com/prowler-cloud/prowler/pull/10843)
- Install zizmor v1.24.1 in API Docker image for GitHub Actions workflow scanning [(#10607)](https://github.com/prowler-cloud/prowler/pull/10607)
### 🔄 Changed
- Allows tenant owners to expel users from their organizations [(#10787)](https://github.com/prowler-cloud/prowler/pull/10787)
- `aggregate_findings`, `aggregate_attack_surface`, `aggregate_scan_resource_group_summaries` and `aggregate_scan_category_summaries` now upsert via `bulk_create(update_conflicts=True, ...)` instead of the prior `ignore_conflicts=True` / plain INSERT / `already backfilled` short-circuit. Re-runs triggered by the post-mute reaggregation pipeline no longer trip the `unique_*_per_scan` constraints nor silently drop updates, and are race-safe under concurrent writers (e.g. scan completion overlapping with a fresh mute rule) [(#10843)](https://github.com/prowler-cloud/prowler/pull/10843)
- Rename the scan-category and scan-resource-group summary aggregators from `backfill_*` to `aggregate_*` [(#10843)](https://github.com/prowler-cloud/prowler/pull/10843)
### 🐞 Fixed
- `generate_outputs_task` crashing with `KeyError` for compliance frameworks listed by `get_compliance_frameworks` but not loadable by `Compliance.get_bulk` [(#10903)](https://github.com/prowler-cloud/prowler/pull/10903)
---
## [1.25.4] (Prowler v5.24.4)
### 🚀 Added
- `DJANGO_SENTRY_TRACES_SAMPLE_RATE` env var (default `0.02`) enables Sentry performance tracing for the API [(#10873)](https://github.com/prowler-cloud/prowler/pull/10873)
### 🔄 Changed
- Attack Paths: Neo4j driver `connection_acquisition_timeout` is now configurable via `NEO4J_CONN_ACQUISITION_TIMEOUT` (default lowered from 120 s to 15 s) [(#10873)](https://github.com/prowler-cloud/prowler/pull/10873)
### 🐞 Fixed
- `/tmp/prowler_api_output` saturation in compliance report workers: the final `rmtree` in `generate_compliance_reports` now only waits on frameworks actually generated for the provider (so unsupported frameworks no longer leave a placeholder `results` entry that blocks cleanup), output directories are created lazily per enabled framework, and both `generate_compliance_reports` and `generate_outputs_task` run an opportunistic stale cleanup at task start with a 48h age threshold, a per-host `fcntl` throttle, a 50-deletions-per-run cap, and guards that protect EXECUTING scans and scans whose `output_location` still points to a local path (metadata lookups routed through the admin DB so RLS does not hide those rows) [(#10874)](https://github.com/prowler-cloud/prowler/pull/10874)
---
## [1.25.3] (Prowler v5.24.3)
### 🚀 Added
- `/overviews/findings`, `/overviews/findings-severity` and `/overviews/services` now reflect newly-muted findings without waiting for the next scan. The post-mute `reaggregate-all-finding-group-summaries` task was extended to re-run the same per-scan pipeline that scan completion runs (`ScanSummary`, `DailySeveritySummary`, `FindingGroupDailySummary`) on the latest scan of every `(provider, day)` pair, keeping the pre-aggregated tables in sync with `Finding.muted` updates [(#10827)](https://github.com/prowler-cloud/prowler/pull/10827)
### 🐞 Fixed
- Finding groups aggregated `status` now treats muted findings as resolved: a group is `FAIL` only while at least one non-muted FAIL remains, otherwise it is `PASS` (including fully-muted groups). The `filter[status]` filter and the `sort=status` ordering share the same semantics, keeping `status` consistent with `fail_count` and the orthogonal `muted` flag [(#10825)](https://github.com/prowler-cloud/prowler/pull/10825)
- `aggregate_findings` is now idempotent: it deletes the scan's existing `ScanSummary` rows before `bulk_create`, so re-runs (such as the post-mute reaggregation pipeline) no longer violate the `unique_scan_summary` constraint and no longer abort the downstream `DailySeveritySummary` / `FindingGroupDailySummary` recomputation for the affected scan [(#10827)](https://github.com/prowler-cloud/prowler/pull/10827)
- Attack Paths: Findings on AWS were silently dropped during the Neo4j merge for resources whose Cartography node is keyed by a short identifier (e.g. EC2 instances) rather than the full ARN [(#10839)](https://github.com/prowler-cloud/prowler/pull/10839)
---
## [1.25.2] (Prowler v5.24.2)
### 🔄 Changed
- Finding groups `/resources` endpoints now materialize the filtered finding IDs into a Python list before filtering `ResourceFindingMapping`, so PostgreSQL switches from a Merge Semi Join that read hundreds of thousands of RFM index entries to a Nested Loop Index Scan over `finding_id`. The `has_mappings.exists()` pre-check is removed, and a request-scoped cache deduplicates the finding-id round-trip across the helpers that build different RFM querysets [(#10816)](https://github.com/prowler-cloud/prowler/pull/10816)
### 🐞 Fixed
- `/finding-groups/latest/<check_id>/resources` now selects the latest completed scan per provider by `-completed_at` (then `-inserted_at`) instead of `-inserted_at`, matching the `/finding-groups/latest` summary path and the daily-summary upsert so overlapping scans no longer produce diverging `delta`/`new_count` between the two endpoints [(#10802)](https://github.com/prowler-cloud/prowler/pull/10802)
## [1.25.1] (Prowler v5.24.1)
### 🔄 Changed
- Attack Paths: Restore `SYNC_BATCH_SIZE` and `FINDINGS_BATCH_SIZE` defaults to 1000, upgrade Cartography to 0.135.0, enable Celery queue priority for cleanup task, rewrite Finding insertion, remove AWS graph cleanup and add timing logs [(#10729)](https://github.com/prowler-cloud/prowler/pull/10729)
### 🐞 Fixed
- Finding group resources endpoints now include findings without associated resources (orphaned IaC findings) as simulated resource rows, and return one row per finding when multiple findings share a resource [(#10708)](https://github.com/prowler-cloud/prowler/pull/10708)
- Attack Paths: Missing `tenant_id` filter while getting related findings after scan completes [(#10722)](https://github.com/prowler-cloud/prowler/pull/10722)
- Finding group counters `pass_count`, `fail_count` and `manual_count` now exclude muted findings [(#10753)](https://github.com/prowler-cloud/prowler/pull/10753)
- Silent data loss in `ResourceFindingMapping` bulk insert that left findings orphaned when `INSERT ... ON CONFLICT DO NOTHING` dropped rows without raising; added explicit `unique_fields` [(#10724)](https://github.com/prowler-cloud/prowler/pull/10724)
- `DELETE /tenants/{tenant_pk}/memberships/{id}` now deletes the expelled user's account when the removed membership was their last one, and blacklists every outstanding refresh token for that user so their existing sessions can no longer mint new access tokens [(#10787)](https://github.com/prowler-cloud/prowler/pull/10787)
---
## [1.25.0] (Prowler v5.24.0)
### 🔄 Changed
- Bump Poetry to `2.3.4` in Dockerfile and pre-commit hooks. Regenerate `api/poetry.lock` [(#10681)](https://github.com/prowler-cloud/prowler/pull/10681)
- Attack Paths: Remove dead `cleanup_findings` no-op and its supporting `prowler_finding_lastupdated` index [(#10684)](https://github.com/prowler-cloud/prowler/pull/10684)
### 🐞 Fixed
- Worker-beat race condition on cold start: replaced `sleep 15` with API service healthcheck dependency (Docker Compose) and init containers (Helm), aligned Gunicorn default port to `8080` [(#10603)](https://github.com/prowler-cloud/prowler/pull/10603)
- API container startup crash on Linux due to root-owned bind-mount preventing JWT key generation [(#10646)](https://github.com/prowler-cloud/prowler/pull/10646)
### 🔐 Security
- `pytest` from 8.2.2 to 9.0.3 to fix CVE-2025-71176 [(#10678)](https://github.com/prowler-cloud/prowler/pull/10678)
---
## [1.24.0] (Prowler v5.23.0)
### 🚀 Added
- RBAC role lookup filtered by `tenant_id` to prevent cross-tenant privilege leak [(#10491)](https://github.com/prowler-cloud/prowler/pull/10491)
- `VALKEY_SCHEME`, `VALKEY_USERNAME`, and `VALKEY_PASSWORD` environment variables to configure Celery broker TLS/auth connection details for Valkey/ElastiCache [(#10420)](https://github.com/prowler-cloud/prowler/pull/10420)
- `Vercel` provider support [(#10190)](https://github.com/prowler-cloud/prowler/pull/10190)
- Finding groups list and latest endpoints support `sort=delta`, ordering by `new_count` then `changed_count` so groups with the most new findings rank highest [(#10606)](https://github.com/prowler-cloud/prowler/pull/10606)
- Finding group resources endpoints (`/finding-groups/{check_id}/resources` and `/finding-groups/latest/{check_id}/resources`) now expose `finding_id` per row, pointing to the most recent matching Finding for each resource. UUIDv7 ordering guarantees `Max(finding__id)` resolves to the latest snapshot [(#10630)](https://github.com/prowler-cloud/prowler/pull/10630)
- Handle CIS and CISA SCuBA compliance framework from google workspace [(#10629)](https://github.com/prowler-cloud/prowler/pull/10629)
- Sort support for all finding group counter fields: `pass_muted_count`, `fail_muted_count`, `manual_muted_count`, and all `new_*`/`changed_*` status-mute breakdown counters [(#10655)](https://github.com/prowler-cloud/prowler/pull/10655)
### 🔄 Changed
- Finding groups list/latest/resources now expose `status``{FAIL, PASS, MANUAL}` and `muted: bool` as orthogonal fields. The aggregated `status` reflects the underlying check outcome regardless of mute state, and `muted=true` signals that every finding in the group/resource is muted. New `manual_count` is exposed alongside `pass_count`/`fail_count`, plus `pass_muted_count`/`fail_muted_count`/`manual_muted_count` siblings so clients can isolate the muted half of each status. The `new_*`/`changed_*` deltas are now broken down by status and mute state via 12 new counters (`new_fail_count`, `new_fail_muted_count`, `new_pass_count`, `new_pass_muted_count`, `new_manual_count`, `new_manual_muted_count` and the matching `changed_*` set). New `filter[muted]=true|false` and `sort=status` (FAIL > PASS > MANUAL) / `sort=muted` are supported. `filter[status]=MUTED` is no longer accepted [(#10630)](https://github.com/prowler-cloud/prowler/pull/10630)
- Attack Paths: Periodic cleanup of stale scans with dead-worker detection via Celery inspect, marking orphaned `EXECUTING` scans as `FAILED` and recovering `graph_data_ready` [(#10387)](https://github.com/prowler-cloud/prowler/pull/10387)
- Attack Paths: Replace `_provider_id` property with `_Provider_{uuid}` label for provider isolation, add regex-based label injection for custom queries [(#10402)](https://github.com/prowler-cloud/prowler/pull/10402)
### 🐞 Fixed
- `reaggregate_all_finding_group_summaries_task` now refreshes finding group daily summaries for every `(provider, day)` combination instead of only the latest scan per provider, matching the unbounded scope of `mute_historical_findings_task`. Mute rule operations no longer leave older daily summaries drifting from the underlying muted findings [(#10630)](https://github.com/prowler-cloud/prowler/pull/10630)
- Finding groups list/latest now apply computed status/severity filters and finding-level prefilters (delta, region, service, category, resource group, scan, resource type), plus `check_title` support for sort/filter consistency [(#10428)](https://github.com/prowler-cloud/prowler/pull/10428)
- Populate compliance data inside `check_metadata` for findings, which was always returned as `null` [(#10449)](https://github.com/prowler-cloud/prowler/pull/10449)
- 403 error for admin users listing tenants due to roles query not using the admin database connection [(#10460)](https://github.com/prowler-cloud/prowler/pull/10460)
@@ -28,10 +149,14 @@ All notable changes to the **Prowler API** are documented in this file.
- Membership `post_delete` signal using raw FK ids to avoid `DoesNotExist` during cascade deletions [(#10497)](https://github.com/prowler-cloud/prowler/pull/10497)
- Finding group resources endpoints returning false 404 when filters match no results, and `sort` parameter being ignored [(#10510)](https://github.com/prowler-cloud/prowler/pull/10510)
- Jira integration failing with `JiraInvalidIssueTypeError` on non-English Jira instances due to hardcoded `"Task"` issue type; now dynamically fetches available issue types per project [(#10534)](https://github.com/prowler-cloud/prowler/pull/10534)
- Finding group `first_seen_at` now reflects when a new finding appeared in the scan instead of the oldest carry-forward date across all unchanged findings [(#10595)](https://github.com/prowler-cloud/prowler/pull/10595)
- Attack Paths: Remove `clear_cache` call from read-only query endpoints; cache clearing belongs to the scan/ingestion flow, not API queries [(#10586)](https://github.com/prowler-cloud/prowler/pull/10586)
### 🔐 Security
- Pin all unpinned dependencies to exact versions to prevent supply chain attacks and ensure reproducible builds [(#10469)](https://github.com/prowler-cloud/prowler/pull/10469)
- `authlib` bumped from 1.6.6 to 1.6.9 to fix CVE-2026-28802 (JWT `alg: none` validation bypass) [(#10579)](https://github.com/prowler-cloud/prowler/pull/10579)
- `aiohttp` bumped from 3.13.3 to 3.13.5 to fix CVE-2026-34520 (the C parser accepted null bytes and control characters in response headers) [(#10538)](https://github.com/prowler-cloud/prowler/pull/10538)
---
+22 -2
View File
@@ -5,9 +5,12 @@ LABEL maintainer="https://github.com/prowler-cloud/api"
ARG POWERSHELL_VERSION=7.5.0
ENV POWERSHELL_VERSION=${POWERSHELL_VERSION}
ARG TRIVY_VERSION=0.69.2
ARG TRIVY_VERSION=0.70.0
ENV TRIVY_VERSION=${TRIVY_VERSION}
ARG ZIZMOR_VERSION=1.24.1
ENV ZIZMOR_VERSION=${ZIZMOR_VERSION}
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends \
wget \
@@ -22,6 +25,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libtool \
libxslt1-dev \
python3-dev \
git \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
@@ -57,6 +61,22 @@ RUN ARCH=$(uname -m) && \
mkdir -p /tmp/.cache/trivy && \
chmod 777 /tmp/.cache/trivy
# Install zizmor for GitHub Actions workflow scanning
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
ZIZMOR_ARCH="x86_64-unknown-linux-gnu" ; \
elif [ "$ARCH" = "aarch64" ]; then \
ZIZMOR_ARCH="aarch64-unknown-linux-gnu" ; \
else \
echo "Unsupported architecture for zizmor: $ARCH" && exit 1 ; \
fi && \
wget --progress=dot:giga "https://github.com/zizmorcore/zizmor/releases/download/v${ZIZMOR_VERSION}/zizmor-${ZIZMOR_ARCH}.tar.gz" -O /tmp/zizmor.tar.gz && \
mkdir -p /tmp/zizmor-extract && \
tar zxf /tmp/zizmor.tar.gz -C /tmp/zizmor-extract && \
mv /tmp/zizmor-extract/zizmor /usr/local/bin/zizmor && \
chmod +x /usr/local/bin/zizmor && \
rm -rf /tmp/zizmor.tar.gz /tmp/zizmor-extract
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
@@ -71,7 +91,7 @@ RUN mkdir -p /tmp/prowler_api_output
COPY pyproject.toml ./
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
pip install --no-cache-dir poetry==2.3.4
ENV PATH="/home/prowler/.local/bin:$PATH"
-1
View File
@@ -56,7 +56,6 @@ start_worker() {
start_worker_beat() {
echo "Starting the worker-beat..."
sleep 15
poetry run python -m celery -A config.celery beat -l "${DJANGO_LOGGING_LEVEL:-info}" --scheduler django_celery_beat.schedulers:DatabaseScheduler
}
+319 -267
View File
@@ -103,132 +103,132 @@ files = [
[[package]]
name = "aiohttp"
version = "3.13.3"
version = "3.13.5"
description = "Async http client/server framework (asyncio)"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "aiohttp-3.13.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d5a372fd5afd301b3a89582817fdcdb6c34124787c70dbcc616f259013e7eef7"},
{file = "aiohttp-3.13.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:147e422fd1223005c22b4fe080f5d93ced44460f5f9c105406b753612b587821"},
{file = "aiohttp-3.13.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:859bd3f2156e81dd01432f5849fc73e2243d4a487c4fd26609b1299534ee1845"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:dca68018bf48c251ba17c72ed479f4dafe9dbd5a73707ad8d28a38d11f3d42af"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:fee0c6bc7db1de362252affec009707a17478a00ec69f797d23ca256e36d5940"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c048058117fd649334d81b4b526e94bde3ccaddb20463a815ced6ecbb7d11160"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:215a685b6fbbfcf71dfe96e3eba7a6f58f10da1dfdf4889c7dd856abe430dca7"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:de2c184bb1fe2cbd2cefba613e9db29a5ab559323f994b6737e370d3da0ac455"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:75ca857eba4e20ce9f546cd59c7007b33906a4cd48f2ff6ccf1ccfc3b646f279"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:81e97251d9298386c2b7dbeb490d3d1badbdc69107fb8c9299dd04eb39bddc0e"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:c0e2d366af265797506f0283487223146af57815b388623f0357ef7eac9b209d"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:4e239d501f73d6db1522599e14b9b321a7e3b1de66ce33d53a765d975e9f4808"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:0db318f7a6f065d84cb1e02662c526294450b314a02bd9e2a8e67f0d8564ce40"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:bfc1cc2fe31a6026a8a88e4ecfb98d7f6b1fec150cfd708adbfd1d2f42257c29"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:af71fff7bac6bb7508956696dce8f6eec2bbb045eceb40343944b1ae62b5ef11"},
{file = "aiohttp-3.13.3-cp310-cp310-win32.whl", hash = "sha256:37da61e244d1749798c151421602884db5270faf479cf0ef03af0ff68954c9dd"},
{file = "aiohttp-3.13.3-cp310-cp310-win_amd64.whl", hash = "sha256:7e63f210bc1b57ef699035f2b4b6d9ce096b5914414a49b0997c839b2bd2223c"},
{file = "aiohttp-3.13.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5b6073099fb654e0a068ae678b10feff95c5cae95bbfcbfa7af669d361a8aa6b"},
{file = "aiohttp-3.13.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cb93e166e6c28716c8c6aeb5f99dfb6d5ccf482d29fe9bf9a794110e6d0ab64"},
{file = "aiohttp-3.13.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:28e027cf2f6b641693a09f631759b4d9ce9165099d2b5d92af9bd4e197690eea"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3b61b7169ababd7802f9568ed96142616a9118dd2be0d1866e920e77ec8fa92a"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:80dd4c21b0f6237676449c6baaa1039abae86b91636b6c91a7f8e61c87f89540"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:65d2ccb7eabee90ce0503c17716fc77226be026dcc3e65cce859a30db715025b"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5b179331a481cb5529fca8b432d8d3c7001cb217513c94cd72d668d1248688a3"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d4c940f02f49483b18b079d1c27ab948721852b281f8b015c058100e9421dd1"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f9444f105664c4ce47a2a7171a2418bce5b7bae45fb610f4e2c36045d85911d3"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:694976222c711d1d00ba131904beb60534f93966562f64440d0c9d41b8cdb440"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:f33ed1a2bf1997a36661874b017f5c4b760f41266341af36febaf271d179f6d7"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e636b3c5f61da31a92bf0d91da83e58fdfa96f178ba682f11d24f31944cdd28c"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:5d2d94f1f5fcbe40838ac51a6ab5704a6f9ea42e72ceda48de5e6b898521da51"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2be0e9ccf23e8a94f6f0650ce06042cefc6ac703d0d7ab6c7a917289f2539ad4"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9af5e68ee47d6534d36791bbe9b646d2a7c7deb6fc24d7943628edfbb3581f29"},
{file = "aiohttp-3.13.3-cp311-cp311-win32.whl", hash = "sha256:a2212ad43c0833a873d0fb3c63fa1bacedd4cf6af2fee62bf4b739ceec3ab239"},
{file = "aiohttp-3.13.3-cp311-cp311-win_amd64.whl", hash = "sha256:642f752c3eb117b105acbd87e2c143de710987e09860d674e068c4c2c441034f"},
{file = "aiohttp-3.13.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:b903a4dfee7d347e2d87697d0713be59e0b87925be030c9178c5faa58ea58d5c"},
{file = "aiohttp-3.13.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:a45530014d7a1e09f4a55f4f43097ba0fd155089372e105e4bff4ca76cb1b168"},
{file = "aiohttp-3.13.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:27234ef6d85c914f9efeb77ff616dbf4ad2380be0cda40b4db086ffc7ddd1b7d"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d32764c6c9aafb7fb55366a224756387cd50bfa720f32b88e0e6fa45b27dcf29"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b1a6102b4d3ebc07dad44fbf07b45bb600300f15b552ddf1851b5390202ea2e3"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c014c7ea7fb775dd015b2d3137378b7be0249a448a1612268b5a90c2d81de04d"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2b8d8ddba8f95ba17582226f80e2de99c7a7948e66490ef8d947e272a93e9463"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9ae8dd55c8e6c4257eae3a20fd2c8f41edaea5992ed67156642493b8daf3cecc"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:01ad2529d4b5035578f5081606a465f3b814c542882804e2e8cda61adf5c71bf"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bb4f7475e359992b580559e008c598091c45b5088f28614e855e42d39c2f1033"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:c19b90316ad3b24c69cd78d5c9b4f3aa4497643685901185b65166293d36a00f"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:96d604498a7c782cb15a51c406acaea70d8c027ee6b90c569baa6e7b93073679"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:084911a532763e9d3dd95adf78a78f4096cd5f58cdc18e6fdbc1b58417a45423"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:7a4a94eb787e606d0a09404b9c38c113d3b099d508021faa615d70a0131907ce"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:87797e645d9d8e222e04160ee32aa06bc5c163e8499f24db719e7852ec23093a"},
{file = "aiohttp-3.13.3-cp312-cp312-win32.whl", hash = "sha256:b04be762396457bef43f3597c991e192ee7da460a4953d7e647ee4b1c28e7046"},
{file = "aiohttp-3.13.3-cp312-cp312-win_amd64.whl", hash = "sha256:e3531d63d3bdfa7e3ac5e9b27b2dd7ec9df3206a98e0b3445fa906f233264c57"},
{file = "aiohttp-3.13.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:5dff64413671b0d3e7d5918ea490bdccb97a4ad29b3f311ed423200b2203e01c"},
{file = "aiohttp-3.13.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:87b9aab6d6ed88235aa2970294f496ff1a1f9adcd724d800e9b952395a80ffd9"},
{file = "aiohttp-3.13.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:425c126c0dc43861e22cb1c14ba4c8e45d09516d0a3ae0a3f7494b79f5f233a3"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7f9120f7093c2a32d9647abcaf21e6ad275b4fbec5b55969f978b1a97c7c86bf"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:697753042d57f4bf7122cab985bf15d0cef23c770864580f5af4f52023a56bd6"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6de499a1a44e7de70735d0b39f67c8f25eb3d91eb3103be99ca0fa882cdd987d"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:37239e9f9a7ea9ac5bf6b92b0260b01f8a22281996da609206a84df860bc1261"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f76c1e3fe7d7c8afad7ed193f89a292e1999608170dcc9751a7462a87dfd5bc0"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fc290605db2a917f6e81b0e1e0796469871f5af381ce15c604a3c5c7e51cb730"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4021b51936308aeea0367b8f006dc999ca02bc118a0cc78c303f50a2ff6afb91"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:49a03727c1bba9a97d3e93c9f93ca03a57300f484b6e935463099841261195d3"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3d9908a48eb7416dc1f4524e69f1d32e5d90e3981e4e37eb0aa1cd18f9cfa2a4"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:2712039939ec963c237286113c68dbad80a82a4281543f3abf766d9d73228998"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:7bfdc049127717581866fa4708791220970ce291c23e28ccf3922c700740fdc0"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8057c98e0c8472d8846b9c79f56766bcc57e3e8ac7bfd510482332366c56c591"},
{file = "aiohttp-3.13.3-cp313-cp313-win32.whl", hash = "sha256:1449ceddcdbcf2e0446957863af03ebaaa03f94c090f945411b61269e2cb5daf"},
{file = "aiohttp-3.13.3-cp313-cp313-win_amd64.whl", hash = "sha256:693781c45a4033d31d4187d2436f5ac701e7bbfe5df40d917736108c1cc7436e"},
{file = "aiohttp-3.13.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:ea37047c6b367fd4bd632bff8077449b8fa034b69e812a18e0132a00fae6e808"},
{file = "aiohttp-3.13.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:6fc0e2337d1a4c3e6acafda6a78a39d4c14caea625124817420abceed36e2415"},
{file = "aiohttp-3.13.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c685f2d80bb67ca8c3837823ad76196b3694b0159d232206d1e461d3d434666f"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:48e377758516d262bde50c2584fc6c578af272559c409eecbdd2bae1601184d6"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:34749271508078b261c4abb1767d42b8d0c0cc9449c73a4df494777dc55f0687"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:82611aeec80eb144416956ec85b6ca45a64d76429c1ed46ae1b5f86c6e0c9a26"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2fff83cfc93f18f215896e3a190e8e5cb413ce01553901aca925176e7568963a"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bbe7d4cecacb439e2e2a8a1a7b935c25b812af7a5fd26503a66dadf428e79ec1"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b928f30fe49574253644b1ca44b1b8adbd903aa0da4b9054a6c20fc7f4092a25"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7b5e8fe4de30df199155baaf64f2fcd604f4c678ed20910db8e2c66dc4b11603"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:8542f41a62bcc58fc7f11cf7c90e0ec324ce44950003feb70640fc2a9092c32a"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:5e1d8c8b8f1d91cd08d8f4a3c2b067bfca6ec043d3ff36de0f3a715feeedf926"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:90455115e5da1c3c51ab619ac57f877da8fd6d73c05aacd125c5ae9819582aba"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:042e9e0bcb5fba81886c8b4fbb9a09d6b8a00245fd8d88e4d989c1f96c74164c"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2eb752b102b12a76ca02dff751a801f028b4ffbbc478840b473597fc91a9ed43"},
{file = "aiohttp-3.13.3-cp314-cp314-win32.whl", hash = "sha256:b556c85915d8efaed322bf1bdae9486aa0f3f764195a0fb6ee962e5c71ef5ce1"},
{file = "aiohttp-3.13.3-cp314-cp314-win_amd64.whl", hash = "sha256:9bf9f7a65e7aa20dd764151fb3d616c81088f91f8df39c3893a536e279b4b984"},
{file = "aiohttp-3.13.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:05861afbbec40650d8a07ea324367cb93e9e8cc7762e04dd4405df99fa65159c"},
{file = "aiohttp-3.13.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2fc82186fadc4a8316768d61f3722c230e2c1dcab4200d52d2ebdf2482e47592"},
{file = "aiohttp-3.13.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:0add0900ff220d1d5c5ebbf99ed88b0c1bbf87aa7e4262300ed1376a6b13414f"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:568f416a4072fbfae453dcf9a99194bbb8bdeab718e08ee13dfa2ba0e4bebf29"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:add1da70de90a2569c5e15249ff76a631ccacfe198375eead4aadf3b8dc849dc"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:10b47b7ba335d2e9b1239fa571131a87e2d8ec96b333e68b2a305e7a98b0bae2"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3dd4dce1c718e38081c8f35f323209d4c1df7d4db4bab1b5c88a6b4d12b74587"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:34bac00a67a812570d4a460447e1e9e06fae622946955f939051e7cc895cfab8"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a19884d2ee70b06d9204b2727a7b9f983d0c684c650254679e716b0b77920632"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5f8ca7f2bb6ba8348a3614c7918cc4bb73268c5ac2a207576b7afea19d3d9f64"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:b0d95340658b9d2f11d9697f59b3814a9d3bb4b7a7c20b131df4bcef464037c0"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:a1e53262fd202e4b40b70c3aff944a8155059beedc8a89bba9dc1f9ef06a1b56"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:d60ac9663f44168038586cab2157e122e46bdef09e9368b37f2d82d354c23f72"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:90751b8eed69435bac9ff4e3d2f6b3af1f57e37ecb0fbeee59c0174c9e2d41df"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fc353029f176fd2b3ec6cfc71be166aba1936fe5d73dd1992ce289ca6647a9aa"},
{file = "aiohttp-3.13.3-cp314-cp314t-win32.whl", hash = "sha256:2e41b18a58da1e474a057b3d35248d8320029f61d70a37629535b16a0c8f3767"},
{file = "aiohttp-3.13.3-cp314-cp314t-win_amd64.whl", hash = "sha256:44531a36aa2264a1860089ffd4dce7baf875ee5a6079d5fb42e261c704ef7344"},
{file = "aiohttp-3.13.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:31a83ea4aead760dfcb6962efb1d861db48c34379f2ff72db9ddddd4cda9ea2e"},
{file = "aiohttp-3.13.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:988a8c5e317544fdf0d39871559e67b6341065b87fceac641108c2096d5506b7"},
{file = "aiohttp-3.13.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9b174f267b5cfb9a7dba9ee6859cecd234e9a681841eb85068059bc867fb8f02"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:947c26539750deeaee933b000fb6517cc770bbd064bad6033f1cff4803881e43"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:9ebf57d09e131f5323464bd347135a88622d1c0976e88ce15b670e7ad57e4bd6"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4ae5b5a0e1926e504c81c5b84353e7a5516d8778fbbff00429fe7b05bb25cbce"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2ba0eea45eb5cc3172dbfc497c066f19c41bac70963ea1a67d51fc92e4cf9a80"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bae5c2ed2eae26cc382020edad80d01f36cb8e746da40b292e68fec40421dc6a"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:8a60e60746623925eab7d25823329941aee7242d559baa119ca2b253c88a7bd6"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:e50a2e1404f063427c9d027378472316201a2290959a295169bcf25992d04558"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:9a9dc347e5a3dc7dfdbc1f82da0ef29e388ddb2ed281bfce9dd8248a313e62b7"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:b46020d11d23fe16551466c77823df9cc2f2c1e63cc965daf67fa5eec6ca1877"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:69c56fbc1993fa17043e24a546959c0178fe2b5782405ad4559e6c13975c15e3"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:b99281b0704c103d4e11e72a76f1b543d4946fea7dd10767e7e1b5f00d4e5704"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:40c5e40ecc29ba010656c18052b877a1c28f84344825efa106705e835c28530f"},
{file = "aiohttp-3.13.3-cp39-cp39-win32.whl", hash = "sha256:56339a36b9f1fc708260c76c87e593e2afb30d26de9ae1eb445b5e051b98a7a1"},
{file = "aiohttp-3.13.3-cp39-cp39-win_amd64.whl", hash = "sha256:c6b8568a3bb5819a0ad087f16d40e5a3fb6099f39ea1d5625a3edc1e923fc538"},
{file = "aiohttp-3.13.3.tar.gz", hash = "sha256:a949eee43d3782f2daae4f4a2819b2cb9b0c5d3b7f7a927067cc84dafdbb9f88"},
{file = "aiohttp-3.13.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:02222e7e233295f40e011c1b00e3b0bd451f22cf853a0304c3595633ee47da4b"},
{file = "aiohttp-3.13.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bace460460ed20614fa6bc8cb09966c0b8517b8c58ad8046828c6078d25333b5"},
{file = "aiohttp-3.13.5-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f546a4dc1e6a5edbb9fd1fd6ad18134550e096a5a43f4ad74acfbd834fc6670"},
{file = "aiohttp-3.13.5-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c86969d012e51b8e415a8c6ce96f7857d6a87d6207303ab02d5d11ef0cad2274"},
{file = "aiohttp-3.13.5-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b6f6cd1560c5fa427e3b6074bb24d2c64e225afbb7165008903bd42e4e33e28a"},
{file = "aiohttp-3.13.5-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:636bc362f0c5bbc7372bc3ae49737f9e3030dbce469f0f422c8f38079780363d"},
{file = "aiohttp-3.13.5-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6a7cbeb06d1070f1d14895eeeed4dac5913b22d7b456f2eb969f11f4b3993796"},
{file = "aiohttp-3.13.5-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bca9ef7517fd7874a1a08970ae88f497bf5c984610caa0bf40bd7e8450852b95"},
{file = "aiohttp-3.13.5-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:019a67772e034a0e6b9b17c13d0a8fe56ad9fb150fc724b7f3ffd3724288d9e5"},
{file = "aiohttp-3.13.5-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:f34ecee82858e41dd217734f0c41a532bd066bcaab636ad830f03a30b2a96f2a"},
{file = "aiohttp-3.13.5-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:4eac02d9af4813ee289cd63a361576da36dba57f5a1ab36377bc2600db0cbb73"},
{file = "aiohttp-3.13.5-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:4beac52e9fe46d6abf98b0176a88154b742e878fdf209d2248e99fcdf73cd297"},
{file = "aiohttp-3.13.5-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:c180f480207a9b2475f2b8d8bd7204e47aec952d084b2a2be58a782ffcf96074"},
{file = "aiohttp-3.13.5-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2837fb92951564d6339cedae4a7231692aa9f73cbc4fb2e04263b96844e03b4e"},
{file = "aiohttp-3.13.5-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:d9010032a0b9710f58012a1e9c222528763d860ba2ee1422c03473eab47703e7"},
{file = "aiohttp-3.13.5-cp310-cp310-win32.whl", hash = "sha256:7c4b6668b2b2b9027f209ddf647f2a4407784b5d88b8be4efcc72036f365baf9"},
{file = "aiohttp-3.13.5-cp310-cp310-win_amd64.whl", hash = "sha256:cd3db5927bf9167d5a6157ddb2f036f6b6b0ad001ac82355d43e97a4bde76d76"},
{file = "aiohttp-3.13.5-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7ab7229b6f9b5c1ba4910d6c41a9eb11f543eadb3f384df1b4c293f4e73d44d6"},
{file = "aiohttp-3.13.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:8f14c50708bb156b3a3ca7230b3d820199d56a48e3af76fa21c2d6087190fe3d"},
{file = "aiohttp-3.13.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e7d2f8616f0ff60bd332022279011776c3ac0faa0f1b463f7bb12326fbc97a1c"},
{file = "aiohttp-3.13.5-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a2567b72e1ffc3ab25510db43f355b29eeada56c0a622e58dcdb19530eb0a3cb"},
{file = "aiohttp-3.13.5-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:fb0540c854ac9c0c5ad495908fdfd3e332d553ec731698c0e29b1877ba0d2ec6"},
{file = "aiohttp-3.13.5-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c9883051c6972f58bfc4ebb2116345ee2aa151178e99c3f2b2bbe2af712abd13"},
{file = "aiohttp-3.13.5-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2294172ce08a82fb7c7273485895de1fa1186cc8294cfeb6aef4af42ad261174"},
{file = "aiohttp-3.13.5-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3a807cabd5115fb55af198b98178997a5e0e57dead43eb74a93d9c07d6d4a7dc"},
{file = "aiohttp-3.13.5-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:aa6d0d932e0f39c02b80744273cd5c388a2d9bc07760a03164f229c8e02662f6"},
{file = "aiohttp-3.13.5-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:60869c7ac4aaabe7110f26499f3e6e5696eae98144735b12a9c3d9eae2b51a49"},
{file = "aiohttp-3.13.5-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:26d2f8546f1dfa75efa50c3488215a903c0168d253b75fba4210f57ab77a0fb8"},
{file = "aiohttp-3.13.5-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:f1162a1492032c82f14271e831c8f4b49f2b6078f4f5fc74de2c912fa225d51d"},
{file = "aiohttp-3.13.5-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:8b14eb3262fad0dc2f89c1a43b13727e709504972186ff6a99a3ecaa77102b6c"},
{file = "aiohttp-3.13.5-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:ca9ac61ac6db4eb6c2a0cd1d0f7e1357647b638ccc92f7e9d8d133e71ed3c6ac"},
{file = "aiohttp-3.13.5-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:7996023b2ed59489ae4762256c8516df9820f751cf2c5da8ed2fb20ee50abab3"},
{file = "aiohttp-3.13.5-cp311-cp311-win32.whl", hash = "sha256:77dfa48c9f8013271011e51c00f8ada19851f013cde2c48fca1ba5e0caf5bb06"},
{file = "aiohttp-3.13.5-cp311-cp311-win_amd64.whl", hash = "sha256:d3a4834f221061624b8887090637db9ad4f61752001eae37d56c52fddade2dc8"},
{file = "aiohttp-3.13.5-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:023ecba036ddd840b0b19bf195bfae970083fd7024ce1ac22e9bba90464620e9"},
{file = "aiohttp-3.13.5-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:15c933ad7920b7d9a20de151efcd05a6e38302cbf0e10c9b2acb9a42210a2416"},
{file = "aiohttp-3.13.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ab2899f9fa2f9f741896ebb6fa07c4c883bfa5c7f2ddd8cf2aafa86fa981b2d2"},
{file = "aiohttp-3.13.5-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a60eaa2d440cd4707696b52e40ed3e2b0f73f65be07fd0ef23b6b539c9c0b0b4"},
{file = "aiohttp-3.13.5-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:55b3bdd3292283295774ab585160c4004f4f2f203946997f49aac032c84649e9"},
{file = "aiohttp-3.13.5-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c2b2355dc094e5f7d45a7bb262fe7207aa0460b37a0d87027dcf21b5d890e7d5"},
{file = "aiohttp-3.13.5-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b38765950832f7d728297689ad78f5f2cf79ff82487131c4d26fe6ceecdc5f8e"},
{file = "aiohttp-3.13.5-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b18f31b80d5a33661e08c89e202edabf1986e9b49c42b4504371daeaa11b47c1"},
{file = "aiohttp-3.13.5-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:33add2463dde55c4f2d9635c6ab33ce154e5ecf322bd26d09af95c5f81cfa286"},
{file = "aiohttp-3.13.5-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:327cc432fdf1356fb4fbc6fe833ad4e9f6aacb71a8acaa5f1855e4b25910e4a9"},
{file = "aiohttp-3.13.5-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:7c35b0bf0b48a70b4cb4fc5d7bed9b932532728e124874355de1a0af8ec4bc88"},
{file = "aiohttp-3.13.5-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:df23d57718f24badef8656c49743e11a89fd6f5358fa8a7b96e728fda2abf7d3"},
{file = "aiohttp-3.13.5-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:02e048037a6501a5ec1f6fc9736135aec6eb8a004ce48838cb951c515f32c80b"},
{file = "aiohttp-3.13.5-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:31cebae8b26f8a615d2b546fee45d5ffb76852ae6450e2a03f42c9102260d6fe"},
{file = "aiohttp-3.13.5-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:888e78eb5ca55a615d285c3c09a7a91b42e9dd6fc699b166ebd5dee87c9ccf14"},
{file = "aiohttp-3.13.5-cp312-cp312-win32.whl", hash = "sha256:8bd3ec6376e68a41f9f95f5ed170e2fcf22d4eb27a1f8cb361d0508f6e0557f3"},
{file = "aiohttp-3.13.5-cp312-cp312-win_amd64.whl", hash = "sha256:110e448e02c729bcebb18c60b9214a87ba33bac4a9fa5e9a5f139938b56c6cb1"},
{file = "aiohttp-3.13.5-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a5029cc80718bbd545123cd8fe5d15025eccaaaace5d0eeec6bd556ad6163d61"},
{file = "aiohttp-3.13.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:4bb6bf5811620003614076bdc807ef3b5e38244f9d25ca5fe888eaccea2a9832"},
{file = "aiohttp-3.13.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a84792f8631bf5a94e52d9cc881c0b824ab42717165a5579c760b830d9392ac9"},
{file = "aiohttp-3.13.5-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:57653eac22c6a4c13eb22ecf4d673d64a12f266e72785ab1c8b8e5940d0e8090"},
{file = "aiohttp-3.13.5-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5e5f7debc7a57af53fdf5c5009f9391d9f4c12867049d509bf7bb164a6e295b"},
{file = "aiohttp-3.13.5-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c719f65bebcdf6716f10e9eff80d27567f7892d8988c06de12bbbd39307c6e3a"},
{file = "aiohttp-3.13.5-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d97f93fdae594d886c5a866636397e2bcab146fd7a132fd6bb9ce182224452f8"},
{file = "aiohttp-3.13.5-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3df334e39d4c2f899a914f1dba283c1aadc311790733f705182998c6f7cae665"},
{file = "aiohttp-3.13.5-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fe6970addfea9e5e081401bcbadf865d2b6da045472f58af08427e108d618540"},
{file = "aiohttp-3.13.5-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7becdf835feff2f4f335d7477f121af787e3504b48b449ff737afb35869ba7bb"},
{file = "aiohttp-3.13.5-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:676e5651705ad5d8a70aeb8eb6936c436d8ebbd56e63436cb7dd9bb36d2a9a46"},
{file = "aiohttp-3.13.5-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:9b16c653d38eb1a611cc898c41e76859ca27f119d25b53c12875fd0474ae31a8"},
{file = "aiohttp-3.13.5-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:999802d5fa0389f58decd24b537c54aa63c01c3219ce17d1214cbda3c2b22d2d"},
{file = "aiohttp-3.13.5-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:ec707059ee75732b1ba130ed5f9580fe10ff75180c812bc267ded039db5128c6"},
{file = "aiohttp-3.13.5-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:2d6d44a5b48132053c2f6cd5c8cb14bc67e99a63594e336b0f2af81e94d5530c"},
{file = "aiohttp-3.13.5-cp313-cp313-win32.whl", hash = "sha256:329f292ed14d38a6c4c435e465f48bebb47479fd676a0411936cc371643225cc"},
{file = "aiohttp-3.13.5-cp313-cp313-win_amd64.whl", hash = "sha256:69f571de7500e0557801c0b51f4780482c0ec5fe2ac851af5a92cfce1af1cb83"},
{file = "aiohttp-3.13.5-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:eb4639f32fd4a9904ab8fb45bf3383ba71137f3d9d4ba25b3b3f3109977c5b8c"},
{file = "aiohttp-3.13.5-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:7e5dc4311bd5ac493886c63cbf76ab579dbe4641268e7c74e48e774c74b6f2be"},
{file = "aiohttp-3.13.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:756c3c304d394977519824449600adaf2be0ccee76d206ee339c5e76b70ded25"},
{file = "aiohttp-3.13.5-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ecc26751323224cf8186efcf7fbcbc30f4e1d8c7970659daf25ad995e4032a56"},
{file = "aiohttp-3.13.5-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:10a75acfcf794edf9d8db50e5a7ec5fc818b2a8d3f591ce93bc7b1210df016d2"},
{file = "aiohttp-3.13.5-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:0f7a18f258d124cd678c5fe072fe4432a4d5232b0657fca7c1847f599233c83a"},
{file = "aiohttp-3.13.5-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:df6104c009713d3a89621096f3e3e88cc323fd269dbd7c20afe18535094320be"},
{file = "aiohttp-3.13.5-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:241a94f7de7c0c3b616627aaad530fe2cb620084a8b144d3be7b6ecfe95bae3b"},
{file = "aiohttp-3.13.5-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c974fb66180e58709b6fc402846f13791240d180b74de81d23913abe48e96d94"},
{file = "aiohttp-3.13.5-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:6e27ea05d184afac78aabbac667450c75e54e35f62238d44463131bd3f96753d"},
{file = "aiohttp-3.13.5-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a79a6d399cef33a11b6f004c67bb07741d91f2be01b8d712d52c75711b1e07c7"},
{file = "aiohttp-3.13.5-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:c632ce9c0b534fbe25b52c974515ed674937c5b99f549a92127c85f771a78772"},
{file = "aiohttp-3.13.5-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:fceedde51fbd67ee2bcc8c0b33d0126cc8b51ef3bbde2f86662bd6d5a6f10ec5"},
{file = "aiohttp-3.13.5-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:f92995dfec9420bb69ae629abf422e516923ba79ba4403bc750d94fb4a6c68c1"},
{file = "aiohttp-3.13.5-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:20ae0ff08b1f2c8788d6fb85afcb798654ae6ba0b747575f8562de738078457b"},
{file = "aiohttp-3.13.5-cp314-cp314-win32.whl", hash = "sha256:b20df693de16f42b2472a9c485e1c948ee55524786a0a34345511afdd22246f3"},
{file = "aiohttp-3.13.5-cp314-cp314-win_amd64.whl", hash = "sha256:f85c6f327bf0b8c29da7d93b1cabb6363fb5e4e160a32fa241ed2dce21b73162"},
{file = "aiohttp-3.13.5-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:1efb06900858bb618ff5cee184ae2de5828896c448403d51fb633f09e109be0a"},
{file = "aiohttp-3.13.5-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:fee86b7c4bd29bdaf0d53d14739b08a106fdda809ca5fe032a15f52fae5fe254"},
{file = "aiohttp-3.13.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:20058e23909b9e65f9da62b396b77dfa95965cbe840f8def6e572538b1d32e36"},
{file = "aiohttp-3.13.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8cf20a8d6868cb15a73cab329ffc07291ba8c22b1b88176026106ae39aa6df0f"},
{file = "aiohttp-3.13.5-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:330f5da04c987f1d5bdb8ae189137c77139f36bd1cb23779ca1a354a4b027800"},
{file = "aiohttp-3.13.5-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6f1cbf0c7926d315c3c26c2da41fd2b5d2fe01ac0e157b78caefc51a782196cf"},
{file = "aiohttp-3.13.5-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:53fc049ed6390d05423ba33103ded7281fe897cf97878f369a527070bd95795b"},
{file = "aiohttp-3.13.5-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:898703aa2667e3c5ca4c54ca36cd73f58b7a38ef87a5606414799ebce4d3fd3a"},
{file = "aiohttp-3.13.5-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:0494a01ca9584eea1e5fbd6d748e61ecff218c51b576ee1999c23db7066417d8"},
{file = "aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:6cf81fe010b8c17b09495cbd15c1d35afbc8fb405c0c9cf4738e5ae3af1d65be"},
{file = "aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:c564dd5f09ddc9d8f2c2d0a301cd30a79a2cc1b46dd1a73bef8f0038863d016b"},
{file = "aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:2994be9f6e51046c4f864598fd9abeb4fba6e88f0b2152422c9666dcd4aea9c6"},
{file = "aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:157826e2fa245d2ef46c83ea8a5faf77ca19355d278d425c29fda0beb3318037"},
{file = "aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:a8aca50daa9493e9e13c0f566201a9006f080e7c50e5e90d0b06f53146a54500"},
{file = "aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:3b13560160d07e047a93f23aaa30718606493036253d5430887514715b67c9d9"},
{file = "aiohttp-3.13.5-cp314-cp314t-win32.whl", hash = "sha256:9a0f4474b6ea6818b41f82172d799e4b3d29e22c2c520ce4357856fced9af2f8"},
{file = "aiohttp-3.13.5-cp314-cp314t-win_amd64.whl", hash = "sha256:18a2f6c1182c51baa1d28d68fea51513cb2a76612f038853c0ad3c145423d3d9"},
{file = "aiohttp-3.13.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:347542f0ea3f95b2a955ee6656461fa1c776e401ac50ebce055a6c38454a0adf"},
{file = "aiohttp-3.13.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:178c7b5e62b454c2bc790786e6058c3cc968613b4419251b478c153a4aec32b1"},
{file = "aiohttp-3.13.5-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:af545c2cffdb0967a96b6249e6f5f7b0d92cdfd267f9d5238d5b9ca63e8edb10"},
{file = "aiohttp-3.13.5-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:206b7b3ef96e4ce211754f0cd003feb28b7d81f0ad26b8d077a5d5161436067f"},
{file = "aiohttp-3.13.5-cp39-cp39-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:ee5e86776273de1795947d17bddd6bb19e0365fd2af4289c0d2c5454b6b1d36b"},
{file = "aiohttp-3.13.5-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:95d14ca7abefde230f7639ec136ade282655431fd5db03c343b19dda72dd1643"},
{file = "aiohttp-3.13.5-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:912d4b6af530ddb1338a66229dac3a25ff11d4448be3ec3d6340583995f56031"},
{file = "aiohttp-3.13.5-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e999f0c88a458c836d5fb521814e92ed2172c649200336a6df514987c1488258"},
{file = "aiohttp-3.13.5-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:39380e12bd1f2fdab4285b6e055ad48efbaed5c836433b142ed4f5b9be71036a"},
{file = "aiohttp-3.13.5-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:9efcc0f11d850cefcafdd9275b9576ad3bfb539bed96807663b32ad99c4d4b88"},
{file = "aiohttp-3.13.5-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:147b4f501d0292077f29d5268c16bb7c864a1f054d7001c4c1812c0421ea1ed0"},
{file = "aiohttp-3.13.5-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:d147004fede1b12f6013a6dbb2a26a986a671a03c6ea740ddc76500e5f1c399f"},
{file = "aiohttp-3.13.5-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:9277145d36a01653863899c665243871434694bcc3431922c3b35c978061bdb8"},
{file = "aiohttp-3.13.5-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:4e704c52438f66fdd89588346183d898bb42167cf88f8b7ff1c0f9fc957c348f"},
{file = "aiohttp-3.13.5-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:a8a4d3427e8de1312ddf309cc482186466c79895b3a139fed3259fc01dfa9a5b"},
{file = "aiohttp-3.13.5-cp39-cp39-win32.whl", hash = "sha256:6f497a6876aa4b1a102b04996ce4c1170c7040d83faa9387dd921c16e30d5c83"},
{file = "aiohttp-3.13.5-cp39-cp39-win_amd64.whl", hash = "sha256:cb979826071c0986a5f08333a36104153478ce6018c58cba7f9caddaf63d5d67"},
{file = "aiohttp-3.13.5.tar.gz", hash = "sha256:9d98cc980ecc96be6eb4c1994ce35d28d8b1f5e5208a23b421187d1209dbb7d1"},
]
[package.dependencies]
@@ -682,21 +682,21 @@ requests = ">=2.21.0,<3.0.0"
[[package]]
name = "alibabacloud-tea-openapi"
version = "0.4.1"
version = "0.4.4"
description = "Alibaba Cloud openapi SDK Library for Python"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "alibabacloud_tea_openapi-0.4.1-py3-none-any.whl", hash = "sha256:e46bfa3ca34086d2c357d217a0b7284ecbd4b3bab5c88e075e73aec637b0e4a0"},
{file = "alibabacloud_tea_openapi-0.4.1.tar.gz", hash = "sha256:2384b090870fdb089c3c40f3fb8cf0145b8c7d6c14abbac521f86a01abb5edaf"},
{file = "alibabacloud_tea_openapi-0.4.4-py3-none-any.whl", hash = "sha256:cea6bc1fe35b0319a8752cb99eb0ecb0dab7ca1a71b99c12970ba0867410995f"},
{file = "alibabacloud_tea_openapi-0.4.4.tar.gz", hash = "sha256:1b0917bc03cd49417da64945e92731716d53e2eb8707b235f54e45b7473221ce"},
]
[package.dependencies]
alibabacloud-credentials = ">=1.0.2,<2.0.0"
alibabacloud-gateway-spi = ">=0.0.2,<1.0.0"
alibabacloud-tea-util = ">=0.3.13,<1.0.0"
cryptography = ">=3.0.0,<45.0.0"
cryptography = {version = ">=3.0.0,<47.0.0", markers = "python_version >= \"3.8\""}
darabonba-core = ">=1.0.3,<2.0.0"
[[package]]
@@ -943,14 +943,14 @@ files = [
[[package]]
name = "authlib"
version = "1.6.6"
version = "1.6.9"
description = "The ultimate Python library in building OAuth and OpenID Connect servers and clients."
optional = false
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "authlib-1.6.6-py2.py3-none-any.whl", hash = "sha256:7d9e9bc535c13974313a87f53e8430eb6ea3d1cf6ae4f6efcd793f2e949143fd"},
{file = "authlib-1.6.6.tar.gz", hash = "sha256:45770e8e056d0f283451d9996fbb59b70d45722b45d854d58f32878d0a40c38e"},
{file = "authlib-1.6.9-py2.py3-none-any.whl", hash = "sha256:f08b4c14e08f0861dc18a32357b33fbcfd2ea86cfe3fe149484b4d764c4a0ac3"},
{file = "authlib-1.6.9.tar.gz", hash = "sha256:d8f2421e7e5980cc1ddb4e32d3f5fa659cfaf60d8eaf3281ebed192e4ab74f04"},
]
[package.dependencies]
@@ -1526,19 +1526,19 @@ typing-extensions = ">=4.6.0"
[[package]]
name = "azure-mgmt-resource"
version = "23.3.0"
version = "24.0.0"
description = "Microsoft Azure Resource Management Client Library for Python"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "azure_mgmt_resource-23.3.0-py3-none-any.whl", hash = "sha256:ab216ee28e29db6654b989746e0c85a1181f66653929d2cb6e48fba66d9af323"},
{file = "azure_mgmt_resource-23.3.0.tar.gz", hash = "sha256:fc4f1fd8b6aad23f8af4ed1f913df5f5c92df117449dc354fea6802a2829fea4"},
{file = "azure_mgmt_resource-24.0.0-py3-none-any.whl", hash = "sha256:27b32cd223e2784269f5a0db3c282042886ee4072d79cedc638438ece7cd0df4"},
{file = "azure_mgmt_resource-24.0.0.tar.gz", hash = "sha256:cf6b8995fcdd407ac9ff1dd474087129429a1d90dbb1ac77f97c19b96237b265"},
]
[package.dependencies]
azure-common = ">=1.1"
azure-mgmt-core = ">=1.3.2"
azure-mgmt-core = ">=1.5.0"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
@@ -1822,19 +1822,19 @@ crt = ["awscrt (==0.27.6)"]
[[package]]
name = "cartography"
version = "0.132.0"
version = "0.135.0"
description = "Explore assets and their relationships across your technical infrastructure."
optional = false
python-versions = ">=3.10"
groups = ["main"]
files = [
{file = "cartography-0.132.0-py3-none-any.whl", hash = "sha256:c070aa51d0ab4479cb043cae70b35e7df49f2fb5f1fa95ccf10000bbeb952262"},
{file = "cartography-0.132.0.tar.gz", hash = "sha256:7c6332bc57fd2629d7b83aee7bd95a7b2edb0d51ef746efa0461399e0b66625c"},
{file = "cartography-0.135.0-py3-none-any.whl", hash = "sha256:c62c32a6917b8f23a8b98fe2b6c7c4a918b50f55918482966c4dae1cf5f538e1"},
{file = "cartography-0.135.0.tar.gz", hash = "sha256:3f500cd22c3b392d00e8b49f62acc95fd4dcd559ce514aafe2eb8101133c7a49"},
]
[package.dependencies]
adal = ">=1.2.4"
aioboto3 = ">=13.0.0"
aioboto3 = ">=15.0.0"
azure-cli-core = ">=2.26.0"
azure-identity = ">=1.5.0"
azure-keyvault-certificates = ">=4.0.0"
@@ -1852,9 +1852,9 @@ azure-mgmt-keyvault = ">=10.0.0"
azure-mgmt-logic = ">=10.0.0"
azure-mgmt-monitor = ">=3.0.0"
azure-mgmt-network = ">=25.0.0"
azure-mgmt-resource = ">=10.2.0,<25.0.0"
azure-mgmt-resource = ">=24.0.0,<25"
azure-mgmt-security = ">=5.0.0"
azure-mgmt-sql = ">=3.0.1,<4"
azure-mgmt-sql = ">=3.0.1"
azure-mgmt-storage = ">=16.0.0"
azure-mgmt-synapse = ">=2.0.0"
azure-mgmt-web = ">=7.0.0"
@@ -1862,38 +1862,39 @@ azure-synapse-artifacts = ">=0.17.0"
backoff = ">=2.1.2"
boto3 = ">=1.15.1"
botocore = ">=1.18.1"
cloudflare = ">=4.1.0,<5.0.0"
cloudflare = ">=4.1.0"
crowdstrike-falconpy = ">=0.5.1"
cryptography = "*"
dnspython = ">=1.15.0"
duo-client = "*"
google-api-python-client = ">=1.7.8"
cryptography = ">=45.0.0"
dnspython = ">=2.0.0"
duo-client = ">=5.5.0"
google-api-python-client = ">=2.0.0"
google-auth = ">=2.37.0"
google-cloud-asset = ">=1.0.0"
google-cloud-resource-manager = ">=1.14.2"
httpx = ">=0.24.0"
kubernetes = ">=22.6.0"
marshmallow = ">=3.0.0rc7"
msgraph-sdk = "*"
marshmallow = ">=4.0.0"
msgraph-sdk = ">=1.53.0"
msrestazure = ">=0.6.4"
neo4j = ">=6.0.0"
oci = ">=2.71.0"
okta = "<1.0.0"
packageurl-python = "*"
packaging = "*"
packageurl-python = ">=0.17.0"
packaging = ">=26.0.0"
pagerduty = ">=4.0.1"
policyuniverse = ">=1.1.0.0"
PyJWT = {version = ">=2.0.0", extras = ["crypto"]}
python-dateutil = "*"
python-dateutil = ">=2.9.0"
python-digitalocean = ">=1.16.0"
pyyaml = ">=5.3.1"
requests = ">=2.22.0"
scaleway = ">=2.10.0"
slack-sdk = ">=3.37.0"
statsd = "*"
statsd = ">=4.0.0"
typer = ">=0.9.0"
types-aiobotocore-ecr = "*"
xmltodict = "*"
types-aiobotocore-ecr = ">=3.1.0"
workos = ">=5.44.0"
xmltodict = ">=1.0.0"
[[package]]
name = "celery"
@@ -2503,62 +2504,74 @@ dev = ["bandit", "coverage", "flake8", "pydocstyle", "pylint", "pytest", "pytest
[[package]]
name = "cryptography"
version = "44.0.3"
version = "46.0.7"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
optional = false
python-versions = "!=3.9.0,!=3.9.1,>=3.7"
python-versions = "!=3.9.0,!=3.9.1,>=3.8"
groups = ["main", "dev"]
files = [
{file = "cryptography-44.0.3-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:962bc30480a08d133e631e8dfd4783ab71cc9e33d5d7c1e192f0b7c06397bb88"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffc61e8f3bf5b60346d89cd3d37231019c17a081208dfbbd6e1605ba03fa137"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58968d331425a6f9eedcee087f77fd3c927c88f55368f43ff7e0a19891f2642c"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:e28d62e59a4dbd1d22e747f57d4f00c459af22181f0b2f787ea83f5a876d7c76"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:af653022a0c25ef2e3ffb2c673a50e5a0d02fecc41608f4954176f1933b12359"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:157f1f3b8d941c2bd8f3ffee0af9b049c9665c39d3da9db2dc338feca5e98a43"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:c6cd67722619e4d55fdb42ead64ed8843d64638e9c07f4011163e46bc512cf01"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:b424563394c369a804ecbee9b06dfb34997f19d00b3518e39f83a5642618397d"},
{file = "cryptography-44.0.3-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:c91fc8e8fd78af553f98bc7f2a1d8db977334e4eea302a4bfd75b9461c2d8904"},
{file = "cryptography-44.0.3-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:25cd194c39fa5a0aa4169125ee27d1172097857b27109a45fadc59653ec06f44"},
{file = "cryptography-44.0.3-cp37-abi3-win32.whl", hash = "sha256:3be3f649d91cb182c3a6bd336de8b61a0a71965bd13d1a04a0e15b39c3d5809d"},
{file = "cryptography-44.0.3-cp37-abi3-win_amd64.whl", hash = "sha256:3883076d5c4cc56dbef0b898a74eb6992fdac29a7b9013870b34efe4ddb39a0d"},
{file = "cryptography-44.0.3-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:5639c2b16764c6f76eedf722dbad9a0914960d3489c0cc38694ddf9464f1bb2f"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3ffef566ac88f75967d7abd852ed5f182da252d23fac11b4766da3957766759"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:192ed30fac1728f7587c6f4613c29c584abdc565d7417c13904708db10206645"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:7d5fe7195c27c32a64955740b949070f21cba664604291c298518d2e255931d2"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3f07943aa4d7dad689e3bb1638ddc4944cc5e0921e3c227486daae0e31a05e54"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:cb90f60e03d563ca2445099edf605c16ed1d5b15182d21831f58460c48bffb93"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:ab0b005721cc0039e885ac3503825661bd9810b15d4f374e473f8c89b7d5460c"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:3bb0847e6363c037df8f6ede57d88eaf3410ca2267fb12275370a76f85786a6f"},
{file = "cryptography-44.0.3-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:b0cc66c74c797e1db750aaa842ad5b8b78e14805a9b5d1348dc603612d3e3ff5"},
{file = "cryptography-44.0.3-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:6866df152b581f9429020320e5eb9794c8780e90f7ccb021940d7f50ee00ae0b"},
{file = "cryptography-44.0.3-cp39-abi3-win32.whl", hash = "sha256:c138abae3a12a94c75c10499f1cbae81294a6f983b3af066390adee73f433028"},
{file = "cryptography-44.0.3-cp39-abi3-win_amd64.whl", hash = "sha256:5d186f32e52e66994dce4f766884bcb9c68b8da62d61d9d215bfe5fb56d21334"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:cad399780053fb383dc067475135e41c9fe7d901a97dd5d9c5dfb5611afc0d7d"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:21a83f6f35b9cc656d71b5de8d519f566df01e660ac2578805ab245ffd8523f8"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:fc3c9babc1e1faefd62704bb46a69f359a9819eb0292e40df3fb6e3574715cd4"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:e909df4053064a97f1e6565153ff8bb389af12c5c8d29c343308760890560aff"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:dad80b45c22e05b259e33ddd458e9e2ba099c86ccf4e88db7bbab4b747b18d06"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:479d92908277bed6e1a1c69b277734a7771c2b78633c224445b5c60a9f4bc1d9"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:896530bc9107b226f265effa7ef3f21270f18a2026bc09fed1ebd7b66ddf6375"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:9b4d4a5dbee05a2c390bf212e78b99434efec37b17a4bff42f50285c5c8c9647"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02f55fb4f8b79c1221b0961488eaae21015b69b210e18c386b69de182ebb1259"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:dd3db61b8fe5be220eee484a17233287d0be6932d056cf5738225b9c05ef4fff"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:978631ec51a6bbc0b7e58f23b68a8ce9e5f09721940933e9c217068388789fe5"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:5d20cc348cca3a8aa7312f42ab953a56e15323800ca3ab0706b8cd452a3a056c"},
{file = "cryptography-44.0.3.tar.gz", hash = "sha256:fe19d8bc5536a91a24a8133328880a41831b6c5df54599a8417b62fe015d3053"},
{file = "cryptography-46.0.7-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:ea42cbe97209df307fdc3b155f1b6fa2577c0defa8f1f7d3be7d31d189108ad4"},
{file = "cryptography-46.0.7-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b36a4695e29fe69215d75960b22577197aca3f7a25b9cf9d165dcfe9d80bc325"},
{file = "cryptography-46.0.7-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:5ad9ef796328c5e3c4ceed237a183f5d41d21150f972455a9d926593a1dcb308"},
{file = "cryptography-46.0.7-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:73510b83623e080a2c35c62c15298096e2a5dc8d51c3b4e1740211839d0dea77"},
{file = "cryptography-46.0.7-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:cbd5fb06b62bd0721e1170273d3f4d5a277044c47ca27ee257025146c34cbdd1"},
{file = "cryptography-46.0.7-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:420b1e4109cc95f0e5700eed79908cef9268265c773d3a66f7af1eef53d409ef"},
{file = "cryptography-46.0.7-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:24402210aa54baae71d99441d15bb5a1919c195398a87b563df84468160a65de"},
{file = "cryptography-46.0.7-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:8a469028a86f12eb7d2fe97162d0634026d92a21f3ae0ac87ed1c4a447886c83"},
{file = "cryptography-46.0.7-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:9694078c5d44c157ef3162e3bf3946510b857df5a3955458381d1c7cfc143ddb"},
{file = "cryptography-46.0.7-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:42a1e5f98abb6391717978baf9f90dc28a743b7d9be7f0751a6f56a75d14065b"},
{file = "cryptography-46.0.7-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:91bbcb08347344f810cbe49065914fe048949648f6bd5c2519f34619142bbe85"},
{file = "cryptography-46.0.7-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:5d1c02a14ceb9148cc7816249f64f623fbfee39e8c03b3650d842ad3f34d637e"},
{file = "cryptography-46.0.7-cp311-abi3-win32.whl", hash = "sha256:d23c8ca48e44ee015cd0a54aeccdf9f09004eba9fc96f38c911011d9ff1bd457"},
{file = "cryptography-46.0.7-cp311-abi3-win_amd64.whl", hash = "sha256:397655da831414d165029da9bc483bed2fe0e75dde6a1523ec2fe63f3c46046b"},
{file = "cryptography-46.0.7-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:d151173275e1728cf7839aaa80c34fe550c04ddb27b34f48c232193df8db5842"},
{file = "cryptography-46.0.7-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:db0f493b9181c7820c8134437eb8b0b4792085d37dbb24da050476ccb664e59c"},
{file = "cryptography-46.0.7-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:ebd6daf519b9f189f85c479427bbd6e9c9037862cf8fe89ee35503bd209ed902"},
{file = "cryptography-46.0.7-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:b7b412817be92117ec5ed95f880defe9cf18a832e8cafacf0a22337dc1981b4d"},
{file = "cryptography-46.0.7-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:fbfd0e5f273877695cb93baf14b185f4878128b250cc9f8e617ea0c025dfb022"},
{file = "cryptography-46.0.7-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:ffca7aa1d00cf7d6469b988c581598f2259e46215e0140af408966a24cf086ce"},
{file = "cryptography-46.0.7-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:60627cf07e0d9274338521205899337c5d18249db56865f943cbe753aa96f40f"},
{file = "cryptography-46.0.7-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:80406c3065e2c55d7f49a9550fe0c49b3f12e5bfff5dedb727e319e1afb9bf99"},
{file = "cryptography-46.0.7-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:c5b1ccd1239f48b7151a65bc6dd54bcfcc15e028c8ac126d3fada09db0e07ef1"},
{file = "cryptography-46.0.7-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:d5f7520159cd9c2154eb61eb67548ca05c5774d39e9c2c4339fd793fe7d097b2"},
{file = "cryptography-46.0.7-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:fcd8eac50d9138c1d7fc53a653ba60a2bee81a505f9f8850b6b2888555a45d0e"},
{file = "cryptography-46.0.7-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:65814c60f8cc400c63131584e3e1fad01235edba2614b61fbfbfa954082db0ee"},
{file = "cryptography-46.0.7-cp314-cp314t-win32.whl", hash = "sha256:fdd1736fed309b4300346f88f74cd120c27c56852c3838cab416e7a166f67298"},
{file = "cryptography-46.0.7-cp314-cp314t-win_amd64.whl", hash = "sha256:e06acf3c99be55aa3b516397fe42f5855597f430add9c17fa46bf2e0fb34c9bb"},
{file = "cryptography-46.0.7-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:462ad5cb1c148a22b2e3bcc5ad52504dff325d17daf5df8d88c17dda1f75f2a4"},
{file = "cryptography-46.0.7-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:84d4cced91f0f159a7ddacad249cc077e63195c36aac40b4150e7a57e84fffe7"},
{file = "cryptography-46.0.7-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:128c5edfe5e5938b86b03941e94fac9ee793a94452ad1365c9fc3f4f62216832"},
{file = "cryptography-46.0.7-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:5e51be372b26ef4ba3de3c167cd3d1022934bc838ae9eaad7e644986d2a3d163"},
{file = "cryptography-46.0.7-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:cdf1a610ef82abb396451862739e3fc93b071c844399e15b90726ef7470eeaf2"},
{file = "cryptography-46.0.7-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1d25aee46d0c6f1a501adcddb2d2fee4b979381346a78558ed13e50aa8a59067"},
{file = "cryptography-46.0.7-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:cdfbe22376065ffcf8be74dc9a909f032df19bc58a699456a21712d6e5eabfd0"},
{file = "cryptography-46.0.7-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:abad9dac36cbf55de6eb49badd4016806b3165d396f64925bf2999bcb67837ba"},
{file = "cryptography-46.0.7-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:935ce7e3cfdb53e3536119a542b839bb94ec1ad081013e9ab9b7cfd478b05006"},
{file = "cryptography-46.0.7-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:35719dc79d4730d30f1c2b6474bd6acda36ae2dfae1e3c16f2051f215df33ce0"},
{file = "cryptography-46.0.7-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:7bbc6ccf49d05ac8f7d7b5e2e2c33830d4fe2061def88210a126d130d7f71a85"},
{file = "cryptography-46.0.7-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a1529d614f44b863a7b480c6d000fe93b59acee9c82ffa027cfadc77521a9f5e"},
{file = "cryptography-46.0.7-cp38-abi3-win32.whl", hash = "sha256:f247c8c1a1fb45e12586afbb436ef21ff1e80670b2861a90353d9b025583d246"},
{file = "cryptography-46.0.7-cp38-abi3-win_amd64.whl", hash = "sha256:506c4ff91eff4f82bdac7633318a526b1d1309fc07ca76a3ad182cb5b686d6d3"},
{file = "cryptography-46.0.7-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:fc9ab8856ae6cf7c9358430e49b368f3108f050031442eaeb6b9d87e4dcf4e4f"},
{file = "cryptography-46.0.7-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:d3b99c535a9de0adced13d159c5a9cf65c325601aa30f4be08afd680643e9c15"},
{file = "cryptography-46.0.7-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d02c738dacda7dc2a74d1b2b3177042009d5cab7c7079db74afc19e56ca1b455"},
{file = "cryptography-46.0.7-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:04959522f938493042d595a736e7dbdff6eb6cc2339c11465b3ff89343b65f65"},
{file = "cryptography-46.0.7-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:3986ac1dee6def53797289999eabe84798ad7817f3e97779b5061a95b0ee4968"},
{file = "cryptography-46.0.7-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:258514877e15963bd43b558917bc9f54cf7cf866c38aa576ebf47a77ddbc43a4"},
{file = "cryptography-46.0.7.tar.gz", hash = "sha256:e4cfd68c5f3e0bfdad0d38e023239b96a2fe84146481852dffbcca442c245aa5"},
]
[package.dependencies]
cffi = {version = ">=1.12", markers = "platform_python_implementation != \"PyPy\""}
cffi = {version = ">=2.0.0", markers = "python_full_version >= \"3.9.0\" and platform_python_implementation != \"PyPy\""}
[package.extras]
docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=3.0.0) ; python_version >= \"3.8\""]
docs = ["sphinx (>=5.3.0)", "sphinx-inline-tabs", "sphinx-rtd-theme (>=3.0.0)"]
docstest = ["pyenchant (>=3)", "readme-renderer (>=30.0)", "sphinxcontrib-spelling (>=7.3.1)"]
nox = ["nox (>=2024.4.15)", "nox[uv] (>=2024.3.2) ; python_version >= \"3.8\""]
pep8test = ["check-sdist ; python_version >= \"3.8\"", "click (>=8.0.1)", "mypy (>=1.4)", "ruff (>=0.3.6)"]
nox = ["nox[uv] (>=2024.4.15)"]
pep8test = ["check-sdist", "click (>=8.0.1)", "mypy (>=1.14)", "ruff (>=0.11.11)"]
sdist = ["build (>=1.0.0)"]
ssh = ["bcrypt (>=3.1.5)"]
test = ["certifi (>=2024)", "cryptography-vectors (==44.0.3)", "pretend (>=0.7)", "pytest (>=7.4.0)", "pytest-benchmark (>=4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
test = ["certifi (>=2024)", "cryptography-vectors (==46.0.7)", "pretend (>=0.7)", "pytest (>=7.4.0)", "pytest-benchmark (>=4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
test-randomorder = ["pytest-randomly"]
[[package]]
@@ -3740,19 +3753,19 @@ urllib3 = ["packaging", "urllib3"]
[[package]]
name = "google-auth-httplib2"
version = "0.2.1"
version = "0.2.0"
description = "Google Authentication Library: httplib2 transport"
optional = false
python-versions = ">=3.7"
python-versions = "*"
groups = ["main"]
files = [
{file = "google_auth_httplib2-0.2.1-py3-none-any.whl", hash = "sha256:1be94c611db91c01f9703e7f62b0a59bbd5587a95571c7b6fade510d648bc08b"},
{file = "google_auth_httplib2-0.2.1.tar.gz", hash = "sha256:5ef03be3927423c87fb69607b42df23a444e434ddb2555b73b3679793187b7de"},
{file = "google-auth-httplib2-0.2.0.tar.gz", hash = "sha256:38aa7badf48f974f1eb9861794e9c0cb2a0511a4ec0679b1f886d108f5640e05"},
{file = "google_auth_httplib2-0.2.0-py2.py3-none-any.whl", hash = "sha256:b65a0a2123300dd71281a7bf6e64d65a0759287df52729bdd1ae2e47dc311a3d"},
]
[package.dependencies]
google-auth = ">=1.32.0,<3.0.0"
httplib2 = ">=0.19.0,<1.0.0"
google-auth = "*"
httplib2 = ">=0.19.0"
[[package]]
name = "google-cloud-access-context-manager"
@@ -5181,24 +5194,16 @@ files = [
[[package]]
name = "marshmallow"
version = "3.26.2"
version = "4.3.0"
description = "A lightweight library for converting complex datatypes to and from native Python datatypes."
optional = false
python-versions = ">=3.9"
python-versions = ">=3.10"
groups = ["main", "dev"]
files = [
{file = "marshmallow-3.26.2-py3-none-any.whl", hash = "sha256:013fa8a3c4c276c24d26d84ce934dc964e2aa794345a0f8c7e5a7191482c8a73"},
{file = "marshmallow-3.26.2.tar.gz", hash = "sha256:bbe2adb5a03e6e3571b573f42527c6fe926e17467833660bebd11593ab8dfd57"},
{file = "marshmallow-4.3.0-py3-none-any.whl", hash = "sha256:46c4fe6984707e3cbd485dfebbf0a59874f58d695aad05c1668d15e8c6e13b46"},
{file = "marshmallow-4.3.0.tar.gz", hash = "sha256:fb43c53b3fe240b8f6af37223d6ef1636f927ad9bea8ab323afad95dff090880"},
]
[package.dependencies]
packaging = ">=17.0"
[package.extras]
dev = ["marshmallow[tests]", "pre-commit (>=3.5,<5.0)", "tox"]
docs = ["autodocsumm (==0.2.14)", "furo (==2024.8.6)", "sphinx (==8.1.3)", "sphinx-copybutton (==0.5.2)", "sphinx-issues (==5.0.0)", "sphinxext-opengraph (==0.9.1)"]
tests = ["pytest", "simplejson"]
[[package]]
name = "matplotlib"
version = "3.10.8"
@@ -5492,14 +5497,14 @@ dev = ["bumpver", "isort", "mypy", "pylint", "pytest", "yapf"]
[[package]]
name = "msgraph-sdk"
version = "1.23.0"
version = "1.55.0"
description = "The Microsoft Graph Python SDK"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "msgraph_sdk-1.23.0-py3-none-any.whl", hash = "sha256:58e0047b4ca59fd82022c02cd73fec0170a3d84f3b76721e3db2a0314df9a58a"},
{file = "msgraph_sdk-1.23.0.tar.gz", hash = "sha256:6dd1ba9a46f5f0ce8599fd9610133adbd9d1493941438b5d3632fce9e55ed607"},
{file = "msgraph_sdk-1.55.0-py3-none-any.whl", hash = "sha256:c8e68ebc4b88af5111de312e7fa910a4e76ddf48a4534feadb1fb8a411c48cfc"},
{file = "msgraph_sdk-1.55.0.tar.gz", hash = "sha256:6df691a31954a050d26b8a678968017e157d940fb377f2a8a4e17a9741b98756"},
]
[package.dependencies]
@@ -5925,23 +5930,24 @@ signedtoken = ["cryptography (>=3.0.0)", "pyjwt (>=2.0.0,<3)"]
[[package]]
name = "oci"
version = "2.160.3"
version = "2.169.0"
description = "Oracle Cloud Infrastructure Python SDK"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "oci-2.160.3-py3-none-any.whl", hash = "sha256:858bff3e697098bdda44833d2476bfb4632126f0182178e7dbde4dbd156d71f0"},
{file = "oci-2.160.3.tar.gz", hash = "sha256:57514889be3b713a8385d86e3ba8a33cf46e3563c2a7e29a93027fb30b8a2537"},
{file = "oci-2.169.0-py3-none-any.whl", hash = "sha256:c71bb5143f307791082b3e33cc1545c2490a518cfed85ab1948ef5107c36d30b"},
{file = "oci-2.169.0.tar.gz", hash = "sha256:f3c5fff00b01783b5325ea7b13bf140053ec1e9f41da20bfb9c8a349ee7662fa"},
]
[package.dependencies]
certifi = "*"
circuitbreaker = {version = ">=1.3.1,<3.0.0", markers = "python_version >= \"3.7\""}
cryptography = ">=3.2.1,<46.0.0"
pyOpenSSL = ">=17.5.0,<25.0.0"
cryptography = ">=3.2.1,<47.0.0"
pyOpenSSL = ">=17.5.0,<27.0.0"
python-dateutil = ">=2.5.3,<3.0.0"
pytz = ">=2016.10"
urllib3 = {version = ">=2.6.3", markers = "python_version >= \"3.10.0\""}
[package.extras]
adk = ["docstring-parser (>=0.16) ; python_version >= \"3.10\" and python_version < \"4\"", "mcp (>=1.6.0) ; python_version >= \"3.10\" and python_version < \"4\"", "pydantic (>=2.10.6) ; python_version >= \"3.10\" and python_version < \"4\"", "rich (>=13.9.4) ; python_version >= \"3.10\" and python_version < \"4\""]
@@ -6445,6 +6451,33 @@ docs = ["sphinx (>=1.7.1)"]
redis = ["redis"]
tests = ["pytest (>=5.4.1)", "pytest-cov (>=2.8.1)", "pytest-mypy (>=0.8.0)", "pytest-timeout (>=2.1.0)", "redis", "sphinx (>=6.0.0)", "types-redis"]
[[package]]
name = "prek"
version = "0.3.9"
description = "A Git hook manager written in Rust, designed as a drop-in alternative to pre-commit."
optional = false
python-versions = ">=3.8"
groups = ["dev"]
files = [
{file = "prek-0.3.9-py3-none-linux_armv6l.whl", hash = "sha256:3ed793d51bfaa27bddb64d525d7acb77a7c8644f549412d82252e3eb0b88aad8"},
{file = "prek-0.3.9-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:399c58400c0bd0b82a93a3c09dc1bfd88d8d0cfb242d414d2ed247187b06ead1"},
{file = "prek-0.3.9-py3-none-macosx_11_0_arm64.whl", hash = "sha256:e2ea1ffb124e92f081b8e2ca5b5a623a733efb3be0c5b1f4b7ffe2ee17d1f20c"},
{file = "prek-0.3.9-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.musllinux_1_1_aarch64.whl", hash = "sha256:aaf639f95b7301639298311d8d44aad0d0b4864e9736083ad3c71ce9765d37ab"},
{file = "prek-0.3.9-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff104863b187fa443ea8451ca55d51e2c6e94f99f00d88784b5c3c4c623f1ebe"},
{file = "prek-0.3.9-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:039ecaf87c63a3e67cca645ebd5bc5eb6aafa6c9d929e9a27b2921e7849d7ef9"},
{file = "prek-0.3.9-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3bde2a3d045705095983c7f78ba04f72a7565fe1c2b4e85f5628502a254754ff"},
{file = "prek-0.3.9-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:28a0960a21543563e2c8e19aaad176cc8423a87aac3c914d0f313030d7a9244a"},
{file = "prek-0.3.9-py3-none-manylinux_2_28_aarch64.whl", hash = "sha256:0dfb5d5171d7523271909246ee306b4dc3d5b63752e7dd7c7e8a8908fc9490d1"},
{file = "prek-0.3.9-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:82b791bd36c1430c84d3ae7220a85152babc7eaf00f70adcb961bd594e756ba3"},
{file = "prek-0.3.9-py3-none-musllinux_1_1_armv7l.whl", hash = "sha256:6eac6d2f736b041118f053a1487abed468a70dd85a8688eaf87bb42d3dcecf20"},
{file = "prek-0.3.9-py3-none-musllinux_1_1_i686.whl", hash = "sha256:5517e46e761367a3759b3168eabc120840ffbca9dfbc53187167298a98f87dc4"},
{file = "prek-0.3.9-py3-none-musllinux_1_1_x86_64.whl", hash = "sha256:92024778cf78683ca32687bb249ab6a7d5c33887b5ee1d1a9f6d0c14228f4cf3"},
{file = "prek-0.3.9-py3-none-win32.whl", hash = "sha256:7f89c55e5f480f5d073769e319924ad69d4bf9f98c5cb46a83082e26e634c958"},
{file = "prek-0.3.9-py3-none-win_amd64.whl", hash = "sha256:7722f3372eaa83b147e70a43cb7b9fe2128c13d0c78d8a1cdbf2a8ec2ee071eb"},
{file = "prek-0.3.9-py3-none-win_arm64.whl", hash = "sha256:0bced6278d6cc8a4b46048979e36bc9da034611dc8facd77ab123177b833a929"},
{file = "prek-0.3.9.tar.gz", hash = "sha256:f82b92d81f42f1f90a47f5fbbf492373e25ef1f790080215b2722dd6da66510e"},
]
[[package]]
name = "prompt-toolkit"
version = "3.0.52"
@@ -6632,7 +6665,7 @@ files = [
[[package]]
name = "prowler"
version = "5.23.0"
version = "5.26.0"
description = "Prowler is an Open Source security tool to perform AWS, GCP and Azure security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks."
optional = false
python-versions = ">=3.10,<3.13"
@@ -6652,7 +6685,7 @@ alibabacloud-rds20140815 = "12.0.0"
alibabacloud_sas20181203 = "6.1.0"
alibabacloud-sls20201230 = "5.9.0"
alibabacloud_sts20150401 = "1.1.6"
alibabacloud_tea_openapi = "0.4.1"
alibabacloud_tea_openapi = "0.4.4"
alibabacloud_vpc20160428 = "6.13.0"
alive-progress = "3.3.0"
awsipranges = "0.3.3"
@@ -6674,7 +6707,7 @@ azure-mgmt-postgresqlflexibleservers = "1.1.0"
azure-mgmt-rdbms = "10.1.0"
azure-mgmt-recoveryservices = "3.1.0"
azure-mgmt-recoveryservicesbackup = "9.2.0"
azure-mgmt-resource = "23.3.0"
azure-mgmt-resource = "24.0.0"
azure-mgmt-search = "9.1.0"
azure-mgmt-security = "7.0.0"
azure-mgmt-sql = "3.0.1"
@@ -6687,29 +6720,29 @@ boto3 = "1.40.61"
botocore = "1.40.61"
cloudflare = "4.3.1"
colorama = "0.4.6"
cryptography = "44.0.3"
cryptography = "46.0.7"
dash = "3.1.1"
dash-bootstrap-components = "2.0.3"
defusedxml = ">=0.7.1"
defusedxml = "0.7.1"
detect-secrets = "1.5.0"
dulwich = "0.23.0"
google-api-python-client = "2.163.0"
google-auth-httplib2 = ">=0.1,<0.3"
google-auth-httplib2 = "0.2.0"
h2 = "4.3.0"
jsonschema = "4.23.0"
kubernetes = "32.0.1"
markdown = "3.10.2"
microsoft-kiota-abstractions = "1.9.2"
msgraph-sdk = "1.23.0"
msgraph-sdk = "1.55.0"
numpy = "2.0.2"
oci = "2.160.3"
oci = "2.169.0"
openstacksdk = "4.2.0"
pandas = "2.2.3"
py-iam-expand = "0.1.0"
py-ocsf-models = "0.8.1"
pydantic = ">=2.0,<3.0"
pydantic = "2.12.5"
pygithub = "2.8.0"
python-dateutil = ">=2.9.0.post0,<3.0.0"
python-dateutil = "2.9.0.post0"
pytz = "2025.1"
schema = "0.7.5"
shodan = "1.31.0"
@@ -6721,8 +6754,8 @@ uuid6 = "2024.7.10"
[package.source]
type = "git"
url = "https://github.com/prowler-cloud/prowler.git"
reference = "master"
resolved_reference = "6ac90eb1b58590b6f2f51645dbef17b9231053f4"
reference = "eb1b4190ab2d9c265b46c9ede0298b81bdcf35a8"
resolved_reference = "eb1b4190ab2d9c265b46c9ede0298b81bdcf35a8"
[[package]]
name = "psutil"
@@ -6887,14 +6920,14 @@ pydantic = ">=2.12.0,<3.0.0"
[[package]]
name = "pyasn1"
version = "0.6.2"
version = "0.6.3"
description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pyasn1-0.6.2-py3-none-any.whl", hash = "sha256:1eb26d860996a18e9b6ed05e7aae0e9fc21619fcee6af91cca9bad4fbea224bf"},
{file = "pyasn1-0.6.2.tar.gz", hash = "sha256:9b59a2b25ba7e4f8197db7686c09fb33e658b98339fadb826e9512629017833b"},
{file = "pyasn1-0.6.3-py3-none-any.whl", hash = "sha256:a80184d120f0864a52a073acc6fc642847d0be408e7c7252f31390c0f4eadcde"},
{file = "pyasn1-0.6.3.tar.gz", hash = "sha256:697a8ecd6d98891189184ca1fa05d1bb00e2f84b5977c481452050549c8a72cf"},
]
[[package]]
@@ -7114,14 +7147,14 @@ urllib3 = ">=1.26.0"
[[package]]
name = "pygments"
version = "2.19.2"
version = "2.20.0"
description = "Pygments is a syntax highlighting package written in Python."
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main", "dev"]
files = [
{file = "pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b"},
{file = "pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887"},
{file = "pygments-2.20.0-py3-none-any.whl", hash = "sha256:81a9e26dd42fd28a23a2d169d86d7ac03b46e2f8b59ed4698fb4785f946d0176"},
{file = "pygments-2.20.0.tar.gz", hash = "sha256:6757cd03768053ff99f3039c1a36d6c0aa0b263438fcab17520b30a303a82b5f"},
]
[package.extras]
@@ -7129,14 +7162,14 @@ windows-terminal = ["colorama (>=0.4.6)"]
[[package]]
name = "pyjwt"
version = "2.11.0"
version = "2.12.1"
description = "JSON Web Token implementation in Python"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pyjwt-2.11.0-py3-none-any.whl", hash = "sha256:94a6bde30eb5c8e04fee991062b534071fd1439ef58d2adc9ccb823e7bcd0469"},
{file = "pyjwt-2.11.0.tar.gz", hash = "sha256:35f95c1f0fbe5d5ba6e43f00271c275f7a1a4db1dab27bf708073b75318ea623"},
{file = "pyjwt-2.12.1-py3-none-any.whl", hash = "sha256:28ca37c070cad8ba8cd9790cd940535d40274d22f80ab87f3ac6a713e6e8454c"},
{file = "pyjwt-2.12.1.tar.gz", hash = "sha256:c74a7a2adf861c04d002db713dd85f84beb242228e671280bf709d765b03672b"},
]
[package.dependencies]
@@ -7183,7 +7216,7 @@ description = "The MSALRuntime Python Interop Package"
optional = false
python-versions = ">=3.6"
groups = ["main"]
markers = "(platform_system == \"Windows\" or platform_system == \"Darwin\" or platform_system == \"Linux\") and sys_platform == \"win32\""
markers = "sys_platform == \"win32\" and (platform_system == \"Windows\" or platform_system == \"Darwin\" or platform_system == \"Linux\")"
files = [
{file = "pymsalruntime-0.18.1-cp310-cp310-macosx_14_0_arm64.whl", hash = "sha256:0c22e2e83faa10de422bbfaacc1bb2887c9025ee8a53f0fc2e4f7db01c4a7b66"},
{file = "pymsalruntime-0.18.1-cp310-cp310-macosx_14_0_x86_64.whl", hash = "sha256:8ce2944a0f944833d047bb121396091e00287e2b6373716106da86ea99abf379"},
@@ -7261,18 +7294,19 @@ tests = ["hypothesis (>=3.27.0)", "pytest (>=7.4.0)", "pytest-cov (>=2.10.1)", "
[[package]]
name = "pyopenssl"
version = "24.3.0"
version = "26.0.0"
description = "Python wrapper module around the OpenSSL library"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pyOpenSSL-24.3.0-py3-none-any.whl", hash = "sha256:e474f5a473cd7f92221cc04976e48f4d11502804657a08a989fb3be5514c904a"},
{file = "pyopenssl-24.3.0.tar.gz", hash = "sha256:49f7a019577d834746bc55c5fce6ecbcec0f2b4ec5ce1cf43a9a173b8138bb36"},
{file = "pyopenssl-26.0.0-py3-none-any.whl", hash = "sha256:df94d28498848b98cc1c0ffb8ef1e71e40210d3b0a8064c9d29571ed2904bf81"},
{file = "pyopenssl-26.0.0.tar.gz", hash = "sha256:f293934e52936f2e3413b89c6ce36df66a0b34ae1ea3a053b8c5020ff2f513fc"},
]
[package.dependencies]
cryptography = ">=41.0.5,<45"
cryptography = ">=46.0.0,<47"
typing-extensions = {version = ">=4.9", markers = "python_version < \"3.13\" and python_version >= \"3.8\""}
[package.extras]
docs = ["sphinx (!=5.2.0,!=5.2.0.post0,!=7.2.5)", "sphinx_rtd_theme"]
@@ -7324,24 +7358,25 @@ files = [
[[package]]
name = "pytest"
version = "8.2.2"
version = "9.0.3"
description = "pytest: simple powerful testing with Python"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.10"
groups = ["main", "dev"]
files = [
{file = "pytest-8.2.2-py3-none-any.whl", hash = "sha256:c434598117762e2bd304e526244f67bf66bbd7b5d6cf22138be51ff661980343"},
{file = "pytest-8.2.2.tar.gz", hash = "sha256:de4bb8104e201939ccdc688b27a89a7be2079b22e2bd2b07f806b6ba71117977"},
{file = "pytest-9.0.3-py3-none-any.whl", hash = "sha256:2c5efc453d45394fdd706ade797c0a81091eccd1d6e4bccfcd476e2b8e0ab5d9"},
{file = "pytest-9.0.3.tar.gz", hash = "sha256:b86ada508af81d19edeb213c681b1d48246c1a91d304c6c81a427674c17eb91c"},
]
[package.dependencies]
colorama = {version = "*", markers = "sys_platform == \"win32\""}
iniconfig = "*"
packaging = "*"
pluggy = ">=1.5,<2.0"
colorama = {version = ">=0.4", markers = "sys_platform == \"win32\""}
iniconfig = ">=1.0.1"
packaging = ">=22"
pluggy = ">=1.5,<2"
pygments = ">=2.7.2"
[package.extras]
dev = ["argcomplete", "attrs (>=19.2)", "hypothesis (>=3.56)", "mock", "pygments (>=2.7.2)", "requests", "setuptools", "xmlschema"]
dev = ["argcomplete", "attrs (>=19.2)", "hypothesis (>=3.56)", "mock", "requests", "setuptools", "xmlschema"]
[[package]]
name = "pytest-celery"
@@ -7877,26 +7912,26 @@ shaping = ["uharfbuzz"]
[[package]]
name = "requests"
version = "2.32.5"
version = "2.33.1"
description = "Python HTTP for Humans."
optional = false
python-versions = ">=3.9"
python-versions = ">=3.10"
groups = ["main", "dev"]
files = [
{file = "requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6"},
{file = "requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf"},
{file = "requests-2.33.1-py3-none-any.whl", hash = "sha256:4e6d1ef462f3626a1f0a0a9c42dd93c63bad33f9f1c1937509b8c5c8718ab56a"},
{file = "requests-2.33.1.tar.gz", hash = "sha256:18817f8c57c6263968bc123d237e3b8b08ac046f5456bd1e307ee8f4250d3517"},
]
[package.dependencies]
certifi = ">=2017.4.17"
certifi = ">=2023.5.7"
charset_normalizer = ">=2,<4"
idna = ">=2.5,<4"
PySocks = {version = ">=1.5.6,<1.5.7 || >1.5.7", optional = true, markers = "extra == \"socks\""}
urllib3 = ">=1.21.1,<3"
urllib3 = ">=1.26,<3"
[package.extras]
socks = ["PySocks (>=1.5.6,!=1.5.7)"]
use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
use-chardet-on-py3 = ["chardet (>=3.0.2,<8)"]
[[package]]
name = "requests-file"
@@ -8779,6 +8814,23 @@ markupsafe = ">=2.1.1"
[package.extras]
watchdog = ["watchdog (>=2.3)"]
[[package]]
name = "workos"
version = "6.0.4"
description = "WorkOS Python Client"
optional = false
python-versions = ">=3.10"
groups = ["main"]
files = [
{file = "workos-6.0.4-py3-none-any.whl", hash = "sha256:548668b3702673536f853ba72a7b5bbbc269e467aaf9ac4f477b6e0177df5e21"},
{file = "workos-6.0.4.tar.gz", hash = "sha256:b0bfe8fd212b8567422c4ea3732eb33608794033eb3a69900c6b04db183c32d6"},
]
[package.dependencies]
cryptography = ">=46.0,<47.0"
httpx = ">=0.28,<1.0"
pyjwt = ">=2.12,<3.0"
[[package]]
name = "wrapt"
version = "1.17.3"
@@ -9372,4 +9424,4 @@ files = [
[metadata]
lock-version = "2.1"
python-versions = ">=3.11,<3.13"
content-hash = "167d4549788b8bc8bb7772b9a81ade1eab73d8f354251a8d6af4901223cc7f67"
content-hash = "df8a20081fe91c40d071e508dbe19590c8b7ffb5dcc61e71cf30ed016bad5a34"
+5 -5
View File
@@ -25,7 +25,7 @@ dependencies = [
"defusedxml==0.7.1",
"gunicorn==23.0.0",
"lxml==5.3.2",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@eb1b4190ab2d9c265b46c9ede0298b81bdcf35a8",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (==1.3.0)",
"sentry-sdk[django] (==2.56.0)",
@@ -38,7 +38,7 @@ dependencies = [
"matplotlib (==3.10.8)",
"reportlab (==4.4.10)",
"neo4j (==6.1.0)",
"cartography (==0.132.0)",
"cartography (==0.135.0)",
"gevent (==25.9.1)",
"werkzeug (==3.1.7)",
"sqlparse (==0.5.5)",
@@ -50,7 +50,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.24.0"
version = "1.27.0"
[project.scripts]
celery = "src.backend.config.settings.celery"
@@ -62,10 +62,9 @@ django-silk = "5.3.2"
docker = "7.1.0"
filelock = "3.20.3"
freezegun = "1.5.1"
marshmallow = "==3.26.2"
mypy = "1.10.1"
pylint = "3.2.5"
pytest = "8.2.2"
pytest = "9.0.3"
pytest-cov = "5.0.0"
pytest-django = "4.8.0"
pytest-env = "1.1.3"
@@ -75,3 +74,4 @@ ruff = "0.5.0"
safety = "3.7.0"
tqdm = "4.67.1"
vulture = "2.14"
prek = "0.3.9"
+2 -2
View File
@@ -52,7 +52,7 @@ class ApiConfig(AppConfig):
"check_and_fix_socialaccount_sites_migration",
]
# Skip Neo4j initialization during tests, some Django commands, and Celery
# Skip eager Neo4j init for tests, some Django commands, and Celery (prefork pool: driver must stay lazy, no post_fork hook)
if getattr(settings, "TESTING", False) or (
len(sys.argv) > 1
and (
@@ -64,7 +64,7 @@ class ApiConfig(AppConfig):
)
):
logger.info(
"Skipping Neo4j initialization because tests, some Django commands or Celery"
"Skipping eager Neo4j init: tests, some Django commands, or Celery prefork pool (driver stays lazy)"
)
else:
+2 -1
View File
@@ -28,6 +28,7 @@ READ_QUERY_TIMEOUT_SECONDS = env.int(
"ATTACK_PATHS_READ_QUERY_TIMEOUT_SECONDS", default=30
)
MAX_CUSTOM_QUERY_NODES = env.int("ATTACK_PATHS_MAX_CUSTOM_QUERY_NODES", default=250)
CONN_ACQUISITION_TIMEOUT = env.int("NEO4J_CONN_ACQUISITION_TIMEOUT", default=15)
READ_EXCEPTION_CODES = [
"Neo.ClientError.Statement.AccessMode",
"Neo.ClientError.Procedure.ProcedureNotFound",
@@ -62,7 +63,7 @@ def init_driver() -> neo4j.Driver:
auth=(config["USER"], config["PASSWORD"]),
keep_alive=True,
max_connection_lifetime=7200,
connection_acquisition_timeout=120,
connection_acquisition_timeout=CONN_ACQUISITION_TIMEOUT,
max_connection_pool_size=50,
)
_driver.verify_connectivity()
+7 -8
View File
@@ -1,7 +1,6 @@
from collections.abc import Iterable, Mapping
from api.models import Provider
from prowler.config.config import get_available_compliance_frameworks
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.check.models import CheckMetadata
@@ -95,12 +94,12 @@ PROWLER_CHECKS = LazyChecksMapping()
def get_compliance_frameworks(provider_type: Provider.ProviderChoices) -> list[str]:
"""
Retrieve and cache the list of available compliance frameworks for a specific cloud provider.
"""List compliance frameworks the API can load for `provider_type`.
This function lazily loads and caches the available compliance frameworks (e.g., CIS, MITRE, ISO)
for each provider type (AWS, Azure, GCP, etc.) on first access. Subsequent calls for the same
provider will return the cached result.
The list is sourced from `Compliance.get_bulk` so that the names
returned here are guaranteed to be loadable by the bulk loader. This
prevents downstream key mismatches (e.g. CSV report generation iterating
framework names and looking them up in the bulk dict).
Args:
provider_type (Provider.ProviderChoices): The cloud provider type for which to retrieve
@@ -112,8 +111,8 @@ def get_compliance_frameworks(provider_type: Provider.ProviderChoices) -> list[s
"""
global AVAILABLE_COMPLIANCE_FRAMEWORKS
if provider_type not in AVAILABLE_COMPLIANCE_FRAMEWORKS:
AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type] = (
get_available_compliance_frameworks(provider_type)
AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type] = list(
Compliance.get_bulk(provider_type).keys()
)
return AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type]
+4 -2
View File
@@ -330,6 +330,7 @@ class MembershipFilter(FilterSet):
model = Membership
fields = {
"tenant": ["exact"],
"user": ["exact"],
"role": ["exact"],
"date_joined": ["date", "gte", "lte"],
}
@@ -1115,13 +1116,14 @@ class FindingGroupAggregatedComputedFilter(FilterSet):
STATUS_CHOICES = (
("FAIL", "Fail"),
("PASS", "Pass"),
("MUTED", "Muted"),
("MANUAL", "Manual"),
)
status = ChoiceFilter(method="filter_status", choices=STATUS_CHOICES)
status__in = CharInFilter(method="filter_status_in", lookup_expr="in")
severity = ChoiceFilter(method="filter_severity", choices=SeverityChoices)
severity__in = CharInFilter(method="filter_severity_in", lookup_expr="in")
muted = BooleanFilter(field_name="muted")
include_muted = BooleanFilter(method="filter_include_muted")
def filter_status(self, queryset, name, value):
@@ -1198,7 +1200,7 @@ class FindingGroupAggregatedComputedFilter(FilterSet):
if value is True:
return queryset
# include_muted=false: exclude fully-muted groups
return queryset.exclude(fail_count=0, pass_count=0, muted_count__gt=0)
return queryset.exclude(muted=True)
class ProviderSecretFilter(FilterSet):
@@ -0,0 +1,95 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0087_vercel_provider"),
]
operations = [
migrations.AddField(
model_name="findinggroupdailysummary",
name="manual_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="pass_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="fail_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="manual_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="muted",
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_fail_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_fail_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_pass_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_pass_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_manual_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_manual_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_fail_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_fail_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_pass_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_pass_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_manual_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_manual_muted_count",
field=models.IntegerField(default=0),
),
]
@@ -0,0 +1,31 @@
from django.db import migrations
from tasks.tasks import backfill_finding_group_summaries_task
from api.db_router import MainRouter
from api.rls import Tenant
def trigger_backfill_task(apps, schema_editor):
"""
Re-dispatch the finding-group backfill task for every tenant so the new
`manual_count` and `muted` columns added in 0088 get populated from the
last 10 days of completed scans.
The aggregator (`aggregate_finding_group_summaries`) recomputes every
column on each call, so it back-populates the new fields without touching
the existing ones beyond a normal upsert.
"""
tenant_ids = Tenant.objects.using(MainRouter.admin_db).values_list("id", flat=True)
for tenant_id in tenant_ids:
backfill_finding_group_summaries_task.delay(tenant_id=str(tenant_id), days=10)
class Migration(migrations.Migration):
dependencies = [
("api", "0088_finding_group_status_muted_fields"),
]
operations = [
migrations.RunPython(trigger_backfill_task, migrations.RunPython.noop),
]
@@ -0,0 +1,23 @@
from django.db import migrations
TASK_NAME = "attack-paths-cleanup-stale-scans"
def set_cleanup_priority(apps, schema_editor):
PeriodicTask = apps.get_model("django_celery_beat", "PeriodicTask")
PeriodicTask.objects.filter(name=TASK_NAME).update(priority=0)
def unset_cleanup_priority(apps, schema_editor):
PeriodicTask = apps.get_model("django_celery_beat", "PeriodicTask")
PeriodicTask.objects.filter(name=TASK_NAME).update(priority=None)
class Migration(migrations.Migration):
dependencies = [
("api", "0089_backfill_finding_group_status_muted"),
]
operations = [
migrations.RunPython(set_cleanup_priority, unset_cleanup_priority),
]
+80 -2
View File
@@ -595,10 +595,40 @@ class Scan(RowLevelSecurityProtectedModel):
objects = ActiveProviderManager()
all_objects = models.Manager()
_SCOPING_SCANNER_ARG_KEYS_CACHE: tuple[str, ...] | None = None
@classmethod
def get_scoping_scanner_arg_keys(cls) -> tuple[str, ...]:
"""Return the scanner_args keys that mark a scan as scoped.
Derived from ``prowler.lib.scan.scan.Scan.__init__`` so the API stays
in sync with whatever the SDK actually accepts as filters. Cached at
class level — the signature is stable for the process lifetime.
"""
if cls._SCOPING_SCANNER_ARG_KEYS_CACHE is None:
import inspect
from prowler.lib.scan.scan import Scan as ProwlerScan
params = inspect.signature(ProwlerScan.__init__).parameters
cls._SCOPING_SCANNER_ARG_KEYS_CACHE = tuple(
name for name in params if name not in ("self", "provider")
)
return cls._SCOPING_SCANNER_ARG_KEYS_CACHE
class TriggerChoices(models.TextChoices):
SCHEDULED = "scheduled", _("Scheduled")
MANUAL = "manual", _("Manual")
# Trigger values for scans that ran the SDK end-to-end. Imported scans (or
# any future trigger) are intentionally NOT in this set — they may carry
# only a partial slice of resources, so post-scan logic that depends on a
# full-scope sweep (e.g. resetting ephemeral resource findings) must skip
# them by default.
LIVE_SCAN_TRIGGERS = frozenset(
(TriggerChoices.SCHEDULED.value, TriggerChoices.MANUAL.value)
)
id = models.UUIDField(primary_key=True, default=uuid7, editable=False)
name = models.CharField(
blank=True, null=True, max_length=100, validators=[MinLengthValidator(3)]
@@ -681,6 +711,24 @@ class Scan(RowLevelSecurityProtectedModel):
class JSONAPIMeta:
resource_name = "scans"
def is_full_scope(self) -> bool:
"""Return True if this scan ran with no scoping filters at all.
Used to gate post-scan operations (such as resetting the
failed_findings_count of resources missing from the scan) that are only
safe when the scan covered every check, service, and category. Imported
scans are NOT full-scope by definition — they may carry only a partial
slice of resources, so they're rejected via ``trigger`` even before the
scanner_args check.
"""
if self.trigger not in self.LIVE_SCAN_TRIGGERS:
return False
scanner_args = self.scanner_args or {}
for key in self.get_scoping_scanner_arg_keys():
if scanner_args.get(key):
return False
return True
class AttackPathsScan(RowLevelSecurityProtectedModel):
objects = ActiveProviderManager()
@@ -1748,15 +1796,45 @@ class FindingGroupDailySummary(RowLevelSecurityProtectedModel):
# Severity stored as integer for MAX aggregation (5=critical, 4=high, etc.)
severity_order = models.SmallIntegerField(default=1)
# Finding counts
# Finding counts (inclusive of muted findings; use the `muted` flag to
# tell whether the group has any actionable findings).
pass_count = models.IntegerField(default=0)
fail_count = models.IntegerField(default=0)
manual_count = models.IntegerField(default=0)
muted_count = models.IntegerField(default=0)
# Delta counts
# Status counts restricted to muted findings, so clients can isolate the
# muted half of each status (e.g. `pass_count - pass_muted_count` gives the
# actionable PASS findings).
pass_muted_count = models.IntegerField(default=0)
fail_muted_count = models.IntegerField(default=0)
manual_muted_count = models.IntegerField(default=0)
# Whether every finding for this (provider, check, day) is muted.
muted = models.BooleanField(default=False)
# Delta counts (non-muted, kept for convenience and as a "total" view).
new_count = models.IntegerField(default=0)
changed_count = models.IntegerField(default=0)
# Delta breakdown by (status, muted) so clients can answer questions like
# "how many new failing findings appeared in this scan?" without scanning
# the underlying findings table. Mirrors the existing pass/fail/manual
# naming, with `_muted_count` siblings tracking the muted half of each
# bucket explicitly.
new_fail_count = models.IntegerField(default=0)
new_fail_muted_count = models.IntegerField(default=0)
new_pass_count = models.IntegerField(default=0)
new_pass_muted_count = models.IntegerField(default=0)
new_manual_count = models.IntegerField(default=0)
new_manual_muted_count = models.IntegerField(default=0)
changed_fail_count = models.IntegerField(default=0)
changed_fail_muted_count = models.IntegerField(default=0)
changed_pass_count = models.IntegerField(default=0)
changed_pass_muted_count = models.IntegerField(default=0)
changed_manual_count = models.IntegerField(default=0)
changed_manual_muted_count = models.IntegerField(default=0)
# Resource counts
resources_fail = models.IntegerField(default=0)
resources_total = models.IntegerField(default=0)
File diff suppressed because it is too large Load Diff
@@ -12,6 +12,8 @@ from unittest.mock import MagicMock, patch
import neo4j
import pytest
import api.attack_paths.database as db_module
class TestLazyInitialization:
"""Test that Neo4j driver is initialized lazily on first use."""
@@ -19,8 +21,6 @@ class TestLazyInitialization:
@pytest.fixture(autouse=True)
def reset_module_state(self):
"""Reset module-level singleton state before each test."""
import api.attack_paths.database as db_module
original_driver = db_module._driver
db_module._driver = None
@@ -31,8 +31,6 @@ class TestLazyInitialization:
def test_driver_not_initialized_at_import(self):
"""Driver should be None after module import (no eager connection)."""
import api.attack_paths.database as db_module
assert db_module._driver is None
@patch("api.attack_paths.database.settings")
@@ -41,8 +39,6 @@ class TestLazyInitialization:
self, mock_driver_factory, mock_settings
):
"""init_driver() should create connection only when called."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver_factory.return_value = mock_driver
mock_settings.DATABASES = {
@@ -69,8 +65,6 @@ class TestLazyInitialization:
self, mock_driver_factory, mock_settings
):
"""Subsequent calls should return cached driver without reconnecting."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver_factory.return_value = mock_driver
mock_settings.DATABASES = {
@@ -99,8 +93,6 @@ class TestLazyInitialization:
self, mock_driver_factory, mock_settings
):
"""get_driver() should use init_driver() for lazy initialization."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver_factory.return_value = mock_driver
mock_settings.DATABASES = {
@@ -118,14 +110,50 @@ class TestLazyInitialization:
mock_driver_factory.assert_called_once()
class TestConnectionAcquisitionTimeout:
"""Test that the connection acquisition timeout is configurable."""
@pytest.fixture(autouse=True)
def reset_module_state(self):
original_driver = db_module._driver
original_timeout = db_module.CONN_ACQUISITION_TIMEOUT
db_module._driver = None
yield
db_module._driver = original_driver
db_module.CONN_ACQUISITION_TIMEOUT = original_timeout
@patch("api.attack_paths.database.settings")
@patch("api.attack_paths.database.neo4j.GraphDatabase.driver")
def test_driver_receives_configured_timeout(
self, mock_driver_factory, mock_settings
):
"""init_driver() should pass CONN_ACQUISITION_TIMEOUT to the neo4j driver."""
mock_driver_factory.return_value = MagicMock()
mock_settings.DATABASES = {
"neo4j": {
"HOST": "localhost",
"PORT": 7687,
"USER": "neo4j",
"PASSWORD": "password",
}
}
db_module.CONN_ACQUISITION_TIMEOUT = 42
db_module.init_driver()
_, kwargs = mock_driver_factory.call_args
assert kwargs["connection_acquisition_timeout"] == 42
class TestAtexitRegistration:
"""Test that atexit cleanup handler is registered correctly."""
@pytest.fixture(autouse=True)
def reset_module_state(self):
"""Reset module-level singleton state before each test."""
import api.attack_paths.database as db_module
original_driver = db_module._driver
db_module._driver = None
@@ -141,8 +169,6 @@ class TestAtexitRegistration:
self, mock_driver_factory, mock_atexit_register, mock_settings
):
"""atexit.register should be called on first initialization."""
import api.attack_paths.database as db_module
mock_driver_factory.return_value = MagicMock()
mock_settings.DATABASES = {
"neo4j": {
@@ -168,8 +194,6 @@ class TestAtexitRegistration:
The double-checked locking on _driver ensures the atexit registration
block only executes once (when _driver is first created).
"""
import api.attack_paths.database as db_module
mock_driver_factory.return_value = MagicMock()
mock_settings.DATABASES = {
"neo4j": {
@@ -194,8 +218,6 @@ class TestCloseDriver:
@pytest.fixture(autouse=True)
def reset_module_state(self):
"""Reset module-level singleton state before each test."""
import api.attack_paths.database as db_module
original_driver = db_module._driver
db_module._driver = None
@@ -206,8 +228,6 @@ class TestCloseDriver:
def test_close_driver_closes_and_clears_driver(self):
"""close_driver() should close the driver and set it to None."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
db_module._driver = mock_driver
@@ -218,8 +238,6 @@ class TestCloseDriver:
def test_close_driver_handles_none_driver(self):
"""close_driver() should handle case where driver is None."""
import api.attack_paths.database as db_module
db_module._driver = None
# Should not raise
@@ -229,8 +247,6 @@ class TestCloseDriver:
def test_close_driver_clears_driver_even_on_close_error(self):
"""Driver should be cleared even if close() raises an exception."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver.close.side_effect = Exception("Connection error")
db_module._driver = mock_driver
@@ -246,8 +262,6 @@ class TestExecuteReadQuery:
"""Test read query execution helper."""
def test_execute_read_query_calls_read_session_and_returns_result(self):
import api.attack_paths.database as db_module
tx = MagicMock()
expected_graph = MagicMock()
run_result = MagicMock()
@@ -289,8 +303,6 @@ class TestExecuteReadQuery:
assert result is expected_graph
def test_execute_read_query_defaults_parameters_to_empty_dict(self):
import api.attack_paths.database as db_module
tx = MagicMock()
run_result = MagicMock()
run_result.graph.return_value = MagicMock()
@@ -325,8 +337,6 @@ class TestGetSessionReadOnly:
@pytest.fixture(autouse=True)
def reset_module_state(self):
import api.attack_paths.database as db_module
original_driver = db_module._driver
db_module._driver = None
yield
@@ -341,8 +351,6 @@ class TestGetSessionReadOnly:
)
def test_get_session_raises_write_query_not_allowed(self, neo4j_code):
"""Read-mode Neo4j errors should raise `WriteQueryNotAllowedException`."""
import api.attack_paths.database as db_module
mock_session = MagicMock()
neo4j_error = neo4j.exceptions.Neo4jError._hydrate_neo4j(
code=neo4j_code,
@@ -362,8 +370,6 @@ class TestGetSessionReadOnly:
def test_get_session_raises_generic_exception_for_other_errors(self):
"""Non-read-mode Neo4j errors should raise GraphDatabaseQueryException."""
import api.attack_paths.database as db_module
mock_session = MagicMock()
neo4j_error = neo4j.exceptions.Neo4jError._hydrate_neo4j(
code="Neo.ClientError.Statement.SyntaxError",
@@ -388,8 +394,6 @@ class TestThreadSafety:
@pytest.fixture(autouse=True)
def reset_module_state(self):
"""Reset module-level singleton state before each test."""
import api.attack_paths.database as db_module
original_driver = db_module._driver
db_module._driver = None
@@ -404,8 +408,6 @@ class TestThreadSafety:
self, mock_driver_factory, mock_settings
):
"""Multiple threads calling init_driver() should create only one driver."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver_factory.return_value = mock_driver
mock_settings.DATABASES = {
@@ -448,8 +450,6 @@ class TestHasProviderData:
"""Test has_provider_data helper for checking provider nodes in Neo4j."""
def test_returns_true_when_nodes_exist(self):
import api.attack_paths.database as db_module
mock_session = MagicMock()
mock_result = MagicMock()
mock_result.single.return_value = MagicMock() # non-None record
@@ -468,8 +468,6 @@ class TestHasProviderData:
mock_session.run.assert_called_once()
def test_returns_false_when_no_nodes(self):
import api.attack_paths.database as db_module
mock_session = MagicMock()
mock_result = MagicMock()
mock_result.single.return_value = None
@@ -486,8 +484,6 @@ class TestHasProviderData:
assert db_module.has_provider_data("db-tenant-abc", "provider-123") is False
def test_returns_false_when_database_not_found(self):
import api.attack_paths.database as db_module
session_ctx = MagicMock()
session_ctx.__enter__.side_effect = db_module.GraphDatabaseQueryException(
message="Database does not exist",
@@ -503,8 +499,6 @@ class TestHasProviderData:
)
def test_raises_on_other_errors(self):
import api.attack_paths.database as db_module
session_ctx = MagicMock()
session_ctx.__enter__.side_effect = db_module.GraphDatabaseQueryException(
message="Connection refused",
@@ -1,13 +1,18 @@
from unittest.mock import MagicMock, patch
import pytest
from api import compliance as compliance_module
from api.compliance import (
generate_compliance_overview_template,
generate_scan_compliance,
get_compliance_frameworks,
get_prowler_provider_checks,
get_prowler_provider_compliance,
load_prowler_checks,
)
from api.models import Provider
from prowler.lib.check.compliance_models import Compliance
class TestCompliance:
@@ -250,3 +255,58 @@ class TestCompliance:
}
assert template == expected_template
@pytest.fixture
def reset_compliance_cache():
"""Reset the module-level cache so each test starts cold."""
previous = dict(compliance_module.AVAILABLE_COMPLIANCE_FRAMEWORKS)
compliance_module.AVAILABLE_COMPLIANCE_FRAMEWORKS.clear()
try:
yield
finally:
compliance_module.AVAILABLE_COMPLIANCE_FRAMEWORKS.clear()
compliance_module.AVAILABLE_COMPLIANCE_FRAMEWORKS.update(previous)
class TestGetComplianceFrameworks:
def test_returns_keys_from_compliance_get_bulk(self, reset_compliance_cache):
with patch("api.compliance.Compliance") as mock_compliance:
mock_compliance.get_bulk.return_value = {
"cis_1.4_aws": MagicMock(),
"mitre_attack_aws": MagicMock(),
}
result = get_compliance_frameworks(Provider.ProviderChoices.AWS)
assert sorted(result) == ["cis_1.4_aws", "mitre_attack_aws"]
mock_compliance.get_bulk.assert_called_once_with(Provider.ProviderChoices.AWS)
def test_caches_result_per_provider(self, reset_compliance_cache):
with patch("api.compliance.Compliance") as mock_compliance:
mock_compliance.get_bulk.return_value = {"cis_1.4_aws": MagicMock()}
get_compliance_frameworks(Provider.ProviderChoices.AWS)
get_compliance_frameworks(Provider.ProviderChoices.AWS)
# Cached after first call.
assert mock_compliance.get_bulk.call_count == 1
@pytest.mark.parametrize(
"provider_type",
[choice.value for choice in Provider.ProviderChoices],
)
def test_listing_is_subset_of_bulk(self, reset_compliance_cache, provider_type):
"""Regression for CLOUD-API-40S: every name returned by
``get_compliance_frameworks`` must be loadable via ``Compliance.get_bulk``.
A divergence here is what produced ``KeyError: 'csa_ccm_4.0'`` in
``generate_outputs_task`` after universal/multi-provider compliance
JSONs were introduced at the top-level ``prowler/compliance/`` path.
"""
bulk_keys = set(Compliance.get_bulk(provider_type).keys())
listed = set(get_compliance_frameworks(provider_type))
missing = listed - bulk_keys
assert not missing, (
f"get_compliance_frameworks({provider_type!r}) returned names not "
f"loadable by Compliance.get_bulk: {sorted(missing)}"
)
File diff suppressed because it is too large Load Diff
+23 -2
View File
@@ -4185,6 +4185,7 @@ class FindingGroupSerializer(BaseSerializerV1):
check_description = serializers.CharField(required=False, allow_null=True)
severity = serializers.CharField()
status = serializers.CharField()
muted = serializers.BooleanField()
impacted_providers = serializers.ListField(
child=serializers.CharField(), required=False
)
@@ -4192,9 +4193,25 @@ class FindingGroupSerializer(BaseSerializerV1):
resources_total = serializers.IntegerField()
pass_count = serializers.IntegerField()
fail_count = serializers.IntegerField()
manual_count = serializers.IntegerField()
pass_muted_count = serializers.IntegerField()
fail_muted_count = serializers.IntegerField()
manual_muted_count = serializers.IntegerField()
muted_count = serializers.IntegerField()
new_count = serializers.IntegerField()
changed_count = serializers.IntegerField()
new_fail_count = serializers.IntegerField()
new_fail_muted_count = serializers.IntegerField()
new_pass_count = serializers.IntegerField()
new_pass_muted_count = serializers.IntegerField()
new_manual_count = serializers.IntegerField()
new_manual_muted_count = serializers.IntegerField()
changed_fail_count = serializers.IntegerField()
changed_fail_muted_count = serializers.IntegerField()
changed_pass_count = serializers.IntegerField()
changed_pass_muted_count = serializers.IntegerField()
changed_manual_count = serializers.IntegerField()
changed_manual_muted_count = serializers.IntegerField()
first_seen_at = serializers.DateTimeField(required=False, allow_null=True)
last_seen_at = serializers.DateTimeField(required=False, allow_null=True)
failing_since = serializers.DateTimeField(required=False, allow_null=True)
@@ -4208,14 +4225,18 @@ class FindingGroupResourceSerializer(BaseSerializerV1):
Serializer for Finding Group Resources - resources within a finding group.
Returns individual resources with their current status, severity,
and timing information.
and timing information. Orphan findings (without any resource) expose the
finding id as `id` so the row stays identifiable in the UI.
"""
id = serializers.UUIDField(source="resource_id")
id = serializers.UUIDField(source="row_id")
resource = serializers.SerializerMethodField()
provider = serializers.SerializerMethodField()
finding_id = serializers.UUIDField()
status = serializers.CharField()
severity = serializers.CharField()
muted = serializers.BooleanField()
delta = serializers.CharField(required=False, allow_null=True)
first_seen_at = serializers.DateTimeField(required=False, allow_null=True)
last_seen_at = serializers.DateTimeField(required=False, allow_null=True)
muted_reason = serializers.CharField(required=False, allow_null=True)
File diff suppressed because it is too large Load Diff
+3 -1
View File
@@ -17,8 +17,10 @@ celery_app.config_from_object("django.conf:settings", namespace="CELERY")
celery_app.conf.update(result_extended=True, result_expires=None)
celery_app.conf.broker_transport_options = {
"visibility_timeout": BROKER_VISIBILITY_TIMEOUT
"visibility_timeout": BROKER_VISIBILITY_TIMEOUT,
"queue_order_strategy": "priority",
}
celery_app.conf.task_default_priority = 6
celery_app.conf.result_backend_transport_options = {
"visibility_timeout": BROKER_VISIBILITY_TIMEOUT
}
+1 -1
View File
@@ -15,7 +15,7 @@ from config.django.production import LOGGING as DJANGO_LOGGERS, DEBUG # noqa: E
from config.custom_logging import BackendLogger # noqa: E402
BIND_ADDRESS = env("DJANGO_BIND_ADDRESS", default="127.0.0.1")
PORT = env("DJANGO_PORT", default=8000)
PORT = env("DJANGO_PORT", default=8080)
# Server settings
bind = f"{BIND_ADDRESS}:{PORT}"
@@ -120,6 +120,7 @@ sentry_sdk.init(
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
before_send=before_send,
send_default_pii=True,
traces_sample_rate=env.float("DJANGO_SENTRY_TRACES_SAMPLE_RATE", default=0.02),
_experiments={
# Set continuous_profiling_auto_start to True
# to automatically start the profiler on when
+4 -4
View File
@@ -14,8 +14,8 @@ from rest_framework import status
from rest_framework.test import APIClient
from tasks.jobs.backfill import (
backfill_resource_scan_summaries,
backfill_scan_category_summaries,
backfill_scan_resource_group_summaries,
aggregate_scan_category_summaries,
aggregate_scan_resource_group_summaries,
)
from api.attack_paths import (
@@ -1445,8 +1445,8 @@ def latest_scan_finding_with_categories(
)
finding.add_resources([resource])
backfill_resource_scan_summaries(tenant_id, str(scan.id))
backfill_scan_category_summaries(tenant_id, str(scan.id))
backfill_scan_resource_group_summaries(tenant_id, str(scan.id))
aggregate_scan_category_summaries(tenant_id, str(scan.id))
aggregate_scan_resource_group_summaries(tenant_id, str(scan.id))
return finding
Binary file not shown.

After

Width:  |  Height:  |  Size: 131 KiB

+102 -11
View File
@@ -1,6 +1,8 @@
# Portions of this file are based on code from the Cartography project
# (https://github.com/cartography-cncf/cartography), which is licensed under the Apache 2.0 License.
import time
from typing import Any
import aioboto3
@@ -33,7 +35,7 @@ def start_aws_ingestion(
For the scan progress updates:
- The caller of this function (`tasks.jobs.attack_paths.scan.run`) has set it to 2.
- When the control returns to the caller, it will be set to 95.
- When the control returns to the caller, it will be set to 93.
"""
# Initialize variables common to all jobs
@@ -47,7 +49,7 @@ def start_aws_ingestion(
}
boto3_session = get_boto3_session(prowler_api_provider, prowler_sdk_provider)
regions: list[str] = list(prowler_sdk_provider._enabled_regions)
regions: list[str] = resolve_aws_regions(prowler_api_provider, prowler_sdk_provider)
requested_syncs = list(cartography_aws.RESOURCE_FUNCTIONS.keys())
sync_args = cartography_aws._build_aws_sync_kwargs(
@@ -89,34 +91,50 @@ def start_aws_ingestion(
logger.info(
f"Syncing function permission_relationships for AWS account {prowler_api_provider.uid}"
)
t0 = time.perf_counter()
cartography_aws.RESOURCE_FUNCTIONS["permission_relationships"](**sync_args)
logger.info(
f"Synced function permission_relationships for AWS account {prowler_api_provider.uid} in {time.perf_counter() - t0:.3f}s"
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 88)
if "resourcegroupstaggingapi" in requested_syncs:
logger.info(
f"Syncing function resourcegroupstaggingapi for AWS account {prowler_api_provider.uid}"
)
t0 = time.perf_counter()
cartography_aws.RESOURCE_FUNCTIONS["resourcegroupstaggingapi"](**sync_args)
logger.info(
f"Synced function resourcegroupstaggingapi for AWS account {prowler_api_provider.uid} in {time.perf_counter() - t0:.3f}s"
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 89)
logger.info(
f"Syncing ec2_iaminstanceprofile scoped analysis for AWS account {prowler_api_provider.uid}"
)
t0 = time.perf_counter()
cartography_aws.run_scoped_analysis_job(
"aws_ec2_iaminstanceprofile.json",
neo4j_session,
common_job_parameters,
)
logger.info(
f"Synced ec2_iaminstanceprofile scoped analysis for AWS account {prowler_api_provider.uid} in {time.perf_counter() - t0:.3f}s"
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 90)
logger.info(
f"Syncing lambda_ecr analysis for AWS account {prowler_api_provider.uid}"
)
t0 = time.perf_counter()
cartography_aws.run_analysis_job(
"aws_lambda_ecr.json",
neo4j_session,
common_job_parameters,
)
logger.info(
f"Synced lambda_ecr analysis for AWS account {prowler_api_provider.uid} in {time.perf_counter() - t0:.3f}s"
)
if all(
s in requested_syncs
@@ -125,25 +143,34 @@ def start_aws_ingestion(
logger.info(
f"Syncing lb_container_exposure scoped analysis for AWS account {prowler_api_provider.uid}"
)
t0 = time.perf_counter()
cartography_aws.run_scoped_analysis_job(
"aws_lb_container_exposure.json",
neo4j_session,
common_job_parameters,
)
logger.info(
f"Synced lb_container_exposure scoped analysis for AWS account {prowler_api_provider.uid} in {time.perf_counter() - t0:.3f}s"
)
if all(s in requested_syncs for s in ["ec2:network_acls", "ec2:load_balancer_v2"]):
logger.info(
f"Syncing lb_nacl_direct scoped analysis for AWS account {prowler_api_provider.uid}"
)
t0 = time.perf_counter()
cartography_aws.run_scoped_analysis_job(
"aws_lb_nacl_direct.json",
neo4j_session,
common_job_parameters,
)
logger.info(
f"Synced lb_nacl_direct scoped analysis for AWS account {prowler_api_provider.uid} in {time.perf_counter() - t0:.3f}s"
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 91)
logger.info(f"Syncing metadata for AWS account {prowler_api_provider.uid}")
t0 = time.perf_counter()
cartography_aws.merge_module_sync_metadata(
neo4j_session,
group_type="AWSAccount",
@@ -152,24 +179,23 @@ def start_aws_ingestion(
update_tag=cartography_config.update_tag,
stat_handler=cartography_aws.stat_handler,
)
logger.info(
f"Synced metadata for AWS account {prowler_api_provider.uid} in {time.perf_counter() - t0:.3f}s"
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 92)
# Removing the added extra field
del common_job_parameters["AWS_ID"]
logger.info(f"Syncing cleanup_job for AWS account {prowler_api_provider.uid}")
cartography_aws.run_cleanup_job(
"aws_post_ingestion_principals_cleanup.json",
neo4j_session,
common_job_parameters,
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 93)
logger.info(f"Syncing analysis for AWS account {prowler_api_provider.uid}")
t0 = time.perf_counter()
cartography_aws._perform_aws_analysis(
requested_syncs, neo4j_session, common_job_parameters
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 94)
logger.info(
f"Synced analysis for AWS account {prowler_api_provider.uid} in {time.perf_counter() - t0:.3f}s"
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 93)
return failed_syncs
@@ -200,6 +226,48 @@ def get_boto3_session(
return boto3_session
def resolve_aws_regions(
prowler_api_provider: ProwlerAPIProvider,
prowler_sdk_provider: ProwlerSDKProvider,
) -> list[str]:
"""Resolve the regions to scan, falling back when `_enabled_regions` is `None`.
The SDK silently sets `_enabled_regions` to `None` when `ec2:DescribeRegions`
fails (missing IAM permission, transient error). Without a fallback the
Cartography ingestion crashes with a non-actionable `TypeError`. Try the
user's `audited_regions` next, then the partition's static region list.
Excluded regions are honored on every branch.
"""
if prowler_sdk_provider._enabled_regions is not None:
regions = set(prowler_sdk_provider._enabled_regions)
elif prowler_sdk_provider.identity.audited_regions:
regions = set(prowler_sdk_provider.identity.audited_regions)
else:
partition = prowler_sdk_provider.identity.partition
try:
regions = prowler_sdk_provider.get_available_aws_service_regions(
"ec2", partition
)
except KeyError:
raise RuntimeError(
f"No region data available for partition {partition!r}; "
f"cannot determine regions to scan for "
f"{prowler_api_provider.uid}"
)
logger.warning(
f"Could not enumerate enabled regions for AWS account "
f"{prowler_api_provider.uid}; falling back to all regions in "
f"partition {partition!r}"
)
excluded = set(getattr(prowler_sdk_provider, "_excluded_regions", None) or ())
return sorted(regions - excluded)
def get_aioboto3_session(boto3_session: boto3.Session) -> aioboto3.Session:
return aioboto3.Session(botocore_session=boto3_session._session)
@@ -234,6 +302,8 @@ def sync_aws_account(
)
try:
func_t0 = time.perf_counter()
# `ecr:image_layers` uses `aioboto3_session` instead of `boto3_session`
if func_name == "ecr:image_layers":
cartography_aws.RESOURCE_FUNCTIONS[func_name](
@@ -257,7 +327,15 @@ def sync_aws_account(
else:
cartography_aws.RESOURCE_FUNCTIONS[func_name](**sync_args)
logger.info(
f"Synced function {func_name} for AWS account {prowler_api_provider.uid} in {time.perf_counter() - func_t0:.3f}s"
)
except Exception as e:
logger.info(
f"Synced function {func_name} for AWS account {prowler_api_provider.uid} in {time.perf_counter() - func_t0:.3f}s (FAILED)"
)
exception_message = utils.stringify_exception(
e, f"Exception for AWS sync function: {func_name}"
)
@@ -277,3 +355,16 @@ def sync_aws_account(
)
return failed_syncs
def extract_short_uid(uid: str) -> str:
"""Return the short identifier from an AWS ARN or resource ID.
Supported inputs end in one of:
- `<type>/<id>` (e.g. `instance/i-xxx`)
- `<type>:<id>` (e.g. `function:name`)
- `<id>` (e.g. `bucket-name` or `i-xxx`)
If `uid` is already a short resource ID, it is returned unchanged.
"""
return uid.rsplit("/", 1)[-1].rsplit(":", 1)[-1]
@@ -18,28 +18,45 @@ logger = get_task_logger(__name__)
def cleanup_stale_attack_paths_scans() -> dict:
"""
Find `EXECUTING` `AttackPathsScan` scans whose workers are dead or that have
exceeded the stale threshold, and mark them as `FAILED`.
Mark stale `AttackPathsScan` rows as `FAILED`.
Two-pass detection:
Covers two stuck-state scenarios:
1. `EXECUTING` scans whose workers are dead, or that have exceeded the
stale threshold while alive.
2. `SCHEDULED` scans that never made it to a worker parent scan
crashed before dispatch, broker lost the message, etc. Detected by
age plus the parent `Scan` no longer being in flight.
"""
threshold = timedelta(minutes=ATTACK_PATHS_SCAN_STALE_THRESHOLD_MINUTES)
now = datetime.now(tz=timezone.utc)
cutoff = now - threshold
cleaned_up: list[str] = []
cleaned_up.extend(_cleanup_stale_executing_scans(cutoff))
cleaned_up.extend(_cleanup_stale_scheduled_scans(cutoff))
logger.info(
f"Stale `AttackPathsScan` cleanup: {len(cleaned_up)} scan(s) cleaned up"
)
return {"cleaned_up_count": len(cleaned_up), "scan_ids": cleaned_up}
def _cleanup_stale_executing_scans(cutoff: datetime) -> list[str]:
"""
Two-pass detection for `EXECUTING` scans:
1. If `TaskResult.worker` exists, ping the worker.
- Dead worker: cleanup immediately (any age).
- Alive + past threshold: revoke the task, then cleanup.
- Alive + within threshold: skip.
2. If no worker field: fall back to time-based heuristic only.
"""
threshold = timedelta(minutes=ATTACK_PATHS_SCAN_STALE_THRESHOLD_MINUTES)
now = datetime.now(tz=timezone.utc)
cutoff = now - threshold
executing_scans = (
executing_scans = list(
AttackPathsScan.all_objects.using(MainRouter.admin_db)
.filter(state=StateChoices.EXECUTING)
.select_related("task__task_runner_task")
)
# Cache worker liveness so each worker is pinged at most once
executing_scans = list(executing_scans)
workers = {
tr.worker
for scan in executing_scans
@@ -48,7 +65,7 @@ def cleanup_stale_attack_paths_scans() -> dict:
}
worker_alive = {w: _is_worker_alive(w) for w in workers}
cleaned_up = []
cleaned_up: list[str] = []
for scan in executing_scans:
task_result = (
@@ -65,9 +82,7 @@ def cleanup_stale_attack_paths_scans() -> dict:
# Alive but stale — revoke before cleanup
_revoke_task(task_result)
reason = (
"Scan exceeded stale threshold — " "cleaned up by periodic task"
)
reason = "Scan exceeded stale threshold — cleaned up by periodic task"
else:
reason = "Worker dead — cleaned up by periodic task"
else:
@@ -82,10 +97,57 @@ def cleanup_stale_attack_paths_scans() -> dict:
if _cleanup_scan(scan, task_result, reason):
cleaned_up.append(str(scan.id))
logger.info(
f"Stale `AttackPathsScan` cleanup: {len(cleaned_up)} scan(s) cleaned up"
return cleaned_up
def _cleanup_stale_scheduled_scans(cutoff: datetime) -> list[str]:
"""
Cleanup `SCHEDULED` scans that never reached a worker.
Detection:
- `state == SCHEDULED`
- `started_at < cutoff`
- parent `Scan` is no longer in flight (terminal state or missing). This
avoids cleaning up rows whose parent Prowler scan is legitimately still
running.
For each match: revoke the queued task (best-effort; harmless if already
consumed), atomically flip to `FAILED`, and mark the `TaskResult`. The
temp Neo4j database is never created while `SCHEDULED`, so no drop is
needed.
"""
scheduled_scans = list(
AttackPathsScan.all_objects.using(MainRouter.admin_db)
.filter(
state=StateChoices.SCHEDULED,
started_at__lt=cutoff,
)
.select_related("task__task_runner_task", "scan")
)
return {"cleaned_up_count": len(cleaned_up), "scan_ids": cleaned_up}
cleaned_up: list[str] = []
parent_terminal = (
StateChoices.COMPLETED,
StateChoices.FAILED,
StateChoices.CANCELLED,
)
for scan in scheduled_scans:
parent_scan = scan.scan
if parent_scan is not None and parent_scan.state not in parent_terminal:
continue
task_result = (
getattr(scan.task, "task_runner_task", None) if scan.task else None
)
if task_result:
_revoke_task(task_result, terminate=False)
reason = "Scan never started — cleaned up by periodic task"
if _cleanup_scheduled_scan(scan, task_result, reason):
cleaned_up.append(str(scan.id))
return cleaned_up
def _is_worker_alive(worker: str) -> bool:
@@ -98,12 +160,17 @@ def _is_worker_alive(worker: str) -> bool:
return True
def _revoke_task(task_result) -> None:
"""Send `SIGTERM` to a hung Celery task. Non-fatal on failure."""
def _revoke_task(task_result, terminate: bool = True) -> None:
"""Revoke a Celery task. Non-fatal on failure.
`terminate=True` SIGTERMs the worker if the task is mid-execution; use
for EXECUTING cleanup. `terminate=False` only marks the task id revoked
across workers, so any worker pulling the queued message discards it;
use for SCHEDULED cleanup where the task hasn't run yet.
"""
try:
current_app.control.revoke(
task_result.task_id, terminate=True, signal="SIGTERM"
)
kwargs = {"terminate": True, "signal": "SIGTERM"} if terminate else {}
current_app.control.revoke(task_result.task_id, **kwargs)
logger.info(f"Revoked task {task_result.task_id}")
except Exception:
logger.exception(f"Failed to revoke task {task_result.task_id}")
@@ -125,28 +192,64 @@ def _cleanup_scan(scan, task_result, reason: str) -> bool:
except Exception:
logger.exception(f"Failed to drop temp database {tmp_db_name}")
# 2. Lock row, verify still EXECUTING, mark FAILED — all atomic
with rls_transaction(str(scan.tenant_id)):
try:
fresh_scan = AttackPathsScan.objects.select_for_update().get(id=scan.id)
except AttackPathsScan.DoesNotExist:
logger.warning(f"Scan {scan_id_str} no longer exists, skipping")
return False
fresh_scan = _finalize_failed_scan(scan, StateChoices.EXECUTING, reason)
if fresh_scan is None:
return False
if fresh_scan.state != StateChoices.EXECUTING:
logger.info(f"Scan {scan_id_str} is now {fresh_scan.state}, skipping")
return False
_mark_scan_finished(fresh_scan, StateChoices.FAILED, {"global_error": reason})
# 3. Mark `TaskResult` as `FAILURE` (not RLS-protected, outside lock)
# Mark `TaskResult` as `FAILURE` (not RLS-protected, outside lock)
if task_result:
task_result.status = states.FAILURE
task_result.date_done = datetime.now(tz=timezone.utc)
task_result.save(update_fields=["status", "date_done"])
# 4. Recover graph_data_ready if provider data still exists
recover_graph_data_ready(fresh_scan)
logger.info(f"Cleaned up stale scan {scan_id_str}: {reason}")
return True
def _cleanup_scheduled_scan(scan, task_result, reason: str) -> bool:
"""
Clean up a `SCHEDULED` scan that never reached a worker.
Skips the temp Neo4j drop the database is only created once the worker
enters `EXECUTING`, so dropping it here just produces noisy log output.
Returns `True` if the scan was actually cleaned up, `False` if skipped.
"""
scan_id_str = str(scan.id)
fresh_scan = _finalize_failed_scan(scan, StateChoices.SCHEDULED, reason)
if fresh_scan is None:
return False
if task_result:
task_result.status = states.FAILURE
task_result.date_done = datetime.now(tz=timezone.utc)
task_result.save(update_fields=["status", "date_done"])
logger.info(f"Cleaned up scheduled scan {scan_id_str}: {reason}")
return True
def _finalize_failed_scan(scan, expected_state: str, reason: str):
"""
Atomically lock the row, verify it's still in `expected_state`, and
mark it `FAILED`. Returns the locked row on success, `None` if the
row is gone or has already moved on.
"""
scan_id_str = str(scan.id)
with rls_transaction(str(scan.tenant_id)):
try:
fresh_scan = AttackPathsScan.objects.select_for_update().get(id=scan.id)
except AttackPathsScan.DoesNotExist:
logger.warning(f"Scan {scan_id_str} no longer exists, skipping")
return None
if fresh_scan.state != expected_state:
logger.info(f"Scan {scan_id_str} is now {fresh_scan.state}, skipping")
return None
_mark_scan_finished(fresh_scan, StateChoices.FAILED, {"global_error": reason})
return fresh_scan
@@ -8,9 +8,9 @@ from tasks.jobs.attack_paths import aws
# Batch size for Neo4j write operations (resource labeling, cleanup)
BATCH_SIZE = env.int("ATTACK_PATHS_BATCH_SIZE", 1000)
# Batch size for Postgres findings fetch (keyset pagination page size)
FINDINGS_BATCH_SIZE = env.int("ATTACK_PATHS_FINDINGS_BATCH_SIZE", 500)
FINDINGS_BATCH_SIZE = env.int("ATTACK_PATHS_FINDINGS_BATCH_SIZE", 1000)
# Batch size for temp-to-tenant graph sync (nodes and relationships per cursor page)
SYNC_BATCH_SIZE = env.int("ATTACK_PATHS_SYNC_BATCH_SIZE", 250)
SYNC_BATCH_SIZE = env.int("ATTACK_PATHS_SYNC_BATCH_SIZE", 1000)
# Neo4j internal labels (Prowler-specific, not provider-specific)
# - `Internet`: Singleton node representing external internet access for exposed-resource queries
@@ -37,6 +37,8 @@ class ProviderConfig:
# Label for resources connected to the account node, enabling indexed finding lookups.
resource_label: str # e.g., "_AWSResource"
ingestion_function: Callable
# Maps a Postgres resource UID (e.g. full ARN) to the short-id form Cartography stores on some node types (e.g. `i-xxx` for EC2Instance).
short_uid_extractor: Callable[[str], str]
# Provider Configurations
@@ -48,6 +50,7 @@ AWS_CONFIG = ProviderConfig(
uid_field="arn",
resource_label="_AWSResource",
ingestion_function=aws.start_aws_ingestion,
short_uid_extractor=aws.extract_short_uid,
)
PROVIDER_CONFIGS: dict[str, ProviderConfig] = {
@@ -116,6 +119,21 @@ def get_provider_resource_label(provider_type: str) -> str:
return config.resource_label if config else "_UnknownProviderResource"
def _identity_short_uid(uid: str) -> str:
"""Fallback short-uid extractor for providers without a custom mapping."""
return uid
def get_short_uid_extractor(provider_type: str) -> Callable[[str], str]:
"""Get the short-uid extractor for a provider type.
Returns an identity function when the provider is unknown, so callers can
rely on a callable always being returned.
"""
config = PROVIDER_CONFIGS.get(provider_type)
return config.short_uid_extractor if config else _identity_short_uid
# Dynamic Isolation Label Helpers
# --------------------------------
@@ -67,25 +67,52 @@ def retrieve_attack_paths_scan(
return None
def set_attack_paths_scan_task_id(
tenant_id: str,
scan_pk: str,
task_id: str,
) -> None:
"""Persist the Celery `task_id` on the `AttackPathsScan` row.
Called at dispatch time (when `apply_async` returns) so the row carries
the task id even while still `SCHEDULED`. This lets the periodic
cleanup revoke queued messages for scans that never reached a worker.
"""
with rls_transaction(tenant_id):
ProwlerAPIAttackPathsScan.objects.filter(id=scan_pk).update(task_id=task_id)
def starting_attack_paths_scan(
attack_paths_scan: ProwlerAPIAttackPathsScan,
task_id: str,
cartography_config: CartographyConfig,
) -> None:
with rls_transaction(attack_paths_scan.tenant_id):
attack_paths_scan.task_id = task_id
attack_paths_scan.state = StateChoices.EXECUTING
attack_paths_scan.started_at = datetime.now(tz=timezone.utc)
attack_paths_scan.update_tag = cartography_config.update_tag
) -> bool:
"""Flip the row from `SCHEDULED` to `EXECUTING` atomically.
attack_paths_scan.save(
update_fields=[
"task_id",
"state",
"started_at",
"update_tag",
]
)
Returns `False` if the row is gone or has already moved past
`SCHEDULED` (e.g., periodic cleanup raced ahead and marked it
`FAILED` while the worker message was still in flight).
"""
with rls_transaction(attack_paths_scan.tenant_id):
try:
locked = ProwlerAPIAttackPathsScan.objects.select_for_update().get(
id=attack_paths_scan.id
)
except ProwlerAPIAttackPathsScan.DoesNotExist:
return False
if locked.state != StateChoices.SCHEDULED:
return False
locked.state = StateChoices.EXECUTING
locked.started_at = datetime.now(tz=timezone.utc)
locked.update_tag = cartography_config.update_tag
locked.save(update_fields=["state", "started_at", "update_tag"])
# Keep the in-memory object the caller is holding in sync.
attack_paths_scan.state = locked.state
attack_paths_scan.started_at = locked.started_at
attack_paths_scan.update_tag = locked.update_tag
return True
def _mark_scan_finished(
@@ -5,14 +5,14 @@ This module handles:
- Adding resource labels to Cartography nodes for efficient lookups
- Loading Prowler findings into the graph
- Linking findings to resources
- Cleaning up stale findings
"""
from collections import defaultdict
from typing import Any, Generator
from typing import Any, Callable, Generator
from uuid import UUID
import neo4j
from cartography.config import Config as CartographyConfig
from celery.utils.log import get_task_logger
from tasks.jobs.attack_paths.config import (
@@ -21,10 +21,10 @@ from tasks.jobs.attack_paths.config import (
get_node_uid_field,
get_provider_resource_label,
get_root_node_label,
get_short_uid_extractor,
)
from tasks.jobs.attack_paths.queries import (
ADD_RESOURCE_LABEL_TEMPLATE,
CLEANUP_FINDINGS_TEMPLATE,
INSERT_FINDING_TEMPLATE,
render_cypher_template,
)
@@ -58,7 +58,9 @@ _DB_QUERY_FIELDS = [
]
def _to_neo4j_dict(record: dict[str, Any], resource_uid: str) -> dict[str, Any]:
def _to_neo4j_dict(
record: dict[str, Any], resource_uid: str, resource_short_uid: str
) -> dict[str, Any]:
"""Transform a Django `.values()` record into a `dict` ready for Neo4j ingestion."""
return {
"id": str(record["id"]),
@@ -76,6 +78,7 @@ def _to_neo4j_dict(record: dict[str, Any], resource_uid: str) -> dict[str, Any]:
"muted": record["muted"],
"muted_reason": record["muted_reason"],
"resource_uid": resource_uid,
"resource_short_uid": resource_short_uid,
}
@@ -88,18 +91,21 @@ def analysis(
prowler_api_provider: Provider,
scan_id: str,
config: CartographyConfig,
) -> None:
) -> tuple[int, int]:
"""
Main entry point for Prowler findings analysis.
Adds resource labels, loads findings, and cleans up stale data.
Adds resource labels and loads findings.
Returns (labeled_nodes, findings_loaded).
"""
add_resource_label(
total_labeled = add_resource_label(
neo4j_session, prowler_api_provider.provider, str(prowler_api_provider.uid)
)
findings_data = stream_findings_with_resources(prowler_api_provider, scan_id)
load_findings(neo4j_session, findings_data, prowler_api_provider, config)
cleanup_findings(neo4j_session, prowler_api_provider, config)
total_loaded = load_findings(
neo4j_session, findings_data, prowler_api_provider, config
)
return total_labeled, total_loaded
def add_resource_label(
@@ -149,12 +155,11 @@ def load_findings(
findings_batches: Generator[list[dict[str, Any]], None, None],
prowler_api_provider: Provider,
config: CartographyConfig,
) -> None:
) -> int:
"""Load Prowler findings into the graph, linking them to resources."""
query = render_cypher_template(
INSERT_FINDING_TEMPLATE,
{
"__ROOT_NODE_LABEL__": get_root_node_label(prowler_api_provider.provider),
"__NODE_UID_FIELD__": get_node_uid_field(prowler_api_provider.provider),
"__RESOURCE_LABEL__": get_provider_resource_label(
prowler_api_provider.provider
@@ -163,13 +168,14 @@ def load_findings(
)
parameters = {
"provider_uid": str(prowler_api_provider.uid),
"last_updated": config.update_tag,
"prowler_version": ProwlerConfig.prowler_version,
}
batch_num = 0
total_records = 0
edges_merged = 0
edges_dropped = 0
for batch in findings_batches:
batch_num += 1
batch_size = len(batch)
@@ -178,31 +184,16 @@ def load_findings(
parameters["findings_data"] = batch
logger.info(f"Loading findings batch {batch_num} ({batch_size} records)")
neo4j_session.run(query, parameters)
summary = neo4j_session.run(query, parameters).single()
if summary is not None:
edges_merged += summary.get("merged_count", 0)
edges_dropped += summary.get("dropped_count", 0)
logger.info(f"Finished loading {total_records} records in {batch_num} batches")
def cleanup_findings(
neo4j_session: neo4j.Session,
prowler_api_provider: Provider,
config: CartographyConfig,
) -> None:
"""Remove stale findings (classic Cartography behaviour)."""
parameters = {
"last_updated": config.update_tag,
"batch_size": BATCH_SIZE,
}
batch = 1
deleted_count = 1
while deleted_count > 0:
logger.info(f"Cleaning findings batch {batch}")
result = neo4j_session.run(CLEANUP_FINDINGS_TEMPLATE, parameters)
deleted_count = result.single().get("deleted_findings_count", 0)
batch += 1
logger.info(
f"Finished loading {total_records} records in {batch_num} batches "
f"(edges_merged={edges_merged}, edges_dropped={edges_dropped})"
)
return total_records
# Findings Streaming (Generator-based)
@@ -226,8 +217,9 @@ def stream_findings_with_resources(
)
tenant_id = prowler_api_provider.tenant_id
short_uid_extractor = get_short_uid_extractor(prowler_api_provider.provider)
for batch in _paginate_findings(tenant_id, scan_id):
enriched = _enrich_batch_with_resources(batch, tenant_id)
enriched = _enrich_batch_with_resources(batch, tenant_id, short_uid_extractor)
if enriched:
yield enriched
@@ -273,7 +265,9 @@ def _fetch_findings_batch(
with rls_transaction(tenant_id, using=READ_REPLICA_ALIAS):
# Use `all_objects` to get `Findings` even on soft-deleted `Providers`
# But even the provider is already validated as active in this context
qs = FindingModel.all_objects.filter(scan_id=scan_id).order_by("id")
qs = FindingModel.all_objects.filter(
tenant_id=tenant_id, scan_id=scan_id
).order_by("id")
if after_id is not None:
qs = qs.filter(id__gt=after_id)
@@ -288,6 +282,7 @@ def _fetch_findings_batch(
def _enrich_batch_with_resources(
findings_batch: list[dict[str, Any]],
tenant_id: str,
short_uid_extractor: Callable[[str], str],
) -> list[dict[str, Any]]:
"""
Enrich findings with their resource UIDs.
@@ -299,7 +294,7 @@ def _enrich_batch_with_resources(
resource_map = _build_finding_resource_map(finding_ids, tenant_id)
return [
_to_neo4j_dict(finding, resource_uid)
_to_neo4j_dict(finding, resource_uid, short_uid_extractor(resource_uid))
for finding in findings_batch
for resource_uid in resource_map.get(finding["id"], [])
]
@@ -13,14 +13,13 @@ from tasks.jobs.attack_paths.config import (
logger = get_task_logger(__name__)
# Indexes for Prowler findings and resource lookups
# Indexes for Prowler Findings and resource lookups
FINDINGS_INDEX_STATEMENTS = [
# Resource indexes for Prowler Finding lookups
"CREATE INDEX aws_resource_arn IF NOT EXISTS FOR (n:_AWSResource) ON (n.arn);",
"CREATE INDEX aws_resource_id IF NOT EXISTS FOR (n:_AWSResource) ON (n.id);",
# Prowler Finding indexes
f"CREATE INDEX prowler_finding_id IF NOT EXISTS FOR (n:{PROWLER_FINDING_LABEL}) ON (n.id);",
f"CREATE INDEX prowler_finding_lastupdated IF NOT EXISTS FOR (n:{PROWLER_FINDING_LABEL}) ON (n.lastupdated);",
f"CREATE INDEX prowler_finding_status IF NOT EXISTS FOR (n:{PROWLER_FINDING_LABEL}) ON (n.status);",
# Internet node index for MERGE lookups
f"CREATE INDEX internet_id IF NOT EXISTS FOR (n:{INTERNET_NODE_LABEL}) ON (n.id);",
@@ -32,63 +32,59 @@ ADD_RESOURCE_LABEL_TEMPLATE = """
"""
INSERT_FINDING_TEMPLATE = f"""
MATCH (account:__ROOT_NODE_LABEL__ {{id: $provider_uid}})
UNWIND $findings_data AS finding_data
OPTIONAL MATCH (account)-->(resource_by_uid:__RESOURCE_LABEL__)
WHERE resource_by_uid.__NODE_UID_FIELD__ = finding_data.resource_uid
WITH account, finding_data, resource_by_uid
OPTIONAL MATCH (account)-->(resource_by_id:__RESOURCE_LABEL__)
OPTIONAL MATCH (resource_by_uid:__RESOURCE_LABEL__ {{__NODE_UID_FIELD__: finding_data.resource_uid}})
OPTIONAL MATCH (resource_by_id:__RESOURCE_LABEL__ {{id: finding_data.resource_uid}})
WHERE resource_by_uid IS NULL
AND resource_by_id.id = finding_data.resource_uid
WITH account, finding_data, COALESCE(resource_by_uid, resource_by_id) AS resource
WHERE resource IS NOT NULL
OPTIONAL MATCH (resource_by_short:__RESOURCE_LABEL__ {{id: finding_data.resource_short_uid}})
WHERE resource_by_uid IS NULL AND resource_by_id IS NULL
WITH finding_data,
resource_by_uid,
resource_by_id,
head(collect(resource_by_short)) AS resource_by_short
WITH finding_data,
COALESCE(resource_by_uid, resource_by_id, resource_by_short) AS resource
MERGE (finding:{PROWLER_FINDING_LABEL} {{id: finding_data.id}})
ON CREATE SET
finding.id = finding_data.id,
finding.uid = finding_data.uid,
finding.inserted_at = finding_data.inserted_at,
finding.updated_at = finding_data.updated_at,
finding.first_seen_at = finding_data.first_seen_at,
finding.scan_id = finding_data.scan_id,
finding.delta = finding_data.delta,
finding.status = finding_data.status,
finding.status_extended = finding_data.status_extended,
finding.severity = finding_data.severity,
finding.check_id = finding_data.check_id,
finding.check_title = finding_data.check_title,
finding.muted = finding_data.muted,
finding.muted_reason = finding_data.muted_reason,
finding.firstseen = timestamp(),
finding.lastupdated = $last_updated,
finding._module_name = 'cartography:prowler',
finding._module_version = $prowler_version
ON MATCH SET
finding.status = finding_data.status,
finding.status_extended = finding_data.status_extended,
finding.lastupdated = $last_updated
FOREACH (_ IN CASE WHEN resource IS NOT NULL THEN [1] ELSE [] END |
MERGE (finding:{PROWLER_FINDING_LABEL} {{id: finding_data.id}})
ON CREATE SET
finding.id = finding_data.id,
finding.uid = finding_data.uid,
finding.inserted_at = finding_data.inserted_at,
finding.updated_at = finding_data.updated_at,
finding.first_seen_at = finding_data.first_seen_at,
finding.scan_id = finding_data.scan_id,
finding.delta = finding_data.delta,
finding.status = finding_data.status,
finding.status_extended = finding_data.status_extended,
finding.severity = finding_data.severity,
finding.check_id = finding_data.check_id,
finding.check_title = finding_data.check_title,
finding.muted = finding_data.muted,
finding.muted_reason = finding_data.muted_reason,
finding.firstseen = timestamp(),
finding.lastupdated = $last_updated,
finding._module_name = 'cartography:prowler',
finding._module_version = $prowler_version
ON MATCH SET
finding.status = finding_data.status,
finding.status_extended = finding_data.status_extended,
finding.lastupdated = $last_updated
MERGE (resource)-[rel:HAS_FINDING]->(finding)
ON CREATE SET
rel.firstseen = timestamp(),
rel.lastupdated = $last_updated,
rel._module_name = 'cartography:prowler',
rel._module_version = $prowler_version
ON MATCH SET
rel.lastupdated = $last_updated
)
MERGE (resource)-[rel:HAS_FINDING]->(finding)
ON CREATE SET
rel.firstseen = timestamp(),
rel.lastupdated = $last_updated,
rel._module_name = 'cartography:prowler',
rel._module_version = $prowler_version
ON MATCH SET
rel.lastupdated = $last_updated
"""
WITH sum(CASE WHEN resource IS NOT NULL THEN 1 ELSE 0 END) AS merged_count,
sum(CASE WHEN resource IS NULL THEN 1 ELSE 0 END) AS dropped_count
CLEANUP_FINDINGS_TEMPLATE = f"""
MATCH (finding:{PROWLER_FINDING_LABEL})
WHERE finding.lastupdated < $last_updated
WITH finding LIMIT $batch_size
DETACH DELETE finding
RETURN COUNT(finding) AS deleted_findings_count
RETURN merged_count, dropped_count
"""
# Internet queries (used by internet.py)
+62 -12
View File
@@ -55,6 +55,7 @@ exception propagates to Celery.
import logging
import time
from typing import Any
from cartography.config import Config as CartographyConfig
@@ -96,6 +97,19 @@ def run(tenant_id: str, scan_id: str, task_id: str) -> dict[str, Any]:
)
attack_paths_scan = db_utils.retrieve_attack_paths_scan(tenant_id, scan_id)
# Idempotency guard: cleanup may have flipped this row to a terminal state
# while the message was still in flight. Bail out before touching state.
if attack_paths_scan and attack_paths_scan.state in (
StateChoices.FAILED,
StateChoices.COMPLETED,
StateChoices.CANCELLED,
):
logger.warning(
f"Attack Paths scan {attack_paths_scan.id} already in terminal "
f"state {attack_paths_scan.state}; skipping execution"
)
return {}
# Checks before starting the scan
if not cartography_ingestion_function:
ingestion_exceptions = {
@@ -113,12 +127,17 @@ def run(tenant_id: str, scan_id: str, task_id: str) -> dict[str, Any]:
else:
if not attack_paths_scan:
# Safety net for in-flight messages or direct task invocations; dispatcher normally pre-creates the row.
logger.warning(
f"No Attack Paths Scan found for scan {scan_id} and tenant {tenant_id}, let's create it then"
)
attack_paths_scan = db_utils.create_attack_paths_scan(
tenant_id, scan_id, prowler_api_provider.id
)
if attack_paths_scan and task_id:
db_utils.set_attack_paths_scan_task_id(
tenant_id, attack_paths_scan.id, task_id
)
tmp_database_name = graph_database.get_database_name(
attack_paths_scan.id, temporary=True
@@ -140,8 +159,18 @@ def run(tenant_id: str, scan_id: str, task_id: str) -> dict[str, Any]:
)
# Starting the Attack Paths scan
db_utils.starting_attack_paths_scan(
attack_paths_scan, task_id, tenant_cartography_config
if not db_utils.starting_attack_paths_scan(
attack_paths_scan, tenant_cartography_config
):
logger.warning(
f"Attack Paths scan {attack_paths_scan.id} no longer in SCHEDULED state; cleanup likely raced ahead"
)
return {}
scan_t0 = time.perf_counter()
logger.info(
f"Starting Attack Paths scan ({attack_paths_scan.id}) for "
f"{prowler_api_provider.provider.upper()} provider {prowler_api_provider.id}"
)
subgraph_dropped = False
@@ -169,6 +198,7 @@ def run(tenant_id: str, scan_id: str, task_id: str) -> dict[str, Any]:
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 2)
# The real scan, where iterates over cloud services
t0 = time.perf_counter()
ingestion_exceptions = utils.call_within_event_loop(
cartography_ingestion_function,
tmp_neo4j_session,
@@ -177,19 +207,23 @@ def run(tenant_id: str, scan_id: str, task_id: str) -> dict[str, Any]:
prowler_sdk_provider,
attack_paths_scan,
)
logger.info(
f"Cartography ingestion completed in {time.perf_counter() - t0:.3f}s "
f"(failed_syncs={len(ingestion_exceptions)})"
)
# Post-processing: Just keeping it to be more Cartography compliant
logger.info(
f"Syncing Cartography ontology for AWS account {prowler_api_provider.uid}"
)
cartography_ontology.run(tmp_neo4j_session, tmp_cartography_config)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 95)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 94)
logger.info(
f"Syncing Cartography analysis for AWS account {prowler_api_provider.uid}"
)
cartography_analysis.run(tmp_neo4j_session, tmp_cartography_config)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 96)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 95)
# Creating Internet node and CAN_ACCESS relationships
logger.info(
@@ -198,14 +232,20 @@ def run(tenant_id: str, scan_id: str, task_id: str) -> dict[str, Any]:
internet.analysis(
tmp_neo4j_session, prowler_api_provider, tmp_cartography_config
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 96)
# Adding Prowler Finding nodes and relationships
logger.info(
f"Syncing Prowler analysis for AWS account {prowler_api_provider.uid}"
)
findings.analysis(
t0 = time.perf_counter()
labeled_nodes, findings_loaded = findings.analysis(
tmp_neo4j_session, prowler_api_provider, scan_id, tmp_cartography_config
)
logger.info(
f"Prowler analysis completed in {time.perf_counter() - t0:.3f}s "
f"(findings={findings_loaded}, labeled_nodes={labeled_nodes})"
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 97)
logger.info(
@@ -227,22 +267,33 @@ def run(tenant_id: str, scan_id: str, task_id: str) -> dict[str, Any]:
logger.info(f"Deleting existing provider graph in {tenant_database_name}")
db_utils.set_provider_graph_data_ready(attack_paths_scan, False)
provider_gated = True
graph_database.drop_subgraph(
t0 = time.perf_counter()
deleted_nodes = graph_database.drop_subgraph(
database=tenant_database_name,
provider_id=str(prowler_api_provider.id),
)
logger.info(
f"Deleted existing provider graph in {time.perf_counter() - t0:.3f}s "
f"(deleted_nodes={deleted_nodes})"
)
subgraph_dropped = True
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 98)
logger.info(
f"Syncing graph from {tmp_database_name} into {tenant_database_name}"
)
sync.sync_graph(
t0 = time.perf_counter()
sync_result = sync.sync_graph(
source_database=tmp_database_name,
target_database=tenant_database_name,
tenant_id=str(prowler_api_provider.tenant_id),
provider_id=str(prowler_api_provider.id),
)
logger.info(
f"Synced graph in {time.perf_counter() - t0:.3f}s "
f"(nodes={sync_result['nodes']}, relationships={sync_result['relationships']})"
)
sync_completed = True
db_utils.set_graph_data_ready(attack_paths_scan, True)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 99)
@@ -250,17 +301,16 @@ def run(tenant_id: str, scan_id: str, task_id: str) -> dict[str, Any]:
logger.info(f"Clearing Neo4j cache for database {tenant_database_name}")
graph_database.clear_cache(tenant_database_name)
logger.info(
f"Completed Cartography ({attack_paths_scan.id}) for "
f"{prowler_api_provider.provider.upper()} provider {prowler_api_provider.id}"
)
logger.info(f"Dropping temporary Neo4j database {tmp_database_name}")
graph_database.drop_database(tmp_database_name)
db_utils.finish_attack_paths_scan(
attack_paths_scan, StateChoices.COMPLETED, ingestion_exceptions
)
logger.info(
f"Attack Paths scan completed in {time.perf_counter() - scan_t0:.3f}s "
f"(state=completed, failed_syncs={len(ingestion_exceptions)})"
)
return ingestion_exceptions
except Exception as e:
@@ -5,6 +5,8 @@ This module handles syncing graph data from temporary scan databases
to the tenant database, adding provider isolation labels and properties.
"""
import time
from collections import defaultdict
from typing import Any
@@ -81,6 +83,7 @@ def sync_nodes(
Source and target sessions are opened sequentially per batch to avoid
holding two Bolt connections simultaneously for the entire sync duration.
"""
t0 = time.perf_counter()
last_id = -1
total_synced = 0
@@ -117,7 +120,7 @@ def sync_nodes(
total_synced += batch_count
logger.info(
f"Synced {total_synced} nodes from {source_database} to {target_database}"
f"Synced {total_synced} nodes from {source_database} to {target_database} in {time.perf_counter() - t0:.3f}s"
)
return total_synced
@@ -136,6 +139,7 @@ def sync_relationships(
Source and target sessions are opened sequentially per batch to avoid
holding two Bolt connections simultaneously for the entire sync duration.
"""
t0 = time.perf_counter()
last_id = -1
total_synced = 0
@@ -166,7 +170,7 @@ def sync_relationships(
total_synced += batch_count
logger.info(
f"Synced {total_synced} relationships from {source_database} to {target_database}"
f"Synced {total_synced} relationships from {source_database} to {target_database} in {time.perf_counter() - t0:.3f}s"
)
return total_synced
+46 -26
View File
@@ -297,12 +297,15 @@ def backfill_daily_severity_summaries(tenant_id: str, days: int = None):
}
def backfill_scan_category_summaries(tenant_id: str, scan_id: str):
def aggregate_scan_category_summaries(tenant_id: str, scan_id: str):
"""
Backfill ScanCategorySummary for a completed scan.
Aggregates category counts from all findings in the scan and creates
one ScanCategorySummary row per (category, severity) combination.
Idempotent: re-runs replace the scan's existing rows so counts stay in
sync with `Finding.muted` updates triggered outside scan completion
(e.g. mute rules).
Args:
tenant_id: Target tenant UUID
@@ -312,11 +315,6 @@ def backfill_scan_category_summaries(tenant_id: str, scan_id: str):
dict: Status indicating whether backfill was performed
"""
with rls_transaction(tenant_id, using=READ_REPLICA_ALIAS):
if ScanCategorySummary.objects.filter(
tenant_id=tenant_id, scan_id=scan_id
).exists():
return {"status": "already backfilled"}
if not Scan.objects.filter(
tenant_id=tenant_id,
id=scan_id,
@@ -337,9 +335,6 @@ def backfill_scan_category_summaries(tenant_id: str, scan_id: str):
cache=category_counts,
)
if not category_counts:
return {"status": "no categories to backfill"}
category_summaries = [
ScanCategorySummary(
tenant_id=tenant_id,
@@ -353,20 +348,38 @@ def backfill_scan_category_summaries(tenant_id: str, scan_id: str):
for (category, severity), counts in category_counts.items()
]
with rls_transaction(tenant_id):
ScanCategorySummary.objects.bulk_create(
category_summaries, batch_size=500, ignore_conflicts=True
)
if category_summaries:
with rls_transaction(tenant_id):
# Upsert so re-runs (post-mute reaggregation) don't trip
# `unique_category_severity_per_scan`; race-safe under concurrent writers.
ScanCategorySummary.objects.bulk_create(
category_summaries,
batch_size=500,
update_conflicts=True,
unique_fields=["tenant_id", "scan_id", "category", "severity"],
update_fields=[
"total_findings",
"failed_findings",
"new_failed_findings",
],
)
if not category_counts:
return {"status": "no categories to backfill"}
return {"status": "backfilled", "categories_count": len(category_counts)}
def backfill_scan_resource_group_summaries(tenant_id: str, scan_id: str):
def aggregate_scan_resource_group_summaries(tenant_id: str, scan_id: str):
"""
Backfill ScanGroupSummary for a completed scan.
Aggregates resource group counts from all findings in the scan and creates
one ScanGroupSummary row per (resource_group, severity) combination.
Idempotent: re-runs replace the scan's existing rows so counts stay in
sync with `Finding.muted` updates triggered outside scan completion
(e.g. mute rules) and with resource-inventory views reading from this
table.
Args:
tenant_id: Target tenant UUID
@@ -376,11 +389,6 @@ def backfill_scan_resource_group_summaries(tenant_id: str, scan_id: str):
dict: Status indicating whether backfill was performed
"""
with rls_transaction(tenant_id, using=READ_REPLICA_ALIAS):
if ScanGroupSummary.objects.filter(
tenant_id=tenant_id, scan_id=scan_id
).exists():
return {"status": "already backfilled"}
if not Scan.objects.filter(
tenant_id=tenant_id,
id=scan_id,
@@ -418,9 +426,6 @@ def backfill_scan_resource_group_summaries(tenant_id: str, scan_id: str):
group_resources_cache=group_resources_cache,
)
if not resource_group_counts:
return {"status": "no resource groups to backfill"}
# Compute group-level resource counts (same value for all severity rows in a group)
group_resource_counts = {
grp: len(uids) for grp, uids in group_resources_cache.items()
@@ -439,10 +444,25 @@ def backfill_scan_resource_group_summaries(tenant_id: str, scan_id: str):
for (grp, severity), counts in resource_group_counts.items()
]
with rls_transaction(tenant_id):
ScanGroupSummary.objects.bulk_create(
resource_group_summaries, batch_size=500, ignore_conflicts=True
)
if resource_group_summaries:
with rls_transaction(tenant_id):
# Upsert so re-runs (post-mute reaggregation) don't trip
# `unique_resource_group_severity_per_scan`; race-safe under concurrent writers.
ScanGroupSummary.objects.bulk_create(
resource_group_summaries,
batch_size=500,
update_conflicts=True,
unique_fields=["tenant_id", "scan_id", "resource_group", "severity"],
update_fields=[
"total_findings",
"failed_findings",
"new_failed_findings",
"resources_count",
],
)
if not resource_group_counts:
return {"status": "no resource groups to backfill"}
return {"status": "backfilled", "resource_groups_count": len(resource_group_counts)}
+11 -3
View File
@@ -32,9 +32,13 @@ from prowler.lib.outputs.compliance.cis.cis_aws import AWSCIS
from prowler.lib.outputs.compliance.cis.cis_azure import AzureCIS
from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
from prowler.lib.outputs.compliance.cis.cis_github import GithubCIS
from prowler.lib.outputs.compliance.cis.cis_googleworkspace import GoogleWorkspaceCIS
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
from prowler.lib.outputs.compliance.cis.cis_m365 import M365CIS
from prowler.lib.outputs.compliance.cis.cis_oraclecloud import OracleCloudCIS
from prowler.lib.outputs.compliance.cisa_scuba.cisa_scuba_googleworkspace import (
GoogleWorkspaceCISASCuBA,
)
from prowler.lib.outputs.compliance.csa.csa_alibabacloud import AlibabaCloudCSA
from prowler.lib.outputs.compliance.csa.csa_aws import AWSCSA
from prowler.lib.outputs.compliance.csa.csa_azure import AzureCSA
@@ -93,7 +97,7 @@ COMPLIANCE_CLASS_MAP = {
(lambda name: name.startswith("iso27001_"), AWSISO27001),
(lambda name: name.startswith("kisa"), AWSKISAISMSP),
(lambda name: name == "prowler_threatscore_aws", ProwlerThreatScoreAWS),
(lambda name: name == "ccc_aws", CCC_AWS),
(lambda name: name.startswith("ccc_"), CCC_AWS),
(lambda name: name.startswith("c5_"), AWSC5),
(lambda name: name.startswith("csa_"), AWSCSA),
],
@@ -102,7 +106,7 @@ COMPLIANCE_CLASS_MAP = {
(lambda name: name == "mitre_attack_azure", AzureMitreAttack),
(lambda name: name.startswith("ens_"), AzureENS),
(lambda name: name.startswith("iso27001_"), AzureISO27001),
(lambda name: name == "ccc_azure", CCC_Azure),
(lambda name: name.startswith("ccc_"), CCC_Azure),
(lambda name: name == "prowler_threatscore_azure", ProwlerThreatScoreAzure),
(lambda name: name == "c5_azure", AzureC5),
(lambda name: name.startswith("csa_"), AzureCSA),
@@ -113,7 +117,7 @@ COMPLIANCE_CLASS_MAP = {
(lambda name: name.startswith("ens_"), GCPENS),
(lambda name: name.startswith("iso27001_"), GCPISO27001),
(lambda name: name == "prowler_threatscore_gcp", ProwlerThreatScoreGCP),
(lambda name: name == "ccc_gcp", CCC_GCP),
(lambda name: name.startswith("ccc_"), CCC_GCP),
(lambda name: name == "c5_gcp", GCPC5),
(lambda name: name.startswith("csa_"), GCPCSA),
],
@@ -133,6 +137,10 @@ COMPLIANCE_CLASS_MAP = {
"github": [
(lambda name: name.startswith("cis_"), GithubCIS),
],
"googleworkspace": [
(lambda name: name.startswith("cis_"), GoogleWorkspaceCIS),
(lambda name: name.startswith("cisa_scuba_"), GoogleWorkspaceCISASCuBA),
],
"iac": [
# IaC provider doesn't have specific compliance frameworks yet
# Trivy handles its own compliance checks
+636 -42
View File
@@ -1,11 +1,19 @@
import gc
import os
import re
import time
from collections.abc import Iterable
from pathlib import Path
from shutil import rmtree
from uuid import UUID
import fcntl
from celery.utils.log import get_task_logger
from config.django.base import DJANGO_TMP_OUTPUT_DIRECTORY
from tasks.jobs.export import _generate_compliance_output_directory, _upload_to_s3
from tasks.jobs.reports import (
FRAMEWORK_REGISTRY,
CISReportGenerator,
CSAReportGenerator,
ENSReportGenerator,
NIS2ReportGenerator,
@@ -14,12 +22,398 @@ from tasks.jobs.reports import (
from tasks.jobs.threatscore import compute_threatscore_metrics
from tasks.jobs.threatscore_utils import _aggregate_requirement_statistics_from_database
from api.db_router import READ_REPLICA_ALIAS
from api.db_router import READ_REPLICA_ALIAS, MainRouter
from api.db_utils import rls_transaction
from api.models import Provider, ScanSummary, ThreatScoreSnapshot
from api.models import Provider, Scan, ScanSummary, StateChoices, ThreatScoreSnapshot
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.outputs.finding import Finding as FindingOutput
logger = get_task_logger(__name__)
STALE_TMP_OUTPUT_MAX_AGE_HOURS = 48
STALE_TMP_OUTPUT_MAX_DELETIONS_PER_RUN = 50
STALE_TMP_OUTPUT_THROTTLE_SECONDS = 60 * 60
STALE_TMP_OUTPUT_LOCK_FILE_NAME = ".stale_tmp_cleanup.lock"
# Refuse to ever run rmtree against shared system roots; the configured
# DJANGO_TMP_OUTPUT_DIRECTORY must be a dedicated subdirectory.
_FORBIDDEN_CLEANUP_ROOTS = frozenset(
Path(p).resolve()
for p in ("/", "/tmp", "/var", "/var/tmp", "/home", "/root", "/etc", "/usr")
)
def _resolve_stale_tmp_safe_root() -> Path | None:
"""Resolve the configured tmp output directory, rejecting unsafe roots."""
try:
configured_root = Path(DJANGO_TMP_OUTPUT_DIRECTORY).resolve()
except OSError:
return None
if configured_root in _FORBIDDEN_CLEANUP_ROOTS:
return None
return configured_root
STALE_TMP_OUTPUT_SAFE_ROOT = _resolve_stale_tmp_safe_root()
# Matches CIS compliance_ids like "cis_1.4_aws", "cis_5.0_azure",
# "cis_1.10_kubernetes", "cis_3.0.1_aws". Requires at least one dotted
# component so malformed inputs like "cis_._aws" or "cis_5._aws" are rejected
# at the regex stage, rather than by a later ValueError fallback.
_CIS_VARIANT_RE = re.compile(r"^cis_(?P<version>\d+(?:\.\d+)+)_(?P<provider>.+)$")
def _pick_latest_cis_variant(compliance_ids: Iterable[str]) -> str | None:
"""Return the CIS compliance_id with the highest semantic version.
CIS ships many variants per provider (e.g. cis_1.4_aws, ..., cis_6.0_aws).
A lexicographic sort is incorrect for version strings like ``1.10`` vs
``1.2``; this helper parses the version into a tuple of ints so ``1.10``
is correctly ordered after ``1.2``. Malformed names are skipped so a
broken JSON cannot crash the whole CIS pipeline.
Args:
compliance_ids: Iterable of CIS compliance identifiers. Expected to
belong to a single provider (callers should pass the already
filtered keys from ``Compliance.get_bulk(provider_type)``).
Returns:
The compliance_id with the highest parsed version, or ``None`` if no
well-formed CIS identifier was found.
"""
best_key: tuple[int, ...] | None = None
best_name: str | None = None
for name in compliance_ids:
match = _CIS_VARIANT_RE.match(name)
if not match:
continue
try:
key = tuple(int(part) for part in match.group("version").split("."))
except ValueError:
# Defensive: the regex already guarantees numeric chunks, but we
# keep the guard so a future regex change cannot crash callers.
continue
if best_key is None or key > best_key:
best_key = key
best_name = name
return best_name
def _should_run_stale_cleanup(
root_path: Path,
throttle_seconds: int = STALE_TMP_OUTPUT_THROTTLE_SECONDS,
) -> bool:
"""Throttle stale cleanup to at most once per hour per host."""
lock_file_path = root_path / STALE_TMP_OUTPUT_LOCK_FILE_NAME
now_timestamp = int(time.time())
try:
with lock_file_path.open("a+", encoding="ascii") as lock_file:
try:
fcntl.flock(lock_file.fileno(), fcntl.LOCK_EX | fcntl.LOCK_NB)
except BlockingIOError:
return False
lock_file.seek(0)
previous_value = lock_file.read().strip()
try:
last_run_timestamp = int(previous_value) if previous_value else 0
except ValueError:
last_run_timestamp = 0
if now_timestamp - last_run_timestamp < throttle_seconds:
return False
lock_file.seek(0)
lock_file.truncate()
lock_file.write(str(now_timestamp))
lock_file.flush()
os.fsync(lock_file.fileno())
except OSError as error:
logger.warning("Skipping stale tmp cleanup: lock file error (%s)", error)
return False
return True
def _is_scan_metadata_protected(
scan_path: Path,
scan_state: str | None,
output_location: str | None,
) -> bool:
"""
Return True when metadata indicates the directory must not be deleted.
Protected cases:
- Scan is still EXECUTING.
- Scan has a local output artifact path (non-S3) under this scan directory.
"""
if scan_state == StateChoices.EXECUTING.value:
return True
output_location = output_location or ""
if output_location and not output_location.startswith("s3://"):
try:
resolved_output_location = Path(output_location).resolve()
except OSError:
# Conservative fallback: if we cannot resolve a local output path,
# keep the directory to avoid deleting potentially needed artifacts.
return True
if (
resolved_output_location == scan_path
or scan_path in resolved_output_location.parents
):
return True
return False
def _is_scan_directory_protected(
tenant_id: str,
scan_id: str,
scan_path: Path,
) -> bool:
"""
DB-backed wrapper used when batch metadata is not already available.
"""
try:
scan_uuid = UUID(scan_id)
except ValueError:
return False
try:
scan = (
Scan.all_objects.using(MainRouter.admin_db)
.filter(tenant_id=tenant_id, id=scan_uuid)
.only("state", "output_location")
.first()
)
except Exception as error:
logger.warning(
"Skipping stale tmp cleanup for %s/%s due to scan lookup error: %s",
tenant_id,
scan_id,
error,
)
return True
if not scan:
return False
return _is_scan_metadata_protected(
scan_path=scan_path,
scan_state=scan.state,
output_location=scan.output_location,
)
def _cleanup_stale_tmp_output_directories(
tmp_output_root: str,
max_age_hours: int = STALE_TMP_OUTPUT_MAX_AGE_HOURS,
exclude_scan: tuple[str, str] | None = None,
max_deletions_per_run: int = STALE_TMP_OUTPUT_MAX_DELETIONS_PER_RUN,
) -> int:
"""
Opportunistically delete stale scan directories under the tmp output root.
Expected directory layout:
<tmp_output_root>/<tenant_id>/<scan_id>/...
Each run that wins the per-host throttle sweeps every tenant directory so
leftover artifacts cannot pile up for tenants whose own tasks happen to
lose the throttle race.
Args:
tmp_output_root: Base tmp output path.
max_age_hours: Directory max age before deletion.
exclude_scan: Optional (tenant_id, scan_id) that must never be deleted.
max_deletions_per_run: Max number of scan directories deleted per run.
Returns:
Number of deleted scan directories.
"""
try:
if max_age_hours <= 0:
return 0
try:
root_path = Path(tmp_output_root).resolve()
except OSError as error:
logger.warning(
"Skipping stale tmp cleanup: unable to resolve %s (%s)",
tmp_output_root,
error,
)
return 0
if (
STALE_TMP_OUTPUT_SAFE_ROOT is None
or root_path != STALE_TMP_OUTPUT_SAFE_ROOT
):
logger.warning(
"Skipping stale tmp cleanup: unsupported root %s (allowed: %s)",
root_path,
STALE_TMP_OUTPUT_SAFE_ROOT,
)
return 0
if not root_path.exists() or not root_path.is_dir():
return 0
if max_deletions_per_run <= 0:
return 0
if not _should_run_stale_cleanup(root_path):
return 0
cutoff_timestamp = time.time() - (max_age_hours * 60 * 60)
deleted_scan_dirs = 0
try:
tenant_dirs = list(root_path.iterdir())
except OSError as error:
logger.warning(
"Skipping stale tmp cleanup: unable to list %s (%s)",
root_path,
error,
)
return 0
for tenant_dir in tenant_dirs:
if deleted_scan_dirs >= max_deletions_per_run:
break
if not tenant_dir.is_dir() or tenant_dir.is_symlink():
continue
try:
scan_dirs = list(tenant_dir.iterdir())
except OSError:
continue
stale_candidates: list[tuple[str, Path, UUID | None]] = []
for scan_dir in scan_dirs:
if not scan_dir.is_dir() or scan_dir.is_symlink():
continue
if exclude_scan and (
tenant_dir.name == exclude_scan[0]
and scan_dir.name == exclude_scan[1]
):
continue
try:
if scan_dir.stat().st_mtime >= cutoff_timestamp:
continue
except OSError:
continue
try:
resolved_scan_dir = scan_dir.resolve()
except OSError:
continue
if root_path not in resolved_scan_dir.parents:
logger.warning(
"Skipping stale tmp cleanup for path outside root: %s",
resolved_scan_dir,
)
continue
try:
scan_uuid: UUID | None = UUID(scan_dir.name)
except ValueError:
scan_uuid = None
stale_candidates.append((scan_dir.name, resolved_scan_dir, scan_uuid))
if not stale_candidates:
continue
scan_metadata_by_id: dict[UUID, tuple[str | None, str | None]] = {}
metadata_preload_succeeded = False
candidate_scan_ids = [
candidate[2] for candidate in stale_candidates if candidate[2]
]
if candidate_scan_ids:
try:
scan_rows = (
Scan.all_objects.using(MainRouter.admin_db)
.filter(
tenant_id=tenant_dir.name,
id__in=candidate_scan_ids,
)
.values_list("id", "state", "output_location")
)
scan_metadata_by_id = {
scan_id: (scan_state, output_location)
for scan_id, scan_state, output_location in scan_rows
}
metadata_preload_succeeded = True
except Exception as error:
logger.warning(
"Skipping stale tmp cleanup metadata preload for tenant %s: %s",
tenant_dir.name,
error,
)
else:
metadata_preload_succeeded = True
for scan_name, resolved_scan_dir, scan_uuid in stale_candidates:
if deleted_scan_dirs >= max_deletions_per_run:
break
should_check_scan_fallback = True
if scan_uuid and metadata_preload_succeeded:
should_check_scan_fallback = False
scan_metadata = scan_metadata_by_id.get(scan_uuid)
if scan_metadata:
scan_state, output_location = scan_metadata
if _is_scan_metadata_protected(
scan_path=resolved_scan_dir,
scan_state=scan_state,
output_location=output_location,
):
continue
if should_check_scan_fallback and _is_scan_directory_protected(
tenant_id=tenant_dir.name,
scan_id=scan_name,
scan_path=resolved_scan_dir,
):
continue
try:
rmtree(resolved_scan_dir, ignore_errors=True)
deleted_scan_dirs += 1
except Exception as error:
logger.warning(
"Error cleaning stale tmp directory %s: %s",
resolved_scan_dir,
error,
)
if deleted_scan_dirs:
logger.info(
"Deleted %s stale tmp output directories older than %sh from %s",
deleted_scan_dirs,
max_age_hours,
root_path,
)
if deleted_scan_dirs >= max_deletions_per_run:
logger.info(
"Stale tmp cleanup hit deletion limit (%s) for root %s",
max_deletions_per_run,
root_path,
)
return deleted_scan_dirs
except Exception as error:
logger.warning(
"Skipping stale tmp cleanup due to unexpected error: %s",
error,
exc_info=True,
)
return 0
def generate_threatscore_report(
@@ -191,6 +585,53 @@ def generate_csa_report(
)
def generate_cis_report(
tenant_id: str,
scan_id: str,
compliance_id: str,
output_path: str,
provider_id: str,
only_failed: bool = True,
include_manual: bool = False,
provider_obj: Provider | None = None,
requirement_statistics: dict[str, dict[str, int]] | None = None,
findings_cache: dict[str, list[FindingOutput]] | None = None,
) -> None:
"""
Generate a PDF compliance report for a specific CIS Benchmark variant.
Unlike single-version frameworks (ENS, NIS2, CSA), CIS has multiple
variants per provider (e.g., cis_1.4_aws, cis_5.0_aws, cis_6.0_aws). This
wrapper is called once per variant, receiving the specific compliance_id.
Args:
tenant_id: The tenant ID for Row-Level Security context.
scan_id: ID of the scan executed by Prowler.
compliance_id: ID of the specific CIS variant (e.g., "cis_5.0_aws").
output_path: Output PDF file path.
provider_id: Provider ID for the scan.
only_failed: If True, only include failed requirements in detailed section.
include_manual: If True, include manual requirements in detailed section.
provider_obj: Pre-fetched Provider object to avoid duplicate queries.
requirement_statistics: Pre-aggregated requirement statistics.
findings_cache: Cache of already loaded findings to avoid duplicate queries.
"""
generator = CISReportGenerator(FRAMEWORK_REGISTRY["cis"])
generator.generate(
tenant_id=tenant_id,
scan_id=scan_id,
compliance_id=compliance_id,
output_path=output_path,
provider_id=provider_id,
provider_obj=provider_obj,
requirement_statistics=requirement_statistics,
findings_cache=findings_cache,
only_failed=only_failed,
include_manual=include_manual,
)
def generate_compliance_reports(
tenant_id: str,
scan_id: str,
@@ -199,6 +640,7 @@ def generate_compliance_reports(
generate_ens: bool = True,
generate_nis2: bool = True,
generate_csa: bool = True,
generate_cis: bool = True,
only_failed_threatscore: bool = True,
min_risk_level_threatscore: int = 4,
include_manual_ens: bool = True,
@@ -206,6 +648,8 @@ def generate_compliance_reports(
only_failed_nis2: bool = True,
only_failed_csa: bool = True,
include_manual_csa: bool = False,
only_failed_cis: bool = True,
include_manual_cis: bool = False,
) -> dict[str, dict[str, bool | str]]:
"""
Generate multiple compliance reports with shared database queries.
@@ -215,6 +659,13 @@ def generate_compliance_reports(
- Aggregating requirement statistics once (shared across all reports)
- Reusing compliance framework data when possible
For CIS a single PDF is produced per run: the one matching the highest
available CIS version for the scan's provider (picked dynamically from
``Compliance.get_bulk`` via :func:`_pick_latest_cis_variant`). The
returned ``results["cis"]`` entry has the same flat shape as the other
single-version frameworks the picked variant is an internal detail,
not surfaced in the result.
Args:
tenant_id: The tenant ID for Row-Level Security context.
scan_id: The ID of the scan to generate reports for.
@@ -223,6 +674,8 @@ def generate_compliance_reports(
generate_ens: Whether to generate ENS report.
generate_nis2: Whether to generate NIS2 report.
generate_csa: Whether to generate CSA CCM report.
generate_cis: Whether to generate a CIS Benchmark report for the
latest CIS version available for the provider.
only_failed_threatscore: For ThreatScore, only include failed requirements.
min_risk_level_threatscore: Minimum risk level for ThreatScore critical requirements.
include_manual_ens: For ENS, include manual requirements.
@@ -230,22 +683,39 @@ def generate_compliance_reports(
only_failed_nis2: For NIS2, only include failed requirements.
only_failed_csa: For CSA CCM, only include failed requirements.
include_manual_csa: For CSA CCM, include manual requirements.
only_failed_cis: For CIS, only include failed requirements in detailed section.
include_manual_cis: For CIS, include manual requirements in detailed section.
Returns:
Dictionary with results for each report type.
Dictionary with results for each report type. Every value has the
same flat shape: ``{"upload": bool, "path": str, "error"?: str}``.
"""
logger.info(
"Generating compliance reports for scan %s with provider %s"
" (ThreatScore: %s, ENS: %s, NIS2: %s, CSA: %s)",
" (ThreatScore: %s, ENS: %s, NIS2: %s, CSA: %s, CIS: %s)",
scan_id,
provider_id,
generate_threatscore,
generate_ens,
generate_nis2,
generate_csa,
generate_cis,
)
results = {}
try:
_cleanup_stale_tmp_output_directories(
DJANGO_TMP_OUTPUT_DIRECTORY,
max_age_hours=STALE_TMP_OUTPUT_MAX_AGE_HOURS,
exclude_scan=(tenant_id, scan_id),
)
except Exception as error:
logger.warning(
"Skipping stale tmp cleanup before compliance reports for scan %s: %s",
scan_id,
error,
)
results: dict = {}
# Validate that the scan has findings and get provider info
with rls_transaction(tenant_id, using=READ_REPLICA_ALIAS):
@@ -259,6 +729,8 @@ def generate_compliance_reports(
results["nis2"] = {"upload": False, "path": ""}
if generate_csa:
results["csa"] = {"upload": False, "path": ""}
if generate_cis:
results["cis"] = {"upload": False, "path": ""}
return results
provider_obj = Provider.objects.get(id=provider_id)
@@ -299,11 +771,39 @@ def generate_compliance_reports(
results["csa"] = {"upload": False, "path": ""}
generate_csa = False
# For CIS we do NOT pre-check the provider against a hard-coded whitelist
# (that list drifts the moment a new CIS JSON ships). Instead, we inspect
# the dynamically loaded framework map and pick the latest available CIS
# version, if any.
latest_cis: str | None = None
if generate_cis:
try:
frameworks_bulk = Compliance.get_bulk(provider_type)
latest_cis = _pick_latest_cis_variant(
name for name in frameworks_bulk.keys() if name.startswith("cis_")
)
except Exception as e:
logger.error("Error discovering CIS variants for %s: %s", provider_type, e)
results["cis"] = {"upload": False, "path": "", "error": str(e)}
generate_cis = False
else:
if latest_cis is None:
logger.info("No CIS variants available for provider %s", provider_type)
results["cis"] = {"upload": False, "path": ""}
generate_cis = False
else:
logger.info(
"Selected latest CIS variant for provider %s: %s",
provider_type,
latest_cis,
)
if (
not generate_threatscore
and not generate_ens
and not generate_nis2
and not generate_csa
and not generate_cis
):
return results
@@ -319,38 +819,56 @@ def generate_compliance_reports(
findings_cache = {}
logger.info("Created shared findings cache for all reports")
# Generate output directories
generated_report_keys: list[str] = []
output_paths: dict[str, str] = {}
out_dir: str | None = None
# Generate output directories only for enabled and supported report types.
try:
logger.info("Generating output directories")
threatscore_path = _generate_compliance_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY,
provider_uid,
tenant_id,
scan_id,
compliance_framework="threatscore",
)
ens_path = _generate_compliance_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY,
provider_uid,
tenant_id,
scan_id,
compliance_framework="ens",
)
nis2_path = _generate_compliance_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY,
provider_uid,
tenant_id,
scan_id,
compliance_framework="nis2",
)
csa_path = _generate_compliance_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY,
provider_uid,
tenant_id,
scan_id,
compliance_framework="csa",
)
out_dir = str(Path(threatscore_path).parent.parent)
if generate_threatscore:
output_paths["threatscore"] = _generate_compliance_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY,
provider_uid,
tenant_id,
scan_id,
compliance_framework="threatscore",
)
if generate_ens:
output_paths["ens"] = _generate_compliance_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY,
provider_uid,
tenant_id,
scan_id,
compliance_framework="ens",
)
if generate_nis2:
output_paths["nis2"] = _generate_compliance_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY,
provider_uid,
tenant_id,
scan_id,
compliance_framework="nis2",
)
if generate_csa:
output_paths["csa"] = _generate_compliance_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY,
provider_uid,
tenant_id,
scan_id,
compliance_framework="csa",
)
if generate_cis and latest_cis:
output_paths["cis"] = _generate_compliance_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY,
provider_uid,
tenant_id,
scan_id,
compliance_framework="cis",
)
if output_paths:
first_output_path = next(iter(output_paths.values()))
out_dir = str(Path(first_output_path).parent.parent)
except Exception as e:
logger.error("Error generating output directory: %s", e)
error_dict = {"error": str(e), "upload": False, "path": ""}
@@ -362,10 +880,14 @@ def generate_compliance_reports(
results["nis2"] = error_dict.copy()
if generate_csa:
results["csa"] = error_dict.copy()
if generate_cis:
results["cis"] = error_dict.copy()
return results
# Generate ThreatScore report
if generate_threatscore:
generated_report_keys.append("threatscore")
threatscore_path = output_paths["threatscore"]
compliance_id_threatscore = f"prowler_threatscore_{provider_type}"
pdf_path_threatscore = f"{threatscore_path}_threatscore_report.pdf"
logger.info(
@@ -467,6 +989,8 @@ def generate_compliance_reports(
# Generate ENS report
if generate_ens:
generated_report_keys.append("ens")
ens_path = output_paths["ens"]
compliance_id_ens = f"ens_rd2022_{provider_type}"
pdf_path_ens = f"{ens_path}_ens_report.pdf"
logger.info("Generating ENS report with compliance %s", compliance_id_ens)
@@ -501,6 +1025,8 @@ def generate_compliance_reports(
# Generate NIS2 report
if generate_nis2:
generated_report_keys.append("nis2")
nis2_path = output_paths["nis2"]
compliance_id_nis2 = f"nis2_{provider_type}"
pdf_path_nis2 = f"{nis2_path}_nis2_report.pdf"
logger.info("Generating NIS2 report with compliance %s", compliance_id_nis2)
@@ -536,6 +1062,8 @@ def generate_compliance_reports(
# Generate CSA CCM report
if generate_csa:
generated_report_keys.append("csa")
csa_path = output_paths["csa"]
compliance_id_csa = f"csa_ccm_4.0_{provider_type}"
pdf_path_csa = f"{csa_path}_csa_report.pdf"
logger.info("Generating CSA CCM report with compliance %s", compliance_id_csa)
@@ -569,14 +1097,75 @@ def generate_compliance_reports(
logger.error("Error generating CSA CCM report: %s", e)
results["csa"] = {"upload": False, "path": "", "error": str(e)}
# Clean up temporary files if all reports were uploaded successfully
all_uploaded = all(
result.get("upload", False)
for result in results.values()
if result.get("upload") is not None
# Generate CIS Benchmark report for the latest available version only.
# CIS ships multiple versions per provider (e.g. cis_1.4_aws, cis_5.0_aws,
# cis_6.0_aws); we dynamically pick the highest semantic version at run
# time rather than hard-coding a per-provider mapping.
if generate_cis and latest_cis:
generated_report_keys.append("cis")
cis_path = output_paths["cis"]
if out_dir is None:
out_dir = str(Path(cis_path).parent.parent)
pdf_path_cis = f"{cis_path}_cis_report.pdf"
try:
generate_cis_report(
tenant_id=tenant_id,
scan_id=scan_id,
compliance_id=latest_cis,
output_path=pdf_path_cis,
provider_id=provider_id,
only_failed=only_failed_cis,
include_manual=include_manual_cis,
provider_obj=provider_obj,
requirement_statistics=requirement_statistics,
findings_cache=findings_cache,
)
upload_uri_cis = _upload_to_s3(
tenant_id,
scan_id,
pdf_path_cis,
f"cis/{Path(pdf_path_cis).name}",
)
if upload_uri_cis:
results["cis"] = {
"upload": True,
"path": upload_uri_cis,
}
logger.info(
"CIS report %s uploaded to %s",
latest_cis,
upload_uri_cis,
)
else:
results["cis"] = {"upload": False, "path": out_dir}
logger.warning(
"CIS report %s saved locally at %s",
latest_cis,
out_dir,
)
except Exception as e:
logger.error("Error generating CIS report %s: %s", latest_cis, e)
results["cis"] = {
"upload": False,
"path": "",
"error": str(e),
}
finally:
# Free ReportLab/matplotlib memory before moving on.
gc.collect()
# Clean up temporary files only if all generated reports were
# uploaded successfully. Reports skipped for provider incompatibility
# or missing CIS variants must not block cleanup.
all_uploaded = bool(generated_report_keys) and all(
results.get(report_key, {}).get("upload", False)
for report_key in generated_report_keys
)
if all_uploaded:
if all_uploaded and out_dir:
try:
rmtree(Path(out_dir), ignore_errors=True)
logger.info("Cleaned up temporary files at %s", out_dir)
@@ -595,6 +1184,7 @@ def generate_compliance_reports_job(
generate_ens: bool = True,
generate_nis2: bool = True,
generate_csa: bool = True,
generate_cis: bool = True,
) -> dict[str, dict[str, bool | str]]:
"""
Celery task wrapper for generate_compliance_reports.
@@ -607,9 +1197,12 @@ def generate_compliance_reports_job(
generate_ens: Whether to generate ENS report.
generate_nis2: Whether to generate NIS2 report.
generate_csa: Whether to generate CSA CCM report.
generate_cis: Whether to generate the CIS Benchmark report for the
latest CIS version available for the provider.
Returns:
Dictionary with results for each report type.
Dictionary with results for each report type. Every entry shares the
same flat ``{"upload", "path", "error"?}`` shape.
"""
return generate_compliance_reports(
tenant_id=tenant_id,
@@ -619,4 +1212,5 @@ def generate_compliance_reports_job(
generate_ens=generate_ens,
generate_nis2=generate_nis2,
generate_csa=generate_csa,
generate_cis=generate_cis,
)
@@ -17,6 +17,9 @@ from .charts import (
get_chart_color_for_percentage,
)
# Framework-specific generators
from .cis import CISReportGenerator
# Reusable components
# Reusable components: Color helpers, Badge components, Risk component,
# Table components, Section components
@@ -31,10 +34,12 @@ from .components import (
create_section_header,
create_status_badge,
create_summary_table,
escape_html,
get_color_for_compliance,
get_color_for_risk_level,
get_color_for_weight,
get_status_color,
truncate_text,
)
# Framework configuration: Main configuration, Color constants, ENS colors,
@@ -90,8 +95,6 @@ from .config import (
FrameworkConfig,
get_framework_config,
)
# Framework-specific generators
from .csa import CSAReportGenerator
from .ens import ENSReportGenerator
from .nis2 import NIS2ReportGenerator
@@ -109,6 +112,7 @@ __all__ = [
"ENSReportGenerator",
"NIS2ReportGenerator",
"CSAReportGenerator",
"CISReportGenerator",
# Configuration
"FrameworkConfig",
"FRAMEWORK_REGISTRY",
@@ -182,6 +186,9 @@ __all__ = [
# Section components
"create_section_header",
"create_summary_table",
# Text helpers
"truncate_text",
"escape_html",
# Chart functions
"get_chart_color_for_percentage",
"create_vertical_bar_chart",
+755
View File
@@ -0,0 +1,755 @@
import os
import re
from collections import defaultdict
from typing import Any
from reportlab.lib.units import inch
from reportlab.platypus import Image, PageBreak, Paragraph, Spacer, Table, TableStyle
from api.models import StatusChoices
from .base import (
BaseComplianceReportGenerator,
ComplianceData,
RequirementData,
get_requirement_metadata,
)
from .charts import (
create_horizontal_bar_chart,
create_pie_chart,
create_stacked_bar_chart,
get_chart_color_for_percentage,
)
from .components import ColumnConfig, create_data_table, escape_html, truncate_text
from .config import (
CHART_COLOR_GREEN_1,
CHART_COLOR_RED,
CHART_COLOR_YELLOW,
COLOR_BG_BLUE,
COLOR_BLUE,
COLOR_BORDER_GRAY,
COLOR_DARK_GRAY,
COLOR_GRAY,
COLOR_GRID_GRAY,
COLOR_HIGH_RISK,
COLOR_LIGHT_BLUE,
COLOR_SAFE,
COLOR_WHITE,
)
# Ordered buckets used both in the executive summary tables and the charts
# section. Exposed as module constants so the two call sites never drift.
_PROFILE_BUCKET_ORDER: tuple[str, ...] = ("L1", "L2", "Other")
_ASSESSMENT_BUCKET_ORDER: tuple[str, ...] = ("Automated", "Manual")
# Anchored matchers for profile normalization — substring checks on "L1"/"L2"
# would happily match unrelated tokens like "CL2 Worker" or "HL2" coming from
# future CIS profile enum values.
_LEVEL_2_RE = re.compile(r"(?:\bLevel\s*2\b|\bL2\b|Level_2)")
_LEVEL_1_RE = re.compile(r"(?:\bLevel\s*1\b|\bL1\b|Level_1)")
def _normalize_profile(profile: Any) -> str:
"""Bucket a CIS Profile enum/string into one of: ``L1``, ``L2``, ``Other``.
The ``CIS_Requirement_Attribute_Profile`` enum has values like
``"Level 1"``, ``"Level 2"``, ``"E3 Level 1"``, ``"E5 Level 2"``. We
collapse them into three buckets to keep charts and badges readable
across CIS variants, using anchored regex matches so that future enum
values cannot accidentally promote e.g. ``"CL2 Worker"`` into ``L2``.
Args:
profile: The profile value (enum member, string, or ``None``).
Returns:
One of ``"L1"``, ``"L2"``, ``"Other"``.
"""
if profile is None:
return "Other"
value = getattr(profile, "value", None) or str(profile)
if _LEVEL_2_RE.search(value):
return "L2"
if _LEVEL_1_RE.search(value):
return "L1"
return "Other"
def _profile_badge_text(bucket: str) -> str:
"""Map a normalized profile bucket (L1/L2/Other) to a short badge label."""
return {"L1": "Level 1", "L2": "Level 2"}.get(bucket, "Other")
# =============================================================================
# CIS Report Generator
# =============================================================================
class CISReportGenerator(BaseComplianceReportGenerator):
"""
PDF report generator for CIS (Center for Internet Security) Benchmarks.
CIS differs from single-version frameworks (ENS, NIS2, CSA) in that:
- Each provider has multiple CIS versions (e.g. AWS: 1.4, 1.5, ..., 6.0).
- Section names differ across versions and providers and MUST be derived
at runtime from the loaded compliance data.
- Requirements carry Profile (Level 1/Level 2) and AssessmentStatus
(Automated/Manual) attributes that drive the executive summary and
charts.
This generator produces:
- Cover page with Prowler logo and dynamic CIS version/provider metadata
- Executive summary with overall compliance score, counts, and breakdowns
by Profile and AssessmentStatus
- Charts: overall status pie, pass rate by section (horizontal bar),
Level 1 vs Level 2 pass/fail distribution (stacked bar)
- Requirements index grouped by dynamic section
- Detailed findings for FAIL requirements with CIS-specific audit /
remediation / rationale details
"""
# Per-run memoization cache for ``_compute_statistics``. ``generate()``
# is the public entry point and is called once per PDF, so scoping the
# cache to the last seen ComplianceData instance is enough to avoid the
# double computation between executive summary and charts section.
_stats_cache_key: int | None = None
_stats_cache_value: dict | None = None
# Body section ordering — ensure every top-level section starts on its
# own clean page. The base class only puts a PageBreak AFTER Charts and
# Requirements Index, so Executive Summary and Charts end up sharing a
# page. This override prepends a PageBreak so Compliance Analysis always
# begins on a fresh page.
def _build_body_sections(self, data: ComplianceData) -> list:
return [PageBreak(), *super()._build_body_sections(data)]
# -------------------------------------------------------------------------
# Cover page override — shows dynamic CIS version + provider in the title
# -------------------------------------------------------------------------
def create_cover_page(self, data: ComplianceData) -> list:
"""Create the CIS report cover page with Prowler + CIS logos side by side."""
elements = []
# Create logos side by side (same pattern as NIS2 / ENS)
prowler_logo_path = os.path.join(
os.path.dirname(__file__), "../../assets/img/prowler_logo.png"
)
cis_logo_path = os.path.join(
os.path.dirname(__file__), "../../assets/img/cis_logo.png"
)
if os.path.exists(cis_logo_path):
prowler_logo = Image(prowler_logo_path, width=3.5 * inch, height=0.7 * inch)
cis_logo = Image(cis_logo_path, width=2.3 * inch, height=1.1 * inch)
logos_table = Table(
[[prowler_logo, cis_logo]], colWidths=[4 * inch, 2.5 * inch]
)
logos_table.setStyle(
TableStyle(
[
("ALIGN", (0, 0), (0, 0), "LEFT"),
("ALIGN", (1, 0), (1, 0), "RIGHT"),
("VALIGN", (0, 0), (0, 0), "MIDDLE"),
("VALIGN", (1, 0), (1, 0), "MIDDLE"),
]
)
)
elements.append(logos_table)
elif os.path.exists(prowler_logo_path):
# Fallback: only the Prowler logo if the CIS asset is missing
elements.append(Image(prowler_logo_path, width=5 * inch, height=1 * inch))
elements.append(Spacer(1, 0.5 * inch))
# Dynamic title: "CIS Benchmark v5.0 — AWS Compliance Report"
provider_label = ""
if data.provider_obj:
provider_label = f"{data.provider_obj.provider.upper()}"
title_text = (
f"CIS Benchmark v{data.version}{provider_label}<br/>Compliance Report"
)
elements.append(Paragraph(title_text, self.styles["title"]))
elements.append(Spacer(1, 0.5 * inch))
# Metadata table via base class helper
info_rows = self._build_info_rows(data, language=self.config.language)
metadata_data = []
for label, value in info_rows:
if label in ("Name:", "Description:") and value:
metadata_data.append(
[label, Paragraph(str(value), self.styles["normal_center"])]
)
else:
metadata_data.append([label, value])
metadata_table = Table(metadata_data, colWidths=[2 * inch, 4 * inch])
metadata_table.setStyle(
TableStyle(
[
("BACKGROUND", (0, 0), (0, -1), COLOR_BLUE),
("TEXTCOLOR", (0, 0), (0, -1), COLOR_WHITE),
("FONTNAME", (0, 0), (0, -1), "FiraCode"),
("BACKGROUND", (1, 0), (1, -1), COLOR_BG_BLUE),
("TEXTCOLOR", (1, 0), (1, -1), COLOR_GRAY),
("FONTNAME", (1, 0), (1, -1), "PlusJakartaSans"),
("ALIGN", (0, 0), (-1, -1), "LEFT"),
("VALIGN", (0, 0), (-1, -1), "TOP"),
("FONTSIZE", (0, 0), (-1, -1), 11),
("GRID", (0, 0), (-1, -1), 1, COLOR_BORDER_GRAY),
("LEFTPADDING", (0, 0), (-1, -1), 10),
("RIGHTPADDING", (0, 0), (-1, -1), 10),
("TOPPADDING", (0, 0), (-1, -1), 8),
("BOTTOMPADDING", (0, 0), (-1, -1), 8),
]
)
)
elements.append(metadata_table)
return elements
# -------------------------------------------------------------------------
# Executive Summary
# -------------------------------------------------------------------------
def create_executive_summary(self, data: ComplianceData) -> list:
"""Create the CIS executive summary section."""
elements = []
elements.append(Paragraph("Executive Summary", self.styles["h1"]))
elements.append(Spacer(1, 0.1 * inch))
stats = self._compute_statistics(data)
# --- Summary metrics table ---
summary_data = [
["Metric", "Value"],
["Total Requirements", str(stats["total"])],
["Passed", str(stats["passed"])],
["Failed", str(stats["failed"])],
["Manual", str(stats["manual"])],
["Overall Compliance", f"{stats['overall_compliance']:.1f}%"],
]
summary_table = Table(summary_data, colWidths=[3 * inch, 2 * inch])
summary_table.setStyle(
TableStyle(
[
("BACKGROUND", (0, 0), (-1, 0), COLOR_BLUE),
("TEXTCOLOR", (0, 0), (-1, 0), COLOR_WHITE),
("BACKGROUND", (0, 2), (0, 2), COLOR_SAFE),
("TEXTCOLOR", (0, 2), (0, 2), COLOR_WHITE),
("BACKGROUND", (0, 3), (0, 3), COLOR_HIGH_RISK),
("TEXTCOLOR", (0, 3), (0, 3), COLOR_WHITE),
("BACKGROUND", (0, 4), (0, 4), COLOR_DARK_GRAY),
("TEXTCOLOR", (0, 4), (0, 4), COLOR_WHITE),
("FONTNAME", (0, 0), (-1, 0), "PlusJakartaSans"),
("FONTSIZE", (0, 0), (-1, 0), 12),
("FONTSIZE", (0, 1), (-1, -1), 10),
("ALIGN", (0, 0), (-1, -1), "CENTER"),
("VALIGN", (0, 0), (-1, -1), "MIDDLE"),
("GRID", (0, 0), (-1, -1), 0.5, COLOR_BORDER_GRAY),
("BOTTOMPADDING", (0, 0), (-1, 0), 10),
(
"ROWBACKGROUNDS",
(1, 1),
(1, -1),
[COLOR_WHITE, COLOR_BG_BLUE],
),
]
)
)
elements.append(summary_table)
elements.append(Spacer(1, 0.25 * inch))
# --- Profile breakdown table ---
elements.append(Paragraph("Breakdown by Profile", self.styles["h2"]))
elements.append(Spacer(1, 0.1 * inch))
profile_counts = stats["profile_counts"]
profile_table_data = [["Profile", "Passed", "Failed", "Manual", "Total"]]
for bucket in _PROFILE_BUCKET_ORDER:
counts = profile_counts.get(bucket, {"passed": 0, "failed": 0, "manual": 0})
total = counts["passed"] + counts["failed"] + counts["manual"]
if total == 0:
continue
profile_table_data.append(
[
_profile_badge_text(bucket),
str(counts["passed"]),
str(counts["failed"]),
str(counts["manual"]),
str(total),
]
)
profile_table = Table(
profile_table_data,
colWidths=[1.5 * inch, 1 * inch, 1 * inch, 1 * inch, 1 * inch],
)
profile_table.setStyle(
TableStyle(
[
("BACKGROUND", (0, 0), (-1, 0), COLOR_BLUE),
("TEXTCOLOR", (0, 0), (-1, 0), COLOR_WHITE),
("FONTNAME", (0, 0), (-1, 0), "FiraCode"),
("FONTSIZE", (0, 0), (-1, 0), 10),
("ALIGN", (0, 0), (-1, -1), "CENTER"),
("VALIGN", (0, 0), (-1, -1), "MIDDLE"),
("FONTSIZE", (0, 1), (-1, -1), 9),
("GRID", (0, 0), (-1, -1), 0.5, COLOR_GRID_GRAY),
(
"ROWBACKGROUNDS",
(0, 1),
(-1, -1),
[COLOR_WHITE, COLOR_BG_BLUE],
),
]
)
)
elements.append(profile_table)
elements.append(Spacer(1, 0.25 * inch))
# --- Assessment status breakdown ---
elements.append(Paragraph("Breakdown by Assessment Status", self.styles["h2"]))
elements.append(Spacer(1, 0.1 * inch))
assessment_counts = stats["assessment_counts"]
assessment_table_data = [["Assessment", "Passed", "Failed", "Manual", "Total"]]
for bucket in _ASSESSMENT_BUCKET_ORDER:
counts = assessment_counts.get(
bucket, {"passed": 0, "failed": 0, "manual": 0}
)
total = counts["passed"] + counts["failed"] + counts["manual"]
if total == 0:
continue
assessment_table_data.append(
[
bucket,
str(counts["passed"]),
str(counts["failed"]),
str(counts["manual"]),
str(total),
]
)
assessment_table = Table(
assessment_table_data,
colWidths=[1.5 * inch, 1 * inch, 1 * inch, 1 * inch, 1 * inch],
)
assessment_table.setStyle(
TableStyle(
[
("BACKGROUND", (0, 0), (-1, 0), COLOR_LIGHT_BLUE),
("TEXTCOLOR", (0, 0), (-1, 0), COLOR_WHITE),
("FONTNAME", (0, 0), (-1, 0), "FiraCode"),
("FONTSIZE", (0, 0), (-1, 0), 10),
("ALIGN", (0, 0), (-1, -1), "CENTER"),
("VALIGN", (0, 0), (-1, -1), "MIDDLE"),
("FONTSIZE", (0, 1), (-1, -1), 9),
("GRID", (0, 0), (-1, -1), 0.5, COLOR_GRID_GRAY),
(
"ROWBACKGROUNDS",
(0, 1),
(-1, -1),
[COLOR_WHITE, COLOR_BG_BLUE],
),
]
)
)
elements.append(assessment_table)
elements.append(Spacer(1, 0.25 * inch))
# --- Top 5 failing sections ---
top_failing = stats["top_failing_sections"]
if top_failing:
elements.append(
Paragraph("Top Sections with Lowest Compliance", self.styles["h2"])
)
elements.append(Spacer(1, 0.1 * inch))
top_table_data = [["Section", "Passed", "Failed", "Compliance"]]
for section_label, section_stats in top_failing:
passed = section_stats["passed"]
failed = section_stats["failed"]
total = passed + failed
pct = (passed / total * 100) if total > 0 else 100
top_table_data.append(
[
truncate_text(section_label, 55),
str(passed),
str(failed),
f"{pct:.1f}%",
]
)
top_table = Table(
top_table_data,
colWidths=[3.5 * inch, 0.9 * inch, 0.9 * inch, 1.2 * inch],
)
top_table.setStyle(
TableStyle(
[
("BACKGROUND", (0, 0), (-1, 0), COLOR_HIGH_RISK),
("TEXTCOLOR", (0, 0), (-1, 0), COLOR_WHITE),
("FONTNAME", (0, 0), (-1, 0), "FiraCode"),
("FONTSIZE", (0, 0), (-1, 0), 10),
("ALIGN", (0, 0), (-1, -1), "CENTER"),
("VALIGN", (0, 0), (-1, -1), "MIDDLE"),
("FONTSIZE", (0, 1), (-1, -1), 9),
("GRID", (0, 0), (-1, -1), 0.5, COLOR_GRID_GRAY),
(
"ROWBACKGROUNDS",
(0, 1),
(-1, -1),
[COLOR_WHITE, COLOR_BG_BLUE],
),
]
)
)
elements.append(top_table)
return elements
# -------------------------------------------------------------------------
# Charts section
# -------------------------------------------------------------------------
def create_charts_section(self, data: ComplianceData) -> list:
"""Create the CIS charts section."""
elements = []
elements.append(Paragraph("Compliance Analysis", self.styles["h1"]))
elements.append(Spacer(1, 0.1 * inch))
# --- Pie chart: overall Pass / Fail / Manual ---
stats = self._compute_statistics(data)
pie_labels = []
pie_values = []
pie_colors = []
if stats["passed"] > 0:
pie_labels.append(f"Pass ({stats['passed']})")
pie_values.append(stats["passed"])
pie_colors.append(CHART_COLOR_GREEN_1)
if stats["failed"] > 0:
pie_labels.append(f"Fail ({stats['failed']})")
pie_values.append(stats["failed"])
pie_colors.append(CHART_COLOR_RED)
if stats["manual"] > 0:
pie_labels.append(f"Manual ({stats['manual']})")
pie_values.append(stats["manual"])
pie_colors.append(CHART_COLOR_YELLOW)
if pie_values:
elements.append(Paragraph("Overall Status Distribution", self.styles["h2"]))
elements.append(Spacer(1, 0.1 * inch))
pie_buffer = create_pie_chart(
labels=pie_labels,
values=pie_values,
colors=pie_colors,
)
pie_buffer.seek(0)
elements.append(Image(pie_buffer, width=4.5 * inch, height=4.5 * inch))
elements.append(Spacer(1, 0.2 * inch))
# --- Horizontal bar: pass rate by section ---
section_stats = stats["section_stats"]
if section_stats:
elements.append(PageBreak())
elements.append(Paragraph("Compliance by Section", self.styles["h1"]))
elements.append(Spacer(1, 0.1 * inch))
elements.append(
Paragraph(
"The following chart shows compliance percentage for each CIS "
"section based on automated checks:",
self.styles["normal_center"],
)
)
elements.append(Spacer(1, 0.1 * inch))
# Sort sections by pass rate descending for readability
sorted_sections = sorted(
section_stats.items(),
key=lambda item: (
(item[1]["passed"] / (item[1]["passed"] + item[1]["failed"]) * 100)
if (item[1]["passed"] + item[1]["failed"]) > 0
else 100
),
reverse=True,
)
bar_labels = []
bar_values = []
for section_label, section_data in sorted_sections:
total = section_data["passed"] + section_data["failed"]
if total == 0:
continue
pct = (section_data["passed"] / total) * 100
bar_labels.append(truncate_text(section_label, 60))
bar_values.append(pct)
if bar_values:
bar_buffer = create_horizontal_bar_chart(
labels=bar_labels,
values=bar_values,
xlabel="Compliance (%)",
color_func=get_chart_color_for_percentage,
label_fontsize=9,
)
bar_buffer.seek(0)
elements.append(Image(bar_buffer, width=6.5 * inch, height=5 * inch))
# --- Stacked bar: Level 1 vs Level 2 pass/fail ---
profile_counts = stats["profile_counts"]
has_profile_data = any(
(counts["passed"] + counts["failed"]) > 0
for counts in profile_counts.values()
)
if has_profile_data:
elements.append(PageBreak())
elements.append(Paragraph("Profile Breakdown", self.styles["h1"]))
elements.append(Spacer(1, 0.1 * inch))
elements.append(
Paragraph(
"Distribution of Pass / Fail / Manual across CIS profile levels.",
self.styles["normal_center"],
)
)
elements.append(Spacer(1, 0.1 * inch))
profile_labels = []
pass_series = []
fail_series = []
manual_series = []
for bucket in _PROFILE_BUCKET_ORDER:
counts = profile_counts.get(bucket)
if not counts:
continue
total = counts["passed"] + counts["failed"] + counts["manual"]
if total == 0:
continue
profile_labels.append(_profile_badge_text(bucket))
pass_series.append(counts["passed"])
fail_series.append(counts["failed"])
manual_series.append(counts["manual"])
if profile_labels:
stacked_buffer = create_stacked_bar_chart(
labels=profile_labels,
data_series={
"Pass": pass_series,
"Fail": fail_series,
"Manual": manual_series,
},
xlabel="Profile",
ylabel="Requirements",
)
stacked_buffer.seek(0)
elements.append(Image(stacked_buffer, width=6 * inch, height=4 * inch))
return elements
# -------------------------------------------------------------------------
# Requirements Index
# -------------------------------------------------------------------------
def create_requirements_index(self, data: ComplianceData) -> list:
"""Create the CIS requirements index grouped by dynamic section."""
elements = []
elements.append(Paragraph("Requirements Index", self.styles["h1"]))
elements.append(Spacer(1, 0.1 * inch))
sections = self._derive_sections(data)
by_section: dict[str, list[dict]] = defaultdict(list)
for req in data.requirements:
meta = get_requirement_metadata(req.id, data.attributes_by_requirement_id)
section = "Other"
profile_bucket = "Other"
assessment = ""
if meta:
section = getattr(meta, "Section", "Other") or "Other"
profile_bucket = _normalize_profile(getattr(meta, "Profile", None))
assessment_enum = getattr(meta, "AssessmentStatus", None)
assessment = getattr(assessment_enum, "value", None) or str(
assessment_enum or ""
)
by_section[section].append(
{
"id": req.id,
"description": truncate_text(req.description, 80),
"profile": _profile_badge_text(profile_bucket),
"assessment": assessment or "-",
"status": (req.status or "").upper(),
}
)
columns = [
ColumnConfig("ID", 0.9 * inch, "id", align="LEFT"),
ColumnConfig("Description", 3.0 * inch, "description", align="LEFT"),
ColumnConfig("Profile", 0.9 * inch, "profile"),
ColumnConfig("Assessment", 1 * inch, "assessment"),
ColumnConfig("Status", 0.9 * inch, "status"),
]
for section in sections:
rows = by_section.get(section, [])
if not rows:
continue
elements.append(Paragraph(truncate_text(section, 90), self.styles["h2"]))
elements.append(Spacer(1, 0.05 * inch))
table = create_data_table(
data=rows,
columns=columns,
header_color=self.config.primary_color,
normal_style=self.styles["normal_center"],
)
elements.append(table)
elements.append(Spacer(1, 0.15 * inch))
return elements
# -------------------------------------------------------------------------
# Detailed findings hook — inject CIS-specific rationale / audit content
# -------------------------------------------------------------------------
def _render_requirement_detail_extras(
self, req: RequirementData, data: ComplianceData
) -> list:
"""Render CIS rationale, impact, audit, remediation and references."""
extras = []
meta = get_requirement_metadata(req.id, data.attributes_by_requirement_id)
if meta is None:
return extras
field_map = [
("Rationale", "RationaleStatement"),
("Impact", "ImpactStatement"),
("Audit Procedure", "AuditProcedure"),
("Remediation", "RemediationProcedure"),
("References", "References"),
]
for label, attr_name in field_map:
value = getattr(meta, attr_name, None)
if not value:
continue
text = str(value).strip()
if not text:
continue
extras.append(Paragraph(f"<b>{label}:</b>", self.styles["h3"]))
extras.append(Paragraph(escape_html(text), self.styles["normal"]))
extras.append(Spacer(1, 0.08 * inch))
return extras
# -------------------------------------------------------------------------
# Private helpers
# -------------------------------------------------------------------------
def _derive_sections(self, data: ComplianceData) -> list[str]:
"""Extract ordered unique Section names from loaded compliance data."""
seen: dict[str, bool] = {}
for req in data.requirements:
meta = get_requirement_metadata(req.id, data.attributes_by_requirement_id)
if meta is None:
continue
section = getattr(meta, "Section", None) or "Other"
if section not in seen:
seen[section] = True
return list(seen.keys())
def _compute_statistics(self, data: ComplianceData) -> dict:
"""Aggregate all statistics needed for summary and charts.
Memoized per-``ComplianceData`` instance via ``_stats_cache_*``: the
executive summary and the charts section both need the same numbers,
so they would otherwise re-iterate the requirements twice. We key on
``id(data)`` because ``ComplianceData`` is a dataclass and its
instances are not hashable.
Returns a dict with:
- total, passed, failed, manual: int
- overall_compliance: float (percentage)
- profile_counts: {"L1": {"passed", "failed", "manual"}, ...}
- assessment_counts: {"Automated": {...}, "Manual": {...}}
- section_stats: {section_name: {"passed", "failed", "manual"}, ...}
- top_failing_sections: list[(section_name, stats)] (up to 5)
"""
cache_key = id(data)
if self._stats_cache_key == cache_key and self._stats_cache_value is not None:
return self._stats_cache_value
stats = self._compute_statistics_uncached(data)
self._stats_cache_key = cache_key
self._stats_cache_value = stats
return stats
def _compute_statistics_uncached(self, data: ComplianceData) -> dict:
"""Actual aggregation kernel; call ``_compute_statistics`` instead."""
total = len(data.requirements)
passed = sum(1 for r in data.requirements if r.status == StatusChoices.PASS)
failed = sum(1 for r in data.requirements if r.status == StatusChoices.FAIL)
manual = sum(1 for r in data.requirements if r.status == StatusChoices.MANUAL)
evaluated = passed + failed
overall_compliance = (passed / evaluated * 100) if evaluated > 0 else 100.0
profile_counts: dict[str, dict[str, int]] = {
"L1": {"passed": 0, "failed": 0, "manual": 0},
"L2": {"passed": 0, "failed": 0, "manual": 0},
"Other": {"passed": 0, "failed": 0, "manual": 0},
}
assessment_counts: dict[str, dict[str, int]] = {
"Automated": {"passed": 0, "failed": 0, "manual": 0},
"Manual": {"passed": 0, "failed": 0, "manual": 0},
}
section_stats: dict[str, dict[str, int]] = defaultdict(
lambda: {"passed": 0, "failed": 0, "manual": 0}
)
for req in data.requirements:
meta = get_requirement_metadata(req.id, data.attributes_by_requirement_id)
if meta is None:
continue
profile_bucket = _normalize_profile(getattr(meta, "Profile", None))
assessment_enum = getattr(meta, "AssessmentStatus", None)
assessment_value = getattr(assessment_enum, "value", None) or str(
assessment_enum or ""
)
assessment_bucket = (
"Automated" if assessment_value == "Automated" else "Manual"
)
section = getattr(meta, "Section", None) or "Other"
status_key = {
StatusChoices.PASS: "passed",
StatusChoices.FAIL: "failed",
StatusChoices.MANUAL: "manual",
}.get(req.status)
if status_key is None:
continue
profile_counts[profile_bucket][status_key] += 1
assessment_counts[assessment_bucket][status_key] += 1
section_stats[section][status_key] += 1
# Top 5 sections with lowest pass rate (only sections with evaluated reqs)
def _section_rate(item):
_, stats_ = item
evaluated_ = stats_["passed"] + stats_["failed"]
if evaluated_ == 0:
return 101 # sort evaluated=0 to the bottom
return stats_["passed"] / evaluated_ * 100
top_failing_sections = sorted(
(
item
for item in section_stats.items()
if (item[1]["passed"] + item[1]["failed"]) > 0
),
key=_section_rate,
)[:5]
return {
"total": total,
"passed": passed,
"failed": failed,
"manual": manual,
"overall_compliance": overall_compliance,
"profile_counts": profile_counts,
"assessment_counts": assessment_counts,
"section_stats": dict(section_stats),
"top_failing_sections": top_failing_sections,
}

Some files were not shown because too many files have changed in this diff Show More