Compare commits

...

221 Commits

Author SHA1 Message Date
Prowler Bot 2cb8179477 chore: review changelog for v5.24.1 (#10792)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-20 14:10:04 +02:00
Prowler Bot c9bbe7033b fix(ui): sorting and filtering for findings (#10790)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-20 13:46:36 +02:00
Prowler Bot 76ecb30968 fix(api): detect silent failures in ResourceFindingMapping (#10781)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-20 09:15:49 +02:00
Prowler Bot 84a60fe06b fix(ui): correct IaC findings counters (#10773)
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-17 13:55:17 +02:00
Prowler Bot f71743b95b fix(cloudflare): guard validate_credentials against paginator infinite loops (#10772)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-04-17 11:38:12 +02:00
Prowler Bot 68dcc5a75c fix(ui): exclude muted findings and polish filter selectors (#10770)
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
2026-04-17 11:16:41 +02:00
Prowler Bot 407ae24f04 perf(attack-paths): cleanup task prioritization, restore default batch sizes to 1000, upgrade Cartography to 0.135.0 (#10768)
Co-authored-by: Josema Camacho <josema@prowler.com>
2026-04-17 11:01:19 +02:00
Prowler Bot 17c4a286af chore(deps): bump msgraph-sdk to 1.55.0 and azure-mgmt-resource to 24.0.0, remove marshmallow (#10766)
Co-authored-by: Josema Camacho <josema@prowler.com>
2026-04-17 10:22:17 +02:00
Prowler Bot 69ee2cdcef fix(googleworkspace): treat secure Google defaults as PASS for Drive checks (#10765)
Co-authored-by: lydiavilchez <114735608+lydiavilchez@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-17 09:12:57 +02:00
Prowler Bot 3544ff5e75 fix: CHANGELOG minor issue (#10759)
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
2026-04-16 17:10:44 +02:00
Prowler Bot 69287dc3a1 fix(api): exclude muted findings from pass_count, fail_count and manual_count (#10755) 2026-04-16 16:16:25 +02:00
Prowler Bot cf5848d11d fix(ui): upgrade React 19.2.5 and Next.js 16.2.3 to mitigate CVE-2026-23869 (#10754)
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
2026-04-16 15:39:30 +02:00
Prowler Bot 8ead3fa6bb fix(api): add fallback handling for missing resources in findings (#10751)
Co-authored-by: Adrián Peña <adrianjpr@gmail.com>
2026-04-16 14:54:27 +02:00
Prowler Bot 21483cc12f fix(googleworkspace): treat secure Google defaults as PASS for Calendar checks (#10735)
Co-authored-by: lydiavilchez <114735608+lydiavilchez@users.noreply.github.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-04-16 13:36:14 +02:00
Prowler Bot 628de4bd06 fix(image): --registry-list crashes with AttributeError on global_provider (#10730)
Co-authored-by: Erich Blume <725328+eblume@users.noreply.github.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-04-16 13:31:08 +02:00
Prowler Bot 043f1ef138 fix(sdk): allow account-scoped tokens in Cloudflare connection test (#10731)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-04-16 13:25:09 +02:00
Prowler Bot a120da9409 fix(db): add missing tenant_id filter in queries (#10725)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-16 12:11:28 +02:00
Prowler Bot d5b71c6436 chore(ui): Bump version to v5.24.1 (#10713)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-15 20:14:37 +02:00
Prowler Bot 9114d09ba5 docs: Update version to v5.24.0 (#10716)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-15 20:14:27 +02:00
Prowler Bot d2b1224a30 chore(release): Bump version to v5.24.1 (#10712)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-15 20:13:54 +02:00
Prowler Bot 54b54e25e2 chore(api): Bump version to v1.25.1 (#10717)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-15 20:13:43 +02:00
Prowler Bot 1b45724ca8 chore(api): Update prowler dependency to v5.24 for release 5.24.0 (#10709)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-15 18:57:37 +02:00
Pepe Fagoaga ba5b23245f chore: review changelog for v5.24 (#10707) 2026-04-15 18:05:55 +02:00
Daniel Barranquero 43913b1592 feat(aws): support excluding regions from scans via CLI, env var, and config (#10688) 2026-04-15 17:59:46 +02:00
Alan Buscaglia 9e31160887 fix(ui): improve attack paths scan table UX and fix info banner variant (#10704)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-15 17:33:29 +02:00
Pepe Fagoaga 9a0c73256e chore: delete .opencode (#10702) 2026-04-15 15:10:40 +02:00
Alejandro Bailo 2a160a10df refactor(ui): remove legacy side drawers and clean code (#10692) 2026-04-15 13:55:57 +02:00
Alan Buscaglia 8d8bee165b feat(ui): improve attack paths scan selection UX (#10685) 2026-04-15 13:54:25 +02:00
Alan Buscaglia 606efec9f8 fix(ui): keep update credentials wizard open (#10675) 2026-04-15 13:50:20 +02:00
Alan Buscaglia d5354e8b1d feat(ui): add syntax highlighting to finding groups remediation code (#10698) 2026-04-15 12:58:35 +02:00
Rubén De la Torre Vico a96e5890dc docs: replace Excalidraw diagrams with Mermaid and fix architecture connections (#10697) 2026-04-15 12:51:29 +02:00
Pepe Fagoaga bb81c5dd2d docs: add contextual menu for copy and issue/feat (#10699) 2026-04-15 12:50:29 +02:00
Daniel Barranquero c3acb818d9 fix(vercel): handle team-scoped firewall config responses (#10695) 2026-04-15 11:59:20 +02:00
Andoni Alonso e6fc59267b docs: add Finding Groups documentation page (#10696)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-15 11:58:39 +02:00
Josema Camacho 62f114f5d0 refactor(api): remove dead cleanup_findings no-op from attack-paths module (#10684) 2026-04-15 09:16:38 +02:00
Pepe Fagoaga 392ffd5a60 fix(beat): make it dependant from API service (#10603)
Co-authored-by: Josema Camacho <josema@prowler.com>
2026-04-14 18:35:26 +02:00
Alejandro Bailo 507b0882d5 fix(ui): fix findings group resource filters and mute modal migration (#10662) 2026-04-14 18:01:45 +02:00
Alejandro Bailo 89d72cf8fd feat(ui): new resources side drawer with redesigned detail panel (#10673) 2026-04-14 17:20:19 +02:00
Rubén De la Torre Vico f3a042933f chore(deps): replace pre-commit and husky with prek (#10601) 2026-04-14 16:34:54 +02:00
stepsecurity-app[bot] 96e7d6cb3a feat(security): security best practices from StepSecurity (#10682)
Signed-off-by: StepSecurity Bot <bot@stepsecurity.io>
Co-authored-by: stepsecurity-app[bot] <188008098+stepsecurity-app[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-04-14 15:13:12 +02:00
Hugo Pereira Brito a82eaa885d refactor(m365): normalize CA platforms at model level (#10635)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 15:00:23 +02:00
Hugo Pereira Brito 90a619a8b4 feat(m365): add entra_conditional_access_policy_block_unknown_device_platforms security check (#10615)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 14:32:37 +02:00
Hugo Pereira Brito 638bf62d76 feat(entra): directory sync account exclusion (#10620)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 14:16:32 +02:00
Pablo Fernandez Guerra (PFE) 962615ca1f chore(ui): bump serialize-javascript override from 7.0.3 to 7.0.5 (#10653)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 14:11:59 +02:00
Hugo Pereira Brito 5610f5ad90 feat(m365): add entra_conditional_access_policy_corporate_device_sign_in_frequency_enforced security check (#10618)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 14:10:00 +02:00
Pepe Fagoaga be6fe1db04 chore(security): bump pytest to 9.0.3 (#10678) 2026-04-14 13:59:30 +02:00
Hugo Pereira Brito 92b838866a feat(m365): add entra_conditional_access_policy_mfa_enforced_for_guest_users security check (#10616)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 13:45:12 +02:00
Josema Camacho 51591cb8cd build: bump poetry to 2.3.4 and consolidate SDK workflows (#10681) 2026-04-14 13:32:46 +02:00
Hugo Pereira Brito e24e1ab771 feat(m365): add exchange_organization_delicensing_resiliency_enabled security check (#10608) 2026-04-14 13:30:45 +02:00
Hugo Pereira Brito bc3fd79457 feat(intune): add device compliance policy marks noncompliant check (#10599) 2026-04-14 13:01:47 +02:00
Hugo Pereira Brito 4941ed5797 feat(entra): add new check entra_conditional_access_policy_all_apps_all_users (#10619)
Co-authored-by: Hugo P.Brito <hugopbrito@Mac.home>
2026-04-14 12:47:57 +02:00
Daniel Barranquero 0f4d8ff891 feat(aws): add bedrock_vpc_endpoints_configured security check (#10591) 2026-04-14 12:22:22 +02:00
Daniel Barranquero d1ab8b8ae5 feat(aws): add iam_policy_no_wildcard_marketplace_subscribe and iam_inline_policy_no_wildcard_marketplace_subscribe checks (#10525)
Co-authored-by: Hugo P.Brito <hugopbrit@gmail.com>
2026-04-14 12:08:40 +02:00
Daniel Barranquero 65e9593b41 feat(aws): add bedrock_access_not_stale security check (#10536) 2026-04-14 11:20:40 +02:00
Daniel Barranquero 131112398b feat(aws): add bedrock_full_access_policy_attached security check (#10577) 2026-04-14 11:00:40 +02:00
Pedro Martín c952ea018e fix(ui): reflect actual provider in compliance detail header (#10674)
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
2026-04-14 10:22:42 +02:00
Pedro Martín 31b645ee53 chore(github): allow GitHub release CDN in trivy scan allowlist (#10679) 2026-04-14 10:09:54 +02:00
harshadkhetpal 0123e603d8 fix: replace bare except with except Exception in prowler-wrapper (#10499) 2026-04-14 08:11:53 +02:00
Prowler Bot b65265da4b feat(aws): Update regions for AWS services (#10659)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-14 08:03:14 +02:00
Prowler Bot 1335332fe9 chore(api): Bump version to v1.25.0 (#10668)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-13 22:18:59 +02:00
Prowler Bot f37a2a1efe chore(release): Bump version to v5.24.0 (#10666)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-13 22:18:54 +02:00
Prowler Bot 3e0e1398c4 docs: Update version to v5.23.0 (#10667)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-13 22:18:13 +02:00
Prowler Bot a4ad9ba01f chore(ui): Bump version to v5.24.0 (#10665)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-04-13 22:17:44 +02:00
Adrián Peña c6d5f44c5e chore: update pyjwt (#10661) 2026-04-13 14:09:37 +02:00
Adrián Peña 5d24a41625 feat(api): add sort support for all finding group counter fields (#10655) 2026-04-13 13:34:35 +02:00
lydiavilchez e33825747f fix(googleworkspace): apply customer-level policy filter to Calendar service (#10658) 2026-04-13 11:26:35 +02:00
lydiavilchez d919d979dd feat(googleworkspace): add Drive and Docs service checks using Cloud Identity Policy API (#10648) 2026-04-13 10:48:24 +02:00
Pepe Fagoaga 6534faf678 chore: review changelog for v5.23 (#10631) 2026-04-13 08:59:07 +02:00
Alan Buscaglia 1aa91cf60f fix(ui): exclude service filter from finding group resources endpoint (#10652) 2026-04-10 14:06:47 +02:00
Josema Camacho dad84f0ee2 docs(attack-paths): replace basic query examples with graph traversal patterns (#10649) 2026-04-10 12:23:02 +02:00
Alejandro Bailo 0d7c5f6ac5 feat(ui): make finding group delta indicator status-filter aware (#10647) 2026-04-10 11:29:11 +02:00
Hugo Pereira Brito 431776bcfd docs(attack-paths): link custom queries to Prowler docs (#10640) 2026-04-10 10:17:45 +01:00
Alejandro Bailo 0e8080f09c fix(ui): findings groups fixes (#10633) 2026-04-10 10:44:10 +02:00
Adrián Peña e4b2950436 refactor(api): split finding-groups status from muted state (#10630) 2026-04-09 18:07:43 +02:00
Pablo Fernandez Guerra (PFE) 63174caf98 docs: add multi-tenant (organizations) management guide (#10638)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: David <david.copo@gmail.com>
2026-04-09 17:51:54 +02:00
Alejandro Bailo 4e508b69c9 fix(vercel): use canonical Hub URLs in check metadata (#10636) 2026-04-09 16:23:50 +02:00
Andoni Alonso 18cfb191f5 docs: rename Prowler App to Prowler Cloud in provider headers (#10634) 2026-04-09 15:58:35 +02:00
Avula Jeevan Yadav b898f257f1 feat(stepfunctions): add check for secrets in state machine definition (#10570)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-04-09 15:56:29 +02:00
Hugo Pereira Brito cccb3a4b94 chore(sdk,mcp): pin direct dependencies to exact versions (#10593)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-04-09 14:42:49 +01:00
Daniel Barranquero ca50b24d77 docs: add Vercel Cloud getting started (#10609) 2026-04-09 15:40:44 +02:00
mintlify[bot] 7eb204fff0 docs: classify supported providers by category on main page (#10621)
Co-authored-by: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-04-09 15:39:43 +02:00
Pedro Martín 56c370d3a4 chore(ccc): update with latest version and improve mapping (#10625) 2026-04-09 15:27:18 +02:00
Pedro Martín b0d8534907 feat(api): add needed changes for GoogleWorkspace compliance (#10629) 2026-04-09 14:36:55 +02:00
dependabot[bot] ad36938717 chore(deps): bump actions/download-artifact from 6.0.0 to 8.0.1 (#10541)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:25:14 +02:00
dependabot[bot] 10dd9460e9 chore(deps): bump azure/setup-helm from 4.3.0 to 5.0.0 (#10543)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:24:42 +02:00
dependabot[bot] c8d41745dd chore(deps): bump softprops/action-gh-release from 2.5.0 to 2.6.1 (#10544)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:23:44 +02:00
dependabot[bot] c6c000a369 chore(deps): bump actions/setup-node from 6.2.0 to 6.3.0 (#10545)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:23:18 +02:00
dependabot[bot] a2b083e8c8 chore(deps): bump actions/cache from 5.0.3 to 5.0.4 (#10546)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:22:58 +02:00
dependabot[bot] d2f7169537 chore(deps): bump actions/checkout from 6.0.1 to 6.0.2 (#10548)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:22:26 +02:00
dependabot[bot] 632f2633c1 chore(deps): bump zizmorcore/zizmor-action from 0.5.0 to 0.5.2 (#10550)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:20:34 +02:00
dependabot[bot] 82d487a1e7 chore(deps): bump sorenlouv/backport-github-action from 10.2.0 to 11.0.0 (#10540)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:20:11 +02:00
dependabot[bot] 9a6a43637d chore(deps): bump pnpm/action-setup from 4.2.0 to 5.0.0 (#10551)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:19:50 +02:00
dependabot[bot] c21cf0ac20 chore(deps): bump tj-actions/changed-files from 47.0.4 to 47.0.5 (#10552)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:19:28 +02:00
dependabot[bot] f3b142c0cf chore(deps): bump docker/login-action from 3.7.0 to 4.0.0 (#10554)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:19:00 +02:00
dependabot[bot] eda90c4673 chore(deps): bump actions/upload-artifact from 6.0.0 to 7.0.0 (#10555)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:18:16 +02:00
dependabot[bot] def59a8cc2 chore(deps): bump docker/setup-buildx-action from 3.12.0 to 4.0.0 (#10556)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:16:00 +02:00
dependabot[bot] 1bfed74db5 chore(deps): bump docker/build-push-action from 6.19.2 to 7.0.0 (#10557)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 10:14:27 +02:00
Davidm4r baf1194824 feat(ui): invitation flow smart routing (#10589)
Co-authored-by: Pablo Fernandez Guerra (PFE) <148432447+pfe-nazaries@users.noreply.github.com>
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 10:11:52 +02:00
Alejandro Bailo b9270df3e6 feat(ui): improvements over findings groups feature (#10590) 2026-04-09 09:39:52 +02:00
dependabot[bot] 379df7800d chore(deps): bump aiohttp from 3.13.3 to 3.13.5 in /api (#10538)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-09 09:27:55 +02:00
dependabot[bot] fcabe1f99e chore(deps): bump aiohttp from 3.13.3 to 3.13.5 (#10537)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-09 08:57:16 +02:00
Davidm4r ad7a56d010 fix(ui): show active organization ID in profile page (#10617) 2026-04-09 08:51:39 +02:00
Pablo Fernandez Guerra (PFE) 406eedd68a chore(ui): unset GIT_WORK_TREE in pre-commit hook (#10574)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-08 14:27:12 +02:00
lydiavilchez bc38104903 feat(googleworkspace): add calendar service checks using Cloud Identity Policy API (#10597) 2026-04-08 13:26:56 +02:00
Andoni Alonso 9290d7e105 feat(sdk): warn when sensitive CLI flags receive explicit values (#10532) 2026-04-08 13:15:05 +02:00
lydiavilchez 72e8f09c07 feat(googleworkspace): add directory check for CIS 1.1.3 - super admin only admin roles (#10488)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-08 12:05:15 +02:00
Pepe Fagoaga 1d43885230 docs: update architecture diagram (#10604) 2026-04-08 11:05:28 +02:00
Adrián Peña e6aedcb207 feat(api): support sort by delta on finding-groups endpoints (#10606) 2026-04-08 11:04:57 +02:00
Kay Agahd 89fe867944 fix(aws): recognize service-specific condition keys as restrictive in is_policy_public (#10600)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-08 10:55:55 +02:00
Pepe Fagoaga 2be2753c55 fix(codeartifact): only retrieve the latest version from a package (#10243)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-04-08 09:21:19 +02:00
Josema Camacho 283259f34c fix(sdk): resolve empty-set bug in _enabled_regions causing 36-region client creation and CI timeouts (#10598) 2026-04-08 08:40:58 +02:00
Adrián Peña abaacd7dbf feat(api): finding group first_seen_at semantics and resource delta (#10595) 2026-04-07 16:41:08 +02:00
rchotacode 5e1e4bd8e4 fix(oci): Mutelist support (#10566)
Co-authored-by: Ronan Chota <ronan.chota@saic.com>
Co-authored-by: Hugo P.Brito <hugopbrito@users.noreply.github.com>
2026-04-07 13:23:51 +01:00
Davidm4r 33efd72b97 chore(deps): bump authlib from 1.6.5 to 1.6.9 (#10579)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-07 13:31:59 +02:00
Pepe Fagoaga b2788df8cc chore(issues): automate conversation lock on issue close (#10596) 2026-04-07 13:07:02 +02:00
Andoni Alonso b1b361af8b chore(ci): update Pablo user for labeling purposes (#10594) 2026-04-07 12:54:04 +02:00
Josema Camacho 8bc03f8d04 fix(api): remove clear_cache from attack paths read-only query endpoints (#10586) 2026-04-07 12:46:51 +02:00
Andoni Alonso ca03d9c0a9 docs: add Google Workspace SAML SSO configuration guide (#10564)
Co-authored-by: Alan Buscaglia <Alan-TheGentleman@users.noreply.github.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-04-07 12:03:21 +02:00
Kay Agahd 8985280621 fix(azure): create distinct report per key/secret in keyvault checks (#10332)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Hugo P.Brito <hugopbrit@gmail.com>
2026-04-07 09:36:48 +01:00
Pepe Fagoaga b7ee2b9690 chore: rename UI tab regarding the environment (#10588) 2026-04-07 10:30:01 +02:00
Alejandro Bailo 6b2d9b5580 feat(ui): add Vercel provider (#10191)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-07 10:13:18 +02:00
kaiisfree c99ed991b7 fix: show all checks including threat-detection in --list-checks (#10578)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: kaiisfree <kai@users.noreply.github.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-04-06 16:55:15 +01:00
Hugo Pereira Brito 7c0034524a fix(sdk): add missing __init__.py for codebuild GitHub orgs check (#10584) 2026-04-06 16:40:04 +01:00
Josema Camacho 749110de75 chore(sdk): bump cryptography to 46.0.6, oci to 2.169.0, and alibabacloud-tea-openapi to 0.4.4 (#10535) 2026-04-06 15:09:33 +02:00
Adrián Peña 5fff3b920d fix(api): exclude spurious retrieve from Jira docs and add known limitations (#10580) 2026-04-06 14:30:38 +02:00
Pablo Fernandez Guerra (PFE) 961f9c86da feat(ui): Add tenant management (#10491)
Co-authored-by: Pablo Fernandez <pfe@NB0240.local>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: David <david.copo@gmail.com>
2026-04-06 10:31:30 +02:00
Andoni Alonso 0f1da703d1 docs(image): add Prowler App documentation and authentication guide (#10527)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-06 09:59:56 +02:00
Pepe Fagoaga 07f3416493 feat(mcp): Add resource events tool (#10412) 2026-04-06 08:42:04 +02:00
Alan Buscaglia 509ec74c3d fix(ui): findings groups improvements — security fixes, code quality, and UX feedback (#10513)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-01 15:54:46 +02:00
Adrián Peña ab8e83da3f fix(api,ui): dynamically fetch Jira issue types instead of hardcoding "Task" (#10534)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-04-01 14:37:49 +02:00
Pablo Fernandez Guerra (PFE) 6ac90eb1b5 chore(ui): add pnpm supply chain security protections (#10471)
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
Co-authored-by: César Arroba <cesar@prowler.com>
2026-04-01 14:10:01 +02:00
Alejandro Bailo af6198e6c2 feat(api): integrate Vercel provider into API layer (#10190)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-01 13:20:49 +02:00
Josema Camacho dfe06a1077 fix(ui): allow selecting failed scans when graph data is available (#10531) 2026-04-01 11:08:34 +02:00
Alejandro Bailo 4f86667433 feat(sdk): add Vercel provider with 30 security checks (#10189)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-03-31 16:21:22 +02:00
Andoni Alonso 4bb1e5cff7 fix(sdk): redact sensitive CLI flags in HTML output (#10518) 2026-03-31 15:01:09 +02:00
Pedro Martín 99b80ebbd9 chore(actions): add pr-check-compliance-mapping action (#10526) 2026-03-31 13:38:20 +02:00
rchotacode d18c5a8974 fix(oci): fix identity clients (#10520)
Co-authored-by: Ronan Chota <ronan.chota@saic.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-31 09:42:19 +02:00
Hugo Pereira Brito ab00c2dce1 feat(m365): add entra_conditional_access_policy_block_elevated_insider_risk security check (#10234)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-30 17:27:00 +02:00
Pablo Fernandez Guerra (PFE) 765f9c72f2 docs: add missing pre-commit hooks setup for TruffleHog, Safety and Hadolint (#10448)
Co-authored-by: Pablo Fernandez <pfe@NB0240.local>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-authored-by: Pablo F.G <pablo.fernandez@prowler.com>
2026-03-30 16:43:32 +02:00
Erich Blume de5bb94ff6 fix(image): pass registry arguments through init_global_provider (#10470)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-03-30 15:19:01 +02:00
lydiavilchez c009a2128a feat(google-workspace): add CISA SCuBA Baselines compliance (#10466)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2026-03-30 14:33:38 +02:00
Alejandro Bailo 50556df713 feat(ui): add findings grouped view (#10425)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
2026-03-30 14:17:36 +02:00
Hugo Pereira Brito 3b875484b0 feat(m365): add device registration MFA and harden Intune enrollment CA check (#10222)
Co-authored-by: Hugo Brito <hugopbrito@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-30 13:36:05 +02:00
lydiavilchez 442b379777 feat(google-workspace): add CIS Foundations Benchmark v1.3.0 compliance (#10462)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2026-03-30 12:57:12 +02:00
Adrián Peña 2a8b6261e1 fix(api): false 404 and sorting on finding group resources endpoints (#10510) 2026-03-30 12:47:16 +02:00
Alan Buscaglia 6df74529d6 refactor(ui): remove "Clear all" button from filter pills strip (#10481) 2026-03-30 12:26:01 +02:00
César Arroba 6f6d62f51f fix(ci): remove DOCKER_HUB_REPOSITORY secret and add toniblyx mirror push (#10512) 2026-03-30 11:53:04 +02:00
Hugo Pereira Brito 7148086410 feat(m365): add entra_conditional_access_policy_block_o365_elevated_insider_risk security check (#10232)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-30 11:49:29 +02:00
Alan Buscaglia 4ef0b1bf2c fix(ui): fix pre-commit hook skipping lint, tests, and build (#10494) 2026-03-30 10:44:59 +02:00
César Arroba de492a770c fix(ci): remove DOCKER_HUB_REPOSITORY secret from sdk container workflow (#10509) 2026-03-30 10:20:38 +02:00
César Arroba e9009f783b fix(ci): remove setup-buildx-action from create-manifest jobs (#10508) 2026-03-30 10:01:32 +02:00
Raajhesh Kannaa Chidambaram db1edf5ca7 feat(aws): add internet-exposed category to 13 checks (#10502)
Co-authored-by: Raajhesh Kannaa Chidambaram <495042+raajheshkannaa@users.noreply.github.com>
2026-03-30 08:59:29 +02:00
rchotacode 82d3ccec18 fix(oci): Add multi region filtering argument support (#10473)
Co-authored-by: Ronan Chota <ronan.chota@saic.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-30 08:45:16 +02:00
rchotacode ff46281f64 fix(oci): Fix service region support (#10472)
Co-authored-by: Ronan Chota <ronan.chota@saic.com>
2026-03-30 08:19:32 +02:00
Josema Camacho 94e234cefb fix(api): use raw FK ids in membership post_delete signal to avoid cascade lookup failures (#10497) 2026-03-27 16:16:28 +01:00
Pepe Fagoaga 8267fc4813 fix(step_security): keep notify in audit mode (#10496) 2026-03-27 16:01:24 +01:00
Josema Camacho 8bfeee238b feat(api): replace _provider_id property with label-based isolation and regex injection for custom queries (#10402) 2026-03-27 14:31:56 +01:00
Josema Camacho cc197ea901 feat(api): add periodic cleanup of stale Attack Paths scans with dead-worker detection (#10387) 2026-03-27 14:17:22 +01:00
Pepe Fagoaga 2b5d015e09 feat(security): add missing endpoints to allowlist (#10495)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-03-27 13:53:52 +01:00
Adrián Peña 73e0ac6892 chore: update dependencies (#10492) 2026-03-27 13:13:47 +01:00
Adrián Peña 700b51ddad chore: update Python version references from 3.9 to 3.10 (#10493) 2026-03-27 13:13:36 +01:00
Pepe Fagoaga 417be55604 feat(security): block mode for hardened runners (#10482) 2026-03-27 13:08:59 +01:00
Hugo Pereira Brito f75ce7b4dd feat(ui): add OpenCypher query editor (#10445) 2026-03-27 10:58:48 +00:00
Hugo Pereira Brito 269d9dfe41 feat(cli): add --resource-group flag to filter checks by resource group (#10479)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-27 11:55:28 +01:00
Apoorv Darshan 7b0ce7842b fix: remove return statements from finally blocks (#10102)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-03-27 09:23:15 +01:00
Terry Franklin 0a11ca4a68 feat(celery): VALKEY_SCHEME environment variable (#10420)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-03-27 09:22:35 +01:00
Adrián Peña c953fa7e67 fix(api): resolve check_title filter to check_id for consistent finding-group counts (#10486) 2026-03-27 09:05:02 +01:00
Pepe Fagoaga 73907db856 fix(trivy-scan): don't comment if PR from fork (#10490) 2026-03-27 08:37:19 +01:00
Raajhesh Kannaa Chidambaram 041f95b3df feat(ec2): add check for SG ingress from public IPs to any port (#10335)
Co-authored-by: Raajhesh Kannaa Chidambaram <495042+raajheshkannaa@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-26 17:21:16 +01:00
stepsecurity-app[bot] 716c130140 feat(security): security best practices from StepSecurity (#10480)
Signed-off-by: StepSecurity Bot <bot@stepsecurity.io>
Co-authored-by: stepsecurity-app[bot] <188008098+stepsecurity-app[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-03-26 13:58:19 +01:00
Hugo Pereira Brito c651f60e3a feat(m365): add entra_conditional_access_policy_mdm_compliant_device_required check (#10220)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-26 11:36:30 +01:00
Adrián Peña dd00d71a07 fix(api): fix finding groups muted filter, counters and reaggregation (#10477) 2026-03-26 10:35:21 +01:00
Sandiyo Christan 834d1bca49 feat(awslambda): enrich Function model with inventory fields and add 3 security checks (#10381)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-26 10:33:39 +01:00
Davidm4r 2cf45c72b6 fix(api): remove MANAGE_ACCOUNT permission requirement for listing or create a tenant (#10468) 2026-03-26 09:41:16 +01:00
Pepe Fagoaga 213e18724d fix: Prowler's changelog (#10475) 2026-03-25 16:07:45 +01:00
Pepe Fagoaga 571141f57c fix(aws): set partition's region for global services (#10458) 2026-03-25 15:47:51 +01:00
Adrián Peña 45f0909c3e chore(api): pin all unpinned dependencies to exact versions (#10469) 2026-03-25 13:27:04 +01:00
Alan Buscaglia b01fcc6cb2 fix(ui): refine filter clear and undo behavior in Findings page (#10446) 2026-03-25 13:20:10 +01:00
Adrián Peña 2ddd5b3091 chore: bump minimum Python to 3.10 and pin SDK dependencies (#10464) 2026-03-25 12:32:28 +01:00
Raajhesh Kannaa Chidambaram 6100932c60 feat(glue): add check for plaintext secrets in ETL job arguments (#10368)
Co-authored-by: Raajhesh Kannaa Chidambaram <495042+raajheshkannaa@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-25 12:25:36 +01:00
lydiavilchez 1c2b146e6e fix(docs): replace Google Workspace customer ID image with English version (#10467) 2026-03-25 11:49:30 +01:00
McRolly NWANGWU 833f3779ef feat(cloudfront): detect Standard Logging v2 via CloudWatch Log Delivery (#10090)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-03-25 10:09:21 +00:00
Daniel Barranquero c752811666 fix(oci): false positive for kms key rotation check (#10450) 2026-03-25 11:09:02 +01:00
Daniel Barranquero 4d1f7626f9 fix(oci): false positive for password policies (#10453) 2026-03-25 10:52:31 +01:00
Davidm4r 9bf2a13177 fix: resolve 403 error for admin users listing tenants (#10460) 2026-03-25 10:13:54 +01:00
Josema Camacho d15e67e2e5 fix(api): filter neo4j.io defunct connection logs in Sentry before_send (#10452) 2026-03-25 09:55:12 +01:00
Pepe Fagoaga 20cf5562b8 chore: update org members (#10461) 2026-03-25 09:36:10 +01:00
Pepe Fagoaga 36279f694c chore(gha): ignore zizmor rules and fix version comment (#10459) 2026-03-25 09:09:36 +01:00
César Arroba c991a1d0e8 chore: fix UI bump version (#10451) 2026-03-24 17:39:49 +01:00
Adrián Peña aa3641718b fix(api): populate compliance data in check_metadata for findings (#10449) 2026-03-24 17:19:53 +01:00
Adrián Peña bb80797392 fix(api): support finding-group aggregated filters (#10428) 2026-03-24 16:39:26 +01:00
Hugo Pereira Brito 435624fcd4 fix(sdk): support renamed OCI IdP mapping events (#10416) 2026-03-24 13:18:16 +00:00
Felix Dreissig 9e67f31913 feat(gcp): Add checks for GCP Gemini (Generative Language) API (#10280)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-03-24 14:11:27 +01:00
Prowler Bot 0984cfd75b chore(api): Bump version to v1.24.0 (#10440)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-03-24 14:05:48 +01:00
Prowler Bot c1044ef491 chore(release): Bump version to v5.23.0 (#10439)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-03-24 14:05:05 +01:00
Prowler Bot 19c4c9251c docs: Update version to v5.22.0 (#10441)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-03-24 14:03:47 +01:00
Josema Camacho 55ed7a0663 docs(CHANGELOG): cutting for 5.22.0 (#10437) 2026-03-24 12:15:44 +01:00
Alan Buscaglia 0599040d4e feat(ui): add batch apply pattern to Findings filters (#10388) 2026-03-24 11:09:11 +01:00
lydiavilchez 737d20d2c1 docs(googleworkspace): add Cloud/App documentation (#10421)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-03-24 09:48:01 +01:00
Josema Camacho 844efbd046 perf(api): deduplicate nodes before ProwlerFinding lookup in Attack Paths queries (#10424) 2026-03-23 17:16:15 +01:00
Josema Camacho d60b4f0f52 fix(api): Update Flask and Werkzeug to address vulnerabilities (#10430) 2026-03-23 16:59:03 +01:00
Hugo Pereira Brito 49ba25ba07 feat(ui): add custom attack paths queries (#10397) 2026-03-23 15:36:37 +00:00
Daniel Barranquero 41629137ef docs: remove cookbook from k8s section (#10427) 2026-03-23 16:22:54 +01:00
Hugo Pereira Brito 114e86c0dc fix(sdk): ignore disabled users in Entra MFA check (#10426) 2026-03-23 15:21:31 +00:00
Prowler Bot 1015f1379f feat(aws): Update regions for AWS services (#10413)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-03-23 15:28:51 +01:00
Prowler Bot c62ac6c71b feat(aws): Update regions for AWS services (#10076)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-03-23 15:26:29 +01:00
Daniel Barranquero 14356e3187 docs: add cookbooks section (#10410)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-03-23 13:51:07 +01:00
Adrián Peña 591f5a8603 fix(api): align finding-group latest aggregation (#10419) 2026-03-23 12:43:45 +01:00
mintlify[bot] 93b8a7c74c docs(attack-paths): Lighthouse AI support and supported queries to Attack Paths (#10409)
Co-authored-by: mintlify[bot] <109931778+mintlify[bot]@users.noreply.github.com>
Co-authored-by: Josema Camacho <josema@prowler.com>
2026-03-23 11:12:26 +01:00
Hugo Pereira Brito 7df73a9d4f fix(sdk): use case-insensitive comparison for Azure MySQL flexible server checks (#10396) 2026-03-23 09:59:14 +00:00
Hugo Pereira Brito 1eda94140d fix(sdk): use case-insensitive comparison for Azure VM backup checks (#10395) 2026-03-23 09:45:08 +00:00
Adrián Peña ad6368a446 chore: add defusedxml as api dependency (#10401) 2026-03-19 18:26:55 +01:00
Adrián Peña 3361393b7d chore: update changelog (#10400) 2026-03-19 17:55:18 +01:00
Sandiyo Christan 0b7a21a70c fix(api): [security] use defusedxml to prevent XML bomb DoS in SAML metadata parsing (#10165)
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-authored-by: Adrián Peña <adrianjpr@gmail.com>
2026-03-19 17:44:52 +01:00
Josema Camacho 872e6e239c perf(api): replace JOINs with pre-check in threat score aggregation query (#10394) 2026-03-19 17:30:06 +01:00
Adrián Peña 2fe92cfce3 feat(api): add check title search for finding groups (#10377) 2026-03-19 16:48:26 +01:00
César Arroba cece2cb87e chore: pin Prowler version to lastest master commit on push (#10384) 2026-03-19 14:32:38 +01:00
Adrián Peña ab266080d0 perf(api): add trigram indexes for finding groups (#10378) 2026-03-19 13:54:50 +01:00
Prowler Bot 4638b39ed4 chore(api): Bump version to v1.23.0 (#10393)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-03-19 13:42:46 +01:00
Prowler Bot 997f9bf64a docs: Update version to v5.21.0 (#10391)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-03-19 13:40:33 +01:00
Prowler Bot aecc234f78 chore(release): Bump version to v5.22.0 (#10389)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-03-19 13:40:22 +01:00
1058 changed files with 94278 additions and 23788 deletions
+4 -1
View File
@@ -78,6 +78,9 @@ TASK_RETRY_ATTEMPTS=5
# Valkey settings
# If running Valkey and celery on host, use localhost, else use 'valkey'
VALKEY_SCHEME=redis
VALKEY_USERNAME=
VALKEY_PASSWORD=
VALKEY_HOST=valkey
VALKEY_PORT=6379
VALKEY_DB=0
@@ -142,7 +145,7 @@ SENTRY_RELEASE=local
NEXT_PUBLIC_SENTRY_ENVIRONMENT=${SENTRY_ENVIRONMENT}
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.16.0
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.24.1
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
@@ -13,11 +13,15 @@ inputs:
poetry-version:
description: 'Poetry version to install'
required: false
default: '2.1.1'
default: '2.3.4'
install-dependencies:
description: 'Install Python dependencies with Poetry'
required: false
default: 'true'
update-lock:
description: 'Run `poetry lock` during setup. Only enable when a prior step mutates pyproject.toml (e.g. API `@master` VCS rewrite). Default: false.'
required: false
default: 'false'
runs:
using: 'composite'
@@ -74,7 +78,7 @@ runs:
grep -A2 -B2 "resolved_reference" poetry.lock
- name: Update poetry.lock (prowler repo only)
if: github.repository == 'prowler-cloud/prowler'
if: github.repository == 'prowler-cloud/prowler' && inputs.update-lock == 'true'
shell: bash
working-directory: ${{ inputs.working-directory }}
run: poetry lock
+4 -1
View File
@@ -117,7 +117,10 @@ runs:
INPUTS_IMAGE_TAG: ${{ inputs.image-tag }}
- name: Comment scan results on PR
if: inputs.create-pr-comment == 'true' && github.event_name == 'pull_request'
if: >-
inputs.create-pr-comment == 'true'
&& github.event_name == 'pull_request'
&& github.event.pull_request.head.repo.full_name == github.repository
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
env:
IMAGE_NAME: ${{ inputs.image-name }}
+7
View File
@@ -67,6 +67,11 @@ provider/googleworkspace:
- any-glob-to-any-file: "prowler/providers/googleworkspace/**"
- any-glob-to-any-file: "tests/providers/googleworkspace/**"
provider/vercel:
- changed-files:
- any-glob-to-any-file: "prowler/providers/vercel/**"
- any-glob-to-any-file: "tests/providers/vercel/**"
github_actions:
- changed-files:
- any-glob-to-any-file: ".github/workflows/*"
@@ -102,6 +107,8 @@ mutelist:
- any-glob-to-any-file: "tests/providers/openstack/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/googleworkspace/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/googleworkspace/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/vercel/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/vercel/lib/mutelist/**"
integration/s3:
- changed-files:
+8
View File
@@ -177,6 +177,14 @@ modules:
- tests/providers/llm/**
e2e: []
- name: sdk-vercel
match:
- prowler/providers/vercel/**
- prowler/compliance/vercel/**
tests:
- tests/providers/vercel/**
e2e: []
# ============================================
# SDK - Lib modules
# ============================================
+17
View File
@@ -13,6 +13,8 @@ env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
permissions: {}
jobs:
detect-release-type:
runs-on: ubuntu-latest
@@ -27,6 +29,11 @@ jobs:
patch_version: ${{ steps.detect.outputs.patch_version }}
current_api_version: ${{ steps.get_api_version.outputs.current_api_version }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -79,6 +86,11 @@ jobs:
contents: read
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -204,6 +216,11 @@ jobs:
contents: read
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
+14 -1
View File
@@ -17,6 +17,8 @@ concurrency:
env:
API_WORKING_DIR: ./api
permissions: {}
jobs:
api-code-quality:
runs-on: ubuntu-latest
@@ -32,6 +34,16 @@ jobs:
working-directory: ./api
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
pypi.org:443
files.pythonhosted.org:443
api.github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -40,7 +52,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
api/**
@@ -57,6 +69,7 @@ jobs:
with:
python-version: ${{ matrix.python-version }}
working-directory: ./api
update-lock: 'true'
- name: Poetry check
if: steps.check-changes.outputs.any_changed == 'true'
+14
View File
@@ -24,6 +24,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
api-analyze:
name: CodeQL Security Analysis
@@ -41,6 +43,18 @@ jobs:
- 'python'
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
github.com:443
release-assets.githubusercontent.com:443
uploads.github.com:443
release-assets.githubusercontent.com:443
objects.githubusercontent.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
+72 -10
View File
@@ -18,9 +18,6 @@ on:
required: true
type: string
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: false
@@ -36,6 +33,8 @@ env:
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler-api
permissions: {}
jobs:
setup:
if: github.repository == 'prowler-cloud/prowler'
@@ -43,7 +42,14 @@ jobs:
timeout-minutes: 5
outputs:
short-sha: ${{ steps.set-short-sha.outputs.short-sha }}
permissions:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
- name: Calculate short SHA
id: set-short-sha
run: echo "short-sha=${GITHUB_SHA::7}" >> $GITHUB_OUTPUT
@@ -55,7 +61,14 @@ jobs:
timeout-minutes: 5
outputs:
message-ts: ${{ steps.slack-notification.outputs.ts }}
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -94,24 +107,50 @@ jobs:
packages: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
_http._tcp.deb.debian.org:443
aka.ms:443
auth.docker.io:443
cdn.powershellgallery.com:443
dc.services.visualstudio.com:443
debian.map.fastlydns.net:80
files.pythonhosted.org:443
github.com:443
powershellinfraartifacts-gkhedzdeaghdezhr.z01.azurefd.net:443
production.cloudflare.docker.com:443
pypi.org:443
registry-1.docker.io:443
release-assets.githubusercontent.com:443
www.powershellgallery.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Pin prowler SDK to latest master commit
if: github.event_name == 'push'
run: |
LATEST_SHA=$(git ls-remote https://github.com/prowler-cloud/prowler.git refs/heads/master | cut -f1)
sed -i "s|prowler-cloud/prowler.git@master|prowler-cloud/prowler.git@${LATEST_SHA}|" api/pyproject.toml
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build and push API container for ${{ matrix.arch }}
id: container-push
if: github.event_name == 'push' || github.event_name == 'release' || github.event_name == 'workflow_dispatch'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true
@@ -126,17 +165,26 @@ jobs:
needs: [setup, container-build-push]
if: always() && needs.setup.result == 'success' && needs.container-build-push.result == 'success'
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
release-assets.githubusercontent.com:443
registry-1.docker.io:443
auth.docker.io:443
production.cloudflare.docker.com:443
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
run: |
@@ -178,7 +226,14 @@ jobs:
needs: [setup, notify-release-started, container-build-push, create-manifest]
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -221,6 +276,13 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
- name: Trigger API deployment
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4.0.1
with:
+37 -4
View File
@@ -18,6 +18,8 @@ env:
API_WORKING_DIR: ./api
IMAGE_NAME: prowler-api
permissions: {}
jobs:
api-dockerfile-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -27,6 +29,13 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -35,7 +44,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: api/Dockerfile
@@ -65,6 +74,30 @@ jobs:
pull-requests: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
mirror.gcr.io:443
check.trivy.dev:443
github.com:443
registry-1.docker.io:443
auth.docker.io:443
production.cloudflare.docker.com:443
debian.map.fastlydns.net:80
release-assets.githubusercontent.com:443
objects.githubusercontent.com:443
pypi.org:443
files.pythonhosted.org:443
www.powershellgallery.com:443
aka.ms:443
cdn.powershellgallery.com:443
_http._tcp.deb.debian.org:443
powershellinfraartifacts-gkhedzdeaghdezhr.z01.azurefd.net:443
get.trivy.dev:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -73,7 +106,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: api/**
files_ignore: |
@@ -84,11 +117,11 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.API_WORKING_DIR }}
push: false
+19 -2
View File
@@ -17,6 +17,8 @@ concurrency:
env:
API_WORKING_DIR: ./api
permissions: {}
jobs:
api-security-scans:
runs-on: ubuntu-latest
@@ -32,6 +34,19 @@ jobs:
working-directory: ./api
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
pypi.org:443
files.pythonhosted.org:443
github.com:443
auth.safetycli.com:443
pyup.io:443
data.safetycli.com:443
api.github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -40,7 +55,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
api/**
@@ -57,6 +72,7 @@ jobs:
with:
python-version: ${{ matrix.python-version }}
working-directory: ./api
update-lock: 'true'
- name: Bandit
if: steps.check-changes.outputs.any_changed == 'true'
@@ -64,9 +80,10 @@ jobs:
- name: Safety
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run safety check --ignore 79023,79027,86217
run: poetry run safety check --ignore 79023,79027,86217,71600
# TODO: 79023 & 79027 knack ReDoS until `azure-cli-core` (via `cartography`) allows `knack` >=0.13.0
# TODO: 86217 because `alibabacloud-tea-openapi == 0.4.3` don't let us upgrade `cryptography >= 46.0.0`
# TODO: 71600 CVE-2024-1135 false positive - fixed in gunicorn 22.0.0, project uses 23.0.0
- name: Vulture
if: steps.check-changes.outputs.any_changed == 'true'
+23 -1
View File
@@ -22,11 +22,16 @@ env:
POSTGRES_USER: prowler_user
POSTGRES_PASSWORD: prowler
POSTGRES_DB: postgres-db
VALKEY_SCHEME: redis
VALKEY_USERNAME: ""
VALKEY_PASSWORD: ""
VALKEY_HOST: localhost
VALKEY_PORT: 6379
VALKEY_DB: 0
API_WORKING_DIR: ./api
permissions: {}
jobs:
api-tests:
runs-on: ubuntu-latest
@@ -72,6 +77,22 @@ jobs:
--health-retries 5
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
pypi.org:443
files.pythonhosted.org:443
cli.codecov.io:443
keybase.io:443
ingest.codecov.io:443
storage.googleapis.com:443
o26192.ingest.us.sentry.io:443
api.github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -80,7 +101,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
api/**
@@ -97,6 +118,7 @@ jobs:
with:
python-version: ${{ matrix.python-version }}
working-directory: ./api
update-lock: 'true'
- name: Run tests with pytest
if: steps.check-changes.outputs.any_changed == 'true'
+10 -1
View File
@@ -17,6 +17,8 @@ env:
BACKPORT_LABEL_PREFIX: backport-to-
BACKPORT_LABEL_IGNORE: was-backported
permissions: {}
jobs:
backport:
if: github.event.pull_request.merged == true && !(contains(github.event.pull_request.labels.*.name, 'backport')) && !(contains(github.event.pull_request.labels.*.name, 'was-backported'))
@@ -27,6 +29,13 @@ jobs:
pull-requests: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
- name: Check labels
id: label_check
uses: agilepathway/label-checker@c3d16ad512e7cea5961df85ff2486bb774caf3c5 # v1.6.65
@@ -39,7 +48,7 @@ jobs:
- name: Backport PR
if: steps.label_check.outputs.label_check == 'success'
uses: sorenlouv/backport-github-action@516854e7c9f962b9939085c9a92ea28411d1ae90 # v10.2.0
uses: sorenlouv/backport-github-action@9460b7102fea25466026ce806c9ebf873ac48721 # v11.0.0
with:
github_token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
auto_backport_label_prefix: ${{ env.BACKPORT_LABEL_PREFIX }}
+13 -1
View File
@@ -21,6 +21,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
zizmor:
if: github.repository == 'prowler-cloud/prowler'
@@ -33,12 +35,22 @@ jobs:
actions: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
ghcr.io:443
pkg-containers.githubusercontent.com:443
api.github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Run zizmor
uses: zizmorcore/zizmor-action@0dce2577a4760a2749d8cfb7a84b7d5585ebcb7d # v0.5.0
uses: zizmorcore/zizmor-action@71321a20a9ded102f6e9ce5718a2fcec2c4f70d8 # v0.5.2
with:
token: ${{ github.token }}
@@ -9,6 +9,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.issue.number }}
cancel-in-progress: false
permissions: {}
jobs:
update-labels:
if: contains(github.event.issue.labels.*.name, 'status/awaiting-response')
@@ -19,6 +21,11 @@ jobs:
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Remove 'status/awaiting-response' label
env:
GH_TOKEN: ${{ github.token }}
@@ -16,6 +16,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
conventional-commit-check:
runs-on: ubuntu-latest
@@ -25,6 +27,11 @@ jobs:
pull-requests: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Check PR title format
uses: agenthunt/conventional-commit-checker-action@f1823f632e95a64547566dcd2c7da920e67117ad # v2.0.1
with:
@@ -13,6 +13,8 @@ env:
BACKPORT_LABEL_PREFIX: backport-to-
BACKPORT_LABEL_COLOR: B60205
permissions: {}
jobs:
create-label:
runs-on: ubuntu-latest
@@ -22,6 +24,11 @@ jobs:
issues: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Create backport label for minor releases
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+17
View File
@@ -13,6 +13,8 @@ env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
permissions: {}
jobs:
detect-release-type:
runs-on: ubuntu-latest
@@ -27,6 +29,11 @@ jobs:
patch_version: ${{ steps.detect.outputs.patch_version }}
current_docs_version: ${{ steps.get_docs_version.outputs.current_docs_version }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -79,6 +86,11 @@ jobs:
contents: read
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -204,6 +216,11 @@ jobs:
contents: read
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
+11
View File
@@ -14,6 +14,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
scan-secrets:
runs-on: ubuntu-latest
@@ -22,6 +24,15 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
ghcr.io:443
pkg-containers.githubusercontent.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
+9 -2
View File
@@ -21,6 +21,8 @@ concurrency:
env:
CHART_PATH: contrib/k8s/helm/prowler-app
permissions: {}
jobs:
helm-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -30,13 +32,18 @@ jobs:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Helm
uses: azure/setup-helm@1a275c3b69536ee54be43f2070a358922e12c8d4 # v4.3.1
uses: azure/setup-helm@dda3372f752e03dde6b3237bc9431cdc2f7a02a2 # v5.0.0
- name: Update chart dependencies
run: helm dependency update ${{ env.CHART_PATH }}
+9 -2
View File
@@ -13,6 +13,8 @@ concurrency:
env:
CHART_PATH: contrib/k8s/helm/prowler-app
permissions: {}
jobs:
release-helm-chart:
if: github.repository == 'prowler-cloud/prowler'
@@ -23,13 +25,18 @@ jobs:
packages: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Helm
uses: azure/setup-helm@b9e51907a09c216f16ebe8536097933489208112 # v4.3.0
uses: azure/setup-helm@dda3372f752e03dde6b3237bc9431cdc2f7a02a2 # v5.0.0
- name: Set appVersion from release tag
run: |
+53
View File
@@ -0,0 +1,53 @@
name: 'Tools: Lock Issue on Close'
on:
issues:
types:
- closed
concurrency:
group: ${{ github.workflow }}-${{ github.event.issue.number }}
cancel-in-progress: false
permissions: {}
jobs:
lock:
if: |
github.repository == 'prowler-cloud/prowler' &&
github.event.issue.locked == false
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
issues: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
- name: Comment and lock issue
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
script: |
const { owner, repo } = context.repo;
const issue_number = context.payload.issue.number;
try {
await github.rest.issues.createComment({
owner,
repo,
issue_number,
body: 'This issue is now locked as it has been closed. If you are still hitting a related problem, please open a new issue and link back to this one for context. Thanks!'
});
} catch (error) {
core.warning(`Failed to post lock comment on issue #${issue_number}: ${error.message}`);
}
const lockParams = { owner, repo, issue_number };
if (context.payload.issue.state_reason === 'completed') {
lockParams.lock_reason = 'resolved';
}
await github.rest.issues.lock(lockParams);
+39 -9
View File
@@ -65,6 +65,11 @@ jobs:
text: ${{ steps.compute-text.outputs.text }}
title: ${{ steps.compute-text.outputs.title }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Setup Scripts
uses: github/gh-aw/actions/setup@9382be3ca9ac18917e111a99d4e6bbff58d0dccc # v0.43.23
with:
@@ -129,6 +134,11 @@ jobs:
output_types: ${{ steps.collect_output.outputs.output_types }}
secret_verification_result: ${{ steps.validate-secret.outputs.verification_result }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Setup Scripts
uses: github/gh-aw/actions/setup@9382be3ca9ac18917e111a99d4e6bbff58d0dccc # v0.43.23
with:
@@ -762,7 +772,7 @@ jobs:
SECRET_GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Upload Safe Outputs
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: safe-output
path: ${{ env.GH_AW_SAFE_OUTPUTS }}
@@ -783,13 +793,13 @@ jobs:
await main();
- name: Upload sanitized agent output
if: always() && env.GH_AW_AGENT_OUTPUT
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: agent-output
path: ${{ env.GH_AW_AGENT_OUTPUT }}
if-no-files-found: warn
- name: Upload engine output files
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: agent_outputs
path: |
@@ -829,7 +839,7 @@ jobs:
- name: Upload agent artifacts
if: always()
continue-on-error: true
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: agent-artifacts
path: |
@@ -859,13 +869,18 @@ jobs:
tools_reported: ${{ steps.missing_tool.outputs.tools_reported }}
total_count: ${{ steps.missing_tool.outputs.total_count }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Setup Scripts
uses: github/gh-aw/actions/setup@9382be3ca9ac18917e111a99d4e6bbff58d0dccc # v0.43.23
with:
destination: /opt/gh-aw/actions
- name: Download agent output artifact
continue-on-error: true
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: agent-output
path: /tmp/gh-aw/safeoutputs/
@@ -966,19 +981,24 @@ jobs:
outputs:
success: ${{ steps.parse_results.outputs.success }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Setup Scripts
uses: github/gh-aw/actions/setup@9382be3ca9ac18917e111a99d4e6bbff58d0dccc # v0.43.23
with:
destination: /opt/gh-aw/actions
- name: Download agent artifacts
continue-on-error: true
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: agent-artifacts
path: /tmp/gh-aw/threat-detection/
- name: Download agent output artifact
continue-on-error: true
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: agent-output
path: /tmp/gh-aw/threat-detection/
@@ -1051,7 +1071,7 @@ jobs:
await main();
- name: Upload threat detection log
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: threat-detection.log
path: /tmp/gh-aw/threat-detection/detection.log
@@ -1070,6 +1090,11 @@ jobs:
outputs:
activated: ${{ (steps.check_membership.outputs.is_team_member == 'true') && (steps.check_rate_limit.outputs.rate_limit_ok == 'true') }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Setup Scripts
uses: github/gh-aw/actions/setup@9382be3ca9ac18917e111a99d4e6bbff58d0dccc # v0.43.23
with:
@@ -1138,13 +1163,18 @@ jobs:
process_safe_outputs_processed_count: ${{ steps.process_safe_outputs.outputs.processed_count }}
process_safe_outputs_temporary_id_map: ${{ steps.process_safe_outputs.outputs.temporary_id_map }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Setup Scripts
uses: github/gh-aw/actions/setup@9382be3ca9ac18917e111a99d4e6bbff58d0dccc # v0.43.23
with:
destination: /opt/gh-aw/actions
- name: Download agent output artifact
continue-on-error: true
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
uses: actions/download-artifact@3e5f45b2cfb9172054b4087a40e8e0b5a5461e7c # v8.0.1
with:
name: agent-output
path: /tmp/gh-aw/safeoutputs/
+14 -1
View File
@@ -15,6 +15,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
labeler:
runs-on: ubuntu-latest
@@ -24,6 +26,11 @@ jobs:
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Apply labels to PR
uses: actions/labeler@634933edcd8ababfe52f92936142cc22ac488b1b # v6.0.1
with:
@@ -38,6 +45,11 @@ jobs:
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Check if author is org member
id: check_membership
env:
@@ -65,7 +77,8 @@ jobs:
"RosaRivasProwler"
"StylusFrost"
"toniblyx"
"vicferpoy"
"davidm4r"
"pfe-nazaries"
)
echo "Checking if $AUTHOR is a member of prowler-cloud organization"
+61 -10
View File
@@ -17,9 +17,6 @@ on:
required: true
type: string
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: false
@@ -35,6 +32,8 @@ env:
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler-mcp
permissions: {}
jobs:
setup:
if: github.repository == 'prowler-cloud/prowler'
@@ -42,7 +41,14 @@ jobs:
timeout-minutes: 5
outputs:
short-sha: ${{ steps.set-short-sha.outputs.short-sha }}
permissions:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
- name: Calculate short SHA
id: set-short-sha
run: echo "short-sha=${GITHUB_SHA::7}" >> $GITHUB_OUTPUT
@@ -54,7 +60,14 @@ jobs:
timeout-minutes: 5
outputs:
message-ts: ${{ steps.slack-notification.outputs.ts }}
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -92,24 +105,38 @@ jobs:
contents: read
packages: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
registry-1.docker.io:443
auth.docker.io:443
production.cloudflare.docker.com:443
ghcr.io:443
pkg-containers.githubusercontent.com:443
files.pythonhosted.org:443
pypi.org:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build and push MCP container for ${{ matrix.arch }}
id: container-push
if: github.event_name == 'push' || github.event_name == 'release' || github.event_name == 'workflow_dispatch'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true
@@ -132,17 +159,27 @@ jobs:
needs: [setup, container-build-push]
if: always() && needs.setup.result == 'success' && needs.container-build-push.result == 'success'
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
registry-1.docker.io:443
auth.docker.io:443
production.cloudflare.docker.com:443
github.com:443
release-assets.githubusercontent.com:443
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
run: |
@@ -184,7 +221,14 @@ jobs:
needs: [setup, notify-release-started, container-build-push, create-manifest]
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -227,6 +271,13 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
- name: Trigger MCP deployment
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4.0.1
with:
+33 -4
View File
@@ -18,6 +18,8 @@ env:
MCP_WORKING_DIR: ./mcp_server
IMAGE_NAME: prowler-mcp
permissions: {}
jobs:
mcp-dockerfile-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -27,6 +29,13 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -35,7 +44,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: mcp_server/Dockerfile
@@ -64,6 +73,26 @@ jobs:
pull-requests: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
registry-1.docker.io:443
auth.docker.io:443
production.cloudflare.docker.com:443
ghcr.io:443
pkg-containers.githubusercontent.com:443
files.pythonhosted.org:443
pypi.org:443
api.github.com:443
mirror.gcr.io:443
check.trivy.dev:443
get.trivy.dev:443
release-assets.githubusercontent.com:443
objects.githubusercontent.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -72,7 +101,7 @@ jobs:
- name: Check for MCP changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: mcp_server/**
files_ignore: |
@@ -81,11 +110,11 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build MCP container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.MCP_WORKING_DIR }}
push: false
+13 -1
View File
@@ -14,6 +14,8 @@ env:
PYTHON_VERSION: "3.12"
WORKING_DIRECTORY: ./mcp_server
permissions: {}
jobs:
validate-release:
if: github.repository == 'prowler-cloud/prowler'
@@ -26,6 +28,11 @@ jobs:
major_version: ${{ steps.parse-version.outputs.major }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Parse and validate version
id: parse-version
run: |
@@ -59,13 +66,18 @@ jobs:
url: https://pypi.org/project/prowler-mcp/
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Install uv
uses: astral-sh/setup-uv@5a095e7a2014a4212f075830d4f7277575a9d098 # v7
uses: astral-sh/setup-uv@5a095e7a2014a4212f075830d4f7277575a9d098 # v7.3.1
with:
enable-cache: false
+11 -1
View File
@@ -16,6 +16,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
check-changelog:
if: contains(github.event.pull_request.labels.*.name, 'no-changelog') == false
@@ -28,6 +30,14 @@ jobs:
MONITORED_FOLDERS: 'api ui prowler mcp_server'
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -37,7 +47,7 @@ jobs:
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
api/**
@@ -0,0 +1,182 @@
name: 'Tools: Check Compliance Mapping'
on:
pull_request:
types:
- 'opened'
- 'synchronize'
- 'reopened'
- 'labeled'
- 'unlabeled'
branches:
- 'master'
- 'v5.*'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
check-compliance-mapping:
if: contains(github.event.pull_request.labels.*.name, 'no-compliance-check') == false
runs-on: ubuntu-latest
timeout-minutes: 15
permissions:
contents: read
pull-requests: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
# zizmor: ignore[artipacked]
persist-credentials: true # Required by tj-actions/changed-files to fetch PR branch
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
prowler/providers/**/services/**/*.metadata.json
prowler/compliance/**/*.json
- name: Check if new checks are mapped in compliance
id: compliance-check
run: |
ADDED_METADATA="${STEPS_CHANGED_FILES_OUTPUTS_ADDED_FILES}"
ALL_CHANGED="${STEPS_CHANGED_FILES_OUTPUTS_ALL_CHANGED_FILES}"
# Filter only new metadata files (new checks)
new_checks=""
for f in $ADDED_METADATA; do
case "$f" in *.metadata.json) new_checks="$new_checks $f" ;; esac
done
if [ -z "$(echo "$new_checks" | tr -d ' ')" ]; then
echo "No new checks detected."
echo "has_new_checks=false" >> "$GITHUB_OUTPUT"
exit 0
fi
# Collect compliance files changed in this PR
changed_compliance=""
for f in $ALL_CHANGED; do
case "$f" in prowler/compliance/*.json) changed_compliance="$changed_compliance $f" ;; esac
done
UNMAPPED=""
MAPPED=""
for metadata_file in $new_checks; do
check_dir=$(dirname "$metadata_file")
check_id=$(basename "$check_dir")
provider=$(echo "$metadata_file" | cut -d'/' -f3)
# Read CheckID from the metadata JSON for accuracy
if [ -f "$metadata_file" ]; then
json_check_id=$(python3 -c "import json; print(json.load(open('$metadata_file')).get('CheckID', ''))" 2>/dev/null || echo "")
if [ -n "$json_check_id" ]; then
check_id="$json_check_id"
fi
fi
# Search for the check ID in compliance files changed in this PR
found_in=""
for comp_file in $changed_compliance; do
if grep -q "\"${check_id}\"" "$comp_file" 2>/dev/null; then
found_in="${found_in}$(basename "$comp_file" .json), "
fi
done
if [ -n "$found_in" ]; then
found_in=$(echo "$found_in" | sed 's/, $//')
MAPPED="${MAPPED}- \`${check_id}\` (\`${provider}\`): ${found_in}"$'\n'
else
UNMAPPED="${UNMAPPED}- \`${check_id}\` (\`${provider}\`)"$'\n'
fi
done
echo "has_new_checks=true" >> "$GITHUB_OUTPUT"
if [ -n "$UNMAPPED" ]; then
echo "has_unmapped=true" >> "$GITHUB_OUTPUT"
else
echo "has_unmapped=false" >> "$GITHUB_OUTPUT"
fi
{
echo "unmapped<<EOF"
echo -e "${UNMAPPED}"
echo "EOF"
} >> "$GITHUB_OUTPUT"
{
echo "mapped<<EOF"
echo -e "${MAPPED}"
echo "EOF"
} >> "$GITHUB_OUTPUT"
env:
STEPS_CHANGED_FILES_OUTPUTS_ADDED_FILES: ${{ steps.changed-files.outputs.added_files }}
STEPS_CHANGED_FILES_OUTPUTS_ALL_CHANGED_FILES: ${{ steps.changed-files.outputs.all_changed_files }}
- name: Manage compliance review label
if: steps.compliance-check.outputs.has_new_checks == 'true'
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PR_NUMBER: ${{ github.event.pull_request.number }}
HAS_UNMAPPED: ${{ steps.compliance-check.outputs.has_unmapped }}
run: |
LABEL_NAME="needs-compliance-review"
if [ "$HAS_UNMAPPED" = "true" ]; then
echo "Adding compliance review label to PR #${PR_NUMBER}..."
gh pr edit "$PR_NUMBER" --add-label "$LABEL_NAME" --repo "${{ github.repository }}" || true
else
echo "Removing compliance review label from PR #${PR_NUMBER}..."
gh pr edit "$PR_NUMBER" --remove-label "$LABEL_NAME" --repo "${{ github.repository }}" || true
fi
- name: Find existing compliance comment
if: steps.compliance-check.outputs.has_new_checks == 'true' && github.event.pull_request.head.repo.full_name == github.repository
id: find-comment
uses: peter-evans/find-comment@b30e6a3c0ed37e7c023ccd3f1db5c6c0b0c23aad # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: 'github-actions[bot]'
body-includes: '<!-- compliance-mapping-check -->'
- name: Create or update compliance comment
if: steps.compliance-check.outputs.has_new_checks == 'true' && github.event.pull_request.head.repo.full_name == github.repository
uses: peter-evans/create-or-update-comment@e8674b075228eee787fea43ef493e45ece1004c9 # v5.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.find-comment.outputs.comment-id }}
edit-mode: replace
body: |
<!-- compliance-mapping-check -->
## Compliance Mapping Review
This PR adds new checks. Please verify that they have been mapped to the relevant compliance framework requirements.
${{ steps.compliance-check.outputs.unmapped != '' && format('### New checks not mapped to any compliance framework in this PR
{0}
> Please review whether these checks should be added to compliance framework requirements in `prowler/compliance/<provider>/`. Each compliance JSON has a `Checks` array inside each requirement — add the check ID there if it satisfies that requirement.', steps.compliance-check.outputs.unmapped) || '' }}
${{ steps.compliance-check.outputs.mapped != '' && format('### New checks already mapped in this PR
{0}', steps.compliance-check.outputs.mapped) || '' }}
Use the `no-compliance-check` label to skip this check.
+8 -1
View File
@@ -15,6 +15,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions: {}
jobs:
check-conflicts:
runs-on: ubuntu-latest
@@ -25,6 +27,11 @@ jobs:
issues: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout PR head
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -34,7 +41,7 @@ jobs:
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: '**'
+9
View File
@@ -12,6 +12,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
cancel-in-progress: false
permissions: {}
jobs:
trigger-cloud-pull-request:
if: |
@@ -23,6 +25,13 @@ jobs:
permissions:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
- name: Calculate short commit SHA
id: vars
run: |
+11 -8
View File
@@ -17,6 +17,8 @@ concurrency:
env:
PROWLER_VERSION: ${{ inputs.prowler_version }}
permissions: {}
jobs:
prepare-release:
if: github.event_name == 'workflow_dispatch' && github.repository == 'prowler-cloud/prowler'
@@ -26,6 +28,11 @@ jobs:
contents: write
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -33,15 +40,11 @@ jobs:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
- name: Setup Python with Poetry
uses: ./.github/actions/setup-python-poetry
with:
python-version: '3.12'
- name: Install Poetry
run: |
python3 -m pip install --user poetry==2.1.1
echo "$HOME/.local/bin" >> $GITHUB_PATH
install-dependencies: 'false'
- name: Configure Git
run: |
@@ -375,7 +378,7 @@ jobs:
no-changelog
- name: Create draft release
uses: softprops/action-gh-release@a06a81a03ee405af7f2048a818ed3f03bbf83c7b # v2.5.0
uses: softprops/action-gh-release@153bb8e04406b158c6c84fc1615b65b24149a1fe # v2.6.1
with:
tag_name: ${{ env.PROWLER_VERSION }}
name: Prowler ${{ env.PROWLER_VERSION }}
+17
View File
@@ -13,6 +13,8 @@ env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
permissions: {}
jobs:
detect-release-type:
runs-on: ubuntu-latest
@@ -26,6 +28,11 @@ jobs:
minor_version: ${{ steps.detect.outputs.minor_version }}
patch_version: ${{ steps.detect.outputs.patch_version }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Detect release type and parse version
id: detect
run: |
@@ -66,6 +73,11 @@ jobs:
contents: read
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -175,6 +187,11 @@ jobs:
contents: read
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -10,6 +10,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
check-duplicate-test-names:
if: github.repository == 'prowler-cloud/prowler'
@@ -19,6 +21,13 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
+14 -15
View File
@@ -14,6 +14,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
sdk-code-quality:
if: github.repository == 'prowler-cloud/prowler'
@@ -24,12 +26,20 @@ jobs:
strategy:
matrix:
python-version:
- '3.9'
- '3.10'
- '3.11'
- '3.12'
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
pypi.org:443
files.pythonhosted.org:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -38,7 +48,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ./**
files_ignore: |
@@ -61,22 +71,11 @@ jobs:
contrib/**
**/AGENTS.md
- name: Install Poetry
- name: Setup Python with Poetry
if: steps.check-changes.outputs.any_changed == 'true'
run: pipx install poetry==2.1.1
- name: Set up Python ${{ matrix.python-version }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ matrix.python-version }}
cache: 'poetry'
- name: Install dependencies
if: steps.check-changes.outputs.any_changed == 'true'
run: |
poetry install --no-root
poetry run pip list
- name: Check Poetry lock file
if: steps.check-changes.outputs.any_changed == 'true'
+12
View File
@@ -30,6 +30,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
sdk-analyze:
if: github.repository == 'prowler-cloud/prowler'
@@ -48,6 +50,16 @@ jobs:
- 'python'
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
github.com:443
release-assets.githubusercontent.com:443
uploads.github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
+116 -23
View File
@@ -23,9 +23,6 @@ on:
required: true
type: string
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: false
@@ -45,10 +42,13 @@ env:
# Container registries
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler
TONIBLYX_DOCKERHUB_REPOSITORY: toniblyx
# AWS configuration (for ECR)
AWS_REGION: us-east-1
permissions: {}
jobs:
setup:
if: github.repository == 'prowler-cloud/prowler'
@@ -59,21 +59,31 @@ jobs:
prowler_version_major: ${{ steps.get-prowler-version.outputs.prowler_version_major }}
latest_tag: ${{ steps.get-prowler-version.outputs.latest_tag }}
stable_tag: ${{ steps.get-prowler-version.outputs.stable_tag }}
permissions:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
pypi.org:443
files.pythonhosted.org:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
- name: Setup Python with Poetry
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ env.PYTHON_VERSION }}
install-dependencies: 'false'
- name: Install Poetry
run: |
pipx install poetry==2.1.1
pipx inject poetry poetry-bumpversion
- name: Inject poetry-bumpversion plugin
run: pipx inject poetry poetry-bumpversion
- name: Get Prowler version and set tags
id: get-prowler-version
@@ -115,7 +125,14 @@ jobs:
timeout-minutes: 5
outputs:
message-ts: ${{ steps.slack-notification.outputs.ts }}
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -154,19 +171,40 @@ jobs:
packages: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.ecr-public.us-east-1.amazonaws.com:443
public.ecr.aws:443
registry-1.docker.io:443
production.cloudflare.docker.com:443
auth.docker.io:443
debian.map.fastlydns.net:80
github.com:443
release-assets.githubusercontent.com:443
pypi.org:443
files.pythonhosted.org:443
www.powershellgallery.com:443
aka.ms:443
cdn.powershellgallery.com:443
_http._tcp.deb.debian.org:443
powershellinfraartifacts-gkhedzdeaghdezhr.z01.azurefd.net:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Public ECR
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
registry: public.ecr.aws
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
@@ -175,12 +213,12 @@ jobs:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build and push SDK container for ${{ matrix.arch }}
id: container-push
if: github.event_name == 'push' || github.event_name == 'release' || github.event_name == 'workflow_dispatch'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: .
file: ${{ env.DOCKERFILE_PATH }}
@@ -196,16 +234,32 @@ jobs:
needs: [setup, container-build-push]
if: always() && needs.setup.result == 'success' && needs.container-build-push.result == 'success'
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
registry-1.docker.io:443
auth.docker.io:443
public.ecr.aws:443
production.cloudflare.docker.com:443
github.com:443
release-assets.githubusercontent.com:443
api.ecr-public.us-east-1.amazonaws.com:443
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Public ECR
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
registry: public.ecr.aws
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
@@ -213,15 +267,11 @@ jobs:
env:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
run: |
docker buildx imagetools create \
-t ${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_LATEST_TAG} \
-t ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_LATEST_TAG} \
-t ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_LATEST_TAG} \
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_LATEST_TAG}-amd64 \
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_LATEST_TAG}-arm64
@@ -232,12 +282,10 @@ jobs:
if: github.event_name == 'release' || github.event_name == 'workflow_dispatch'
run: |
docker buildx imagetools create \
-t ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${NEEDS_SETUP_OUTPUTS_PROWLER_VERSION} \
-t ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${NEEDS_SETUP_OUTPUTS_STABLE_TAG} \
-t ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${NEEDS_SETUP_OUTPUTS_PROWLER_VERSION} \
-t ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${NEEDS_SETUP_OUTPUTS_STABLE_TAG} \
-t ${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_PROWLER_VERSION} \
-t ${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_STABLE_TAG} \
-t ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${NEEDS_SETUP_OUTPUTS_PROWLER_VERSION} \
-t ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${NEEDS_SETUP_OUTPUTS_STABLE_TAG} \
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_LATEST_TAG}-amd64 \
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_LATEST_TAG}-arm64
env:
@@ -245,6 +293,39 @@ jobs:
NEEDS_SETUP_OUTPUTS_STABLE_TAG: ${{ needs.setup.outputs.stable_tag }}
NEEDS_SETUP_OUTPUTS_LATEST_TAG: ${{ needs.setup.outputs.latest_tag }}
# Push to toniblyx/prowler only for current version (latest/stable/release tags)
- name: Login to DockerHub (toniblyx)
if: needs.setup.outputs.latest_tag == 'latest'
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.TONIBLYX_DOCKERHUB_USERNAME }}
password: ${{ secrets.TONIBLYX_DOCKERHUB_PASSWORD }}
- name: Push manifests to toniblyx for push event
if: needs.setup.outputs.latest_tag == 'latest' && github.event_name == 'push'
run: |
docker buildx imagetools create \
-t ${{ env.TONIBLYX_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:latest \
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:latest
- name: Push manifests to toniblyx for release event
if: needs.setup.outputs.latest_tag == 'latest' && (github.event_name == 'release' || github.event_name == 'workflow_dispatch')
run: |
docker buildx imagetools create \
-t ${{ env.TONIBLYX_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${NEEDS_SETUP_OUTPUTS_PROWLER_VERSION} \
-t ${{ env.TONIBLYX_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:stable \
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:stable
env:
NEEDS_SETUP_OUTPUTS_PROWLER_VERSION: ${{ needs.setup.outputs.prowler_version }}
# Re-login as prowlercloud for cleanup of intermediate tags
- name: Login to DockerHub (prowlercloud)
if: always()
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Install regctl
if: always()
uses: regclient/actions/regctl-installer@da9319db8e44e8b062b3a147e1dfb2f574d41a03 # main
@@ -264,7 +345,14 @@ jobs:
needs: [setup, notify-release-started, container-build-push, create-manifest]
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -307,6 +395,11 @@ jobs:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Calculate short SHA
id: short-sha
run: echo "short_sha=${GITHUB_SHA::7}" >> $GITHUB_OUTPUT
+37 -4
View File
@@ -17,6 +17,8 @@ concurrency:
env:
IMAGE_NAME: prowler
permissions: {}
jobs:
sdk-dockerfile-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -26,6 +28,13 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -34,7 +43,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: Dockerfile
@@ -64,6 +73,30 @@ jobs:
pull-requests: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
registry-1.docker.io:443
auth.docker.io:443
production.cloudflare.docker.com:443
api.github.com:443
mirror.gcr.io:443
check.trivy.dev:443
debian.map.fastlydns.net:80
release-assets.githubusercontent.com:443
objects.githubusercontent.com:443
pypi.org:443
files.pythonhosted.org:443
www.powershellgallery.com:443
aka.ms:443
cdn.powershellgallery.com:443
_http._tcp.deb.debian.org:443
powershellinfraartifacts-gkhedzdeaghdezhr.z01.azurefd.net:443
get.trivy.dev:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -72,7 +105,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ./**
files_ignore: |
@@ -97,11 +130,11 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build SDK container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: .
push: false
+23 -10
View File
@@ -13,6 +13,8 @@ env:
RELEASE_TAG: ${{ github.event.release.tag_name }}
PYTHON_VERSION: '3.12'
permissions: {}
jobs:
validate-release:
if: github.repository == 'prowler-cloud/prowler'
@@ -25,6 +27,11 @@ jobs:
major_version: ${{ steps.parse-version.outputs.major }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Parse and validate version
id: parse-version
run: |
@@ -58,18 +65,21 @@ jobs:
url: https://pypi.org/project/prowler/${{ needs.validate-release.outputs.prowler_version }}/
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Install Poetry
run: pipx install poetry==2.1.1
- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
- name: Setup Python with Poetry
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ env.PYTHON_VERSION }}
install-dependencies: 'false'
- name: Build Prowler package
run: poetry build
@@ -91,18 +101,21 @@ jobs:
url: https://pypi.org/project/prowler-cloud/${{ needs.validate-release.outputs.prowler_version }}/
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Install Poetry
run: pipx install poetry==2.1.1
- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
- name: Setup Python with Poetry
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ env.PYTHON_VERSION }}
install-dependencies: 'false'
- name: Install toml package
run: pip install toml
@@ -13,6 +13,8 @@ env:
PYTHON_VERSION: '3.12'
AWS_REGION: 'us-east-1'
permissions: {}
jobs:
refresh-aws-regions:
if: github.repository == 'prowler-cloud/prowler'
@@ -24,6 +26,11 @@ jobs:
contents: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -12,6 +12,8 @@ concurrency:
env:
PYTHON_VERSION: '3.12'
permissions: {}
jobs:
refresh-oci-regions:
if: github.repository == 'prowler-cloud/prowler'
@@ -22,6 +24,11 @@ jobs:
contents: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
+18 -12
View File
@@ -14,6 +14,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
sdk-security-scans:
if: github.repository == 'prowler-cloud/prowler'
@@ -23,6 +25,19 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
pypi.org:443
files.pythonhosted.org:443
github.com:443
auth.safetycli.com:443
pyup.io:443
data.safetycli.com:443
api.github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -31,7 +46,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files:
./**
@@ -56,20 +71,11 @@ jobs:
contrib/**
**/AGENTS.md
- name: Install Poetry
- name: Setup Python with Poetry
if: steps.check-changes.outputs.any_changed == 'true'
run: pipx install poetry==2.1.1
- name: Set up Python 3.12
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: ./.github/actions/setup-python-poetry
with:
python-version: '3.12'
cache: 'poetry'
- name: Install dependencies
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry install --no-root
- name: Security scan with Bandit
if: steps.check-changes.outputs.any_changed == 'true'
+75 -29
View File
@@ -14,6 +14,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
sdk-tests:
if: github.repository == 'prowler-cloud/prowler'
@@ -24,12 +26,41 @@ jobs:
strategy:
matrix:
python-version:
- '3.9'
- '3.10'
- '3.11'
- '3.12'
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
pypi.org:443
files.pythonhosted.org:443
api.github.com:443
release-assets.githubusercontent.com:443
*.amazonaws.com:443
*.googleapis.com:443
schema.ocsf.io:443
registry-1.docker.io:443
production.cloudflare.docker.com:443
powershellinfraartifacts-gkhedzdeaghdezhr.z01.azurefd.net:443
o26192.ingest.us.sentry.io:443
management.azure.com:443
login.microsoftonline.com:443
keybase.io:443
ingest.codecov.io:443
graph.microsoft.com:443
dc.services.visualstudio.com:443
cloud.mongodb.com:443
cli.codecov.io:443
auth.docker.io:443
api.vercel.com:443
api.atlassian.com:443
aka.ms:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -38,7 +69,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ./**
files_ignore: |
@@ -61,26 +92,17 @@ jobs:
contrib/**
**/AGENTS.md
- name: Install Poetry
- name: Setup Python with Poetry
if: steps.check-changes.outputs.any_changed == 'true'
run: pipx install poetry==2.1.1
- name: Set up Python ${{ matrix.python-version }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
uses: ./.github/actions/setup-python-poetry
with:
python-version: ${{ matrix.python-version }}
cache: 'poetry'
- name: Install dependencies
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry install --no-root
# AWS Provider
- name: Check if AWS files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-aws
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/aws/**
@@ -187,11 +209,11 @@ jobs:
echo "AWS service_paths='${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS}'"
if [ "${STEPS_AWS_SERVICES_OUTPUTS_RUN_ALL}" = "true" ]; then
poetry run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml tests/providers/aws
poetry run pytest -p no:randomly -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml tests/providers/aws
elif [ -z "${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS}" ]; then
echo "No AWS service paths detected; skipping AWS tests."
else
poetry run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml ${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS}
poetry run pytest -p no:randomly -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml ${STEPS_AWS_SERVICES_OUTPUTS_SERVICE_PATHS}
fi
env:
STEPS_AWS_SERVICES_OUTPUTS_RUN_ALL: ${{ steps.aws-services.outputs.run_all }}
@@ -210,7 +232,7 @@ jobs:
- name: Check if Azure files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-azure
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/azure/**
@@ -234,7 +256,7 @@ jobs:
- name: Check if GCP files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-gcp
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/gcp/**
@@ -258,7 +280,7 @@ jobs:
- name: Check if Kubernetes files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-kubernetes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/kubernetes/**
@@ -282,7 +304,7 @@ jobs:
- name: Check if GitHub files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-github
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/github/**
@@ -306,7 +328,7 @@ jobs:
- name: Check if NHN files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-nhn
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/nhn/**
@@ -330,7 +352,7 @@ jobs:
- name: Check if M365 files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-m365
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/m365/**
@@ -354,7 +376,7 @@ jobs:
- name: Check if IaC files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-iac
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/iac/**
@@ -378,7 +400,7 @@ jobs:
- name: Check if MongoDB Atlas files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-mongodbatlas
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/mongodbatlas/**
@@ -402,7 +424,7 @@ jobs:
- name: Check if OCI files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-oraclecloud
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/oraclecloud/**
@@ -426,7 +448,7 @@ jobs:
- name: Check if OpenStack files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-openstack
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/openstack/**
@@ -450,7 +472,7 @@ jobs:
- name: Check if Google Workspace files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-googleworkspace
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/googleworkspace/**
@@ -470,11 +492,35 @@ jobs:
flags: prowler-py${{ matrix.python-version }}-googleworkspace
files: ./googleworkspace_coverage.xml
# Vercel Provider
- name: Check if Vercel files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-vercel
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/**/vercel/**
./tests/**/vercel/**
./poetry.lock
- name: Run Vercel tests
if: steps.changed-vercel.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/vercel --cov-report=xml:vercel_coverage.xml tests/providers/vercel
- name: Upload Vercel coverage to Codecov
if: steps.changed-vercel.outputs.any_changed == 'true'
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: prowler-py${{ matrix.python-version }}-vercel
files: ./vercel_coverage.xml
# Lib
- name: Check if Lib files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-lib
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/lib/**
@@ -498,7 +544,7 @@ jobs:
- name: Check if Config files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-config
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
./prowler/config/**
+14 -1
View File
@@ -31,6 +31,8 @@ on:
description: "Whether there are UI E2E tests to run"
value: ${{ jobs.analyze.outputs.has-ui-e2e }}
permissions: {}
jobs:
analyze:
runs-on: ubuntu-latest
@@ -45,8 +47,19 @@ jobs:
has-sdk-tests: ${{ steps.set-flags.outputs.has-sdk-tests }}
has-api-tests: ${{ steps.set-flags.outputs.has-api-tests }}
has-ui-e2e: ${{ steps.set-flags.outputs.has-ui-e2e }}
permissions:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
pypi.org:443
files.pythonhosted.org:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -55,7 +68,7 @@ jobs:
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
- name: Setup Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
+20 -3
View File
@@ -13,6 +13,8 @@ env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
permissions: {}
jobs:
detect-release-type:
runs-on: ubuntu-latest
@@ -26,6 +28,11 @@ jobs:
minor_version: ${{ steps.detect.outputs.minor_version }}
patch_version: ${{ steps.detect.outputs.patch_version }}
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Detect release type and parse version
id: detect
run: |
@@ -66,6 +73,11 @@ jobs:
contents: read
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -89,7 +101,7 @@ jobs:
run: |
set -e
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${NEXT_MINOR_VERSION}|" .env
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=.*|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${NEXT_MINOR_VERSION}|" .env
echo "Files modified:"
git --no-pager diff
@@ -143,7 +155,7 @@ jobs:
run: |
set -e
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${FIRST_PATCH_VERSION}|" .env
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=.*|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${FIRST_PATCH_VERSION}|" .env
echo "Files modified:"
git --no-pager diff
@@ -179,6 +191,11 @@ jobs:
contents: read
pull-requests: write
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -208,7 +225,7 @@ jobs:
run: |
set -e
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${NEXT_PATCH_VERSION}|" .env
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=.*|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${NEXT_PATCH_VERSION}|" .env
echo "Files modified:"
git --no-pager diff
+12
View File
@@ -26,6 +26,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: {}
jobs:
ui-analyze:
if: github.repository == 'prowler-cloud/prowler'
@@ -44,6 +46,16 @@ jobs:
- 'javascript-typescript'
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
github.com:443
release-assets.githubusercontent.com:443
uploads.github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
+61 -10
View File
@@ -17,9 +17,6 @@ on:
required: true
type: string
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: false
@@ -38,6 +35,8 @@ env:
# Build args
NEXT_PUBLIC_API_BASE_URL: http://prowler-api:8080/api/v1
permissions: {}
jobs:
setup:
if: github.repository == 'prowler-cloud/prowler'
@@ -45,7 +44,14 @@ jobs:
timeout-minutes: 5
outputs:
short-sha: ${{ steps.set-short-sha.outputs.short-sha }}
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Calculate short SHA
id: set-short-sha
run: echo "short-sha=${GITHUB_SHA::7}" >> $GITHUB_OUTPUT
@@ -57,7 +63,14 @@ jobs:
timeout-minutes: 5
outputs:
message-ts: ${{ steps.slack-notification.outputs.ts }}
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -96,24 +109,38 @@ jobs:
packages: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
registry-1.docker.io:443
production.cloudflare.docker.com:443
auth.docker.io:443
registry.npmjs.org:443
dl-cdn.alpinelinux.org:443
fonts.googleapis.com:443
fonts.gstatic.com:443
github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build and push UI container for ${{ matrix.arch }}
id: container-push
if: github.event_name == 'push' || github.event_name == 'release' || github.event_name == 'workflow_dispatch'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
@@ -131,17 +158,27 @@ jobs:
needs: [setup, container-build-push]
if: always() && needs.setup.result == 'success' && needs.container-build-push.result == 'success'
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
release-assets.githubusercontent.com:443
registry-1.docker.io:443
auth.docker.io:443
production.cloudflare.docker.com:443
- name: Login to DockerHub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3.7.0
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
run: |
@@ -183,7 +220,14 @@ jobs:
needs: [setup, notify-release-started, container-build-push, create-manifest]
runs-on: ubuntu-latest
timeout-minutes: 5
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -226,6 +270,13 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
api.github.com:443
- name: Trigger UI deployment
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4.0.1
with:
+33 -4
View File
@@ -18,6 +18,8 @@ env:
UI_WORKING_DIR: ./ui
IMAGE_NAME: prowler-ui
permissions: {}
jobs:
ui-dockerfile-lint:
if: github.repository == 'prowler-cloud/prowler'
@@ -27,6 +29,13 @@ jobs:
contents: read
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -35,7 +44,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ui/Dockerfile
@@ -65,6 +74,26 @@ jobs:
pull-requests: write
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
registry-1.docker.io:443
auth.docker.io:443
production.cloudflare.docker.com:443
registry.npmjs.org:443
dl-cdn.alpinelinux.org:443
fonts.googleapis.com:443
fonts.gstatic.com:443
api.github.com:443
mirror.gcr.io:443
check.trivy.dev:443
get.trivy.dev:443
release-assets.githubusercontent.com:443
objects.githubusercontent.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -73,7 +102,7 @@ jobs:
- name: Check for UI changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: ui/**
files_ignore: |
@@ -83,11 +112,11 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build UI container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6.19.2
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: ${{ env.UI_WORKING_DIR }}
target: prod
+23 -8
View File
@@ -15,13 +15,14 @@ on:
- 'ui/**'
- 'api/**' # API changes can affect UI E2E
permissions:
contents: read
permissions: {}
jobs:
# First, analyze which tests need to run
impact-analysis:
if: github.repository == 'prowler-cloud/prowler'
permissions:
contents: read
uses: ./.github/workflows/test-impact-analysis.yml
# Run E2E tests based on impact analysis
@@ -75,8 +76,15 @@ jobs:
# Pass E2E paths from impact analysis
E2E_TEST_PATHS: ${{ needs.impact-analysis.outputs.ui-e2e }}
RUN_ALL_TESTS: ${{ needs.impact-analysis.outputs.run-all }}
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -152,21 +160,21 @@ jobs:
'
- name: Setup Node.js
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6.2.0
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: '24.13.0'
- name: Setup pnpm
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4
uses: pnpm/action-setup@fc06bc1257f339d1d5d8b3a19a8cae5388b55320 # v5.0.0
with:
version: 10
package_json_file: ui/package.json
run_install: false
- name: Get pnpm store directory
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm and Next.js cache
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
with:
path: |
${{ env.STORE_PATH }}
@@ -186,7 +194,7 @@ jobs:
run: pnpm run build
- name: Cache Playwright browsers
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
id: playwright-cache
with:
path: ~/.cache/ms-playwright
@@ -253,7 +261,7 @@ jobs:
fi
- name: Upload test reports
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
if: failure()
with:
name: playwright-report
@@ -273,7 +281,14 @@ jobs:
needs.impact-analysis.outputs.has-ui-e2e != 'true' &&
needs.impact-analysis.outputs.run-all != 'true'
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: No E2E tests needed
run: |
echo "## E2E Tests Skipped" >> $GITHUB_STEP_SUMMARY
+21 -7
View File
@@ -18,6 +18,8 @@ env:
UI_WORKING_DIR: ./ui
NODE_VERSION: '24.13.0'
permissions: {}
jobs:
ui-tests:
runs-on: ubuntu-latest
@@ -29,6 +31,18 @@ jobs:
working-directory: ./ui
steps:
- name: Harden Runner
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: block
allowed-endpoints: >
github.com:443
registry.npmjs.org:443
fonts.googleapis.com:443
fonts.gstatic.com:443
api.github.com:443
release-assets.githubusercontent.com:443
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
@@ -37,7 +51,7 @@ jobs:
- name: Check for UI changes
id: check-changes
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
ui/**
@@ -50,7 +64,7 @@ jobs:
- name: Get changed source files for targeted tests
id: changed-source
if: steps.check-changes.outputs.any_changed == 'true'
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
ui/**/*.ts
@@ -66,7 +80,7 @@ jobs:
- name: Check for critical path changes (run all tests)
id: critical-changes
if: steps.check-changes.outputs.any_changed == 'true'
uses: tj-actions/changed-files@7dee1b0c1557f278e5c7dc244927139d78c0e22a # v47.0.4
uses: tj-actions/changed-files@22103cc46bda19c2b464ffe86db46df6922fd323 # v47.0.5
with:
files: |
ui/lib/**
@@ -78,15 +92,15 @@ jobs:
- name: Setup Node.js ${{ env.NODE_VERSION }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6.2.0
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: ${{ env.NODE_VERSION }}
- name: Setup pnpm
if: steps.check-changes.outputs.any_changed == 'true'
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4
uses: pnpm/action-setup@fc06bc1257f339d1d5d8b3a19a8cae5388b55320 # v5.0.0
with:
version: 10
package_json_file: ui/package.json
run_install: false
- name: Get pnpm store directory
@@ -96,7 +110,7 @@ jobs:
- name: Setup pnpm and Next.js cache
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v5.0.3
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5.0.4
with:
path: |
${{ env.STORE_PATH }}
+25
View File
@@ -0,0 +1,25 @@
rules:
secrets-outside-env:
ignore:
- api-bump-version.yml
- api-container-build-push.yml
- api-tests.yml
- backport.yml
- docs-bump-version.yml
- issue-triage.lock.yml
- mcp-container-build-push.yml
- pr-merged.yml
- prepare-release.yml
- sdk-bump-version.yml
- sdk-container-build-push.yml
- sdk-refresh-aws-services-regions.yml
- sdk-refresh-oci-regions.yml
- sdk-tests.yml
- ui-bump-version.yml
- ui-container-build-push.yml
- ui-e2e-tests-v2.yml
superfluous-actions:
ignore:
- pr-check-changelog.yml
- pr-conflict-checker.yml
- prepare-release.yml
+2
View File
@@ -60,6 +60,7 @@ htmlcov/
**/mcp-config.json
**/mcpServers.json
.mcp/
.mcp.json
# AI Coding Assistants - Cursor
.cursorignore
@@ -83,6 +84,7 @@ continue.json
.continuerc.json
# AI Coding Assistants - OpenCode
.opencode/
opencode.json
# AI Coding Assistants - GitHub Copilot
+3 -2
View File
@@ -70,7 +70,7 @@ repos:
args: ["--ignore=E266,W503,E203,E501,W605"]
- repo: https://github.com/python-poetry/poetry
rev: 2.1.1
rev: 2.3.4
hooks:
- id: poetry-check
name: API - poetry-check
@@ -128,7 +128,8 @@ repos:
# TODO: Botocore needs urllib3 1.X so we need to ignore these vulnerabilities 77744,77745. Remove this once we upgrade to urllib3 2.X
# TODO: 79023 & 79027 knack ReDoS until `azure-cli-core` (via `cartography`) allows `knack` >=0.13.0
# TODO: 86217 because `alibabacloud-tea-openapi == 0.4.3` don't let us upgrade `cryptography >= 46.0.0`
entry: bash -c 'safety check --ignore 70612,66963,74429,76352,76353,77744,77745,79023,79027,86217'
# TODO: 71600 CVE-2024-1135 false positive - fixed in gunicorn 22.0.0, project uses 23.0.0
entry: bash -c 'safety check --ignore 70612,66963,74429,76352,76353,77744,77745,79023,79027,86217,71600'
language: system
- id: vulture
+1 -1
View File
@@ -13,7 +13,7 @@ build:
post_create_environment:
# Install poetry
# https://python-poetry.org/docs/#installing-manually
- python -m pip install poetry
- python -m pip install poetry==2.3.4
post_install:
# Install dependencies with 'docs' dependency group
# https://python-poetry.org/docs/managing-dependencies/#dependency-groups
+3 -3
View File
@@ -140,7 +140,7 @@ Prowler is an open-source cloud security assessment tool supporting AWS, Azure,
| Component | Location | Tech Stack |
|-----------|----------|------------|
| SDK | `prowler/` | Python 3.9+, Poetry |
| SDK | `prowler/` | Python 3.10+, Poetry 2.3+ |
| API | `api/` | Django 5.1, DRF, Celery |
| UI | `ui/` | Next.js 15, React 19, Tailwind 4 |
| MCP Server | `mcp_server/` | FastMCP, Python 3.12+ |
@@ -153,12 +153,12 @@ Prowler is an open-source cloud security assessment tool supporting AWS, Azure,
```bash
# Setup
poetry install --with dev
poetry run pre-commit install
poetry run prek install
# Code quality
poetry run make lint
poetry run make format
poetry run pre-commit run --all-files
poetry run prek run --all-files
```
---
+2 -2
View File
@@ -1,4 +1,4 @@
FROM python:3.12.11-slim-bookworm AS build
FROM python:3.12.11-slim-bookworm@sha256:519591d6871b7bc437060736b9f7456b8731f1499a57e22e6c285135ae657bf7 AS build
LABEL maintainer="https://github.com/prowler-cloud/prowler"
LABEL org.opencontainers.image.source="https://github.com/prowler-cloud/prowler"
@@ -68,7 +68,7 @@ ENV HOME='/home/prowler'
ENV PATH="${HOME}/.local/bin:${PATH}"
#hadolint ignore=DL3013
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
pip install --no-cache-dir poetry==2.3.4
RUN poetry install --compile && \
rm -rf ~/.cache/pip
+17 -5
View File
@@ -3,7 +3,7 @@
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/master/docs/img/prowler-logo-white.png#gh-dark-mode-only" width="50%" height="50%">
</p>
<p align="center">
<b><i>Prowler</b> is the Open Cloud Security platform trusted by thousands to automate security and compliance in any cloud environment. With hundreds of ready-to-use checks and compliance frameworks, Prowler delivers real-time, customizable monitoring and seamless integrations, making cloud security simple, scalable, and cost-effective for organizations of any size.
<b><i>Prowler</b> is the Open Cloud Security Platform trusted by thousands to automate security and compliance in any cloud environment. With hundreds of ready-to-use checks and compliance frameworks, Prowler delivers real-time, customizable monitoring and seamless integrations, making cloud security simple, scalable, and cost-effective for organizations of any size.
</p>
<p align="center">
<b>Secure ANY cloud at AI Speed at <a href="https://prowler.com">prowler.com</i></b>
@@ -41,7 +41,7 @@
# Description
**Prowler** is the worlds most widely used _open-source cloud security platform_ that automates security and compliance across **any cloud environment**. With hundreds of ready-to-use security checks, remediation guidance, and compliance frameworks, Prowler is built to _“Secure ANY cloud at AI Speed”_. Prowler delivers **AI-driven**, **customizable**, and **easy-to-use** assessments, dashboards, reports, and integrations, making cloud security **simple**, **scalable**, and **cost-effective** for organizations of any size.
**Prowler** is the worlds most widely used _Open-Source Cloud Security Platform_ that automates security and compliance across **any cloud environment**. With hundreds of ready-to-use security checks, remediation guidance, and compliance frameworks, Prowler is built to _“Secure ANY Cloud at AI Speed”_. Prowler delivers **AI-driven**, **customizable**, and **easy-to-use** assessments, dashboards, reports, and integrations, making cloud security **simple**, **scalable**, and **cost-effective** for organizations of any size.
Prowler includes hundreds of built-in controls to ensure compliance with standards and frameworks, including:
@@ -119,6 +119,7 @@ Every AWS provider scan will enqueue an Attack Paths ingestion job automatically
| Image | N/A | N/A | N/A | N/A | Official | CLI, API |
| Google Workspace | 1 | 1 | 0 | 1 | Official | CLI |
| OpenStack | 27 | 4 | 0 | 8 | Official | UI, API, CLI |
| Vercel | 30 | 6 | 0 | 5 | Official | CLI |
| NHN | 6 | 2 | 1 | 0 | Unofficial | CLI |
> [!Note]
@@ -239,9 +240,17 @@ pnpm start
> Once configured, access the Prowler App at http://localhost:3000. Sign up using your email and password to get started.
**Pre-commit Hooks Setup**
Some pre-commit hooks require tools installed on your system:
1. **Install [TruffleHog](https://github.com/trufflesecurity/trufflehog#install)** (secret scanning) — see the [official installation options](https://github.com/trufflesecurity/trufflehog#install).
2. **Install [Hadolint](https://github.com/hadolint/hadolint#install)** (Dockerfile linting) — see the [official installation options](https://github.com/hadolint/hadolint#install).
## Prowler CLI
### Pip package
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/). Consequently, it can be installed using pip with Python >3.9.1, <3.13:
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/). Consequently, it can be installed using pip with Python >=3.10, <3.13:
```console
pip install prowler
@@ -273,7 +282,7 @@ The container images are available here:
### From GitHub
Python >3.9.1, <3.13 is required with pip and Poetry:
Python >=3.10, <3.13 is required with pip and Poetry:
``` console
git clone https://github.com/prowler-cloud/prowler
@@ -301,7 +310,10 @@ python prowler-cli.py -v
- **Prowler SDK**: A Python SDK designed to extend the functionality of the Prowler CLI for advanced capabilities.
- **Prowler MCP Server**: A Model Context Protocol server that provides AI tools for Lighthouse, the AI-powered security assistant. This is a critical dependency for Lighthouse functionality.
![Prowler App Architecture](docs/products/img/prowler-app-architecture.png)
![Prowler App Architecture](docs/images/products/prowler-app-architecture.png)
<!-- Diagram source: docs/images/products/prowler-app-architecture.mmd — edit there, re-render at https://mermaid.live, and replace the PNG. -->
## Prowler CLI
+103
View File
@@ -2,6 +2,108 @@
All notable changes to the **Prowler API** are documented in this file.
## [1.25.1] (Prowler v5.24.1)
### 🔄 Changed
- Attack Paths: Restore `SYNC_BATCH_SIZE` and `FINDINGS_BATCH_SIZE` defaults to 1000, upgrade Cartography to 0.135.0, enable Celery queue priority for cleanup task, rewrite Finding insertion, remove AWS graph cleanup and add timing logs [(#10729)](https://github.com/prowler-cloud/prowler/pull/10729)
### 🐞 Fixed
- Finding group resources endpoints now include findings without associated resources (orphaned IaC findings) as simulated resource rows, and return one row per finding when multiple findings share a resource [(#10708)](https://github.com/prowler-cloud/prowler/pull/10708)
- Attack Paths: Missing `tenant_id` filter while getting related findings after scan completes [(#10722)](https://github.com/prowler-cloud/prowler/pull/10722)
- Finding group counters `pass_count`, `fail_count` and `manual_count` now exclude muted findings [(#10753)](https://github.com/prowler-cloud/prowler/pull/10753)
- Silent data loss in `ResourceFindingMapping` bulk insert that left findings orphaned when `INSERT ... ON CONFLICT DO NOTHING` dropped rows without raising; added explicit `unique_fields` [(#10724)](https://github.com/prowler-cloud/prowler/pull/10724)
---
## [1.25.0] (Prowler v5.24.0)
### 🔄 Changed
- Bump Poetry to `2.3.4` in Dockerfile and pre-commit hooks. Regenerate `api/poetry.lock` [(#10681)](https://github.com/prowler-cloud/prowler/pull/10681)
- Attack Paths: Remove dead `cleanup_findings` no-op and its supporting `prowler_finding_lastupdated` index [(#10684)](https://github.com/prowler-cloud/prowler/pull/10684)
### 🐞 Fixed
- Worker-beat race condition on cold start: replaced `sleep 15` with API service healthcheck dependency (Docker Compose) and init containers (Helm), aligned Gunicorn default port to `8080` [(#10603)](https://github.com/prowler-cloud/prowler/pull/10603)
- API container startup crash on Linux due to root-owned bind-mount preventing JWT key generation [(#10646)](https://github.com/prowler-cloud/prowler/pull/10646)
### 🔐 Security
- `pytest` from 8.2.2 to 9.0.3 to fix CVE-2025-71176 [(#10678)](https://github.com/prowler-cloud/prowler/pull/10678)
---
## [1.24.0] (Prowler v5.23.0)
### 🚀 Added
- RBAC role lookup filtered by `tenant_id` to prevent cross-tenant privilege leak [(#10491)](https://github.com/prowler-cloud/prowler/pull/10491)
- `VALKEY_SCHEME`, `VALKEY_USERNAME`, and `VALKEY_PASSWORD` environment variables to configure Celery broker TLS/auth connection details for Valkey/ElastiCache [(#10420)](https://github.com/prowler-cloud/prowler/pull/10420)
- `Vercel` provider support [(#10190)](https://github.com/prowler-cloud/prowler/pull/10190)
- Finding groups list and latest endpoints support `sort=delta`, ordering by `new_count` then `changed_count` so groups with the most new findings rank highest [(#10606)](https://github.com/prowler-cloud/prowler/pull/10606)
- Finding group resources endpoints (`/finding-groups/{check_id}/resources` and `/finding-groups/latest/{check_id}/resources`) now expose `finding_id` per row, pointing to the most recent matching Finding for each resource. UUIDv7 ordering guarantees `Max(finding__id)` resolves to the latest snapshot [(#10630)](https://github.com/prowler-cloud/prowler/pull/10630)
- Handle CIS and CISA SCuBA compliance framework from google workspace [(#10629)](https://github.com/prowler-cloud/prowler/pull/10629)
- Sort support for all finding group counter fields: `pass_muted_count`, `fail_muted_count`, `manual_muted_count`, and all `new_*`/`changed_*` status-mute breakdown counters [(#10655)](https://github.com/prowler-cloud/prowler/pull/10655)
### 🔄 Changed
- Finding groups list/latest/resources now expose `status``{FAIL, PASS, MANUAL}` and `muted: bool` as orthogonal fields. The aggregated `status` reflects the underlying check outcome regardless of mute state, and `muted=true` signals that every finding in the group/resource is muted. New `manual_count` is exposed alongside `pass_count`/`fail_count`, plus `pass_muted_count`/`fail_muted_count`/`manual_muted_count` siblings so clients can isolate the muted half of each status. The `new_*`/`changed_*` deltas are now broken down by status and mute state via 12 new counters (`new_fail_count`, `new_fail_muted_count`, `new_pass_count`, `new_pass_muted_count`, `new_manual_count`, `new_manual_muted_count` and the matching `changed_*` set). New `filter[muted]=true|false` and `sort=status` (FAIL > PASS > MANUAL) / `sort=muted` are supported. `filter[status]=MUTED` is no longer accepted [(#10630)](https://github.com/prowler-cloud/prowler/pull/10630)
- Attack Paths: Periodic cleanup of stale scans with dead-worker detection via Celery inspect, marking orphaned `EXECUTING` scans as `FAILED` and recovering `graph_data_ready` [(#10387)](https://github.com/prowler-cloud/prowler/pull/10387)
- Attack Paths: Replace `_provider_id` property with `_Provider_{uuid}` label for provider isolation, add regex-based label injection for custom queries [(#10402)](https://github.com/prowler-cloud/prowler/pull/10402)
### 🐞 Fixed
- `reaggregate_all_finding_group_summaries_task` now refreshes finding group daily summaries for every `(provider, day)` combination instead of only the latest scan per provider, matching the unbounded scope of `mute_historical_findings_task`. Mute rule operations no longer leave older daily summaries drifting from the underlying muted findings [(#10630)](https://github.com/prowler-cloud/prowler/pull/10630)
- Finding groups list/latest now apply computed status/severity filters and finding-level prefilters (delta, region, service, category, resource group, scan, resource type), plus `check_title` support for sort/filter consistency [(#10428)](https://github.com/prowler-cloud/prowler/pull/10428)
- Populate compliance data inside `check_metadata` for findings, which was always returned as `null` [(#10449)](https://github.com/prowler-cloud/prowler/pull/10449)
- 403 error for admin users listing tenants due to roles query not using the admin database connection [(#10460)](https://github.com/prowler-cloud/prowler/pull/10460)
- Filter transient Neo4j defunct connection logs in Sentry `before_send` to suppress false-positive alerts handled by `RetryableSession` retries [(#10452)](https://github.com/prowler-cloud/prowler/pull/10452)
- `MANAGE_ACCOUNT` permission no longer required for listing and creating tenants [(#10468)](https://github.com/prowler-cloud/prowler/pull/10468)
- Finding groups muted filter, counters, metadata extraction and mute reaggregation [(#10477)](https://github.com/prowler-cloud/prowler/pull/10477)
- Finding groups `check_title__icontains` resolution, `name__icontains` resource filter and `resource_group` field in `/resources` response [(#10486)](https://github.com/prowler-cloud/prowler/pull/10486)
- Membership `post_delete` signal using raw FK ids to avoid `DoesNotExist` during cascade deletions [(#10497)](https://github.com/prowler-cloud/prowler/pull/10497)
- Finding group resources endpoints returning false 404 when filters match no results, and `sort` parameter being ignored [(#10510)](https://github.com/prowler-cloud/prowler/pull/10510)
- Jira integration failing with `JiraInvalidIssueTypeError` on non-English Jira instances due to hardcoded `"Task"` issue type; now dynamically fetches available issue types per project [(#10534)](https://github.com/prowler-cloud/prowler/pull/10534)
- Finding group `first_seen_at` now reflects when a new finding appeared in the scan instead of the oldest carry-forward date across all unchanged findings [(#10595)](https://github.com/prowler-cloud/prowler/pull/10595)
- Attack Paths: Remove `clear_cache` call from read-only query endpoints; cache clearing belongs to the scan/ingestion flow, not API queries [(#10586)](https://github.com/prowler-cloud/prowler/pull/10586)
### 🔐 Security
- Pin all unpinned dependencies to exact versions to prevent supply chain attacks and ensure reproducible builds [(#10469)](https://github.com/prowler-cloud/prowler/pull/10469)
- `authlib` bumped from 1.6.6 to 1.6.9 to fix CVE-2026-28802 (JWT `alg: none` validation bypass) [(#10579)](https://github.com/prowler-cloud/prowler/pull/10579)
- `aiohttp` bumped from 3.13.3 to 3.13.5 to fix CVE-2026-34520 (the C parser accepted null bytes and control characters in response headers) [(#10538)](https://github.com/prowler-cloud/prowler/pull/10538)
---
## [1.23.0] (Prowler v5.22.0)
### 🚀 Added
- Finding groups support `check_title` substring filtering [(#10377)](https://github.com/prowler-cloud/prowler/pull/10377)
### 🐞 Fixed
- Finding groups latest endpoint now aggregates the latest snapshot per provider before check-level totals, keeping impacted resources aligned across providers [(#10419)](https://github.com/prowler-cloud/prowler/pull/10419)
- Mute rule creation now triggers finding-group summary re-aggregation after historical muting, keeping stats in sync after mute operations [(#10419)](https://github.com/prowler-cloud/prowler/pull/10419)
- Attack Paths: Deduplicate nodes before ProwlerFinding lookup in Attack Paths Cypher queries, reducing execution time [(#10424)](https://github.com/prowler-cloud/prowler/pull/10424)
### 🔐 Security
- Replace stdlib XML parser with `defusedxml` in SAML metadata parsing to prevent XML bomb (billion laughs) DoS attacks [(#10165)](https://github.com/prowler-cloud/prowler/pull/10165)
- Bump `flask` to 3.1.3 (CVE-2026-27205) and `werkzeug` to 3.1.6 (CVE-2026-27199) [(#10430)](https://github.com/prowler-cloud/prowler/pull/10430)
---
## [1.22.1] (Prowler v5.21.1)
### 🐞 Fixed
- Threat score aggregation query to eliminate unnecessary JOINs and `COUNT(DISTINCT)` overhead [(#10394)](https://github.com/prowler-cloud/prowler/pull/10394)
---
## [1.22.0] (Prowler v5.21.0)
### 🚀 Added
@@ -21,6 +123,7 @@ All notable changes to the **Prowler API** are documented in this file.
### 🔐 Security
- Use `psycopg2.sql` to safely compose DDL in `PostgresEnumMigration`, preventing SQL injection via f-string interpolation [(#10166)](https://github.com/prowler-cloud/prowler/pull/10166)
- Replace stdlib XML parser with `defusedxml` in SAML metadata parsing to prevent XML bomb (billion laughs) DoS attacks [(#10165)](https://github.com/prowler-cloud/prowler/pull/10165)
---
+2 -2
View File
@@ -1,4 +1,4 @@
FROM python:3.12.10-slim-bookworm AS build
FROM python:3.12.10-slim-bookworm@sha256:fd95fa221297a88e1cf49c55ec1828edd7c5a428187e67b5d1805692d11588db AS build
LABEL maintainer="https://github.com/prowler-cloud/api"
@@ -71,7 +71,7 @@ RUN mkdir -p /tmp/prowler_api_output
COPY pyproject.toml ./
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
pip install --no-cache-dir poetry==2.3.4
ENV PATH="/home/prowler/.local/bin:$PATH"
+20 -2
View File
@@ -30,14 +30,32 @@ start_prod_server() {
poetry run gunicorn -c config/guniconf.py config.wsgi:application
}
resolve_worker_hostname() {
TASK_ID=""
if [ -n "$ECS_CONTAINER_METADATA_URI_V4" ]; then
TASK_ID=$(wget -qO- --timeout=2 "${ECS_CONTAINER_METADATA_URI_V4}/task" | \
python3 -c "import sys,json; print(json.load(sys.stdin)['TaskARN'].split('/')[-1])" 2>/dev/null)
fi
if [ -z "$TASK_ID" ]; then
TASK_ID=$(python3 -c "import uuid; print(uuid.uuid4().hex)")
fi
echo "${TASK_ID}@$(hostname)"
}
start_worker() {
echo "Starting the worker..."
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion,backfill,overview,integrations,compliance,attack-paths-scans -E --max-tasks-per-child 1
poetry run python -m celery -A config.celery worker \
-n "$(resolve_worker_hostname)" \
-l "${DJANGO_LOGGING_LEVEL:-info}" \
-Q celery,scans,scan-reports,deletion,backfill,overview,integrations,compliance,attack-paths-scans \
-E --max-tasks-per-child 1
}
start_worker_beat() {
echo "Starting the worker-beat..."
sleep 15
poetry run python -m celery -A config.celery beat -l "${DJANGO_LOGGING_LEVEL:-info}" --scheduler django_celery_beat.schedulers:DatabaseScheduler
}
+419 -377
View File
File diff suppressed because it is too large Load Diff
+24 -23
View File
@@ -5,43 +5,44 @@ requires = ["poetry-core"]
[project]
authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
dependencies = [
"celery (>=5.4.0,<6.0.0)",
"celery (==5.6.2)",
"dj-rest-auth[with_social,jwt] (==7.0.1)",
"django (==5.1.15)",
"django-allauth[saml] (>=65.13.0,<66.0.0)",
"django-celery-beat (>=2.7.0,<3.0.0)",
"django-celery-results (>=2.5.1,<3.0.0)",
"django-allauth[saml] (==65.15.0)",
"django-celery-beat (==2.9.0)",
"django-celery-results (==2.6.0)",
"django-cors-headers==4.4.0",
"django-environ==0.11.2",
"django-filter==24.3",
"django-guid==3.5.0",
"django-postgres-extra (>=2.0.8,<3.0.0)",
"django-postgres-extra (==2.0.9)",
"djangorestframework==3.15.2",
"djangorestframework-jsonapi==7.0.2",
"djangorestframework-simplejwt (>=5.3.1,<6.0.0)",
"drf-nested-routers (>=0.94.1,<1.0.0)",
"djangorestframework-simplejwt (==5.5.1)",
"drf-nested-routers (==0.95.0)",
"drf-spectacular==0.27.2",
"drf-spectacular-jsonapi==0.5.1",
"defusedxml==0.7.1",
"gunicorn==23.0.0",
"lxml==5.3.2",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.24",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
"pytest-celery[redis] (==1.3.0)",
"sentry-sdk[django] (==2.56.0)",
"uuid6==2024.7.10",
"openai (>=1.82.0,<2.0.0)",
"openai (==1.109.1)",
"xmlsec==1.3.14",
"h2 (==4.3.0)",
"markdown (>=3.9,<4.0)",
"markdown (==3.10.2)",
"drf-simple-apikey (==2.2.1)",
"matplotlib (>=3.10.6,<4.0.0)",
"reportlab (>=4.4.4,<5.0.0)",
"neo4j (>=6.0.0,<7.0.0)",
"cartography (==0.132.0)",
"gevent (>=25.9.1,<26.0.0)",
"werkzeug (>=3.1.4)",
"sqlparse (>=0.5.4)",
"fonttools (>=4.60.2)"
"matplotlib (==3.10.8)",
"reportlab (==4.4.10)",
"neo4j (==6.1.0)",
"cartography (==0.135.0)",
"gevent (==25.9.1)",
"werkzeug (==3.1.7)",
"sqlparse (==0.5.5)",
"fonttools (==4.62.1)"
]
description = "Prowler's API (Django/DRF)"
license = "Apache-2.0"
@@ -49,7 +50,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.22.0"
version = "1.25.1"
[project.scripts]
celery = "src.backend.config.settings.celery"
@@ -61,10 +62,9 @@ django-silk = "5.3.2"
docker = "7.1.0"
filelock = "3.20.3"
freezegun = "1.5.1"
marshmallow = ">=3.15.0,<4.0.0"
mypy = "1.10.1"
pylint = "3.2.5"
pytest = "8.2.2"
pytest = "9.0.3"
pytest-cov = "5.0.0"
pytest-django = "4.8.0"
pytest-env = "1.1.3"
@@ -74,3 +74,4 @@ ruff = "0.5.0"
safety = "3.7.0"
tqdm = "4.67.1"
vulture = "2.14"
prek = "0.3.9"
@@ -0,0 +1,170 @@
"""
Cypher sanitizer for custom (user-supplied) Attack Paths queries.
Two responsibilities:
1. **Validation** - reject queries containing SSRF or dangerous procedure
patterns (defense-in-depth; the primary control is ``neo4j.READ_ACCESS``).
2. **Provider-scoped label injection** - inject a dynamic
``_Provider_{uuid}`` label into every node pattern so the database can
use its native label index for provider isolation.
Label-injection pipeline:
1. **Protect** string literals and line comments (placeholder replacement).
2. **Split** by top-level clause keywords to track clause context.
3. **Pass A** - inject into *labeled* node patterns in ALL segments.
4. **Pass B** - inject into *bare* node patterns in MATCH segments only.
5. **Restore** protected regions.
"""
import re
from rest_framework.exceptions import ValidationError
from tasks.jobs.attack_paths.config import get_provider_label
# Step 1 - String / comment protection
# Single combined regex: strings first, then line comments.
# The regex engine finds the leftmost match, so a string like 'https://prowler.com'
# is consumed as a string before the // inside it can match as a comment.
_PROTECTED_RE = re.compile(r"'(?:[^'\\]|\\.)*'|\"(?:[^\"\\]|\\.)*\"|//[^\n]*")
# Step 2 - Clause splitting
# OPTIONAL MATCH must come before MATCH to avoid partial matching.
_CLAUSE_RE = re.compile(
r"\b(OPTIONAL\s+MATCH|MATCH|WHERE|RETURN|WITH|ORDER\s+BY"
r"|SKIP|LIMIT|UNION|UNWIND|CALL)\b",
re.IGNORECASE,
)
# Pass A - Labeled node patterns (all segments)
# Matches node patterns that have at least one :Label.
# (?<!\w)\( - open paren NOT preceded by a word char (excludes function calls).
# Group 1: optional variable + one or more :Label
# Group 2: optional {properties} + closing paren
_LABELED_NODE_RE = re.compile(
r"(?<!\w)\("
r"("
r"\s*(?:[a-zA-Z_]\w*)?"
r"(?:\s*:\s*(?:`[^`]*`|[a-zA-Z_]\w*))+"
r")"
r"("
r"\s*(?:\{[^}]*\})?"
r"\s*\)"
r")"
)
# Pass B - Bare node patterns (MATCH segments only)
# Matches (identifier) or (identifier {properties}) without any :Label.
# Only applied in MATCH/OPTIONAL MATCH segments.
_BARE_NODE_RE = re.compile(
r"(?<!\w)\(" r"(\s*[a-zA-Z_]\w*)" r"(\s*(?:\{[^}]*\})?)" r"\s*\)"
)
_MATCH_CLAUSES = frozenset({"MATCH", "OPTIONAL MATCH"})
def _inject_labeled(segment: str, label: str) -> str:
"""Inject provider label into all node patterns that have existing labels."""
return _LABELED_NODE_RE.sub(rf"(\1:{label}\2", segment)
def _inject_bare(segment: str, label: str) -> str:
"""Inject provider label into bare `(identifier)` node patterns."""
def _replace(match):
var = match.group(1)
props = match.group(2).strip()
if props:
return f"({var}:{label} {props})"
return f"({var}:{label})"
return _BARE_NODE_RE.sub(_replace, segment)
def inject_provider_label(cypher: str, provider_id: str) -> str:
"""Rewrite a Cypher query to scope every node pattern to a provider.
Args:
cypher: The original Cypher query string.
provider_id: The provider UUID (will be converted to a label via
`get_provider_label`).
Returns:
The rewritten Cypher with `:_Provider_{uuid}` appended to every
node pattern.
"""
label = get_provider_label(provider_id)
# Step 1: Protect strings and comments (single pass, leftmost-first)
protected: list[str] = []
def _save(match):
protected.append(match.group(0))
return f"\x00P{len(protected) - 1}\x00"
work = _PROTECTED_RE.sub(_save, cypher)
# Step 2: Split by clause keywords
parts = _CLAUSE_RE.split(work)
# Steps 3-4: Apply injection passes per segment
result: list[str] = []
current_clause: str | None = None
for i, part in enumerate(parts):
if i % 2 == 1:
# Keyword token - normalize for clause tracking
current_clause = re.sub(r"\s+", " ", part.strip()).upper()
result.append(part)
else:
# Content segment - apply injection based on clause context
part = _inject_labeled(part, label)
if current_clause in _MATCH_CLAUSES:
part = _inject_bare(part, label)
result.append(part)
work = "".join(result)
# Step 5: Restore protected regions
for i, original in enumerate(protected):
work = work.replace(f"\x00P{i}\x00", original)
return work
# ---------------------------------------------------------------------------
# Validation
# ---------------------------------------------------------------------------
# Patterns that indicate SSRF or dangerous procedure calls
# Defense-in-depth layer - the primary control is `neo4j.READ_ACCESS`
_BLOCKED_PATTERNS = [
re.compile(r"\bLOAD\s+CSV\b", re.IGNORECASE),
re.compile(r"\bapoc\.load\b", re.IGNORECASE),
re.compile(r"\bapoc\.import\b", re.IGNORECASE),
re.compile(r"\bapoc\.export\b", re.IGNORECASE),
re.compile(r"\bapoc\.cypher\b", re.IGNORECASE),
re.compile(r"\bapoc\.systemdb\b", re.IGNORECASE),
re.compile(r"\bapoc\.config\b", re.IGNORECASE),
re.compile(r"\bapoc\.periodic\b", re.IGNORECASE),
re.compile(r"\bapoc\.do\b", re.IGNORECASE),
re.compile(r"\bapoc\.trigger\b", re.IGNORECASE),
re.compile(r"\bapoc\.custom\b", re.IGNORECASE),
]
def validate_custom_query(cypher: str) -> None:
"""Reject queries containing known SSRF or dangerous procedure patterns.
Raises ValidationError if a blocked pattern is found.
String literals and comments are stripped before matching to avoid
false positives.
"""
stripped = _PROTECTED_RE.sub("", cypher)
for pattern in _BLOCKED_PATTERNS:
if pattern.search(stripped):
raise ValidationError({"query": "Query contains a blocked operation"})
+7 -13
View File
@@ -11,8 +11,8 @@ from config.env import env
from django.conf import settings
from tasks.jobs.attack_paths.config import (
BATCH_SIZE,
PROVIDER_ID_PROPERTY,
PROVIDER_RESOURCE_LABEL,
get_provider_label,
)
from api.attack_paths.retryable_session import RetryableSession
@@ -163,11 +163,8 @@ def drop_subgraph(database: str, provider_id: str) -> int:
Uses batched deletion to avoid memory issues with large graphs.
Silently returns 0 if the database doesn't exist.
"""
provider_label = get_provider_label(provider_id)
deleted_nodes = 0
parameters = {
"provider_id": provider_id,
"batch_size": BATCH_SIZE,
}
try:
with get_session(database) as session:
@@ -175,12 +172,12 @@ def drop_subgraph(database: str, provider_id: str) -> int:
while deleted_count > 0:
result = session.run(
f"""
MATCH (n:{PROVIDER_RESOURCE_LABEL} {{{PROVIDER_ID_PROPERTY}: $provider_id}})
MATCH (n:{PROVIDER_RESOURCE_LABEL}:`{provider_label}`)
WITH n LIMIT $batch_size
DETACH DELETE n
RETURN COUNT(n) AS deleted_nodes_count
""",
parameters,
{"batch_size": BATCH_SIZE},
)
deleted_count = result.single().get("deleted_nodes_count", 0)
deleted_nodes += deleted_count
@@ -199,15 +196,12 @@ def has_provider_data(database: str, provider_id: str) -> bool:
Returns `False` if the database doesn't exist.
"""
query = (
f"MATCH (n:{PROVIDER_RESOURCE_LABEL} "
f"{{{PROVIDER_ID_PROPERTY}: $provider_id}}) "
"RETURN 1 LIMIT 1"
)
provider_label = get_provider_label(provider_id)
query = f"MATCH (n:{PROVIDER_RESOURCE_LABEL}:`{provider_label}`) RETURN 1 LIMIT 1"
try:
with get_session(database, default_access_mode=neo4j.READ_ACCESS) as session:
result = session.run(query, {"provider_id": provider_id})
result = session.run(query)
return result.single() is not None
except GraphDatabaseQueryException as exc:
File diff suppressed because it is too large Load Diff
@@ -1,13 +1,18 @@
from tasks.jobs.attack_paths.config import PROVIDER_ID_PROPERTY, PROVIDER_RESOURCE_LABEL
from tasks.jobs.attack_paths.config import PROVIDER_RESOURCE_LABEL, get_provider_label
def get_cartography_schema_query(provider_id: str) -> str:
"""Build the Cartography schema metadata query scoped to a provider label."""
provider_label = get_provider_label(provider_id)
return f"""
MATCH (n:{PROVIDER_RESOURCE_LABEL}:`{provider_label}`)
WHERE n._module_name STARTS WITH 'cartography:'
AND NOT n._module_name IN ['cartography:ontology', 'cartography:prowler']
AND n._module_version IS NOT NULL
RETURN n._module_name AS module_name, n._module_version AS module_version
LIMIT 1
"""
CARTOGRAPHY_SCHEMA_METADATA = f"""
MATCH (n:{PROVIDER_RESOURCE_LABEL} {{{PROVIDER_ID_PROPERTY}: $provider_id}})
WHERE n._module_name STARTS WITH 'cartography:'
AND NOT n._module_name IN ['cartography:ontology', 'cartography:prowler']
AND n._module_version IS NOT NULL
RETURN n._module_name AS module_name, n._module_version AS module_version
LIMIT 1
"""
GITHUB_SCHEMA_URL = (
"https://github.com/cartography-cncf/cartography/blob/"
@@ -1,22 +1,26 @@
import logging
import re
from typing import Any, Iterable
import neo4j
from rest_framework.exceptions import APIException, PermissionDenied, ValidationError
from api.attack_paths import database as graph_database, AttackPathsQueryDefinition
from api.attack_paths.cypher_sanitizer import (
inject_provider_label,
validate_custom_query,
)
from api.attack_paths.queries.schema import (
CARTOGRAPHY_SCHEMA_METADATA,
GITHUB_SCHEMA_URL,
RAW_SCHEMA_URL,
get_cartography_schema_query,
)
from config.custom_logging import BackendLogger
from tasks.jobs.attack_paths.config import (
INTERNAL_LABELS,
INTERNAL_PROPERTIES,
PROVIDER_ID_PROPERTY,
get_provider_label,
is_dynamic_isolation_label,
)
@@ -72,7 +76,6 @@ def prepare_parameters(
clean_parameters = {
"provider_uid": str(provider_uid),
"provider_id": str(provider_id),
}
for definition_parameter in definition.parameters:
@@ -123,38 +126,6 @@ def execute_query(
# Custom query helpers
# Patterns that indicate SSRF or dangerous procedure calls
# Defense-in-depth layer - the primary control is `neo4j.READ_ACCESS`
_BLOCKED_PATTERNS = [
re.compile(r"\bLOAD\s+CSV\b", re.IGNORECASE),
re.compile(r"\bapoc\.load\b", re.IGNORECASE),
re.compile(r"\bapoc\.import\b", re.IGNORECASE),
re.compile(r"\bapoc\.export\b", re.IGNORECASE),
re.compile(r"\bapoc\.cypher\b", re.IGNORECASE),
re.compile(r"\bapoc\.systemdb\b", re.IGNORECASE),
re.compile(r"\bapoc\.config\b", re.IGNORECASE),
re.compile(r"\bapoc\.periodic\b", re.IGNORECASE),
re.compile(r"\bapoc\.do\b", re.IGNORECASE),
re.compile(r"\bapoc\.trigger\b", re.IGNORECASE),
re.compile(r"\bapoc\.custom\b", re.IGNORECASE),
]
# Strip string literals so patterns inside quotes don't cause false positives
# Handles escaped quotes (\' and \") inside strings
_STRING_LITERALS = re.compile(r"'(?:[^'\\]|\\.)*'|\"(?:[^\"\\]|\\.)*\"")
def validate_custom_query(cypher: str) -> None:
"""Reject queries containing known SSRF or dangerous procedure patterns.
Raises ValidationError if a blocked pattern is found.
String literals are stripped before matching to avoid false positives.
"""
stripped = _STRING_LITERALS.sub("", cypher)
for pattern in _BLOCKED_PATTERNS:
if pattern.search(stripped):
raise ValidationError({"query": "Query contains a blocked operation"})
def normalize_custom_query_payload(raw_data):
if not isinstance(raw_data, dict):
@@ -173,7 +144,15 @@ def execute_custom_query(
cypher: str,
provider_id: str,
) -> dict[str, Any]:
# Defense-in-depth for custom queries:
# 1. neo4j.READ_ACCESS — prevents mutations at the driver level
# 2. inject_provider_label() — regex-based label injection scopes node patterns
# 3. _serialize_graph() — post-query filter drops nodes without the provider label
#
# Layer 2 is best-effort (regex can't fully parse Cypher);
# layer 3 is the safety net that guarantees provider isolation.
validate_custom_query(cypher)
cypher = inject_provider_label(cypher, provider_id)
try:
graph = graph_database.execute_read_query(
@@ -208,10 +187,7 @@ def get_cartography_schema(
with graph_database.get_session(
database_name, default_access_mode=neo4j.READ_ACCESS
) as session:
result = session.run(
CARTOGRAPHY_SCHEMA_METADATA,
{"provider_id": provider_id},
)
result = session.run(get_cartography_schema_query(provider_id))
record = result.single()
except graph_database.GraphDatabaseQueryException as exc:
logger.error(f"Cartography schema query failed: {exc}")
@@ -255,10 +231,12 @@ def _truncate_graph(graph: dict[str, Any]) -> dict[str, Any]:
def _serialize_graph(graph, provider_id: str) -> dict[str, Any]:
provider_label = get_provider_label(provider_id)
nodes = []
kept_node_ids = set()
for node in graph.nodes:
if node._properties.get(PROVIDER_ID_PROPERTY) != provider_id:
if provider_label not in node.labels:
continue
kept_node_ids.add(node.element_id)
@@ -273,14 +251,11 @@ def _serialize_graph(graph, provider_id: str) -> dict[str, Any]:
filtered_count = len(graph.nodes) - len(nodes)
if filtered_count > 0:
logger.debug(
f"Filtered {filtered_count} nodes without matching provider_id={provider_id}"
f"Filtered {filtered_count} nodes without provider label {provider_label}"
)
relationships = []
for relationship in graph.relationships:
if relationship._properties.get(PROVIDER_ID_PROPERTY) != provider_id:
continue
if (
relationship.start_node.element_id not in kept_node_ids
or relationship.end_node.element_id not in kept_node_ids
+17 -26
View File
@@ -1,10 +1,10 @@
from django.conf import settings
from django.core.exceptions import ObjectDoesNotExist
from django.db import transaction
from rest_framework import permissions
from rest_framework.exceptions import NotAuthenticated
from rest_framework.filters import SearchFilter
from rest_framework.permissions import SAFE_METHODS
from rest_framework.response import Response
from rest_framework_json_api import filters
from rest_framework_json_api.views import ModelViewSet
@@ -12,7 +12,7 @@ from api.authentication import CombinedJWTOrAPIKeyAuthentication
from api.db_router import MainRouter, reset_read_db_alias, set_read_db_alias
from api.db_utils import POSTGRES_USER_VAR, rls_transaction
from api.filters import CustomDjangoFilterBackend
from api.models import Role, Tenant
from api.models import Role, UserRoleRelationship
from api.rbac.permissions import HasPermissions
@@ -113,27 +113,22 @@ class BaseTenantViewset(BaseViewSet):
if request is not None:
request.db_alias = self.db_alias
with transaction.atomic(using=self.db_alias):
tenant = super().dispatch(request, *args, **kwargs)
try:
# If the request is a POST, create the admin role
if request.method == "POST":
isinstance(tenant, dict) and self._create_admin_role(
tenant.data["id"]
)
except Exception as e:
self._handle_creation_error(e, tenant)
raise
return tenant
if request.method == "POST":
with transaction.atomic(using=MainRouter.admin_db):
tenant = super().dispatch(request, *args, **kwargs)
if isinstance(tenant, Response) and tenant.status_code == 201:
self._create_admin_role(tenant.data["id"])
return tenant
else:
with transaction.atomic(using=self.db_alias):
return super().dispatch(request, *args, **kwargs)
finally:
if alias_token is not None:
reset_read_db_alias(alias_token)
self.db_alias = MainRouter.default_db
def _create_admin_role(self, tenant_id):
Role.objects.using(MainRouter.admin_db).create(
admin_role = Role.objects.using(MainRouter.admin_db).create(
name="admin",
tenant_id=tenant_id,
manage_users=True,
@@ -144,15 +139,11 @@ class BaseTenantViewset(BaseViewSet):
manage_scans=True,
unlimited_visibility=True,
)
def _handle_creation_error(self, error, tenant):
if tenant.data.get("id"):
try:
Tenant.objects.using(MainRouter.admin_db).filter(
id=tenant.data["id"]
).delete()
except ObjectDoesNotExist:
pass # Tenant might not exist, handle gracefully
UserRoleRelationship.objects.using(MainRouter.admin_db).create(
user=self.request.user,
role=admin_role,
tenant_id=tenant_id,
)
def initial(self, request, *args, **kwargs):
if request.auth is None:
+172 -12
View File
@@ -15,6 +15,7 @@ from django_filters.rest_framework import (
from rest_framework_json_api.django_filters.backends import DjangoFilterBackend
from rest_framework_json_api.serializers import ValidationError
from api.constants import SEVERITY_ORDER
from api.db_utils import (
FindingDeltaEnumField,
InvitationStateEnumField,
@@ -43,6 +44,7 @@ from api.models import (
ProviderGroup,
ProviderSecret,
Resource,
ResourceFindingMapping,
ResourceTag,
Role,
Scan,
@@ -196,17 +198,13 @@ class CommonFindingFilters(FilterSet):
field_name="resource_services", lookup_expr="icontains"
)
resource_uid = CharFilter(field_name="resources__uid")
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
resource_uid__icontains = CharFilter(
field_name="resources__uid", lookup_expr="icontains"
)
resource_uid = CharFilter(method="filter_resource_uid")
resource_uid__in = CharInFilter(method="filter_resource_uid_in")
resource_uid__icontains = CharFilter(method="filter_resource_uid_icontains")
resource_name = CharFilter(field_name="resources__name")
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
resource_name__icontains = CharFilter(
field_name="resources__name", lookup_expr="icontains"
)
resource_name = CharFilter(method="filter_resource_name")
resource_name__in = CharInFilter(method="filter_resource_name_in")
resource_name__icontains = CharFilter(method="filter_resource_name_icontains")
resource_type = CharFilter(method="filter_resource_type")
resource_type__in = CharInFilter(field_name="resource_types", lookup_expr="overlap")
@@ -264,6 +262,52 @@ class CommonFindingFilters(FilterSet):
)
return queryset.filter(overall_query).distinct()
def filter_check_title_icontains(self, queryset, name, value):
# Resolve from the summary table (has check_title column + trigram
# GIN index) instead of scanning JSON in the findings table.
matching_check_ids = (
FindingGroupDailySummary.objects.filter(
check_title__icontains=value,
)
.values_list("check_id", flat=True)
.distinct()
)
return queryset.filter(check_id__in=matching_check_ids)
# --- Resource subquery filters ---
# Resolve resource → RFM → finding_ids first, then filter findings
# by id__in. This avoids a 3-way JOIN driven from the (huge)
# findings side and lets PostgreSQL start from the resources
# unique-constraint index instead.
@staticmethod
def _finding_ids_for_resources(**lookup):
return ResourceFindingMapping.objects.filter(
resource__in=Resource.objects.filter(**lookup).values("id")
).values("finding_id")
def filter_resource_uid(self, queryset, name, value):
return queryset.filter(id__in=self._finding_ids_for_resources(uid=value))
def filter_resource_uid_in(self, queryset, name, value):
return queryset.filter(id__in=self._finding_ids_for_resources(uid__in=value))
def filter_resource_uid_icontains(self, queryset, name, value):
return queryset.filter(
id__in=self._finding_ids_for_resources(uid__icontains=value)
)
def filter_resource_name(self, queryset, name, value):
return queryset.filter(id__in=self._finding_ids_for_resources(name=value))
def filter_resource_name_in(self, queryset, name, value):
return queryset.filter(id__in=self._finding_ids_for_resources(name__in=value))
def filter_resource_name_icontains(self, queryset, name, value):
return queryset.filter(
id__in=self._finding_ids_for_resources(name__icontains=value)
)
class TenantFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
@@ -390,6 +434,7 @@ class ScanFilter(ProviderRelationshipFilterSet):
class Meta:
model = Scan
fields = {
"id": ["exact", "in"],
"provider": ["exact", "in"],
"name": ["exact", "icontains"],
"started_at": ["gte", "lte"],
@@ -803,11 +848,15 @@ class FindingGroupFilter(CommonFindingFilters):
check_id = CharFilter(field_name="check_id", lookup_expr="exact")
check_id__in = CharInFilter(field_name="check_id", lookup_expr="in")
check_id__icontains = CharFilter(field_name="check_id", lookup_expr="icontains")
check_title__icontains = CharFilter(method="filter_check_title_icontains")
scan = UUIDFilter(field_name="scan_id", lookup_expr="exact")
scan__in = UUIDInFilter(field_name="scan_id", lookup_expr="in")
class Meta:
model = Finding
fields = {
"check_id": ["exact", "in", "icontains"],
"scan": ["exact", "in"],
}
def filter_queryset(self, queryset):
@@ -895,15 +944,31 @@ class LatestFindingGroupFilter(CommonFindingFilters):
check_id = CharFilter(field_name="check_id", lookup_expr="exact")
check_id__in = CharInFilter(field_name="check_id", lookup_expr="in")
check_id__icontains = CharFilter(field_name="check_id", lookup_expr="icontains")
check_title__icontains = CharFilter(method="filter_check_title_icontains")
scan = UUIDFilter(field_name="scan_id", lookup_expr="exact")
scan__in = UUIDInFilter(field_name="scan_id", lookup_expr="in")
class Meta:
model = Finding
fields = {
"check_id": ["exact", "in", "icontains"],
"scan": ["exact", "in"],
}
class FindingGroupSummaryFilter(FilterSet):
class _CheckTitleToCheckIdMixin:
"""Resolve check_title search to check_ids so all provider rows are kept."""
def filter_check_title_to_check_ids(self, queryset, name, value):
matching_check_ids = (
queryset.filter(check_title__icontains=value)
.values_list("check_id", flat=True)
.distinct()
)
return queryset.filter(check_id__in=matching_check_ids)
class FindingGroupSummaryFilter(_CheckTitleToCheckIdMixin, FilterSet):
"""
Filter for FindingGroupDailySummary queries.
@@ -926,6 +991,7 @@ class FindingGroupSummaryFilter(FilterSet):
check_id = CharFilter(field_name="check_id", lookup_expr="exact")
check_id__in = CharInFilter(field_name="check_id", lookup_expr="in")
check_id__icontains = CharFilter(field_name="check_id", lookup_expr="icontains")
check_title__icontains = CharFilter(method="filter_check_title_to_check_ids")
# Provider filters
provider_id = UUIDFilter(field_name="provider_id", lookup_expr="exact")
@@ -1013,7 +1079,7 @@ class FindingGroupSummaryFilter(FilterSet):
return dt
class LatestFindingGroupSummaryFilter(FilterSet):
class LatestFindingGroupSummaryFilter(_CheckTitleToCheckIdMixin, FilterSet):
"""
Filter for FindingGroupDailySummary /latest endpoint.
@@ -1025,6 +1091,7 @@ class LatestFindingGroupSummaryFilter(FilterSet):
check_id = CharFilter(field_name="check_id", lookup_expr="exact")
check_id__in = CharInFilter(field_name="check_id", lookup_expr="in")
check_id__icontains = CharFilter(field_name="check_id", lookup_expr="icontains")
check_title__icontains = CharFilter(method="filter_check_title_to_check_ids")
# Provider filters
provider_id = UUIDFilter(field_name="provider_id", lookup_expr="exact")
@@ -1042,6 +1109,99 @@ class LatestFindingGroupSummaryFilter(FilterSet):
}
class FindingGroupAggregatedComputedFilter(FilterSet):
"""Filter aggregated finding-group rows by computed status/severity/muted."""
STATUS_CHOICES = (
("FAIL", "Fail"),
("PASS", "Pass"),
("MANUAL", "Manual"),
)
status = ChoiceFilter(method="filter_status", choices=STATUS_CHOICES)
status__in = CharInFilter(method="filter_status_in", lookup_expr="in")
severity = ChoiceFilter(method="filter_severity", choices=SeverityChoices)
severity__in = CharInFilter(method="filter_severity_in", lookup_expr="in")
muted = BooleanFilter(field_name="muted")
include_muted = BooleanFilter(method="filter_include_muted")
def filter_status(self, queryset, name, value):
return queryset.filter(aggregated_status=value)
def filter_status_in(self, queryset, name, value):
values = value
if isinstance(value, str):
values = [part.strip() for part in value.split(",") if part.strip()]
allowed = {choice[0] for choice in self.STATUS_CHOICES}
invalid = [
status_value for status_value in values if status_value not in allowed
]
if invalid:
raise ValidationError(
[
{
"detail": f"invalid status filter: {invalid[0]}",
"status": "400",
"source": {"pointer": "/data"},
"code": "invalid",
}
]
)
if not values:
return queryset
return queryset.filter(aggregated_status__in=values)
def filter_severity(self, queryset, name, value):
severity_order = SEVERITY_ORDER.get(value)
if severity_order is None:
raise ValidationError(
[
{
"detail": f"invalid severity filter: {value}",
"status": "400",
"source": {"pointer": "/data"},
"code": "invalid",
}
]
)
return queryset.filter(severity_order=severity_order)
def filter_severity_in(self, queryset, name, value):
values = value
if isinstance(value, str):
values = [part.strip() for part in value.split(",") if part.strip()]
orders = []
for severity_value in values:
severity_order = SEVERITY_ORDER.get(severity_value)
if severity_order is None:
raise ValidationError(
[
{
"detail": f"invalid severity filter: {severity_value}",
"status": "400",
"source": {"pointer": "/data"},
"code": "invalid",
}
]
)
orders.append(severity_order)
if not orders:
return queryset
return queryset.filter(severity_order__in=orders)
def filter_include_muted(self, queryset, name, value):
if value is True:
return queryset
# include_muted=false: exclude fully-muted groups
return queryset.exclude(muted=True)
class ProviderSecretFilter(FilterSet):
inserted_at = DateFilter(
field_name="inserted_at",
@@ -0,0 +1,31 @@
# Generated by Django 5.1.15 on 2026-03-18
from django.contrib.postgres.indexes import GinIndex, OpClass
from django.contrib.postgres.operations import AddIndexConcurrently
from django.db import migrations
from django.db.models.functions import Upper
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0084_googleworkspace_provider"),
]
operations = [
AddIndexConcurrently(
model_name="findinggroupdailysummary",
index=GinIndex(
OpClass(Upper("check_id"), name="gin_trgm_ops"),
name="fgds_check_id_trgm_idx",
),
),
AddIndexConcurrently(
model_name="findinggroupdailysummary",
index=GinIndex(
OpClass(Upper("check_title"), name="gin_trgm_ops"),
name="fgds_check_title_trgm_idx",
),
),
]
@@ -0,0 +1,49 @@
from django.db import migrations
TASK_NAME = "attack-paths-cleanup-stale-scans"
INTERVAL_HOURS = 1
def create_periodic_task(apps, schema_editor):
IntervalSchedule = apps.get_model("django_celery_beat", "IntervalSchedule")
PeriodicTask = apps.get_model("django_celery_beat", "PeriodicTask")
schedule, _ = IntervalSchedule.objects.get_or_create(
every=INTERVAL_HOURS,
period="hours",
)
PeriodicTask.objects.update_or_create(
name=TASK_NAME,
defaults={
"task": TASK_NAME,
"interval": schedule,
"enabled": True,
},
)
def delete_periodic_task(apps, schema_editor):
IntervalSchedule = apps.get_model("django_celery_beat", "IntervalSchedule")
PeriodicTask = apps.get_model("django_celery_beat", "PeriodicTask")
PeriodicTask.objects.filter(name=TASK_NAME).delete()
# Clean up the schedule if no other task references it
IntervalSchedule.objects.filter(
every=INTERVAL_HOURS,
period="hours",
periodictask__isnull=True,
).delete()
class Migration(migrations.Migration):
dependencies = [
("api", "0085_finding_group_daily_summary_trgm_indexes"),
("django_celery_beat", "0019_alter_periodictasks_options"),
]
operations = [
migrations.RunPython(create_periodic_task, delete_periodic_task),
]
@@ -0,0 +1,40 @@
from django.db import migrations
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0086_attack_paths_cleanup_periodic_task"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="provider",
field=api.db_utils.ProviderEnumField(
choices=[
("aws", "AWS"),
("azure", "Azure"),
("gcp", "GCP"),
("kubernetes", "Kubernetes"),
("m365", "M365"),
("github", "GitHub"),
("mongodbatlas", "MongoDB Atlas"),
("iac", "IaC"),
("oraclecloud", "Oracle Cloud Infrastructure"),
("alibabacloud", "Alibaba Cloud"),
("cloudflare", "Cloudflare"),
("openstack", "OpenStack"),
("image", "Image"),
("googleworkspace", "Google Workspace"),
("vercel", "Vercel"),
],
default="aws",
),
),
migrations.RunSQL(
"ALTER TYPE provider ADD VALUE IF NOT EXISTS 'vercel';",
reverse_sql=migrations.RunSQL.noop,
),
]
@@ -0,0 +1,95 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0087_vercel_provider"),
]
operations = [
migrations.AddField(
model_name="findinggroupdailysummary",
name="manual_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="pass_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="fail_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="manual_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="muted",
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_fail_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_fail_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_pass_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_pass_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_manual_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="new_manual_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_fail_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_fail_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_pass_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_pass_muted_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_manual_count",
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name="findinggroupdailysummary",
name="changed_manual_muted_count",
field=models.IntegerField(default=0),
),
]
@@ -0,0 +1,31 @@
from django.db import migrations
from tasks.tasks import backfill_finding_group_summaries_task
from api.db_router import MainRouter
from api.rls import Tenant
def trigger_backfill_task(apps, schema_editor):
"""
Re-dispatch the finding-group backfill task for every tenant so the new
`manual_count` and `muted` columns added in 0088 get populated from the
last 10 days of completed scans.
The aggregator (`aggregate_finding_group_summaries`) recomputes every
column on each call, so it back-populates the new fields without touching
the existing ones beyond a normal upsert.
"""
tenant_ids = Tenant.objects.using(MainRouter.admin_db).values_list("id", flat=True)
for tenant_id in tenant_ids:
backfill_finding_group_summaries_task.delay(tenant_id=str(tenant_id), days=10)
class Migration(migrations.Migration):
dependencies = [
("api", "0088_finding_group_status_muted_fields"),
]
operations = [
migrations.RunPython(trigger_backfill_task, migrations.RunPython.noop),
]
@@ -0,0 +1,23 @@
from django.db import migrations
TASK_NAME = "attack-paths-cleanup-stale-scans"
def set_cleanup_priority(apps, schema_editor):
PeriodicTask = apps.get_model("django_celery_beat", "PeriodicTask")
PeriodicTask.objects.filter(name=TASK_NAME).update(priority=0)
def unset_cleanup_priority(apps, schema_editor):
PeriodicTask = apps.get_model("django_celery_beat", "PeriodicTask")
PeriodicTask.objects.filter(name=TASK_NAME).update(priority=None)
class Migration(migrations.Migration):
dependencies = [
("api", "0089_backfill_finding_group_status_muted"),
]
operations = [
migrations.RunPython(set_cleanup_priority, unset_cleanup_priority),
]
+55 -3
View File
@@ -1,14 +1,15 @@
import json
import logging
import re
import xml.etree.ElementTree as ET
from datetime import datetime, timedelta, timezone
from uuid import UUID, uuid4
import defusedxml
from allauth.socialaccount.models import SocialApp
from config.custom_logging import BackendLogger
from config.settings.social_login import SOCIALACCOUNT_PROVIDERS
from cryptography.fernet import Fernet, InvalidToken
from defusedxml import ElementTree as ET
from django.conf import settings
from django.contrib.auth.models import AbstractBaseUser
from django.contrib.postgres.fields import ArrayField
@@ -294,6 +295,7 @@ class Provider(RowLevelSecurityProtectedModel):
OPENSTACK = "openstack", _("OpenStack")
IMAGE = "image", _("Image")
GOOGLEWORKSPACE = "googleworkspace", _("Google Workspace")
VERCEL = "vercel", _("Vercel")
@staticmethod
def validate_aws_uid(value):
@@ -437,6 +439,15 @@ class Provider(RowLevelSecurityProtectedModel):
pointer="/data/attributes/uid",
)
@staticmethod
def validate_vercel_uid(value):
if not re.match(r"^team_[a-zA-Z0-9]{16,32}$", value):
raise ModelValidationError(
detail="Vercel provider ID must be a valid Vercel Team ID (e.g., team_xxxxxxxxxxxxxxxxxxxxxxxx).",
code="vercel-uid",
pointer="/data/attributes/uid",
)
@staticmethod
def validate_image_uid(value):
if not re.match(r"^[a-zA-Z0-9][a-zA-Z0-9._/:@-]{2,249}$", value):
@@ -1737,15 +1748,45 @@ class FindingGroupDailySummary(RowLevelSecurityProtectedModel):
# Severity stored as integer for MAX aggregation (5=critical, 4=high, etc.)
severity_order = models.SmallIntegerField(default=1)
# Finding counts
# Finding counts (inclusive of muted findings; use the `muted` flag to
# tell whether the group has any actionable findings).
pass_count = models.IntegerField(default=0)
fail_count = models.IntegerField(default=0)
manual_count = models.IntegerField(default=0)
muted_count = models.IntegerField(default=0)
# Delta counts
# Status counts restricted to muted findings, so clients can isolate the
# muted half of each status (e.g. `pass_count - pass_muted_count` gives the
# actionable PASS findings).
pass_muted_count = models.IntegerField(default=0)
fail_muted_count = models.IntegerField(default=0)
manual_muted_count = models.IntegerField(default=0)
# Whether every finding for this (provider, check, day) is muted.
muted = models.BooleanField(default=False)
# Delta counts (non-muted, kept for convenience and as a "total" view).
new_count = models.IntegerField(default=0)
changed_count = models.IntegerField(default=0)
# Delta breakdown by (status, muted) so clients can answer questions like
# "how many new failing findings appeared in this scan?" without scanning
# the underlying findings table. Mirrors the existing pass/fail/manual
# naming, with `_muted_count` siblings tracking the muted half of each
# bucket explicitly.
new_fail_count = models.IntegerField(default=0)
new_fail_muted_count = models.IntegerField(default=0)
new_pass_count = models.IntegerField(default=0)
new_pass_muted_count = models.IntegerField(default=0)
new_manual_count = models.IntegerField(default=0)
new_manual_muted_count = models.IntegerField(default=0)
changed_fail_count = models.IntegerField(default=0)
changed_fail_muted_count = models.IntegerField(default=0)
changed_pass_count = models.IntegerField(default=0)
changed_pass_muted_count = models.IntegerField(default=0)
changed_manual_count = models.IntegerField(default=0)
changed_manual_muted_count = models.IntegerField(default=0)
# Resource counts
resources_fail = models.IntegerField(default=0)
resources_total = models.IntegerField(default=0)
@@ -1783,6 +1824,15 @@ class FindingGroupDailySummary(RowLevelSecurityProtectedModel):
fields=["tenant_id", "provider", "inserted_at"],
name="fgds_tenant_prov_ins_idx",
),
# Trigram indexes for case-insensitive search
GinIndex(
OpClass(Upper("check_id"), name="gin_trgm_ops"),
name="fgds_check_id_trgm_idx",
),
GinIndex(
OpClass(Upper("check_title"), name="gin_trgm_ops"),
name="fgds_check_title_trgm_idx",
),
]
class JSONAPIMeta:
@@ -2058,6 +2108,8 @@ class SAMLConfiguration(RowLevelSecurityProtectedModel):
root = ET.fromstring(self.metadata_xml)
except ET.ParseError as e:
raise ValidationError({"metadata_xml": f"Invalid XML: {e}"})
except defusedxml.DefusedXmlException as e:
raise ValidationError({"metadata_xml": f"Unsafe XML content rejected: {e}"})
# Entity ID
entity_id = root.attrib.get("entityID")
+19 -7
View File
@@ -1,7 +1,7 @@
from enum import Enum
from typing import Optional
from django.db.models import QuerySet
from rest_framework.exceptions import PermissionDenied
from rest_framework.permissions import BasePermission
from api.db_router import MainRouter
@@ -29,8 +29,17 @@ class HasPermissions(BasePermission):
if not required_permissions:
return True
tenant_id = getattr(request, "tenant_id", None)
if not tenant_id:
tenant_id = request.auth.get("tenant_id") if request.auth else None
if not tenant_id:
return False
user_roles = (
User.objects.using(MainRouter.admin_db).get(id=request.user.id).roles.all()
User.objects.using(MainRouter.admin_db)
.get(id=request.user.id)
.roles.using(MainRouter.admin_db)
.filter(tenant_id=tenant_id)
)
if not user_roles:
return False
@@ -42,14 +51,17 @@ class HasPermissions(BasePermission):
return True
def get_role(user: User) -> Optional[Role]:
def get_role(user: User, tenant_id: str) -> Role:
"""
Retrieve the first role assigned to the given user.
Retrieve the role assigned to the given user in the specified tenant.
Returns:
The user's first Role instance if the user has any roles, otherwise None.
Raises:
PermissionDenied: If the user has no role in the given tenant.
"""
return user.roles.first()
role = user.roles.using(MainRouter.admin_db).filter(tenant_id=tenant_id).first()
if role is None:
raise PermissionDenied("User has no role in this tenant.")
return role
def get_providers(role: Role) -> QuerySet[Provider]:
+1 -1
View File
@@ -61,7 +61,7 @@ def revoke_membership_api_keys(sender, instance, **kwargs): # noqa: F841
in that tenant should be revoked to prevent further access.
"""
TenantAPIKey.objects.filter(
entity=instance.user, tenant_id=instance.tenant.id
entity_id=instance.user_id, tenant_id=instance.tenant_id
).update(revoked=True)
+133 -1
View File
@@ -1,7 +1,7 @@
openapi: 3.0.3
info:
title: Prowler API
version: 1.22.0
version: 1.25.1
description: |-
Prowler API specification.
@@ -372,6 +372,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -387,6 +388,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -409,6 +411,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -426,6 +429,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -1351,6 +1355,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -1366,6 +1371,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -1827,6 +1833,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -1842,6 +1849,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -1864,6 +1872,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -1881,6 +1890,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -2429,6 +2439,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -2444,6 +2455,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -2466,6 +2478,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -2483,6 +2496,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -2939,6 +2953,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -2954,6 +2969,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -2976,6 +2992,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -2993,6 +3010,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -3447,6 +3465,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -3462,6 +3481,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -3484,6 +3504,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -3501,6 +3522,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -3943,6 +3965,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -3958,6 +3981,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -3980,6 +4004,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -3997,6 +4022,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -5780,6 +5806,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -5795,6 +5822,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -5817,6 +5845,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -5834,6 +5863,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- name: filter[search]
@@ -5955,6 +5985,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -5970,6 +6001,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -5992,6 +6024,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -6009,6 +6042,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- name: filter[search]
@@ -6119,6 +6153,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -6134,6 +6169,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -6155,6 +6191,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -6172,6 +6209,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- name: filter[search]
@@ -6314,6 +6352,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -6329,6 +6368,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -6351,6 +6391,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -6368,6 +6409,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -6523,6 +6565,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -6538,6 +6581,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -6560,6 +6604,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -6577,6 +6622,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -6726,6 +6772,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -6741,6 +6788,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -6762,6 +6810,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -6779,6 +6828,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- name: filter[search]
@@ -6970,6 +7020,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -6985,6 +7036,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -7007,6 +7059,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -7024,6 +7077,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -7144,6 +7198,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -7159,6 +7214,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -7181,6 +7237,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -7198,6 +7255,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -7342,6 +7400,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -7357,6 +7416,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -7379,6 +7439,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -7396,6 +7457,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -8181,6 +8243,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -8196,6 +8259,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider__in]
schema:
@@ -8218,6 +8282,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -8235,6 +8300,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -8257,6 +8323,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -8272,6 +8339,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -8294,6 +8362,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -8311,6 +8380,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- name: filter[search]
@@ -8980,6 +9050,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -8995,6 +9066,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -9017,6 +9089,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -9034,6 +9107,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -9527,6 +9601,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -9542,6 +9617,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -9564,6 +9640,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -9581,6 +9658,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -9887,6 +9965,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -9902,6 +9981,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -9924,6 +10004,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -9941,6 +10022,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -10253,6 +10335,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -10268,6 +10351,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -10290,6 +10374,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -10307,6 +10392,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -11129,6 +11215,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
* `aws` - AWS
* `azure` - Azure
@@ -11144,6 +11231,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
- in: query
name: filter[provider_type__in]
schema:
@@ -11166,6 +11254,7 @@ paths:
- mongodbatlas
- openstack
- oraclecloud
- vercel
description: |-
Multiple values may be separated by commas.
@@ -11183,6 +11272,7 @@ paths:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
explode: false
style: form
- in: query
@@ -18463,6 +18553,15 @@ components:
required:
- clouds_yaml_content
- clouds_yaml_cloud
- type: object
title: Vercel API Token
properties:
api_token:
type: string
description: Vercel API token for authentication. Can be scoped
to a specific team.
required:
- api_token
writeOnly: true
required:
- secret
@@ -19465,6 +19564,7 @@ components:
- openstack
- image
- googleworkspace
- vercel
type: string
description: |-
* `aws` - AWS
@@ -19481,6 +19581,7 @@ components:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
x-spec-enum-id: c0d56cad8ab9abe5
uid:
type: string
@@ -19601,6 +19702,7 @@ components:
- openstack
- image
- googleworkspace
- vercel
type: string
x-spec-enum-id: c0d56cad8ab9abe5
description: |-
@@ -19620,6 +19722,7 @@ components:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
uid:
type: string
title: Unique identifier for the provider, set by the provider
@@ -19671,6 +19774,7 @@ components:
- openstack
- image
- googleworkspace
- vercel
type: string
x-spec-enum-id: c0d56cad8ab9abe5
description: |-
@@ -19690,6 +19794,7 @@ components:
* `openstack` - OpenStack
* `image` - Image
* `googleworkspace` - Google Workspace
* `vercel` - Vercel
uid:
type: string
minLength: 3
@@ -20539,6 +20644,15 @@ components:
required:
- clouds_yaml_content
- clouds_yaml_cloud
- type: object
title: Vercel API Token
properties:
api_token:
type: string
description: Vercel API token for authentication. Can be scoped
to a specific team.
required:
- api_token
writeOnly: true
required:
- secret_type
@@ -20955,6 +21069,15 @@ components:
required:
- clouds_yaml_content
- clouds_yaml_cloud
- type: object
title: Vercel API Token
properties:
api_token:
type: string
description: Vercel API token for authentication. Can be scoped
to a specific team.
required:
- api_token
writeOnly: true
required:
- secret_type
@@ -21381,6 +21504,15 @@ components:
required:
- clouds_yaml_content
- clouds_yaml_cloud
- type: object
title: Vercel API Token
properties:
api_token:
type: string
description: Vercel API token for authentication. Can be scoped
to a specific team.
required:
- api_token
writeOnly: true
required:
- secret
@@ -215,6 +215,21 @@ class TestTokenSwitchTenant:
tenant_id = tenants_fixture[0].id
user_instance = User.objects.get(email=test_user)
Membership.objects.create(user=user_instance, tenant_id=tenant_id)
# Assign an admin role in the target tenant so the user can access resources
target_role = Role.objects.create(
name="admin",
tenant_id=tenant_id,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=True,
)
UserRoleRelationship.objects.create(
user=user_instance, role=target_role, tenant_id=tenant_id
)
# Check that using our new user's credentials we can authenticate and get the providers
access_token, _ = get_api_tokens(client, test_user, test_password)
@@ -301,7 +316,7 @@ class TestTokenSwitchTenant:
assert invalid_tenant_response.status_code == 400
assert invalid_tenant_response.json()["errors"][0]["code"] == "invalid"
assert invalid_tenant_response.json()["errors"][0]["detail"] == (
"Tenant does not exist or user is not a " "member."
"Tenant does not exist or user is not a member."
)
@@ -912,10 +927,9 @@ class TestAPIKeyLifecycle:
auth_response = client.get(reverse("provider-list"), headers=api_key_headers)
# Must return 401 Unauthorized, not 500 Internal Server Error
assert auth_response.status_code == 401, (
f"Expected 401 but got {auth_response.status_code}: "
f"{auth_response.json()}"
)
assert (
auth_response.status_code == 401
), f"Expected 401 but got {auth_response.status_code}: {auth_response.json()}"
# Verify error message is present
response_json = auth_response.json()
+17 -13
View File
@@ -10,11 +10,11 @@ from django.conf import settings
import api
import api.apps as api_apps_module
from api.apps import (
ApiConfig,
PRIVATE_KEY_FILE,
PUBLIC_KEY_FILE,
SIGNING_KEY_ENV,
VERIFYING_KEY_ENV,
ApiConfig,
)
@@ -187,9 +187,10 @@ def test_ready_initializes_driver_for_api_process(monkeypatch):
_set_argv(monkeypatch, ["gunicorn"])
_set_testing(monkeypatch, False)
with patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None), patch(
"api.attack_paths.database.init_driver"
) as init_driver:
with (
patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None),
patch("api.attack_paths.database.init_driver") as init_driver,
):
config.ready()
init_driver.assert_called_once()
@@ -200,9 +201,10 @@ def test_ready_skips_driver_for_celery(monkeypatch):
_set_argv(monkeypatch, ["celery", "-A", "api"])
_set_testing(monkeypatch, False)
with patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None), patch(
"api.attack_paths.database.init_driver"
) as init_driver:
with (
patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None),
patch("api.attack_paths.database.init_driver") as init_driver,
):
config.ready()
init_driver.assert_not_called()
@@ -213,9 +215,10 @@ def test_ready_skips_driver_for_manage_py_skip_command(monkeypatch):
_set_argv(monkeypatch, ["manage.py", "migrate"])
_set_testing(monkeypatch, False)
with patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None), patch(
"api.attack_paths.database.init_driver"
) as init_driver:
with (
patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None),
patch("api.attack_paths.database.init_driver") as init_driver,
):
config.ready()
init_driver.assert_not_called()
@@ -226,9 +229,10 @@ def test_ready_skips_driver_when_testing(monkeypatch):
_set_argv(monkeypatch, ["gunicorn"])
_set_testing(monkeypatch, True)
with patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None), patch(
"api.attack_paths.database.init_driver"
) as init_driver:
with (
patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None),
patch("api.attack_paths.database.init_driver") as init_driver,
):
config.ready()
init_driver.assert_not_called()
+24 -97
View File
@@ -11,7 +11,7 @@ from api.attack_paths import database as graph_database
from api.attack_paths import views_helpers
from tasks.jobs.attack_paths.config import (
PROVIDER_ELEMENT_ID_PROPERTY,
PROVIDER_ID_PROPERTY,
get_provider_label,
)
@@ -53,7 +53,7 @@ def test_prepare_parameters_includes_provider_and_casts(
)
assert result["provider_uid"] == "123456789012"
assert result["provider_id"] == "test-provider-id"
assert "provider_id" not in result
assert result["limit"] == 5
@@ -107,12 +107,12 @@ def test_execute_query_serializes_graph(
parameters = {"provider_uid": "123"}
provider_id = "test-provider-123"
plabel = get_provider_label(provider_id)
node = attack_paths_graph_stub_classes.Node(
element_id="node-1",
labels=["AWSAccount"],
labels=["AWSAccount", plabel],
properties={
"name": "account",
PROVIDER_ID_PROPERTY: provider_id,
"complex": {
"items": [
attack_paths_graph_stub_classes.NativeValue("value"),
@@ -121,15 +121,13 @@ def test_execute_query_serializes_graph(
},
},
)
node_2 = attack_paths_graph_stub_classes.Node(
"node-2", ["RDSInstance"], {PROVIDER_ID_PROPERTY: provider_id}
)
node_2 = attack_paths_graph_stub_classes.Node("node-2", ["RDSInstance", plabel], {})
relationship = attack_paths_graph_stub_classes.Relationship(
element_id="rel-1",
rel_type="OWNS",
start_node=node,
end_node=node_2,
properties={"weight": 1, PROVIDER_ID_PROPERTY: provider_id},
properties={"weight": 1},
)
graph = SimpleNamespace(nodes=[node, node_2], relationships=[relationship])
@@ -213,29 +211,27 @@ def test_execute_query_raises_permission_denied_on_read_only(
)
def test_serialize_graph_filters_by_provider_id(attack_paths_graph_stub_classes):
def test_serialize_graph_filters_by_provider_label(attack_paths_graph_stub_classes):
provider_id = "provider-keep"
plabel = get_provider_label(provider_id)
other_label = get_provider_label("provider-other")
node_keep = attack_paths_graph_stub_classes.Node(
"n1", ["AWSAccount"], {PROVIDER_ID_PROPERTY: provider_id}
)
node_keep = attack_paths_graph_stub_classes.Node("n1", ["AWSAccount", plabel], {})
node_drop = attack_paths_graph_stub_classes.Node(
"n2", ["AWSAccount"], {PROVIDER_ID_PROPERTY: "provider-other"}
"n2", ["AWSAccount", other_label], {}
)
rel_keep = attack_paths_graph_stub_classes.Relationship(
"r1", "OWNS", node_keep, node_keep, {PROVIDER_ID_PROPERTY: provider_id}
)
rel_drop_by_provider = attack_paths_graph_stub_classes.Relationship(
"r2", "OWNS", node_keep, node_drop, {PROVIDER_ID_PROPERTY: "provider-other"}
"r1", "OWNS", node_keep, node_keep, {}
)
# Relationship connecting a kept node to a dropped node — filtered by endpoint check
rel_drop_orphaned = attack_paths_graph_stub_classes.Relationship(
"r3", "OWNS", node_keep, node_drop, {PROVIDER_ID_PROPERTY: provider_id}
"r2", "OWNS", node_keep, node_drop, {}
)
graph = SimpleNamespace(
nodes=[node_keep, node_drop],
relationships=[rel_keep, rel_drop_by_provider, rel_drop_orphaned],
relationships=[rel_keep, rel_drop_orphaned],
)
result = views_helpers._serialize_graph(graph, provider_id)
@@ -354,7 +350,6 @@ def test_serialize_properties_filters_internal_fields():
"_module_name": "cartography:aws",
"_module_version": "0.98.0",
# Provider isolation
PROVIDER_ID_PROPERTY: "42",
PROVIDER_ELEMENT_ID_PROPERTY: "42:abc123",
}
@@ -449,14 +444,11 @@ def test_execute_custom_query_serializes_graph(
attack_paths_graph_stub_classes,
):
provider_id = "test-provider-123"
node_1 = attack_paths_graph_stub_classes.Node(
"node-1", ["AWSAccount"], {PROVIDER_ID_PROPERTY: provider_id}
)
node_2 = attack_paths_graph_stub_classes.Node(
"node-2", ["RDSInstance"], {PROVIDER_ID_PROPERTY: provider_id}
)
plabel = get_provider_label(provider_id)
node_1 = attack_paths_graph_stub_classes.Node("node-1", ["AWSAccount", plabel], {})
node_2 = attack_paths_graph_stub_classes.Node("node-2", ["RDSInstance", plabel], {})
relationship = attack_paths_graph_stub_classes.Relationship(
"rel-1", "OWNS", node_1, node_2, {PROVIDER_ID_PROPERTY: provider_id}
"rel-1", "OWNS", node_1, node_2, {}
)
graph_result = MagicMock()
@@ -471,10 +463,11 @@ def test_execute_custom_query_serializes_graph(
"db-tenant-test", "MATCH (n) RETURN n", provider_id
)
mock_execute.assert_called_once_with(
database="db-tenant-test",
cypher="MATCH (n) RETURN n",
)
mock_execute.assert_called_once()
call_kwargs = mock_execute.call_args[1]
assert call_kwargs["database"] == "db-tenant-test"
# The cypher is rewritten with the provider label injection
assert plabel in call_kwargs["cypher"]
assert len(result["nodes"]) == 2
assert result["relationships"][0]["label"] == "OWNS"
assert result["truncated"] is False
@@ -511,72 +504,6 @@ def test_execute_custom_query_wraps_graph_errors():
mock_logger.error.assert_called_once()
# -- validate_custom_query ------------------------------------------------
@pytest.mark.parametrize(
"cypher",
[
"LOAD CSV FROM 'http://169.254.169.254/' AS x RETURN x",
"load csv from 'http://evil.com' as row return row",
"CALL apoc.load.json('http://evil.com/') YIELD value RETURN value",
"CALL apoc.load.csvParams('http://evil.com/', {}, null) YIELD list RETURN list",
"CALL apoc.import.csv([{fileName: 'f'}], [], {}) YIELD node RETURN node",
"CALL apoc.export.csv.all('file.csv', {})",
"CALL apoc.cypher.run('CREATE (n)', {}) YIELD value RETURN value",
"CALL apoc.systemdb.graph() YIELD nodes RETURN nodes",
"CALL apoc.config.list() YIELD key, value RETURN key, value",
"CALL apoc.periodic.iterate('MATCH (n) RETURN n', 'DELETE n', {batchSize: 100})",
"CALL apoc.do.when(true, 'CREATE (n) RETURN n', '', {}) YIELD value RETURN value",
"CALL apoc.trigger.add('t', 'RETURN 1', {phase: 'before'})",
"CALL apoc.custom.asProcedure('myProc', 'RETURN 1')",
],
ids=[
"LOAD_CSV",
"LOAD_CSV_lowercase",
"apoc.load.json",
"apoc.load.csvParams",
"apoc.import.csv",
"apoc.export.csv",
"apoc.cypher.run",
"apoc.systemdb.graph",
"apoc.config.list",
"apoc.periodic.iterate",
"apoc.do.when",
"apoc.trigger.add",
"apoc.custom.asProcedure",
],
)
def test_validate_custom_query_rejects_blocked_patterns(cypher):
with pytest.raises(ValidationError) as exc:
views_helpers.validate_custom_query(cypher)
assert "blocked operation" in str(exc.value.detail)
@pytest.mark.parametrize(
"cypher",
[
"MATCH (n:AWSAccount) RETURN n LIMIT 10",
"MATCH (a)-[r]->(b) RETURN a, r, b",
"MATCH (n) WHERE n.name CONTAINS 'load' RETURN n",
"CALL apoc.create.vNode(['Label'], {}) YIELD node RETURN node",
"MATCH (n) WHERE n.name = 'apoc.load.json' RETURN n",
'MATCH (n) WHERE n.description = "LOAD CSV is cool" RETURN n',
],
ids=[
"simple_match",
"traversal",
"contains_load_substring",
"apoc_virtual_node",
"apoc_load_inside_single_quotes",
"load_csv_inside_double_quotes",
],
)
def test_validate_custom_query_allows_clean_queries(cypher):
views_helpers.validate_custom_query(cypher)
# -- _truncate_graph ----------------------------------------------------------
@@ -0,0 +1,43 @@
import pytest
from config.settings.celery import _build_celery_broker_url
class TestBuildCeleryBrokerUrl:
def test_without_credentials(self):
broker_url = _build_celery_broker_url("redis", "", "", "valkey", "6379", "0")
assert broker_url == "redis://valkey:6379/0"
def test_with_password_only(self):
broker_url = _build_celery_broker_url(
"rediss", "", "secret", "cache.example.com", "6379", "0"
)
assert broker_url == "rediss://:secret@cache.example.com:6379/0"
def test_with_username_and_password(self):
broker_url = _build_celery_broker_url(
"rediss", "default", "secret", "cache.example.com", "6379", "0"
)
assert broker_url == "rediss://default:secret@cache.example.com:6379/0"
def test_with_username_only(self):
broker_url = _build_celery_broker_url(
"redis", "admin", "", "valkey", "6379", "0"
)
assert broker_url == "redis://admin@valkey:6379/0"
def test_url_encodes_credentials(self):
broker_url = _build_celery_broker_url(
"rediss", "user@name", "p@ss:word", "cache.example.com", "6379", "0"
)
assert (
broker_url == "rediss://user%40name:p%40ss%3Aword@cache.example.com:6379/0"
)
def test_invalid_scheme_raises_error(self):
with pytest.raises(ValueError, match="Invalid VALKEY_SCHEME 'http'"):
_build_celery_broker_url("http", "", "", "valkey", "6379", "0")
@@ -0,0 +1,429 @@
"""Unit tests for the Cypher sanitizer (validation + provider-label injection)."""
from unittest.mock import patch
import pytest
from rest_framework.exceptions import ValidationError
from api.attack_paths.cypher_sanitizer import (
inject_provider_label,
validate_custom_query,
)
PROVIDER_ID = "019c41ee-7df3-7dec-a684-d839f95619f8"
LABEL = "_Provider_019c41ee7df37deca684d839f95619f8"
def _inject(cypher: str) -> str:
"""Shortcut that patches `get_provider_label` to avoid config imports."""
with patch(
"api.attack_paths.cypher_sanitizer.get_provider_label", return_value=LABEL
):
return inject_provider_label(cypher, PROVIDER_ID)
# ---------------------------------------------------------------------------
# Pass A - Labeled node patterns (all clauses)
# ---------------------------------------------------------------------------
class TestLabeledNodes:
def test_single_label(self):
result = _inject("MATCH (n:AWSRole) RETURN n")
assert f"(n:AWSRole:{LABEL})" in result
def test_label_with_properties(self):
result = _inject("MATCH (n:AWSRole {name: 'admin'}) RETURN n")
assert f"(n:AWSRole:{LABEL} {{name: 'admin'}})" in result
def test_multiple_labels(self):
result = _inject("MATCH (n:AWSRole:AWSPrincipal) RETURN n")
assert f"(n:AWSRole:AWSPrincipal:{LABEL})" in result
def test_anonymous_labeled(self):
result = _inject(
"MATCH (:AWSPrincipal {arn: 'ecs-tasks.amazonaws.com'}) RETURN 1"
)
assert f"(:AWSPrincipal:{LABEL} {{arn: 'ecs-tasks.amazonaws.com'}})" in result
def test_backtick_label(self):
result = _inject("MATCH (n:`My Label`) RETURN n")
assert f"(n:`My Label`:{LABEL})" in result
def test_labeled_in_where_clause(self):
"""Labeled nodes in WHERE (pattern existence) still get the label."""
result = _inject(
"MATCH (n:AWSRole) WHERE EXISTS((n)-[:REL]->(:Target)) RETURN n"
)
assert f"(n:AWSRole:{LABEL})" in result
assert f"(:Target:{LABEL})" in result
def test_labeled_in_return_clause(self):
"""Labeled nodes in RETURN still get the label (they're always node patterns)."""
result = _inject("MATCH (n:AWSRole) RETURN (n:AWSRole)")
assert result.count(f":AWSRole:{LABEL}") == 2
def test_labeled_in_optional_match(self):
result = _inject(
"OPTIONAL MATCH (pf:ProwlerFinding {status: 'FAIL'}) RETURN pf"
)
assert f"(pf:ProwlerFinding:{LABEL} {{status: 'FAIL'}})" in result
# ---------------------------------------------------------------------------
# Pass B - Bare node patterns (MATCH/OPTIONAL MATCH only)
# ---------------------------------------------------------------------------
class TestBareNodes:
def test_bare_in_match(self):
result = _inject("MATCH (a)-[:HAS_POLICY]->(b) RETURN a, b")
assert f"(a:{LABEL})" in result
assert f"(b:{LABEL})" in result
def test_bare_with_properties_in_match(self):
result = _inject("MATCH (n {name: 'x'}) RETURN n")
assert f"(n:{LABEL} {{name: 'x'}})" in result
def test_bare_in_optional_match(self):
result = _inject("OPTIONAL MATCH (n)-[r]-(m) RETURN n")
assert f"(n:{LABEL})" in result
assert f"(m:{LABEL})" in result
def test_bare_not_injected_in_return(self):
"""Bare (identifier) in RETURN could be expression grouping."""
cypher = "MATCH (n:AWSRole) RETURN (n)"
result = _inject(cypher)
# The labeled (n:AWSRole) gets the label, but the bare (n) in RETURN should not
assert f"(n:AWSRole:{LABEL})" in result
# Count how many times the label appears - should be 1 (from MATCH only)
assert result.count(LABEL) == 1
def test_bare_not_injected_in_where(self):
cypher = "MATCH (n:AWSRole) WHERE (n.x > 1) RETURN n"
result = _inject(cypher)
# (n.x > 1) is an expression group, not a node pattern - should be untouched
assert "(n.x > 1)" in result
def test_bare_not_injected_in_with(self):
cypher = "MATCH (n:AWSRole) WITH (n) RETURN n"
result = _inject(cypher)
assert result.count(LABEL) == 1
def test_bare_not_injected_in_unwind(self):
cypher = "UNWIND nodes(path) as n OPTIONAL MATCH (n)-[r]-(m) RETURN n"
result = _inject(cypher)
# (n) and (m) in OPTIONAL MATCH get injected, but nodes(path) in UNWIND does not
assert f"(n:{LABEL})" in result
assert f"(m:{LABEL})" in result
# ---------------------------------------------------------------------------
# Function call exclusion
# ---------------------------------------------------------------------------
class TestFunctionCallExclusion:
@pytest.mark.parametrize(
"func_call",
[
"collect(DISTINCT pf)",
"any(x IN stmt.action WHERE toLower(x) = 'iam:*')",
"toLower(action)",
"nodes(path)",
"count(n)",
"apoc.create.vNode(labels)",
"EXISTS(n.prop)",
"size(n.list)",
],
)
def test_function_calls_not_injected(self, func_call):
cypher = f"MATCH (n:AWSRole) WHERE {func_call} RETURN n"
result = _inject(cypher)
# The function call should remain unchanged
assert func_call in result
# Only the MATCH labeled node should get the label
assert result.count(LABEL) == 1
# ---------------------------------------------------------------------------
# String and comment protection
# ---------------------------------------------------------------------------
class TestProtection:
def test_string_with_fake_node_pattern(self):
cypher = "MATCH (n:AWSRole) WHERE n.name = '(fake:Label)' RETURN n"
result = _inject(cypher)
assert "'(fake:Label)'" in result
assert result.count(LABEL) == 1
def test_double_quoted_string(self):
cypher = 'MATCH (n:AWSRole) WHERE n.name = "(fake:Label)" RETURN n'
result = _inject(cypher)
assert '"(fake:Label)"' in result
assert result.count(LABEL) == 1
def test_line_comment_with_node_pattern(self):
cypher = "// (n:Fake)\nMATCH (n:AWSRole) RETURN n"
result = _inject(cypher)
assert "// (n:Fake)" in result
assert result.count(LABEL) == 1
def test_string_containing_double_slash(self):
"""Strings with // inside should be consumed as strings, not comments."""
cypher = "MATCH (n:AWSRole {url: 'https://example.com'}) RETURN n"
result = _inject(cypher)
assert "'https://example.com'" in result
assert f"(n:AWSRole:{LABEL}" in result
def test_escaped_quotes_in_string(self):
cypher = r"MATCH (n:AWSRole) WHERE n.name = 'it\'s a test' RETURN n"
result = _inject(cypher)
assert result.count(LABEL) == 1
# ---------------------------------------------------------------------------
# Clause splitting
# ---------------------------------------------------------------------------
class TestClauseSplitting:
def test_case_insensitive_keywords(self):
cypher = "match (n:AWSRole) where n.x = 1 return n"
result = _inject(cypher)
assert f"(n:AWSRole:{LABEL})" in result
def test_optional_match_with_extra_whitespace(self):
cypher = "OPTIONAL MATCH (n:AWSRole) RETURN n"
result = _inject(cypher)
assert f"(n:AWSRole:{LABEL})" in result
def test_multiple_match_clauses(self):
cypher = (
"MATCH (a:AWSAccount)--(b:AWSRole) "
"MATCH (b)--(c:AWSPolicy) "
"RETURN a, b, c"
)
result = _inject(cypher)
assert f"(a:AWSAccount:{LABEL})" in result
assert f"(b:AWSRole:{LABEL})" in result
assert f"(c:AWSPolicy:{LABEL})" in result
# (b) in second MATCH is bare and gets injected
assert result.count(LABEL) == 4 # a, b (labeled), b (bare in 2nd MATCH), c
# ---------------------------------------------------------------------------
# Real-world query patterns from aws.py
# ---------------------------------------------------------------------------
class TestRealWorldQueries:
def test_basic_resource_query(self):
cypher = (
"MATCH path = (aws:AWSAccount {id: $provider_uid})--(rds:RDSInstance)\n"
"UNWIND nodes(path) as n\n"
"OPTIONAL MATCH (n)-[pfr]-(pf:ProwlerFinding {status: 'FAIL'})\n"
"RETURN path, collect(DISTINCT pf) as dpf"
)
result = _inject(cypher)
assert f"(aws:AWSAccount:{LABEL} {{id: $provider_uid}})" in result
assert f"(rds:RDSInstance:{LABEL})" in result
assert f"(n:{LABEL})" in result
assert f"(pf:ProwlerFinding:{LABEL} {{status: 'FAIL'}})" in result
assert "nodes(path)" in result # function call untouched
assert "collect(DISTINCT pf)" in result # function call untouched
def test_privilege_escalation_query(self):
cypher = (
"MATCH path_principal = (aws:AWSAccount {id: $uid})"
"--(principal:AWSPrincipal)--(pol:AWSPolicy)\n"
"WHERE pol.effect = 'Allow'\n"
"MATCH (principal)--(cfn_policy:AWSPolicy)"
"--(stmt_cfn:AWSPolicyStatement)\n"
"WHERE any(action IN stmt_cfn.action WHERE toLower(action) = 'iam:passrole')\n"
"MATCH path_target = (aws)--(target_role:AWSRole)"
"-[:TRUSTS_AWS_PRINCIPAL]->(:AWSPrincipal {arn: 'cloudformation.amazonaws.com'})\n"
"RETURN path_principal, path_target"
)
result = _inject(cypher)
assert f"(aws:AWSAccount:{LABEL} {{id: $uid}})" in result
assert f"(principal:AWSPrincipal:{LABEL})" in result
assert f"(pol:AWSPolicy:{LABEL})" in result
assert f"(principal:{LABEL})" in result # bare in 2nd MATCH
assert f"(cfn_policy:AWSPolicy:{LABEL})" in result
assert f"(stmt_cfn:AWSPolicyStatement:{LABEL})" in result
assert f"(aws:{LABEL})" in result # bare in 3rd MATCH
assert f"(target_role:AWSRole:{LABEL})" in result
assert (
f"(:AWSPrincipal:{LABEL} {{arn: 'cloudformation.amazonaws.com'}})" in result
)
# Function calls in WHERE untouched
assert "any(action IN" in result
assert "toLower(action)" in result
def test_custom_bare_query(self):
cypher = (
"MATCH (a)-[:HAS_POLICY]->(b)\n"
"WHERE a.name CONTAINS 'admin'\n"
"RETURN a, b"
)
result = _inject(cypher)
assert f"(a:{LABEL})" in result
assert f"(b:{LABEL})" in result
assert result.count(LABEL) == 2
def test_internet_via_path_connectivity(self):
"""Post-refactor pattern: Internet reached via CAN_ACCESS, not standalone."""
cypher = (
"MATCH path = (aws:AWSAccount {id: $provider_uid})--(ec2:EC2Instance)\n"
"WHERE ec2.exposed_internet = true\n"
"OPTIONAL MATCH (internet:Internet)-[can_access:CAN_ACCESS]->(ec2)\n"
"RETURN path, internet, can_access"
)
result = _inject(cypher)
assert f"(aws:AWSAccount:{LABEL}" in result
assert f"(ec2:EC2Instance:{LABEL})" in result
assert f"(internet:Internet:{LABEL})" in result
# ec2 in OPTIONAL MATCH is bare, but already labeled via Pass A won't match it
# because it has no label. It IS bare, so Pass B injects.
assert f"(ec2:{LABEL})" in result
# ---------------------------------------------------------------------------
# Edge cases
# ---------------------------------------------------------------------------
class TestEdgeCases:
def test_empty_query(self):
assert _inject("") == ""
def test_no_node_patterns(self):
cypher = "RETURN 1 + 2"
assert _inject(cypher) == cypher
def test_anonymous_empty_parens_not_injected(self):
"""Empty () in MATCH is extremely rare but should not be injected."""
cypher = "MATCH ()--(m:AWSRole) RETURN m"
result = _inject(cypher)
assert "()" in result # empty parens untouched
assert f"(m:AWSRole:{LABEL})" in result
def test_fully_anonymous_query_bypasses_injection(self):
"""All-anonymous patterns bypass injection entirely.
MATCH ()--()--() has no labels and no variables, so neither Pass A
(labeled) nor Pass B (bare identifier) can inject the provider label.
This is safe because _serialize_graph() (Layer 3) filters every
returned node by provider label, dropping cross-provider data before
it reaches the user.
"""
cypher = "MATCH ()--()--() RETURN *"
result = _inject(cypher)
assert result == cypher # completely unmodified
assert LABEL not in result
def test_relationship_patterns_untouched(self):
cypher = "MATCH (a:X)-[r:REL_TYPE {x: 1}]->(b:Y) RETURN a"
result = _inject(cypher)
assert "[r:REL_TYPE {x: 1}]" in result # relationship untouched
assert f"(a:X:{LABEL})" in result
assert f"(b:Y:{LABEL})" in result
def test_call_subquery(self):
cypher = (
"CALL {\n"
" MATCH (inner:AWSRole) RETURN inner\n"
"}\n"
"MATCH (outer:AWSAccount) RETURN outer, inner"
)
result = _inject(cypher)
assert f"(inner:AWSRole:{LABEL})" in result
assert f"(outer:AWSAccount:{LABEL})" in result
def test_multiple_protected_regions(self):
cypher = (
"MATCH (n:X {a: 'hello'}) " 'WHERE n.b = "world" ' "// comment\n" "RETURN n"
)
result = _inject(cypher)
assert "'hello'" in result
assert '"world"' in result
assert "// comment" in result
assert f"(n:X:{LABEL}" in result
def test_idempotent_on_already_injected(self):
"""Running injection twice should add the label twice (not ideal, but predictable)."""
first = _inject("MATCH (n:AWSRole) RETURN n")
second = _inject(first)
# The label appears twice (stacked)
assert second.count(LABEL) == 2
# ---------------------------------------------------------------------------
# Validation
# ---------------------------------------------------------------------------
class TestValidation:
@pytest.mark.parametrize(
"cypher",
[
"LOAD CSV FROM 'http://169.254.169.254/' AS x RETURN x",
"load csv from 'http://evil.com' as row return row",
"CALL apoc.load.json('http://evil.com/') YIELD value RETURN value",
"CALL apoc.load.csvParams('http://evil.com/', {}, null) YIELD list RETURN list",
"CALL apoc.import.csv([{fileName: 'f'}], [], {}) YIELD node RETURN node",
"CALL apoc.export.csv.all('file.csv', {})",
"CALL apoc.cypher.run('CREATE (n)', {}) YIELD value RETURN value",
"CALL apoc.systemdb.graph() YIELD nodes RETURN nodes",
"CALL apoc.config.list() YIELD key, value RETURN key, value",
"CALL apoc.periodic.iterate('MATCH (n) RETURN n', 'DELETE n', {batchSize: 100})",
"CALL apoc.do.when(true, 'CREATE (n) RETURN n', '', {}) YIELD value RETURN value",
"CALL apoc.trigger.add('t', 'RETURN 1', {phase: 'before'})",
"CALL apoc.custom.asProcedure('myProc', 'RETURN 1')",
],
ids=[
"LOAD_CSV",
"LOAD_CSV_lowercase",
"apoc.load.json",
"apoc.load.csvParams",
"apoc.import.csv",
"apoc.export.csv",
"apoc.cypher.run",
"apoc.systemdb.graph",
"apoc.config.list",
"apoc.periodic.iterate",
"apoc.do.when",
"apoc.trigger.add",
"apoc.custom.asProcedure",
],
)
def test_rejects_blocked_patterns(self, cypher):
with pytest.raises(ValidationError) as exc:
validate_custom_query(cypher)
assert "blocked operation" in str(exc.value.detail)
@pytest.mark.parametrize(
"cypher",
[
"MATCH (n:AWSAccount) RETURN n LIMIT 10",
"MATCH (a)-[r]->(b) RETURN a, r, b",
"MATCH (n) WHERE n.name CONTAINS 'load' RETURN n",
"CALL apoc.create.vNode(['Label'], {}) YIELD node RETURN node",
"MATCH (n) WHERE n.name = 'apoc.load.json' RETURN n",
'MATCH (n) WHERE n.description = "LOAD CSV is cool" RETURN n',
],
ids=[
"simple_match",
"traversal",
"contains_load_substring",
"apoc_virtual_node",
"apoc_load_inside_single_quotes",
"load_csv_inside_double_quotes",
],
)
def test_allows_clean_queries(self, cypher):
validate_custom_query(cypher)
+33
View File
@@ -243,6 +243,39 @@ class TestSAMLConfigurationModel:
assert "Invalid XML" in errors["metadata_xml"][0]
assert "not well-formed" in errors["metadata_xml"][0]
def test_xml_bomb_rejected(self, tenants_fixture):
"""
Regression test: a 'billion laughs' XML bomb in the SAML metadata field
must be rejected and not allowed to exhaust server memory / CPU.
Before the fix, xml.etree.ElementTree was used directly, which does not
protect against entity-expansion attacks. The fix switches to defusedxml
which raises an exception for any XML containing entity definitions.
"""
tenant = tenants_fixture[0]
xml_bomb = (
"<?xml version='1.0'?>"
"<!DOCTYPE bomb ["
" <!ENTITY a 'aaaaaaaaaa'>"
" <!ENTITY b '&a;&a;&a;&a;&a;&a;&a;&a;&a;&a;'>"
" <!ENTITY c '&b;&b;&b;&b;&b;&b;&b;&b;&b;&b;'>"
" <!ENTITY d '&c;&c;&c;&c;&c;&c;&c;&c;&c;&c;'>"
"]>"
"<md:EntityDescriptor entityID='&d;' "
"xmlns:md='urn:oasis:names:tc:SAML:2.0:metadata'/>"
)
config = SAMLConfiguration(
email_domain="xmlbomb.com",
metadata_xml=xml_bomb,
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config._parse_metadata()
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
def test_metadata_missing_sso_fails(self, tenants_fixture):
tenant = tenants_fixture[0]
xml = """<md:EntityDescriptor entityID="x" xmlns:md="urn:oasis:names:tc:SAML:2.0:metadata">
+64 -1
View File
@@ -2,7 +2,7 @@ import json
from unittest.mock import ANY, Mock, patch
import pytest
from conftest import TODAY
from conftest import TEST_PASSWORD, TODAY
from django.urls import reverse
from rest_framework import status
@@ -830,3 +830,66 @@ class TestUserRoleLinkPermissions:
)
assert response.status_code == status.HTTP_403_FORBIDDEN
@pytest.mark.django_db
class TestCrossTenantRoleLeak:
"""Regression tests for get_role() cross-tenant privilege leak.
get_role() must query admin_db (bypassing RLS) so that a user with a role
in tenant A cannot accidentally pass role checks when authenticated against
tenant B where they have no role.
"""
def test_user_with_role_in_tenant_a_denied_in_tenant_b(self, tenants_fixture):
"""User has admin role in tenant A, membership in tenant B but no role.
Hitting an RBAC-protected endpoint with a tenant-B token must return 403."""
from rest_framework.test import APIClient
tenant_a = tenants_fixture[0]
tenant_b = tenants_fixture[1]
user = User.objects.create_user(
name="cross_tenant_user",
email="cross_tenant@test.com",
password=TEST_PASSWORD,
)
Membership.objects.create(
user=user, tenant=tenant_a, role=Membership.RoleChoices.OWNER
)
Membership.objects.create(
user=user, tenant=tenant_b, role=Membership.RoleChoices.OWNER
)
# Role only in tenant A
role = Role.objects.create(
name="admin",
tenant_id=tenant_a.id,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=True,
)
UserRoleRelationship.objects.create(user=user, role=role, tenant_id=tenant_a.id)
# Mint token scoped to tenant B (where user has NO role)
serializer = TokenSerializer(
data={
"type": "tokens",
"email": "cross_tenant@test.com",
"password": TEST_PASSWORD,
"tenant_id": tenant_b.id,
}
)
serializer.is_valid(raise_exception=True)
access_token = serializer.validated_data["access"]
client = APIClient()
client.defaults["HTTP_AUTHORIZATION"] = f"Bearer {access_token}"
# user-list requires manage_users permission via HasPermissions
response = client.get(reverse("user-list"))
assert response.status_code == status.HTTP_403_FORBIDDEN
+58 -12
View File
@@ -4,14 +4,25 @@ from unittest.mock import MagicMock
from config.settings.sentry import before_send
def _make_log_record(msg, level=logging.ERROR, name="test", args=None):
"""Build a real LogRecord so getMessage() works like in production."""
record = logging.LogRecord(
name=name,
level=level,
pathname="",
lineno=0,
msg=msg,
args=args,
exc_info=None,
)
return record
def test_before_send_ignores_log_with_ignored_exception():
"""Test that before_send ignores logs containing ignored exceptions."""
log_record = MagicMock()
log_record.msg = "Provider kubernetes is not connected"
log_record.levelno = logging.ERROR # 40
log_record = _make_log_record("Provider kubernetes is not connected")
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
@@ -36,12 +47,9 @@ def test_before_send_ignores_exception_with_ignored_exception():
def test_before_send_passes_through_non_ignored_log():
"""Test that before_send passes through logs that don't contain ignored exceptions."""
log_record = MagicMock()
log_record.msg = "Some other error message"
log_record.levelno = logging.ERROR # 40
log_record = _make_log_record("Some other error message")
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
@@ -66,15 +74,53 @@ def test_before_send_passes_through_non_ignored_exception():
def test_before_send_handles_warning_level():
"""Test that before_send handles warning level logs."""
log_record = MagicMock()
log_record.msg = "Provider kubernetes is not connected"
log_record.levelno = logging.WARNING # 30
log_record = _make_log_record(
"Provider kubernetes is not connected", level=logging.WARNING
)
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None
def test_before_send_ignores_neo4j_defunct_connection():
"""Test that before_send drops neo4j.io defunct connection logs.
The Neo4j driver logs transient connection errors at ERROR level
before RetryableSession retries them. These are noise.
The driver uses %s formatting, so "defunct" is in the args, not
in the template. This test mirrors the real LogRecord structure.
"""
log_record = _make_log_record(
msg="[#%04X] _: <CONNECTION> error: %s: %r",
name="neo4j.io",
args=(
0xE5CC,
"Failed to read from defunct connection "
"IPv4Address(('cloud-neo4j.prowler.com', 7687))",
ConnectionResetError(104, "Connection reset by peer"),
),
)
hint = {"log_record": log_record}
event = MagicMock()
assert before_send(event, hint) is None
def test_before_send_passes_non_defunct_neo4j_log():
"""Test that before_send passes through neo4j.io logs that are not about defunct connections."""
log_record = _make_log_record(
msg="Some other neo4j transport error",
name="neo4j.io",
)
hint = {"log_record": log_record}
event = MagicMock()
assert before_send(event, hint) == event
+50 -5
View File
@@ -33,6 +33,7 @@ from prowler.providers.m365.m365_provider import M365Provider
from prowler.providers.mongodbatlas.mongodbatlas_provider import MongodbatlasProvider
from prowler.providers.openstack.openstack_provider import OpenstackProvider
from prowler.providers.oraclecloud.oraclecloud_provider import OraclecloudProvider
from prowler.providers.vercel.vercel_provider import VercelProvider
class TestMergeDicts:
@@ -128,6 +129,7 @@ class TestReturnProwlerProvider:
(Provider.ProviderChoices.CLOUDFLARE.value, CloudflareProvider),
(Provider.ProviderChoices.OPENSTACK.value, OpenstackProvider),
(Provider.ProviderChoices.IMAGE.value, ImageProvider),
(Provider.ProviderChoices.VERCEL.value, VercelProvider),
],
)
def test_return_prowler_provider(self, provider_type, expected_provider):
@@ -218,6 +220,24 @@ class TestProwlerProviderConnectionTest:
registry_token="tok123",
)
@patch("api.utils.return_prowler_provider")
def test_prowler_provider_connection_test_vercel_provider(
self, mock_return_prowler_provider
):
"""Test connection test for Vercel provider passes team_id."""
provider = MagicMock()
provider.uid = "team_abcdef1234567890"
provider.provider = Provider.ProviderChoices.VERCEL.value
provider.secret.secret = {"api_token": "vercel_token_123"}
mock_return_prowler_provider.return_value = MagicMock()
prowler_provider_connection_test(provider)
mock_return_prowler_provider.return_value.test_connection.assert_called_once_with(
api_token="vercel_token_123",
team_id="team_abcdef1234567890",
raise_on_exception=False,
)
@patch("api.utils.return_prowler_provider")
def test_prowler_provider_connection_test_image_provider_no_creds(
self, mock_return_prowler_provider
@@ -284,6 +304,10 @@ class TestGetProwlerProviderKwargs:
Provider.ProviderChoices.OPENSTACK.value,
{},
),
(
Provider.ProviderChoices.VERCEL.value,
{"team_id": "provider_uid"},
),
],
)
def test_get_prowler_provider_kwargs(self, provider_type, expected_extra_kwargs):
@@ -782,11 +806,15 @@ class TestProwlerIntegrationConnectionTest:
}
integration.configuration = {}
# Mock successful JIRA connection with projects
# Mock successful JIRA connection with projects and issue types
mock_connection = MagicMock()
mock_connection.is_connected = True
mock_connection.error = None
mock_connection.projects = {"PROJ1": "Project 1", "PROJ2": "Project 2"}
mock_connection.issue_types = {
"PROJ1": ["Task", "Bug"],
"PROJ2": ["Task", "Story"],
}
mock_jira_class.test_connection.return_value = mock_connection
# Mock rls_transaction context manager
@@ -815,6 +843,12 @@ class TestProwlerIntegrationConnectionTest:
"PROJ2": "Project 2",
}
# Verify issue types were saved to integration configuration
assert integration.configuration["issue_types"] == {
"PROJ1": ["Task", "Bug"],
"PROJ2": ["Task", "Story"],
}
# Verify integration.save() was called
integration.save.assert_called_once()
@@ -838,6 +872,7 @@ class TestProwlerIntegrationConnectionTest:
mock_connection.is_connected = False
mock_connection.error = Exception("Authentication failed: Invalid credentials")
mock_connection.projects = {} # Empty projects when connection fails
mock_connection.issue_types = {} # Empty issue types when connection fails
mock_jira_class.test_connection.return_value = mock_connection
# Mock rls_transaction context manager
@@ -863,6 +898,9 @@ class TestProwlerIntegrationConnectionTest:
# Verify empty projects dict was saved to integration configuration
assert integration.configuration["projects"] == {}
# Verify empty issue types dict was saved to integration configuration
assert integration.configuration["issue_types"] == {}
# Verify integration.save() was called even on connection failure
integration.save.assert_called_once()
@@ -881,11 +919,11 @@ class TestProwlerIntegrationConnectionTest:
"domain": "example.atlassian.net",
}
integration.configuration = {
"issue_types": ["Task"], # Existing configuration
"issue_types": {"OLD_PROJ": ["Task"]}, # Existing configuration
"projects": {"OLD_PROJ": "Old Project"}, # Will be overwritten
}
# Mock successful JIRA connection with new projects
# Mock successful JIRA connection with new projects and issue types
mock_connection = MagicMock()
mock_connection.is_connected = True
mock_connection.error = None
@@ -893,6 +931,10 @@ class TestProwlerIntegrationConnectionTest:
"NEW_PROJ1": "New Project 1",
"NEW_PROJ2": "New Project 2",
}
mock_connection.issue_types = {
"NEW_PROJ1": ["Task", "Bug"],
"NEW_PROJ2": ["Story"],
}
mock_jira_class.test_connection.return_value = mock_connection
# Mock rls_transaction context manager
@@ -910,8 +952,11 @@ class TestProwlerIntegrationConnectionTest:
"NEW_PROJ2": "New Project 2",
}
# Verify other configuration fields were preserved
assert integration.configuration["issue_types"] == ["Task"]
# Verify issue types were also updated
assert integration.configuration["issue_types"] == {
"NEW_PROJ1": ["Task", "Bug"],
"NEW_PROJ2": ["Story"],
}
# Verify integration.save() was called
integration.save.assert_called_once()
File diff suppressed because it is too large Load Diff
+23
View File
@@ -39,6 +39,7 @@ if TYPE_CHECKING:
)
from prowler.providers.openstack.openstack_provider import OpenstackProvider
from prowler.providers.oraclecloud.oraclecloud_provider import OraclecloudProvider
from prowler.providers.vercel.vercel_provider import VercelProvider
class CustomOAuth2Client(OAuth2Client):
@@ -94,6 +95,7 @@ def return_prowler_provider(
| MongodbatlasProvider
| OpenstackProvider
| OraclecloudProvider
| VercelProvider
):
"""Return the Prowler provider class based on the given provider type.
@@ -175,6 +177,10 @@ def return_prowler_provider(
from prowler.providers.image.image_provider import ImageProvider
prowler_provider = ImageProvider
case Provider.ProviderChoices.VERCEL.value:
from prowler.providers.vercel.vercel_provider import VercelProvider
prowler_provider = VercelProvider
case _:
raise ValueError(f"Provider type {provider.provider} not supported")
return prowler_provider
@@ -235,6 +241,11 @@ def get_prowler_provider_kwargs(
# clouds_yaml_content, clouds_yaml_cloud and provider_id are validated
# in the provider itself, so it's not needed here.
pass
elif provider.provider == Provider.ProviderChoices.VERCEL.value:
prowler_provider_kwargs = {
**prowler_provider_kwargs,
"team_id": provider.uid,
}
elif provider.provider == Provider.ProviderChoices.IMAGE.value:
# Detect whether uid is a registry URL (e.g. "docker.io/andoniaf") or
# a concrete image reference (e.g. "docker.io/andoniaf/myimage:latest").
@@ -281,6 +292,7 @@ def initialize_prowler_provider(
| MongodbatlasProvider
| OpenstackProvider
| OraclecloudProvider
| VercelProvider
):
"""Initialize a Prowler provider instance based on the given provider type.
@@ -332,6 +344,13 @@ def prowler_provider_connection_test(provider: Provider) -> Connection:
"raise_on_exception": False,
}
return prowler_provider.test_connection(**openstack_kwargs)
elif provider.provider == Provider.ProviderChoices.VERCEL.value:
vercel_kwargs = {
**prowler_provider_kwargs,
"team_id": provider.uid,
"raise_on_exception": False,
}
return prowler_provider.test_connection(**vercel_kwargs)
elif provider.provider == Provider.ProviderChoices.IMAGE.value:
image_kwargs = {
"image": provider.uid,
@@ -415,8 +434,12 @@ def prowler_integration_connection_test(integration: Integration) -> Connection:
raise_on_exception=False,
)
project_keys = jira_connection.projects if jira_connection.is_connected else {}
issue_types = (
jira_connection.issue_types if jira_connection.is_connected else {}
)
with rls_transaction(str(integration.tenant_id)):
integration.configuration["projects"] = project_keys
integration.configuration["issue_types"] = issue_types
integration.save()
return jira_connection
elif integration.integration_type == Integration.IntegrationChoices.SLACK:
@@ -69,8 +69,10 @@ class SecurityHubConfigSerializer(BaseValidateSerializer):
class JiraConfigSerializer(BaseValidateSerializer):
domain = serializers.CharField(read_only=True)
issue_types = serializers.ListField(
read_only=True, child=serializers.CharField(), default=["Task"]
issue_types = serializers.DictField(
read_only=True,
child=serializers.ListField(child=serializers.CharField()),
default={},
)
projects = serializers.DictField(read_only=True)
@@ -404,6 +404,17 @@ from rest_framework_json_api import serializers
},
"required": ["clouds_yaml_content", "clouds_yaml_cloud"],
},
{
"type": "object",
"title": "Vercel API Token",
"properties": {
"api_token": {
"type": "string",
"description": "Vercel API token for authentication. Can be scoped to a specific team.",
},
},
"required": ["api_token"],
},
]
}
)
+67 -5
View File
@@ -1573,6 +1573,8 @@ class BaseWriteProviderSecretSerializer(BaseWriteSerializer):
serializer = OpenStackCloudsYamlProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.IMAGE.value:
serializer = ImageProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.VERCEL.value:
serializer = VercelProviderSecret(data=secret)
else:
raise serializers.ValidationError(
{"provider": f"Provider type not supported {provider_type}"}
@@ -1779,6 +1781,13 @@ class ImageProviderSecret(serializers.Serializer):
return attrs
class VercelProviderSecret(serializers.Serializer):
api_token = serializers.CharField()
class Meta:
resource_name = "provider-secrets"
class AlibabaCloudProviderSecret(serializers.Serializer):
access_key_id = serializers.CharField()
access_key_secret = serializers.CharField()
@@ -2713,11 +2722,11 @@ class BaseWriteIntegrationSerializer(BaseWriteSerializer):
)
config_serializer = JiraConfigSerializer
# Create non-editable configuration for JIRA integration
default_jira_issue_types = ["Task"]
# issue_types will be populated per project when connection is tested
configuration.update(
{
"projects": {},
"issue_types": default_jira_issue_types,
"issue_types": {},
"domain": credentials.get("domain"),
}
)
@@ -2932,13 +2941,25 @@ class IntegrationUpdateSerializer(BaseWriteIntegrationSerializer):
return representation
class IntegrationJiraIssueTypesSerializer(BaseSerializerV1):
"""
Serializer for Jira issue types response.
"""
project_key = serializers.CharField(read_only=True)
issue_types = serializers.ListField(child=serializers.CharField(), read_only=True)
class JSONAPIMeta:
resource_name = "jira-issue-types"
class IntegrationJiraDispatchSerializer(BaseSerializerV1):
"""
Serializer for dispatching findings to JIRA integration.
"""
project_key = serializers.CharField(required=True)
issue_type = serializers.ChoiceField(required=True, choices=["Task"])
issue_type = serializers.CharField(required=True)
class JSONAPIMeta:
resource_name = "integrations-jira-dispatches"
@@ -2967,6 +2988,23 @@ class IntegrationJiraDispatchSerializer(BaseSerializerV1):
}
)
issue_type = attrs.get("issue_type")
available_issue_types = integration_instance.configuration.get(
"issue_types", {}
)
# Handle old format where issue_types was a flat list (e.g., ["Task"])
if not isinstance(available_issue_types, dict):
available_issue_types = {}
project_issue_types = available_issue_types.get(project_key, [])
if project_issue_types and issue_type not in project_issue_types:
raise ValidationError(
{
"issue_type": f"The issue type '{issue_type}' is not available for project '{project_key}'. "
f"Available types: {', '.join(project_issue_types)}. "
"Refresh the connection if this is an error."
}
)
return validated_attrs
@@ -4147,6 +4185,7 @@ class FindingGroupSerializer(BaseSerializerV1):
check_description = serializers.CharField(required=False, allow_null=True)
severity = serializers.CharField()
status = serializers.CharField()
muted = serializers.BooleanField()
impacted_providers = serializers.ListField(
child=serializers.CharField(), required=False
)
@@ -4154,9 +4193,25 @@ class FindingGroupSerializer(BaseSerializerV1):
resources_total = serializers.IntegerField()
pass_count = serializers.IntegerField()
fail_count = serializers.IntegerField()
manual_count = serializers.IntegerField()
pass_muted_count = serializers.IntegerField()
fail_muted_count = serializers.IntegerField()
manual_muted_count = serializers.IntegerField()
muted_count = serializers.IntegerField()
new_count = serializers.IntegerField()
changed_count = serializers.IntegerField()
new_fail_count = serializers.IntegerField()
new_fail_muted_count = serializers.IntegerField()
new_pass_count = serializers.IntegerField()
new_pass_muted_count = serializers.IntegerField()
new_manual_count = serializers.IntegerField()
new_manual_muted_count = serializers.IntegerField()
changed_fail_count = serializers.IntegerField()
changed_fail_muted_count = serializers.IntegerField()
changed_pass_count = serializers.IntegerField()
changed_pass_muted_count = serializers.IntegerField()
changed_manual_count = serializers.IntegerField()
changed_manual_muted_count = serializers.IntegerField()
first_seen_at = serializers.DateTimeField(required=False, allow_null=True)
last_seen_at = serializers.DateTimeField(required=False, allow_null=True)
failing_since = serializers.DateTimeField(required=False, allow_null=True)
@@ -4170,16 +4225,21 @@ class FindingGroupResourceSerializer(BaseSerializerV1):
Serializer for Finding Group Resources - resources within a finding group.
Returns individual resources with their current status, severity,
and timing information.
and timing information. Orphan findings (without any resource) expose the
finding id as `id` so the row stays identifiable in the UI.
"""
id = serializers.UUIDField(source="resource_id")
id = serializers.UUIDField(source="row_id")
resource = serializers.SerializerMethodField()
provider = serializers.SerializerMethodField()
finding_id = serializers.UUIDField()
status = serializers.CharField()
severity = serializers.CharField()
muted = serializers.BooleanField()
delta = serializers.CharField(required=False, allow_null=True)
first_seen_at = serializers.DateTimeField(required=False, allow_null=True)
last_seen_at = serializers.DateTimeField(required=False, allow_null=True)
muted_reason = serializers.CharField(required=False, allow_null=True)
class JSONAPIMeta:
resource_name = "finding-group-resources"
@@ -4193,6 +4253,7 @@ class FindingGroupResourceSerializer(BaseSerializerV1):
"service": {"type": "string"},
"region": {"type": "string"},
"type": {"type": "string"},
"resource_group": {"type": "string"},
},
}
)
@@ -4204,6 +4265,7 @@ class FindingGroupResourceSerializer(BaseSerializerV1):
"service": obj.get("resource_service", ""),
"region": obj.get("resource_region", ""),
"type": obj.get("resource_type", ""),
"resource_group": obj.get("resource_group", ""),
}
@extend_schema_field(
File diff suppressed because it is too large Load Diff
+3 -1
View File
@@ -17,8 +17,10 @@ celery_app.config_from_object("django.conf:settings", namespace="CELERY")
celery_app.conf.update(result_extended=True, result_expires=None)
celery_app.conf.broker_transport_options = {
"visibility_timeout": BROKER_VISIBILITY_TIMEOUT
"visibility_timeout": BROKER_VISIBILITY_TIMEOUT,
"queue_order_strategy": "priority",
}
celery_app.conf.task_default_priority = 6
celery_app.conf.result_backend_transport_options = {
"visibility_timeout": BROKER_VISIBILITY_TIMEOUT
}
+5
View File
@@ -299,3 +299,8 @@ DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
# SAML requirement
CSRF_COOKIE_SECURE = True
SESSION_COOKIE_SECURE = True
# Attack Paths
ATTACK_PATHS_SCAN_STALE_THRESHOLD_MINUTES = env.int(
"ATTACK_PATHS_SCAN_STALE_THRESHOLD_MINUTES", 2880
) # 48h
+1 -1
View File
@@ -15,7 +15,7 @@ from config.django.production import LOGGING as DJANGO_LOGGERS, DEBUG # noqa: E
from config.custom_logging import BackendLogger # noqa: E402
BIND_ADDRESS = env("DJANGO_BIND_ADDRESS", default="127.0.0.1")
PORT = env("DJANGO_PORT", default=8000)
PORT = env("DJANGO_PORT", default=8080)
# Server settings
bind = f"{BIND_ADDRESS}:{PORT}"
+43 -1
View File
@@ -1,10 +1,52 @@
from urllib.parse import quote
from config.env import env
_VALID_SCHEMES = {"redis", "rediss"}
def _build_celery_broker_url(
scheme: str,
username: str,
password: str,
host: str,
port: str,
db: str,
) -> str:
if scheme not in _VALID_SCHEMES:
raise ValueError(
f"Invalid VALKEY_SCHEME '{scheme}'. Must be one of: {', '.join(sorted(_VALID_SCHEMES))}"
)
encoded_username = quote(username, safe="") if username else ""
encoded_password = quote(password, safe="") if password else ""
auth = ""
if encoded_username and encoded_password:
auth = f"{encoded_username}:{encoded_password}@"
elif encoded_password:
auth = f":{encoded_password}@"
elif encoded_username:
auth = f"{encoded_username}@"
return f"{scheme}://{auth}{host}:{port}/{db}"
VALKEY_SCHEME = env("VALKEY_SCHEME", default="redis")
VALKEY_USERNAME = env("VALKEY_USERNAME", default="")
VALKEY_PASSWORD = env("VALKEY_PASSWORD", default="")
VALKEY_HOST = env("VALKEY_HOST", default="valkey")
VALKEY_PORT = env("VALKEY_PORT", default="6379")
VALKEY_DB = env("VALKEY_DB", default="0")
CELERY_BROKER_URL = f"redis://{VALKEY_HOST}:{VALKEY_PORT}/{VALKEY_DB}"
CELERY_BROKER_URL = _build_celery_broker_url(
VALKEY_SCHEME,
VALKEY_USERNAME,
VALKEY_PASSWORD,
VALKEY_HOST,
VALKEY_PORT,
VALKEY_DB,
)
CELERY_RESULT_BACKEND = "django-db"
CELERY_TASK_TRACK_STARTED = True
+15 -2
View File
@@ -1,4 +1,5 @@
import sentry_sdk
from config.env import env
IGNORED_EXCEPTIONS = [
@@ -85,8 +86,20 @@ def before_send(event, hint):
# Ignore logs with the ignored_exceptions
# https://docs.python.org/3/library/logging.html#logrecord-objects
if "log_record" in hint:
log_msg = hint["log_record"].msg
log_lvl = hint["log_record"].levelno
log_record = hint["log_record"]
log_msg = log_record.getMessage()
log_lvl = log_record.levelno
# The Neo4j driver logs transient connection errors (defunct
# connections, resets) at ERROR level via the `neo4j.io` logger.
# `RetryableSession` handles these with retries. If all retries
# are exhausted, the exception propagates and Sentry captures
# it as a normal exception event.
if (
getattr(log_record, "name", "").startswith("neo4j.io")
and "defunct" in log_msg
):
return None
# Handle Error and Critical events and discard the rest
if log_lvl <= 40 and any(ignored in log_msg for ignored in IGNORED_EXCEPTIONS):
+110 -2
View File
@@ -111,8 +111,9 @@ def disable_logging():
logging.disable(logging.CRITICAL)
@pytest.fixture(scope="session", autouse=True)
def create_test_user(django_db_setup, django_db_blocker):
@pytest.fixture(scope="session")
def _session_test_user(django_db_setup, django_db_blocker):
"""Create the test user once per session. Internal; use create_test_user instead."""
with django_db_blocker.unblock():
user = User.objects.create_user(
name="testing",
@@ -122,6 +123,21 @@ def create_test_user(django_db_setup, django_db_blocker):
return user
@pytest.fixture(autouse=True)
def create_test_user(_session_test_user, django_db_blocker):
"""Re-create the session-scoped test user when a TransactionTestCase
has truncated the users table."""
with django_db_blocker.unblock():
if not User.objects.filter(pk=_session_test_user.pk).exists():
User.objects.create_user(
id=_session_test_user.pk,
name="testing",
email=TEST_USER,
password=TEST_PASSWORD,
)
return _session_test_user
@pytest.fixture(scope="function")
def create_test_user_rbac(django_db_setup, django_db_blocker, tenants_fixture):
with django_db_blocker.unblock():
@@ -549,6 +565,12 @@ def providers_fixture(tenants_fixture):
alias="googleworkspace_testing",
tenant_id=tenant.id,
)
provider13 = Provider.objects.create(
provider="vercel",
uid="team_abcdef1234567890ab",
alias="vercel_testing",
tenant_id=tenant.id,
)
return (
provider1,
@@ -563,6 +585,7 @@ def providers_fixture(tenants_fixture):
provider10,
provider11,
provider12,
provider13,
)
@@ -2012,6 +2035,7 @@ def finding_groups_fixture(
"CheckId": "s3_bucket_public_access",
"checktitle": "Ensure S3 buckets do not allow public access",
"Description": "S3 buckets should be configured to restrict public access.",
"resourcegroup": "storage",
},
first_seen_at="2024-01-02T00:00:00Z",
muted=False,
@@ -2036,6 +2060,7 @@ def finding_groups_fixture(
"CheckId": "s3_bucket_public_access",
"checktitle": "Ensure S3 buckets do not allow public access",
"Description": "S3 buckets should be configured to restrict public access.",
"resourcegroup": "storage",
},
first_seen_at="2024-01-03T00:00:00Z",
muted=False,
@@ -2234,6 +2259,89 @@ def finding_groups_fixture(
return findings
@pytest.fixture
def finding_groups_title_variants_fixture(
tenants_fixture, providers_fixture, scans_fixture, resources_fixture
):
"""
Two providers report the same check_id with different checktitle values.
Simulates a Prowler version upgrade where the check title changed but the
check_id stayed the same. Used to verify that check_title__icontains
resolves to check_id first, so results include all providers regardless
of which title variant matches the search term.
"""
tenant = tenants_fixture[0]
provider1, provider2, *_ = providers_fixture
scan1, scan2, *_ = scans_fixture
resource1, resource2, *_ = resources_fixture
findings = []
# Provider 1 — OLD title variant
finding_old = Finding.objects.create(
tenant_id=tenant.id,
uid="fg_title_variant_old",
scan=scan1,
delta="new",
status=Status.FAIL,
status_extended="Secret scanning not enabled",
impact=Severity.high,
impact_extended="High risk",
severity=Severity.high,
raw_result={"status": Status.FAIL, "severity": Severity.high},
tags={},
check_id="github_secret_scanning_enabled",
check_metadata={
"CheckId": "github_secret_scanning_enabled",
"checktitle": "Ensure repository has secret scanning enabled",
"Description": "Checks if secret scanning is enabled.",
},
first_seen_at="2024-01-01T00:00:00Z",
muted=False,
)
finding_old.add_resources([resource1])
findings.append(finding_old)
# Provider 2 — NEW title variant (same check_id, different checktitle)
finding_new = Finding.objects.create(
tenant_id=tenant.id,
uid="fg_title_variant_new",
scan=scan2,
delta="new",
status=Status.FAIL,
status_extended="Secret scanning not enabled on repo",
impact=Severity.high,
impact_extended="High risk",
severity=Severity.high,
raw_result={"status": Status.FAIL, "severity": Severity.high},
tags={},
check_id="github_secret_scanning_enabled",
check_metadata={
"CheckId": "github_secret_scanning_enabled",
"checktitle": "Check if secret scanning is enabled in GitHub",
"Description": "Checks if secret scanning is enabled.",
},
first_seen_at="2024-01-02T00:00:00Z",
muted=False,
)
finding_new.add_resources([resource2])
findings.append(finding_new)
from tasks.jobs.scan import aggregate_finding_group_summaries
aggregate_finding_group_summaries(
tenant_id=str(tenant.id),
scan_id=str(scan1.id),
)
aggregate_finding_group_summaries(
tenant_id=str(tenant.id),
scan_id=str(scan2.id),
)
return findings
def pytest_collection_modifyitems(items):
"""Ensure test_rbac.py is executed first."""
items.sort(key=lambda item: 0 if "test_rbac.py" in item.nodeid else 1)

Some files were not shown because too many files have changed in this diff Show More