Compare commits

...

258 Commits

Author SHA1 Message Date
Prowler Bot df8a7220ff feat(oraclecloud): Update commercial regions (#10082)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-02-16 14:23:28 +01:00
Daniel Barranquero a106cdf4c9 fix: oci regions actions labels (#10083) 2026-02-16 14:23:17 +01:00
Daniel Barranquero a86f0b95bc fix(oci): update regions script to handle raw credentials (#10081) 2026-02-16 14:03:27 +01:00
Josema Camacho bb34f6cc3d refactor(api): remove graph_database and is_graph_database_deleted from AttackPathsScan (#10077) 2026-02-16 12:46:49 +01:00
Daniel Barranquero be516f1dfc feat(openstack): Add 7 New Compute Security Checks (#9944) 2026-02-16 11:46:48 +01:00
Copilot 90e317d39f fix(kms): detect public access for any KMS action, not just kms:* (#10071)
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: jfagoagas <16007882+jfagoagas@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-02-16 10:12:29 +01:00
Pedro Martín 21bdbacdfb chore(readme): update and add skill (#10067)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-02-16 09:31:21 +01:00
Rubén De la Torre Vico 75ee07c6e1 chore(gcp): enhance metadata for logging service (#9648)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-02-13 16:37:07 +01:00
Rubén De la Torre Vico ddc5d879e0 chore(gcp): enhance metadata for kms service (#9647)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-02-13 16:32:26 +01:00
Rubén De la Torre Vico 006c2dc754 chore(gcp): enhance metadata for iam service (#9646)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-02-13 16:24:52 +01:00
Rubén De la Torre Vico 4981d3fc38 chore(gcp): enhance metadata for gke service (#9645)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-13 16:14:14 +01:00
Rubén De la Torre Vico cceaf1ea54 chore(gcp): enhance metadata for gcr service (#9644)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-02-13 15:55:00 +01:00
Rubén De la Torre Vico b436da27c8 chore(gcp): enhance metadata for dns service (#9643)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-13 15:47:30 +01:00
Rubén De la Torre Vico 82be83c668 chore(gcp): enhance metadata for dataproc service (#9642)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-13 14:57:33 +01:00
Andoni Alonso 4f18bfc33c feat(iam): add ECS Exec privilege escalation detection (ECS-006) (#10066) 2026-02-13 14:45:33 +01:00
Rubén De la Torre Vico 941f9b7e0b chore(gcp): enhance metadata for compute service (#9641)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-13 14:29:38 +01:00
kushpatel321 9da0b0c0b1 feat(github): add organization domain verification check (#10033)
Co-authored-by: Kush321 <kushp2018@gmail.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-02-13 13:41:17 +01:00
Rubén De la Torre Vico 8c1da0732d chore(gcp): enhance metadata for cloudsql service (#9639)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-13 13:35:34 +01:00
Josema Camacho 02b58d8a31 fix(api): mark attack paths scan as failed when celery task fails (#10065) 2026-02-13 13:20:38 +01:00
Rubén De la Torre Vico 3defbcd386 chore(gcp): enhance metadata for cloudstorage service (#9640)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-13 13:17:58 +01:00
Josema Camacho ceb4691c36 build(deps): bump cryptography to 44.0.3 and py-ocsf-models to 0.8.1 (#10059) 2026-02-13 12:36:38 +01:00
Pepe Fagoaga 4be8831ee1 docs: add proxy/load balancer UI rebuild requirements (#10064) 2026-02-13 11:11:05 +01:00
Andoni Alonso da23d62e6a docs(image): add Image provider CLI documentation (#9986) 2026-02-13 11:00:03 +01:00
Rubén De la Torre Vico 222db94a48 chore(gcp): enhance metadata for bigquery service (#9638)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-13 10:57:31 +01:00
Hugo Pereira Brito c33565a127 fix(sdk): update openstacksdk to fix pip install on systems without C compiler (#10055) 2026-02-13 10:49:01 +01:00
Pedro Martín 961b247d36 feat(compliance): add csa ccm for the alibabacloud provider (#10061) 2026-02-13 10:36:29 +01:00
Rubén De la Torre Vico 6abd5186aa chore(gcp): enhance metadata for apikeys service (#9637)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-13 10:35:05 +01:00
Pedro Martín 627088e214 feat(compliance): add csa ccm for the oraclecloud provider (#10057) 2026-02-12 18:06:51 +01:00
Josema Camacho 93ac38ca90 feat(attack-pahts--aws-queries): The rest of Path Finding paths queries (#10008) 2026-02-12 17:09:08 +01:00
Andoni Alonso aa7490aab4 feat(image): add container image provider for CLI scanning (#9984) 2026-02-12 16:36:48 +01:00
Daniel Barranquero b94c8a5e5e feat(api): add OpenStack provider support (#10003) 2026-02-12 14:40:19 +01:00
Daniel Barranquero e6bea9f25a feat(oraclecloud): add automated OCI regions updater script and CI workflow (#10020)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-02-12 14:35:43 +01:00
dependabot[bot] 1f4e308374 build(deps): bump pillow from 12.1.0 to 12.1.1 in /api (#10027)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Josema Camacho <josema@prowler.com>
2026-02-12 14:26:03 +01:00
Pedro Martín 4d569d5b79 feat(compliance): add csa ccm for the gcp provider (#10042) 2026-02-12 14:13:24 +01:00
Alejandro Bailo 5b038e631a refactor(ui): centralize provider type filter sanitization in server actions (#10043) 2026-02-12 14:12:37 +01:00
Alejandro Bailo c5707ae9f1 chore(ui): update npm dependencies to fix security vulnerabilities (#10052) 2026-02-12 14:02:05 +01:00
Pedro Martín 29090adb03 feat(compliance): add csa ccm for the azure provider (#10039) 2026-02-12 13:35:22 +01:00
Hugo Pereira Brito 78bd9adeed chore(cloudflare): parallelize zone API calls with threading (#9982)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-02-12 13:15:51 +01:00
Pedro Martín f55983a77d feat(compliance): add csa ccm 4.0 for the aws provider (#10018) 2026-02-12 13:10:59 +01:00
Hugo Pereira Brito 52f98f1704 chore(ci): update org members list in PR labeler (#10053) 2026-02-12 13:04:35 +01:00
Andoni Alonso 3afa98084f chore(ci): update Josema user for labeling purposes (#10041) 2026-02-12 11:46:14 +01:00
Alejandro Bailo b0ee914825 chore(ui): improve changelog wording for v1.18.2 bug fixes (#10044) 2026-02-12 11:30:56 +01:00
Andoni Alonso eabe488437 feat(aws): update privilege escalation check with pathfinding.cloud patterns (#9922) 2026-02-12 09:39:39 +01:00
Alejandro Bailo 8104382cc1 fix(ui): reapply filter transition opacity overlay on filter changes (#10036) 2026-02-11 22:13:33 +01:00
Alejandro Bailo 592c7bac81 fix(ui): move default muted filter from middleware to client-side hook (#10034) 2026-02-11 20:58:58 +01:00
Alejandro Bailo 3aefde14aa revert: re-integrate signalFilterChange into useUrlFilters (#10028) (#10032) 2026-02-11 20:21:58 +01:00
Alejandro Bailo 02f3e77eaf fix(ui): re-integrate signalFilterChange into useUrlFilters and always reset page on filter change (#10028) 2026-02-11 20:06:26 +01:00
Alejandro Bailo bcd7b2d723 fix(ui): remove useTransition and shared context from useUrlFilters (#10025) 2026-02-11 18:57:48 +01:00
Alejandro Bailo 86946f3a84 fix(ui): fix findings filter silent reverts by replacing useRelatedFilters effect with pure derivation (#10021)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-11 17:57:38 +01:00
Andoni Alonso fce1e4f3d2 feat(m365): add defender_safe_attachments_policy_enabled security check (#9833)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-11 15:42:11 +01:00
Andoni Alonso 5d490fa185 feat(m365): add defender_atp_safe_attachments_and_docs_configured security check (#9837)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-11 15:21:06 +01:00
Alejandro Bailo ea847d8824 fix(ui): use local transitions for filter navigation to prevent silent reverts (#10017) 2026-02-11 14:41:03 +01:00
Andoni Alonso c5f7e80b20 feat(m365): add defender_safelinks_policy_enabled security check (#9832)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-11 13:03:32 +01:00
Alejandro Bailo f5345a3982 fix(ui): fix filter navigation and pagination bugs in findings and scans pages (#10013) 2026-02-11 11:18:29 +01:00
Adrián Peña b539514d8d docs: restructure SAML SSO guide for Okta App Catalog (#10012) 2026-02-11 11:15:59 +01:00
Hugo Pereira Brito 9acef41f96 fix(sdk): mute HPACK library logs to prevent token leakage (#10010) 2026-02-11 10:59:15 +01:00
Pedro Martín c40adce2ff feat(oraclecloud): add CIS 3.1 compliance framework (#9971) 2026-02-11 10:39:16 +01:00
Adrián Peña 378c2ff7f6 fix(saml): prevent SAML role mapping from removing last manage-account user (#10007) 2026-02-10 15:57:34 +01:00
Alejandro Bailo d54095abde feat(ui): add expandable row support to DataTable (#9940) 2026-02-10 15:51:55 +01:00
Alejandro Bailo a12cb5b6d6 feat(ui): add TreeView component for hierarchical data (#9911) 2026-02-10 15:26:07 +01:00
Andoni Alonso dde42b6a84 fix(github): combine --repository and --organization flags for scan scoping (#10001) 2026-02-10 14:34:59 +01:00
Prowler Bot 3316ec8d23 feat(aws): Update regions for AWS services (#9989)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-02-10 12:02:09 +01:00
Alejandro Bailo 71220b2696 fix(ui): replace HeroUI dropdowns with Radix ActionDropdown to fix overlay conflict (#9996) 2026-02-10 10:28:03 +01:00
Utwo dd730eec94 feat(app): Helm chart for deploying prowler in k8s (#9835)
Co-authored-by: Cursor <cursoragent@cursor.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-02-09 16:43:12 +01:00
Alejandro Bailo afe2e0a09e fix(ui): guard against unknown provider types in ProviderTypeSelector (#9991) 2026-02-09 15:18:50 +01:00
Alejandro Bailo 507d163a50 docs(ui): mark changelog v1.18.1 as released with Prowler v5.18.1 (#9993) 2026-02-09 13:16:44 +01:00
Josema Camacho 530fef5106 chore(attack-pahts): Internet node is now created while Attack Paths scan (#9992) 2026-02-09 12:17:51 +01:00
Josema Camacho 5cbbceb3be chore(attack-pahts): improve attack paths queries attribution (#9983) 2026-02-09 11:07:12 +01:00
Daniel Barranquero fa189e7eb9 docs(openstack): add provider to introduction table (#9990) 2026-02-09 10:33:10 +01:00
Pedro Martín fb966213cc test(e2e): add e2e tests for alibabacloud provider (#9729) 2026-02-09 10:25:26 +01:00
Rubén De la Torre Vico 097a60ebc9 chore(azure): enhance metadata for monitor service (#9622)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-02-09 10:12:57 +01:00
Pedro Martín db03556ef6 chore(readme): update content (#9972) 2026-02-09 09:09:46 +01:00
Josema Camacho ecc8eaf366 feat(skills): create new Attack Packs queries in openCypher (#9975) 2026-02-06 11:57:33 +01:00
Alan Buscaglia 619d1ffc62 chore(ci): remove legacy E2E workflow superseded by optimized v2 (#9977) 2026-02-06 11:20:10 +01:00
Alan Buscaglia 9e20cb2e5a fix(ui): optimize scans page polling to avoid redundant API calls (#9974)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2026-02-06 10:49:15 +01:00
Prowler Bot cb76e77851 chore(api): Bump version to v1.20.0 (#9968)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-02-05 22:18:33 +01:00
Prowler Bot a24f818547 chore(release): Bump version to v5.19.0 (#9964)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-02-05 22:17:38 +01:00
Prowler Bot e07687ce67 docs: Update version to v5.18.0 (#9965)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-02-05 22:16:42 +01:00
Josema Camacho d016039b18 chore(ui): prepare changelog for v5.18.0 release (#9962) 2026-02-05 13:07:51 +01:00
Daniel Barranquero ac013ec6fc feat(docs): permission error while deploying docker (#9954) 2026-02-05 11:44:22 +01:00
Josema Camacho 4ebded6ab1 chore(attack-paths): A Neo4j database per tenant (#9955) 2026-02-05 10:29:37 +01:00
Alan Buscaglia 770269772a test(ui): stabilize auth and provider e2e flows (#9945) 2026-02-05 09:56:49 +01:00
Josema Camacho ab18ddb81a chore(api): prepare changelog for 5.18.0 release (#9960) 2026-02-05 09:34:54 +01:00
Pedro Martín cda7f89091 feat(azure): add HIPAA compliance framework (#9957) 2026-02-05 08:45:52 +01:00
Josema Camacho 658ae755ae chore(attack-paths): pin cartography to 0.126.1 (#9893)
Co-authored-by: César Arroba <cesar@prowler.com>
2026-02-04 19:20:15 +01:00
Daniel Barranquero 486719737b chore(sdk): prepare changelog for v5.18.0 (#9958) 2026-02-04 19:16:19 +01:00
Hugo Pereira Brito cb9ab03778 feat(aws): revert Adding check that AWS Auto Scaling group has deletion protection (#9956)
Co-authored-by: Josema Camacho <hello@josema.xyz>
2026-02-04 16:53:08 +01:00
Rubén De la Torre Vico 96a2262730 chore(azure): enhance metadata for storage service (#9628)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-02-04 16:40:47 +01:00
Serhii Sokolov 69818abdd0 feat(aws): Adding check that AWS Auto Scaling group has deletion protection (#9928)
Co-authored-by: Serhii Sokolov <serhii.sokolov@automat-it.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-02-04 13:17:13 +01:00
Rubén De la Torre Vico d447bdfe54 chore(azure): enhance metadata for network service (#9624)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-02-04 11:56:25 +01:00
Rubén De la Torre Vico b5095f5dc7 chore(azure): enhance metadata for sqlserver service (#9627)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-02-04 08:03:20 +01:00
Pawan Gambhir 9fe71d1046 fix(dashboard): resolve CSV/XLSX download failure with filters (#9946)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-02-03 18:47:42 +01:00
Hugo Pereira Brito 547c53e07c ci: add duplicate test name checker across providers (#9949) 2026-02-03 12:00:41 +01:00
Víctor Fernández Poyatos e1900fc776 fix(api): bump outdated versions (#9950) 2026-02-03 11:03:11 +01:00
Víctor Fernández Poyatos 3c0cb3cd58 chore: update poetry lock for SDK and API (#9941) 2026-02-03 09:44:02 +01:00
Daniel Barranquero e66c9864f5 fix: modify tests files name (#9942) 2026-02-03 08:05:27 +01:00
Hugo Pereira Brito b1f9971617 feat(api): add Cloudflare provider support (#9907) 2026-02-02 14:08:33 +01:00
Alex Baker d01f399cb2 docs(SECURITY.md): Update Link to Security (#9927) 2026-02-02 13:27:12 +01:00
Hugo Pereira Brito 2535b55951 fix(jira): truncate summary to 255 characters to prevent INVALID_INPUT error (#9926) 2026-02-02 12:11:03 +01:00
Rubén De la Torre Vico 0f55d6e21d chore(azure): enhance metadata for postgresql service (#9626)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-30 14:09:11 +01:00
Alan Buscaglia afb666e0da feat(ci): add test impact analysis for selective test execution (#9844) 2026-01-29 17:51:25 +01:00
Andoni Alonso 13cd882ed2 docs(developer-guide): add AI Skills reference to introduction (#9924) 2026-01-29 16:55:15 +01:00
Daniel Barranquero f65879346b feat(docs): add openstack cli first version (#9848)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-01-29 14:24:44 +01:00
Alejandro Bailo 013f2e5d32 fix(ui): resource drawer duplicates and performance optimization (#9921) 2026-01-29 14:15:05 +01:00
RosaRivas bcaa95f973 docs: replace membership by organization as it appears in prowler app (#9918) 2026-01-29 13:59:48 +01:00
Andoni Alonso 625dd37fd4 fix(docs): standardize authentication page titles across providers (#9920) 2026-01-29 13:56:03 +01:00
Alejandro Bailo fee2f84b89 fix(ui): patch React Server Components DoS vulnerability (GHSA-83fc-fqcc-2hmg) (#9917)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-29 13:37:19 +01:00
Daniel Barranquero 08730b4eb5 feat(openstack): add Openstack provider (#9811) 2026-01-29 12:54:18 +01:00
Hugo Pereira Brito c183a2a89a fix(azure): remove duplicated findings in entra_user_with_vm_access_has_mfa (#9914) 2026-01-29 12:20:15 +01:00
mohd4adil e97e31c7ca chore(aws): add support for trusted aws accounts in cross account checks for s3, eventbridge bus, eventbridge schema and dynamodb (#9692)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-29 09:13:34 +01:00
Rubén De la Torre Vico ad7be95dc3 chore(azure): enhance metadata for defender service (#9618)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-28 17:41:19 +01:00
Kay Agahd 04e2d15dd2 feat(aws): add check rds_instance_extended_support (#9865)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2026-01-28 16:49:35 +01:00
Hugo Pereira Brito 143d4b7c29 fix(docs): azure auth permissions and broken image (#9906) 2026-01-28 14:55:16 +01:00
Alejandro Bailo 0c5778d4a1 feat: resource view re-styling with new components (#9864) 2026-01-28 14:07:01 +01:00
Víctor Fernández Poyatos c77d9dd3a9 fix(api): enable autocommit for concurrent index migrations (#9905) 2026-01-28 13:26:16 +01:00
Víctor Fernández Poyatos 8783e963d3 feat(api): remove unused database indexes and improve new failed findings index (#9904) 2026-01-28 12:35:36 +01:00
Rubén De la Torre Vico 5407f3c68e chore(azure): enhance metadata for mysql service (#9623)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-28 11:05:01 +01:00
Alejandro Bailo 83ec3fa458 chore(ui): update CHANGELOG.md (#9901) 2026-01-28 09:21:24 +01:00
dependabot[bot] ac32f03de3 build(deps): bump azure-core from 1.35.0 to 1.38.0 in /api (#9790)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-27 17:17:33 +01:00
dependabot[bot] 7b11a716b9 build(deps): bump azure-core from 1.35.0 to 1.38.0 (#9791)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-27 17:11:10 +01:00
Pepe Fagoaga b2c18b69ee fix(api): handle AccessDenied during AssumeRole in events endpoint (#9899) 2026-01-27 15:32:51 +01:00
Andoni Alonso 727fafb147 fix(attack-paths): correct aws-security-groups-open-internet-facing query (#9892) 2026-01-27 14:20:05 +01:00
Hugo Pereira Brito 80c94faff9 feat(cloudflare): --account-id filter support (#9894)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-01-27 14:18:55 +01:00
Alejandro Bailo 065827cd38 feat: upgrade to Next.js 16.1.3 (#9826) 2026-01-27 14:02:31 +01:00
Hugo Pereira Brito 6bb8dc6168 feat(cloudflare): extend dns and zone services check coverage (#9426)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-01-27 13:48:26 +01:00
Sergio Garcia 9e7ecb39fa feat(aws): CloudTrail timeline for findings (#9101)
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-27 13:00:46 +01:00
Alan Buscaglia 255ce0e866 test(ui-e2e): reorganize auth tests and add documentation (#9788)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2026-01-27 12:53:24 +01:00
Pedro Martín dce406b39b feat(report): improve the way of reporting and adding reports (#9444) 2026-01-27 11:40:36 +01:00
Andoni Alonso 28c36cc5fc feat(attack-paths): add Bedrock and AttachRolePolicy privilege escalation queries (#9885) 2026-01-27 09:35:48 +01:00
Pedro Martín 8242b21f34 docs(providers): update check, compliance, and category counts (#9886) 2026-01-27 08:55:06 +01:00
Pepe Fagoaga 1897e38c6b chore(skill): add changelog entries at the bottom (#9890) 2026-01-27 07:46:50 +01:00
Andoni Alonso 3d6aa6c650 feat(m365): add defender_zap_for_teams_enabled security check (#9838)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-01-26 17:34:10 +01:00
Alejandro Bailo ee93ad6cbc chore(ui): bump changelog version to 1.18.0 (#9884)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-26 16:26:11 +01:00
Andoni Alonso 7f4c02c738 feat(m365): add exchange_shared_mailbox_sign_in_disabled check (#9828) 2026-01-26 16:00:28 +01:00
Hugo Pereira Brito d386730770 fix(ui): fetch all providers in scan page dropdown (#9781)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-26 15:14:22 +01:00
Hugo Pereira Brito 5784592437 chore(azure): add vault parallelization in keyvault service (#9876) 2026-01-26 13:39:54 +01:00
Víctor Fernández Poyatos 35f263dea6 fix(scans): scheduled scans duplicates (#9829) 2026-01-26 13:20:48 +01:00
Josema Camacho a1637ec46b fix(attack-paths): clear Neo4j database cache after scan and queries (#9877) 2026-01-23 16:06:10 +01:00
Rubén De la Torre Vico 6c6a6c55cf chore(azure): enhance metadata for policy service (#9625)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-23 14:40:09 +01:00
Rubén De la Torre Vico 31b53f091b chore(azure): enhance metadata for iam service (#9620)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-23 14:22:07 +01:00
Rubén De la Torre Vico f7a16fff99 chore(azure): enhance metadata for databricks service (#9617)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-23 13:47:45 +01:00
Josema Camacho cb5c9ea1c5 fix(attack-paths): improve findings ingestion cypher query (#9874) 2026-01-23 13:28:38 +01:00
Josema Camacho cb367da97d fix(attack-paths): Start Neo4j at startup for API only (#9872)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-23 10:52:22 +01:00
Adrián Peña be2a58dc82 refactor(api): lazy load providers and compliance (#9857) 2026-01-23 10:14:35 +01:00
Pepe Fagoaga 29133f2d7e fix(neo4j): lazy load driver (#9868)
Co-authored-by: Josema Camacho <josema@prowler.com>
2026-01-23 06:36:47 +01:00
Pepe Fagoaga babf18ffea fix(attack-paths): Use Findings.all_objects to avoid the custom manager (#9869) 2026-01-23 06:17:57 +01:00
Rubén De la Torre Vico b6a34d2220 chore(azure): enhance metadata for cosmosdb service (#9616)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-22 19:53:15 +01:00
Rubén De la Torre Vico 77dc79df32 chore(azure): enhance metadata for containerregistry service (#9615)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-22 19:28:31 +01:00
Pepe Fagoaga 91e3c01f51 fix(attack-paths): load findings in batches into Neo4j (#9862)
Co-authored-by: Josema Camacho <josema@prowler.com>
2026-01-22 18:17:50 +01:00
Andoni Alonso 6cb0edf3e1 feat(aws/codebuild): add check for CodeBreach webhook filter vulnerability (#9840)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-01-22 15:12:24 +01:00
Josema Camacho 7dfafb9337 fix(attack-paths): read findings using replica DB and add more logs (#9861) 2026-01-22 14:51:22 +01:00
Pepe Fagoaga dce05295ef chore(skills): Improve Django and DRF skills (#9831)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-01-22 13:54:06 +01:00
Josema Camacho 03d4c19ed5 fix: remove None databases name for removing provider Neo4j databases (#9858) 2026-01-22 13:45:35 +01:00
lydiavilchez 963ece9a0b feat(gcp): add check to detect persistent disks on suspended VM instances (#9747)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-22 13:38:30 +01:00
Rubén De la Torre Vico a32eff6946 chore(azure): enhance metadata for appinsights service (#9614)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-22 13:26:42 +01:00
Rubén De la Torre Vico 3bb326133a chore(azure): enhance metadata for app service (#9613)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-22 13:07:24 +01:00
Josema Camacho 799826758e fix: improve API startup process manage.py detection (#9856) 2026-01-22 12:34:18 +01:00
Prowler Bot 1208005a94 chore(api): Bump version to v1.19.0 (#9853)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-01-22 11:33:24 +01:00
Prowler Bot ecdece9f1e chore(release): Bump version to v5.18.0 (#9850)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-01-22 11:32:56 +01:00
Prowler Bot 9c2c555628 docs: Update version to v5.17.0 (#9852)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-01-22 11:32:03 +01:00
Hugo Pereira Brito ca2f3ccc1c fix(skills): avoid sdk test __init__ file creation (#9845) 2026-01-21 15:31:57 +01:00
César Arroba 9ffa0043ab chore: add release version to changelogs (#9846) 2026-01-21 15:19:31 +01:00
lydiavilchez e76ecfdd4d feat(gcp): add check for OS Login 2FA enabled at project level (#9839) 2026-01-21 15:12:01 +01:00
Pepe Fagoaga f11f71bc42 chore(changelog): make all consistent and product-focused (#9808) 2026-01-21 13:36:36 +01:00
Alan Buscaglia 607cfd61ef perf(ui): optimize CI cache for pnpm and Next.js builds (#9843) 2026-01-21 13:18:31 +01:00
Josema Camacho 9c76dafaa4 chore(attack-paths): adding stability to Neo4j driver and session (#9842) 2026-01-21 12:44:31 +01:00
lydiavilchez 7b839d9f9e feat(gcp): add check to enforce On Host Maintenance set to MIGRATE (#9834) 2026-01-21 09:37:21 +01:00
Pepe Fagoaga f39a82fdf4 docs(security): restructure security page into dedicated sections (#9836) 2026-01-20 15:27:29 +01:00
Josema Camacho d1a7eed5fa chore(security): update filelock dep to solve vulnerability 82754 (#9816) 2026-01-20 13:26:59 +01:00
César Arroba 5be4ec511f fix(api): handle Neo4j unavailability during app initialization (#9827)
Co-authored-by: Josema Camacho <josema@prowler.com>
2026-01-20 12:22:41 +01:00
dependabot[bot] a0166aede7 build(deps): bump django-allauth from 65.11.0 to 65.13.0 in /api (#9575)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-01-20 11:54:21 +01:00
Alan Buscaglia 1a2a2ea3cc fix(ui): make attack paths graph edges theme-aware (#9821) 2026-01-19 18:04:23 +01:00
Rubén De la Torre Vico e61d1401b9 chore(azure): enhance metadata for apim service (#9612)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-19 17:42:09 +01:00
Rubén De la Torre Vico a2789b7fc6 chore(azure): enhance metadata for aks service (#9611)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-19 17:25:10 +01:00
Rubén De la Torre Vico 34217492d0 chore(azure): enhance metadata for aisearch service (#9087)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-19 16:57:22 +01:00
dependabot[bot] ed50ed1e6d build(deps): bump pyasn1 from 0.6.1 to 0.6.2 (#9817)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-19 16:55:04 +01:00
Pepe Fagoaga 186977f81c docs: new support page (#9824) 2026-01-19 15:55:27 +01:00
Pepe Fagoaga c33f20ad72 chore: lint AWS IAM simulator (#9825) 2026-01-19 15:03:21 +01:00
dependabot[bot] d0b0c66ef0 build(deps): bump pyasn1 from 0.6.1 to 0.6.2 in /api (#9818)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-19 15:03:08 +01:00
Pepe Fagoaga e849959fd5 chore(changelog): run check for root dependency files (#9823) 2026-01-19 15:02:46 +01:00
bota4go 7c090a6a07 fix(aws): simulator code path (#9822)
Co-authored-by: Your Name <you@example.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-19 13:34:23 +01:00
Alejandro Bailo bc4484f269 feat(ui): add resource group label formatter to resources view (#9820) 2026-01-19 11:13:48 +01:00
bota4go 7601142e42 feat(aws-simulator): IAM policy simulator (#9252) 2026-01-19 09:40:16 +01:00
Alejandro Bailo f47310bceb feat(ui): add resource groups filter to findings view (#9812) 2026-01-16 13:58:36 +01:00
Josema Camacho 032499c29a feat(attack-paths): The complete Attack Paths feature (#9805)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: César Arroba <19954079+cesararroba@users.noreply.github.com>
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Co-authored-by: Chandrapal Badshah <Chan9390@users.noreply.github.com>
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
Co-authored-by: Adrián Peña <adrianjpr@gmail.com>
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: KonstGolfi <73020281+KonstGolfi@users.noreply.github.com>
Co-authored-by: lydiavilchez <114735608+lydiavilchez@users.noreply.github.com>
Co-authored-by: Prowler Bot <bot@prowler.com>
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
Co-authored-by: StylusFrost <43682773+StylusFrost@users.noreply.github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
Co-authored-by: bota4go <108249054+bota4go@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: mchennai <50082780+mchennai@users.noreply.github.com>
Co-authored-by: Ryan Nolette <sonofagl1tch@users.noreply.github.com>
Co-authored-by: Ulissis Correa <123517149+ulissisc@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: Lee Trout <ltrout@watchpointlabs.com>
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: Alan-TheGentleman <alan@thegentleman.dev>
2026-01-16 13:37:09 +01:00
Pepe Fagoaga d7af97b30a chore(skills): add Prowler Changelog skill (#9806) 2026-01-16 13:31:34 +01:00
Hugo Pereira Brito aa24034ca7 feat(cloudflare): Add bot protection and configuration checks for zones (#9425)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-01-16 12:06:52 +01:00
Alejandro Bailo ec4eb70539 refactor(ui): improve layouts and styles (#9807) 2026-01-16 12:00:01 +01:00
RoseSecurity 76a8610121 fix(pre-commit): update isort repo URL to pycqa (#9785) 2026-01-15 18:33:25 +01:00
Alejandro Bailo d5e2c930a9 feat(ui): add Resources Inventory feature (#9492)
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
2026-01-15 16:25:12 +01:00
Josema Camacho 2c4f866e42 feat(attack-paths): update slack-sdk for cartography compatibility (#9801) 2026-01-15 14:30:33 +01:00
Rubén De la Torre Vico 31845df1a7 refactor(ui): change Lighthouse AI MCP tool filtering from blacklist to whitelist (#9802) 2026-01-15 13:53:05 +01:00
Adrián Peña d8c1273a57 feat(api): add resource group overview endpoint and filtering (#9694)
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2026-01-15 13:05:25 +01:00
Rubén De la Torre Vico 3317c0a5e0 chore(aws): enhance metadata for ec2 service (#9549)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-15 13:01:21 +01:00
Josema Camacho 847645543a feat(attack-paths): update boto dependencies for catrography compatibility (#9798)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-01-15 13:00:54 +01:00
Alejandro Bailo 76aa65cb61 chore(ui): CHANGELOG.md updated (#9800) 2026-01-15 12:55:13 +01:00
Alejandro Bailo 484a1d1fef chore: upgrade Node.js to 24.13.0 LTS (#9797) 2026-01-15 12:46:42 +01:00
Alejandro Bailo c8bc0576ea feat: implement compliance watchlist (#9786) 2026-01-15 12:37:16 +01:00
Alejandro Bailo 76cda6d777 feat(ui): new findings view (#9794) 2026-01-15 12:15:06 +01:00
Andoni Alonso 28978f6db6 fix(oci): pass provider UID to update credentials forms (#9746) 2026-01-15 11:29:23 +01:00
Hugo Pereira Brito d4bc6d7531 feat(cloudflare): Add TLS/SSL, records and email security checks for zones (#9424)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-01-15 09:31:27 +01:00
Hugo Pereira Brito 1bf49747ad chore(entra): enhance performance for user_registration_details and user mfa evaluation (#9236) 2026-01-14 14:01:51 +01:00
lydiavilchez 2cde4c939d feat(gcp): add compute_snapshot_not_outdated check (#9774) 2026-01-14 12:35:29 +01:00
Hugo Pereira Brito 9844379d30 chore(cloudflare): rename zones service to zone (#9792) 2026-01-14 11:00:51 +01:00
Pedro Martín 211b1b67f9 feat(ui): improve threatscore visualization per pillar (#9773)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2026-01-14 09:05:54 +01:00
Rubén De la Torre Vico 864b2099c3 chore(aws): enhance metadata for cognito service (#8853)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-13 14:01:37 +01:00
Andoni Alonso 270266c906 fix(skills): formatting file (#9783) 2026-01-13 12:38:32 +01:00
Alan Buscaglia c8fab497fd feat(skills): sync AGENTS.md to AI-specific formats (#9751)
Co-authored-by: Alan-TheGentleman <alan@thegentleman.dev>
Co-authored-by: pedrooot <pedromarting3@gmail.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-01-13 11:44:44 +01:00
Hugo Pereira Brito b0eea61468 feat(cloudflare): Add Cloudflare provider with zones service and critical security checks (#9423) 2026-01-13 11:09:54 +01:00
Rubén De la Torre Vico 463fc32fca chore(aws): enhance metadata for iam service (#9550)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-13 11:06:09 +01:00
Pedro Martín 17f5633a8d feat(compliance): add CIS 1.12 for Kubernetes (#9778) 2026-01-13 10:16:28 +01:00
Pedro Martín 48274f1d54 feat(compliance): add CIS 6.0 for M365 (#9779) 2026-01-13 10:07:12 +01:00
Pedro Martín 9719f9ee86 feat(compliance): add CIS 5.0 for Azure (#9777) 2026-01-13 09:39:24 +01:00
Alejandro Bailo d38be934a3 feat(ui): add new findings table (#9699) 2026-01-12 15:44:25 +01:00
Rubén De la Torre Vico 0472eb74d2 chore(aws): enhance metadata for bedrock service (#8827)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-12 14:26:37 +01:00
Rubén De la Torre Vico e5b86da6e5 chore(aws): enhance metadata for rds service (#9551)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-12 13:52:29 +01:00
Lee Trout 429c591819 chore(aws): fixup AWS EC2 SG lib (#9216)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-01-12 13:47:37 +01:00
Prowler Bot 87c0747174 feat(aws): Update regions for AWS services (#9771)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-01-12 13:00:39 +01:00
lydiavilchez 62a8540169 feat(gcp): add check to detect Compute Engine configuration changes (#9698)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-01-12 12:22:15 +01:00
Pepe Fagoaga 9ee77c2b97 chore(security): Remove safety check ignores as they are fixed (#9752) 2026-01-12 12:02:22 +01:00
Víctor Fernández Poyatos 5f2cb614ad feat(overviews): Compliance watchlist endpoint (#9596)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-01-12 11:40:36 +01:00
Chandrapal Badshah 6c01151d78 docs(lighthouse): update lighthouse architecture docs (#9576)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-01-12 10:18:58 +01:00
mchennai 05466cff22 test: Add edge case test for s3_bucket_server_access_logging_enabled (#9725)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-12 10:06:34 +01:00
Rubén De la Torre Vico a57b6d78bf docs: add audit scope column to supported providers table (#9750) 2026-01-12 09:19:29 +01:00
Adrián Peña d3eb30c066 chore: update API PR template (#9749) 2026-01-09 15:13:48 +01:00
Alan Buscaglia 7f2fa275c6 feat: add AI skills pack for Claude Code and OpenCode (#9728)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-09 15:01:18 +01:00
Pepe Fagoaga 42ae5b6e3e chore(template): PR Community Checklist (#9748) 2026-01-09 14:42:07 +01:00
Pepe Fagoaga 7c1bcfc781 fix: typo in subscription error (#9745)
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
2026-01-09 11:32:10 +01:00
dependabot[bot] 68684b107a build(deps-dev): bump authlib from 1.6.5 to 1.6.6 in /api (#9742)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-09 08:25:25 +01:00
dependabot[bot] d04716ea95 build(deps): bump werkzeug from 3.1.4 to 3.1.5 in /api (#9743)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-09 08:23:58 +01:00
dependabot[bot] 8d8b7aad15 build(deps): bump werkzeug from 3.1.4 to 3.1.5 (#9744)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-09 08:22:37 +01:00
Pepe Fagoaga f3ba70dd6b docs: add warning about changes not complaining with roadmap (#9741) 2026-01-08 17:03:38 +01:00
Andoni Alonso 27492cbd42 fix(oci): validate credentials before scanning (#9738) 2026-01-08 15:47:26 +01:00
dependabot[bot] 795220e290 build(deps): bump werkzeug from 3.1.3 to 3.1.4 (#9399)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 15:41:48 +01:00
dependabot[bot] 64ab8e64b0 build(deps): bump urllib3 from 1.26.20 to 2.6.3 (#9734)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 15:41:39 +01:00
dependabot[bot] a0f9df07bd build(deps): bump pynacl from 1.5.0 to 1.6.2 (#9726)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 15:40:55 +01:00
dependabot[bot] 3d16c62f30 build(deps): bump fastmcp from 2.13.1 to 2.14.0 in /mcp_server (#9696)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 15:04:53 +01:00
dependabot[bot] fa2deef241 build(deps): bump aiohttp from 3.12.15 to 3.13.3 in /api (#9723)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 14:12:54 +01:00
dependabot[bot] 211639d849 build(deps-dev): bump marshmallow from 3.26.1 to 3.26.2 in /api (#9651)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:52:58 +01:00
dependabot[bot] 25c90f9f63 build(deps): bump urllib3 from 2.5.0 to 2.6.3 in /api (#9735)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:45:58 +01:00
dependabot[bot] bbdb230bb2 build(deps): bump filelock from 3.12.4 to 3.20.1 in /api (#9594)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:45:14 +01:00
dependabot[bot] 6e2ba66a5a build(deps): bump pynacl from 1.5.0 to 1.6.2 in /api (#9739)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:44:13 +01:00
dependabot[bot] 3332e5b891 build(deps): bump aiohttp from 3.12.14 to 3.13.3 (#9722)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:38:35 +01:00
dependabot[bot] 44d791dfe9 build(deps-dev): bump marshmallow from 3.26.1 to 3.26.2 (#9652)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:37:20 +01:00
dependabot[bot] 73375ee289 build(deps): bump tj-actions/changed-files from 47.0.0 to 47.0.1 (#9711)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 13:30:41 +01:00
Rubén De la Torre Vico 503b56188b chore(aws): enhance metadata for datasync service (#8854)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-08 13:22:59 +01:00
dependabot[bot] 7c9dd8fe89 build(deps): bump peter-evans/create-pull-request from 7.0.8 to 8.0.0 (#9705)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:19:35 +01:00
dependabot[bot] f407a24022 build(deps): bump actions/upload-artifact from 4.6.2 to 6.0.0 (#9712)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:16:15 +01:00
dependabot[bot] 8f5c43744f build(deps): bump softprops/action-gh-release from 2.4.1 to 2.5.0 (#9389)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:15:24 +01:00
Rubén De la Torre Vico 8d78831d29 chore(aws): enhance metadata for s3 service (#9552)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-08 13:13:32 +01:00
dependabot[bot] 858446c740 build(deps): bump actions/setup-node from 6.0.0 to 6.1.0 (#9707)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:00:44 +01:00
dependabot[bot] e9ca8bfda6 build(deps): bump trufflesecurity/trufflehog from 3.91.1 to 3.92.4 (#9710)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 12:56:15 +01:00
dependabot[bot] 5cd446c446 build(deps): bump codecov/codecov-action from 5.5.1 to 5.5.2 (#9708)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 12:56:04 +01:00
dependabot[bot] 319f5b6c38 build(deps): bump actions/cache from 4.3.0 to 5.0.1 (#9706)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 12:54:40 +01:00
dependabot[bot] 64c9dd4947 build(deps): bump docker/login-action from 3.4.0 to 3.6.0 (#9396)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 12:54:03 +01:00
dependabot[bot] 8b2dea52fa build(deps): bump docker/setup-buildx-action from 3.11.1 to 3.12.0 (#9709)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 12:52:42 +01:00
Andoni Alonso da567138fa docs(developer-guide): add missing compliance framework link (#9736) 2026-01-08 10:19:16 +01:00
Sergio Garcia 5b59986ae7 docs(azure): enhance Managed Identity authentication documentation (#9012)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-01-08 09:04:04 +01:00
1450 changed files with 161190 additions and 24388 deletions
+20 -1
View File
@@ -48,6 +48,26 @@ POSTGRES_DB=prowler_db
# POSTGRES_REPLICA_MAX_ATTEMPTS=3
# POSTGRES_REPLICA_RETRY_BASE_DELAY=0.5
# Neo4j auth
NEO4J_HOST=neo4j
NEO4J_PORT=7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=neo4j_password
# Neo4j settings
NEO4J_DBMS_MAX__DATABASES=1000
NEO4J_SERVER_MEMORY_PAGECACHE_SIZE=1G
NEO4J_SERVER_MEMORY_HEAP_INITIAL__SIZE=1G
NEO4J_SERVER_MEMORY_HEAP_MAX__SIZE=1G
NEO4J_POC_EXPORT_FILE_ENABLED=true
NEO4J_APOC_IMPORT_FILE_ENABLED=true
NEO4J_APOC_IMPORT_FILE_USE_NEO4J_CONFIG=true
NEO4J_PLUGINS=["apoc"]
NEO4J_DBMS_SECURITY_PROCEDURES_ALLOWLIST=apoc.*
NEO4J_DBMS_SECURITY_PROCEDURES_UNRESTRICTED=apoc.*
NEO4J_DBMS_CONNECTOR_BOLT_LISTEN_ADDRESS=0.0.0.0:7687
# Neo4j Prowler settings
ATTACK_PATHS_BATCH_SIZE=1000
# Celery-Prowler task settings
TASK_RETRY_DELAY_SECONDS=0.1
TASK_RETRY_ATTEMPTS=5
@@ -117,7 +137,6 @@ SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
NEXT_PUBLIC_SENTRY_ENVIRONMENT=${SENTRY_ENVIRONMENT}
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.16.0
@@ -29,7 +29,7 @@ runs:
run: |
BRANCH_NAME="${GITHUB_HEAD_REF:-${GITHUB_REF_NAME}}"
echo "Using branch: $BRANCH_NAME"
sed -i "s|@master|@$BRANCH_NAME|g" pyproject.toml
sed -i "s|\(git+https://github.com/prowler-cloud/prowler[^@]*\)@master|\1@$BRANCH_NAME|g" pyproject.toml
- name: Install poetry
shell: bash
+20 -2
View File
@@ -46,12 +46,22 @@ provider/oci:
- changed-files:
- any-glob-to-any-file: "prowler/providers/oraclecloud/**"
- any-glob-to-any-file: "tests/providers/oraclecloud/**"
provider/alibabacloud:
- changed-files:
- any-glob-to-any-file: "prowler/providers/alibabacloud/**"
- any-glob-to-any-file: "tests/providers/alibabacloud/**"
provider/cloudflare:
- changed-files:
- any-glob-to-any-file: "prowler/providers/cloudflare/**"
- any-glob-to-any-file: "tests/providers/cloudflare/**"
provider/openstack:
- changed-files:
- any-glob-to-any-file: "prowler/providers/openstack/**"
- any-glob-to-any-file: "tests/providers/openstack/**"
github_actions:
- changed-files:
- any-glob-to-any-file: ".github/workflows/*"
@@ -67,15 +77,23 @@ mutelist:
- any-glob-to-any-file: "prowler/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/kubernetes/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/m365/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/mongodbatlas/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/oraclecloud/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/alibabacloud/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/cloudflare/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/openstack/lib/mutelist/**"
- any-glob-to-any-file: "tests/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/aws/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/kubernetes/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/m365/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/mongodbatlas/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/oci/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/oraclecloud/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/alibabacloud/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/cloudflare/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/openstack/lib/mutelist/**"
integration/s3:
- changed-files:
+19 -2
View File
@@ -14,14 +14,26 @@ Please add a detailed description of how to review this PR.
### Checklist
- Are there new checks included in this PR? Yes / No
- If so, do we need to update permissions for the provider? Please review this carefully.
<details>
<summary><b>Community Checklist</b></summary>
- [ ] This feature/issue is listed in [here](https://github.com/prowler-cloud/prowler/issues?q=sort%3Aupdated-desc+is%3Aissue+is%3Aopen) or roadmap.prowler.com
- [ ] Is it assigned to me, if not, request it via the issue/feature in [here](https://github.com/prowler-cloud/prowler/issues?q=sort%3Aupdated-desc+is%3Aissue+is%3Aopen) or [Prowler Community Slack](goto.prowler.com/slack)
</details>
- [ ] Review if the code is being covered by tests.
- [ ] Review if code is being documented following this specification https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
- [ ] Review if backport is needed.
- [ ] Review if is needed to change the [Readme.md](https://github.com/prowler-cloud/prowler/blob/master/README.md)
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/prowler/CHANGELOG.md), if applicable.
#### SDK/CLI
- Are there new checks included in this PR? Yes / No
- If so, do we need to update permissions for the provider? Please review this carefully.
#### UI
- [ ] All issue/task requirements work as expected on the UI
- [ ] Screenshots/Video of the functionality flow (if applicable) - Mobile (X < 640px)
@@ -30,6 +42,11 @@ Please add a detailed description of how to review this PR.
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/ui/CHANGELOG.md), if applicable.
#### API
- [ ] All issue/task requirements work as expected on the API
- [ ] Endpoint response output (if applicable)
- [ ] EXPLAIN ANALYZE output for new/modified queries or indexes (if applicable)
- [ ] Performance test results (if applicable)
- [ ] Any other relevant evidence of the implementation (if applicable)
- [ ] Verify if API specs need to be regenerated.
- [ ] Check if version updates are required (e.g., specs, Poetry, etc.).
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/api/CHANGELOG.md), if applicable.
+257
View File
@@ -0,0 +1,257 @@
#!/usr/bin/env python3
"""
Test Impact Analysis Script
Analyzes changed files and determines which tests need to run.
Outputs GitHub Actions compatible outputs.
Usage:
python test-impact.py <changed_files...>
python test-impact.py --from-stdin # Read files from stdin (one per line)
Outputs (for GitHub Actions):
- run-all: "true" if critical paths changed
- sdk-tests: Space-separated list of SDK test paths
- api-tests: Space-separated list of API test paths
- ui-e2e: Space-separated list of UI E2E test paths
- modules: Comma-separated list of affected module names
"""
import fnmatch
import os
import sys
from pathlib import Path
import yaml
def load_config() -> dict:
"""Load test-impact.yml configuration."""
config_path = Path(__file__).parent.parent / "test-impact.yml"
with open(config_path) as f:
return yaml.safe_load(f)
def matches_pattern(file_path: str, pattern: str) -> bool:
"""Check if file path matches a glob pattern."""
# Normalize paths
file_path = file_path.strip("/")
pattern = pattern.strip("/")
# Handle ** patterns
if "**" in pattern:
# Convert glob pattern to work with fnmatch
# e.g., "prowler/lib/**" matches "prowler/lib/check/foo.py"
base = pattern.replace("/**", "")
if file_path.startswith(base):
return True
# Also try standard fnmatch
return fnmatch.fnmatch(file_path, pattern)
return fnmatch.fnmatch(file_path, pattern)
def filter_ignored_files(
changed_files: list[str], ignored_paths: list[str]
) -> list[str]:
"""Filter out files that match ignored patterns."""
filtered = []
for file_path in changed_files:
is_ignored = False
for pattern in ignored_paths:
if matches_pattern(file_path, pattern):
print(f" [IGNORED] {file_path} matches {pattern}", file=sys.stderr)
is_ignored = True
break
if not is_ignored:
filtered.append(file_path)
return filtered
def check_critical_paths(changed_files: list[str], critical_paths: list[str]) -> bool:
"""Check if any changed file matches critical paths."""
for file_path in changed_files:
for pattern in critical_paths:
if matches_pattern(file_path, pattern):
print(f" [CRITICAL] {file_path} matches {pattern}", file=sys.stderr)
return True
return False
def find_affected_modules(
changed_files: list[str], modules: list[dict]
) -> dict[str, dict]:
"""Find which modules are affected by changed files."""
affected = {}
for file_path in changed_files:
for module in modules:
module_name = module["name"]
match_patterns = module.get("match", [])
for pattern in match_patterns:
if matches_pattern(file_path, pattern):
if module_name not in affected:
affected[module_name] = {
"tests": set(),
"e2e": set(),
"matched_files": [],
}
affected[module_name]["matched_files"].append(file_path)
# Add test patterns
for test_pattern in module.get("tests", []):
affected[module_name]["tests"].add(test_pattern)
# Add E2E patterns
for e2e_pattern in module.get("e2e", []):
affected[module_name]["e2e"].add(e2e_pattern)
break # File matched this module, move to next file
return affected
def categorize_tests(
affected_modules: dict[str, dict],
) -> tuple[set[str], set[str], set[str]]:
"""Categorize tests into SDK, API, and UI E2E."""
sdk_tests = set()
api_tests = set()
ui_e2e = set()
for module_name, data in affected_modules.items():
for test_path in data["tests"]:
if test_path.startswith("tests/"):
sdk_tests.add(test_path)
elif test_path.startswith("api/"):
api_tests.add(test_path)
for e2e_path in data["e2e"]:
ui_e2e.add(e2e_path)
return sdk_tests, api_tests, ui_e2e
def set_github_output(name: str, value: str):
"""Set GitHub Actions output."""
github_output = os.environ.get("GITHUB_OUTPUT")
if github_output:
with open(github_output, "a") as f:
# Handle multiline values
if "\n" in value:
import uuid
delimiter = uuid.uuid4().hex
f.write(f"{name}<<{delimiter}\n{value}\n{delimiter}\n")
else:
f.write(f"{name}={value}\n")
# Print for debugging (without deprecated format)
print(f" {name}={value}", file=sys.stderr)
def main():
# Parse arguments
if "--from-stdin" in sys.argv:
changed_files = [line.strip() for line in sys.stdin if line.strip()]
else:
changed_files = [f for f in sys.argv[1:] if f and not f.startswith("-")]
if not changed_files:
print("No changed files provided", file=sys.stderr)
set_github_output("run-all", "false")
set_github_output("sdk-tests", "")
set_github_output("api-tests", "")
set_github_output("ui-e2e", "")
set_github_output("modules", "")
set_github_output("has-tests", "false")
return
print(f"Analyzing {len(changed_files)} changed files...", file=sys.stderr)
for f in changed_files[:10]: # Show first 10
print(f" - {f}", file=sys.stderr)
if len(changed_files) > 10:
print(f" ... and {len(changed_files) - 10} more", file=sys.stderr)
# Load configuration
config = load_config()
# Filter out ignored files (docs, configs, etc.)
ignored_paths = config.get("ignored", {}).get("paths", [])
changed_files = filter_ignored_files(changed_files, ignored_paths)
if not changed_files:
print("\nAll changed files are ignored (docs, configs, etc.)", file=sys.stderr)
print("No tests needed.", file=sys.stderr)
set_github_output("run-all", "false")
set_github_output("sdk-tests", "")
set_github_output("api-tests", "")
set_github_output("ui-e2e", "")
set_github_output("modules", "none-ignored")
set_github_output("has-tests", "false")
return
print(
f"\n{len(changed_files)} files remain after filtering ignored paths",
file=sys.stderr,
)
# Check critical paths
critical_paths = config.get("critical", {}).get("paths", [])
if check_critical_paths(changed_files, critical_paths):
print("\nCritical path changed - running ALL tests", file=sys.stderr)
set_github_output("run-all", "true")
set_github_output("sdk-tests", "tests/")
set_github_output("api-tests", "api/src/backend/")
set_github_output("ui-e2e", "ui/tests/")
set_github_output("modules", "all")
set_github_output("has-tests", "true")
return
# Find affected modules
modules = config.get("modules", [])
affected = find_affected_modules(changed_files, modules)
if not affected:
print("\nNo test-mapped modules affected", file=sys.stderr)
set_github_output("run-all", "false")
set_github_output("sdk-tests", "")
set_github_output("api-tests", "")
set_github_output("ui-e2e", "")
set_github_output("modules", "")
set_github_output("has-tests", "false")
return
# Report affected modules
print(f"\nAffected modules: {len(affected)}", file=sys.stderr)
for module_name, data in affected.items():
print(f" [{module_name}]", file=sys.stderr)
for f in data["matched_files"][:3]:
print(f" - {f}", file=sys.stderr)
if len(data["matched_files"]) > 3:
print(
f" ... and {len(data['matched_files']) - 3} more files",
file=sys.stderr,
)
# Categorize tests
sdk_tests, api_tests, ui_e2e = categorize_tests(affected)
# Output results
print("\nTest paths to run:", file=sys.stderr)
print(f" SDK: {sdk_tests or 'none'}", file=sys.stderr)
print(f" API: {api_tests or 'none'}", file=sys.stderr)
print(f" E2E: {ui_e2e or 'none'}", file=sys.stderr)
set_github_output("run-all", "false")
set_github_output("sdk-tests", " ".join(sorted(sdk_tests)))
set_github_output("api-tests", " ".join(sorted(api_tests)))
set_github_output("ui-e2e", " ".join(sorted(ui_e2e)))
set_github_output("modules", ",".join(sorted(affected.keys())))
set_github_output(
"has-tests", "true" if (sdk_tests or api_tests or ui_e2e) else "false"
)
if __name__ == "__main__":
main()
+402
View File
@@ -0,0 +1,402 @@
# Test Impact Analysis Configuration
# Defines which tests to run based on changed files
#
# Usage: Changes to paths in 'critical' always run all tests.
# Changes to paths in 'modules' run only the mapped tests.
# Changes to paths in 'ignored' don't trigger any tests.
# Ignored paths - changes here don't trigger any tests
# Documentation, configs, and other non-code files
ignored:
paths:
# Documentation
- docs/**
- "*.md"
- "**/*.md"
- mkdocs.yml
# Config files that don't affect runtime
- .gitignore
- .gitattributes
- .editorconfig
- .pre-commit-config.yaml
- .backportrc.json
- CODEOWNERS
- LICENSE
# IDE/Editor configs
- .vscode/**
- .idea/**
# Examples and contrib (not production code)
- examples/**
- contrib/**
# Skills (AI agent configs, not runtime)
- skills/**
# E2E setup helpers (not runnable tests)
- ui/tests/setups/**
# Permissions docs
- permissions/**
# Critical paths - changes here run ALL tests
# These are foundational/shared code that can affect anything
critical:
paths:
# SDK Core
- prowler/lib/**
- prowler/config/**
- prowler/exceptions/**
- prowler/providers/common/**
# API Core
- api/src/backend/api/models.py
- api/src/backend/config/**
- api/src/backend/conftest.py
# UI Core
- ui/lib/**
- ui/types/**
- ui/config/**
- ui/middleware.ts
# CI/CD changes
- .github/workflows/**
- .github/test-impact.yml
# Module mappings - path patterns to test patterns
modules:
# ============================================
# SDK - Providers (each provider is isolated)
# ============================================
- name: sdk-aws
match:
- prowler/providers/aws/**
- prowler/compliance/aws/**
tests:
- tests/providers/aws/**
e2e: []
- name: sdk-azure
match:
- prowler/providers/azure/**
- prowler/compliance/azure/**
tests:
- tests/providers/azure/**
e2e: []
- name: sdk-gcp
match:
- prowler/providers/gcp/**
- prowler/compliance/gcp/**
tests:
- tests/providers/gcp/**
e2e: []
- name: sdk-kubernetes
match:
- prowler/providers/kubernetes/**
- prowler/compliance/kubernetes/**
tests:
- tests/providers/kubernetes/**
e2e: []
- name: sdk-github
match:
- prowler/providers/github/**
- prowler/compliance/github/**
tests:
- tests/providers/github/**
e2e: []
- name: sdk-m365
match:
- prowler/providers/m365/**
- prowler/compliance/m365/**
tests:
- tests/providers/m365/**
e2e: []
- name: sdk-alibabacloud
match:
- prowler/providers/alibabacloud/**
- prowler/compliance/alibabacloud/**
tests:
- tests/providers/alibabacloud/**
e2e: []
- name: sdk-cloudflare
match:
- prowler/providers/cloudflare/**
- prowler/compliance/cloudflare/**
tests:
- tests/providers/cloudflare/**
e2e: []
- name: sdk-oraclecloud
match:
- prowler/providers/oraclecloud/**
- prowler/compliance/oraclecloud/**
tests:
- tests/providers/oraclecloud/**
e2e: []
- name: sdk-mongodbatlas
match:
- prowler/providers/mongodbatlas/**
- prowler/compliance/mongodbatlas/**
tests:
- tests/providers/mongodbatlas/**
e2e: []
- name: sdk-nhn
match:
- prowler/providers/nhn/**
- prowler/compliance/nhn/**
tests:
- tests/providers/nhn/**
e2e: []
- name: sdk-iac
match:
- prowler/providers/iac/**
- prowler/compliance/iac/**
tests:
- tests/providers/iac/**
e2e: []
- name: sdk-llm
match:
- prowler/providers/llm/**
- prowler/compliance/llm/**
tests:
- tests/providers/llm/**
e2e: []
# ============================================
# SDK - Lib modules
# ============================================
- name: sdk-lib-check
match:
- prowler/lib/check/**
tests:
- tests/lib/check/**
e2e: []
- name: sdk-lib-outputs
match:
- prowler/lib/outputs/**
tests:
- tests/lib/outputs/**
e2e: []
- name: sdk-lib-scan
match:
- prowler/lib/scan/**
tests:
- tests/lib/scan/**
e2e: []
- name: sdk-lib-cli
match:
- prowler/lib/cli/**
tests:
- tests/lib/cli/**
e2e: []
- name: sdk-lib-mutelist
match:
- prowler/lib/mutelist/**
tests:
- tests/lib/mutelist/**
e2e: []
# ============================================
# API - Views, Serializers, Tasks
# ============================================
- name: api-views
match:
- api/src/backend/api/v1/views.py
tests:
- api/src/backend/api/tests/test_views.py
e2e:
# API view changes can break UI
- ui/tests/**
- name: api-serializers
match:
- api/src/backend/api/v1/serializers.py
- api/src/backend/api/v1/serializer_utils/**
tests:
- api/src/backend/api/tests/**
e2e:
# Serializer changes affect API responses → UI
- ui/tests/**
- name: api-filters
match:
- api/src/backend/api/filters.py
tests:
- api/src/backend/api/tests/**
e2e: []
- name: api-rbac
match:
- api/src/backend/api/rbac/**
tests:
- api/src/backend/api/tests/**
e2e:
- ui/tests/roles/**
- name: api-tasks
match:
- api/src/backend/tasks/**
tests:
- api/src/backend/tasks/tests/**
e2e: []
- name: api-attack-paths
match:
- api/src/backend/api/attack_paths/**
tests:
- api/src/backend/api/tests/test_attack_paths.py
e2e: []
# ============================================
# UI - Components and Features
# ============================================
- name: ui-providers
match:
- ui/components/providers/**
- ui/actions/providers/**
- ui/app/**/providers/**
tests: []
e2e:
- ui/tests/providers/**
- name: ui-findings
match:
- ui/components/findings/**
- ui/actions/findings/**
- ui/app/**/findings/**
tests: []
e2e:
- ui/tests/findings/**
- name: ui-scans
match:
- ui/components/scans/**
- ui/actions/scans/**
- ui/app/**/scans/**
tests: []
e2e:
- ui/tests/scans/**
- name: ui-compliance
match:
- ui/components/compliance/**
- ui/actions/compliances/**
- ui/app/**/compliance/**
tests: []
e2e:
- ui/tests/compliance/**
- name: ui-auth
match:
- ui/components/auth/**
- ui/actions/auth/**
- ui/app/(auth)/**
tests: []
e2e:
- ui/tests/sign-in/**
- ui/tests/sign-up/**
- name: ui-invitations
match:
- ui/components/invitations/**
- ui/actions/invitations/**
- ui/app/**/invitations/**
tests: []
e2e:
- ui/tests/invitations/**
- name: ui-roles
match:
- ui/components/roles/**
- ui/actions/roles/**
- ui/app/**/roles/**
tests: []
e2e:
- ui/tests/roles/**
- name: ui-users
match:
- ui/components/users/**
- ui/actions/users/**
- ui/app/**/users/**
tests: []
e2e:
- ui/tests/users/**
- name: ui-integrations
match:
- ui/components/integrations/**
- ui/actions/integrations/**
- ui/app/**/integrations/**
tests: []
e2e:
- ui/tests/integrations/**
- name: ui-resources
match:
- ui/components/resources/**
- ui/actions/resources/**
- ui/app/**/resources/**
tests: []
e2e:
- ui/tests/resources/**
- name: ui-profile
match:
- ui/app/**/profile/**
tests: []
e2e:
- ui/tests/profile/**
- name: ui-lighthouse
match:
- ui/components/lighthouse/**
- ui/actions/lighthouse/**
- ui/app/**/lighthouse/**
- ui/lib/lighthouse/**
tests: []
e2e:
- ui/tests/lighthouse/**
- name: ui-overview
match:
- ui/components/overview/**
- ui/actions/overview/**
tests: []
e2e:
- ui/tests/home/**
- name: ui-shadcn
match:
- ui/components/shadcn/**
- ui/components/ui/**
tests: []
e2e:
# Shared components can affect any E2E
- ui/tests/**
- name: ui-attack-paths
match:
- ui/components/attack-paths/**
- ui/actions/attack-paths/**
- ui/app/**/attack-paths/**
tests: []
e2e:
- ui/tests/attack-paths/**
+3 -3
View File
@@ -110,7 +110,7 @@ jobs:
git --no-pager diff
- name: Create PR for next API minor version to master
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -164,7 +164,7 @@ jobs:
git --no-pager diff
- name: Create PR for first API patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -235,7 +235,7 @@ jobs:
git --no-pager diff
- name: Create PR for next API patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
+2 -1
View File
@@ -37,7 +37,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
api/**
@@ -46,6 +46,7 @@ jobs:
api/docs/**
api/README.md
api/CHANGELOG.md
api/AGENTS.md
- name: Setup Python with Poetry
if: steps.check-changes.outputs.any_changed == 'true'
@@ -102,7 +102,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build and push API container for ${{ matrix.arch }}
id: container-push
@@ -125,13 +125,13 @@ jobs:
steps:
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
+4 -3
View File
@@ -32,7 +32,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: api/Dockerfile
@@ -67,17 +67,18 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: api/**
files_ignore: |
api/docs/**
api/README.md
api/CHANGELOG.md
api/AGENTS.md
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
+11 -10
View File
@@ -1,14 +1,14 @@
name: 'API: Security'
name: "API: Security"
on:
push:
branches:
- 'master'
- 'v5.*'
- "master"
- "v5.*"
pull_request:
branches:
- 'master'
- 'v5.*'
- "master"
- "v5.*"
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -26,7 +26,7 @@ jobs:
strategy:
matrix:
python-version:
- '3.12'
- "3.12"
defaults:
run:
working-directory: ./api
@@ -37,7 +37,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
api/**
@@ -46,6 +46,7 @@ jobs:
api/docs/**
api/README.md
api/CHANGELOG.md
api/AGENTS.md
- name: Setup Python with Poetry
if: steps.check-changes.outputs.any_changed == 'true'
@@ -60,9 +61,9 @@ jobs:
- name: Safety
if: steps.check-changes.outputs.any_changed == 'true'
# 76352, 76353, 77323 come from SDK, but they cannot upgrade it yet. It does not affect API
# TODO: Botocore needs urllib3 1.X so we need to ignore these vulnerabilities 77744,77745. Remove this once we upgrade to urllib3 2.X
run: poetry run safety check --ignore 70612,66963,74429,76352,76353,77323,77744,77745
run: poetry run safety check --ignore 79023,79027,84420
# TODO: 79023 & 79027 knack ReDoS until `azure-cli-core` (via `cartography`) allows `knack` >=0.13.0
# TODO: 84420 from `azure-core`, that we need fix alltogether with `azure-cli-core` and `knack`
- name: Vulture
if: steps.check-changes.outputs.any_changed == 'true'
+3 -2
View File
@@ -77,7 +77,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
api/**
@@ -86,6 +86,7 @@ jobs:
api/docs/**
api/README.md
api/CHANGELOG.md
api/AGENTS.md
- name: Setup Python with Poetry
if: steps.check-changes.outputs.any_changed == 'true'
@@ -100,7 +101,7 @@ jobs:
- name: Upload coverage reports to Codecov
if: steps.check-changes.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
+3 -3
View File
@@ -106,7 +106,7 @@ jobs:
git --no-pager diff
- name: Create PR for documentation update to master
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -161,7 +161,7 @@ jobs:
git --no-pager diff
- name: Create PR for documentation update to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -225,7 +225,7 @@ jobs:
git --no-pager diff
- name: Create PR for documentation update to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
+1 -1
View File
@@ -28,6 +28,6 @@ jobs:
fetch-depth: 0
- name: Scan for secrets with TruffleHog
uses: trufflesecurity/trufflehog@aade3bff5594fe8808578dd4db3dfeae9bf2abdc # v3.91.1
uses: trufflesecurity/trufflehog@ef6e76c3c4023279497fab4721ffa071a722fd05 # v3.92.4
with:
extra_args: '--results=verified,unknown'
+2 -4
View File
@@ -51,18 +51,16 @@ jobs:
"amitsharm"
"andoniaf"
"cesararroba"
"Chan9390"
"danibarranqueroo"
"HugoPBrito"
"jfagoagas"
"josemazo"
"josema-xyz"
"lydiavilchez"
"mmuller88"
"MrCloudSec"
# "MrCloudSec"
"pedrooot"
"prowler-bot"
"puchy22"
"rakan-pro"
"RosaRivasProwler"
"StylusFrost"
"toniblyx"
@@ -100,7 +100,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build and push MCP container for ${{ matrix.arch }}
id: container-push
@@ -131,13 +131,13 @@ jobs:
steps:
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
+3 -3
View File
@@ -32,7 +32,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: mcp_server/Dockerfile
@@ -66,7 +66,7 @@ jobs:
- name: Check for MCP changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: mcp_server/**
files_ignore: |
@@ -75,7 +75,7 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build MCP container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
+20 -2
View File
@@ -35,21 +35,23 @@ jobs:
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
api/**
ui/**
prowler/**
mcp_server/**
poetry.lock
pyproject.toml
- name: Check for folder changes and changelog presence
id: check-folders
run: |
missing_changelogs=""
# Check api folder
if [[ "${{ steps.changed-files.outputs.any_changed }}" == "true" ]]; then
# Check monitored folders
for folder in $MONITORED_FOLDERS; do
# Get files changed in this folder
changed_in_folder=$(echo "${{ steps.changed-files.outputs.all_changed_files }}" | tr ' ' '\n' | grep "^${folder}/" || true)
@@ -64,6 +66,22 @@ jobs:
fi
fi
done
# Check root-level dependency files (poetry.lock, pyproject.toml)
# These are associated with the prowler folder changelog
root_deps_changed=$(echo "${{ steps.changed-files.outputs.all_changed_files }}" | tr ' ' '\n' | grep -E "^(poetry\.lock|pyproject\.toml)$" || true)
if [ -n "$root_deps_changed" ]; then
echo "Detected changes in root dependency files: $root_deps_changed"
# Check if prowler/CHANGELOG.md was already updated (might have been caught above)
prowler_changelog_updated=$(echo "${{ steps.changed-files.outputs.all_changed_files }}" | tr ' ' '\n' | grep "^prowler/CHANGELOG.md$" || true)
if [ -z "$prowler_changelog_updated" ]; then
# Only add if prowler wasn't already flagged
if ! echo "$missing_changelogs" | grep -q "prowler"; then
echo "No changelog update found for root dependency changes"
missing_changelogs="${missing_changelogs}- \`prowler\` (root dependency files changed)"$'\n'
fi
fi
fi
fi
{
+1 -1
View File
@@ -32,7 +32,7 @@ jobs:
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: '**'
+2 -2
View File
@@ -344,7 +344,7 @@ jobs:
- name: Create PR for API dependency update
if: ${{ env.PATCH_VERSION == '0' }}
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
commit-message: 'chore(api): update prowler dependency to ${{ env.BRANCH_NAME }} for release ${{ env.PROWLER_VERSION }}'
@@ -374,7 +374,7 @@ jobs:
no-changelog
- name: Create draft release
uses: softprops/action-gh-release@6da8fa9354ddfdc4aeace5fc48d7f679b5214090 # v2.4.1
uses: softprops/action-gh-release@a06a81a03ee405af7f2048a818ed3f03bbf83c7b # v2.5.0
with:
tag_name: ${{ env.PROWLER_VERSION }}
name: Prowler ${{ env.PROWLER_VERSION }}
+3 -3
View File
@@ -91,7 +91,7 @@ jobs:
git --no-pager diff
- name: Create PR for next minor version to master
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -139,7 +139,7 @@ jobs:
git --no-pager diff
- name: Create PR for first patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -196,7 +196,7 @@ jobs:
git --no-pager diff
- name: Create PR for next patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -0,0 +1,91 @@
name: 'SDK: Check Duplicate Test Names'
on:
pull_request:
branches:
- 'master'
- 'v5.*'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
check-duplicate-test-names:
if: github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
timeout-minutes: 10
permissions:
contents: read
steps:
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Check for duplicate test names across providers
run: |
python3 << 'EOF'
import sys
from collections import defaultdict
from pathlib import Path
def find_duplicate_test_names():
"""Find test files with the same name across different providers."""
tests_dir = Path("tests/providers")
if not tests_dir.exists():
print("tests/providers directory not found")
sys.exit(0)
# Dictionary: filename -> list of (provider, full_path)
test_files = defaultdict(list)
# Find all *_test.py files
for test_file in tests_dir.rglob("*_test.py"):
relative_path = test_file.relative_to(tests_dir)
provider = relative_path.parts[0]
filename = test_file.name
test_files[filename].append((provider, str(test_file)))
# Find duplicates (files appearing in multiple providers)
duplicates = {
filename: locations
for filename, locations in test_files.items()
if len(set(loc[0] for loc in locations)) > 1
}
if not duplicates:
print("No duplicate test file names found across providers.")
print("All test names are unique within the repository.")
sys.exit(0)
# Report duplicates
print("::error::Duplicate test file names found across providers!")
print()
print("=" * 70)
print("DUPLICATE TEST NAMES DETECTED")
print("=" * 70)
print()
print("The following test files have the same name in multiple providers.")
print("Please rename YOUR new test file by adding the provider prefix.")
print()
print("Example: 'kms_service_test.py' -> 'oraclecloud_kms_service_test.py'")
print()
for filename, locations in sorted(duplicates.items()):
print(f"### {filename}")
print(f" Found in {len(locations)} providers:")
for provider, path in sorted(locations):
print(f" - {provider}: {path}")
print()
print(f" Suggested fix: Rename your new file to '<provider>_{filename}'")
print()
print("=" * 70)
print()
print("See: tests/providers/TESTING.md for naming conventions.")
sys.exit(1)
if __name__ == "__main__":
find_duplicate_test_names()
EOF
+5 -3
View File
@@ -35,7 +35,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ./**
files_ignore: |
@@ -47,6 +47,7 @@ jobs:
ui/**
dashboard/**
mcp_server/**
skills/**
README.md
mkdocs.yml
.backportrc.json
@@ -55,6 +56,7 @@ jobs:
examples/**
.gitignore
contrib/**
**/AGENTS.md
- name: Install Poetry
if: steps.check-changes.outputs.any_changed == 'true'
@@ -79,11 +81,11 @@ jobs:
- name: Lint with flake8
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib,ui,api
run: poetry run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib,ui,api,skills
- name: Check format with black
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run black --exclude api ui --check .
run: poetry run black --exclude "api|ui|skills" --check .
- name: Lint with pylint
if: steps.check-changes.outputs.any_changed == 'true'
@@ -169,7 +169,7 @@ jobs:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build and push SDK container for ${{ matrix.arch }}
id: container-push
@@ -193,13 +193,13 @@ jobs:
steps:
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Public ECR
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: public.ecr.aws
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
@@ -208,7 +208,7 @@ jobs:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
+5 -3
View File
@@ -31,7 +31,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: Dockerfile
@@ -66,7 +66,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ./**
files_ignore: |
@@ -78,6 +78,7 @@ jobs:
ui/**
dashboard/**
mcp_server/**
skills/**
README.md
mkdocs.yml
.backportrc.json
@@ -86,10 +87,11 @@ jobs:
examples/**
.gitignore
contrib/**
**/AGENTS.md
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build SDK container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
@@ -50,7 +50,7 @@ jobs:
- name: Create pull request
id: create-pr
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
author: 'prowler-bot <179230569+prowler-bot@users.noreply.github.com>'
@@ -0,0 +1,93 @@
name: 'SDK: Refresh OCI Regions'
on:
schedule:
- cron: '0 9 * * 1' # Every Monday at 09:00 UTC
workflow_dispatch:
concurrency:
group: ${{ github.workflow }}
cancel-in-progress: false
env:
PYTHON_VERSION: '3.12'
jobs:
refresh-oci-regions:
if: github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
timeout-minutes: 15
permissions:
pull-requests: write
contents: write
steps:
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
ref: 'master'
- name: Set up Python ${{ env.PYTHON_VERSION }}
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: 'pip'
- name: Install dependencies
run: pip install oci
- name: Update OCI regions
env:
OCI_CLI_USER: ${{ secrets.E2E_OCI_USER_ID }}
OCI_CLI_FINGERPRINT: ${{ secrets.E2E_OCI_FINGERPRINT }}
OCI_CLI_TENANCY: ${{ secrets.E2E_OCI_TENANCY_ID }}
OCI_CLI_KEY_CONTENT: ${{ secrets.E2E_OCI_KEY_CONTENT }}
OCI_CLI_REGION: ${{ secrets.E2E_OCI_REGION }}
run: python util/update_oci_regions.py
- name: Create pull request
id: create-pr
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
author: 'prowler-bot <179230569+prowler-bot@users.noreply.github.com>'
committer: 'github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>'
commit-message: 'feat(oraclecloud): update commercial regions'
branch: 'oci-regions-update-${{ github.run_number }}'
title: 'feat(oraclecloud): Update commercial regions'
labels: |
status/waiting-for-revision
no-changelog
body: |
### Description
Automated update of OCI commercial regions from the official Oracle Cloud Infrastructure Identity service.
**Trigger:** ${{ github.event_name == 'schedule' && 'Scheduled (weekly)' || github.event_name == 'workflow_dispatch' && 'Manual' || 'Workflow update' }}
**Run:** [#${{ github.run_number }}](${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }})
### Changes
This PR updates the `OCI_COMMERCIAL_REGIONS` dictionary in `prowler/providers/oraclecloud/config.py` with the latest regions fetched from the OCI Identity API (`list_regions()`).
- Government regions (`OCI_GOVERNMENT_REGIONS`) are preserved unchanged
- Region display names are mapped from Oracle's official documentation
### Checklist
- [x] This is an automated update from OCI official sources
- [x] Government regions (us-langley-1, us-luke-1) preserved
- [x] No manual review of region data required
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
- name: PR creation result
run: |
if [[ "${{ steps.create-pr.outputs.pull-request-number }}" ]]; then
echo "✓ Pull request #${{ steps.create-pr.outputs.pull-request-number }} created successfully"
echo "URL: ${{ steps.create-pr.outputs.pull-request-url }}"
else
echo "✓ No changes detected - OCI regions are up to date"
fi
+7 -3
View File
@@ -28,9 +28,11 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ./**
files:
./**
.github/workflows/sdk-security.yml
files_ignore: |
.github/**
prowler/CHANGELOG.md
@@ -40,6 +42,7 @@ jobs:
ui/**
dashboard/**
mcp_server/**
skills/**
README.md
mkdocs.yml
.backportrc.json
@@ -48,6 +51,7 @@ jobs:
examples/**
.gitignore
contrib/**
**/AGENTS.md
- name: Install Poetry
if: steps.check-changes.outputs.any_changed == 'true'
@@ -70,7 +74,7 @@ jobs:
- name: Security scan with Safety
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run safety check --ignore 70612 -r pyproject.toml
run: poetry run safety check -r pyproject.toml
- name: Dead code detection with Vulture
if: steps.check-changes.outputs.any_changed == 'true'
+51 -25
View File
@@ -35,7 +35,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ./**
files_ignore: |
@@ -47,6 +47,7 @@ jobs:
ui/**
dashboard/**
mcp_server/**
skills/**
README.md
mkdocs.yml
.backportrc.json
@@ -55,6 +56,7 @@ jobs:
examples/**
.gitignore
contrib/**
**/AGENTS.md
- name: Install Poetry
if: steps.check-changes.outputs.any_changed == 'true'
@@ -75,7 +77,7 @@ jobs:
- name: Check if AWS files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-aws
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/aws/**
@@ -189,7 +191,7 @@ jobs:
- name: Upload AWS coverage to Codecov
if: steps.changed-aws.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -200,7 +202,7 @@ jobs:
- name: Check if Azure files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-azure
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/azure/**
@@ -213,7 +215,7 @@ jobs:
- name: Upload Azure coverage to Codecov
if: steps.changed-azure.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -224,7 +226,7 @@ jobs:
- name: Check if GCP files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-gcp
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/gcp/**
@@ -237,7 +239,7 @@ jobs:
- name: Upload GCP coverage to Codecov
if: steps.changed-gcp.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -248,7 +250,7 @@ jobs:
- name: Check if Kubernetes files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-kubernetes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/kubernetes/**
@@ -261,7 +263,7 @@ jobs:
- name: Upload Kubernetes coverage to Codecov
if: steps.changed-kubernetes.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -272,7 +274,7 @@ jobs:
- name: Check if GitHub files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-github
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/github/**
@@ -285,7 +287,7 @@ jobs:
- name: Upload GitHub coverage to Codecov
if: steps.changed-github.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -296,7 +298,7 @@ jobs:
- name: Check if NHN files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-nhn
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/nhn/**
@@ -309,7 +311,7 @@ jobs:
- name: Upload NHN coverage to Codecov
if: steps.changed-nhn.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -320,7 +322,7 @@ jobs:
- name: Check if M365 files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-m365
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/m365/**
@@ -333,7 +335,7 @@ jobs:
- name: Upload M365 coverage to Codecov
if: steps.changed-m365.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -344,7 +346,7 @@ jobs:
- name: Check if IaC files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-iac
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/iac/**
@@ -357,7 +359,7 @@ jobs:
- name: Upload IaC coverage to Codecov
if: steps.changed-iac.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -368,7 +370,7 @@ jobs:
- name: Check if MongoDB Atlas files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-mongodbatlas
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/mongodbatlas/**
@@ -381,7 +383,7 @@ jobs:
- name: Upload MongoDB Atlas coverage to Codecov
if: steps.changed-mongodbatlas.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -392,7 +394,7 @@ jobs:
- name: Check if OCI files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-oraclecloud
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/oraclecloud/**
@@ -405,18 +407,42 @@ jobs:
- name: Upload OCI coverage to Codecov
if: steps.changed-oraclecloud.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: prowler-py${{ matrix.python-version }}-oraclecloud
files: ./oraclecloud_coverage.xml
# OpenStack Provider
- name: Check if OpenStack files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-openstack
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/openstack/**
./tests/**/openstack/**
./poetry.lock
- name: Run OpenStack tests
if: steps.changed-openstack.outputs.any_changed == 'true'
run: poetry run pytest -n auto --cov=./prowler/providers/openstack --cov-report=xml:openstack_coverage.xml tests/providers/openstack
- name: Upload OpenStack coverage to Codecov
if: steps.changed-openstack.outputs.any_changed == 'true'
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: prowler-py${{ matrix.python-version }}-openstack
files: ./openstack_coverage.xml
# Lib
- name: Check if Lib files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-lib
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/lib/**
@@ -429,7 +455,7 @@ jobs:
- name: Upload Lib coverage to Codecov
if: steps.changed-lib.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -440,7 +466,7 @@ jobs:
- name: Check if Config files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-config
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/config/**
@@ -453,7 +479,7 @@ jobs:
- name: Upload Config coverage to Codecov
if: steps.changed-config.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
+112
View File
@@ -0,0 +1,112 @@
name: Test Impact Analysis
on:
workflow_call:
outputs:
run-all:
description: "Whether to run all tests (critical path changed)"
value: ${{ jobs.analyze.outputs.run-all }}
sdk-tests:
description: "SDK test paths to run"
value: ${{ jobs.analyze.outputs.sdk-tests }}
api-tests:
description: "API test paths to run"
value: ${{ jobs.analyze.outputs.api-tests }}
ui-e2e:
description: "UI E2E test paths to run"
value: ${{ jobs.analyze.outputs.ui-e2e }}
modules:
description: "Comma-separated list of affected modules"
value: ${{ jobs.analyze.outputs.modules }}
has-tests:
description: "Whether there are any tests to run"
value: ${{ jobs.analyze.outputs.has-tests }}
has-sdk-tests:
description: "Whether there are SDK tests to run"
value: ${{ jobs.analyze.outputs.has-sdk-tests }}
has-api-tests:
description: "Whether there are API tests to run"
value: ${{ jobs.analyze.outputs.has-api-tests }}
has-ui-e2e:
description: "Whether there are UI E2E tests to run"
value: ${{ jobs.analyze.outputs.has-ui-e2e }}
jobs:
analyze:
runs-on: ubuntu-latest
timeout-minutes: 5
outputs:
run-all: ${{ steps.impact.outputs.run-all }}
sdk-tests: ${{ steps.impact.outputs.sdk-tests }}
api-tests: ${{ steps.impact.outputs.api-tests }}
ui-e2e: ${{ steps.impact.outputs.ui-e2e }}
modules: ${{ steps.impact.outputs.modules }}
has-tests: ${{ steps.impact.outputs.has-tests }}
has-sdk-tests: ${{ steps.set-flags.outputs.has-sdk-tests }}
has-api-tests: ${{ steps.set-flags.outputs.has-api-tests }}
has-ui-e2e: ${{ steps.set-flags.outputs.has-ui-e2e }}
steps:
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
- name: Setup Python
uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b # v5.3.0
with:
python-version: '3.12'
- name: Install PyYAML
run: pip install pyyaml
- name: Analyze test impact
id: impact
run: |
echo "Changed files:"
echo "${{ steps.changed-files.outputs.all_changed_files }}" | tr ' ' '\n'
echo ""
python .github/scripts/test-impact.py ${{ steps.changed-files.outputs.all_changed_files }}
- name: Set convenience flags
id: set-flags
run: |
if [[ -n "${{ steps.impact.outputs.sdk-tests }}" ]]; then
echo "has-sdk-tests=true" >> $GITHUB_OUTPUT
else
echo "has-sdk-tests=false" >> $GITHUB_OUTPUT
fi
if [[ -n "${{ steps.impact.outputs.api-tests }}" ]]; then
echo "has-api-tests=true" >> $GITHUB_OUTPUT
else
echo "has-api-tests=false" >> $GITHUB_OUTPUT
fi
if [[ -n "${{ steps.impact.outputs.ui-e2e }}" ]]; then
echo "has-ui-e2e=true" >> $GITHUB_OUTPUT
else
echo "has-ui-e2e=false" >> $GITHUB_OUTPUT
fi
- name: Summary
run: |
echo "## Test Impact Analysis" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
if [[ "${{ steps.impact.outputs.run-all }}" == "true" ]]; then
echo "🚨 **Critical path changed - running ALL tests**" >> $GITHUB_STEP_SUMMARY
else
echo "### Affected Modules" >> $GITHUB_STEP_SUMMARY
echo "\`${{ steps.impact.outputs.modules }}\`" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "### Tests to Run" >> $GITHUB_STEP_SUMMARY
echo "| Category | Paths |" >> $GITHUB_STEP_SUMMARY
echo "|----------|-------|" >> $GITHUB_STEP_SUMMARY
echo "| SDK Tests | \`${{ steps.impact.outputs.sdk-tests || 'none' }}\` |" >> $GITHUB_STEP_SUMMARY
echo "| API Tests | \`${{ steps.impact.outputs.api-tests || 'none' }}\` |" >> $GITHUB_STEP_SUMMARY
echo "| UI E2E | \`${{ steps.impact.outputs.ui-e2e || 'none' }}\` |" >> $GITHUB_STEP_SUMMARY
fi
+3 -3
View File
@@ -90,7 +90,7 @@ jobs:
git --no-pager diff
- name: Create PR for next minor version to master
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -140,7 +140,7 @@ jobs:
git --no-pager diff
- name: Create PR for first patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -199,7 +199,7 @@ jobs:
git --no-pager diff
- name: Create PR for next patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -104,7 +104,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build and push UI container for ${{ matrix.arch }}
id: container-push
@@ -130,13 +130,13 @@ jobs:
steps:
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
+4 -3
View File
@@ -32,7 +32,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ui/Dockerfile
@@ -67,16 +67,17 @@ jobs:
- name: Check for UI changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ui/**
files_ignore: |
ui/CHANGELOG.md
ui/README.md
ui/AGENTS.md
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build UI container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
@@ -1,4 +1,8 @@
name: UI - E2E Tests
name: UI - E2E Tests (Optimized)
# This is an optimized version that runs only relevant E2E tests
# based on changed files. Falls back to running all tests if
# critical paths are changed or if impact analysis fails.
on:
pull_request:
@@ -6,13 +10,23 @@ on:
- master
- "v5.*"
paths:
- '.github/workflows/ui-e2e-tests.yml'
- '.github/workflows/ui-e2e-tests-v2.yml'
- '.github/test-impact.yml'
- 'ui/**'
- 'api/**' # API changes can affect UI E2E
jobs:
e2e-tests:
# First, analyze which tests need to run
impact-analysis:
if: github.repository == 'prowler-cloud/prowler'
uses: ./.github/workflows/test-impact-analysis.yml
# Run E2E tests based on impact analysis
e2e-tests:
needs: impact-analysis
if: |
github.repository == 'prowler-cloud/prowler' &&
(needs.impact-analysis.outputs.has-ui-e2e == 'true' || needs.impact-analysis.outputs.run-all == 'true')
runs-on: ubuntu-latest
env:
AUTH_SECRET: 'fallback-ci-secret-for-testing'
@@ -51,118 +65,185 @@ jobs:
E2E_OCI_KEY_CONTENT: ${{ secrets.E2E_OCI_KEY_CONTENT }}
E2E_OCI_REGION: ${{ secrets.E2E_OCI_REGION }}
E2E_NEW_USER_PASSWORD: ${{ secrets.E2E_NEW_USER_PASSWORD }}
E2E_ALIBABACLOUD_ACCOUNT_ID: ${{ secrets.E2E_ALIBABACLOUD_ACCOUNT_ID }}
E2E_ALIBABACLOUD_ACCESS_KEY_ID: ${{ secrets.E2E_ALIBABACLOUD_ACCESS_KEY_ID }}
E2E_ALIBABACLOUD_ACCESS_KEY_SECRET: ${{ secrets.E2E_ALIBABACLOUD_ACCESS_KEY_SECRET }}
E2E_ALIBABACLOUD_ROLE_ARN: ${{ secrets.E2E_ALIBABACLOUD_ROLE_ARN }}
# Pass E2E paths from impact analysis
E2E_TEST_PATHS: ${{ needs.impact-analysis.outputs.ui-e2e }}
RUN_ALL_TESTS: ${{ needs.impact-analysis.outputs.run-all }}
steps:
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Show test scope
run: |
echo "## E2E Test Scope" >> $GITHUB_STEP_SUMMARY
if [[ "${{ env.RUN_ALL_TESTS }}" == "true" ]]; then
echo "Running **ALL** E2E tests (critical path changed)" >> $GITHUB_STEP_SUMMARY
else
echo "Running tests matching: \`${{ env.E2E_TEST_PATHS }}\`" >> $GITHUB_STEP_SUMMARY
fi
echo ""
echo "Affected modules: \`${{ needs.impact-analysis.outputs.modules }}\`" >> $GITHUB_STEP_SUMMARY
- name: Create k8s Kind Cluster
uses: helm/kind-action@v1
with:
cluster_name: kind
- name: Modify kubeconfig
run: |
# Modify the kubeconfig to use the kind cluster server to https://kind-control-plane:6443
# from worker service into docker-compose.yml
kubectl config set-cluster kind-kind --server=https://kind-control-plane:6443
kubectl config view
kubectl config set-cluster kind-kind --server=https://kind-control-plane:6443
kubectl config view
- name: Add network kind to docker compose
run: |
# Add the network kind to the docker compose to interconnect to kind cluster
yq -i '.networks.kind.external = true' docker-compose.yml
# Add network kind to worker service and default network too
yq -i '.services.worker.networks = ["kind","default"]' docker-compose.yml
- name: Fix API data directory permissions
run: docker run --rm -v $(pwd)/_data/api:/data alpine chown -R 1000:1000 /data
- name: Add AWS credentials for testing AWS SDK Default Adding Provider
- name: Add AWS credentials for testing
run: |
echo "Adding AWS credentials for testing AWS SDK Default Adding Provider..."
echo "AWS_ACCESS_KEY_ID=${{ secrets.E2E_AWS_PROVIDER_ACCESS_KEY }}" >> .env
echo "AWS_SECRET_ACCESS_KEY=${{ secrets.E2E_AWS_PROVIDER_SECRET_KEY }}" >> .env
- name: Start API services
run: |
# Override docker-compose image tag to use latest instead of stable
# This overrides any PROWLER_API_VERSION set in .env file
export PROWLER_API_VERSION=latest
echo "Using PROWLER_API_VERSION=${PROWLER_API_VERSION}"
docker compose up -d api worker worker-beat
- name: Wait for API to be ready
run: |
echo "Waiting for prowler-api..."
timeout=150 # 5 minutes max
timeout=150
elapsed=0
while [ $elapsed -lt $timeout ]; do
if curl -s ${NEXT_PUBLIC_API_BASE_URL}/docs >/dev/null 2>&1; then
echo "Prowler API is ready!"
exit 0
fi
echo "Waiting for prowler-api... (${elapsed}s elapsed)"
echo "Waiting... (${elapsed}s elapsed)"
sleep 5
elapsed=$((elapsed + 5))
done
echo "Timeout waiting for prowler-api to start"
echo "Timeout waiting for prowler-api"
exit 1
- name: Load database fixtures for E2E tests
- name: Load database fixtures
run: |
docker compose exec -T api sh -c '
echo "Loading all fixtures from api/fixtures/dev/..."
for fixture in api/fixtures/dev/*.json; do
if [ -f "$fixture" ]; then
echo "Loading $fixture"
poetry run python manage.py loaddata "$fixture" --database admin
fi
done
echo "All database fixtures loaded successfully!"
'
- name: Setup Node.js environment
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6.0.0
- name: Setup Node.js
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: '20.x'
node-version: '24.13.0'
- name: Setup pnpm
uses: pnpm/action-setup@v4
with:
version: 10
run_install: false
- name: Get pnpm store directory
shell: bash
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
- name: Setup pnpm and Next.js cache
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('ui/pnpm-lock.yaml') }}
path: |
${{ env.STORE_PATH }}
./ui/node_modules
./ui/.next/cache
key: ${{ runner.os }}-pnpm-nextjs-${{ hashFiles('ui/pnpm-lock.yaml') }}-${{ hashFiles('ui/**/*.ts', 'ui/**/*.tsx', 'ui/**/*.js', 'ui/**/*.jsx') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
${{ runner.os }}-pnpm-nextjs-${{ hashFiles('ui/pnpm-lock.yaml') }}-
${{ runner.os }}-pnpm-nextjs-
- name: Install UI dependencies
working-directory: ./ui
run: pnpm install --frozen-lockfile
run: pnpm install --frozen-lockfile --prefer-offline
- name: Build UI application
working-directory: ./ui
run: pnpm run build
- name: Cache Playwright browsers
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
id: playwright-cache
with:
path: ~/.cache/ms-playwright
key: ${{ runner.os }}-playwright-${{ hashFiles('ui/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-playwright-
- name: Install Playwright browsers
working-directory: ./ui
if: steps.playwright-cache.outputs.cache-hit != 'true'
run: pnpm run test:e2e:install
- name: Run E2E tests
working-directory: ./ui
run: pnpm run test:e2e
run: |
if [[ "${{ env.RUN_ALL_TESTS }}" == "true" ]]; then
echo "Running ALL E2E tests..."
pnpm run test:e2e
else
echo "Running targeted E2E tests: ${{ env.E2E_TEST_PATHS }}"
# Convert glob patterns to playwright test paths
# e.g., "ui/tests/providers/**" -> "tests/providers"
TEST_PATHS="${{ env.E2E_TEST_PATHS }}"
# Remove ui/ prefix and convert ** to empty (playwright handles recursion)
TEST_PATHS=$(echo "$TEST_PATHS" | sed 's|ui/||g' | sed 's|\*\*||g' | tr ' ' '\n' | sort -u)
# Drop auth setup helpers (not runnable test suites)
TEST_PATHS=$(echo "$TEST_PATHS" | grep -v '^tests/setups/')
if [[ -z "$TEST_PATHS" ]]; then
echo "No runnable E2E test paths after filtering setups"
exit 0
fi
TEST_PATHS=$(echo "$TEST_PATHS" | tr '\n' ' ')
echo "Resolved test paths: $TEST_PATHS"
pnpm exec playwright test $TEST_PATHS
fi
- name: Upload test reports
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
if: failure()
with:
name: playwright-report
path: ui/playwright-report/
retention-days: 30
- name: Cleanup services
if: always()
run: |
echo "Shutting down services..."
docker compose down -v || true
echo "Cleanup completed"
# Skip job - provides clear feedback when no E2E tests needed
skip-e2e:
needs: impact-analysis
if: |
github.repository == 'prowler-cloud/prowler' &&
needs.impact-analysis.outputs.has-ui-e2e != 'true' &&
needs.impact-analysis.outputs.run-all != 'true'
runs-on: ubuntu-latest
steps:
- name: No E2E tests needed
run: |
echo "## E2E Tests Skipped" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "No UI E2E tests needed for this change." >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "Affected modules: \`${{ needs.impact-analysis.outputs.modules }}\`" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "To run all tests, modify a file in a critical path (e.g., \`ui/lib/**\`)." >> $GITHUB_STEP_SUMMARY
+14 -9
View File
@@ -16,7 +16,7 @@ concurrency:
env:
UI_WORKING_DIR: ./ui
NODE_VERSION: '20.x'
NODE_VERSION: '24.13.0'
jobs:
ui-tests:
@@ -34,7 +34,7 @@ jobs:
- name: Check for UI changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
ui/**
@@ -42,10 +42,11 @@ jobs:
files_ignore: |
ui/CHANGELOG.md
ui/README.md
ui/AGENTS.md
- name: Setup Node.js ${{ env.NODE_VERSION }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6.0.0
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: ${{ env.NODE_VERSION }}
@@ -61,18 +62,22 @@ jobs:
shell: bash
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
- name: Setup pnpm and Next.js cache
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('ui/pnpm-lock.yaml') }}
path: |
${{ env.STORE_PATH }}
${{ env.UI_WORKING_DIR }}/node_modules
${{ env.UI_WORKING_DIR }}/.next/cache
key: ${{ runner.os }}-pnpm-nextjs-${{ hashFiles('ui/pnpm-lock.yaml') }}-${{ hashFiles('ui/**/*.ts', 'ui/**/*.tsx', 'ui/**/*.js', 'ui/**/*.jsx') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
${{ runner.os }}-pnpm-nextjs-${{ hashFiles('ui/pnpm-lock.yaml') }}-
${{ runner.os }}-pnpm-nextjs-
- name: Install dependencies
if: steps.check-changes.outputs.any_changed == 'true'
run: pnpm install --frozen-lockfile
run: pnpm install --frozen-lockfile --prefer-offline
- name: Run healthcheck
if: steps.check-changes.outputs.any_changed == 'true'
+12 -1
View File
@@ -82,6 +82,9 @@ continue.json
.continuerc
.continuerc.json
# AI Coding Assistants - OpenCode
opencode.json
# AI Coding Assistants - GitHub Copilot
.copilot/
.github/copilot/
@@ -147,8 +150,16 @@ node_modules
# Persistent data
_data/
# Claude
# AI Instructions (generated by skills/setup.sh from AGENTS.md)
CLAUDE.md
GEMINI.md
.github/copilot-instructions.md
# Compliance report
*.pdf
# AI Skills symlinks (generated by skills/setup.sh)
.claude/skills
.codex/skills
.github/skills
.gemini/skills
+10 -6
View File
@@ -34,6 +34,7 @@ repos:
rev: v2.3.1
hooks:
- id: autoflake
exclude: ^skills/
args:
[
"--in-place",
@@ -41,22 +42,24 @@ repos:
"--remove-unused-variable",
]
- repo: https://github.com/timothycrosley/isort
- repo: https://github.com/pycqa/isort
rev: 5.13.2
hooks:
- id: isort
exclude: ^skills/
args: ["--profile", "black"]
- repo: https://github.com/psf/black
rev: 24.4.2
hooks:
- id: black
exclude: ^skills/
- repo: https://github.com/pycqa/flake8
rev: 7.0.0
hooks:
- id: flake8
exclude: contrib
exclude: (contrib|^skills/)
args: ["--ignore=E266,W503,E203,E501,W605"]
- repo: https://github.com/python-poetry/poetry
@@ -82,7 +85,6 @@ repos:
args: ["--directory=./"]
pass_filenames: false
- repo: https://github.com/hadolint/hadolint
rev: v2.13.0-beta
hooks:
@@ -109,7 +111,7 @@ repos:
- id: bandit
name: bandit
description: "Bandit is a tool for finding common security issues in Python code"
entry: bash -c 'bandit -q -lll -x '*_test.py,./contrib/,./.venv/' -r .'
entry: bash -c 'bandit -q -lll -x '*_test.py,./contrib/,./.venv/,./skills/' -r .'
language: system
files: '.*\.py'
@@ -117,13 +119,15 @@ repos:
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
# TODO: Botocore needs urllib3 1.X so we need to ignore these vulnerabilities 77744,77745. Remove this once we upgrade to urllib3 2.X
entry: bash -c 'safety check --ignore 70612,66963,74429,76352,76353,77744,77745'
# TODO: 79023 & 79027 knack ReDoS until `azure-cli-core` (via `cartography`) allows `knack` >=0.13.0
# TODO: 84420 from `azure-core`, that we need fix alltogether with `azure-cli-core` and `knack`
entry: bash -c 'safety check --ignore 70612,66963,74429,76352,76353,77744,77745,79023,79027,84420'
language: system
- id: vulture
name: vulture
description: "Vulture finds unused code in Python programs."
entry: bash -c 'vulture --exclude "contrib,.venv,api/src/backend/api/tests/,api/src/backend/conftest.py,api/src/backend/tasks/tests/" --min-confidence 100 .'
entry: bash -c 'vulture --exclude "contrib,.venv,api/src/backend/api/tests/,api/src/backend/conftest.py,api/src/backend/tasks/tests/,skills/" --min-confidence 100 .'
language: system
files: '.*\.py'
+132 -88
View File
@@ -2,109 +2,153 @@
## How to Use This Guide
- Start here for cross-project norms, Prowler is a monorepo with several components. Every component should have an `AGENTS.md` file that contains the guidelines for the agents in that component. The file is located beside the code you are touching (e.g. `api/AGENTS.md`, `ui/AGENTS.md`, `prowler/AGENTS.md`).
- Follow the stricter rule when guidance conflicts; component docs override this file for their scope.
- Keep instructions synchronized. When you add new workflows or scripts, update both, the relevant component `AGENTS.md` and this file if they apply broadly.
- Start here for cross-project norms. Prowler is a monorepo with several components.
- Each component has an `AGENTS.md` file with specific guidelines (e.g., `api/AGENTS.md`, `ui/AGENTS.md`).
- Component docs override this file when guidance conflicts.
## Available Skills
Use these skills for detailed patterns on-demand:
### Generic Skills (Any Project)
| Skill | Description | URL |
|-------|-------------|-----|
| `typescript` | Const types, flat interfaces, utility types | [SKILL.md](skills/typescript/SKILL.md) |
| `react-19` | No useMemo/useCallback, React Compiler | [SKILL.md](skills/react-19/SKILL.md) |
| `nextjs-15` | App Router, Server Actions, streaming | [SKILL.md](skills/nextjs-15/SKILL.md) |
| `tailwind-4` | cn() utility, no var() in className | [SKILL.md](skills/tailwind-4/SKILL.md) |
| `playwright` | Page Object Model, MCP workflow, selectors | [SKILL.md](skills/playwright/SKILL.md) |
| `pytest` | Fixtures, mocking, markers, parametrize | [SKILL.md](skills/pytest/SKILL.md) |
| `django-drf` | ViewSets, Serializers, Filters | [SKILL.md](skills/django-drf/SKILL.md) |
| `jsonapi` | Strict JSON:API v1.1 spec compliance | [SKILL.md](skills/jsonapi/SKILL.md) |
| `zod-4` | New API (z.email(), z.uuid()) | [SKILL.md](skills/zod-4/SKILL.md) |
| `zustand-5` | Persist, selectors, slices | [SKILL.md](skills/zustand-5/SKILL.md) |
| `ai-sdk-5` | UIMessage, streaming, LangChain | [SKILL.md](skills/ai-sdk-5/SKILL.md) |
### Prowler-Specific Skills
| Skill | Description | URL |
|-------|-------------|-----|
| `prowler` | Project overview, component navigation | [SKILL.md](skills/prowler/SKILL.md) |
| `prowler-api` | Django + RLS + JSON:API patterns | [SKILL.md](skills/prowler-api/SKILL.md) |
| `prowler-ui` | Next.js + shadcn conventions | [SKILL.md](skills/prowler-ui/SKILL.md) |
| `prowler-sdk-check` | Create new security checks | [SKILL.md](skills/prowler-sdk-check/SKILL.md) |
| `prowler-mcp` | MCP server tools and models | [SKILL.md](skills/prowler-mcp/SKILL.md) |
| `prowler-test-sdk` | SDK testing (pytest + moto) | [SKILL.md](skills/prowler-test-sdk/SKILL.md) |
| `prowler-test-api` | API testing (pytest-django + RLS) | [SKILL.md](skills/prowler-test-api/SKILL.md) |
| `prowler-test-ui` | E2E testing (Playwright) | [SKILL.md](skills/prowler-test-ui/SKILL.md) |
| `prowler-compliance` | Compliance framework structure | [SKILL.md](skills/prowler-compliance/SKILL.md) |
| `prowler-compliance-review` | Review compliance framework PRs | [SKILL.md](skills/prowler-compliance-review/SKILL.md) |
| `prowler-provider` | Add new cloud providers | [SKILL.md](skills/prowler-provider/SKILL.md) |
| `prowler-changelog` | Changelog entries (keepachangelog.com) | [SKILL.md](skills/prowler-changelog/SKILL.md) |
| `prowler-ci` | CI checks and PR gates (GitHub Actions) | [SKILL.md](skills/prowler-ci/SKILL.md) |
| `prowler-commit` | Professional commits (conventional-commits) | [SKILL.md](skills/prowler-commit/SKILL.md) |
| `prowler-pr` | Pull request conventions | [SKILL.md](skills/prowler-pr/SKILL.md) |
| `prowler-docs` | Documentation style guide | [SKILL.md](skills/prowler-docs/SKILL.md) |
| `prowler-attack-paths-query` | Create Attack Paths openCypher queries | [SKILL.md](skills/prowler-attack-paths-query/SKILL.md) |
| `skill-creator` | Create new AI agent skills | [SKILL.md](skills/skill-creator/SKILL.md) |
### Auto-invoke Skills
When performing these actions, ALWAYS invoke the corresponding skill FIRST:
| Action | Skill |
|--------|-------|
| Add changelog entry for a PR or feature | `prowler-changelog` |
| Adding DRF pagination or permissions | `django-drf` |
| Adding new providers | `prowler-provider` |
| Adding services to existing providers | `prowler-provider` |
| Adding privilege escalation detection queries | `prowler-attack-paths-query` |
| After creating/modifying a skill | `skill-sync` |
| App Router / Server Actions | `nextjs-15` |
| Building AI chat features | `ai-sdk-5` |
| Committing changes | `prowler-commit` |
| Create PR that requires changelog entry | `prowler-changelog` |
| Create a PR with gh pr create | `prowler-pr` |
| Creating API endpoints | `jsonapi` |
| Creating Attack Paths queries | `prowler-attack-paths-query` |
| Creating ViewSets, serializers, or filters in api/ | `django-drf` |
| Creating Zod schemas | `zod-4` |
| Creating a git commit | `prowler-commit` |
| Creating new checks | `prowler-sdk-check` |
| Creating new skills | `skill-creator` |
| Creating/modifying Prowler UI components | `prowler-ui` |
| Creating/modifying models, views, serializers | `prowler-api` |
| Creating/updating compliance frameworks | `prowler-compliance` |
| Debug why a GitHub Actions job is failing | `prowler-ci` |
| Fill .github/pull_request_template.md (Context/Description/Steps to review/Checklist) | `prowler-pr` |
| General Prowler development questions | `prowler` |
| Implementing JSON:API endpoints | `django-drf` |
| Inspect PR CI checks and gates (.github/workflows/*) | `prowler-ci` |
| Inspect PR CI workflows (.github/workflows/*): conventional-commit, pr-check-changelog, pr-conflict-checker, labeler | `prowler-pr` |
| Mapping checks to compliance controls | `prowler-compliance` |
| Mocking AWS with moto in tests | `prowler-test-sdk` |
| Modifying API responses | `jsonapi` |
| Regenerate AGENTS.md Auto-invoke tables (sync.sh) | `skill-sync` |
| Review PR requirements: template, title conventions, changelog gate | `prowler-pr` |
| Review changelog format and conventions | `prowler-changelog` |
| Reviewing JSON:API compliance | `jsonapi` |
| Reviewing compliance framework PRs | `prowler-compliance-review` |
| Testing RLS tenant isolation | `prowler-test-api` |
| Troubleshoot why a skill is missing from AGENTS.md auto-invoke | `skill-sync` |
| Understand CODEOWNERS/labeler-based automation | `prowler-ci` |
| Understand PR title conventional-commit validation | `prowler-ci` |
| Understand changelog gate and no-changelog label behavior | `prowler-ci` |
| Understand review ownership with CODEOWNERS | `prowler-pr` |
| Update CHANGELOG.md in any component | `prowler-changelog` |
| Updating existing Attack Paths queries | `prowler-attack-paths-query` |
| Updating existing checks and metadata | `prowler-sdk-check` |
| Using Zustand stores | `zustand-5` |
| Working on MCP server tools | `prowler-mcp` |
| Working on Prowler UI structure (actions/adapters/types/hooks) | `prowler-ui` |
| Working with Prowler UI test helpers/pages | `prowler-test-ui` |
| Working with Tailwind classes | `tailwind-4` |
| Writing Playwright E2E tests | `playwright` |
| Writing Prowler API tests | `prowler-test-api` |
| Writing Prowler SDK tests | `prowler-test-sdk` |
| Writing Prowler UI E2E tests | `prowler-test-ui` |
| Writing Python tests with pytest | `pytest` |
| Writing React components | `react-19` |
| Writing TypeScript types/interfaces | `typescript` |
| Writing documentation | `prowler-docs` |
---
## Project Overview
Prowler is an open-source cloud security assessment tool that supports multiple cloud providers (AWS, Azure, GCP, Kubernetes, GitHub, M365, etc.). The project consists in a monorepo with the following main components:
Prowler is an open-source cloud security assessment tool supporting AWS, Azure, GCP, Kubernetes, GitHub, M365, and more.
- **Prowler SDK**: Python SDK, includes the Prowler CLI, providers, services, checks, compliances, config, etc. (`prowler/`)
- **Prowler API**: Django-based REST API backend (`api/`)
- **Prowler UI**: Next.js frontend application (`ui/`)
- **Prowler MCP Server**: Model Context Protocol server that gives access to the entire Prowler ecosystem for LLMs (`mcp_server/`)
- **Prowler Dashboard**: Prowler CLI feature that allows to visualize the results of the scans in a simple dashboard (`dashboard/`)
| Component | Location | Tech Stack |
|-----------|----------|------------|
| SDK | `prowler/` | Python 3.9+, Poetry |
| API | `api/` | Django 5.1, DRF, Celery |
| UI | `ui/` | Next.js 15, React 19, Tailwind 4 |
| MCP Server | `mcp_server/` | FastMCP, Python 3.12+ |
| Dashboard | `dashboard/` | Dash, Plotly |
### Project Structure (Key Folders & Files)
- `prowler/`: Main source code for Prowler SDK (CLI, providers, services, checks, compliances, config, etc.)
- `api/`: Django-based REST API backend components
- `ui/`: Next.js frontend application
- `mcp_server/`: Model Context Protocol server that gives access to the entire Prowler ecosystem for LLMs
- `dashboard/`: Prowler CLI feature that allows to visualize the results of the scans in a simple dashboard
- `docs/`: Documentation
- `examples/`: Example output formats for providers and scripts
- `permissions/`: Permission-related files and policies
- `contrib/`: Community-contributed scripts or modules
- `tests/`: Prowler SDK test suite
- `docker-compose.yml`: Docker compose file to run the Prowler App (API + UI) production environment
- `docker-compose-dev.yml`: Docker compose file to run the Prowler App (API + UI) development environment
- `pyproject.toml`: Poetry Prowler SDK project file
- `.pre-commit-config.yaml`: Pre-commit hooks configuration
- `Makefile`: Makefile to run the project
- `LICENSE`: License file
- `README.md`: README file
- `CONTRIBUTING.md`: Contributing guide
---
## Python Development
Most of the code is written in Python, so the main files in the root are focused on Python code.
### Poetry Dev Environment
For developing in Python we recommend using `poetry` to manage the dependencies. The minimal version is `2.1.1`. So it is recommended to run all commands using `poetry run ...`.
To install the core dependencies to develop it is needed to run `poetry install --with dev`.
### Pre-commit hooks
The project has pre-commit hooks to lint and format the code. They are installed by running `poetry run pre-commit install`.
When commiting a change, the hooks will be run automatically. Some of them are:
- Code formatting (black, isort)
- Linting (flake8, pylint)
- Security checks (bandit, safety, trufflehog)
- YAML/JSON validation
- Poetry lock file validation
### Linting and Formatting
We use the following tools to lint and format the code:
- `flake8`: for linting the code
- `black`: for formatting the code
- `pylint`: for linting the code
You can run all using the `make` command:
```bash
# Setup
poetry install --with dev
poetry run pre-commit install
# Code quality
poetry run make lint
poetry run make format
poetry run pre-commit run --all-files
```
Or they will be run automatically when you commit your changes using pre-commit hooks.
---
## Commit & Pull Request Guidelines
For the commit messages and pull requests name follow the conventional-commit style.
Follow conventional-commit style: `<type>[scope]: <description>`
Befire creating a pull request, complete the checklist in `.github/pull_request_template.md`. Summaries should explain deployment impact, highlight review steps, and note changelog or permission updates. Run all relevant tests and linters before requesting review and link screenshots for UI or dashboard changes.
**Types:** `feat`, `fix`, `docs`, `chore`, `perf`, `refactor`, `style`, `test`
### Conventional Commit Style
The Conventional Commits specification is a lightweight convention on top of commit messages. It provides an easy set of rules for creating an explicit commit history; which makes it easier to write automated tools on top of.
The commit message should be structured as follows:
```
<type>[optional scope]: <description>
<BLANK LINE>
[optional body]
<BLANK LINE>
[optional footer(s)]
```
Any line of the commit message cannot be longer 100 characters! This allows the message to be easier to read on GitHub as well as in various git tools
#### Commit Types
- **feat**: code change introuce new functionality to the application
- **fix**: code change that solve a bug in the codebase
- **docs**: documentation only changes
- **chore**: changes related to the build process or auxiliary tools and libraries, that do not affect the application's functionality
- **perf**: code change that improves performance
- **refactor**: code change that neither fixes a bug nor adds a feature
- **style**: changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
- **test**: adding missing tests or correcting existing tests
Before creating a PR:
1. Complete checklist in `.github/pull_request_template.md`
2. Run all relevant tests and linters
3. Link screenshots for UI changes
+67 -9
View File
@@ -80,6 +80,23 @@ prowler dashboard
```
![Prowler Dashboard](docs/images/products/dashboard.png)
## Attack Paths
Attack Paths automatically extends every completed AWS scan with a Neo4j graph that combines Cartography's cloud inventory with Prowler findings. The feature runs in the API worker after each scan and therefore requires:
- An accessible Neo4j instance (the Docker Compose files already ships a `neo4j` service).
- The following environment variables so Django and Celery can connect:
| Variable | Description | Default |
| --- | --- | --- |
| `NEO4J_HOST` | Hostname used by the API containers. | `neo4j` |
| `NEO4J_PORT` | Bolt port exposed by Neo4j. | `7687` |
| `NEO4J_USER` / `NEO4J_PASSWORD` | Credentials with rights to create per-tenant databases. | `neo4j` / `neo4j_password` |
Every AWS provider scan will enqueue an Attack Paths ingestion job automatically. Other cloud providers will be added in future iterations.
# Prowler at a Glance
> [!Tip]
> For the most accurate and up-to-date information about checks, services, frameworks, and categories, visit [**Prowler Hub**](https://hub.prowler.com).
@@ -87,17 +104,19 @@ prowler dashboard
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) | Support | Interface |
|---|---|---|---|---|---|---|
| AWS | 584 | 85 | 40 | 17 | Official | UI, API, CLI |
| GCP | 89 | 17 | 14 | 5 | Official | UI, API, CLI |
| Azure | 169 | 22 | 15 | 8 | Official | UI, API, CLI |
| Kubernetes | 84 | 7 | 6 | 9 | Official | UI, API, CLI |
| GitHub | 20 | 2 | 1 | 2 | Official | UI, API, CLI |
| M365 | 70 | 7 | 3 | 2 | Official | UI, API, CLI |
| OCI | 52 | 15 | 1 | 12 | Official | UI, API, CLI |
| Alibaba Cloud | 63 | 10 | 1 | 9 | Official | CLI |
| AWS | 572 | 83 | 41 | 17 | Official | UI, API, CLI |
| Azure | 165 | 20 | 18 | 13 | Official | UI, API, CLI |
| GCP | 100 | 13 | 15 | 11 | Official | UI, API, CLI |
| Kubernetes | 83 | 7 | 7 | 9 | Official | UI, API, CLI |
| GitHub | 21 | 2 | 1 | 2 | Official | UI, API, CLI |
| M365 | 75 | 7 | 4 | 4 | Official | UI, API, CLI |
| OCI | 51 | 13 | 3 | 12 | Official | UI, API, CLI |
| Alibaba Cloud | 61 | 9 | 3 | 9 | Official | UI, API, CLI |
| Cloudflare | 29 | 2 | 0 | 5 | Official | CLI, API |
| IaC | [See `trivy` docs.](https://trivy.dev/latest/docs/coverage/iac/) | N/A | N/A | N/A | Official | UI, API, CLI |
| MongoDB Atlas | 10 | 4 | 0 | 3 | Official | UI, API, CLI |
| MongoDB Atlas | 10 | 3 | 0 | 3 | Official | UI, API, CLI |
| LLM | [See `promptfoo` docs.](https://www.promptfoo.dev/docs/red-team/plugins/) | N/A | N/A | N/A | Official | CLI |
| OpenStack | 1 | 1 | 0 | 2 | Official | CLI |
| NHN | 6 | 2 | 1 | 0 | Unofficial | CLI |
> [!Note]
@@ -310,6 +329,45 @@ And many more environments.
![Architecture](docs/img/architecture.png)
# 🤖 AI Skills for Development
Prowler includes a comprehensive set of **AI Skills** that help AI coding assistants understand Prowler's codebase patterns and conventions.
## What are AI Skills?
Skills are structured instructions that give AI assistants the context they need to write code that follows Prowler's standards. They include:
- **Coding patterns** for each component (SDK, API, UI, MCP Server)
- **Testing conventions** (pytest, Playwright)
- **Architecture guidelines** (Clean Architecture, RLS patterns)
- **Framework-specific rules** (React 19, Next.js 15, Django DRF, Tailwind 4)
## Available Skills
| Category | Skills |
|----------|--------|
| **Generic** | `typescript`, `react-19`, `nextjs-15`, `tailwind-4`, `playwright`, `pytest`, `django-drf`, `zod-4`, `zustand-5`, `ai-sdk-5` |
| **Prowler** | `prowler`, `prowler-api`, `prowler-ui`, `prowler-mcp`, `prowler-sdk-check`, `prowler-test-ui`, `prowler-test-api`, `prowler-test-sdk`, `prowler-compliance`, `prowler-provider`, `prowler-pr`, `prowler-docs` |
## Setup
```bash
./skills/setup.sh
```
This configures skills for AI coding assistants that follow the [agentskills.io](https://agentskills.io) standard:
| Tool | Configuration |
|------|---------------|
| **Claude Code** | `.claude/skills/` (symlink) |
| **OpenCode** | `.claude/skills/` (symlink) |
| **Codex (OpenAI)** | `.codex/skills/` (symlink) |
| **GitHub Copilot** | `.github/skills/` (symlink) |
| **Gemini CLI** | `.gemini/skills/` (symlink) |
> **Note:** Restart your AI coding assistant after running setup to load the skills.
> Gemini CLI requires `experimental.skills` enabled in settings.
# 📖 Documentation
For installation instructions, usage details, tutorials, and the Developer Guide, visit https://docs.prowler.com/
+1 -1
View File
@@ -62,4 +62,4 @@ We strive to resolve all problems as quickly as possible, and we would like to p
---
For more information about our security policies, please refer to our [Security](https://docs.prowler.com/projects/prowler-open-source/en/latest/security/) section in our documentation.
For more information about our security policies, please refer to our [Security](https://docs.prowler.com/security) section in our documentation.
+167
View File
@@ -0,0 +1,167 @@
# Prowler API - AI Agent Ruleset
> **Skills Reference**: For detailed patterns, use these skills:
> - [`prowler-api`](../skills/prowler-api/SKILL.md) - Models, Serializers, Views, RLS patterns
> - [`prowler-test-api`](../skills/prowler-test-api/SKILL.md) - Testing patterns (pytest-django)
> - [`prowler-attack-paths-query`](../skills/prowler-attack-paths-query/SKILL.md) - Attack Paths openCypher queries
> - [`django-drf`](../skills/django-drf/SKILL.md) - Generic DRF patterns
> - [`jsonapi`](../skills/jsonapi/SKILL.md) - Strict JSON:API v1.1 spec compliance
> - [`pytest`](../skills/pytest/SKILL.md) - Generic pytest patterns
### Auto-invoke Skills
When performing these actions, ALWAYS invoke the corresponding skill FIRST:
| Action | Skill |
|--------|-------|
| Add changelog entry for a PR or feature | `prowler-changelog` |
| Adding DRF pagination or permissions | `django-drf` |
| Adding privilege escalation detection queries | `prowler-attack-paths-query` |
| Committing changes | `prowler-commit` |
| Create PR that requires changelog entry | `prowler-changelog` |
| Creating API endpoints | `jsonapi` |
| Creating Attack Paths queries | `prowler-attack-paths-query` |
| Creating ViewSets, serializers, or filters in api/ | `django-drf` |
| Creating a git commit | `prowler-commit` |
| Creating/modifying models, views, serializers | `prowler-api` |
| Implementing JSON:API endpoints | `django-drf` |
| Modifying API responses | `jsonapi` |
| Review changelog format and conventions | `prowler-changelog` |
| Reviewing JSON:API compliance | `jsonapi` |
| Testing RLS tenant isolation | `prowler-test-api` |
| Update CHANGELOG.md in any component | `prowler-changelog` |
| Updating existing Attack Paths queries | `prowler-attack-paths-query` |
| Writing Prowler API tests | `prowler-test-api` |
| Writing Python tests with pytest | `pytest` |
---
## CRITICAL RULES - NON-NEGOTIABLE
### Models
- ALWAYS: UUIDv4 PKs, `inserted_at`/`updated_at` timestamps, `JSONAPIMeta` class
- ALWAYS: Inherit from `RowLevelSecurityProtectedModel` for tenant-scoped data
- NEVER: Auto-increment integer PKs, models without tenant isolation
### Serializers
- ALWAYS: Separate serializers for Create/Update operations
- ALWAYS: Inherit from `RLSSerializer` for tenant-scoped models
- NEVER: Write logic in serializers (use services/utils)
### Views
- ALWAYS: Inherit from `BaseRLSViewSet` for tenant-scoped resources
- ALWAYS: Define `filterset_class`, use `@extend_schema` for OpenAPI
- NEVER: Raw SQL queries, business logic in views
### Row-Level Security (RLS)
- ALWAYS: Use `rls_transaction(tenant_id)` context manager
- NEVER: Query across tenants, trust client-provided tenant_id
### Celery Tasks
- ALWAYS: `@shared_task` with `name`, `queue`, `RLSTask` base class
- NEVER: Long-running ops in views, request context in tasks
---
## DECISION TREES
### Serializer Selection
```
Read → <Model>Serializer
Create → <Model>CreateSerializer
Update → <Model>UpdateSerializer
Nested read → <Model>IncludeSerializer
```
### Task vs View
```
< 100ms → View
> 100ms or external API → Celery task
Needs retry → Celery task
```
---
## TECH STACK
Django 5.1.x | DRF 3.15.x | djangorestframework-jsonapi 7.x | Celery 5.4.x | PostgreSQL 16 | pytest 8.x
---
## PROJECT STRUCTURE
```
api/src/backend/
├── api/ # Main Django app
│ ├── v1/ # API version 1 (views, serializers, urls)
│ ├── models.py # Django models
│ ├── filters.py # FilterSet classes
│ ├── base_views.py # Base ViewSet classes
│ ├── rls.py # Row-Level Security
│ └── tests/ # Unit tests
├── config/ # Django configuration
└── tasks/ # Celery tasks
```
---
## COMMANDS
```bash
# Development
poetry run python src/backend/manage.py runserver
poetry run celery -A config.celery worker -l INFO
# Database
poetry run python src/backend/manage.py makemigrations
poetry run python src/backend/manage.py migrate
# Testing & Linting
poetry run pytest -x --tb=short
poetry run make lint
```
---
## QA CHECKLIST
- [ ] `poetry run pytest` passes
- [ ] `poetry run make lint` passes
- [ ] Migrations created if models changed
- [ ] New endpoints have `@extend_schema` decorators
- [ ] RLS properly applied for tenant data
- [ ] Tests cover success and error cases
---
## NAMING CONVENTIONS
| Entity | Pattern | Example |
|--------|---------|---------|
| Serializer (read) | `<Model>Serializer` | `ProviderSerializer` |
| Serializer (create) | `<Model>CreateSerializer` | `ProviderCreateSerializer` |
| Serializer (update) | `<Model>UpdateSerializer` | `ProviderUpdateSerializer` |
| Filter | `<Model>Filter` | `ProviderFilter` |
| ViewSet | `<Model>ViewSet` | `ProviderViewSet` |
| Task | `<action>_<entity>_task` | `sync_provider_resources_task` |
---
## API CONVENTIONS (JSON:API)
```json
{
"data": {
"type": "providers",
"id": "uuid",
"attributes": { "name": "value" },
"relationships": { "tenant": { "data": { "type": "tenants", "id": "uuid" } } }
}
}
```
- Content-Type: `application/vnd.api+json`
- Pagination: `?page[number]=1&page[size]=20`
- Filtering: `?filter[field]=value`, `?filter[field__in]=val1,val2`
- Sorting: `?sort=field`, `?sort=-field`
- Including: `?include=provider,findings`
+198 -63
View File
@@ -2,43 +2,128 @@
All notable changes to the **Prowler API** are documented in this file.
## [1.18.0] (Prowler UNRELEASED)
## [1.20.0] (Prowler UNRELEASED)
### Added
- Support AlibabaCloud provider [(#9485)](https://github.com/prowler-cloud/prowler/pull/9485)
### 🚀 Added
- OpenStack provider support [(#10003)](https://github.com/prowler-cloud/prowler/pull/10003)
### 🔄 Changed
- Attack Paths: Queries definition now has short description and attribution [(#9983)](https://github.com/prowler-cloud/prowler/pull/9983)
- Attack Paths: Internet node is created while scan [(#9992)](https://github.com/prowler-cloud/prowler/pull/9992)
- Attack Paths: Add full paths set from [pathfinding.cloud](https://pathfinding.cloud/) [(#10008)](https://github.com/prowler-cloud/prowler/pull/10008)
- Support CSA CCM 4.0 for the AWS provider [(#10018)](https://github.com/prowler-cloud/prowler/pull/10018)
- Support CSA CCM 4.0 for the GCP provider [(#10042)](https://github.com/prowler-cloud/prowler/pull/10042)
- Support CSA CCM 4.0 for the Azure provider [(#10039)](https://github.com/prowler-cloud/prowler/pull/10039)
- Support CSA CCM 4.0 for the Oracle Cloud provider [(#10057)](https://github.com/prowler-cloud/prowler/pull/10057)
- Support CSA CCM 4.0 for the Alibaba Cloud provider [(#10061)](https://github.com/prowler-cloud/prowler/pull/10061)
- Attack Paths: Mark attack Paths scan as failed when Celery task fails outside job error handling [(#10065)](https://github.com/prowler-cloud/prowler/pull/10065)
- Attack Paths: Remove legacy per-scan `graph_database` and `is_graph_database_deleted` fields from AttackPathsScan model [(#10077)](https://github.com/prowler-cloud/prowler/pull/10077)
### 🔐 Security
- Bump `Pillow` to 12.1.1 (CVE-2021-25289) [(#10027)](https://github.com/prowler-cloud/prowler/pull/10027)
---
## [1.17.2] (Prowler v5.16.2)
## [1.19.2] (Prowler v5.18.2)
### Security
- Updated dependencies to patch security vulnerabilities: Django 5.1.15 (CVE-2025-64460, CVE-2025-13372), Werkzeug 3.1.4 (CVE-2025-66221), sqlparse 0.5.5 (PVE-2025-82038), fonttools 4.60.2 (CVE-2025-66034) [(#9730)](https://github.com/prowler-cloud/prowler/pull/9730)
### 🐞 Fixed
- SAML role mapping now prevents removing the last MANAGE_ACCOUNT user [(#10007)](https://github.com/prowler-cloud/prowler/pull/10007)
---
## [1.19.0] (Prowler v5.18.0)
### 🚀 Added
- Cloudflare provider support [(#9907)](https://github.com/prowler-cloud/prowler/pull/9907)
- Attack Paths: Bedrock Code Interpreter and AttachRolePolicy privilege escalation queries [(#9885)](https://github.com/prowler-cloud/prowler/pull/9885)
- `provider_id` and `provider_id__in` filters for resources endpoints (`GET /resources` and `GET /resources/metadata/latest`) [(#9864)](https://github.com/prowler-cloud/prowler/pull/9864)
- Added memory optimizations for large compliance report generation [(#9444)](https://github.com/prowler-cloud/prowler/pull/9444)
- `GET /api/v1/resources/{id}/events` endpoint to retrieve AWS resource modification history from CloudTrail [(#9101)](https://github.com/prowler-cloud/prowler/pull/9101)
- Partial index on findings to speed up new failed findings queries [(#9904)](https://github.com/prowler-cloud/prowler/pull/9904)
### 🔄 Changed
- Lazy-load providers and compliance data to reduce API/worker startup memory and time [(#9857)](https://github.com/prowler-cloud/prowler/pull/9857)
- Attack Paths: Pinned Cartography to version `0.126.1`, adding AWS scans for SageMaker, CloudFront and Bedrock [(#9893)](https://github.com/prowler-cloud/prowler/issues/9893)
- Remove unused indexes [(#9904)](https://github.com/prowler-cloud/prowler/pull/9904)
- Attack Paths: Modified the behaviour of the Cartography scans to use the same Neo4j database per tenant, instead of individual databases per scans [(#9955)](https://github.com/prowler-cloud/prowler/pull/9955)
### 🐞 Fixed
- Attack Paths: `aws-security-groups-open-internet-facing` query returning no results due to incorrect relationship matching [(#9892)](https://github.com/prowler-cloud/prowler/pull/9892)
---
## [1.18.1] (Prowler v5.17.1)
### 🐞 Fixed
- Improve API startup process by `manage.py` argument detection [(#9856)](https://github.com/prowler-cloud/prowler/pull/9856)
- Deleting providers don't try to delete a `None` Neo4j database when an Attack Paths scan is scheduled [(#9858)](https://github.com/prowler-cloud/prowler/pull/9858)
- Use replica database for reading Findings to add them to the Attack Paths graph [(#9861)](https://github.com/prowler-cloud/prowler/pull/9861)
- Attack paths findings loading query to use streaming generator for O(batch_size) memory instead of O(total_findings) [(#9862)](https://github.com/prowler-cloud/prowler/pull/9862)
- Lazy load Neo4j driver [(#9868)](https://github.com/prowler-cloud/prowler/pull/9868)
- Use `Findings.all_objects` to avoid the `ActiveProviderPartitionedManager` [(#9869)](https://github.com/prowler-cloud/prowler/pull/9869)
- Lazy load Neo4j driver for workers only [(#9872)](https://github.com/prowler-cloud/prowler/pull/9872)
- Improve Cypher query for inserting Findings into Attack Paths scan graphs [(#9874)](https://github.com/prowler-cloud/prowler/pull/9874)
- Clear Neo4j database cache after Attack Paths scan and each API query [(#9877)](https://github.com/prowler-cloud/prowler/pull/9877)
- Deduplicated scheduled scans for long-running providers [(#9829)](https://github.com/prowler-cloud/prowler/pull/9829)
---
## [1.18.0] (Prowler v5.17.0)
### 🚀 Added
- `/api/v1/overviews/compliance-watchlist` endpoint to retrieve the compliance watchlist [(#9596)](https://github.com/prowler-cloud/prowler/pull/9596)
- AlibabaCloud provider support [(#9485)](https://github.com/prowler-cloud/prowler/pull/9485)
- `/api/v1/overviews/resource-groups` endpoint to retrieve an overview of resource groups based on finding severities [(#9694)](https://github.com/prowler-cloud/prowler/pull/9694)
- `group` filter for `GET /findings` and `GET /findings/metadata/latest` endpoints [(#9694)](https://github.com/prowler-cloud/prowler/pull/9694)
- `provider_id` and `provider_id__in` filter aliases for findings endpoints to enable consistent frontend parameter naming [(#9701)](https://github.com/prowler-cloud/prowler/pull/9701)
- Attack Paths: `/api/v1/attack-paths-scans` for AWS providers backed by Neo4j [(#9805)](https://github.com/prowler-cloud/prowler/pull/9805)
### 🔐 Security
- Django 5.1.15 (CVE-2025-64460, CVE-2025-13372), Werkzeug 3.1.4 (CVE-2025-66221), sqlparse 0.5.5 (PVE-2025-82038), fonttools 4.60.2 (CVE-2025-66034) [(#9730)](https://github.com/prowler-cloud/prowler/pull/9730)
- `safety` to `3.7.0` and `filelock` to `3.20.3` due to [Safety vulnerability 82754 (CVE-2025-68146)](https://data.safetycli.com/v/82754/97c/) [(#9816)](https://github.com/prowler-cloud/prowler/pull/9816)
- `pyasn1` to v0.6.2 to address [CVE-2026-23490](https://nvd.nist.gov/vuln/detail/CVE-2026-23490) [(#9818)](https://github.com/prowler-cloud/prowler/pull/9818)
- `django-allauth[saml]` to v65.13.0 to address [CVE-2025-65431](https://nvd.nist.gov/vuln/detail/CVE-2025-65431) [(#9575)](https://github.com/prowler-cloud/prowler/pull/9575)
---
## [1.17.1] (Prowler v5.16.1)
### Changed
### 🔄 Changed
- Security Hub integration error when no regions [(#9635)](https://github.com/prowler-cloud/prowler/pull/9635)
### Fixed
### 🐞 Fixed
- Orphan scheduled scans caused by transaction isolation during provider creation [(#9633)](https://github.com/prowler-cloud/prowler/pull/9633)
---
## [1.17.0] (Prowler v5.16.0)
### Added
### 🚀 Added
- New endpoint to retrieve and overview of the categories based on finding severities [(#9529)](https://github.com/prowler-cloud/prowler/pull/9529)
- Endpoints `GET /findings` and `GET /findings/latests` can now use the category filter [(#9529)](https://github.com/prowler-cloud/prowler/pull/9529)
- Account id, alias and provider name to PDF reporting table [(#9574)](https://github.com/prowler-cloud/prowler/pull/9574)
### Changed
### 🔄 Changed
- Endpoint `GET /overviews/attack-surfaces` no longer returns the related check IDs [(#9529)](https://github.com/prowler-cloud/prowler/pull/9529)
- OpenAI provider to only load chat-compatible models with tool calling support [(#9523)](https://github.com/prowler-cloud/prowler/pull/9523)
- Increased execution delay for the first scheduled scan tasks to 5 seconds[(#9558)](https://github.com/prowler-cloud/prowler/pull/9558)
### Fixed
### 🐞 Fixed
- Made `scan_id` a required filter in the compliance overview endpoint [(#9560)](https://github.com/prowler-cloud/prowler/pull/9560)
- Reduced unnecessary UPDATE resources operations by only saving when tag mappings change, lowering write load during scans [(#9569)](https://github.com/prowler-cloud/prowler/pull/9569)
@@ -46,19 +131,22 @@ All notable changes to the **Prowler API** are documented in this file.
## [1.16.1] (Prowler v5.15.1)
### Fixed
### 🐞 Fixed
- Race condition in scheduled scan creation by adding countdown to task [(#9516)](https://github.com/prowler-cloud/prowler/pull/9516)
## [1.16.0] (Prowler v5.15.0)
### Added
### 🚀 Added
- New endpoint to retrieve an overview of the attack surfaces [(#9309)](https://github.com/prowler-cloud/prowler/pull/9309)
- New endpoint `GET /api/v1/overviews/findings_severity/timeseries` to retrieve daily aggregated findings by severity level [(#9363)](https://github.com/prowler-cloud/prowler/pull/9363)
- Lighthouse AI support for Amazon Bedrock API key [(#9343)](https://github.com/prowler-cloud/prowler/pull/9343)
- Exception handler for provider deletions during scans [(#9414)](https://github.com/prowler-cloud/prowler/pull/9414)
- Support to use admin credentials through the read replica database [(#9440)](https://github.com/prowler-cloud/prowler/pull/9440)
### Changed
### 🔄 Changed
- Error messages from Lighthouse celery tasks [(#9165)](https://github.com/prowler-cloud/prowler/pull/9165)
- Restore the compliance overview endpoint's mandatory filters [(#9338)](https://github.com/prowler-cloud/prowler/pull/9338)
@@ -66,7 +154,8 @@ All notable changes to the **Prowler API** are documented in this file.
## [1.15.2] (Prowler v5.14.2)
### Fixed
### 🐞 Fixed
- Unique constraint violation during compliance overviews task [(#9436)](https://github.com/prowler-cloud/prowler/pull/9436)
- Division by zero error in ENS PDF report when all requirements are manual [(#9443)](https://github.com/prowler-cloud/prowler/pull/9443)
@@ -74,7 +163,8 @@ All notable changes to the **Prowler API** are documented in this file.
## [1.15.1] (Prowler v5.14.1)
### Fixed
### 🐞 Fixed
- Fix typo in PDF reporting [(#9345)](https://github.com/prowler-cloud/prowler/pull/9345)
- Fix IaC provider initialization failure when mutelist processor is configured [(#9331)](https://github.com/prowler-cloud/prowler/pull/9331)
- Match logic for ThreatScore when counting findings [(#9348)](https://github.com/prowler-cloud/prowler/pull/9348)
@@ -83,7 +173,8 @@ All notable changes to the **Prowler API** are documented in this file.
## [1.15.0] (Prowler v5.14.0)
### Added
### 🚀 Added
- IaC (Infrastructure as Code) provider support for remote repositories [(#8751)](https://github.com/prowler-cloud/prowler/pull/8751)
- Extend `GET /api/v1/providers` with provider-type filters and optional pagination disable to support the new Overview filters [(#8975)](https://github.com/prowler-cloud/prowler/pull/8975)
- New endpoint to retrieve the number of providers grouped by provider type [(#8975)](https://github.com/prowler-cloud/prowler/pull/8975)
@@ -102,11 +193,13 @@ All notable changes to the **Prowler API** are documented in this file.
- Enhanced compliance overview endpoint with provider filtering and latest scan aggregation [(#9244)](https://github.com/prowler-cloud/prowler/pull/9244)
- New endpoint `GET /api/v1/overview/regions` to retrieve aggregated findings data by region [(#9273)](https://github.com/prowler-cloud/prowler/pull/9273)
### Changed
### 🔄 Changed
- Optimized database write queries for scan related tasks [(#9190)](https://github.com/prowler-cloud/prowler/pull/9190)
- Date filters are now optional for `GET /api/v1/overviews/services` endpoint; returns latest scan data by default [(#9248)](https://github.com/prowler-cloud/prowler/pull/9248)
### Fixed
### 🐞 Fixed
- Scans no longer fail when findings have UIDs exceeding 300 characters; such findings are now skipped with detailed logging [(#9246)](https://github.com/prowler-cloud/prowler/pull/9246)
- Updated unique constraint for `Provider` model to exclude soft-deleted entries, resolving duplicate errors when re-deleting providers [(#9054)](https://github.com/prowler-cloud/prowler/pull/9054)
- Removed compliance generation for providers without compliance frameworks [(#9208)](https://github.com/prowler-cloud/prowler/pull/9208)
@@ -114,14 +207,16 @@ All notable changes to the **Prowler API** are documented in this file.
- Severity overview endpoint now ignores muted findings as expected [(#9283)](https://github.com/prowler-cloud/prowler/pull/9283)
- Fixed discrepancy between ThreatScore PDF report values and database calculations [(#9296)](https://github.com/prowler-cloud/prowler/pull/9296)
### Security
### 🔐 Security
- Django updated to the latest 5.1 security release, 5.1.14, due to problems with potential [SQL injection](https://github.com/prowler-cloud/prowler/security/dependabot/113) and [denial-of-service vulnerability](https://github.com/prowler-cloud/prowler/security/dependabot/114) [(#9176)](https://github.com/prowler-cloud/prowler/pull/9176)
---
## [1.14.1] (Prowler v5.13.1)
### Fixed
### 🐞 Fixed
- `/api/v1/overviews/providers` collapses data by provider type so the UI receives a single aggregated record per cloud family even when multiple accounts exist [(#9053)](https://github.com/prowler-cloud/prowler/pull/9053)
- Added retry logic to database transactions to handle Aurora read replica connection failures during scale-down events [(#9064)](https://github.com/prowler-cloud/prowler/pull/9064)
- Security Hub integrations stop failing when they read relationships via the replica by allowing replica relations and saving updates through the primary [(#9080)](https://github.com/prowler-cloud/prowler/pull/9080)
@@ -130,7 +225,8 @@ All notable changes to the **Prowler API** are documented in this file.
## [1.14.0] (Prowler v5.13.0)
### Added
### 🚀 Added
- Default JWT keys are generated and stored if they are missing from configuration [(#8655)](https://github.com/prowler-cloud/prowler/pull/8655)
- `compliance_name` for each compliance [(#7920)](https://github.com/prowler-cloud/prowler/pull/7920)
- Support C5 compliance framework for the AWS provider [(#8830)](https://github.com/prowler-cloud/prowler/pull/8830)
@@ -143,35 +239,41 @@ All notable changes to the **Prowler API** are documented in this file.
- Support Common Cloud Controls for AWS, Azure and GCP [(#8000)](https://github.com/prowler-cloud/prowler/pull/8000)
- Add `provider_id__in` filter support to findings and findings severity overview endpoints [(#8951)](https://github.com/prowler-cloud/prowler/pull/8951)
### Changed
### 🔄 Changed
- Now the MANAGE_ACCOUNT permission is required to modify or read user permissions instead of MANAGE_USERS [(#8281)](https://github.com/prowler-cloud/prowler/pull/8281)
- Now at least one user with MANAGE_ACCOUNT permission is required in the tenant [(#8729)](https://github.com/prowler-cloud/prowler/pull/8729)
### Security
### 🔐 Security
- Django updated to the latest 5.1 security release, 5.1.13, due to problems with potential [SQL injection](https://github.com/prowler-cloud/prowler/security/dependabot/104) and [directory traversals](https://github.com/prowler-cloud/prowler/security/dependabot/103) [(#8842)](https://github.com/prowler-cloud/prowler/pull/8842)
---
## [1.13.2] (Prowler v5.12.3)
### Fixed
### 🐞 Fixed
- 500 error when deleting user [(#8731)](https://github.com/prowler-cloud/prowler/pull/8731)
---
## [1.13.1] (Prowler v5.12.2)
### Changed
### 🔄 Changed
- Renamed compliance overview task queue to `compliance` [(#8755)](https://github.com/prowler-cloud/prowler/pull/8755)
### Security
### 🔐 Security
- Django updated to the latest 5.1 security release, 5.1.12, due to [problems](https://www.djangoproject.com/weblog/2025/sep/03/security-releases/) with potential SQL injection in FilteredRelation column aliases [(#8693)](https://github.com/prowler-cloud/prowler/pull/8693)
---
## [1.13.0] (Prowler v5.12.0)
### Added
### 🚀 Added
- Integration with JIRA, enabling sending findings to a JIRA project [(#8622)](https://github.com/prowler-cloud/prowler/pull/8622), [(#8637)](https://github.com/prowler-cloud/prowler/pull/8637)
- `GET /overviews/findings_severity` now supports `filter[status]` and `filter[status__in]` to aggregate by specific statuses (`FAIL`, `PASS`)[(#8186)](https://github.com/prowler-cloud/prowler/pull/8186)
- Throttling options for `/api/v1/tokens` using the `DJANGO_THROTTLE_TOKEN_OBTAIN` environment variable [(#8647)](https://github.com/prowler-cloud/prowler/pull/8647)
@@ -180,101 +282,120 @@ All notable changes to the **Prowler API** are documented in this file.
## [1.12.0] (Prowler v5.11.0)
### Added
### 🚀 Added
- Lighthouse support for OpenAI GPT-5 [(#8527)](https://github.com/prowler-cloud/prowler/pull/8527)
- Integration with Amazon Security Hub, enabling sending findings to Security Hub [(#8365)](https://github.com/prowler-cloud/prowler/pull/8365)
- Generate ASFF output for AWS providers with SecurityHub integration enabled [(#8569)](https://github.com/prowler-cloud/prowler/pull/8569)
### Fixed
### 🐞 Fixed
- GitHub provider always scans user instead of organization when using provider UID [(#8587)](https://github.com/prowler-cloud/prowler/pull/8587)
---
## [1.11.0] (Prowler v5.10.0)
### Added
### 🚀 Added
- Github provider support [(#8271)](https://github.com/prowler-cloud/prowler/pull/8271)
- Integration with Amazon S3, enabling storage and retrieval of scan data via S3 buckets [(#8056)](https://github.com/prowler-cloud/prowler/pull/8056)
### Fixed
### 🐞 Fixed
- Avoid sending errors to Sentry in M365 provider when user authentication fails [(#8420)](https://github.com/prowler-cloud/prowler/pull/8420)
---
## [1.10.2] (Prowler v5.9.2)
### Changed
### 🔄 Changed
- Optimized queries for resources views [(#8336)](https://github.com/prowler-cloud/prowler/pull/8336)
---
## [v1.10.1] (Prowler v5.9.1)
### Fixed
### 🐞 Fixed
- Calculate failed findings during scans to prevent heavy database queries [(#8322)](https://github.com/prowler-cloud/prowler/pull/8322)
---
## [v1.10.0] (Prowler v5.9.0)
### Added
### 🚀 Added
- SSO with SAML support [(#8175)](https://github.com/prowler-cloud/prowler/pull/8175)
- `GET /resources/metadata`, `GET /resources/metadata/latest` and `GET /resources/latest` to expose resource metadata and latest scan results [(#8112)](https://github.com/prowler-cloud/prowler/pull/8112)
### Changed
### 🔄 Changed
- `/processors` endpoints to post-process findings. Currently, only the Mutelist processor is supported to allow to mute findings.
- Optimized the underlying queries for resources endpoints [(#8112)](https://github.com/prowler-cloud/prowler/pull/8112)
- Optimized include parameters for resources view [(#8229)](https://github.com/prowler-cloud/prowler/pull/8229)
- Optimized overview background tasks [(#8300)](https://github.com/prowler-cloud/prowler/pull/8300)
### Fixed
### 🐞 Fixed
- Search filter for findings and resources [(#8112)](https://github.com/prowler-cloud/prowler/pull/8112)
- RBAC is now applied to `GET /overviews/providers` [(#8277)](https://github.com/prowler-cloud/prowler/pull/8277)
### Changed
### 🔄 Changed
- `POST /schedules/daily` returns a `409 CONFLICT` if already created [(#8258)](https://github.com/prowler-cloud/prowler/pull/8258)
### Security
### 🔐 Security
- Enhanced password validation to enforce 12+ character passwords with special characters, uppercase, lowercase, and numbers [(#8225)](https://github.com/prowler-cloud/prowler/pull/8225)
---
## [v1.9.1] (Prowler v5.8.1)
### Added
### 🚀 Added
- Custom exception for provider connection errors during scans [(#8234)](https://github.com/prowler-cloud/prowler/pull/8234)
### Changed
### 🔄 Changed
- Summary and overview tasks now use a dedicated queue and no longer propagate errors to compliance tasks [(#8214)](https://github.com/prowler-cloud/prowler/pull/8214)
### Fixed
### 🐞 Fixed
- Scan with no resources will not trigger legacy code for findings metadata [(#8183)](https://github.com/prowler-cloud/prowler/pull/8183)
- Invitation email comparison case-insensitive [(#8206)](https://github.com/prowler-cloud/prowler/pull/8206)
### Removed
### Removed
- Validation of the provider's secret type during updates [(#8197)](https://github.com/prowler-cloud/prowler/pull/8197)
---
## [v1.9.0] (Prowler v5.8.0)
### Added
### 🚀 Added
- Support GCP Service Account key [(#7824)](https://github.com/prowler-cloud/prowler/pull/7824)
- `GET /compliance-overviews` endpoints to retrieve compliance metadata and specific requirements statuses [(#7877)](https://github.com/prowler-cloud/prowler/pull/7877)
- Lighthouse configuration support [(#7848)](https://github.com/prowler-cloud/prowler/pull/7848)
### Changed
### 🔄 Changed
- Reworked `GET /compliance-overviews` to return proper requirement metrics [(#7877)](https://github.com/prowler-cloud/prowler/pull/7877)
- Optional `user` and `password` for M365 provider [(#7992)](https://github.com/prowler-cloud/prowler/pull/7992)
### Fixed
### 🐞 Fixed
- Scheduled scans are no longer deleted when their daily schedule run is disabled [(#8082)](https://github.com/prowler-cloud/prowler/pull/8082)
---
## [v1.8.5] (Prowler v5.7.5)
### Fixed
### 🐞 Fixed
- Normalize provider UID to ensure safe and unique export directory paths [(#8007)](https://github.com/prowler-cloud/prowler/pull/8007).
- Blank resource types in `/metadata` endpoints [(#8027)](https://github.com/prowler-cloud/prowler/pull/8027)
@@ -282,20 +403,24 @@ All notable changes to the **Prowler API** are documented in this file.
## [v1.8.4] (Prowler v5.7.4)
### Removed
### Removed
- Reverted RLS transaction handling and DB custom backend [(#7994)](https://github.com/prowler-cloud/prowler/pull/7994)
---
## [v1.8.3] (Prowler v5.7.3)
### Added
### 🚀 Added
- Database backend to handle already closed connections [(#7935)](https://github.com/prowler-cloud/prowler/pull/7935)
### Changed
### 🔄 Changed
- Renamed field encrypted_password to password for M365 provider [(#7784)](https://github.com/prowler-cloud/prowler/pull/7784)
### Fixed
### 🐞 Fixed
- Transaction persistence with RLS operations [(#7916)](https://github.com/prowler-cloud/prowler/pull/7916)
- Reverted the change `get_with_retry` to use the original `get` method for retrieving tasks [(#7932)](https://github.com/prowler-cloud/prowler/pull/7932)
@@ -303,7 +428,8 @@ All notable changes to the **Prowler API** are documented in this file.
## [v1.8.2] (Prowler v5.7.2)
### Fixed
### 🐞 Fixed
- Task lookup to use task_kwargs instead of task_args for scan report resolution [(#7830)](https://github.com/prowler-cloud/prowler/pull/7830)
- Kubernetes UID validation to allow valid context names [(#7871)](https://github.com/prowler-cloud/prowler/pull/7871)
- Connection status verification before launching a scan [(#7831)](https://github.com/prowler-cloud/prowler/pull/7831)
@@ -314,14 +440,16 @@ All notable changes to the **Prowler API** are documented in this file.
## [v1.8.1] (Prowler v5.7.1)
### Fixed
### 🐞 Fixed
- Added database index to improve performance on finding lookup [(#7800)](https://github.com/prowler-cloud/prowler/pull/7800)
---
## [v1.8.0] (Prowler v5.7.0)
### Added
### 🚀 Added
- Huge improvements to `/findings/metadata` and resource related filters for findings [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690)
- Improvements to `/overviews` endpoints [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690)
- Queue to perform backfill background tasks [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690)
@@ -332,7 +460,7 @@ All notable changes to the **Prowler API** are documented in this file.
## [v1.7.0] (Prowler v5.6.0)
### Added
### 🚀 Added
- M365 as a new provider [(#7563)](https://github.com/prowler-cloud/prowler/pull/7563)
- `compliance/` folder and ZIPexport functionality for all compliance reports [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653)
@@ -342,7 +470,7 @@ All notable changes to the **Prowler API** are documented in this file.
## [v1.6.0] (Prowler v5.5.0)
### Added
### 🚀 Added
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167)
- HTTP Security Headers [(#7289)](https://github.com/prowler-cloud/prowler/pull/7289)
@@ -354,14 +482,16 @@ All notable changes to the **Prowler API** are documented in this file.
## [v1.5.4] (Prowler v5.4.4)
### Fixed
### 🐞 Fixed
- Bug with periodic tasks when trying to delete a provider [(#7466)](https://github.com/prowler-cloud/prowler/pull/7466)
---
## [v1.5.3] (Prowler v5.4.3)
### Fixed
### 🐞 Fixed
- Duplicated scheduled scans handling [(#7401)](https://github.com/prowler-cloud/prowler/pull/7401)
- Environment variable to configure the deletion task batch size [(#7423)](https://github.com/prowler-cloud/prowler/pull/7423)
@@ -369,14 +499,16 @@ All notable changes to the **Prowler API** are documented in this file.
## [v1.5.2] (Prowler v5.4.2)
### Changed
### 🔄 Changed
- Refactored deletion logic and implemented retry mechanism for deletion tasks [(#7349)](https://github.com/prowler-cloud/prowler/pull/7349)
---
## [v1.5.1] (Prowler v5.4.1)
### Fixed
### 🐞 Fixed
- Handle response in case local files are missing [(#7183)](https://github.com/prowler-cloud/prowler/pull/7183)
- Race condition when deleting export files after the S3 upload [(#7172)](https://github.com/prowler-cloud/prowler/pull/7172)
- Handle exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283)
@@ -385,19 +517,22 @@ All notable changes to the **Prowler API** are documented in this file.
## [v1.5.0] (Prowler v5.4.0)
### Added
### 🚀 Added
- Social login integration with Google and GitHub [(#6906)](https://github.com/prowler-cloud/prowler/pull/6906)
- API scan report system, now all scans launched from the API will generate a compressed file with the report in OCSF, CSV and HTML formats [(#6878)](https://github.com/prowler-cloud/prowler/pull/6878)
- Configurable Sentry integration [(#6874)](https://github.com/prowler-cloud/prowler/pull/6874)
### Changed
### 🔄 Changed
- Optimized `GET /findings` endpoint to improve response time and size [(#7019)](https://github.com/prowler-cloud/prowler/pull/7019)
---
## [v1.4.0] (Prowler v5.3.0)
### Changed
### 🔄 Changed
- Daily scheduled scan instances are now created beforehand with `SCHEDULED` state [(#6700)](https://github.com/prowler-cloud/prowler/pull/6700)
- Findings endpoints now require at least one date filter [(#6800)](https://github.com/prowler-cloud/prowler/pull/6800)
- Findings metadata endpoint received a performance improvement [(#6863)](https://github.com/prowler-cloud/prowler/pull/6863)
+1 -1
View File
@@ -32,7 +32,7 @@ start_prod_server() {
start_worker() {
echo "Starting the worker..."
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion,backfill,overview,integrations,compliance -E --max-tasks-per-child 1
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion,backfill,overview,integrations,compliance,attack-paths-scans -E --max-tasks-per-child 1
}
start_worker_beat() {
+3687 -2154
View File
File diff suppressed because it is too large Load Diff
+7 -4
View File
@@ -5,10 +5,10 @@ requires = ["poetry-core"]
[project]
authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
dependencies = [
"celery[pytest] (>=5.4.0,<6.0.0)",
"celery (>=5.4.0,<6.0.0)",
"dj-rest-auth[with_social,jwt] (==7.0.1)",
"django (==5.1.15)",
"django-allauth[saml] (>=65.8.0,<66.0.0)",
"django-allauth[saml] (>=65.13.0,<66.0.0)",
"django-celery-beat (>=2.7.0,<3.0.0)",
"django-celery-results (>=2.5.1,<3.0.0)",
"django-cors-headers==4.4.0",
@@ -36,6 +36,8 @@ dependencies = [
"drf-simple-apikey (==2.2.1)",
"matplotlib (>=3.10.6,<4.0.0)",
"reportlab (>=4.4.4,<5.0.0)",
"neo4j (<6.0.0)",
"cartography @ git+https://github.com/prowler-cloud/cartography@0.126.1",
"gevent (>=25.9.1,<26.0.0)",
"werkzeug (>=3.1.4)",
"sqlparse (>=0.5.4)",
@@ -47,7 +49,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.18.0"
version = "1.20.0"
[project.scripts]
celery = "src.backend.config.settings.celery"
@@ -57,6 +59,7 @@ bandit = "1.7.9"
coverage = "7.5.4"
django-silk = "5.3.2"
docker = "7.1.0"
filelock = "3.20.3"
freezegun = "1.5.1"
marshmallow = ">=3.15.0,<4.0.0"
mypy = "1.10.1"
@@ -68,6 +71,6 @@ pytest-env = "1.1.3"
pytest-randomly = "3.15.0"
pytest-xdist = "3.6.1"
ruff = "0.5.0"
safety = "3.2.9"
safety = "3.7.0"
tqdm = "4.67.1"
vulture = "2.14"
+37 -5
View File
@@ -30,16 +30,48 @@ class ApiConfig(AppConfig):
def ready(self):
from api import schema_extensions # noqa: F401
from api import signals # noqa: F401
from api.compliance import load_prowler_compliance
from api.attack_paths import database as graph_database
# Generate required cryptographic keys if not present, but only if:
# `"manage.py" not in sys.argv`: If an external server (e.g., Gunicorn) is running the app
# `"manage.py" not in sys.argv[0]`: If an external server (e.g., Gunicorn) is running the app
# `os.environ.get("RUN_MAIN")`: If it's not a Django command or using `runserver`,
# only the main process will do it
if "manage.py" not in sys.argv or os.environ.get("RUN_MAIN"):
if (len(sys.argv) >= 1 and "manage.py" not in sys.argv[0]) or os.environ.get(
"RUN_MAIN"
):
self._ensure_crypto_keys()
load_prowler_compliance()
# Commands that don't need Neo4j
SKIP_NEO4J_DJANGO_COMMANDS = [
"makemigrations",
"migrate",
"pgpartition",
"check",
"help",
"showmigrations",
"check_and_fix_socialaccount_sites_migration",
]
# Skip Neo4j initialization during tests, some Django commands, and Celery
if getattr(settings, "TESTING", False) or (
len(sys.argv) > 1
and (
(
"manage.py" in sys.argv[0]
and sys.argv[1] in SKIP_NEO4J_DJANGO_COMMANDS
)
or "celery" in sys.argv[0]
)
):
logger.info(
"Skipping Neo4j initialization because tests, some Django commands or Celery"
)
else:
graph_database.init_driver()
# Neo4j driver is initialized at API startup (see api.attack_paths.database)
# It remains lazy for Celery workers and selected Django commands
def _ensure_crypto_keys(self):
"""
@@ -54,7 +86,7 @@ class ApiConfig(AppConfig):
global _keys_initialized
# Skip key generation if running tests
if hasattr(settings, "TESTING") and settings.TESTING:
if getattr(settings, "TESTING", False):
return
# Skip if already initialized in this process
@@ -0,0 +1,14 @@
from api.attack_paths.queries import (
AttackPathsQueryDefinition,
AttackPathsQueryParameterDefinition,
get_queries_for_provider,
get_query_by_id,
)
__all__ = [
"AttackPathsQueryDefinition",
"AttackPathsQueryParameterDefinition",
"get_queries_for_provider",
"get_query_by_id",
]
@@ -0,0 +1,181 @@
import atexit
import logging
import threading
from contextlib import contextmanager
from typing import Iterator
from uuid import UUID
import neo4j
import neo4j.exceptions
from django.conf import settings
from api.attack_paths.retryable_session import RetryableSession
from tasks.jobs.attack_paths.config import BATCH_SIZE, PROVIDER_RESOURCE_LABEL
# Without this Celery goes crazy with Neo4j logging
logging.getLogger("neo4j").setLevel(logging.ERROR)
logging.getLogger("neo4j").propagate = False
SERVICE_UNAVAILABLE_MAX_RETRIES = 3
# Module-level process-wide driver singleton
_driver: neo4j.Driver | None = None
_lock = threading.Lock()
# Base Neo4j functions
def get_uri() -> str:
host = settings.DATABASES["neo4j"]["HOST"]
port = settings.DATABASES["neo4j"]["PORT"]
return f"bolt://{host}:{port}"
def init_driver() -> neo4j.Driver:
global _driver
if _driver is not None:
return _driver
with _lock:
if _driver is None:
uri = get_uri()
config = settings.DATABASES["neo4j"]
_driver = neo4j.GraphDatabase.driver(
uri,
auth=(config["USER"], config["PASSWORD"]),
keep_alive=True,
max_connection_lifetime=7200,
connection_acquisition_timeout=120,
max_connection_pool_size=50,
)
_driver.verify_connectivity()
# Register cleanup handler (only runs once since we're inside the _driver is None block)
atexit.register(close_driver)
return _driver
def get_driver() -> neo4j.Driver:
return init_driver()
def close_driver() -> None: # TODO: Use it
global _driver
with _lock:
if _driver is not None:
try:
_driver.close()
finally:
_driver = None
@contextmanager
def get_session(database: str | None = None) -> Iterator[RetryableSession]:
session_wrapper: RetryableSession | None = None
try:
session_wrapper = RetryableSession(
session_factory=lambda: get_driver().session(database=database),
max_retries=SERVICE_UNAVAILABLE_MAX_RETRIES,
)
yield session_wrapper
except neo4j.exceptions.Neo4jError as exc:
message = exc.message if exc.message is not None else str(exc)
raise GraphDatabaseQueryException(message=message, code=exc.code)
finally:
if session_wrapper is not None:
session_wrapper.close()
def create_database(database: str) -> None:
query = "CREATE DATABASE $database IF NOT EXISTS"
parameters = {"database": database}
with get_session() as session:
session.run(query, parameters)
def drop_database(database: str) -> None:
query = f"DROP DATABASE `{database}` IF EXISTS DESTROY DATA"
with get_session() as session:
session.run(query)
def drop_subgraph(database: str, provider_id: str) -> int:
"""
Delete all nodes for a provider from the tenant database.
Uses batched deletion to avoid memory issues with large graphs.
Silently returns 0 if the database doesn't exist.
"""
deleted_nodes = 0
parameters = {
"provider_id": provider_id,
"batch_size": BATCH_SIZE,
}
try:
with get_session(database) as session:
deleted_count = 1
while deleted_count > 0:
result = session.run(
f"""
MATCH (n:{PROVIDER_RESOURCE_LABEL} {{provider_id: $provider_id}})
WITH n LIMIT $batch_size
DETACH DELETE n
RETURN COUNT(n) AS deleted_nodes_count
""",
parameters,
)
deleted_count = result.single().get("deleted_nodes_count", 0)
deleted_nodes += deleted_count
except GraphDatabaseQueryException as exc:
if exc.code == "Neo.ClientError.Database.DatabaseNotFound":
return 0
raise
return deleted_nodes
def clear_cache(database: str) -> None:
query = "CALL db.clearQueryCaches()"
try:
with get_session(database) as session:
session.run(query)
except GraphDatabaseQueryException as exc:
logging.warning(f"Failed to clear query cache for database `{database}`: {exc}")
# Neo4j functions related to Prowler + Cartography
def get_database_name(entity_id: str | UUID, temporary: bool = False) -> str:
prefix = "tmp-scan" if temporary else "tenant"
return f"db-{prefix}-{str(entity_id).lower()}"
# Exceptions
class GraphDatabaseQueryException(Exception):
def __init__(self, message: str, code: str | None = None) -> None:
super().__init__(message)
self.message = message
self.code = code
def __str__(self) -> str:
if self.code:
return f"{self.code}: {self.message}"
return self.message
@@ -0,0 +1,16 @@
from api.attack_paths.queries.types import (
AttackPathsQueryDefinition,
AttackPathsQueryParameterDefinition,
)
from api.attack_paths.queries.registry import (
get_queries_for_provider,
get_query_by_id,
)
__all__ = [
"AttackPathsQueryDefinition",
"AttackPathsQueryParameterDefinition",
"get_queries_for_provider",
"get_query_by_id",
]
File diff suppressed because it is too large Load Diff
@@ -0,0 +1,25 @@
from api.attack_paths.queries.types import AttackPathsQueryDefinition
from api.attack_paths.queries.aws import AWS_QUERIES
# Query definitions organized by provider
_QUERY_DEFINITIONS: dict[str, list[AttackPathsQueryDefinition]] = {
"aws": AWS_QUERIES,
}
# Flat lookup by query ID for O(1) access
_QUERIES_BY_ID: dict[str, AttackPathsQueryDefinition] = {
definition.id: definition
for definitions in _QUERY_DEFINITIONS.values()
for definition in definitions
}
def get_queries_for_provider(provider: str) -> list[AttackPathsQueryDefinition]:
"""Get all attack path queries for a specific provider."""
return _QUERY_DEFINITIONS.get(provider, [])
def get_query_by_id(query_id: str) -> AttackPathsQueryDefinition | None:
"""Get a specific attack path query by its ID."""
return _QUERIES_BY_ID.get(query_id)
@@ -0,0 +1,39 @@
from dataclasses import dataclass, field
@dataclass
class AttackPathsQueryAttribution:
"""Source attribution for an Attack Path query."""
text: str
link: str
@dataclass
class AttackPathsQueryParameterDefinition:
"""
Metadata describing a parameter that must be provided to an Attack Paths query.
"""
name: str
label: str
data_type: str = "string"
cast: type = str
description: str | None = None
placeholder: str | None = None
@dataclass
class AttackPathsQueryDefinition:
"""
Immutable representation of an Attack Path query.
"""
id: str
name: str
short_description: str
description: str
provider: str
cypher: str
attribution: AttackPathsQueryAttribution | None = None
parameters: list[AttackPathsQueryParameterDefinition] = field(default_factory=list)
@@ -0,0 +1,92 @@
import logging
from collections.abc import Callable
from typing import Any
import neo4j
import neo4j.exceptions
logger = logging.getLogger(__name__)
class RetryableSession:
"""
Wrapper around `neo4j.Session` that retries `neo4j.exceptions.ServiceUnavailable` errors.
"""
def __init__(
self,
session_factory: Callable[[], neo4j.Session],
max_retries: int,
) -> None:
self._session_factory = session_factory
self._max_retries = max(0, max_retries)
self._session = self._session_factory()
def close(self) -> None:
if self._session is not None:
self._session.close()
self._session = None
def __enter__(self) -> "RetryableSession":
return self
def __exit__(
self, _: Any, __: Any, ___: Any
) -> None: # Unused args: exc_type, exc, exc_tb
self.close()
def run(self, *args: Any, **kwargs: Any) -> Any:
return self._call_with_retry("run", *args, **kwargs)
def write_transaction(self, *args: Any, **kwargs: Any) -> Any:
return self._call_with_retry("write_transaction", *args, **kwargs)
def read_transaction(self, *args: Any, **kwargs: Any) -> Any:
return self._call_with_retry("read_transaction", *args, **kwargs)
def execute_write(self, *args: Any, **kwargs: Any) -> Any:
return self._call_with_retry("execute_write", *args, **kwargs)
def execute_read(self, *args: Any, **kwargs: Any) -> Any:
return self._call_with_retry("execute_read", *args, **kwargs)
def __getattr__(self, item: str) -> Any:
return getattr(self._session, item)
def _call_with_retry(self, method_name: str, *args: Any, **kwargs: Any) -> Any:
attempt = 0
last_exc: Exception | None = None
while attempt <= self._max_retries:
try:
method = getattr(self._session, method_name)
return method(*args, **kwargs)
except (
BrokenPipeError,
ConnectionResetError,
neo4j.exceptions.ServiceUnavailable,
) as exc: # pragma: no cover - depends on infra
last_exc = exc
attempt += 1
if attempt > self._max_retries:
raise
logger.warning(
f"Neo4j session {method_name} failed with {type(exc).__name__} ({attempt}/{self._max_retries} attempts). Retrying..."
)
self._refresh_session()
raise last_exc if last_exc else RuntimeError("Unexpected retry loop exit")
def _refresh_session(self) -> None:
if self._session is not None:
try:
self._session.close()
except Exception:
# Best-effort close; failures just mean we open a new session below
pass
self._session = self._session_factory()
@@ -0,0 +1,147 @@
import logging
from typing import Any, Iterable
from rest_framework.exceptions import APIException, ValidationError
from api.attack_paths import database as graph_database, AttackPathsQueryDefinition
from config.custom_logging import BackendLogger
from tasks.jobs.attack_paths.config import INTERNAL_LABELS
logger = logging.getLogger(BackendLogger.API)
def normalize_run_payload(raw_data):
if not isinstance(raw_data, dict): # Let the serializer handle this
return raw_data
if "data" in raw_data and isinstance(raw_data.get("data"), dict):
data_section = raw_data.get("data") or {}
attributes = data_section.get("attributes") or {}
payload = {
"id": attributes.get("id", data_section.get("id")),
"parameters": attributes.get("parameters"),
}
# Remove `None` parameters to allow defaults downstream
if payload.get("parameters") is None:
payload.pop("parameters")
return payload
return raw_data
def prepare_query_parameters(
definition: AttackPathsQueryDefinition,
provided_parameters: dict[str, Any],
provider_uid: str,
) -> dict[str, Any]:
parameters = dict(provided_parameters or {})
expected_names = {parameter.name for parameter in definition.parameters}
provided_names = set(parameters.keys())
unexpected = provided_names - expected_names
if unexpected:
raise ValidationError(
{"parameters": f"Unknown parameter(s): {', '.join(sorted(unexpected))}"}
)
missing = expected_names - provided_names
if missing:
raise ValidationError(
{
"parameters": f"Missing required parameter(s): {', '.join(sorted(missing))}"
}
)
clean_parameters = {
"provider_uid": str(provider_uid),
}
for definition_parameter in definition.parameters:
raw_value = provided_parameters[definition_parameter.name]
try:
casted_value = definition_parameter.cast(raw_value)
except (ValueError, TypeError) as exc:
raise ValidationError(
{
"parameters": (
f"Invalid value for parameter `{definition_parameter.name}`: {str(exc)}"
)
}
)
clean_parameters[definition_parameter.name] = casted_value
return clean_parameters
def execute_attack_paths_query(
database_name: str,
definition: AttackPathsQueryDefinition,
parameters: dict[str, Any],
) -> dict[str, Any]:
try:
with graph_database.get_session(database_name) as session:
result = session.run(definition.cypher, parameters)
return _serialize_graph(result.graph())
except graph_database.GraphDatabaseQueryException as exc:
logger.error(f"Query failed for Attack Paths query `{definition.id}`: {exc}")
raise APIException(
"Attack Paths query execution failed due to a database error"
)
def _serialize_graph(graph):
nodes = []
for node in graph.nodes:
nodes.append(
{
"id": node.element_id,
"labels": _filter_labels(node.labels),
"properties": _serialize_properties(node._properties),
},
)
relationships = []
for relationship in graph.relationships:
relationships.append(
{
"id": relationship.element_id,
"label": relationship.type,
"source": relationship.start_node.element_id,
"target": relationship.end_node.element_id,
"properties": _serialize_properties(relationship._properties),
},
)
return {
"nodes": nodes,
"relationships": relationships,
}
def _filter_labels(labels: Iterable[str]) -> list[str]:
return [label for label in labels if label not in INTERNAL_LABELS]
def _serialize_properties(properties: dict[str, Any]) -> dict[str, Any]:
"""Convert Neo4j property values into JSON-serializable primitives."""
def _serialize_value(value: Any) -> Any:
# Neo4j temporal and spatial values expose `to_native` returning Python primitives
if hasattr(value, "to_native") and callable(value.to_native):
return _serialize_value(value.to_native())
if isinstance(value, (list, tuple)):
return [_serialize_value(item) for item in value]
if isinstance(value, dict):
return {key: _serialize_value(val) for key, val in value.items()}
return value
return {key: _serialize_value(val) for key, val in properties.items()}
+133 -32
View File
@@ -1,15 +1,99 @@
from types import MappingProxyType
from collections.abc import Iterable, Mapping
from api.models import Provider
from prowler.config.config import get_available_compliance_frameworks
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.check.models import CheckMetadata
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = {}
PROWLER_CHECKS = {}
AVAILABLE_COMPLIANCE_FRAMEWORKS = {}
class LazyComplianceTemplate(Mapping):
"""Lazy-load compliance templates per provider on first access."""
def __init__(self, provider_types: Iterable[str] | None = None) -> None:
if provider_types is None:
provider_types = Provider.ProviderChoices.values
self._provider_types = tuple(provider_types)
self._provider_types_set = set(self._provider_types)
self._cache: dict[str, dict] = {}
def _load_provider(self, provider_type: str) -> dict:
if provider_type not in self._provider_types_set:
raise KeyError(provider_type)
cached = self._cache.get(provider_type)
if cached is not None:
return cached
_ensure_provider_loaded(provider_type)
return self._cache[provider_type]
def __getitem__(self, key: str) -> dict:
return self._load_provider(key)
def __iter__(self):
return iter(self._provider_types)
def __len__(self) -> int:
return len(self._provider_types)
def __contains__(self, key: object) -> bool:
return key in self._provider_types_set
def get(self, key: str, default=None):
if key not in self._provider_types_set:
return default
return self._load_provider(key)
def __repr__(self) -> str: # pragma: no cover - debugging helper
loaded = ", ".join(sorted(self._cache))
return f"{self.__class__.__name__}(loaded=[{loaded}])"
class LazyChecksMapping(Mapping):
"""Lazy-load checks mapping per provider on first access."""
def __init__(self, provider_types: Iterable[str] | None = None) -> None:
if provider_types is None:
provider_types = Provider.ProviderChoices.values
self._provider_types = tuple(provider_types)
self._provider_types_set = set(self._provider_types)
self._cache: dict[str, dict] = {}
def _load_provider(self, provider_type: str) -> dict:
if provider_type not in self._provider_types_set:
raise KeyError(provider_type)
cached = self._cache.get(provider_type)
if cached is not None:
return cached
_ensure_provider_loaded(provider_type)
return self._cache[provider_type]
def __getitem__(self, key: str) -> dict:
return self._load_provider(key)
def __iter__(self):
return iter(self._provider_types)
def __len__(self) -> int:
return len(self._provider_types)
def __contains__(self, key: object) -> bool:
return key in self._provider_types_set
def get(self, key: str, default=None):
if key not in self._provider_types_set:
return default
return self._load_provider(key)
def __repr__(self) -> str: # pragma: no cover - debugging helper
loaded = ", ".join(sorted(self._cache))
return f"{self.__class__.__name__}(loaded=[{loaded}])"
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = LazyComplianceTemplate()
PROWLER_CHECKS = LazyChecksMapping()
def get_compliance_frameworks(provider_type: Provider.ProviderChoices) -> list[str]:
"""
Retrieve and cache the list of available compliance frameworks for a specific cloud provider.
@@ -70,28 +154,35 @@ def get_prowler_provider_compliance(provider_type: Provider.ProviderChoices) ->
return Compliance.get_bulk(provider_type)
def load_prowler_compliance():
"""
Load and initialize the Prowler compliance data and checks for all provider types.
This function retrieves compliance data for all supported provider types,
generates a compliance overview template, and populates the global variables
`PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE` and `PROWLER_CHECKS` with read-only mappings
of the compliance templates and checks, respectively.
"""
global PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE
global PROWLER_CHECKS
prowler_compliance = {
provider_type: get_prowler_provider_compliance(provider_type)
for provider_type in Provider.ProviderChoices.values
}
template = generate_compliance_overview_template(prowler_compliance)
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = MappingProxyType(template)
PROWLER_CHECKS = MappingProxyType(load_prowler_checks(prowler_compliance))
def _load_provider_assets(provider_type: Provider.ProviderChoices) -> tuple[dict, dict]:
prowler_compliance = {provider_type: get_prowler_provider_compliance(provider_type)}
template = generate_compliance_overview_template(
prowler_compliance, provider_types=[provider_type]
)
checks = load_prowler_checks(prowler_compliance, provider_types=[provider_type])
return template.get(provider_type, {}), checks.get(provider_type, {})
def load_prowler_checks(prowler_compliance):
def _ensure_provider_loaded(provider_type: Provider.ProviderChoices) -> None:
if (
provider_type in PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE._cache
and provider_type in PROWLER_CHECKS._cache
):
return
template_cached = PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE._cache.get(provider_type)
checks_cached = PROWLER_CHECKS._cache.get(provider_type)
if template_cached is not None and checks_cached is not None:
return
template, checks = _load_provider_assets(provider_type)
if template_cached is None:
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE._cache[provider_type] = template
if checks_cached is None:
PROWLER_CHECKS._cache[provider_type] = checks
def load_prowler_checks(
prowler_compliance, provider_types: Iterable[str] | None = None
):
"""
Generate a mapping of checks to the compliance frameworks that include them.
@@ -100,21 +191,25 @@ def load_prowler_checks(prowler_compliance):
of compliance names that include that check.
Args:
prowler_compliance (dict): The compliance data for all provider types,
prowler_compliance (dict): The compliance data for provider types,
as returned by `get_prowler_provider_compliance`.
provider_types (Iterable[str] | None): Optional subset of provider types to
process. Defaults to all providers.
Returns:
dict: A nested dictionary where the first-level keys are provider types,
and the values are dictionaries mapping check IDs to sets of compliance names.
"""
checks = {}
for provider_type in Provider.ProviderChoices.values:
if provider_types is None:
provider_types = Provider.ProviderChoices.values
for provider_type in provider_types:
checks[provider_type] = {
check_id: set() for check_id in get_prowler_provider_checks(provider_type)
}
for compliance_name, compliance_data in prowler_compliance[
provider_type
].items():
for compliance_name, compliance_data in prowler_compliance.get(
provider_type, {}
).items():
for requirement in compliance_data.Requirements:
for check in requirement.Checks:
try:
@@ -163,7 +258,9 @@ def generate_scan_compliance(
] += 1
def generate_compliance_overview_template(prowler_compliance: dict):
def generate_compliance_overview_template(
prowler_compliance: dict, provider_types: Iterable[str] | None = None
):
"""
Generate a compliance overview template for all provider types.
@@ -173,17 +270,21 @@ def generate_compliance_overview_template(prowler_compliance: dict):
counts for requirements status.
Args:
prowler_compliance (dict): The compliance data for all provider types,
prowler_compliance (dict): The compliance data for provider types,
as returned by `get_prowler_provider_compliance`.
provider_types (Iterable[str] | None): Optional subset of provider types to
process. Defaults to all providers.
Returns:
dict: A nested dictionary representing the compliance overview template,
structured by provider type and compliance framework.
"""
template = {}
for provider_type in Provider.ProviderChoices.values:
if provider_types is None:
provider_types = Provider.ProviderChoices.values
for provider_type in provider_types:
provider_compliance = template.setdefault(provider_type, {})
compliance_data_dict = prowler_compliance[provider_type]
compliance_data_dict = prowler_compliance.get(provider_type, {})
for compliance_name, compliance_data in compliance_data_dict.items():
compliance_requirements = {}
+16 -5
View File
@@ -12,7 +12,6 @@ from django.contrib.auth.models import BaseUserManager
from django.db import (
DEFAULT_DB_ALIAS,
OperationalError,
connection,
connections,
models,
transaction,
@@ -450,7 +449,7 @@ def create_index_on_partitions(
all_partitions=True
)
"""
with connection.cursor() as cursor:
with schema_editor.connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
@@ -462,6 +461,7 @@ def create_index_on_partitions(
partitions = [row[0] for row in cursor.fetchall()]
where_sql = f" WHERE {where}" if where else ""
conn = schema_editor.connection
for partition in partitions:
if _should_create_index_on_partition(partition, all_partitions):
idx_name = f"{partition.replace('.', '_')}_{index_name}"
@@ -470,7 +470,12 @@ def create_index_on_partitions(
f"ON {partition} USING {method} ({columns})"
f"{where_sql};"
)
schema_editor.execute(sql)
old_autocommit = conn.connection.autocommit
conn.connection.autocommit = True
try:
schema_editor.execute(sql)
finally:
conn.connection.autocommit = old_autocommit
def drop_index_on_partitions(
@@ -486,7 +491,8 @@ def drop_index_on_partitions(
parent_table: The name of the root table (e.g. "findings").
index_name: The same short name used when creating them.
"""
with connection.cursor() as cursor:
conn = schema_editor.connection
with conn.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
@@ -500,7 +506,12 @@ def drop_index_on_partitions(
for partition in partitions:
idx_name = f"{partition.replace('.', '_')}_{index_name}"
sql = f"DROP INDEX CONCURRENTLY IF EXISTS {idx_name};"
schema_editor.execute(sql)
old_autocommit = conn.connection.autocommit
conn.connection.autocommit = True
try:
schema_editor.execute(sql)
finally:
conn.connection.autocommit = old_autocommit
def generate_api_key_prefix():
+102
View File
@@ -107,3 +107,105 @@ class ConflictException(APIException):
error_detail["source"] = {"pointer": pointer}
super().__init__(detail=[error_detail])
# Upstream Provider Errors (for external API calls like CloudTrail)
# These indicate issues with the provider, not with the user's API authentication
class UpstreamAuthenticationError(APIException):
"""Provider credentials are invalid or expired (502 Bad Gateway).
Used when AWS/Azure/GCP credentials fail to authenticate with the upstream
provider. This is NOT the user's API authentication failing.
"""
status_code = status.HTTP_502_BAD_GATEWAY
default_detail = (
"Provider credentials are invalid or expired. Please reconnect the provider."
)
default_code = "upstream_auth_failed"
def __init__(self, detail=None):
super().__init__(
detail=[
{
"detail": detail or self.default_detail,
"status": str(self.status_code),
"code": self.default_code,
}
]
)
class UpstreamAccessDeniedError(APIException):
"""Provider credentials lack required permissions (502 Bad Gateway).
Used when credentials are valid but don't have the IAM permissions
needed for the requested operation (e.g., cloudtrail:LookupEvents).
This is 502 (not 403) because it's an upstream/gateway error - the USER
authenticated fine, but the PROVIDER's credentials are misconfigured.
"""
status_code = status.HTTP_502_BAD_GATEWAY
default_detail = (
"Access denied. The provider credentials do not have the required permissions."
)
default_code = "upstream_access_denied"
def __init__(self, detail=None):
super().__init__(
detail=[
{
"detail": detail or self.default_detail,
"status": str(self.status_code),
"code": self.default_code,
}
]
)
class UpstreamServiceUnavailableError(APIException):
"""Provider service is unavailable (503 Service Unavailable).
Used when the upstream provider API returns an error or is unreachable.
"""
status_code = status.HTTP_503_SERVICE_UNAVAILABLE
default_detail = "Unable to communicate with the provider. Please try again later."
default_code = "service_unavailable"
def __init__(self, detail=None):
super().__init__(
detail=[
{
"detail": detail or self.default_detail,
"status": str(self.status_code),
"code": self.default_code,
}
]
)
class UpstreamInternalError(APIException):
"""Unexpected error communicating with provider (500 Internal Server Error).
Used as a catch-all for unexpected errors during provider communication.
"""
status_code = status.HTTP_500_INTERNAL_SERVER_ERROR
default_detail = (
"An unexpected error occurred while communicating with the provider."
)
default_code = "internal_error"
def __init__(self, detail=None):
super().__init__(
detail=[
{
"detail": detail or self.default_detail,
"status": str(self.status_code),
"code": self.default_code,
}
]
)
+121 -26
View File
@@ -29,6 +29,7 @@ from api.models import (
Finding,
Integration,
Invitation,
AttackPathsScan,
LighthouseProviderConfiguration,
LighthouseProviderModels,
Membership,
@@ -37,6 +38,7 @@ from api.models import (
PermissionChoices,
Processor,
Provider,
ProviderComplianceScore,
ProviderGroup,
ProviderSecret,
Resource,
@@ -44,6 +46,7 @@ from api.models import (
Role,
Scan,
ScanCategorySummary,
ScanGroupSummary,
ScanSummary,
SeverityChoices,
StateChoices,
@@ -92,10 +95,62 @@ class ChoiceInFilter(BaseInFilter, ChoiceFilter):
pass
class BaseProviderFilter(FilterSet):
"""
Abstract base filter for models with direct FK to Provider.
Provides standard provider_id and provider_type filters.
Subclasses must define Meta.model.
"""
provider_id = UUIDFilter(field_name="provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
field_name="provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="provider__provider",
choices=Provider.ProviderChoices.choices,
lookup_expr="in",
)
class Meta:
abstract = True
fields = {}
class BaseScanProviderFilter(FilterSet):
"""
Abstract base filter for models with FK to Scan (and Scan has FK to Provider).
Provides standard provider_id and provider_type filters via scan relationship.
Subclasses must define Meta.model.
"""
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="scan__provider__provider",
choices=Provider.ProviderChoices.choices,
lookup_expr="in",
)
class Meta:
abstract = True
fields = {}
class CommonFindingFilters(FilterSet):
# We filter providers from the scan in findings
# Both 'provider' and 'provider_id' parameters are supported for API consistency
# Frontend uses 'provider_id' uniformly across all endpoints
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
@@ -161,6 +216,9 @@ class CommonFindingFilters(FilterSet):
category = CharFilter(method="filter_category")
category__in = CharInFilter(field_name="categories", lookup_expr="overlap")
resource_groups = CharFilter(field_name="resource_groups", lookup_expr="exact")
resource_groups__in = CharInFilter(field_name="resource_groups", lookup_expr="in")
# Temporarily disabled until we implement tag filtering in the UI
# resource_tag_key = CharFilter(field_name="resources__tags__key")
# resource_tag_key__in = CharInFilter(
@@ -339,6 +397,23 @@ class ScanFilter(ProviderRelationshipFilterSet):
}
class AttackPathsScanFilter(ProviderRelationshipFilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
completed_at = DateFilter(field_name="completed_at", lookup_expr="date")
started_at = DateFilter(field_name="started_at", lookup_expr="date")
state = ChoiceFilter(choices=StateChoices.choices)
state__in = ChoiceInFilter(
field_name="state", choices=StateChoices.choices, lookup_expr="in"
)
class Meta:
model = AttackPathsScan
fields = {
"provider": ["exact", "in"],
"scan": ["exact", "in"],
}
class TaskFilter(FilterSet):
name = CharFilter(field_name="task_runner_task__task_name", lookup_expr="exact")
name__icontains = CharFilter(
@@ -378,6 +453,8 @@ class ResourceTagFilter(FilterSet):
class ResourceFilter(ProviderRelationshipFilterSet):
provider_id = UUIDFilter(field_name="provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="provider__id", lookup_expr="in")
tag_key = CharFilter(method="filter_tag_key")
tag_value = CharFilter(method="filter_tag_value")
tag = CharFilter(method="filter_tag")
@@ -386,6 +463,8 @@ class ResourceFilter(ProviderRelationshipFilterSet):
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
scan = UUIDFilter(field_name="provider__scan", lookup_expr="exact")
scan__in = UUIDInFilter(field_name="provider__scan", lookup_expr="in")
groups = CharFilter(method="filter_groups")
groups__in = CharInFilter(field_name="groups", lookup_expr="overlap")
class Meta:
model = Resource
@@ -400,6 +479,9 @@ class ResourceFilter(ProviderRelationshipFilterSet):
"updated_at": ["gte", "lte"],
}
def filter_groups(self, queryset, name, value):
return queryset.filter(groups__contains=[value])
def filter_queryset(self, queryset):
if not (self.data.get("scan") or self.data.get("scan__in")) and not (
self.data.get("updated_at")
@@ -460,10 +542,14 @@ class ResourceFilter(ProviderRelationshipFilterSet):
class LatestResourceFilter(ProviderRelationshipFilterSet):
provider_id = UUIDFilter(field_name="provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="provider__id", lookup_expr="in")
tag_key = CharFilter(method="filter_tag_key")
tag_value = CharFilter(method="filter_tag_value")
tag = CharFilter(method="filter_tag")
tags = CharFilter(method="filter_tag")
groups = CharFilter(method="filter_groups")
groups__in = CharInFilter(field_name="groups", lookup_expr="overlap")
class Meta:
model = Resource
@@ -476,6 +562,9 @@ class LatestResourceFilter(ProviderRelationshipFilterSet):
"type": ["exact", "icontains", "in"],
}
def filter_groups(self, queryset, name, value):
return queryset.filter(groups__contains=[value])
def filter_tag_key(self, queryset, name, value):
return queryset.filter(Q(tags__key=value) | Q(tags__key__icontains=value))
@@ -1086,39 +1175,45 @@ class ThreatScoreSnapshotFilter(FilterSet):
}
class AttackSurfaceOverviewFilter(FilterSet):
class AttackSurfaceOverviewFilter(BaseScanProviderFilter):
"""Filter for attack surface overview aggregations by provider."""
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="scan__provider__provider",
choices=Provider.ProviderChoices.choices,
lookup_expr="in",
)
class Meta:
class Meta(BaseScanProviderFilter.Meta):
model = AttackSurfaceOverview
fields = {}
class CategoryOverviewFilter(FilterSet):
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="scan__provider__provider",
choices=Provider.ProviderChoices.choices,
lookup_expr="in",
)
class CategoryOverviewFilter(BaseScanProviderFilter):
"""Filter for category overview aggregations by provider."""
category = CharFilter(field_name="category", lookup_expr="exact")
category__in = CharInFilter(field_name="category", lookup_expr="in")
class Meta:
class Meta(BaseScanProviderFilter.Meta):
model = ScanCategorySummary
fields = {}
class ResourceGroupOverviewFilter(FilterSet):
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="scan__provider__provider",
choices=Provider.ProviderChoices.choices,
lookup_expr="in",
)
resource_group = CharFilter(field_name="resource_group", lookup_expr="exact")
resource_group__in = CharInFilter(field_name="resource_group", lookup_expr="in")
class Meta:
model = ScanGroupSummary
fields = {}
class ComplianceWatchlistFilter(BaseProviderFilter):
"""Filter for compliance watchlist overview by provider."""
class Meta(BaseProviderFilter.Meta):
model = ProviderComplianceScore
@@ -0,0 +1,37 @@
[
{
"model": "api.attackpathsscan",
"pk": "a7f0f6de-6f8e-4b3a-8cbe-3f6dd9012345",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "b85601a8-4b45-4194-8135-03fb980ef428",
"scan": "01920573-aa9c-73c9-bcda-f2e35c9b19d2",
"state": "completed",
"progress": 100,
"update_tag": 1693586667,
"task": null,
"inserted_at": "2024-09-01T17:24:37Z",
"updated_at": "2024-09-01T17:44:37Z",
"started_at": "2024-09-01T17:34:37Z",
"completed_at": "2024-09-01T17:44:37Z",
"duration": 269,
"ingestion_exceptions": {}
}
},
{
"model": "api.attackpathsscan",
"pk": "4a2fb2af-8a60-4d7d-9cae-4ca65e098765",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"provider": "15fce1fa-ecaa-433f-a9dc-62553f3a2555",
"scan": "01929f3b-ed2e-7623-ad63-7c37cd37828f",
"state": "executing",
"progress": 48,
"update_tag": 1697625000,
"task": null,
"inserted_at": "2024-10-18T10:55:57Z",
"updated_at": "2024-10-18T10:56:15Z",
"started_at": "2024-10-18T10:56:05Z"
}
}
]
@@ -0,0 +1,94 @@
import uuid
import django.db.models.deletion
from django.db import migrations, models
import api.db_utils
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0065_alibabacloud_provider"),
]
operations = [
migrations.CreateModel(
name="ProviderComplianceScore",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("compliance_id", models.TextField()),
("requirement_id", models.TextField()),
(
"requirement_status",
api.db_utils.StatusEnumField(
choices=[
("FAIL", "Fail"),
("PASS", "Pass"),
("MANUAL", "Manual"),
]
),
),
("scan_completed_at", models.DateTimeField()),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="compliance_scores",
related_query_name="compliance_score",
to="api.provider",
),
),
(
"scan",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="compliance_scores",
related_query_name="compliance_score",
to="api.scan",
),
),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="api.tenant",
),
),
],
options={
"db_table": "provider_compliance_scores",
"abstract": False,
},
),
migrations.AddConstraint(
model_name="providercompliancescore",
constraint=models.UniqueConstraint(
fields=("tenant_id", "provider_id", "compliance_id", "requirement_id"),
name="unique_provider_compliance_req",
),
),
migrations.AddConstraint(
model_name="providercompliancescore",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_providercompliancescore",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddIndex(
model_name="providercompliancescore",
index=models.Index(
fields=["tenant_id", "provider_id", "compliance_id"],
name="pcs_tenant_prov_comp_idx",
),
),
]
@@ -0,0 +1,61 @@
import uuid
import django.db.models.deletion
from django.db import migrations, models
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0066_provider_compliance_score"),
]
operations = [
migrations.CreateModel(
name="TenantComplianceSummary",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("compliance_id", models.TextField()),
("requirements_passed", models.IntegerField(default=0)),
("requirements_failed", models.IntegerField(default=0)),
("requirements_manual", models.IntegerField(default=0)),
("total_requirements", models.IntegerField(default=0)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="api.tenant",
),
),
],
options={
"db_table": "tenant_compliance_summaries",
"abstract": False,
},
),
migrations.AddConstraint(
model_name="tenantcompliancesummary",
constraint=models.UniqueConstraint(
fields=("tenant_id", "compliance_id"),
name="unique_tenant_compliance_summary",
),
),
migrations.AddConstraint(
model_name="tenantcompliancesummary",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_tenantcompliancesummary",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]
@@ -0,0 +1,126 @@
import uuid
import django.db.models.deletion
from django.db import migrations, models
import api.db_utils
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0067_tenant_compliance_summary"),
]
operations = [
migrations.AddField(
model_name="finding",
name="resource_groups",
field=models.TextField(
blank=True,
help_text="Resource group from check metadata for efficient filtering",
null=True,
),
),
migrations.CreateModel(
name="ScanGroupSummary",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="api.tenant",
),
),
(
"inserted_at",
models.DateTimeField(auto_now_add=True),
),
(
"scan",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="resource_group_summaries",
related_query_name="resource_group_summary",
to="api.scan",
),
),
(
"resource_group",
models.CharField(max_length=50),
),
(
"severity",
api.db_utils.SeverityEnumField(
choices=[
("critical", "Critical"),
("high", "High"),
("medium", "Medium"),
("low", "Low"),
("informational", "Informational"),
],
),
),
(
"total_findings",
models.IntegerField(
default=0, help_text="Non-muted findings (PASS + FAIL)"
),
),
(
"failed_findings",
models.IntegerField(
default=0,
help_text="Non-muted FAIL findings (subset of total_findings)",
),
),
(
"new_failed_findings",
models.IntegerField(
default=0,
help_text="Non-muted FAIL with delta='new' (subset of failed_findings)",
),
),
(
"resources_count",
models.IntegerField(
default=0, help_text="Count of distinct resource_uid values"
),
),
],
options={
"db_table": "scan_resource_group_summaries",
"abstract": False,
},
),
migrations.AddIndex(
model_name="scangroupsummary",
index=models.Index(
fields=["tenant_id", "scan"], name="srgs_tenant_scan_idx"
),
),
migrations.AddConstraint(
model_name="scangroupsummary",
constraint=models.UniqueConstraint(
fields=("tenant_id", "scan_id", "resource_group", "severity"),
name="unique_resource_group_severity_per_scan",
),
),
migrations.AddConstraint(
model_name="scangroupsummary",
constraint=api.rls.RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_scangroupsummary",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]
@@ -0,0 +1,21 @@
from django.contrib.postgres.fields import ArrayField
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0068_finding_resource_group_scangroupsummary"),
]
operations = [
migrations.AddField(
model_name="resource",
name="groups",
field=ArrayField(
models.CharField(max_length=100),
blank=True,
help_text="Groups for categorization (e.g., compute, storage, IAM)",
null=True,
),
),
]
@@ -0,0 +1,154 @@
# Generated by Django 5.1.13 on 2025-11-06 16:20
import django.db.models.deletion
from django.db import migrations, models
from uuid6 import uuid7
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0069_resource_resource_group"),
]
operations = [
migrations.CreateModel(
name="AttackPathsScan",
fields=[
(
"id",
models.UUIDField(
default=uuid7,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"state",
api.db_utils.StateEnumField(
choices=[
("available", "Available"),
("scheduled", "Scheduled"),
("executing", "Executing"),
("completed", "Completed"),
("failed", "Failed"),
("cancelled", "Cancelled"),
],
default="available",
),
),
("progress", models.IntegerField(default=0)),
("started_at", models.DateTimeField(blank=True, null=True)),
("completed_at", models.DateTimeField(blank=True, null=True)),
(
"duration",
models.IntegerField(
blank=True, help_text="Duration in seconds", null=True
),
),
(
"update_tag",
models.BigIntegerField(
blank=True,
help_text="Cartography update tag (epoch)",
null=True,
),
),
(
"graph_database",
models.CharField(blank=True, max_length=63, null=True),
),
(
"is_graph_database_deleted",
models.BooleanField(default=False),
),
(
"ingestion_exceptions",
models.JSONField(blank=True, default=dict, null=True),
),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="attack_paths_scans",
related_query_name="attack_paths_scan",
to="api.provider",
),
),
(
"scan",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="attack_paths_scans",
related_query_name="attack_paths_scan",
to="api.scan",
),
),
(
"task",
models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="attack_paths_scans",
related_query_name="attack_paths_scan",
to="api.task",
),
),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "attack_paths_scans",
"abstract": False,
"indexes": [
models.Index(
fields=["tenant_id", "provider_id", "-inserted_at"],
name="aps_prov_ins_desc_idx",
),
models.Index(
fields=["tenant_id", "state", "-inserted_at"],
name="aps_state_ins_desc_idx",
),
models.Index(
fields=["tenant_id", "scan_id"],
name="aps_scan_lookup_idx",
),
models.Index(
fields=["tenant_id", "provider_id"],
name="aps_active_graph_idx",
include=["graph_database", "id"],
condition=models.Q(("is_graph_database_deleted", False)),
),
models.Index(
fields=["tenant_id", "provider_id", "-completed_at"],
name="aps_completed_graph_idx",
include=["graph_database", "id"],
condition=models.Q(
("state", "completed"),
("is_graph_database_deleted", False),
),
),
],
},
),
migrations.AddConstraint(
model_name="attackpathsscan",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_attackpathsscan",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]
@@ -0,0 +1,41 @@
from django.db import migrations
class Migration(migrations.Migration):
"""
Drop unused indexes on partitioned tables (findings, resource_finding_mappings).
NOTE: RemoveIndexConcurrently cannot be used on partitioned tables in PostgreSQL.
Standard RemoveIndex drops the parent index, which cascades to all partitions.
"""
dependencies = [
("api", "0070_attack_paths_scan"),
]
operations = [
migrations.RemoveIndex(
model_name="finding",
name="gin_findings_search_idx",
),
migrations.RemoveIndex(
model_name="finding",
name="gin_find_service_idx",
),
migrations.RemoveIndex(
model_name="finding",
name="gin_find_region_idx",
),
migrations.RemoveIndex(
model_name="finding",
name="gin_find_rtype_idx",
),
migrations.RemoveIndex(
model_name="finding",
name="find_delta_new_idx",
),
migrations.RemoveIndex(
model_name="resourcefindingmapping",
name="rfm_tenant_finding_idx",
),
]
@@ -0,0 +1,91 @@
"""
Drop unused indexes on non-partitioned tables.
These tables are not partitioned, so RemoveIndexConcurrently can be used safely.
"""
from uuid import uuid4
from django.contrib.postgres.operations import RemoveIndexConcurrently
from django.db import migrations, models
def drop_resource_scan_summary_resource_id_index(apps, schema_editor):
with schema_editor.connection.cursor() as cursor:
cursor.execute(
"""
SELECT idx_ns.nspname, idx.relname
FROM pg_class tbl
JOIN pg_namespace tbl_ns ON tbl_ns.oid = tbl.relnamespace
JOIN pg_index i ON i.indrelid = tbl.oid
JOIN pg_class idx ON idx.oid = i.indexrelid
JOIN pg_namespace idx_ns ON idx_ns.oid = idx.relnamespace
JOIN pg_attribute a
ON a.attrelid = tbl.oid
AND a.attnum = (i.indkey::int[])[0]
WHERE tbl_ns.nspname = ANY (current_schemas(false))
AND tbl.relname = %s
AND i.indnatts = 1
AND a.attname = %s
""",
["resource_scan_summaries", "resource_id"],
)
row = cursor.fetchone()
if not row:
return
schema_name, index_name = row
quote_name = schema_editor.connection.ops.quote_name
qualified_name = f"{quote_name(schema_name)}.{quote_name(index_name)}"
schema_editor.execute(f"DROP INDEX CONCURRENTLY IF EXISTS {qualified_name};")
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0071_drop_partitioned_indexes"),
]
operations = [
RemoveIndexConcurrently(
model_name="resource",
name="gin_resources_search_idx",
),
RemoveIndexConcurrently(
model_name="resourcetag",
name="gin_resource_tags_search_idx",
),
RemoveIndexConcurrently(
model_name="scansummary",
name="ss_tenant_scan_service_idx",
),
RemoveIndexConcurrently(
model_name="complianceoverview",
name="comp_ov_cp_id_idx",
),
RemoveIndexConcurrently(
model_name="complianceoverview",
name="comp_ov_req_fail_idx",
),
RemoveIndexConcurrently(
model_name="complianceoverview",
name="comp_ov_cp_id_req_fail_idx",
),
migrations.SeparateDatabaseAndState(
database_operations=[
migrations.RunPython(
drop_resource_scan_summary_resource_id_index,
reverse_code=migrations.RunPython.noop,
),
],
state_operations=[
migrations.AlterField(
model_name="resourcescansummary",
name="resource_id",
field=models.UUIDField(default=uuid4),
),
],
),
]
@@ -0,0 +1,31 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0072_drop_unused_indexes"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_fail_new_idx",
columns="tenant_id, scan_id",
where="status = 'FAIL' AND delta = 'new'",
all_partitions=True,
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_fail_new_idx",
),
)
]
@@ -0,0 +1,54 @@
from django.db import migrations, models
INDEX_NAME = "find_tenant_scan_fail_new_idx"
PARENT_TABLE = "findings"
def create_parent_and_attach(apps, schema_editor):
with schema_editor.connection.cursor() as cursor:
cursor.execute(
f"CREATE INDEX {INDEX_NAME} ON ONLY {PARENT_TABLE} "
f"USING btree (tenant_id, scan_id) "
f"WHERE status = 'FAIL' AND delta = 'new'"
)
cursor.execute(
"SELECT inhrelid::regclass::text "
"FROM pg_inherits "
"WHERE inhparent = %s::regclass",
[PARENT_TABLE],
)
for (partition,) in cursor.fetchall():
child_idx = f"{partition.replace('.', '_')}_{INDEX_NAME}"
cursor.execute(f"ALTER INDEX {INDEX_NAME} ATTACH PARTITION {child_idx}")
def drop_parent_index(apps, schema_editor):
with schema_editor.connection.cursor() as cursor:
cursor.execute(f"DROP INDEX IF EXISTS {INDEX_NAME}")
class Migration(migrations.Migration):
dependencies = [
("api", "0073_findings_fail_new_index_partitions"),
]
operations = [
migrations.SeparateDatabaseAndState(
state_operations=[
migrations.AddIndex(
model_name="finding",
index=models.Index(
condition=models.Q(status="FAIL", delta="new"),
fields=["tenant_id", "scan_id"],
name=INDEX_NAME,
),
),
],
database_operations=[
migrations.RunPython(
create_parent_and_attach,
reverse_code=drop_parent_index,
),
],
),
]
@@ -0,0 +1,38 @@
# Generated by Django migration for Cloudflare provider support
from django.db import migrations
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0074_findings_fail_new_index_parent"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="provider",
field=api.db_utils.ProviderEnumField(
choices=[
("aws", "AWS"),
("azure", "Azure"),
("gcp", "GCP"),
("kubernetes", "Kubernetes"),
("m365", "M365"),
("github", "GitHub"),
("mongodbatlas", "MongoDB Atlas"),
("iac", "IaC"),
("oraclecloud", "Oracle Cloud Infrastructure"),
("alibabacloud", "Alibaba Cloud"),
("cloudflare", "Cloudflare"),
],
default="aws",
),
),
migrations.RunSQL(
"ALTER TYPE provider ADD VALUE IF NOT EXISTS 'cloudflare';",
reverse_sql=migrations.RunSQL.noop,
),
]
@@ -0,0 +1,39 @@
# Generated by Django migration for OpenStack provider support
from django.db import migrations
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0075_cloudflare_provider"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="provider",
field=api.db_utils.ProviderEnumField(
choices=[
("aws", "AWS"),
("azure", "Azure"),
("gcp", "GCP"),
("kubernetes", "Kubernetes"),
("m365", "M365"),
("github", "GitHub"),
("mongodbatlas", "MongoDB Atlas"),
("iac", "IaC"),
("oraclecloud", "Oracle Cloud Infrastructure"),
("alibabacloud", "Alibaba Cloud"),
("cloudflare", "Cloudflare"),
("openstack", "OpenStack"),
],
default="aws",
),
),
migrations.RunSQL(
"ALTER TYPE provider ADD VALUE IF NOT EXISTS 'openstack';",
reverse_sql=migrations.RunSQL.noop,
),
]
@@ -0,0 +1,23 @@
# Generated by Django 5.1.15 on 2026-02-16 09:24
from django.contrib.postgres.operations import RemoveIndexConcurrently
from django.db import migrations
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0076_openstack_provider"),
]
operations = [
RemoveIndexConcurrently(
model_name="attackpathsscan",
name="aps_active_graph_idx",
),
RemoveIndexConcurrently(
model_name="attackpathsscan",
name="aps_completed_graph_idx",
),
]
@@ -0,0 +1,20 @@
# Generated by Django 5.1.15 on 2026-02-16 09:24
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("api", "0077_remove_attackpathsscan_graph_database_indexes"),
]
operations = [
migrations.RemoveField(
model_name="attackpathsscan",
name="graph_database",
),
migrations.RemoveField(
model_name="attackpathsscan",
name="is_graph_database_deleted",
),
]
+263 -30
View File
@@ -12,7 +12,6 @@ from cryptography.fernet import Fernet, InvalidToken
from django.conf import settings
from django.contrib.auth.models import AbstractBaseUser
from django.contrib.postgres.fields import ArrayField
from django.contrib.postgres.indexes import GinIndex
from django.contrib.postgres.search import SearchVector, SearchVectorField
from django.contrib.sites.models import Site
from django.core.exceptions import ValidationError
@@ -288,6 +287,8 @@ class Provider(RowLevelSecurityProtectedModel):
IAC = "iac", _("IaC")
ORACLECLOUD = "oraclecloud", _("Oracle Cloud Infrastructure")
ALIBABACLOUD = "alibabacloud", _("Alibaba Cloud")
CLOUDFLARE = "cloudflare", _("Cloudflare")
OPENSTACK = "openstack", _("OpenStack")
@staticmethod
def validate_aws_uid(value):
@@ -401,6 +402,24 @@ class Provider(RowLevelSecurityProtectedModel):
pointer="/data/attributes/uid",
)
@staticmethod
def validate_cloudflare_uid(value):
if not re.match(r"^[a-f0-9]{32}$", value):
raise ModelValidationError(
detail="Cloudflare Account ID must be a 32-character hexadecimal string.",
code="cloudflare-uid",
pointer="/data/attributes/uid",
)
@staticmethod
def validate_openstack_uid(value):
if not re.match(r"^[a-zA-Z0-9][a-zA-Z0-9._-]{0,254}$", value):
raise ModelValidationError(
detail="OpenStack provider ID must be a valid project ID (UUID or project name).",
code="openstack-uid",
pointer="/data/attributes/uid",
)
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
@@ -626,6 +645,84 @@ class Scan(RowLevelSecurityProtectedModel):
resource_name = "scans"
class AttackPathsScan(RowLevelSecurityProtectedModel):
objects = ActiveProviderManager()
all_objects = models.Manager()
id = models.UUIDField(primary_key=True, default=uuid7, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
state = StateEnumField(choices=StateChoices.choices, default=StateChoices.AVAILABLE)
progress = models.IntegerField(default=0)
# Timing
started_at = models.DateTimeField(null=True, blank=True)
completed_at = models.DateTimeField(null=True, blank=True)
duration = models.IntegerField(
null=True, blank=True, help_text="Duration in seconds"
)
# Relationship to the provider and optional prowler Scan and celery Task
provider = models.ForeignKey(
"Provider",
on_delete=models.CASCADE,
related_name="attack_paths_scans",
related_query_name="attack_paths_scan",
)
scan = models.ForeignKey(
"Scan",
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name="attack_paths_scans",
related_query_name="attack_paths_scan",
)
task = models.ForeignKey(
"Task",
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name="attack_paths_scans",
related_query_name="attack_paths_scan",
)
# Cartography specific metadata
update_tag = models.BigIntegerField(
null=True, blank=True, help_text="Cartography update tag (epoch)"
)
ingestion_exceptions = models.JSONField(default=dict, null=True, blank=True)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "attack_paths_scans"
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
indexes = [
models.Index(
fields=["tenant_id", "provider_id", "-inserted_at"],
name="aps_prov_ins_desc_idx",
),
models.Index(
fields=["tenant_id", "state", "-inserted_at"],
name="aps_state_ins_desc_idx",
),
models.Index(
fields=["tenant_id", "scan_id"],
name="aps_scan_lookup_idx",
),
]
class JSONAPIMeta:
resource_name = "attack-paths-scans"
class ResourceTag(RowLevelSecurityProtectedModel):
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
@@ -646,10 +743,6 @@ class ResourceTag(RowLevelSecurityProtectedModel):
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "resource_tags"
indexes = [
GinIndex(fields=["text_search"], name="gin_resource_tags_search_idx"),
]
constraints = [
models.UniqueConstraint(
fields=("tenant_id", "key", "value"),
@@ -704,6 +797,12 @@ class Resource(RowLevelSecurityProtectedModel):
metadata = models.TextField(blank=True, null=True)
details = models.TextField(blank=True, null=True)
partition = models.TextField(blank=True, null=True)
groups = ArrayField(
models.CharField(max_length=100),
blank=True,
null=True,
help_text="Groups for categorization (e.g., compute, storage, IAM)",
)
failed_findings_count = models.IntegerField(default=0)
@@ -752,7 +851,6 @@ class Resource(RowLevelSecurityProtectedModel):
fields=["tenant_id", "service", "region", "type"],
name="resource_tenant_metadata_idx",
),
GinIndex(fields=["text_search"], name="gin_resources_search_idx"),
models.Index(fields=["tenant_id", "id"], name="resources_tenant_id_idx"),
models.Index(
fields=["tenant_id", "provider_id"],
@@ -890,6 +988,11 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
null=True,
help_text="Categories from check metadata for efficient filtering",
)
resource_groups = models.TextField(
blank=True,
null=True,
help_text="Resource group from check metadata for efficient filtering",
)
# Relationships
scan = models.ForeignKey(to=Scan, related_name="findings", on_delete=models.CASCADE)
@@ -932,23 +1035,19 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
indexes = [
models.Index(fields=["tenant_id", "id"], name="findings_tenant_and_id_idx"),
GinIndex(fields=["text_search"], name="gin_findings_search_idx"),
models.Index(fields=["tenant_id", "scan_id"], name="find_tenant_scan_idx"),
models.Index(
fields=["tenant_id", "scan_id", "id"], name="find_tenant_scan_id_idx"
),
models.Index(
fields=["tenant_id", "id"],
condition=Q(delta="new"),
name="find_delta_new_idx",
condition=models.Q(status=StatusChoices.FAIL, delta="new"),
fields=["tenant_id", "scan_id"],
name="find_tenant_scan_fail_new_idx",
),
models.Index(
fields=["tenant_id", "uid", "-inserted_at"],
name="find_tenant_uid_inserted_idx",
),
GinIndex(fields=["resource_services"], name="gin_find_service_idx"),
GinIndex(fields=["resource_regions"], name="gin_find_region_idx"),
GinIndex(fields=["resource_types"], name="gin_find_rtype_idx"),
models.Index(
fields=["tenant_id", "scan_id", "check_id"],
name="find_tenant_scan_check_idx",
@@ -1016,10 +1115,6 @@ class ResourceFindingMapping(PostgresPartitionedModel, RowLevelSecurityProtected
# - id
indexes = [
models.Index(
fields=["tenant_id", "finding_id"],
name="rfm_tenant_finding_idx",
),
models.Index(
fields=["tenant_id", "resource_id"],
name="rfm_tenant_resource_idx",
@@ -1336,14 +1431,6 @@ class ComplianceOverview(RowLevelSecurityProtectedModel):
statements=["SELECT", "INSERT", "DELETE"],
),
]
indexes = [
models.Index(fields=["compliance_id"], name="comp_ov_cp_id_idx"),
models.Index(fields=["requirements_failed"], name="comp_ov_req_fail_idx"),
models.Index(
fields=["compliance_id", "requirements_failed"],
name="comp_ov_cp_id_req_fail_idx",
),
]
class JSONAPIMeta:
resource_name = "compliance-overviews"
@@ -1509,10 +1596,6 @@ class ScanSummary(RowLevelSecurityProtectedModel):
fields=["tenant_id", "scan_id"],
name="scan_summaries_tenant_scan_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="ss_tenant_scan_service_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "severity"],
name="ss_tenant_scan_severity_idx",
@@ -1927,7 +2010,7 @@ class SAMLConfiguration(RowLevelSecurityProtectedModel):
class ResourceScanSummary(RowLevelSecurityProtectedModel):
scan_id = models.UUIDField(default=uuid7, db_index=True)
resource_id = models.UUIDField(default=uuid4, db_index=True)
resource_id = models.UUIDField(default=uuid4)
service = models.CharField(max_length=100)
region = models.CharField(max_length=100)
resource_type = models.CharField(max_length=100)
@@ -2032,6 +2115,67 @@ class ScanCategorySummary(RowLevelSecurityProtectedModel):
resource_name = "scan-category-summaries"
class ScanGroupSummary(RowLevelSecurityProtectedModel):
"""
Pre-aggregated resource group metrics per scan by severity.
Stores one row per (resource_group, severity) combination per scan for efficient
overview queries. Resource groups come from check_metadata.Group.
Count relationships (each is a subset of the previous):
- total_findings >= failed_findings >= new_failed_findings
"""
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
scan = models.ForeignKey(
Scan,
on_delete=models.CASCADE,
related_name="resource_group_summaries",
related_query_name="resource_group_summary",
)
resource_group = models.CharField(max_length=50)
severity = SeverityEnumField(choices=SeverityChoices)
total_findings = models.IntegerField(
default=0, help_text="Non-muted findings (PASS + FAIL)"
)
failed_findings = models.IntegerField(
default=0, help_text="Non-muted FAIL findings (subset of total_findings)"
)
new_failed_findings = models.IntegerField(
default=0,
help_text="Non-muted FAIL with delta='new' (subset of failed_findings)",
)
resources_count = models.IntegerField(
default=0, help_text="Count of distinct resource_uid values"
)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "scan_resource_group_summaries"
indexes = [
models.Index(fields=["tenant_id", "scan"], name="srgs_tenant_scan_idx"),
]
constraints = [
models.UniqueConstraint(
fields=("tenant_id", "scan_id", "resource_group", "severity"),
name="unique_resource_group_severity_per_scan",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class JSONAPIMeta:
resource_name = "scan-resource-group-summaries"
class LighthouseConfiguration(RowLevelSecurityProtectedModel):
"""
Stores configuration and API keys for LLM services.
@@ -2605,3 +2749,92 @@ class AttackSurfaceOverview(RowLevelSecurityProtectedModel):
class JSONAPIMeta:
resource_name = "attack-surface-overviews"
class ProviderComplianceScore(RowLevelSecurityProtectedModel):
"""
Compliance requirement status from latest completed scan per provider.
Used for efficient compliance watchlist queries with FAIL-dominant aggregation
across multiple providers. Updated via atomic upsert after each scan completion.
"""
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
scan = models.ForeignKey(
Scan,
on_delete=models.CASCADE,
related_name="compliance_scores",
related_query_name="compliance_score",
)
provider = models.ForeignKey(
Provider,
on_delete=models.CASCADE,
related_name="compliance_scores",
related_query_name="compliance_score",
)
compliance_id = models.TextField()
requirement_id = models.TextField()
requirement_status = StatusEnumField(choices=StatusChoices)
scan_completed_at = models.DateTimeField()
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "provider_compliance_scores"
constraints = [
models.UniqueConstraint(
fields=("tenant_id", "provider_id", "compliance_id", "requirement_id"),
name="unique_provider_compliance_req",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
indexes = [
models.Index(
fields=["tenant_id", "provider_id", "compliance_id"],
name="pcs_tenant_prov_comp_idx",
),
]
class TenantComplianceSummary(RowLevelSecurityProtectedModel):
"""
Pre-aggregated compliance counts per tenant with FAIL-dominant logic applied.
One row per (tenant, compliance_id). Used for fast watchlist queries when
no provider filter is applied. Recalculated after each scan by aggregating
across all providers with FAIL-dominant logic at requirement level.
"""
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
compliance_id = models.TextField()
requirements_passed = models.IntegerField(default=0)
requirements_failed = models.IntegerField(default=0)
requirements_manual = models.IntegerField(default=0)
total_requirements = models.IntegerField(default=0)
updated_at = models.DateTimeField(auto_now=True)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "tenant_compliance_summaries"
constraints = [
models.UniqueConstraint(
fields=("tenant_id", "compliance_id"),
name="unique_tenant_compliance_summary",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
File diff suppressed because it is too large Load Diff
+83 -1
View File
@@ -1,10 +1,13 @@
import os
import sys
import types
from pathlib import Path
from unittest.mock import MagicMock
from unittest.mock import MagicMock, patch
import pytest
from django.conf import settings
import api
import api.apps as api_apps_module
from api.apps import (
ApiConfig,
@@ -150,3 +153,82 @@ def test_ensure_crypto_keys_skips_when_env_vars(monkeypatch, tmp_path):
# Assert: orchestrator did not trigger generation when env present
assert called["ensure"] is False
@pytest.fixture(autouse=True)
def stub_api_modules():
"""Provide dummy modules imported during ApiConfig.ready()."""
created = []
for name in ("api.schema_extensions", "api.signals"):
if name not in sys.modules:
sys.modules[name] = types.ModuleType(name)
created.append(name)
yield
for name in created:
sys.modules.pop(name, None)
def _set_argv(monkeypatch, argv):
monkeypatch.setattr(sys, "argv", argv, raising=False)
def _set_testing(monkeypatch, value):
monkeypatch.setattr(settings, "TESTING", value, raising=False)
def _make_app():
return ApiConfig("api", api)
def test_ready_initializes_driver_for_api_process(monkeypatch):
config = _make_app()
_set_argv(monkeypatch, ["gunicorn"])
_set_testing(monkeypatch, False)
with patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None), patch(
"api.attack_paths.database.init_driver"
) as init_driver:
config.ready()
init_driver.assert_called_once()
def test_ready_skips_driver_for_celery(monkeypatch):
config = _make_app()
_set_argv(monkeypatch, ["celery", "-A", "api"])
_set_testing(monkeypatch, False)
with patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None), patch(
"api.attack_paths.database.init_driver"
) as init_driver:
config.ready()
init_driver.assert_not_called()
def test_ready_skips_driver_for_manage_py_skip_command(monkeypatch):
config = _make_app()
_set_argv(monkeypatch, ["manage.py", "migrate"])
_set_testing(monkeypatch, False)
with patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None), patch(
"api.attack_paths.database.init_driver"
) as init_driver:
config.ready()
init_driver.assert_not_called()
def test_ready_skips_driver_when_testing(monkeypatch):
config = _make_app()
_set_argv(monkeypatch, ["gunicorn"])
_set_testing(monkeypatch, True)
with patch.object(ApiConfig, "_ensure_crypto_keys", return_value=None), patch(
"api.attack_paths.database.init_driver"
) as init_driver:
config.ready()
init_driver.assert_not_called()
@@ -0,0 +1,175 @@
from types import SimpleNamespace
from unittest.mock import MagicMock, patch
import pytest
from rest_framework.exceptions import APIException, ValidationError
from api.attack_paths import database as graph_database
from api.attack_paths import views_helpers
def test_normalize_run_payload_extracts_attributes_section():
payload = {
"data": {
"id": "ignored",
"attributes": {
"id": "aws-rds",
"parameters": {"ip": "192.0.2.0"},
},
}
}
result = views_helpers.normalize_run_payload(payload)
assert result == {"id": "aws-rds", "parameters": {"ip": "192.0.2.0"}}
def test_normalize_run_payload_passthrough_for_non_dict():
sentinel = "not-a-dict"
assert views_helpers.normalize_run_payload(sentinel) is sentinel
def test_prepare_query_parameters_includes_provider_and_casts(
attack_paths_query_definition_factory,
):
definition = attack_paths_query_definition_factory(cast_type=int)
result = views_helpers.prepare_query_parameters(
definition,
{"limit": "5"},
provider_uid="123456789012",
)
assert result["provider_uid"] == "123456789012"
assert result["limit"] == 5
@pytest.mark.parametrize(
"provided,expected_message",
[
({}, "Missing required parameter"),
({"limit": 10, "extra": True}, "Unknown parameter"),
],
)
def test_prepare_query_parameters_validates_names(
attack_paths_query_definition_factory, provided, expected_message
):
definition = attack_paths_query_definition_factory()
with pytest.raises(ValidationError) as exc:
views_helpers.prepare_query_parameters(definition, provided, provider_uid="1")
assert expected_message in str(exc.value)
def test_prepare_query_parameters_validates_cast(
attack_paths_query_definition_factory,
):
definition = attack_paths_query_definition_factory(cast_type=int)
with pytest.raises(ValidationError) as exc:
views_helpers.prepare_query_parameters(
definition,
{"limit": "not-an-int"},
provider_uid="1",
)
assert "Invalid value" in str(exc.value)
def test_execute_attack_paths_query_serializes_graph(
attack_paths_query_definition_factory, attack_paths_graph_stub_classes
):
definition = attack_paths_query_definition_factory(
id="aws-rds",
name="RDS",
short_description="Short desc",
description="",
cypher="MATCH (n) RETURN n",
parameters=[],
)
parameters = {"provider_uid": "123"}
node = attack_paths_graph_stub_classes.Node(
element_id="node-1",
labels=["AWSAccount"],
properties={
"name": "account",
"complex": {
"items": [
attack_paths_graph_stub_classes.NativeValue("value"),
{"nested": 1},
]
},
},
)
relationship = attack_paths_graph_stub_classes.Relationship(
element_id="rel-1",
rel_type="OWNS",
start_node=node,
end_node=attack_paths_graph_stub_classes.Node("node-2", ["RDSInstance"], {}),
properties={"weight": 1},
)
graph = SimpleNamespace(nodes=[node], relationships=[relationship])
run_result = MagicMock()
run_result.graph.return_value = graph
session = MagicMock()
session.run.return_value = run_result
session_ctx = MagicMock()
session_ctx.__enter__.return_value = session
session_ctx.__exit__.return_value = False
database_name = "db-tenant-test-tenant-id"
with patch(
"api.attack_paths.views_helpers.graph_database.get_session",
return_value=session_ctx,
) as mock_get_session:
result = views_helpers.execute_attack_paths_query(
database_name, definition, parameters
)
mock_get_session.assert_called_once_with(database_name)
session.run.assert_called_once_with(definition.cypher, parameters)
assert result["nodes"][0]["id"] == "node-1"
assert result["nodes"][0]["properties"]["complex"]["items"][0] == "value"
assert result["relationships"][0]["label"] == "OWNS"
def test_execute_attack_paths_query_wraps_graph_errors(
attack_paths_query_definition_factory,
):
definition = attack_paths_query_definition_factory(
id="aws-rds",
name="RDS",
short_description="Short desc",
description="",
cypher="MATCH (n) RETURN n",
parameters=[],
)
database_name = "db-tenant-test-tenant-id"
parameters = {"provider_uid": "123"}
class ExplodingContext:
def __enter__(self):
raise graph_database.GraphDatabaseQueryException("boom")
def __exit__(self, exc_type, exc, tb):
return False
with (
patch(
"api.attack_paths.views_helpers.graph_database.get_session",
return_value=ExplodingContext(),
),
patch("api.attack_paths.views_helpers.logger") as mock_logger,
):
with pytest.raises(APIException):
views_helpers.execute_attack_paths_query(
database_name, definition, parameters
)
mock_logger.error.assert_called_once()
@@ -0,0 +1,303 @@
"""
Tests for Neo4j database lazy initialization.
The Neo4j driver connects on first use by default. API processes may
eagerly initialize the driver during app startup, while Celery workers
remain lazy. These tests validate the database module behavior itself.
"""
import threading
from unittest.mock import MagicMock, patch
import pytest
class TestLazyInitialization:
"""Test that Neo4j driver is initialized lazily on first use."""
@pytest.fixture(autouse=True)
def reset_module_state(self):
"""Reset module-level singleton state before each test."""
import api.attack_paths.database as db_module
original_driver = db_module._driver
db_module._driver = None
yield
db_module._driver = original_driver
def test_driver_not_initialized_at_import(self):
"""Driver should be None after module import (no eager connection)."""
import api.attack_paths.database as db_module
assert db_module._driver is None
@patch("api.attack_paths.database.settings")
@patch("api.attack_paths.database.neo4j.GraphDatabase.driver")
def test_init_driver_creates_connection_on_first_call(
self, mock_driver_factory, mock_settings
):
"""init_driver() should create connection only when called."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver_factory.return_value = mock_driver
mock_settings.DATABASES = {
"neo4j": {
"HOST": "localhost",
"PORT": 7687,
"USER": "neo4j",
"PASSWORD": "password",
}
}
assert db_module._driver is None
result = db_module.init_driver()
mock_driver_factory.assert_called_once()
mock_driver.verify_connectivity.assert_called_once()
assert result is mock_driver
assert db_module._driver is mock_driver
@patch("api.attack_paths.database.settings")
@patch("api.attack_paths.database.neo4j.GraphDatabase.driver")
def test_init_driver_returns_cached_driver_on_subsequent_calls(
self, mock_driver_factory, mock_settings
):
"""Subsequent calls should return cached driver without reconnecting."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver_factory.return_value = mock_driver
mock_settings.DATABASES = {
"neo4j": {
"HOST": "localhost",
"PORT": 7687,
"USER": "neo4j",
"PASSWORD": "password",
}
}
first_result = db_module.init_driver()
second_result = db_module.init_driver()
third_result = db_module.init_driver()
# Only one connection attempt
assert mock_driver_factory.call_count == 1
assert mock_driver.verify_connectivity.call_count == 1
# All calls return same instance
assert first_result is second_result is third_result
@patch("api.attack_paths.database.settings")
@patch("api.attack_paths.database.neo4j.GraphDatabase.driver")
def test_get_driver_delegates_to_init_driver(
self, mock_driver_factory, mock_settings
):
"""get_driver() should use init_driver() for lazy initialization."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver_factory.return_value = mock_driver
mock_settings.DATABASES = {
"neo4j": {
"HOST": "localhost",
"PORT": 7687,
"USER": "neo4j",
"PASSWORD": "password",
}
}
result = db_module.get_driver()
assert result is mock_driver
mock_driver_factory.assert_called_once()
class TestAtexitRegistration:
"""Test that atexit cleanup handler is registered correctly."""
@pytest.fixture(autouse=True)
def reset_module_state(self):
"""Reset module-level singleton state before each test."""
import api.attack_paths.database as db_module
original_driver = db_module._driver
db_module._driver = None
yield
db_module._driver = original_driver
@patch("api.attack_paths.database.settings")
@patch("api.attack_paths.database.atexit.register")
@patch("api.attack_paths.database.neo4j.GraphDatabase.driver")
def test_atexit_registered_on_first_init(
self, mock_driver_factory, mock_atexit_register, mock_settings
):
"""atexit.register should be called on first initialization."""
import api.attack_paths.database as db_module
mock_driver_factory.return_value = MagicMock()
mock_settings.DATABASES = {
"neo4j": {
"HOST": "localhost",
"PORT": 7687,
"USER": "neo4j",
"PASSWORD": "password",
}
}
db_module.init_driver()
mock_atexit_register.assert_called_once_with(db_module.close_driver)
@patch("api.attack_paths.database.settings")
@patch("api.attack_paths.database.atexit.register")
@patch("api.attack_paths.database.neo4j.GraphDatabase.driver")
def test_atexit_registered_only_once(
self, mock_driver_factory, mock_atexit_register, mock_settings
):
"""atexit.register should only be called once across multiple inits.
The double-checked locking on _driver ensures the atexit registration
block only executes once (when _driver is first created).
"""
import api.attack_paths.database as db_module
mock_driver_factory.return_value = MagicMock()
mock_settings.DATABASES = {
"neo4j": {
"HOST": "localhost",
"PORT": 7687,
"USER": "neo4j",
"PASSWORD": "password",
}
}
db_module.init_driver()
db_module.init_driver()
db_module.init_driver()
# Only registered once because subsequent calls hit the fast path
assert mock_atexit_register.call_count == 1
class TestCloseDriver:
"""Test driver cleanup functionality."""
@pytest.fixture(autouse=True)
def reset_module_state(self):
"""Reset module-level singleton state before each test."""
import api.attack_paths.database as db_module
original_driver = db_module._driver
db_module._driver = None
yield
db_module._driver = original_driver
def test_close_driver_closes_and_clears_driver(self):
"""close_driver() should close the driver and set it to None."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
db_module._driver = mock_driver
db_module.close_driver()
mock_driver.close.assert_called_once()
assert db_module._driver is None
def test_close_driver_handles_none_driver(self):
"""close_driver() should handle case where driver is None."""
import api.attack_paths.database as db_module
db_module._driver = None
# Should not raise
db_module.close_driver()
assert db_module._driver is None
def test_close_driver_clears_driver_even_on_close_error(self):
"""Driver should be cleared even if close() raises an exception."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver.close.side_effect = Exception("Connection error")
db_module._driver = mock_driver
with pytest.raises(Exception, match="Connection error"):
db_module.close_driver()
# Driver should still be cleared
assert db_module._driver is None
class TestThreadSafety:
"""Test thread-safe initialization."""
@pytest.fixture(autouse=True)
def reset_module_state(self):
"""Reset module-level singleton state before each test."""
import api.attack_paths.database as db_module
original_driver = db_module._driver
db_module._driver = None
yield
db_module._driver = original_driver
@patch("api.attack_paths.database.settings")
@patch("api.attack_paths.database.neo4j.GraphDatabase.driver")
def test_concurrent_init_creates_single_driver(
self, mock_driver_factory, mock_settings
):
"""Multiple threads calling init_driver() should create only one driver."""
import api.attack_paths.database as db_module
mock_driver = MagicMock()
mock_driver_factory.return_value = mock_driver
mock_settings.DATABASES = {
"neo4j": {
"HOST": "localhost",
"PORT": 7687,
"USER": "neo4j",
"PASSWORD": "password",
}
}
results = []
errors = []
def call_init():
try:
result = db_module.init_driver()
results.append(result)
except Exception as e:
errors.append(e)
threads = [threading.Thread(target=call_init) for _ in range(10)]
for t in threads:
t.start()
for t in threads:
t.join()
assert not errors, f"Threads raised errors: {errors}"
# Only one driver created
assert mock_driver_factory.call_count == 1
# All threads got the same driver instance
assert all(r is mock_driver for r in results)
assert len(results) == 10
@@ -6,7 +6,6 @@ from api.compliance import (
get_prowler_provider_checks,
get_prowler_provider_compliance,
load_prowler_checks,
load_prowler_compliance,
)
from api.models import Provider
@@ -35,55 +34,6 @@ class TestCompliance:
assert compliance_data == mock_compliance.get_bulk.return_value
mock_compliance.get_bulk.assert_called_once_with(provider_type)
@patch("api.models.Provider.ProviderChoices")
@patch("api.compliance.get_prowler_provider_compliance")
@patch("api.compliance.generate_compliance_overview_template")
@patch("api.compliance.load_prowler_checks")
def test_load_prowler_compliance(
self,
mock_load_prowler_checks,
mock_generate_compliance_overview_template,
mock_get_prowler_provider_compliance,
mock_provider_choices,
):
mock_provider_choices.values = ["aws", "azure"]
compliance_data_aws = {"compliance_aws": MagicMock()}
compliance_data_azure = {"compliance_azure": MagicMock()}
compliance_data_dict = {
"aws": compliance_data_aws,
"azure": compliance_data_azure,
}
def mock_get_compliance(provider_type):
return compliance_data_dict[provider_type]
mock_get_prowler_provider_compliance.side_effect = mock_get_compliance
mock_generate_compliance_overview_template.return_value = {
"template_key": "template_value"
}
mock_load_prowler_checks.return_value = {"checks_key": "checks_value"}
load_prowler_compliance()
from api.compliance import PROWLER_CHECKS, PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE
assert PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE == {
"template_key": "template_value"
}
assert PROWLER_CHECKS == {"checks_key": "checks_value"}
expected_prowler_compliance = compliance_data_dict
mock_get_prowler_provider_compliance.assert_any_call("aws")
mock_get_prowler_provider_compliance.assert_any_call("azure")
mock_generate_compliance_overview_template.assert_called_once_with(
expected_prowler_compliance
)
mock_load_prowler_checks.assert_called_once_with(expected_prowler_compliance)
@patch("api.compliance.get_prowler_provider_checks")
@patch("api.models.Provider.ProviderChoices")
def test_load_prowler_checks(
+169 -1
View File
@@ -1,9 +1,21 @@
from datetime import datetime, timezone
import pytest
from allauth.socialaccount.models import SocialApp
from django.core.exceptions import ValidationError
from django.db import IntegrityError
from api.db_router import MainRouter
from api.models import Resource, ResourceTag, SAMLConfiguration, SAMLDomainIndex
from api.models import (
ProviderComplianceScore,
Resource,
ResourceTag,
SAMLConfiguration,
SAMLDomainIndex,
StateChoices,
StatusChoices,
TenantComplianceSummary,
)
@pytest.mark.django_db
@@ -324,3 +336,159 @@ class TestSAMLConfigurationModel:
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
assert "There is a problem with your metadata." in errors["metadata_xml"][0]
@pytest.mark.django_db
class TestProviderComplianceScoreModel:
def test_create_provider_compliance_score(self, providers_fixture, scans_fixture):
provider = providers_fixture[0]
scan = scans_fixture[0]
scan.completed_at = datetime.now(timezone.utc)
scan.save()
score = ProviderComplianceScore.objects.create(
tenant_id=provider.tenant_id,
provider=provider,
scan=scan,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan.completed_at,
)
assert score.compliance_id == "aws_cis_2.0"
assert score.requirement_id == "req_1"
assert score.requirement_status == StatusChoices.PASS
def test_unique_constraint_per_provider_compliance_requirement(
self, providers_fixture, scans_fixture
):
provider = providers_fixture[0]
scan = scans_fixture[0]
scan.completed_at = datetime.now(timezone.utc)
scan.save()
ProviderComplianceScore.objects.create(
tenant_id=provider.tenant_id,
provider=provider,
scan=scan,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan.completed_at,
)
with pytest.raises(IntegrityError):
ProviderComplianceScore.objects.create(
tenant_id=provider.tenant_id,
provider=provider,
scan=scan,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.FAIL,
scan_completed_at=scan.completed_at,
)
def test_different_providers_same_requirement_allowed(
self, providers_fixture, scans_fixture
):
provider1, provider2, *_ = providers_fixture
scan1 = scans_fixture[0]
scan1.completed_at = datetime.now(timezone.utc)
scan1.save()
scan2 = scans_fixture[2]
scan2.state = StateChoices.COMPLETED
scan2.completed_at = datetime.now(timezone.utc)
scan2.save()
score1 = ProviderComplianceScore.objects.create(
tenant_id=provider1.tenant_id,
provider=provider1,
scan=scan1,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan1.completed_at,
)
score2 = ProviderComplianceScore.objects.create(
tenant_id=provider2.tenant_id,
provider=provider2,
scan=scan2,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.FAIL,
scan_completed_at=scan2.completed_at,
)
assert score1.id != score2.id
assert score1.requirement_status != score2.requirement_status
@pytest.mark.django_db
class TestTenantComplianceSummaryModel:
def test_create_tenant_compliance_summary(self, tenants_fixture):
tenant = tenants_fixture[0]
summary = TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="aws_cis_2.0",
requirements_passed=5,
requirements_failed=2,
requirements_manual=1,
total_requirements=8,
)
assert summary.compliance_id == "aws_cis_2.0"
assert summary.requirements_passed == 5
assert summary.requirements_failed == 2
assert summary.requirements_manual == 1
assert summary.total_requirements == 8
assert summary.updated_at is not None
def test_unique_constraint_per_tenant_compliance(self, tenants_fixture):
tenant = tenants_fixture[0]
TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="aws_cis_2.0",
requirements_passed=5,
requirements_failed=2,
requirements_manual=1,
total_requirements=8,
)
with pytest.raises(IntegrityError):
TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="aws_cis_2.0",
requirements_passed=3,
requirements_failed=4,
requirements_manual=1,
total_requirements=8,
)
def test_different_tenants_same_compliance_allowed(self, tenants_fixture):
tenant1, tenant2, *_ = tenants_fixture
summary1 = TenantComplianceSummary.objects.create(
tenant_id=tenant1.id,
compliance_id="aws_cis_2.0",
requirements_passed=5,
requirements_failed=2,
requirements_manual=1,
total_requirements=8,
)
summary2 = TenantComplianceSummary.objects.create(
tenant_id=tenant2.id,
compliance_id="aws_cis_2.0",
requirements_passed=3,
requirements_failed=4,
requirements_manual=1,
total_requirements=8,
)
assert summary1.id != summary2.id
assert summary1.requirements_passed != summary2.requirements_passed
+12
View File
@@ -20,12 +20,14 @@ from prowler.providers.alibabacloud.alibabacloud_provider import AlibabacloudPro
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHubConnection
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.cloudflare.cloudflare_provider import CloudflareProvider
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.github.github_provider import GithubProvider
from prowler.providers.iac.iac_provider import IacProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
from prowler.providers.m365.m365_provider import M365Provider
from prowler.providers.mongodbatlas.mongodbatlas_provider import MongodbatlasProvider
from prowler.providers.openstack.openstack_provider import OpenstackProvider
from prowler.providers.oraclecloud.oraclecloud_provider import OraclecloudProvider
@@ -118,6 +120,8 @@ class TestReturnProwlerProvider:
(Provider.ProviderChoices.ORACLECLOUD.value, OraclecloudProvider),
(Provider.ProviderChoices.IAC.value, IacProvider),
(Provider.ProviderChoices.ALIBABACLOUD.value, AlibabacloudProvider),
(Provider.ProviderChoices.CLOUDFLARE.value, CloudflareProvider),
(Provider.ProviderChoices.OPENSTACK.value, OpenstackProvider),
],
)
def test_return_prowler_provider(self, provider_type, expected_provider):
@@ -221,6 +225,14 @@ class TestGetProwlerProviderKwargs:
Provider.ProviderChoices.MONGODBATLAS.value,
{"atlas_organization_id": "provider_uid"},
),
(
Provider.ProviderChoices.CLOUDFLARE.value,
{"filter_accounts": ["provider_uid"]},
),
(
Provider.ProviderChoices.OPENSTACK.value,
{},
),
],
)
def test_get_prowler_provider_kwargs(self, provider_type, expected_extra_kwargs):
File diff suppressed because it is too large Load Diff
+93 -12
View File
@@ -1,4 +1,7 @@
from __future__ import annotations
from datetime import datetime, timezone
from typing import TYPE_CHECKING
from allauth.socialaccount.providers.oauth2.client import OAuth2Client
from django.contrib.postgres.aggregates import ArrayAgg
@@ -11,19 +14,27 @@ from api.exceptions import InvitationTokenExpiredException
from api.models import Integration, Invitation, Processor, Provider, Resource
from api.v1.serializers import FindingMetadataSerializer
from prowler.lib.outputs.jira.jira import Jira, JiraBasicAuthError
from prowler.providers.alibabacloud.alibabacloud_provider import AlibabacloudProvider
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.aws.lib.s3.s3 import S3
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHub
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.common.models import Connection
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.github.github_provider import GithubProvider
from prowler.providers.iac.iac_provider import IacProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
from prowler.providers.m365.m365_provider import M365Provider
from prowler.providers.mongodbatlas.mongodbatlas_provider import MongodbatlasProvider
from prowler.providers.oraclecloud.oraclecloud_provider import OraclecloudProvider
if TYPE_CHECKING:
from prowler.providers.alibabacloud.alibabacloud_provider import (
AlibabacloudProvider,
)
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.cloudflare.cloudflare_provider import CloudflareProvider
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.github.github_provider import GithubProvider
from prowler.providers.iac.iac_provider import IacProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
from prowler.providers.m365.m365_provider import M365Provider
from prowler.providers.mongodbatlas.mongodbatlas_provider import (
MongodbatlasProvider,
)
from prowler.providers.openstack.openstack_provider import OpenstackProvider
from prowler.providers.oraclecloud.oraclecloud_provider import OraclecloudProvider
class CustomOAuth2Client(OAuth2Client):
@@ -68,12 +79,14 @@ def return_prowler_provider(
AlibabacloudProvider
| AwsProvider
| AzureProvider
| CloudflareProvider
| GcpProvider
| GithubProvider
| IacProvider
| KubernetesProvider
| M365Provider
| MongodbatlasProvider
| OpenstackProvider
| OraclecloudProvider
):
"""Return the Prowler provider class based on the given provider type.
@@ -82,32 +95,70 @@ def return_prowler_provider(
provider (Provider): The provider object containing the provider type and associated secrets.
Returns:
AlibabacloudProvider | AwsProvider | AzureProvider | GcpProvider | GithubProvider | IacProvider | KubernetesProvider | M365Provider | MongodbatlasProvider | OraclecloudProvider: The corresponding provider class.
AlibabacloudProvider | AwsProvider | AzureProvider | CloudflareProvider | GcpProvider | GithubProvider | IacProvider | KubernetesProvider | M365Provider | MongodbatlasProvider | OpenstackProvider | OraclecloudProvider: The corresponding provider class.
Raises:
ValueError: If the provider type specified in `provider.provider` is not supported.
"""
match provider.provider:
case Provider.ProviderChoices.AWS.value:
from prowler.providers.aws.aws_provider import AwsProvider
prowler_provider = AwsProvider
case Provider.ProviderChoices.GCP.value:
from prowler.providers.gcp.gcp_provider import GcpProvider
prowler_provider = GcpProvider
case Provider.ProviderChoices.AZURE.value:
from prowler.providers.azure.azure_provider import AzureProvider
prowler_provider = AzureProvider
case Provider.ProviderChoices.KUBERNETES.value:
from prowler.providers.kubernetes.kubernetes_provider import (
KubernetesProvider,
)
prowler_provider = KubernetesProvider
case Provider.ProviderChoices.M365.value:
from prowler.providers.m365.m365_provider import M365Provider
prowler_provider = M365Provider
case Provider.ProviderChoices.GITHUB.value:
from prowler.providers.github.github_provider import GithubProvider
prowler_provider = GithubProvider
case Provider.ProviderChoices.MONGODBATLAS.value:
from prowler.providers.mongodbatlas.mongodbatlas_provider import (
MongodbatlasProvider,
)
prowler_provider = MongodbatlasProvider
case Provider.ProviderChoices.IAC.value:
from prowler.providers.iac.iac_provider import IacProvider
prowler_provider = IacProvider
case Provider.ProviderChoices.ORACLECLOUD.value:
from prowler.providers.oraclecloud.oraclecloud_provider import (
OraclecloudProvider,
)
prowler_provider = OraclecloudProvider
case Provider.ProviderChoices.ALIBABACLOUD.value:
from prowler.providers.alibabacloud.alibabacloud_provider import (
AlibabacloudProvider,
)
prowler_provider = AlibabacloudProvider
case Provider.ProviderChoices.CLOUDFLARE.value:
from prowler.providers.cloudflare.cloudflare_provider import (
CloudflareProvider,
)
prowler_provider = CloudflareProvider
case Provider.ProviderChoices.OPENSTACK.value:
from prowler.providers.openstack.openstack_provider import OpenstackProvider
prowler_provider = OpenstackProvider
case _:
raise ValueError(f"Provider type {provider.provider} not supported")
return prowler_provider
@@ -159,6 +210,17 @@ def get_prowler_provider_kwargs(
**prowler_provider_kwargs,
"atlas_organization_id": provider.uid,
}
elif provider.provider == Provider.ProviderChoices.CLOUDFLARE.value:
prowler_provider_kwargs = {
**prowler_provider_kwargs,
"filter_accounts": [provider.uid],
}
elif provider.provider == Provider.ProviderChoices.OPENSTACK.value:
# No extra kwargs needed: clouds_yaml_content and clouds_yaml_cloud from the
# secret are sufficient. Validating project_id (provider.uid) against the
# clouds.yaml is not feasible because not all auth methods include it and the
# Keystone API is unavailable on public clouds.
pass
if mutelist_processor:
mutelist_content = mutelist_processor.configuration.get("Mutelist", {})
@@ -176,12 +238,14 @@ def initialize_prowler_provider(
AlibabacloudProvider
| AwsProvider
| AzureProvider
| CloudflareProvider
| GcpProvider
| GithubProvider
| IacProvider
| KubernetesProvider
| M365Provider
| MongodbatlasProvider
| OpenstackProvider
| OraclecloudProvider
):
"""Initialize a Prowler provider instance based on the given provider type.
@@ -191,7 +255,7 @@ def initialize_prowler_provider(
mutelist_processor (Processor): The mutelist processor object containing the mutelist configuration.
Returns:
AlibabacloudProvider | AwsProvider | AzureProvider | GcpProvider | GithubProvider | IacProvider | KubernetesProvider | M365Provider | MongodbatlasProvider | OraclecloudProvider: An instance of the corresponding provider class
AlibabacloudProvider | AwsProvider | AzureProvider | CloudflareProvider | GcpProvider | GithubProvider | IacProvider | KubernetesProvider | M365Provider | MongodbatlasProvider | OpenstackProvider | OraclecloudProvider: An instance of the corresponding provider class
initialized with the provider's secrets.
"""
prowler_provider = return_prowler_provider(provider)
@@ -226,6 +290,13 @@ def prowler_provider_connection_test(provider: Provider) -> Connection:
if "access_token" in prowler_provider_kwargs:
iac_test_kwargs["access_token"] = prowler_provider_kwargs["access_token"]
return prowler_provider.test_connection(**iac_test_kwargs)
elif provider.provider == Provider.ProviderChoices.OPENSTACK.value:
openstack_kwargs = {
"clouds_yaml_content": prowler_provider_kwargs["clouds_yaml_content"],
"clouds_yaml_cloud": prowler_provider_kwargs["clouds_yaml_cloud"],
"raise_on_exception": False,
}
return prowler_provider.test_connection(**openstack_kwargs)
else:
return prowler_provider.test_connection(
**prowler_provider_kwargs,
@@ -393,11 +464,21 @@ def get_findings_metadata_no_aggregations(tenant_id: str, filtered_queryset):
categories_set.update(categories_list)
categories = sorted(categories_set)
# Aggregate groups from findings
groups = list(
filtered_queryset.exclude(resource_groups__isnull=True)
.exclude(resource_groups__exact="")
.values_list("resource_groups", flat=True)
.distinct()
.order_by("resource_groups")
)
result = {
"services": services,
"regions": regions,
"resource_types": resource_types,
"categories": categories,
"groups": groups,
}
serializer = FindingMetadataSerializer(data=result)
@@ -346,6 +346,48 @@ from rest_framework_json_api import serializers
},
"required": ["role_arn", "access_key_id", "access_key_secret"],
},
{
"type": "object",
"title": "Cloudflare API Token",
"properties": {
"api_token": {
"type": "string",
"description": "Cloudflare API Token for authentication (recommended).",
},
},
"required": ["api_token"],
},
{
"type": "object",
"title": "Cloudflare API Key + Email",
"properties": {
"api_key": {
"type": "string",
"description": "Cloudflare Global API Key for authentication (legacy).",
},
"api_email": {
"type": "string",
"format": "email",
"description": "Email address associated with the Cloudflare account.",
},
},
"required": ["api_key", "api_email"],
},
{
"type": "object",
"title": "OpenStack clouds.yaml Credentials",
"properties": {
"clouds_yaml_content": {
"type": "string",
"description": "The full content of a clouds.yaml configuration file.",
},
"clouds_yaml_cloud": {
"type": "string",
"description": "The name of the cloud to use from the clouds.yaml file.",
},
},
"required": ["clouds_yaml_content", "clouds_yaml_cloud"],
},
]
}
)
+216
View File
@@ -21,6 +21,7 @@ from rest_framework_simplejwt.tokens import RefreshToken
from api.db_router import MainRouter
from api.exceptions import ConflictException
from api.models import (
AttackPathsScan,
Finding,
Integration,
IntegrationProviderRelationship,
@@ -1132,6 +1133,119 @@ class ScanComplianceReportSerializer(BaseSerializerV1):
fields = ["id", "name"]
class AttackPathsScanSerializer(RLSSerializer):
state = StateEnumSerializerField(read_only=True)
provider_alias = serializers.SerializerMethodField(read_only=True)
provider_type = serializers.SerializerMethodField(read_only=True)
provider_uid = serializers.SerializerMethodField(read_only=True)
class Meta:
model = AttackPathsScan
fields = [
"id",
"state",
"progress",
"provider",
"provider_alias",
"provider_type",
"provider_uid",
"scan",
"task",
"inserted_at",
"started_at",
"completed_at",
"duration",
]
included_serializers = {
"provider": "api.v1.serializers.ProviderIncludeSerializer",
"scan": "api.v1.serializers.ScanIncludeSerializer",
"task": "api.v1.serializers.TaskSerializer",
}
def get_provider_alias(self, obj):
provider = getattr(obj, "provider", None)
return provider.alias if provider else None
def get_provider_type(self, obj):
provider = getattr(obj, "provider", None)
return provider.provider if provider else None
def get_provider_uid(self, obj):
provider = getattr(obj, "provider", None)
return provider.uid if provider else None
class AttackPathsQueryAttributionSerializer(BaseSerializerV1):
text = serializers.CharField()
link = serializers.CharField()
class JSONAPIMeta:
resource_name = "attack-paths-query-attributions"
class AttackPathsQueryParameterSerializer(BaseSerializerV1):
name = serializers.CharField()
label = serializers.CharField()
data_type = serializers.CharField(default="string")
description = serializers.CharField(allow_null=True, required=False)
placeholder = serializers.CharField(allow_null=True, required=False)
class JSONAPIMeta:
resource_name = "attack-paths-query-parameters"
class AttackPathsQuerySerializer(BaseSerializerV1):
id = serializers.CharField()
name = serializers.CharField()
short_description = serializers.CharField()
description = serializers.CharField()
attribution = AttackPathsQueryAttributionSerializer(allow_null=True, required=False)
provider = serializers.CharField()
parameters = AttackPathsQueryParameterSerializer(many=True)
class JSONAPIMeta:
resource_name = "attack-paths-queries"
class AttackPathsQueryRunRequestSerializer(BaseSerializerV1):
id = serializers.CharField()
parameters = serializers.DictField(
child=serializers.JSONField(), allow_empty=True, required=False
)
class JSONAPIMeta:
resource_name = "attack-paths-query-run-requests"
class AttackPathsNodeSerializer(BaseSerializerV1):
id = serializers.CharField()
labels = serializers.ListField(child=serializers.CharField())
properties = serializers.DictField(child=serializers.JSONField())
class JSONAPIMeta:
resource_name = "attack-paths-query-result-nodes"
class AttackPathsRelationshipSerializer(BaseSerializerV1):
id = serializers.CharField()
label = serializers.CharField()
source = serializers.CharField()
target = serializers.CharField()
properties = serializers.DictField(child=serializers.JSONField())
class JSONAPIMeta:
resource_name = "attack-paths-query-result-relationships"
class AttackPathsQueryResultSerializer(BaseSerializerV1):
nodes = AttackPathsNodeSerializer(many=True)
relationships = AttackPathsRelationshipSerializer(many=True)
class JSONAPIMeta:
resource_name = "attack-paths-query-results"
class ResourceTagSerializer(RLSSerializer):
"""
Serializer for the ResourceTag model
@@ -1175,6 +1289,7 @@ class ResourceSerializer(RLSSerializer):
"metadata",
"details",
"partition",
"groups",
]
extra_kwargs = {
"id": {"read_only": True},
@@ -1183,6 +1298,7 @@ class ResourceSerializer(RLSSerializer):
"metadata": {"read_only": True},
"details": {"read_only": True},
"partition": {"read_only": True},
"groups": {"read_only": True},
}
included_serializers = {
@@ -1276,6 +1392,7 @@ class ResourceMetadataSerializer(BaseSerializerV1):
services = serializers.ListField(child=serializers.CharField(), allow_empty=True)
regions = serializers.ListField(child=serializers.CharField(), allow_empty=True)
types = serializers.ListField(child=serializers.CharField(), allow_empty=True)
groups = serializers.ListField(child=serializers.CharField(), allow_empty=True)
# Temporarily disabled until we implement tag filtering in the UI
# tags = serializers.JSONField(help_text="Tags are described as key-value pairs.")
@@ -1302,6 +1419,7 @@ class FindingSerializer(RLSSerializer):
"check_id",
"check_metadata",
"categories",
"resource_groups",
"raw_result",
"inserted_at",
"updated_at",
@@ -1358,6 +1476,9 @@ class FindingMetadataSerializer(BaseSerializerV1):
child=serializers.CharField(), allow_empty=True
)
categories = serializers.ListField(child=serializers.CharField(), allow_empty=True)
groups = serializers.ListField(
child=serializers.CharField(), allow_empty=True, required=False, default=list
)
# Temporarily disabled until we implement tag filtering in the UI
# tags = serializers.JSONField(help_text="Tags are described as key-value pairs.")
@@ -1392,6 +1513,20 @@ class BaseWriteProviderSecretSerializer(BaseWriteSerializer):
serializer = MongoDBAtlasProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.ALIBABACLOUD.value:
serializer = AlibabaCloudProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.CLOUDFLARE.value:
if "api_token" in secret:
serializer = CloudflareTokenProviderSecret(data=secret)
elif "api_key" in secret and "api_email" in secret:
serializer = CloudflareApiKeyProviderSecret(data=secret)
else:
raise serializers.ValidationError(
{
"secret": "Cloudflare credentials must include either 'api_token' "
"or both 'api_key' and 'api_email'."
}
)
elif provider_type == Provider.ProviderChoices.OPENSTACK.value:
serializer = OpenStackCloudsYamlProviderSecret(data=secret)
else:
raise serializers.ValidationError(
{"provider": f"Provider type not supported {provider_type}"}
@@ -1543,6 +1678,29 @@ class OracleCloudProviderSecret(serializers.Serializer):
resource_name = "provider-secrets"
class CloudflareTokenProviderSecret(serializers.Serializer):
api_token = serializers.CharField()
class Meta:
resource_name = "provider-secrets"
class CloudflareApiKeyProviderSecret(serializers.Serializer):
api_key = serializers.CharField()
api_email = serializers.EmailField()
class Meta:
resource_name = "provider-secrets"
class OpenStackCloudsYamlProviderSecret(serializers.Serializer):
clouds_yaml_content = serializers.CharField()
clouds_yaml_cloud = serializers.CharField()
class Meta:
resource_name = "provider-secrets"
class AlibabaCloudProviderSecret(serializers.Serializer):
access_key_id = serializers.CharField()
access_key_secret = serializers.CharField()
@@ -2303,6 +2461,36 @@ class CategoryOverviewSerializer(BaseSerializerV1):
resource_name = "category-overviews"
class ResourceGroupOverviewSerializer(BaseSerializerV1):
"""Serializer for resource group overview aggregations."""
id = serializers.CharField(source="resource_group")
total_findings = serializers.IntegerField()
failed_findings = serializers.IntegerField()
new_failed_findings = serializers.IntegerField()
resources_count = serializers.IntegerField()
severity = serializers.JSONField(
help_text="Severity breakdown: {informational, low, medium, high, critical}"
)
class JSONAPIMeta:
resource_name = "resource-group-overviews"
class ComplianceWatchlistOverviewSerializer(BaseSerializerV1):
"""Serializer for compliance watchlist overview with FAIL-dominant aggregation."""
id = serializers.CharField(source="compliance_id")
compliance_id = serializers.CharField()
requirements_passed = serializers.IntegerField()
requirements_failed = serializers.IntegerField()
requirements_manual = serializers.IntegerField()
total_requirements = serializers.IntegerField()
class JSONAPIMeta:
resource_name = "compliance-watchlist-overviews"
class OverviewRegionSerializer(serializers.Serializer):
id = serializers.SerializerMethodField()
provider_type = serializers.CharField()
@@ -3834,3 +4022,31 @@ class ThreatScoreSnapshotSerializer(RLSSerializer):
if getattr(obj, "_aggregated", False):
return "n/a"
return str(obj.id)
# Resource Events Serializers
class ResourceEventSerializer(BaseSerializerV1):
"""Serializer for resource events (CloudTrail modification history).
NOTE: drf-spectacular auto-generates fields[resource-events] sparse fieldsets
parameter in the OpenAPI schema. This endpoint does not support sparse fieldsets.
"""
id = serializers.CharField(source="event_id")
event_time = serializers.DateTimeField()
event_name = serializers.CharField()
event_source = serializers.CharField()
actor = serializers.CharField()
actor_uid = serializers.CharField(allow_null=True, required=False)
actor_type = serializers.CharField(allow_null=True, required=False)
source_ip_address = serializers.CharField(allow_null=True, required=False)
user_agent = serializers.CharField(allow_null=True, required=False)
request_data = serializers.JSONField(allow_null=True, required=False)
response_data = serializers.JSONField(allow_null=True, required=False)
error_code = serializers.CharField(allow_null=True, required=False)
error_message = serializers.CharField(allow_null=True, required=False)
class Meta:
resource_name = "resource-events"
+4
View File
@@ -4,6 +4,7 @@ from drf_spectacular.views import SpectacularRedocView
from rest_framework_nested import routers
from api.v1.views import (
AttackPathsScanViewSet,
ComplianceOverviewViewSet,
CustomSAMLLoginView,
CustomTokenObtainView,
@@ -53,6 +54,9 @@ router.register(r"tenants", TenantViewSet, basename="tenant")
router.register(r"providers", ProviderViewSet, basename="provider")
router.register(r"provider-groups", ProviderGroupViewSet, basename="providergroup")
router.register(r"scans", ScanViewSet, basename="scan")
router.register(
r"attack-paths-scans", AttackPathsScanViewSet, basename="attack-paths-scans"
)
router.register(r"tasks", TaskViewSet, basename="task")
router.register(r"resources", ResourceViewSet, basename="resource")
router.register(r"findings", FindingViewSet, basename="finding")
File diff suppressed because it is too large Load Diff
+1
View File
@@ -1,6 +1,7 @@
import warnings
from celery import Celery, Task
from config.env import env
# Suppress specific warnings from django-rest-auth: https://github.com/iMerica/dj-rest-auth/issues/684
+1 -1
View File
@@ -276,7 +276,7 @@ FINDINGS_MAX_DAYS_IN_RANGE = env.int("DJANGO_FINDINGS_MAX_DAYS_IN_RANGE", 7)
DJANGO_TMP_OUTPUT_DIRECTORY = env.str(
"DJANGO_TMP_OUTPUT_DIRECTORY", "/tmp/prowler_api_output"
)
DJANGO_FINDINGS_BATCH_SIZE = env.str("DJANGO_FINDINGS_BATCH_SIZE", 1000)
DJANGO_FINDINGS_BATCH_SIZE = env.int("DJANGO_FINDINGS_BATCH_SIZE", 1000)
DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID = env.str("DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID", "")
+6
View File
@@ -44,6 +44,12 @@ DATABASES = {
"HOST": env("POSTGRES_REPLICA_HOST", default=default_db_host),
"PORT": env("POSTGRES_REPLICA_PORT", default=default_db_port),
},
"neo4j": {
"HOST": env.str("NEO4J_HOST", "neo4j"),
"PORT": env.str("NEO4J_PORT", "7687"),
"USER": env.str("NEO4J_USER", "neo4j"),
"PASSWORD": env.str("NEO4J_PASSWORD", "neo4j_password"),
},
}
DATABASES["default"] = DATABASES["prowler_user"]
@@ -45,6 +45,12 @@ DATABASES = {
"HOST": env("POSTGRES_REPLICA_HOST", default=default_db_host),
"PORT": env("POSTGRES_REPLICA_PORT", default=default_db_port),
},
"neo4j": {
"HOST": env.str("NEO4J_HOST"),
"PORT": env.str("NEO4J_PORT"),
"USER": env.str("NEO4J_USER"),
"PASSWORD": env.str("NEO4J_PASSWORD"),
},
}
DATABASES["default"] = DATABASES["prowler_user"]
+4
View File
@@ -18,6 +18,10 @@ DATABASES = {
DATABASE_ROUTERS = []
TESTING = True
# Override page size for testing to a value only slightly above the current fixture count.
# We explicitly set PAGE_SIZE to 15 (round number just above fixture) to avoid masking pagination bugs, while not setting it excessively high.
# If you add more providers to the fixture, please review that the total value is below the current one and update this value if needed.
REST_FRAMEWORK["PAGE_SIZE"] = 15 # noqa: F405
SECRETS_ENCRYPTION_KEY = "ZMiYVo7m4Fbe2eXXPyrwxdJss2WSalXSv3xHBcJkPl0="
# DRF Simple API Key settings
+330 -7
View File
@@ -1,5 +1,6 @@
import logging
from datetime import datetime, timedelta, timezone
from types import SimpleNamespace
from unittest.mock import MagicMock, patch
import pytest
@@ -14,10 +15,16 @@ from rest_framework.test import APIClient
from tasks.jobs.backfill import (
backfill_resource_scan_summaries,
backfill_scan_category_summaries,
backfill_scan_resource_group_summaries,
)
from api.attack_paths import (
AttackPathsQueryDefinition,
AttackPathsQueryParameterDefinition,
)
from api.db_utils import rls_transaction
from api.models import (
AttackPathsScan,
AttackSurfaceOverview,
ComplianceOverview,
ComplianceRequirementOverview,
@@ -30,6 +37,7 @@ from api.models import (
MuteRule,
Processor,
Provider,
ProviderComplianceScore,
ProviderGroup,
ProviderSecret,
Resource,
@@ -40,11 +48,13 @@ from api.models import (
SAMLDomainIndex,
Scan,
ScanCategorySummary,
ScanGroupSummary,
ScanSummary,
StateChoices,
StatusChoices,
Task,
TenantAPIKey,
TenantComplianceSummary,
User,
UserRoleRelationship,
)
@@ -164,22 +174,20 @@ def create_test_user_rbac_no_roles(django_db_setup, django_db_blocker, tenants_f
@pytest.fixture(scope="function")
def create_test_user_rbac_limited(django_db_setup, django_db_blocker):
def create_test_user_rbac_limited(django_db_setup, django_db_blocker, tenants_fixture):
with django_db_blocker.unblock():
user = User.objects.create_user(
name="testing_limited",
email="rbac_limited@rbac.com",
password=TEST_PASSWORD,
)
tenant = Tenant.objects.create(
name="Tenant Test",
)
tenant = tenants_fixture[0]
Membership.objects.create(
user=user,
tenant=tenant,
role=Membership.RoleChoices.OWNER,
)
Role.objects.create(
role = Role.objects.create(
name="limited",
tenant_id=tenant.id,
manage_users=False,
@@ -192,7 +200,7 @@ def create_test_user_rbac_limited(django_db_setup, django_db_blocker):
)
UserRoleRelationship.objects.create(
user=user,
role=Role.objects.get(name="limited"),
role=role,
tenant_id=tenant.id,
)
return user
@@ -523,6 +531,18 @@ def providers_fixture(tenants_fixture):
alias="alibabacloud_testing",
tenant_id=tenant.id,
)
provider10 = Provider.objects.create(
provider="cloudflare",
uid="a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4",
alias="cloudflare_testing",
tenant_id=tenant.id,
)
provider11 = Provider.objects.create(
provider="openstack",
uid="a1b2c3d4-e5f6-7890-abcd-ef1234567890",
alias="openstack_testing",
tenant_id=tenant.id,
)
return (
provider1,
@@ -534,6 +554,8 @@ def providers_fixture(tenants_fixture):
provider7,
provider8,
provider9,
provider10,
provider11,
)
@@ -737,6 +759,7 @@ def resources_fixture(providers_fixture):
region="us-east-1",
service="ec2",
type="prowler-test",
groups=["compute"],
)
resource1.upsert_or_delete_tags(tags)
@@ -749,6 +772,7 @@ def resources_fixture(providers_fixture):
region="eu-west-1",
service="s3",
type="prowler-test",
groups=["storage"],
)
resource2.upsert_or_delete_tags(tags)
@@ -760,6 +784,7 @@ def resources_fixture(providers_fixture):
region="us-east-1",
service="ec2",
type="test",
groups=["compute"],
)
tags = [
@@ -1232,7 +1257,7 @@ def lighthouse_config_fixture(authenticated_client, tenants_fixture):
return LighthouseConfiguration.objects.create(
tenant_id=tenants_fixture[0].id,
name="OpenAI",
api_key_decoded="sk-test1234567890T3BlbkFJtest1234567890",
api_key_decoded="sk-fake-test-key-for-unit-testing-only",
model="gpt-4o",
temperature=0,
max_tokens=4000,
@@ -1381,11 +1406,13 @@ def latest_scan_finding_with_categories(
check_id="genai_iam_check",
check_metadata={"CheckId": "genai_iam_check"},
categories=["gen-ai", "iam"],
resource_groups="ai_ml",
first_seen_at="2024-01-02T00:00:00Z",
)
finding.add_resources([resource])
backfill_resource_scan_summaries(tenant_id, str(scan.id))
backfill_scan_category_summaries(tenant_id, str(scan.id))
backfill_scan_resource_group_summaries(tenant_id, str(scan.id))
return finding
@@ -1588,6 +1615,103 @@ def mute_rules_fixture(tenants_fixture, create_test_user, findings_fixture):
return mute_rule1, mute_rule2
@pytest.fixture
def create_attack_paths_scan():
"""Factory fixture to create Attack Paths scans for tests."""
def _create(
provider,
*,
scan=None,
state=StateChoices.COMPLETED,
progress=0,
**extra_fields,
):
scan_instance = scan or Scan.objects.create(
name=extra_fields.pop("scan_name", "Attack Paths Supporting Scan"),
provider=provider,
trigger=Scan.TriggerChoices.MANUAL,
state=extra_fields.pop("scan_state", StateChoices.COMPLETED),
tenant_id=provider.tenant_id,
)
payload = {
"tenant_id": provider.tenant_id,
"provider": provider,
"scan": scan_instance,
"state": state,
"progress": progress,
}
payload.update(extra_fields)
return AttackPathsScan.objects.create(**payload)
return _create
@pytest.fixture
def attack_paths_query_definition_factory():
"""Factory fixture for building Attack Paths query definitions."""
def _create(**overrides):
cast_type = overrides.pop("cast_type", str)
parameters = overrides.pop(
"parameters",
[
AttackPathsQueryParameterDefinition(
name="limit",
label="Limit",
cast=cast_type,
)
],
)
definition_payload = {
"id": "aws-test",
"name": "Attack Paths Test Query",
"short_description": "Synthetic short description for tests.",
"description": "Synthetic Attack Paths definition for tests.",
"provider": "aws",
"cypher": "RETURN 1",
"parameters": parameters,
}
definition_payload.update(overrides)
return AttackPathsQueryDefinition(**definition_payload)
return _create
@pytest.fixture
def attack_paths_graph_stub_classes():
"""Provide lightweight graph element stubs for Attack Paths serialization tests."""
class AttackPathsNativeValue:
def __init__(self, value):
self._value = value
def to_native(self):
return self._value
class AttackPathsNode:
def __init__(self, element_id, labels, properties):
self.element_id = element_id
self.labels = labels
self._properties = properties
class AttackPathsRelationship:
def __init__(self, element_id, rel_type, start_node, end_node, properties):
self.element_id = element_id
self.type = rel_type
self.start_node = start_node
self.end_node = end_node
self._properties = properties
return SimpleNamespace(
NativeValue=AttackPathsNativeValue,
Node=AttackPathsNode,
Relationship=AttackPathsRelationship,
)
@pytest.fixture
def create_attack_surface_overview():
def _create(tenant, scan, attack_surface_type, total=10, failed=5, muted_failed=2):
@@ -1627,10 +1751,209 @@ def create_scan_category_summary():
return _create
@pytest.fixture(scope="function")
def findings_with_group(scans_fixture, resources_fixture):
scan = scans_fixture[0]
resource = resources_fixture[0]
finding = Finding.objects.create(
tenant_id=scan.tenant_id,
uid="finding_with_group_1",
scan=scan,
delta=None,
status=Status.FAIL,
status_extended="test status",
impact=Severity.critical,
impact_extended="test impact",
severity=Severity.critical,
raw_result={"status": Status.FAIL},
check_id="storage_check",
check_metadata={"CheckId": "storage_check"},
resource_groups="storage",
first_seen_at="2024-01-02T00:00:00Z",
)
finding.add_resources([resource])
backfill_resource_scan_summaries(str(scan.tenant_id), str(scan.id))
return finding
@pytest.fixture(scope="function")
def findings_with_multiple_groups(scans_fixture, resources_fixture):
scan = scans_fixture[0]
resource1, resource2 = resources_fixture[:2]
finding1 = Finding.objects.create(
tenant_id=scan.tenant_id,
uid="finding_multi_grp_1",
scan=scan,
delta=None,
status=Status.FAIL,
status_extended="test status",
impact=Severity.critical,
impact_extended="test impact",
severity=Severity.critical,
raw_result={"status": Status.FAIL},
check_id="storage_check",
check_metadata={"CheckId": "storage_check"},
resource_groups="storage",
first_seen_at="2024-01-02T00:00:00Z",
)
finding1.add_resources([resource1])
finding2 = Finding.objects.create(
tenant_id=scan.tenant_id,
uid="finding_multi_grp_2",
scan=scan,
delta=None,
status=Status.FAIL,
status_extended="test status 2",
impact=Severity.high,
impact_extended="test impact 2",
severity=Severity.high,
raw_result={"status": Status.FAIL},
check_id="security_check",
check_metadata={"CheckId": "security_check"},
resource_groups="security",
first_seen_at="2024-01-02T00:00:00Z",
)
finding2.add_resources([resource2])
backfill_resource_scan_summaries(str(scan.tenant_id), str(scan.id))
return finding1, finding2
@pytest.fixture
def create_scan_resource_group_summary():
def _create(
tenant,
scan,
resource_group,
severity,
total_findings=10,
failed_findings=5,
new_failed_findings=2,
resources_count=3,
):
return ScanGroupSummary.objects.create(
tenant=tenant,
scan=scan,
resource_group=resource_group,
severity=severity,
total_findings=total_findings,
failed_findings=failed_findings,
new_failed_findings=new_failed_findings,
resources_count=resources_count,
)
return _create
def get_authorization_header(access_token: str) -> dict:
return {"Authorization": f"Bearer {access_token}"}
@pytest.fixture
def provider_compliance_scores_fixture(
tenants_fixture, providers_fixture, scans_fixture
):
"""Create ProviderComplianceScore entries for compliance watchlist tests."""
tenant = tenants_fixture[0]
provider1, provider2, *_ = providers_fixture
scan1, _, scan3 = scans_fixture
scan1.completed_at = datetime.now(timezone.utc) - timedelta(hours=1)
scan1.save()
scan3.state = StateChoices.COMPLETED
scan3.completed_at = datetime.now(timezone.utc)
scan3.save()
scores = [
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider1,
scan=scan1,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan1.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider1,
scan=scan1,
compliance_id="aws_cis_2.0",
requirement_id="req_2",
requirement_status=StatusChoices.FAIL,
scan_completed_at=scan1.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider1,
scan=scan1,
compliance_id="aws_cis_2.0",
requirement_id="req_3",
requirement_status=StatusChoices.MANUAL,
scan_completed_at=scan1.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider2,
scan=scan3,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.FAIL,
scan_completed_at=scan3.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider2,
scan=scan3,
compliance_id="aws_cis_2.0",
requirement_id="req_2",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan3.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider1,
scan=scan1,
compliance_id="gdpr_aws",
requirement_id="gdpr_req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan1.completed_at,
),
]
return scores
@pytest.fixture
def tenant_compliance_summary_fixture(tenants_fixture):
"""Create TenantComplianceSummary entries for compliance watchlist tests."""
tenant = tenants_fixture[0]
summaries = [
TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="aws_cis_2.0",
requirements_passed=1,
requirements_failed=2,
requirements_manual=1,
total_requirements=4,
),
TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="gdpr_aws",
requirements_passed=5,
requirements_failed=0,
requirements_manual=2,
total_requirements=7,
),
]
return summaries
def pytest_collection_modifyitems(items):
"""Ensure test_rbac.py is executed first."""
items.sort(key=lambda item: 0 if "test_rbac.py" in item.nodeid else 1)
+7
View File
@@ -7,6 +7,7 @@ from tasks.tasks import perform_scheduled_scan_task
from api.db_utils import rls_transaction
from api.exceptions import ConflictException
from api.models import Provider, Scan, StateChoices
from tasks.jobs.attack_paths import db_utils as attack_paths_db_utils
def schedule_provider_scan(provider_instance: Provider):
@@ -39,6 +40,12 @@ def schedule_provider_scan(provider_instance: Provider):
scheduled_at=datetime.now(timezone.utc),
)
attack_paths_db_utils.create_attack_paths_scan(
tenant_id=tenant_id,
scan_id=str(scheduled_scan.id),
provider_id=provider_id,
)
# Schedule the task
periodic_task_instance = PeriodicTask.objects.create(
interval=schedule,
@@ -0,0 +1,7 @@
from tasks.jobs.attack_paths.db_utils import can_provider_run_attack_paths_scan
from tasks.jobs.attack_paths.scan import run as attack_paths_scan
__all__ = [
"attack_paths_scan",
"can_provider_run_attack_paths_scan",
]
@@ -0,0 +1,253 @@
# Portions of this file are based on code from the Cartography project
# (https://github.com/cartography-cncf/cartography), which is licensed under the Apache 2.0 License.
from typing import Any
import aioboto3
import boto3
import neo4j
from cartography.config import Config as CartographyConfig
from cartography.intel import aws as cartography_aws
from celery.utils.log import get_task_logger
from api.models import (
AttackPathsScan as ProwlerAPIAttackPathsScan,
Provider as ProwlerAPIProvider,
)
from prowler.providers.common.provider import Provider as ProwlerSDKProvider
from tasks.jobs.attack_paths import db_utils, utils
logger = get_task_logger(__name__)
def start_aws_ingestion(
neo4j_session: neo4j.Session,
cartography_config: CartographyConfig,
prowler_api_provider: ProwlerAPIProvider,
prowler_sdk_provider: ProwlerSDKProvider,
attack_paths_scan: ProwlerAPIAttackPathsScan,
) -> dict[str, dict[str, str]]:
"""
Code based on Cartography, specifically on `cartography.intel.aws.__init__.py`.
For the scan progress updates:
- The caller of this function (`tasks.jobs.attack_paths.scan.run`) has set it to 2.
- When the control returns to the caller, it will be set to 95.
"""
# Initialize variables common to all jobs
common_job_parameters = {
"UPDATE_TAG": cartography_config.update_tag,
"permission_relationships_file": cartography_config.permission_relationships_file,
"aws_guardduty_severity_threshold": cartography_config.aws_guardduty_severity_threshold,
"aws_cloudtrail_management_events_lookback_hours": cartography_config.aws_cloudtrail_management_events_lookback_hours,
"experimental_aws_inspector_batch": cartography_config.experimental_aws_inspector_batch,
}
boto3_session = get_boto3_session(prowler_api_provider, prowler_sdk_provider)
regions: list[str] = list(prowler_sdk_provider._enabled_regions)
requested_syncs = list(cartography_aws.RESOURCE_FUNCTIONS.keys())
sync_args = cartography_aws._build_aws_sync_kwargs(
neo4j_session,
boto3_session,
regions,
prowler_api_provider.uid,
cartography_config.update_tag,
common_job_parameters,
)
# Starting with sync functions
logger.info(f"Syncing organizations for AWS account {prowler_api_provider.uid}")
cartography_aws.organizations.sync(
neo4j_session,
{prowler_api_provider.alias: prowler_api_provider.uid},
cartography_config.update_tag,
common_job_parameters,
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 3)
# Adding an extra field
common_job_parameters["AWS_ID"] = prowler_api_provider.uid
cartography_aws._autodiscover_accounts(
neo4j_session,
boto3_session,
prowler_api_provider.uid,
cartography_config.update_tag,
common_job_parameters,
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 4)
failed_syncs = sync_aws_account(
prowler_api_provider, requested_syncs, sync_args, attack_paths_scan
)
if "permission_relationships" in requested_syncs:
logger.info(
f"Syncing function permission_relationships for AWS account {prowler_api_provider.uid}"
)
cartography_aws.RESOURCE_FUNCTIONS["permission_relationships"](**sync_args)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 88)
if "resourcegroupstaggingapi" in requested_syncs:
logger.info(
f"Syncing function resourcegroupstaggingapi for AWS account {prowler_api_provider.uid}"
)
cartography_aws.RESOURCE_FUNCTIONS["resourcegroupstaggingapi"](**sync_args)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 89)
logger.info(
f"Syncing ec2_iaminstanceprofile scoped analysis for AWS account {prowler_api_provider.uid}"
)
cartography_aws.run_scoped_analysis_job(
"aws_ec2_iaminstanceprofile.json",
neo4j_session,
common_job_parameters,
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 90)
logger.info(
f"Syncing lambda_ecr analysis for AWS account {prowler_api_provider.uid}"
)
cartography_aws.run_analysis_job(
"aws_lambda_ecr.json",
neo4j_session,
common_job_parameters,
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 91)
logger.info(f"Syncing metadata for AWS account {prowler_api_provider.uid}")
cartography_aws.merge_module_sync_metadata(
neo4j_session,
group_type="AWSAccount",
group_id=prowler_api_provider.uid,
synced_type="AWSAccount",
update_tag=cartography_config.update_tag,
stat_handler=cartography_aws.stat_handler,
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 92)
# Removing the added extra field
del common_job_parameters["AWS_ID"]
logger.info(f"Syncing cleanup_job for AWS account {prowler_api_provider.uid}")
cartography_aws.run_cleanup_job(
"aws_post_ingestion_principals_cleanup.json",
neo4j_session,
common_job_parameters,
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 93)
logger.info(f"Syncing analysis for AWS account {prowler_api_provider.uid}")
cartography_aws._perform_aws_analysis(
requested_syncs, neo4j_session, common_job_parameters
)
db_utils.update_attack_paths_scan_progress(attack_paths_scan, 94)
return failed_syncs
def get_boto3_session(
prowler_api_provider: ProwlerAPIProvider, prowler_sdk_provider: ProwlerSDKProvider
) -> boto3.Session:
boto3_session = prowler_sdk_provider.session.current_session
aws_accounts_from_session = cartography_aws.organizations.get_aws_account_default(
boto3_session
)
if not aws_accounts_from_session:
raise Exception(
"No valid AWS credentials could be found. No AWS accounts can be synced."
)
aws_account_id_from_session = list(aws_accounts_from_session.values())[0]
if prowler_api_provider.uid != aws_account_id_from_session:
raise Exception(
f"Provider {prowler_api_provider.uid} doesn't match AWS account {aws_account_id_from_session}."
)
if boto3_session.region_name is None:
global_region = prowler_sdk_provider.get_global_region()
boto3_session._session.set_config_variable("region", global_region)
return boto3_session
def get_aioboto3_session(boto3_session: boto3.Session) -> aioboto3.Session:
return aioboto3.Session(botocore_session=boto3_session._session)
def sync_aws_account(
prowler_api_provider: ProwlerAPIProvider,
requested_syncs: list[str],
sync_args: dict[str, Any],
attack_paths_scan: ProwlerAPIAttackPathsScan,
) -> dict[str, str]:
current_progress = 4 # `cartography_aws._autodiscover_accounts`
max_progress = (
87 # `cartography_aws.RESOURCE_FUNCTIONS["permission_relationships"]` - 1
)
n_steps = (
len(requested_syncs) - 2
) # Excluding `permission_relationships` and `resourcegroupstaggingapi`
progress_step = (max_progress - current_progress) / n_steps
failed_syncs = {}
for func_name in requested_syncs:
if func_name in cartography_aws.RESOURCE_FUNCTIONS:
logger.info(
f"Syncing function {func_name} for AWS account {prowler_api_provider.uid}"
)
# Updating progress, not really the right place but good enough
current_progress += progress_step
db_utils.update_attack_paths_scan_progress(
attack_paths_scan, int(current_progress)
)
try:
# `ecr:image_layers` uses `aioboto3_session` instead of `boto3_session`
if func_name == "ecr:image_layers":
cartography_aws.RESOURCE_FUNCTIONS[func_name](
neo4j_session=sync_args.get("neo4j_session"),
aioboto3_session=get_aioboto3_session(
sync_args.get("boto3_session")
),
regions=sync_args.get("regions"),
current_aws_account_id=sync_args.get("current_aws_account_id"),
update_tag=sync_args.get("update_tag"),
common_job_parameters=sync_args.get("common_job_parameters"),
)
# Skip permission relationships and tags for now because they rely on data already being in the graph
elif func_name in [
"permission_relationships",
"resourcegroupstaggingapi",
]:
continue
else:
cartography_aws.RESOURCE_FUNCTIONS[func_name](**sync_args)
except Exception as e:
exception_message = utils.stringify_exception(
e, f"Exception for AWS sync function: {func_name}"
)
failed_syncs[func_name] = exception_message
logger.warning(
f"Caught exception syncing function {func_name} from AWS account {prowler_api_provider.uid}. We "
"are continuing on to the next AWS sync function.",
)
continue
else:
raise ValueError(
f'AWS sync function "{func_name}" was specified but does not exist. Did you misspell it?'
)
return failed_syncs
@@ -0,0 +1,88 @@
from dataclasses import dataclass
from typing import Callable
from config.env import env
from tasks.jobs.attack_paths import aws
# Batch size for Neo4j operations
BATCH_SIZE = env.int("ATTACK_PATHS_BATCH_SIZE", 1000)
# Neo4j internal labels (Prowler-specific, not provider-specific)
# - `ProwlerFinding`: Label for finding nodes created by Prowler and linked to cloud resources.
# - `ProviderResource`: Added to ALL synced nodes for provider isolation and drop/query ops.
# - `Internet`: Singleton node representing external internet access for exposed-resource queries.
PROWLER_FINDING_LABEL = "ProwlerFinding"
PROVIDER_RESOURCE_LABEL = "ProviderResource"
INTERNET_NODE_LABEL = "Internet"
@dataclass(frozen=True)
class ProviderConfig:
"""Configuration for a cloud provider's Attack Paths integration."""
name: str
root_node_label: str # e.g., "AWSAccount"
uid_field: str # e.g., "arn"
# Label for resources connected to the account node, enabling indexed finding lookups.
resource_label: str # e.g., "AWSResource"
ingestion_function: Callable
# Provider Configurations
# -----------------------
AWS_CONFIG = ProviderConfig(
name="aws",
root_node_label="AWSAccount",
uid_field="arn",
resource_label="AWSResource",
ingestion_function=aws.start_aws_ingestion,
)
PROVIDER_CONFIGS: dict[str, ProviderConfig] = {
"aws": AWS_CONFIG,
}
# Labels added by Prowler that should be filtered from API responses
# Derived from provider configs + common internal labels
INTERNAL_LABELS: list[str] = [
"Tenant",
PROVIDER_RESOURCE_LABEL,
# Add all provider-specific resource labels
*[config.resource_label for config in PROVIDER_CONFIGS.values()],
]
# Provider Config Accessors
# -------------------------
def is_provider_available(provider_type: str) -> bool:
"""Check if a provider type is available for Attack Paths scans."""
return provider_type in PROVIDER_CONFIGS
def get_cartography_ingestion_function(provider_type: str) -> Callable | None:
"""Get the Cartography ingestion function for a provider type."""
config = PROVIDER_CONFIGS.get(provider_type)
return config.ingestion_function if config else None
def get_root_node_label(provider_type: str) -> str:
"""Get the root node label for a provider type (e.g., AWSAccount)."""
config = PROVIDER_CONFIGS.get(provider_type)
return config.root_node_label if config else "UnknownProviderAccount"
def get_node_uid_field(provider_type: str) -> str:
"""Get the UID field for a provider type (e.g., arn for AWS)."""
config = PROVIDER_CONFIGS.get(provider_type)
return config.uid_field if config else "UnknownProviderUID"
def get_provider_resource_label(provider_type: str) -> str:
"""Get the resource label for a provider type (e.g., `AWSResource`)."""
config = PROVIDER_CONFIGS.get(provider_type)
return config.resource_label if config else "UnknownProviderResource"
@@ -0,0 +1,137 @@
from datetime import datetime, timezone
from typing import Any
from cartography.config import Config as CartographyConfig
from api.db_utils import rls_transaction
from api.models import (
AttackPathsScan as ProwlerAPIAttackPathsScan,
Provider as ProwlerAPIProvider,
StateChoices,
)
from tasks.jobs.attack_paths.config import is_provider_available
def can_provider_run_attack_paths_scan(tenant_id: str, provider_id: int) -> bool:
with rls_transaction(tenant_id):
prowler_api_provider = ProwlerAPIProvider.objects.get(id=provider_id)
return is_provider_available(prowler_api_provider.provider)
def create_attack_paths_scan(
tenant_id: str,
scan_id: str,
provider_id: int,
) -> ProwlerAPIAttackPathsScan | None:
if not can_provider_run_attack_paths_scan(tenant_id, provider_id):
return None
with rls_transaction(tenant_id):
attack_paths_scan = ProwlerAPIAttackPathsScan.objects.create(
tenant_id=tenant_id,
provider_id=provider_id,
scan_id=scan_id,
state=StateChoices.SCHEDULED,
started_at=datetime.now(tz=timezone.utc),
)
attack_paths_scan.save()
return attack_paths_scan
def retrieve_attack_paths_scan(
tenant_id: str,
scan_id: str,
) -> ProwlerAPIAttackPathsScan | None:
try:
with rls_transaction(tenant_id):
attack_paths_scan = ProwlerAPIAttackPathsScan.objects.get(
scan_id=scan_id,
)
return attack_paths_scan
except ProwlerAPIAttackPathsScan.DoesNotExist:
return None
def starting_attack_paths_scan(
attack_paths_scan: ProwlerAPIAttackPathsScan,
task_id: str,
cartography_config: CartographyConfig,
) -> None:
with rls_transaction(attack_paths_scan.tenant_id):
attack_paths_scan.task_id = task_id
attack_paths_scan.state = StateChoices.EXECUTING
attack_paths_scan.started_at = datetime.now(tz=timezone.utc)
attack_paths_scan.update_tag = cartography_config.update_tag
attack_paths_scan.save(
update_fields=[
"task_id",
"state",
"started_at",
"update_tag",
]
)
def finish_attack_paths_scan(
attack_paths_scan: ProwlerAPIAttackPathsScan,
state: StateChoices,
ingestion_exceptions: dict[str, Any],
) -> None:
with rls_transaction(attack_paths_scan.tenant_id):
now = datetime.now(tz=timezone.utc)
duration = (
int((now - attack_paths_scan.started_at).total_seconds())
if attack_paths_scan.started_at
else 0
)
attack_paths_scan.state = state
attack_paths_scan.progress = 100
attack_paths_scan.completed_at = now
attack_paths_scan.duration = duration
attack_paths_scan.ingestion_exceptions = ingestion_exceptions
attack_paths_scan.save(
update_fields=[
"state",
"progress",
"completed_at",
"duration",
"ingestion_exceptions",
]
)
def update_attack_paths_scan_progress(
attack_paths_scan: ProwlerAPIAttackPathsScan,
progress: int,
) -> None:
with rls_transaction(attack_paths_scan.tenant_id):
attack_paths_scan.progress = progress
attack_paths_scan.save(update_fields=["progress"])
def fail_attack_paths_scan(
tenant_id: str,
scan_id: str,
error: str,
) -> None:
"""
Mark the `AttackPathsScan` row as `FAILED` unless it's already `COMPLETED` or `FAILED`.
Used as a safety net when the Celery task fails outside the job's own error handling.
"""
attack_paths_scan = retrieve_attack_paths_scan(tenant_id, scan_id)
if attack_paths_scan and attack_paths_scan.state not in (
StateChoices.COMPLETED,
StateChoices.FAILED,
):
finish_attack_paths_scan(
attack_paths_scan,
StateChoices.FAILED,
{"global_error": error},
)
@@ -0,0 +1,355 @@
"""
Prowler findings ingestion into Neo4j graph.
This module handles:
- Adding resource labels to Cartography nodes for efficient lookups
- Loading Prowler findings into the graph
- Linking findings to resources
- Cleaning up stale findings
"""
from collections import defaultdict
from dataclasses import asdict, dataclass, fields
from typing import Any, Generator
from uuid import UUID
import neo4j
from cartography.config import Config as CartographyConfig
from celery.utils.log import get_task_logger
from api.db_router import READ_REPLICA_ALIAS
from api.db_utils import rls_transaction
from api.models import Finding as FindingModel
from api.models import Provider, ResourceFindingMapping
from prowler.config import config as ProwlerConfig
from tasks.jobs.attack_paths.config import (
BATCH_SIZE,
get_node_uid_field,
get_provider_resource_label,
get_root_node_label,
)
from tasks.jobs.attack_paths.indexes import IndexType, create_indexes
from tasks.jobs.attack_paths.queries import (
ADD_RESOURCE_LABEL_TEMPLATE,
CLEANUP_FINDINGS_TEMPLATE,
INSERT_FINDING_TEMPLATE,
render_cypher_template,
)
logger = get_task_logger(__name__)
# Type Definitions
# -----------------
# Maps dataclass field names to Django ORM query field names
_DB_FIELD_MAP: dict[str, str] = {
"check_title": "check_metadata__checktitle",
}
@dataclass(slots=True)
class Finding:
"""
Finding data for Neo4j ingestion.
Can be created from a Django .values() query result using from_db_record().
"""
id: str
uid: str
inserted_at: str
updated_at: str
first_seen_at: str
scan_id: str
delta: str
status: str
status_extended: str
severity: str
check_id: str
check_title: str
muted: bool
muted_reason: str | None
resource_uid: str | None = None
@classmethod
def get_db_query_fields(cls) -> tuple[str, ...]:
"""Get field names for Django .values() query."""
return tuple(
_DB_FIELD_MAP.get(f.name, f.name)
for f in fields(cls)
if f.name != "resource_uid"
)
@classmethod
def from_db_record(cls, record: dict[str, Any], resource_uid: str) -> "Finding":
"""Create a Finding from a Django .values() query result."""
return cls(
id=str(record["id"]),
uid=record["uid"],
inserted_at=record["inserted_at"],
updated_at=record["updated_at"],
first_seen_at=record["first_seen_at"],
scan_id=str(record["scan_id"]),
delta=record["delta"],
status=record["status"],
status_extended=record["status_extended"],
severity=record["severity"],
check_id=str(record["check_id"]),
check_title=record["check_metadata__checktitle"],
muted=record["muted"],
muted_reason=record["muted_reason"],
resource_uid=resource_uid,
)
def to_dict(self) -> dict[str, Any]:
"""Convert to dict for Neo4j ingestion."""
return asdict(self)
# Public API
# ----------
def create_findings_indexes(neo4j_session: neo4j.Session) -> None:
"""Create indexes for Prowler findings and resource lookups."""
create_indexes(neo4j_session, IndexType.FINDINGS)
def analysis(
neo4j_session: neo4j.Session,
prowler_api_provider: Provider,
scan_id: str,
config: CartographyConfig,
) -> None:
"""
Main entry point for Prowler findings analysis.
Adds resource labels, loads findings, and cleans up stale data.
"""
add_resource_label(
neo4j_session, prowler_api_provider.provider, str(prowler_api_provider.uid)
)
findings_data = stream_findings_with_resources(prowler_api_provider, scan_id)
load_findings(neo4j_session, findings_data, prowler_api_provider, config)
cleanup_findings(neo4j_session, prowler_api_provider, config)
def add_resource_label(
neo4j_session: neo4j.Session, provider_type: str, provider_uid: str
) -> int:
"""
Add a common resource label to all nodes connected to the provider account.
This enables index usage for resource lookups in the findings query,
since Cartography nodes don't have a common parent label.
Returns the total number of nodes labeled.
"""
query = render_cypher_template(
ADD_RESOURCE_LABEL_TEMPLATE,
{
"__ROOT_LABEL__": get_root_node_label(provider_type),
"__RESOURCE_LABEL__": get_provider_resource_label(provider_type),
},
)
logger.info(
f"Adding {get_provider_resource_label(provider_type)} label to all resources for {provider_uid}"
)
total_labeled = 0
labeled_count = 1
while labeled_count > 0:
result = neo4j_session.run(
query,
{"provider_uid": provider_uid, "batch_size": BATCH_SIZE},
)
labeled_count = result.single().get("labeled_count", 0)
total_labeled += labeled_count
if labeled_count > 0:
logger.info(
f"Labeled {total_labeled} nodes with {get_provider_resource_label(provider_type)}"
)
return total_labeled
def load_findings(
neo4j_session: neo4j.Session,
findings_batches: Generator[list[Finding], None, None],
prowler_api_provider: Provider,
config: CartographyConfig,
) -> None:
"""Load Prowler findings into the graph, linking them to resources."""
query = render_cypher_template(
INSERT_FINDING_TEMPLATE,
{
"__ROOT_NODE_LABEL__": get_root_node_label(prowler_api_provider.provider),
"__NODE_UID_FIELD__": get_node_uid_field(prowler_api_provider.provider),
"__RESOURCE_LABEL__": get_provider_resource_label(
prowler_api_provider.provider
),
},
)
parameters = {
"provider_uid": str(prowler_api_provider.uid),
"last_updated": config.update_tag,
"prowler_version": ProwlerConfig.prowler_version,
}
batch_num = 0
total_records = 0
for batch in findings_batches:
batch_num += 1
batch_size = len(batch)
total_records += batch_size
parameters["findings_data"] = [f.to_dict() for f in batch]
logger.info(f"Loading findings batch {batch_num} ({batch_size} records)")
neo4j_session.run(query, parameters)
logger.info(f"Finished loading {total_records} records in {batch_num} batches")
def cleanup_findings(
neo4j_session: neo4j.Session,
prowler_api_provider: Provider,
config: CartographyConfig,
) -> None:
"""Remove stale findings (classic Cartography behaviour)."""
parameters = {
"provider_uid": str(prowler_api_provider.uid),
"last_updated": config.update_tag,
"batch_size": BATCH_SIZE,
}
batch = 1
deleted_count = 1
while deleted_count > 0:
logger.info(f"Cleaning findings batch {batch}")
result = neo4j_session.run(CLEANUP_FINDINGS_TEMPLATE, parameters)
deleted_count = result.single().get("deleted_findings_count", 0)
batch += 1
# Findings Streaming (Generator-based)
# -------------------------------------
def stream_findings_with_resources(
prowler_api_provider: Provider,
scan_id: str,
) -> Generator[list[Finding], None, None]:
"""
Stream findings with their associated resources in batches.
Uses keyset pagination for efficient traversal of large datasets.
Memory efficient: yields one batch at a time, never holds all findings in memory.
"""
logger.info(
f"Starting findings stream for scan {scan_id} "
f"(tenant {prowler_api_provider.tenant_id}) with batch size {BATCH_SIZE}"
)
tenant_id = prowler_api_provider.tenant_id
for batch in _paginate_findings(tenant_id, scan_id):
enriched = _enrich_batch_with_resources(batch, tenant_id)
if enriched:
yield enriched
logger.info(f"Finished streaming findings for scan {scan_id}")
def _paginate_findings(
tenant_id: str,
scan_id: str,
) -> Generator[list[dict[str, Any]], None, None]:
"""
Paginate through findings using keyset pagination.
Each iteration fetches one batch within its own RLS transaction,
preventing long-held database connections.
"""
last_id = None
iteration = 0
while True:
iteration += 1
batch = _fetch_findings_batch(tenant_id, scan_id, last_id)
logger.info(f"Iteration #{iteration}: fetched {len(batch)} findings")
if not batch:
break
last_id = batch[-1]["id"]
yield batch
def _fetch_findings_batch(
tenant_id: str,
scan_id: str,
after_id: UUID | None,
) -> list[dict[str, Any]]:
"""
Fetch a single batch of findings from the database.
Uses read replica and RLS-scoped transaction.
"""
with rls_transaction(tenant_id, using=READ_REPLICA_ALIAS):
# Use all_objects to avoid the ActiveProviderManager's implicit JOIN
# through Scan -> Provider (to check is_deleted=False).
# The provider is already validated as active in this context.
qs = FindingModel.all_objects.filter(scan_id=scan_id).order_by("id")
if after_id is not None:
qs = qs.filter(id__gt=after_id)
return list(qs.values(*Finding.get_db_query_fields())[:BATCH_SIZE])
# Batch Enrichment
# -----------------
def _enrich_batch_with_resources(
findings_batch: list[dict[str, Any]],
tenant_id: str,
) -> list[Finding]:
"""
Enrich findings with their resource UIDs.
One finding with N resources becomes N output records.
Findings without resources are skipped.
"""
finding_ids = [f["id"] for f in findings_batch]
resource_map = _build_finding_resource_map(finding_ids, tenant_id)
return [
Finding.from_db_record(finding, resource_uid)
for finding in findings_batch
for resource_uid in resource_map.get(finding["id"], [])
]
def _build_finding_resource_map(
finding_ids: list[UUID], tenant_id: str
) -> dict[UUID, list[str]]:
"""Build mapping from finding_id to list of resource UIDs."""
with rls_transaction(tenant_id, using=READ_REPLICA_ALIAS):
resource_mappings = ResourceFindingMapping.objects.filter(
finding_id__in=finding_ids
).values_list("finding_id", "resource__uid")
result = defaultdict(list)
for finding_id, resource_uid in resource_mappings:
result[finding_id].append(resource_uid)
return result
@@ -0,0 +1,67 @@
from enum import Enum
import neo4j
from cartography.client.core.tx import run_write_query
from celery.utils.log import get_task_logger
from tasks.jobs.attack_paths.config import (
INTERNET_NODE_LABEL,
PROWLER_FINDING_LABEL,
PROVIDER_RESOURCE_LABEL,
)
logger = get_task_logger(__name__)
class IndexType(Enum):
"""Types of indexes that can be created."""
FINDINGS = "findings"
SYNC = "sync"
# Indexes for Prowler findings and resource lookups
FINDINGS_INDEX_STATEMENTS = [
# Resources indexes for quick Prowler Finding lookups
"CREATE INDEX aws_resource_arn IF NOT EXISTS FOR (n:AWSResource) ON (n.arn);",
"CREATE INDEX aws_resource_id IF NOT EXISTS FOR (n:AWSResource) ON (n.id);",
# Prowler Finding indexes
f"CREATE INDEX prowler_finding_id IF NOT EXISTS FOR (n:{PROWLER_FINDING_LABEL}) ON (n.id);",
f"CREATE INDEX prowler_finding_provider_uid IF NOT EXISTS FOR (n:{PROWLER_FINDING_LABEL}) ON (n.provider_uid);",
f"CREATE INDEX prowler_finding_lastupdated IF NOT EXISTS FOR (n:{PROWLER_FINDING_LABEL}) ON (n.lastupdated);",
f"CREATE INDEX prowler_finding_status IF NOT EXISTS FOR (n:{PROWLER_FINDING_LABEL}) ON (n.status);",
# Internet node index for MERGE lookups
f"CREATE INDEX internet_id IF NOT EXISTS FOR (n:{INTERNET_NODE_LABEL}) ON (n.id);",
]
# Indexes for provider resource sync operations
SYNC_INDEX_STATEMENTS = [
f"CREATE INDEX provider_element_id IF NOT EXISTS FOR (n:{PROVIDER_RESOURCE_LABEL}) ON (n.provider_element_id);",
f"CREATE INDEX provider_resource_provider_id IF NOT EXISTS FOR (n:{PROVIDER_RESOURCE_LABEL}) ON (n.provider_id);",
]
def create_indexes(neo4j_session: neo4j.Session, index_type: IndexType) -> None:
"""
Create indexes for the specified type.
Args:
`neo4j_session`: The Neo4j session to use
`index_type`: The type of indexes to create (FINDINGS or SYNC)
"""
if index_type == IndexType.FINDINGS:
logger.info("Creating indexes for Prowler Findings node types")
for statement in FINDINGS_INDEX_STATEMENTS:
run_write_query(neo4j_session, statement)
elif index_type == IndexType.SYNC:
logger.info("Ensuring ProviderResource indexes exist")
for statement in SYNC_INDEX_STATEMENTS:
neo4j_session.run(statement)
def create_all_indexes(neo4j_session: neo4j.Session) -> None:
"""Create all indexes (both findings and sync)."""
create_indexes(neo4j_session, IndexType.FINDINGS)
create_indexes(neo4j_session, IndexType.SYNC)
@@ -0,0 +1,67 @@
"""
Internet node enrichment for Attack Paths graph.
Creates a real Internet node and CAN_ACCESS relationships to
internet-exposed resources (EC2Instance, LoadBalancer, LoadBalancerV2)
in the temporary scan database before sync.
"""
import neo4j
from cartography.config import Config as CartographyConfig
from celery.utils.log import get_task_logger
from api.models import Provider
from prowler.config import config as ProwlerConfig
from tasks.jobs.attack_paths.config import get_root_node_label
from tasks.jobs.attack_paths.queries import (
CREATE_CAN_ACCESS_RELATIONSHIPS_TEMPLATE,
CREATE_INTERNET_NODE,
render_cypher_template,
)
logger = get_task_logger(__name__)
def analysis(
neo4j_session: neo4j.Session,
prowler_api_provider: Provider,
config: CartographyConfig,
) -> int:
"""
Create Internet node and CAN_ACCESS relationships to exposed resources.
Args:
neo4j_session: Active Neo4j session (temp database).
prowler_api_provider: The Prowler API provider instance.
config: Cartography configuration with update_tag.
Returns:
Number of CAN_ACCESS relationships created.
"""
provider_uid = str(prowler_api_provider.uid)
parameters = {
"provider_uid": provider_uid,
"last_updated": config.update_tag,
"prowler_version": ProwlerConfig.prowler_version,
}
logger.info(f"Creating Internet node for provider {provider_uid}")
neo4j_session.run(CREATE_INTERNET_NODE, parameters)
query = render_cypher_template(
CREATE_CAN_ACCESS_RELATIONSHIPS_TEMPLATE,
{"__ROOT_LABEL__": get_root_node_label(prowler_api_provider.provider)},
)
logger.info(
f"Creating CAN_ACCESS relationships from Internet to exposed resources for {provider_uid}"
)
result = neo4j_session.run(query, parameters)
relationships_merged = result.single().get("relationships_merged", 0)
logger.info(
f"Created {relationships_merged} CAN_ACCESS relationships for provider {provider_uid}"
)
return relationships_merged

Some files were not shown because too many files have changed in this diff Show More