Compare commits

...

629 Commits

Author SHA1 Message Date
Pepe Fagoaga 1c5b522eb9 fix: typo 2025-05-12 09:04:40 +02:00
Pepe Fagoaga 7290cb3c00 chore: setup docker compose 2025-05-12 09:02:04 +02:00
Pepe Fagoaga d586c950b3 chore(e2e): Setup E2E testing for the UI 2025-05-12 08:56:53 +02:00
Hugo Pereira Brito 3cab52772c feat(m365): add categories for tenant type e3 and e5 (#7691) 2025-05-09 08:11:44 +02:00
Pepe Fagoaga 81aa035451 chore(changelog): prepare for v5.6.0 (#7688) 2025-05-08 16:49:56 +05:45
Pedro Martín 899f31f1ee fix(prowler_threatscore): fine-tune LevelOfRisk (#7667) 2025-05-08 15:23:31 +05:45
Pedro Martín e142a9e0f4 fix(dashboard): drop duplicates for rows (#7686) 2025-05-08 14:20:19 +05:45
Sergio Garcia ed26c2c42c fix(mutelist): properly handle wildcards and regex (#7685) 2025-05-08 12:10:55 +05:45
Pedro Martín 1017510a67 fix(dashboard): remove muted findings on compliance page (#7683) 2025-05-07 13:52:14 -04:00
Adrián Jesús Peña Rodríguez bfa16607b0 feat: add compliance to API report files and its endpoint (#7653)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-05-07 20:44:58 +05:45
Hugo Pereira Brito 4c874b68f5 fix(metadata): typo in defender_chat_report_policy_configured (#7678) 2025-05-07 09:30:49 -04:00
Sergio Garcia 9458e2bbc4 fix(inspector2): handle error when getting active findings (#7670)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-07 14:39:34 +02:00
Alejandro Bailo 2da7b926ed feat: add DeltaIndicator in new findings (#7676) 2025-05-07 17:59:56 +05:45
Daniel Barranquero 8d4f0ab90a feat(docs): add snapshots to M365 docs (#7673) 2025-05-07 12:19:10 +02:00
Hugo Pereira Brito 83aefc42c1 fix(powershell): remove platform-specific execution (#7675) 2025-05-07 11:44:13 +02:00
Alejandro Bailo a6489f39fd refactor(finding-detail): remove "Next Scan" field (#7674) 2025-05-07 14:39:35 +05:45
Pablo Lara 15c34952cf docs: update changelog (#7672) 2025-05-07 09:43:17 +02:00
Alejandro Bailo d002f2f719 feat: diff between providers actions depending on their secrets (#7669) 2025-05-07 09:35:53 +02:00
Sergio Garcia 8530676419 chore(actions): run tests in dependabot updates (#7671) 2025-05-07 11:43:01 +05:45
Pedro Martín fe5a78e4d4 feat(aws): add static credentials for S3 and SH (#7322) 2025-05-06 17:55:53 +02:00
Pablo Lara d823b2b9de chore: tweaks for m365 provider (#7668) 2025-05-06 17:06:44 +02:00
Alejandro Bailo 3b17eb024c feat: add delta attribute in findings detail view with and finding id to the url (#7654) 2025-05-06 16:52:15 +02:00
Pablo Lara 87951a8371 feat(compliance): add a button to download the report in compliance card (#7665) 2025-05-06 14:44:02 +02:00
Andoni Alonso e5ca51d1e7 feat(teams): add new checks teams_security_reporting_enabled and defender_chat_report_policy_configured (#7614)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-05-06 11:30:00 +02:00
Daniel Barranquero e2fd3fe36e feat(defender): add new check defender_malware_policy_comprehensive_attachments_filter_applied (#7661) 2025-05-06 10:29:36 +02:00
Daniel Barranquero 6b0d73d7f9 feat(exchange): make exchange_user_mailbox_auditing_enabled check configurable (#7662)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-05 15:16:41 -04:00
Hugo Pereira Brito 7eec60f4d9 feat(m365): ensure all forms of mail forwarding are blocked or disabled (#7658)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-05 11:21:14 -04:00
Daniel Barranquero 9d788af932 docs(m365): add documentation for m365 (#7622)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-05 16:46:32 +02:00
Pedro Martín bbc0388d4d chore(changelog): update with latest PR (#7628)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-05 10:40:59 -04:00
Pedro Martín 887db29d96 feat(dashboard): support m365 provider (#7633)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-05 10:38:06 -04:00
dependabot[bot] ae74cab70a chore(deps): bump docker/build-push-action from 6.15.0 to 6.16.0 (#7650)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-05 09:58:38 -04:00
Prowler Bot e6d48c1fa4 chore(regions_update): Changes in regions for AWS services (#7657)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-05-05 09:56:16 -04:00
dependabot[bot] d5ab72a97c chore(deps): bump github/codeql-action from 3.28.15 to 3.28.16 (#7649)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-05 09:54:34 -04:00
dependabot[bot] 473631f83b chore(deps): bump trufflesecurity/trufflehog from 3.88.23 to 3.88.26 (#7648)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-05 09:54:16 -04:00
drewadwade a580b1ee04 fix(azure): CIS v2.0 4.4.1 Uses Wrong Check (#7656)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-05-05 15:53:55 +02:00
dependabot[bot] 844dd5ba95 chore(deps): bump actions/setup-python from 5.5.0 to 5.6.0 (#7647)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-05 09:53:40 -04:00
sumit-tft 44f8e4c488 feat(ui): Page size for datatables (#7634) 2025-05-05 15:42:06 +02:00
Alejandro Bailo 180eb61fee fix: error about page number persistence when filters change (#7655) 2025-05-05 12:23:04 +02:00
Andoni Alonso 9828824b73 chore(sentry): attach stacktrace to logging events (#7598)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-05-05 10:38:57 +02:00
Daniel Barranquero c938a25693 feat(exchange): add new check exchange_organization_modern_authentication_enabled (#7636)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-02 12:44:39 +02:00
Daniel Barranquero cccd69f27c feat(exchange): add new check exchange_roles_assignment_policy_addins_disabled (#7644)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-02 11:58:56 +02:00
Daniel Barranquero 3949806b5d feat(exchange): add new check exchange_mailbox_properties_auditing_e3_enabled (#7642)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-02 10:48:30 +02:00
Daniel Barranquero e7d249784d feat(exchange): add new check exchange_transport_config_smtp_auth_disabled (#7640)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-02 09:05:53 +02:00
Daniel Barranquero 25b1efe532 feat(exchange): add new check exchange_organization_mailtips_enabled (#7637)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-02 08:46:14 +02:00
Adrián Jesús Peña Rodríguez c289ddacf2 feat: add m365 to API (#7563)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
2025-04-30 17:09:47 +02:00
Hugo Pereira Brito 3fd9c51086 feat(m365): automate PowerShell modules installation (#7618)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-04-30 16:41:59 +02:00
Pedro Martín de01087246 fix(s3): add ContentType in upload_file (#7635)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-30 19:48:23 +05:45
Pablo Lara fe42bb47f7 fix: set correct default value for session duration (#7639) 2025-04-30 13:00:45 +02:00
Víctor Fernández Poyatos c56bd519bb test(performance): Add base framework for API performance tests (#7632) 2025-04-30 12:36:25 +02:00
Daniel Barranquero 79b29d9437 feat(exchange): add new check exchange_mailbox_policy_additional_storage_restricted (#7638)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-04-30 12:05:41 +02:00
Pedro Martín 82eecec277 feat(sharepoint): add new check related with OneDrive Sync (#7589)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-04-30 11:43:41 +02:00
Pedro Martín ceacd077d2 fix(typos): remove unneeded files (#7627) 2025-04-29 13:24:24 +05:45
Pepe Fagoaga 5a0fb13ece fix(run-sh): Use poetry's env (#7621) 2025-04-29 13:01:12 +05:45
Erlend Ekern 78439b4c0c chore(dockerfile): add image source as docker label (#7617) 2025-04-29 13:00:47 +05:45
Pedro Martín 06f94f884f feat(compliance): add new Prowler Threat Score Compliance Framework (#7603)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-28 09:57:52 +02:00
dependabot[bot] b8836c6404 chore(deps): bump @babel/runtime from 7.24.7 to 7.27.0 in /ui (#7502)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-28 08:49:33 +02:00
Andoni Alonso ac79b86810 feat(teams): add new check teams_meeting_presenters_restricted (#7613)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-25 14:34:05 -04:00
Andoni Alonso 793c2ae947 feat(teams): add new check teams_meeting_recording_disabled (#7607)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-25 12:35:54 -04:00
Andoni Alonso cdcc5c6e35 feat(teams): add new check teams_meeting_external_chat_disabled (#7605)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-25 11:30:38 -04:00
Andoni Alonso 51db81aa5c feat(teams): add new check teams_meeting_external_control_disabled (#7604)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-25 10:59:36 -04:00
Hugo Pereira Brito a51a185f49 fix(powershell): handle m365 provider execution and logging (#7602)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-25 10:44:25 -04:00
Hugo Pereira Brito 90453fd07e feat(teams): add new check teams_meeting_chat_anonymous_users_disabled (#7579)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-25 09:29:24 -04:00
Pablo Lara d740bf84c3 feat: add new M365 to the provider overview table (#7615) 2025-04-25 15:24:47 +02:00
Pedro Martín d13d2677ea fix(compliance): improve compliance and dashboard (#7596)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-24 13:28:18 -04:00
dependabot[bot] b076c98ba1 chore(deps): bump h11 from 0.14.0 to 0.16.0 (#7609)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-24 13:19:11 -04:00
Hugo Pereira Brito d071dea7f7 feat(teams): add new check teams_meeting_dial_in_lobby_bypass_disabled (#7571)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-24 13:05:52 -04:00
Hugo Pereira Brito d9782c7b8a feat(teams): add new check teams_meeting_external_lobby_bypass_disabled (#7568)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-24 12:13:42 -04:00
Pedro Martín f85450d0b5 fix(html): remove first empty line (#7606)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-24 11:23:24 -04:00
Pepe Fagoaga b129326ed6 chore(actions): Bump Prowler version on release (#7560) 2025-04-24 10:25:36 -04:00
Hugo Pereira Brito eaf0d06b63 chore(m365): add test_connection function (#7541)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-24 10:20:58 -04:00
Pedro Martín 87f3e0a138 fix(nhn): remove unneeded parameter (#7600) 2025-04-24 13:21:52 +02:00
Daniel Barranquero 8e3c856a14 feat(exchange): add new check exchange_external_email_tagging_enabled (#7580)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-23 14:11:39 -04:00
Daniel Barranquero 12c2439196 feat(exchange): add new check exchange_transport_rules_whitelist_disabled (#7569)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-23 13:47:51 -04:00
Daniel Barranquero deb1e0ff34 feat(defender): Add new check defender_antispam_policy_inbound_no_allowed_domains (#7500)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-23 13:29:24 -04:00
Hugo Pereira Brito 808e8297b0 feat(teams): add new check teams_meeting_anonymous_user_start_disabled (#7567) 2025-04-23 10:31:17 -04:00
Hugo Pereira Brito 738ce56955 fix(docs): overview m365 auth (#7588) 2025-04-23 09:58:32 -04:00
Sergio Garcia 190fd0b93c fix(scan): handle cloud provider errors and ignore expected sentry noise (#7582) 2025-04-23 09:58:04 -04:00
Pablo Lara ca6df26918 chore: remove deprecated launch scan page from old 4-step workflow (#7592) 2025-04-23 15:13:05 +02:00
Pablo Lara bcfeb97e4a feat(m365): add the new provider m365 - UI part (#7591) 2025-04-23 14:23:33 +02:00
Hugo Pereira Brito 0234957907 feat(teams): add new check teams_meeting_anonymous_user_join_disabled (#7565)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 16:02:16 -04:00
Hugo Pereira Brito 8713b74204 feat(teams): add new check teams_external_users_cannot_start_conversations (#7562)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 14:36:54 -04:00
Hugo Pereira Brito cbaddad358 feat(teams): add new check teams_unmanaged_communication_disabled (#7561)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 13:25:30 -04:00
Hugo Pereira Brito 2379544425 feat(teams): add new check teams_external_domains_restricted (#7557)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-22 13:04:51 -04:00
Hugo Pereira Brito 29fefba62e fix(teams): teams_email_sending_to_channel_disabled docstrings (#7559) 2025-04-22 12:57:18 -04:00
Daniel Barranquero 098382117e feat(defender): add new check defender_antispam_connection_filter_policy_safe_list_off (#7494)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 12:52:34 -04:00
Daniel Barranquero d816d73174 feat(defender): add new check defender_antispam_connection_filter_policy_empty_ip_allowlist (#7492)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 12:28:18 -04:00
Matt Keeler 30eb78c293 fix(aws): use correct ports in ec2_instance_port_cifs_exposed_to_internet recommendation (#7574)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 12:24:12 -04:00
Daniel Barranquero a671b092ee feat(defender): add new check defender_domain_dkim_enabled (#7485)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 11:15:33 -04:00
Pepe Fagoaga 0edf199282 fix(actions): Include files within providers for SDK tests (#7577) 2025-04-22 10:28:43 -04:00
Andoni Alonso 2478555f0e fix(aws): update bucket naming validation to accept dots (#7545)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 10:06:14 -04:00
Daniel Barranquero b07080245d feat(defender): add new check defender_antispam_outbound_policy_configured (#7480)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 09:58:07 -04:00
Pepe Fagoaga 2ebf217bb0 fix(k8s): Remove command as it is not needed (#7570) 2025-04-22 09:33:40 -04:00
Prowler Bot bb527024d9 chore(regions_update): Changes in regions for AWS services (#7550)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-22 09:32:22 -04:00
Sergio Garcia e897978c3e fix(azure): handle new FlowLog properties (#7546) 2025-04-22 09:21:17 -04:00
Pepe Fagoaga 00f1c02532 chore(tests): Split by provider in the SDK (#7564) 2025-04-22 16:46:15 +05:45
César Arroba 348d1a2fda chore: pass labels on PR merge trigger (#7558) 2025-04-21 16:43:40 +02:00
César Arroba f1df8ba458 chore: revert pass labels (#7556) 2025-04-21 12:46:42 +02:00
César Arroba b5ea418933 chore: pass labels as json is required (#7555) 2025-04-21 12:10:18 +02:00
César Arroba 734fa5a4e6 chore: fix merged PR action, incorrect order on payload (#7554) 2025-04-21 12:03:14 +02:00
César Arroba 08f6d4b69b chore: pass labels (#7553) 2025-04-21 11:57:50 +02:00
César Arroba 29d3bb9f9a chore: fix json body (#7552) 2025-04-21 15:01:03 +05:45
César Arroba 4d217e642b chore: fix trigger (#7551) 2025-04-21 14:56:17 +05:45
César Arroba bd56e03991 chore(gha): trigger cloud pull-request when a PR is merged (#7212) 2025-04-21 14:54:22 +05:45
Felix Dreissig 0b6aa0ddcd fix(aws): remove SHA-1 from ACM insecure key algorithms (#7547) 2025-04-18 16:25:44 -04:00
Daniel Barranquero 4f3496194d feat(defender): add new check defender_antiphishing_policy_configured (#7453) 2025-04-18 12:42:19 -04:00
Daniel Barranquero d09a680aaa feat(defender): add new check defender_malware_policy_notifications_internal_users_malware_enabled (#7435)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-18 11:08:05 -04:00
Daniel Barranquero 56d7431d56 feat(defender): add service and new check defender_malware_policy_common_attachments_filter_enabled (#7425)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-17 13:33:43 -04:00
Daniel Barranquero abae5f1626 feat(exchange): add new check exchange_mailbox_audit_bypass_disabled (#7418)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-16 14:06:32 -04:00
Daniel Barranquero 7d0e94eecb feat(exchange): add service and new check exchange_organization_mailbox_auditing_enabled (#7408)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-16 12:19:06 -04:00
Hugo Pereira Brito 23b65c7728 feat(teams): add new check teams_email_sending_to_channel_disabled (#7533)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-16 11:13:55 -04:00
Sergio Garcia aa3182ebc5 feat(gcp): support CLOUDSDK_AUTH_ACCESS_TOKEN (#7495) 2025-04-16 10:35:04 -04:00
Sergio Garcia 32d27df0ba chore(regions): change interval to weekly (#7539) 2025-04-16 09:35:30 -04:00
Prowler Bot 6439f0a5f3 chore(regions_update): Changes in regions for AWS services (#7538)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-16 09:25:29 -04:00
Sergio Garcia 19476632ff chore(dependabot): change settings (#7536) 2025-04-16 11:26:57 +05:45
Pedro Martín d4c12e4632 fix(iam): change some logger.info values (#7526)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-15 13:25:37 -04:00
Hugo Pereira Brito 52bd48168f feat: adapt Microsoft365 provider to use PowerShell (#7331)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-15 13:24:09 -04:00
Bogdan A c0d935e232 docs(gcp): update required permissions for GCP (#7488) 2025-04-15 10:23:45 -04:00
Pepe Fagoaga 24dfd47329 fix(pypi): package name location in pyproject.toml while replicating for prowler-cloud (#7531) 2025-04-15 20:01:27 +05:45
dependabot[bot] fbae338689 chore(deps): bump python from 3.12.9-alpine3.20 to 3.12.10-alpine3.20 (#7520)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-15 09:26:04 -04:00
dependabot[bot] 186fd88f8c chore(deps): bump codecov/codecov-action from 5.4.0 to 5.4.2 (#7522)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-15 09:25:44 -04:00
dependabot[bot] 14ff34c00a chore(deps): bump actions/setup-node from 4.3.0 to 4.4.0 (#7521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-15 09:25:23 -04:00
Prowler Bot a66fa394d3 chore(regions_update): Changes in regions for AWS services (#7527)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-15 09:20:20 -04:00
Pepe Fagoaga 931766fe08 chore(action): Remove cache in PyPI release (#7532) 2025-04-15 18:58:26 +05:45
Pepe Fagoaga c134914896 revert: fix(findings): increase uid max length to 600 (#7528) 2025-04-15 15:54:32 +05:45
Pepe Fagoaga 25dac080a5 chore(changelog): prepare for 5.5.1 (#7523) 2025-04-15 11:46:20 +05:45
Sergio Garcia 910d39eee4 chore(sdk): update changelog (#7512) 2025-04-15 11:19:50 +05:45
Pepe Fagoaga d604ae5569 fix(pyproject): Restore packages location (#7510) 2025-04-14 16:50:50 -04:00
Bogdan A 42f46b0fb1 feat(gcp): add check for unused Service Accounts (#7419)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-14 11:53:54 -04:00
Pepe Fagoaga abb5864224 chore(release): bump for 5.6.0 (#7503) 2025-04-14 11:50:46 -04:00
Prowler Bot 2e2a2bd89a chore(regions_update): Changes in regions for AWS services (#7491)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-14 10:29:19 -04:00
Sergio Garcia f8ee841921 fix(gcp): handle projects without ID (#7496) 2025-04-14 10:25:54 -04:00
Pedro Martín ceda8c76d2 feat(azure): add SOC2 compliance framework (#7489) 2025-04-14 10:16:20 -04:00
Pedro Martín afe0b7443f fix(defender): add default name to contacts (#7483) 2025-04-14 10:16:07 -04:00
Prowler Bot 9b773897d2 chore(regions_update): Changes in regions for AWS services (#7487)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-14 09:53:40 -04:00
Pedro Martín d6ec4c2c96 feat(sdk): add changelog file (#7499) 2025-04-14 09:22:50 -04:00
Prowler Bot 14ef169e99 chore(regions_update): Changes in regions for AWS services (#7497)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-14 09:22:21 -04:00
Pepe Fagoaga 22141f9706 fix(findings): increase uid max length to 600 (#7498)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-14 17:46:13 +05:45
Pablo Lara a5c6fee5b4 fix: update redirect URL for SSO (#7493) 2025-04-11 18:25:28 +05:45
Pablo Lara d3a5a5c0a1 fix: resolve social login issue in AuthForm on sign-up page (#7490) 2025-04-11 09:59:10 +02:00
dependabot[bot] 5d81869de4 chore(deps): bump tj-actions/changed-files from 46.0.4 to 46.0.5 (#7486)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 22:31:33 -04:00
Pepe Fagoaga 73ebf95d89 chore(changelog): Prepare for v5.5.0 (#7484) 2025-04-09 20:50:56 +05:45
Sergio Garcia 9f4574f4ff fix: handle errors in AWS and Azure (#7482) 2025-04-09 20:19:38 +05:45
Pedro Martín cb239b20ab fix(aws): add default session_duration (#7479) 2025-04-09 19:19:17 +05:45
eeche 3ef79588b4 feat(NHN): add NHN cloud provider with 6 checks (#6870)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 09:13:24 -04:00
Prowler Bot 61000e386b chore(regions_update): Changes in regions for AWS services (#7478)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 09:11:29 -04:00
Pablo Lara 53cb57901f fix: fix TS type for session duration (#7481) 2025-04-09 13:44:53 +02:00
Pedro Martín 993ff4d78e feat(gcp): add SOC2 compliance framework (#7476) 2025-04-08 15:04:08 -04:00
Drew Kerrigan 8fb10fbbf7 fix(ui): Remove UTC from timestamps in app (#7474) 2025-04-08 17:43:44 +02:00
Pablo Lara 11e834f639 feat: update the NextJS version to the latest (#7473) 2025-04-08 17:40:39 +02:00
Prowler Bot 62bf2fbb9c chore(regions_update): Changes in regions for AWS services (#7467)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-08 10:21:42 -04:00
dependabot[bot] e57930d6c2 chore(deps): bump github/codeql-action from 3.28.13 to 3.28.15 (#7463)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-08 09:38:18 -04:00
Pepe Fagoaga e0c417a466 fix(action): Use poetry > v2 (#7472) 2025-04-08 18:34:24 +05:45
Sergio Garcia b55f8efed1 fix: handle errors in AWS, Azure, and GCP (#7456) 2025-04-08 18:05:43 +05:45
Pablo Lara 7cbc60d977 feat: add link with the service status using static icon (#7468) 2025-04-08 12:06:21 +02:00
Adrián Jesús Peña Rodríguez 5b7912b558 fix(provider): disable periodic task on views before deleting (#7466)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-08 15:35:22 +05:45
Pedro Martín 57fca3e54d fix(soc2_aws): update compliance and remove some requirements (#7452) 2025-04-07 15:47:19 -04:00
Pedro Martín e31c27b123 fix(gcp): handle logic for empty project names (#7436) 2025-04-07 11:51:15 -04:00
Sergio Garcia 74f1da818e fix(gcp): ignore redirect balancers and add regional ones (#7442) 2025-04-07 11:47:02 -04:00
Pedro Martín 910cfa601b fix(aws): add resource arn for transit gateways (#7447) 2025-04-07 11:46:53 -04:00
dependabot[bot] fe321c3f8a chore(deps): bump tj-actions/changed-files from 46.0.3 to 46.0.4 (#7443)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-07 09:11:54 -04:00
Prowler Bot 43de0d405f chore(regions_update): Changes in regions for AWS services (#7446)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-07 09:11:23 -04:00
dependabot[bot] ac6ed31c8e chore(deps): bump trufflesecurity/trufflehog from 3.88.22 to 3.88.23 (#7444)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-07 09:11:07 -04:00
Prowler Bot 9d47437de4 chore(regions_update): Changes in regions for AWS services (#7445)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-07 09:10:49 -04:00
Pablo Lara eb7a62ff77 refactor: extract common auth headers into reusable helper (#7439) 2025-04-07 08:16:55 +02:00
Pedro Martín 67bc16b46d fix(defender): add default resource name in contacts (#7438) 2025-04-04 09:35:11 -04:00
Sergio Garcia 8552a578a0 fix(aws): solve multiple errors (#7431) 2025-04-04 09:34:58 -04:00
Sergio Garcia a5d277e045 fix(docs): solve broken links (#7432) 2025-04-04 09:15:48 -04:00
Adrián Jesús Peña Rodríguez 6dbf2ac606 feat: add missing SDK fields to API findings and resources (#7318) 2025-04-04 14:57:49 +02:00
Prowler Bot b1569ac2f3 chore(regions_update): Changes in regions for AWS services (#7434)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-04 08:36:23 -04:00
dependabot[bot] 3d0145b522 chore(deps): bump trufflesecurity/trufflehog from 3.88.20 to 3.88.22 (#7433)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-04 08:34:51 -04:00
Pedro Martín 44174526d6 docs: add onboarding information step by step for each provider (#7362) 2025-04-04 13:00:43 +02:00
Pablo Lara 0fd395ea83 fix: correct fetch variable name from invitations to roles (#7437) 2025-04-04 12:08:57 +02:00
dependabot[bot] 5e9d4a80a1 chore(deps): bump msgraph-sdk from 1.18.0 to 1.23.0 (#7128)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-04 11:27:39 +02:00
Pedro Martín e4d234fe03 fix(azure): remove resource_name inside the Check_Report (#7420) 2025-04-03 11:35:02 -04:00
Prowler Bot 3202184718 chore(regions_update): Changes in regions for AWS services (#7424)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-03 09:39:00 -04:00
Sergio Garcia 41e576f4f1 fix(gcp): make logging sink check at project level (#7421) 2025-04-03 09:37:46 -04:00
Pepe Fagoaga d8dce07019 chore(deletion): Add environment variable for batch size (#7423)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-03 15:31:13 +05:45
Prowler Bot 2b0a3144c7 chore(regions_update): Changes in regions for AWS services (#7417)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-02 09:59:08 -04:00
dependabot[bot] 62fbce0b5e chore(deps): bump azure-identity from 1.19.0 to 1.21.0 (#7192)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-02 11:16:47 +02:00
Pedro Martín 5a59bb335c fix(resources): add the correct id and names for resources (#7410) 2025-04-01 20:30:37 +02:00
Sergio Garcia 2719991630 fix(report): log as error when Resource ID or Name do not exist (#7411) 2025-04-01 20:24:18 +02:00
Daniel Barranquero 6a3b8c4674 feat(entra): add new check entra_admin_users_cloud_only (#7286) 2025-04-01 19:14:15 +02:00
dependabot[bot] 191fbf0177 chore(deps): bump azure-mgmt-applicationinsights from 4.0.0 to 4.1.0 (#7161)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-01 14:55:37 +02:00
Víctor Fernández Poyatos 228dd2952a fix(scans): Handle duplicated scan tasks (#7401) 2025-04-01 11:55:14 +02:00
dependabot[bot] 97db38aa25 chore(deps): bump azure-mgmt-containerregistry from 10.3.0 to 12.0.0 (#7025)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-01 10:29:31 +02:00
Pedro Martín dc953a6e22 docs(python): add annotations about Python version (#7402) 2025-03-31 18:14:59 +02:00
Bogdan A 51e796a48d feat(gcp): add check for dormant (unused) SA keys (#7348)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2025-03-31 18:14:21 +02:00
Hugo Pereira Brito 024f1425df feat(entra): add new check entra_legacy_authentication_blocked (#7240) 2025-03-31 18:12:26 +02:00
Hugo Pereira Brito a7ed610da9 feat(entra): add new check entra_users_mfa_enabled (#7228) 2025-03-31 17:54:52 +02:00
Hugo Pereira Brito 7ba99f22cd feat(entra): add new check entra_admin_users_phishing_resistant_mfa_enabled (#7211)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-31 17:52:28 +02:00
Hugo Pereira Brito b8ce09ec34 fix(entra): check name and logic of entra_admin_users_have_mfa_enabled (#7230) 2025-03-31 17:50:51 +02:00
Daniel Barranquero c243110a49 feat(entra): add new check entra_policy_guest_invite_only_for_admin_roles (#7241)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-31 14:53:50 +02:00
Daniel Barranquero ee27636f32 fix(redshift): validation error for Cluster.multi_az (#7381) 2025-03-31 13:55:48 +02:00
dependabot[bot] f2f41c9c44 chore(deps): bump azure-mgmt-resource from 23.2.0 to 23.3.0 (#7054)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-03-31 13:29:49 +02:00
Daniel Barranquero 9312890e6a feat(entra): add new check entra_policy_guest_users_access_restrictions (#7234) 2025-03-31 12:45:26 +02:00
Daniel Barranquero 9578281b4f feat(entra): add new check entra_policy_restricts_user_consent_for_apps (#7225) 2025-03-31 12:32:51 +02:00
Víctor Fernández Poyatos 08690068fc feat(findings): Handle muted findings in API and UI (#7378)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-31 12:25:58 +02:00
Hugo Pereira Brito e06a33de84 feat(entra): add new check entra_managed_device_required_for_mfa_registration (#7203)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-31 12:24:47 +02:00
Prowler Bot 6a3db10fda chore(regions_update): Changes in regions for AWS services (#7395)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-31 10:18:53 +02:00
Andoni Alonso bbed445efa chore(sentry): ignore exception when aws service not available in a region (#7352) 2025-03-31 10:13:19 +02:00
dependabot[bot] 9d65fb0bf2 chore(deps): bump trufflesecurity/trufflehog from 3.88.18 to 3.88.20 (#7394)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-31 10:12:55 +02:00
Prowler Bot 34f03ca110 chore(regions_update): Changes in regions for AWS services (#7391)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-27 11:10:07 +01:00
Daniel Barranquero 87c038f0c2 fix(rds): hundle Certificate rds-ca-2019 not found (#7383) 2025-03-27 11:09:33 +01:00
dependabot[bot] b3014f03b1 chore(deps): bump actions/setup-python from 5.4.0 to 5.5.0 (#7390)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-27 09:13:50 +01:00
Daniel Barranquero d39598c9fc fix(stepfunctions): Nonetype object has no attribute level (#7386) 2025-03-26 19:39:27 +01:00
Daniel Barranquero 5ea9106259 fix(fms): resource metadata could not be converted to dict (#7379) 2025-03-26 19:25:00 +01:00
Prowler Bot bcc0b59de1 chore(regions_update): Changes in regions for AWS services (#7382)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-26 12:52:35 +01:00
Daniel Barranquero 5d6ed640f0 fix(vm): handle Nonetype is not iterable for extensions (#7360)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 12:25:15 +01:00
Sergio Garcia dd1cc2d025 fix(s3): handle None S3 account public access block (#7350) 2025-03-25 11:39:19 +01:00
Andoni Alonso 52e5cc23e4 fix(storagegateway): describe smb/nfs share per region (#7374) 2025-03-25 10:35:37 +01:00
Pablo Lara 76a8e2be1f chore: tweak for button see findings (#7369) 2025-03-25 09:52:36 +01:00
Andoni Alonso d989425490 fix(vm): handle NoneType accessing security_profile (#7221) 2025-03-25 09:33:00 +01:00
Hugo Pereira Brito 1e324b7ed2 fix(network): handle Nonetype is not iterable for security groups (#7208) 2025-03-25 09:28:37 +01:00
Sergio Garcia e68aa62f94 fix(iam): handle none SAML Providers (#7359) 2025-03-25 09:24:32 +01:00
Daniel Barranquero 332b98a1ab fix(iam): handle UnboundLocalError cannot access local variable 'report' (#7361) 2025-03-25 09:22:35 +01:00
Pablo Lara dd05ef7974 chore(scans): properly enable link to findings when scan is completed (#7368) 2025-03-25 08:45:37 +01:00
dependabot[bot] d6862766d3 chore(deps): bump github/codeql-action from 3.28.12 to 3.28.13 (#7367)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-25 12:43:02 +05:45
dependabot[bot] f52d005e2d chore(deps): bump tj-actions/changed-files from 46.0.1 to 46.0.3 (#7363)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-25 12:42:50 +05:45
Víctor Fernández Poyatos bf475234a5 build(api): Force django-allauth==65.4.1 (#7358) 2025-03-24 17:39:47 +01:00
Pablo Lara cd5985c056 docs: update readme (#7357) 2025-03-24 15:41:35 +01:00
Pablo Lara ce33dbf823 chore(findings): apply default filter to show failed findings (#7356) 2025-03-24 15:38:09 +01:00
Pablo Lara 0a9d0688a7 docs(changelog): document addition of download column in scans table … (#7354) 2025-03-24 15:28:13 +01:00
Pablo Lara 24784f2ce5 feat(scans): add download button column for completed scans in table (#7353) 2025-03-24 15:22:36 +01:00
Víctor Fernández Poyatos 7a1e611b88 ref(providers): Refactor provider deletion functions (#7349) 2025-03-24 14:39:14 +01:00
Pepe Fagoaga 3073150008 chore(next): Remove x-powered-by header (#7346) 2025-03-24 16:17:18 +05:45
Jonny 9923def4cb chore(awslambda): update obsolete lambda runtimes (#7330)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-24 11:21:01 +01:00
Víctor Fernández Poyatos a7f612303f feat(compliance): Add endpoint to retrieve compliance overviews metadata (#7333) 2025-03-24 10:34:43 +01:00
Pablo Lara 64c2a2217a docs: update changelog with Next.js security patch (#7339) (#7341) 2025-03-24 09:59:59 +01:00
Pablo Lara 4689d7a952 chore: upgrade Next.js to 14.2.25 to fix auth middleware vulnerability (#7339) 2025-03-24 09:48:41 +01:00
Prowler Bot 87cd143967 chore(regions_update): Changes in regions for AWS services (#7219)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-24 09:46:57 +01:00
Prowler Bot e37fd05d58 chore(regions_update): Changes in regions for AWS services (#7246)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-24 09:46:26 +01:00
Prowler Bot acc708bda5 chore(regions_update): Changes in regions for AWS services (#7250)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-24 09:46:08 +01:00
Prowler Bot c7460bb69c chore(regions_update): Changes in regions for AWS services (#7334)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-24 09:35:47 +01:00
Pepe Fagoaga 84b273dab9 fix(action): Use Poetry v2 (#7329) 2025-03-20 18:49:32 +01:00
Prowler Bot bb7ce2157e chore(regions_update): Changes in regions for AWS services (#7323)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-20 18:10:28 +05:45
Pepe Fagoaga 07b9e1d3a4 chore(api): Update CHANGELOG (#7325) 2025-03-20 15:22:00 +05:45
Pepe Fagoaga 96a879d761 fix(scan_id): Read the ID from the Scan object (#7324) 2025-03-20 15:18:31 +05:45
Pepe Fagoaga 283127c3f4 chore(aws-regions): remove backport to v3 (#7319) 2025-03-19 22:14:41 +05:45
dependabot[bot] beeee80a0b chore(deps): bump github/codeql-action from 3.28.11 to 3.28.12 (#7321)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-19 22:14:23 +05:45
Pepe Fagoaga 06b62826b4 chore(dependabot): disable for v3 (#7316) 2025-03-19 21:56:52 +05:45
Pedro Martín d0736af209 fix(gcp): make provider id mandatory in test_connection (#7296) 2025-03-19 18:33:49 +05:45
Pablo Lara 716c8c1a5f docs: add social login images and update documentation (#7314)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-03-19 17:16:37 +05:45
Pepe Fagoaga e6cdda1bd9 chore(dependabot): Disable for API and UI (#7300) 2025-03-19 14:46:11 +05:45
Pedro Martín 2747a633bc fix(k8s): remove typos from PCI 4.0 (#7294) 2025-03-19 09:31:40 +01:00
Pepe Fagoaga 74118f5cfe chore(social-login): improve copy when not enabled (#7295) 2025-03-19 13:36:22 +05:45
dependabot[bot] 598bdf28bb chore(deps): bump trufflesecurity/trufflehog from 3.88.17 to 3.88.18 (#7297)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-19 12:31:52 +05:45
Pepe Fagoaga d75f681c87 chore(security): Configure HTTP Security Headers (#7220)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-18 17:49:12 +01:00
Pepe Fagoaga c7956ede6a chore(security): Add HTTP Security Headers (#7289) 2025-03-18 17:44:57 +01:00
Pablo Lara 64f5a69e84 fix: prevent SSR mismatch in OAuth URL generation (#7288) 2025-03-18 17:22:29 +01:00
dependabot[bot] bfb15c34b8 chore(deps): bump azure-mgmt-containerservice from 34.0.0 to 34.1.0 (#6989)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-03-18 17:14:25 +01:00
Pablo Lara 638b3ac0cd chore(providers): change wording when adding a new provider (#7280) 2025-03-18 21:50:56 +05:45
Daniel Barranquero 9d6147a037 fix(route53): solve false positive in route53_public_hosted_zones_cloudwatch_logging_enabled (#7201) 2025-03-18 16:54:49 +01:00
Pepe Fagoaga 802c786ac2 fix(test-connection): Handle provider without secret (#7283) 2025-03-18 21:34:36 +05:45
Pepe Fagoaga c8be8dbd9a fix(aws-regions): Use @prowler-bot as author (#7285) 2025-03-18 20:27:19 +05:45
Pablo Lara 7053b2bb37 chore: add env vars for social login (#7257)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-03-18 13:43:46 +01:00
Prowler Bot 447bf832cd chore(regions_update): Changes in regions for AWS services (#7281)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-18 17:35:44 +05:45
Pablo Lara 7c4571b55e feat(providers): add component to render a link to the documentation (#7282) 2025-03-18 12:05:38 +01:00
dependabot[bot] eb7c16aba5 chore(deps): bump azure-mgmt-storage from 21.2.1 to 22.1.1 (#7098)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-03-18 11:06:46 +01:00
Adrián Jesús Peña Rodríguez b09e83b171 chore: add api reference to download report section (#7243) 2025-03-18 14:54:13 +05:45
Hugo Pereira Brito bb149a30a7 fix(microsoft365): typo Microsoft365NotTenantIdButClientIdAndClienSecretError (#7244) 2025-03-17 21:16:47 +05:45
Pablo Lara d5be35af49 chore: Rename keyServer and extract to helper (#7256) 2025-03-17 21:11:27 +05:45
Pedro Martín f6aa56d92b fix(.env): remove spaces (#7255) 2025-03-17 20:48:55 +05:45
Pedro Martín 6a4df15c47 fix(prowler): change from prowler.py to prowler-cli.py (#7253) 2025-03-17 15:44:15 +01:00
Pablo Lara 72de5fdb1b chore: update git ignore file (#7254) 2025-03-17 14:53:58 +01:00
Pedro Martín a7f55d06af feat(jira): add basic auth method (#7233) 2025-03-17 14:31:35 +01:00
Pepe Fagoaga 97da78d4e7 fix(backport): Use container tagged version (#7252) 2025-03-17 18:19:43 +05:45
Pepe Fagoaga c4f6161c73 chore(security): Pin actions to the Full-Length Commit SHA (#7249) 2025-03-17 17:11:28 +05:45
Pablo Lara db7ffea24d chore: add env var for social login (#7251) 2025-03-17 10:23:01 +01:00
Prowler Bot 489b5abf82 chore(regions_update): Changes in regions for AWS services (#7237)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-17 13:47:56 +05:45
Prowler Bot 3a55c2ee07 chore(regions_update): Changes in regions for AWS services (#7245)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-17 12:34:44 +05:45
Pedro Martín 64d866271c fix(scan): add compliance info inside finding (#5649) 2025-03-17 12:18:00 +05:45
Pablo Lara 1ab2a80eab chore: improve UX when social login is not enabled (#7242) 2025-03-15 12:12:30 +01:00
Pablo Lara 89d4c521ba chore(social-login): disable social login buttons when env vars are not set (#7238) 2025-03-14 11:32:22 +01:00
Pablo Lara f2e19d377a chore(social-login): rename env.vars for social login (#7232) 2025-03-13 17:07:17 +01:00
Pablo Lara 2b7b887b87 chore: social auth is algo in sign-up page (#7231) 2025-03-13 14:20:09 +01:00
Pablo Lara 44c70b5d01 chore: remove unused regions (#7229) 2025-03-13 13:57:16 +01:00
Pablo Lara 7514484c42 chore: change wording for launching a single scan (#7226) 2025-03-13 13:48:01 +01:00
Adrián Jesús Peña Rodríguez 9594c4c99f fix: add a handled response in case local files are missing (#7183) 2025-03-13 13:47:00 +01:00
Pablo Lara 56445c9753 chore: update changelog (#7223) 2025-03-13 13:39:26 +01:00
Adrián Jesús Peña Rodríguez 07419fd5e1 fix(exports): change the way to remove the local export files after s3 upload (#7172) 2025-03-13 13:37:17 +01:00
Pablo Lara 2e4dd12b41 feat(social-login): social login with Google is working (#7218)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-03-13 12:52:30 +01:00
Víctor Fernández Poyatos fed2046c49 fix(migrations): add through parameter to integration.providers (#7222) 2025-03-13 12:47:34 +01:00
Pepe Fagoaga db79db4786 fix(pyproject): Rename prowler.py (#7217) 2025-03-13 16:53:38 +05:45
Víctor Fernández Poyatos 6f027e3c57 feat(integrations): Added new endpoints to allow configuring integrations (#7167) 2025-03-12 19:57:55 +05:45
Daniel Barranquero bdb877009f feat(entra): add new check entra_admin_mfa_enabled_for_administrative_roles (#7181)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-12 14:47:29 +01:00
Sergio Garcia 6564ec1ff5 fix(cloudwatch): handle None metric alarms (#7205) 2025-03-12 14:44:36 +01:00
Pedro Martín 443dc067b3 feat(kubernetes): add ISO 27001 2022 compliance framework (#7204) 2025-03-12 14:24:53 +01:00
Hugo Pereira Brito 6221650c5f feat(entra): add new check entra_identity_protection_sign_in_risk_enabled (#7171)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-12 13:53:47 +01:00
Andoni Alonso 034d0fd1f4 refactor(check): add docstrings and improve report handling (#7113) 2025-03-12 13:38:42 +01:00
Hugo Pereira Brito e617ff0460 feat(docs): add microsoft365 configurable checks (#7200) 2025-03-12 12:52:35 +01:00
Hugo Pereira Brito 4b1ed607a7 feat(entra): add new check entra_identity_protection_user_risk_enabled (#7126)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-12 12:44:31 +01:00
Pepe Fagoaga 137365a670 chore(poetry): Upgrade to v2 (#7112) 2025-03-12 17:28:34 +05:45
Hugo Pereira Brito 1891a1b24f feat(entra): add new check entra_managed_device_required_for_authentication (#7115)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-12 11:34:14 +01:00
Daniel Barranquero e57e070866 feat(entra): add new check entra_password_hash_sync_enabled (#7061) 2025-03-12 11:31:49 +01:00
dependabot[bot] 66998cd1ad chore(deps): bump google-api-python-client from 2.162.0 to 2.163.0 (#7191)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-12 11:25:24 +01:00
Prowler Bot c0b1833446 chore(regions_update): Changes in regions for AWS services (#7197)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-12 11:25:06 +01:00
Pablo Lara 329a72c77c chore: update changelog (#7199) 2025-03-12 10:12:33 +01:00
Pablo Lara 2610ee9d0c feat(invitations): Disable editing for accepted invites (#7198) 2025-03-12 10:06:46 +01:00
Pablo Lara a13ca9034e chore(scans): rename type to trigger (#7196) 2025-03-12 09:47:02 +01:00
Pablo Lara 5d1abb3689 chore: auto refresh if the state is also available (#7195) 2025-03-12 09:33:24 +01:00
Pablo Lara e1d1c6d154 styles: tweaks styles (#7194) 2025-03-12 09:23:02 +01:00
Pablo Lara e18e0e7cd4 chore(launch-scan): update wording (#7193) 2025-03-12 08:20:15 +01:00
Pablo Lara eaf3d07a3f chore: update the changelog (#7190) 2025-03-12 08:15:28 +01:00
Hugo Pereira Brito c88ae32b7f feat(microsoft365): add new check entra_admin_users_sign_in_frequency_enabled (#7020)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-11 19:18:33 +01:00
Pablo Lara 605613e220 feat(scans): allow running a scan once (#7188) 2025-03-11 17:47:47 +01:00
Sergio Garcia d2772000ec chore(sentry): ignore new exceptions in Sentry (#7187) 2025-03-11 17:46:14 +01:00
Adrián Jesús Peña Rodríguez 42939a79f5 docs: add users, invitations and RBAC (#7109) 2025-03-11 21:59:04 +05:45
Daniel Barranquero ed17931117 feat(entra): add new check entra_dynamic_group_for_guests_created (#7168)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-11 16:21:17 +01:00
Daniel Barranquero 66df5f7a1c chore(providers): enhance Remediation.Code.CLI field from check's metadata (#7094)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-03-11 16:15:58 +01:00
Pedro Martín fc6e6696e5 feat(gcp): add ISO 27001 2022 compliance framework (#7185) 2025-03-11 15:16:40 +01:00
Sergio Garcia 465748c8a1 chore(sentry): ignore expected errors in GCP API (#7184) 2025-03-11 14:32:37 +01:00
Pedro Martín e59cd71bbf fix(azure): add remaining checks for reqA.5.25 (#7182) 2025-03-11 14:16:10 +01:00
Daniel Barranquero 8a76fea310 feat(entra): add new check entra_admin_consent_workflow_enabled (#7110) 2025-03-11 13:18:17 +01:00
Adrián Jesús Peña Rodríguez 0e46be54ec docs: add generate_output documentation (#7122)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-11 17:23:32 +05:45
Pedro Martín dc81813fdf fix(ens): remove and change duplicated ids (#7165) 2025-03-11 11:35:31 +01:00
Hugo Pereira Brito eaa0df16bb refactor(microsoft365): resource metadata assertions (#7169) 2025-03-11 11:30:37 +01:00
Pedro Martín c23e911028 feat(azure): add ISO 27001 2022 compliance framework (#7170) 2025-03-11 11:29:40 +01:00
dependabot[bot] 06b96a1007 chore(deps): bump tzlocal from 5.3 to 5.3.1 (#7162)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-11 11:17:50 +01:00
Prowler Bot fa545c591f chore(regions_update): Changes in regions for AWS services (#7177)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-11 11:17:27 +01:00
dependabot[bot] e828b780c7 chore(deps): bump trufflesecurity/trufflehog from 3.88.15 to 3.88.16 (#7174)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-11 11:16:57 +01:00
Harshit Raj Singh eca8c5cabd feat(aws): AWS Found Sec Best Practices & PCI DSS v3.2.1 upgrade (#7017)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-03-11 09:31:16 +01:00
Pablo Lara b7bce6008f fix: tweak z-index for custom inputs (#7166) 2025-03-10 11:55:04 +01:00
Pablo Lara 2fdf89883d feat(scans): improve scan launch provider selection (#7164) 2025-03-10 10:05:33 +01:00
dependabot[bot] 6c5d4bbaaa chore(deps): bump django from 5.1.5 to 5.1.7 in /api (#7145)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-10 09:50:09 +01:00
Gary Mclean cb2f926d4f fix(azure): correct check title for SQL Server Unrestricted (#7123) 2025-03-07 18:24:24 +01:00
ryan-stavella 12c01b437e fix(metadata): typo in ec2_securitygroup_allow_wide_open_public_ipv4 (#7116) 2025-03-07 15:28:08 +01:00
dependabot[bot] 3253a58942 chore(deps-dev): bump mock from 5.1.0 to 5.2.0 (#7099)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-07 15:01:43 +01:00
Kay Agahd 199f7f14ea fix(doc): event_time has been changed to time_dt but was not documented (#7136) 2025-03-07 14:36:51 +01:00
Andoni Alonso d42406d765 fix(metadata): match type with check results (#7111) 2025-03-07 14:34:07 +01:00
Kay Agahd 2276ffb1f6 fix(aws): ecs_task_definitions_no_environment_secrets.metadata.json (#7135) 2025-03-07 14:31:03 +01:00
dependabot[bot] 218fb3afb0 chore(deps): bump jinja2 from 3.1.5 to 3.1.6 (#7151)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-07 14:27:29 +01:00
Prowler Bot a9fb890979 chore(regions_update): Changes in regions for AWS services (#7108)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-07 14:06:28 +01:00
Prowler Bot 54ebf5b455 chore(regions_update): Changes in regions for AWS services (#7119)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-07 14:04:48 +01:00
dependabot[bot] c9a0475aa8 chore(deps-dev): bump mkdocs-git-revision-date-localized-plugin from 1.3.0 to 1.4.1 (#7129)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-07 14:03:44 +01:00
Prowler Bot 5567d9f88c chore(regions_update): Changes in regions for AWS services (#7131)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-07 13:19:08 +01:00
dependabot[bot] 56f3e661ae chore(deps): bump trufflesecurity/trufflehog from 3.88.14 to 3.88.15 (#7127)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-07 13:17:45 +01:00
César Arroba 1aa4479a10 chore: increase release to 5.5.0 (#7143) 2025-03-07 13:16:24 +01:00
Prowler Bot 7b625d0a91 chore(regions_update): Changes in regions for AWS services (#7146)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-07 13:15:51 +01:00
Pablo Lara fd0529529d chore: update changelog (#7149) 2025-03-07 11:47:23 +01:00
Pablo Lara af43191954 fix: tweaks for compliance cards (#7147) 2025-03-07 11:32:58 +01:00
Pablo Lara 2ce2ca7c91 feat: add changelog (#7141) 2025-03-06 16:46:55 +01:00
Víctor Fernández Poyatos a0fc3db665 fix(overviews): manage overview exceptions and use batch_size with bulk (#7140) 2025-03-06 15:35:29 +01:00
César Arroba feb458027f chore(ui-gha): delete double quotes on prowler version (#7139) 2025-03-06 19:48:53 +05:45
Pablo Lara e5a5b7af5c fix(groups): display uid if alias is missing (#7137) 2025-03-06 14:37:36 +01:00
Pablo Lara ad456ae2fe fix(credentials): adjust helper links to fit width (#7133) 2025-03-06 11:42:26 +01:00
Pepe Fagoaga 690cb51f6c revert(findings): change uid from varchar to text (#7132) 2025-03-06 16:24:35 +05:45
dependabot[bot] 14aaa2f376 chore(deps): bump jinja2 from 3.1.5 to 3.1.6 in /api (#7130)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-06 09:39:24 +01:00
César Arroba 6e47ca2c41 chore(ui-gha): add version prefix (#7125) 2025-03-05 21:13:24 +05:45
Víctor Fernández Poyatos 0d99d2be9b fix(reports): Fix task kwargs and result (#7124) 2025-03-05 21:10:44 +05:45
César Arroba c322ef00e7 chore(ui): add prowler version on build (#7120) 2025-03-05 20:46:16 +05:45
Pablo Lara 3513421225 feat(compliance): new compliance selector (#7118) 2025-03-05 15:12:10 +01:00
Víctor Fernández Poyatos b0e6bfbefe chore(api): Update changelog (#7090) 2025-03-04 17:44:34 +01:00
dependabot[bot] f7a918730e chore(deps-dev): bump pytest from 8.3.4 to 8.3.5 (#7097)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-04 09:16:05 +01:00
Pablo Lara cef33319c5 chore(ui): update label from 'Select a scan job' to 'Select a cloud p… (#7107) 2025-03-04 09:11:39 +01:00
Pablo Lara 2036a59210 fix(roles): show the correct error message (#7089) 2025-03-03 15:46:02 +01:00
Pablo Lara e5eccb6227 fix: bug with create role and unlimited visibility checkbox (#7088) 2025-03-03 15:45:39 +01:00
Sergio Garcia 48c2c8567c feat(aws): add fixers for threat detection checks (#7085) 2025-03-03 14:20:23 +01:00
Pablo Lara bbeef0299f feat(version): add prowler version to the sidebar (#7086) 2025-03-03 13:40:09 +01:00
Pablo Lara bec5584d63 chore: Update the latest table findings with the most recent changes (#7084) 2025-03-03 13:16:30 +01:00
Pablo Lara bdc759d34c feat(sidebar): sidebar with new functionalities (#7018) 2025-03-03 12:30:28 +01:00
Prowler Bot 8db442d8ba chore(regions_update): Changes in regions for AWS services (#7067)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 09:29:48 +01:00
Sergio Garcia 9e7a0d4175 fix(threat detection): run single threat detection check (#7065) 2025-02-28 13:51:07 +01:00
Pepe Fagoaga 9c33b3f5a9 refactor(stats): Use Finding instead of Check_Report (#7053)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-02-28 10:54:48 +01:00
Pepe Fagoaga 7e7e2c87dc chore(examples): Scan AWS (#7064) 2025-02-28 15:25:10 +05:45
Sergio Garcia 2f741f35a8 chore(gcp): enhance GCP APIs logic (#7046) 2025-02-28 14:55:43 +05:45
dependabot[bot] c411466df7 chore(deps): bump trufflesecurity/trufflehog from 3.88.13 to 3.88.14 (#7063)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-28 09:10:47 +01:00
Daniel Barranquero 9679939307 feat(m365): add sharepoint service with 4 checks (#7057)
Co-authored-by: MarioRgzLpz <mariorgzlpz1809@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-27 18:15:17 +01:00
Pedro Martín 8539423b22 feat(docs): add info related with sts assume role and regions (#7062) 2025-02-27 17:40:31 +01:00
Daniel Barranquero 81edafdf09 fix(azure): handle account not supporting Blob (#7060) 2025-02-27 13:20:56 +01:00
Sergio Garcia e0a262882a fix(ecs): ensure unique finding id in ECS checks (#7059) 2025-02-27 13:02:22 +01:00
Prowler Bot 89237ab99e chore(regions_update): Changes in regions for AWS services (#7056)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-27 11:00:13 +01:00
Hugo Pereira Brito 0f414e451e feat(microsoft365): add new check entra_policy_ensure_default_user_cannot_create_tenants (#6918)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-27 10:31:02 +01:00
Pablo Lara 1180522725 feat(exports): download scan exports (#7006) 2025-02-27 14:08:12 +05:45
Pepe Fagoaga 81c7ebf123 fix(env): UI version must be stable (#7055) 2025-02-27 13:32:53 +05:45
Víctor Fernández Poyatos 258f05e6f4 fix(migrations): Fix migration dependency order (#7051) 2025-02-26 17:26:21 +01:00
Víctor Fernández Poyatos 53efb1c153 feat(labeler): apply label on migration changes (#7052) 2025-02-26 17:03:12 +01:00
Pepe Fagoaga 26014a9705 fix(findings): change uid from varchar to text (#7048) 2025-02-26 21:17:16 +05:45
Víctor Fernández Poyatos 00ef037e45 feat(findings): Add Django management command to populate database with dummy data (#7049) 2025-02-26 16:15:37 +01:00
Adrián Jesús Peña Rodríguez 669ec74e67 feat(export): add API export system (#6878) 2025-02-26 15:49:44 +01:00
dependabot[bot] c4528200b0 chore(deps-dev): bump black from 24.10.0 to 25.1.0 (#6733)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-26 11:38:09 +01:00
Daniel Barranquero ba7cd0250a fix(elasticache): improve logic in elasticache_redis_cluster_backup_enabled (#7042) 2025-02-26 10:31:14 +01:00
Rubén De la Torre Vico c5e97678a1 fix(azure): migrate resource models to avoid using SDK defaults (#6880) 2025-02-26 09:54:53 +01:00
Pedro Martín 337a46cdcc feat(aws): add ISO 27001 2022 compliance framework (#7035)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-26 08:34:08 +01:00
Hugo Pereira Brito 7f74b67f1f chore(iam): enhance iam_role_cross_service_confused_deputy_prevention recommendation (#7023) 2025-02-26 07:37:57 +01:00
Prowler Bot 5dcc48d2e5 chore(regions_update): Changes in regions for AWS services (#7034)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-26 07:30:07 +01:00
Prowler Bot 8b04aab07d chore(regions_update): Changes in regions for AWS services (#7015)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-26 07:29:42 +01:00
dependabot[bot] eab4f6cf2e chore(deps): bump google-api-python-client from 2.161.0 to 2.162.0 (#7037)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-26 07:25:14 +01:00
Hugo Pereira Brito 7f8d623283 refactor(microsoft365): CheckReportMicrosoft365 and resource metadata (#6952) 2025-02-26 07:24:54 +01:00
Víctor Fernández Poyatos dbffed8f1f feat(findings): Optimize findings endpoint (#7019) 2025-02-25 12:41:47 +01:00
Pepe Fagoaga 7e3688fdd0 chore(action): Conventional Commit Check (#7033) 2025-02-25 09:51:55 +01:00
dependabot[bot] 2e111e9ad3 chore(deps): bump trufflesecurity/trufflehog from 3.88.12 to 3.88.13 (#7026)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-25 14:34:24 +05:45
Pedro Martín 6d6070ff3f feat(outputs): add sample outputs (#6945) 2025-02-25 14:33:16 +05:45
Pedro Martín 391bbde353 fix(cis): show report table on the CLI (#6979) 2025-02-25 14:28:58 +05:45
Pedro Martín 3c56eb3762 feat(azure): add PCI DSS 4.0 (#6982) 2025-02-25 14:27:50 +05:45
Pedro Martín 7c14ea354b feat(kubernetes): add PCI DSS 4.0 (#7013) 2025-02-25 14:27:14 +05:45
Pedro Martín c96aad0b77 feat(dashboard): take the latest finding uid by timestamp (#6987) 2025-02-25 14:25:03 +05:45
Víctor Fernández Poyatos a9dd3e424b feat(tasks): add deletion queue for deletion tasks (#7022) 2025-02-24 18:02:52 +01:00
Pedro Martín 8a144a4046 feat(gcp): add PCI DSS 4.0 (#7010) 2025-02-21 16:19:20 +05:30
Prowler Bot 75f86d7267 chore(regions_update): Changes in regions for AWS services (#7011)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-21 15:37:15 +05:30
dependabot[bot] bbf875fc2f chore(deps-dev): bump mkdocs-material from 9.6.4 to 9.6.5 (#7007)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-21 14:28:18 +05:30
Raj Chowdhury 59d491f61b fix(typo): solve typo in dashboard.md (#7009) 2025-02-21 14:17:08 +05:30
dependabot[bot] ed640a1324 chore(deps): bump trufflesecurity/trufflehog from 3.88.11 to 3.88.12 (#7008)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-21 14:16:15 +05:30
César Arroba e86fbcaef7 feat(api): setup sentry for OSS API (#6874) 2025-02-20 23:08:01 +05:45
Pablo Lara 7f48212054 chore(users): renaming the account now triggers a re-render in the sidebar (#7005) 2025-02-20 16:58:45 +01:00
dependabot[bot] a2c5c71baf chore(deps): bump python from 3.12.8-alpine3.20 to 3.12.9-alpine3.20 (#6882)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 21:11:45 +05:30
dependabot[bot] b904f81cb9 chore(deps): bump tzlocal from 5.2 to 5.3 (#6932)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 21:10:46 +05:30
dependabot[bot] d64fe374dd chore(deps): bump cryptography from 43.0.1 to 44.0.1 in /api (#7001)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 12:55:36 +01:00
Hugo Pereira Brito fe25e7938e docs(tutorials): update all deprecated poetry shell references (#7002) 2025-02-20 17:04:19 +05:45
Prowler Bot 931df361bf chore(regions_update): Changes in regions for AWS services (#6998)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-20 15:52:36 +05:30
Pedro Martín d7c45f4aee chore(github): add compliance to PR labeler (#6996) 2025-02-20 14:50:43 +05:30
Pedro Martín 5e5bef581b fix(soc2_aws): remove duplicated checks (#6995) 2025-02-20 14:38:26 +05:30
Hugo Pereira Brito 2d9e95d812 docs(installation): add warning for poetry shell deprecation in README (#6983) 2025-02-20 14:19:35 +05:45
Pablo Lara e5f979d106 chore(findings): add 'Status Extended' attribute to finding details (#6997) 2025-02-20 09:33:03 +01:00
Sergio Garcia c7a5815203 fix(deps): update vulnerable cryptography dependency (#6993) 2025-02-20 12:18:15 +05:30
Pedro Martín 03e268722e feat(aws): add PCI DSS 4.0 (#6949) 2025-02-20 11:07:06 +05:30
dependabot[bot] 78a2774329 chore(deps): bump trufflesecurity/trufflehog from 3.88.9 to 3.88.11 (#6988)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 11:04:15 +05:30
dependabot[bot] c1b5ab7f53 chore(deps): bump kubernetes from 32.0.0 to 32.0.1 (#6992)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 10:46:19 +05:30
Sergio Garcia b861d97ad4 fix(report): remove invalid resources in report (#6852) 2025-02-19 21:27:52 +05:45
Pablo Lara f3abcc9dd6 feat(scans): update the progress for executing scans (#6972) 2025-02-19 16:10:29 +01:00
César Arroba cab13fe018 chore(gha): trigger API or UI deployment when push to master (#6946) 2025-02-19 18:08:51 +05:45
Prowler Bot cc4b19c7ce chore(regions_update): Changes in regions for AWS services (#6978) 2025-02-19 11:04:45 +01:00
Pablo Lara a754d9aee5 fix(roles): handle empty response in deleteRole and ensure revalidation (#6976) 2025-02-19 09:03:49 +01:00
Pedro Martín 22b54b2d8d feat(aws): add compliance CIS 4.0 (#6937) 2025-02-19 08:23:49 +05:30
dependabot[bot] d12ca6301a chore(deps-dev): bump flake8 from 7.1.1 to 7.1.2 (#6954)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-19 08:09:58 +05:30
Hugo Pereira Brito bc1b2ad9ab test(cloudfront): add name retrieval test for cloudfront bucket domains (#6969) 2025-02-19 08:08:55 +05:30
Pepe Fagoaga 1782ab1514 fix(ocsf): Adapt for 1.4.0 (#6971) 2025-02-19 08:06:13 +05:30
Prowler Bot 0384fc50e3 chore(regions_update): Changes in regions for AWS services (#6968)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-18 18:40:01 +05:30
dependabot[bot] cc46dee9ee chore(deps-dev): bump bandit from 1.8.2 to 1.8.3 (#6955)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-18 18:39:10 +05:30
Hugo Pereira Brito ed5a0ae45a fix(cloudfront): Incorrect bucket name retrievement (#6947) 2025-02-17 17:08:28 +01:00
Prowler Bot 928ccfefb8 chore(regions_update): Changes in regions for AWS services (#6944)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-17 16:55:15 +01:00
dependabot[bot] 7f6bfb7b3e chore(deps): bump trufflesecurity/trufflehog from 3.88.8 to 3.88.9 (#6943)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-17 16:54:52 +01:00
Rubén De la Torre Vico bcbc9bf675 fix(gcp): Correct false positive when sslMode=ENCRYPTED_ONLY in CloudSQL (#6936) 2025-02-14 15:16:21 -05:00
dependabot[bot] 0ec4366f4c chore(deps): bump google-api-python-client from 2.160.0 to 2.161.0 (#6933)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-14 10:09:33 -05:00
César Arroba ff72b7eea1 fix(gha): fix short sha step (#6939) 2025-02-14 19:11:26 +05:45
César Arroba a32ca19251 chore(gha): add tag for api and ui images on push to master (#6920) 2025-02-14 18:01:22 +05:45
Pablo Lara b79508956a fix(issue pages): apply sorting by default in issue pages (#6934) 2025-02-14 10:32:34 +01:00
dependabot[bot] d76c5bd658 chore(deps): bump trufflesecurity/trufflehog from 3.88.7 to 3.88.8 (#6931)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-13 18:17:25 -05:00
Kay Agahd 580e11126c fix(aws): codebuild service threw KeyError for projects type CODEPIPELINE (#6919) 2025-02-13 12:22:09 -05:00
Sergio Garcia 736d40546a fix(gcp): handle DNS Managed Zone with no DNSSEC (#6924) 2025-02-13 12:18:50 -05:00
dependabot[bot] 88810d2bb5 chore(deps-dev): bump mkdocs-material from 9.6.3 to 9.6.4 (#6913)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-13 11:36:07 -05:00
Víctor Fernández Poyatos 3a8f4d2ffb feat(social-login): Add social login integration for Google and Github OAuth providers (#6906) 2025-02-13 16:54:38 +01:00
Sergio Garcia 1fe125a65f chore(docs): external K8s cluster Prowler App credentials (#6921) 2025-02-13 09:46:05 -05:00
Kay Agahd 0ff4df0836 fix(aws): SNS threw IndexError if SubscriptionArn is PendingConfirmation (#6896) 2025-02-13 09:34:48 -05:00
Pedro Martín 16b4775e2d fix(gcp): remove typos on CIS 3.0 (#6917) 2025-02-13 13:48:19 +01:00
dependabot[bot] c3a13b8a29 chore(deps): bump trufflesecurity/trufflehog from 3.88.6 to 3.88.7 (#6915)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-12 19:15:03 -05:00
Sergio Garcia d1053375b7 fix(aws): handle AccessDenied when retrieving resource policy (#6908)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-12 15:31:26 -05:00
César Arroba 0fa4538256 fix(gha): fix test build containers on pull requests actions (#6909) 2025-02-12 23:26:54 +05:45
Ogonna Iwunze 738644f288 fix(kms): Amazon KMS API call error handling (#6843)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-12 10:09:15 -05:00
dependabot[bot] 2f80b055ac chore(deps-dev): bump coverage from 7.6.11 to 7.6.12 (#6897)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-12 10:08:26 -05:00
Prowler Bot fd62a1df10 chore(regions_update): Changes in regions for AWS services (#6900)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-12 10:06:42 -05:00
César Arroba a85d0ebd0a chore(api): test build container image on pull request (#6850) 2025-02-12 15:44:05 +05:45
César Arroba 2c06902baa chore(ui): test build container image on pull request (#6849) 2025-02-12 15:43:22 +05:45
Pepe Fagoaga 76ac6429fe chore(version): Update version to 5.4.0 (#6894) 2025-02-11 17:51:08 -05:00
dependabot[bot] 43cae66b0d chore(deps-dev): bump coverage from 7.6.10 to 7.6.11 (#6887)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 19:30:36 -05:00
dependabot[bot] dacddecc7d chore(deps): bump trufflesecurity/trufflehog from 3.88.5 to 3.88.6 (#6888)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 18:15:25 -05:00
Mario Rodriguez Lopez dcb9267c2f feat(microsof365): Add documentation and compliance file (#6195)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-10 11:13:06 -05:00
Víctor Fernández Poyatos ff35fd90fa chore(api): Update changelog and specs (#6876) 2025-02-10 12:06:34 +01:00
Víctor Fernández Poyatos 7469377079 chore: Add needed steps for API in PR template (#6875) 2025-02-10 15:20:09 +05:45
Pepe Fagoaga c8441f8d38 fix(kubernetes): Change UID validation (#6869)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-10 14:55:24 +05:45
Pepe Fagoaga abf4eb0ffc chore: Rename dashboard table latest findings (#6873)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-10 09:55:44 +01:00
dependabot[bot] 93717cc830 chore(deps-dev): bump mkdocs-material from 9.6.2 to 9.6.3 (#6871)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-07 18:24:49 -05:00
Sergio Garcia b629bc81f8 docs(eks): add documentation about EKS onboarding (#6853)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-07 10:59:01 -05:00
Pedro Martín f628897fe1 fix(dashboard): adjust the bar chart display (#6690) 2025-02-07 10:05:30 -05:00
Prowler Bot 54b82a78e3 chore(regions_update): Changes in regions for AWS services (#6858)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-07 10:02:28 -05:00
Víctor Fernández Poyatos 377faf145f feat(findings): Use ArrayAgg and subqueries on metadata endpoint (#6863)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-07 19:36:01 +05:45
Kay Agahd 69e316948f fix(aws): key error for detect-secrets (#6710) 2025-02-07 14:48:16 +01:00
Pablo Lara 62cbff4f53 feat: implement new functionality with inserted_at__gte in findings a… (#6864) 2025-02-07 14:25:25 +01:00
Víctor Fernández Poyatos 5582265e9d docs: Add details about user creation in Prowler app (#6862) 2025-02-07 13:29:25 +01:00
dependabot[bot] fb5ea3c324 chore(deps): bump microsoft-kiota-abstractions from 1.9.1 to 1.9.2 (#6856)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-07 11:07:43 +01:00
Víctor Fernández Poyatos 9b5f676f50 feat(findings): Require date filters for findings endpoints (#6800) 2025-02-07 13:54:55 +05:45
Pranay Girase 88cfc0fa7e fix(typo): typos in Dashboard and Report in HTML (#6847) 2025-02-06 10:42:31 -05:00
Prowler Bot 665bfa2f13 chore(regions_update): Changes in regions for AWS services (#6848)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-06 08:46:32 -05:00
dependabot[bot] b89b1a64f4 chore(deps): bump trufflesecurity/trufflehog from 3.88.4 to 3.88.5 (#6844)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-05 18:02:42 -05:00
Sergio Garcia 9ba657c261 fix(kms): handle error in DescribeKey function (#6839) 2025-02-05 14:03:31 -05:00
Mario Rodriguez Lopez bce958b8e6 feat(entra): add new check entra_thirdparty_integrated_apps_not_allowed (#6357)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-05 12:45:48 -05:00
Daniel Barranquero 914012de2b fix(cloudfront): fix false positive in s3 origins (#6823) 2025-02-05 12:39:49 -05:00
Ogonna Iwunze 8d1c476aed feat(kms): add kms_cmk_not_multi_region AWS check (#6794)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-05 11:20:29 -05:00
Gary Mclean 567c729e9e fix(findings) Spelling mistakes correction (#6822) 2025-02-05 10:26:50 -05:00
Kay Agahd 3f03dd20e4 fix(aws) wording of report.status_extended in awslambda_function_not_publicly_accessible (#6824) 2025-02-05 10:23:52 -05:00
Daniel Barranquero 1c778354da fix(directoryservice): handle ClientException (#6781)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-05 10:22:32 -05:00
Prowler Bot 3a149fa459 chore(regions_update): Changes in regions for AWS services (#6821)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-05 09:19:56 -05:00
Mario Rodriguez Lopez f3b121950d feat(entra): add new entra service for Microsoft365 (#6326)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 19:47:14 -05:00
Mario Rodriguez Lopez 43c13b7ba1 feat(microsoft365): add new check admincenter_settings_password_never_expire (#6023)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 17:24:11 -05:00
dependabot[bot] 9447b33800 chore(deps): bump kubernetes from 31.0.0 to 32.0.0 (#6678)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-04 17:22:51 -05:00
Hugo Pereira Brito 2934752eeb fix(elasticache): InvalidReplicationGroupStateFault error (#6815)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 14:28:31 -05:00
dependabot[bot] dd6d8c71fd chore(deps-dev): bump moto from 5.0.27 to 5.0.28 (#6804)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 12:58:48 -05:00
Pablo Lara 80267c389b style(forms): improve spacing consistency (#6814) 2025-02-04 13:20:24 +01:00
Pablo Lara acfbaf75d5 chore(forms): improvements to the sign-in and sign-up forms (#6813) 2025-02-04 12:46:07 +01:00
Pedro Martín 5f54377407 chore(aws_audit_manager_control_tower_guardrails): add checks to reqs (#6699) 2025-02-03 14:59:08 -05:00
Drew Kerrigan 552aa64741 docs(): add description of changed and new delta values to prowler app tutorial (#6801) 2025-02-03 20:51:03 +01:00
dependabot[bot] d64f611f51 chore(deps): bump pytz from 2024.2 to 2025.1 (#6765)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 12:48:18 -05:00
dependabot[bot] a96cc92d77 chore(deps-dev): bump mkdocs-material from 9.5.50 to 9.6.2 (#6799)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 11:37:02 -05:00
dependabot[bot] 3858cccc41 chore(deps-dev): bump pylint from 3.3.3 to 3.3.4 (#6721)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 10:32:42 -05:00
Pedro Martín 072828512a fix(cis_1.5_aws): add checks to needed reqs (#6695)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-03 10:32:20 -05:00
Pedro Martín a73ffe5642 fix(cis_1.4_aws): add checks to needed reqs (#6696)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-03 10:32:10 -05:00
Pablo Lara 8e784a5b6d feat(scans): show scan details right after launch (#6791) 2025-02-03 16:08:47 +01:00
dependabot[bot] 1b6f9332f1 chore(deps): bump trufflesecurity/trufflehog from 3.88.2 to 3.88.4 (#6760)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 09:35:53 -05:00
secretcod3r db8b472729 fix(gcp): fix wrong provider value in check (#6691) 2025-02-03 09:29:08 -05:00
Pedro Martín 867b371522 fix(cis_2.0_aws): add checks to needed reqs (#6694) 2025-02-03 09:28:04 -05:00
dependabot[bot] c0d7c9fc7d chore(deps): bump google-api-python-client from 2.159.0 to 2.160.0 (#6720)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 09:27:17 -05:00
Pablo Lara bb4685cf90 fix(findings): remove default status filtering (#6784) 2025-02-03 15:20:18 +01:00
Pablo Lara 6a95426749 fix(findings): order findings by inserted_at DESC (#6782) 2025-02-03 11:51:07 +01:00
Víctor Fernández Poyatos ef6af8e84d feat(schedules): Rework daily schedule to always show the next scan (#6700) 2025-02-03 11:08:27 +01:00
Víctor Fernández Poyatos 763130f253 fix(celery): Kill celery worker process after every task to release memory (#6761) 2025-01-31 19:30:08 +05:45
Hugo Pereira Brito 1256c040e9 fix: microsoft365 mutelist (#6724) 2025-01-31 12:32:39 +01:00
dependabot[bot] 18b7b48a99 chore(deps): bump microsoft-kiota-abstractions from 1.6.8 to 1.9.1 (#6734)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-31 10:07:17 +01:00
Pepe Fagoaga 627c11503f fix(db_event): Handle other events (#6754) 2025-01-30 21:46:43 +05:45
Víctor Fernández Poyatos 712ba84f06 feat(scans): Optimize read queries during scans (#6753) 2025-01-30 20:51:12 +05:45
Pepe Fagoaga 5186e029b3 fix(set_report_color): Add more details to error (#6751) 2025-01-30 20:48:51 +05:45
Pablo Lara 5bfaedf903 fix: Enable hot reloading when using Docker Compose for UI (#6750) 2025-01-30 14:05:39 +01:00
Víctor Fernández Poyatos 5061da6897 feat(findings): Improve /findings/metadata performance (#6748) 2025-01-30 13:31:43 +01:00
Pepe Fagoaga c159a28016 fix(neptune): correct service name (#6743) 2025-01-30 17:16:18 +05:45
Pepe Fagoaga 82a1b1c921 fix(finding): raise when generating invalid findings (#6738) 2025-01-30 15:59:38 +05:45
Pepe Fagoaga bf2210d0f4 fix(acm): Key Error DomainName (#6739) 2025-01-30 15:54:31 +05:45
Kay Agahd 8f0772cb94 fix(aws): iam_user_with_temporary_credentials resource in OCSF (#6697)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2025-01-30 15:28:21 +05:45
Pepe Fagoaga 5b57079ecd fix(sns): Add region to subscriptions (#6731) 2025-01-30 14:38:21 +05:45
Matt Johnson 350d759517 chore: Update Google Analytics ID across all docs.prowler.com sites. (#6730) 2025-01-30 12:47:01 +05:45
Pablo Lara edd793c9f5 fix(scans): change label for next scan (#6725) 2025-01-29 10:46:49 +01:00
Víctor Fernández Poyatos 545c2dc685 fix(migrations): Use indexes instead of constraints to define an index (#6722) 2025-01-29 14:24:04 +05:45
Víctor Fernández Poyatos 84955c066c revert: Update Django DB manager to use psycopg3 and connection pooling (#6717) 2025-01-28 22:15:01 +05:45
Víctor Fernández Poyatos 06dd03b170 fix(scan-summaries): Improve efficiency on providers overview (#6716) 2025-01-28 21:56:29 +05:45
Pedro Martín 47bc2ed2dc fix(defender): add field to SecurityContacts (#6693) 2025-01-28 15:52:56 +01:00
Pablo Lara 44281afc54 fix(scans): filters and sorting for scan table (#6713) 2025-01-28 13:26:31 +01:00
Víctor Fernández Poyatos 4d2859d145 fix(scans, findings): Improve API performance ordering by inserted_at instead of id (#6711) 2025-01-28 16:41:58 +05:45
Pablo Lara 45d44a1669 fix: fixed bug when opening finding details while a scan is in progress (#6708) 2025-01-28 06:58:18 +01:00
dependabot[bot] ddd83b340e chore(deps): bump uuid from 10.0.0 to 11.0.5 in /ui (#6516)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-26 13:39:42 +01:00
Mario Rodriguez Lopez ccdb54d7c3 feat(m365): add Microsoft 365 provider (#5902)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-24 13:14:17 -05:00
Rubén De la Torre Vico bcc246d950 fix(cloudsql): add trusted client certificates case for cloudsql_instance_ssl_connections (#6682) 2025-01-24 10:42:45 -05:00
dependabot[bot] 62139e252a chore(deps): bump azure-mgmt-web from 7.3.1 to 8.0.0 (#6680)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-24 12:40:11 +01:00
dependabot[bot] 86950c3a0a chore(deps): bump msgraph-sdk from 1.17.0 to 1.18.0 (#6679)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-24 10:47:09 +01:00
dependabot[bot] f4865ef68d chore(deps): bump azure-storage-blob from 12.24.0 to 12.24.1 (#6666)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-24 09:44:16 +01:00
Pepe Fagoaga ea7209e7ae chore: bump for next minor (#6672) 2025-01-23 13:13:08 -05:00
Hugo Pereira Brito 998c551cf3 fix(cloudwatch): NoneType object is not iterable (#6671) 2025-01-23 12:27:07 -05:00
Paolo Frigo e6f29b0116 docs: update # of checks, services, frameworks and categories (#6528)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 11:11:03 -05:00
Pepe Fagoaga eb90bb39dc chore(api): Bump to v1.3.0 (#6670) 2025-01-23 21:25:29 +05:45
Pepe Fagoaga ad189b35ad chore(scan): Remove ._findings (#6667) 2025-01-23 20:43:02 +05:45
Pablo Lara 7d2989a233 chore: adjust DateWithTime component height when used with InfoField (#6669) 2025-01-23 15:18:24 +01:00
Pablo Lara 862137ae7d chore(scans): improve scan details (#6665) 2025-01-23 13:20:41 +01:00
Pedro Martín c86e082d9a feat(detect-secrets): get secrets plugins from config.yaml (#6544)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-23 17:18:19 +05:45
Sergio Garcia 80fe048f97 feat(resource metadata): add resource metadata to JSON OCSF (#6592)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-01-23 16:06:30 +05:45
dependabot[bot] f2bffb3ce7 chore(deps): bump azure-mgmt-containerservice from 33.0.0 to 34.0.0 (#6630)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-22 16:37:07 -05:00
dependabot[bot] cbe2f9eef8 chore(deps): bump azure-mgmt-compute from 33.1.0 to 34.0.0 (#6628)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-22 20:00:56 +01:00
Pepe Fagoaga 688f41f570 fix(templates): Customize principals and add validation (#6655) 2025-01-22 21:47:57 +05:45
Anton Rubets a29197637e chore(helm): Add prowler helm support (#6580)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-22 10:55:26 -05:00
Prowler Bot 7a2712a37f chore(regions_update): Changes in regions for AWS services (#6652)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-22 09:30:03 -05:00
dependabot[bot] 189f5cfd8c chore(deps): bump boto3 from 1.35.94 to 1.35.99 (#6651)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-22 09:29:41 -05:00
Kay Agahd e509480892 fix: add detector and line number of potential secret (#6654) 2025-01-22 20:13:23 +05:45
Pepe Fagoaga 7f7955351a chore(pre-commit): poetry checks for API and SDK (#6658) 2025-01-22 20:05:26 +05:45
Pepe Fagoaga 46f1db21a8 chore(api): Use prowler from master (#6657) 2025-01-22 20:05:02 +05:45
Pablo Lara fbe7bc6951 feat(providers): show the cloud formation and terraform template links on the form (#6660) 2025-01-22 14:49:38 +01:00
Pablo Lara f658507847 feat(providers): make external id field mandatory in the aws role secret form (#6656) 2025-01-22 12:45:31 +01:00
dependabot[bot] 374078683b chore(deps-dev): bump moto from 5.0.16 to 5.0.27 (#6632)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-21 13:56:06 -05:00
dependabot[bot] 114c4e0886 chore(deps): bump botocore from 1.35.94 to 1.35.99 (#6520)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-21 09:17:18 -05:00
Pablo Lara 67c62766d4 fix(filters): fix dynamic filters (#6642) 2025-01-21 13:33:27 +01:00
dependabot[bot] 3f2947158d chore(deps): bump prowler from 5.1.1 to 5.1.4 in /api (#6641)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-21 14:27:59 +05:45
dependabot[bot] 278a7cb356 chore(deps-dev): bump mkdocs-material from 9.5.49 to 9.5.50 (#6631)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-20 18:31:44 -05:00
Rubén De la Torre Vico 890158a79c fix(OCSF): fix OCSF output when timestamp is UNIX format (#6606) 2025-01-20 17:11:28 -05:00
Rubén De la Torre Vico 4dc1602b77 fix: update Azure CIS with existing App checks (#6611) 2025-01-20 15:12:00 -05:00
Kay Agahd bbba0abac9 fix(aws): list tags for DocumentDB clusters (#6605) 2025-01-20 15:10:58 -05:00
Prowler Bot d04fd807c6 chore(regions_update): Changes in regions for AWS services (#6599)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-20 15:09:35 -05:00
Pablo Lara 3456df4cf1 fix(snippet-id): improve provider ID readability in tables (#6615) 2025-01-20 17:23:19 +01:00
Pablo Lara f56aaa791e chore(RBAC): add permission's info (#6612) 2025-01-20 16:14:48 +01:00
Adrián Jesús Peña Rodríguez 465a758770 fix(rbac): remove invalid required permission (#6608) 2025-01-20 15:21:52 +01:00
Pablo Lara 0f7c0c1b2c fix(RBAC): tweaks for edit role form (#6609) 2025-01-20 14:09:16 +01:00
Adrián Jesús Peña Rodríguez bf8d10b6f6 feat(api): restrict the deletion of users, only the user of the request can be deleted (#6607) 2025-01-20 13:26:47 +01:00
Pablo Lara 20d04553d6 fix(RBAC): restore manage_account permission for roles (#6602) 2025-01-20 11:35:29 +01:00
Daniel Barranquero b56d62e3c4 fix(sqs): fix flaky test (#6593) 2025-01-17 11:48:39 -05:00
Hugo Pereira Brito 9a332dcba1 chore(services): delete all comment headers (#6585) 2025-01-17 08:21:28 -05:00
Hugo Pereira Brito 166d9f8823 fix(apigatewayv2): managed exception NotFoundException (#6576) 2025-01-17 08:17:51 -05:00
Prowler Bot 42f5eed75f chore(regions_update): Changes in regions for AWS services (#6577)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-17 08:17:00 -05:00
Rubén De la Torre Vico 01a7db18dd fix: add missing Check_Report_Azure parameters (#6583) 2025-01-17 08:16:43 -05:00
Pablo Lara d4507465a3 fix(providers): update the label and placeholder based on the cloud provider (#6581) 2025-01-17 12:28:38 +01:00
Pablo Lara 3ac92ed10a fix(findings): remove filter delta_in applied by default (#6578) 2025-01-17 11:03:12 +01:00
Pablo Lara 43c76ca85c feat(findings): add first seen in findings details (#6575) 2025-01-17 10:19:10 +01:00
dependabot[bot] 54d87fa96a chore(deps): bump prowler from 5.0.2 to 5.1.1 in /api (#6573)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-17 13:26:07 +05:45
Daniel Barranquero f041f17268 fix(gcp): fix flaky tests from dns service (#6569) 2025-01-16 14:49:25 -05:00
dependabot[bot] 31c80a6967 chore(deps): bump msgraph-sdk from 1.16.0 to 1.17.0 (#6547)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-16 12:55:30 -05:00
Rubén De la Torre Vico 783ce136f4 feat(network): extract Network resource metadata automated (#6555)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-16 12:41:02 -05:00
Rubén De la Torre Vico f829145781 feat(storage): extract Storage resource metadata automated (#6563)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-16 11:44:43 -05:00
Rubén De la Torre Vico 389337f8cd feat(vm): extract VM resource metadata automated (#6564) 2025-01-16 11:16:02 -05:00
Pedro Martín a0713c2d66 fix(cis): add subsections if needed (#6559) 2025-01-16 11:10:54 -05:00
Rubén De la Torre Vico f94d3cbce4 feat(sqlserver): extract SQL Server resource metadata automated (#6562) 2025-01-16 10:47:21 -05:00
Daniel Barranquero 8d8994b468 feat(aws): include resource metadata to remaining checks (#6551)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-16 10:44:14 -05:00
Rubén De la Torre Vico 784a9097a5 feat(postgresql): extract PostgreSQL resource metadata automated (#6560) 2025-01-16 10:37:55 -05:00
Pedro Martín b9601626e3 fix(detect_secrets): refactor logic for detect-secrets (#6537) 2025-01-16 21:15:44 +05:45
Rubén De la Torre Vico dc80b011f2 feat(policy): extract Policy resource metadata automated (#6558) 2025-01-16 10:29:28 -05:00
Rubén De la Torre Vico ee7d32d460 feat(entra): extract Entra resource metadata automated (#6542)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-16 10:24:53 -05:00
Rubén De la Torre Vico 43fd9ee94e feat(monitor): extract monitor resource metadata automated (#6554)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-16 10:16:19 -05:00
Víctor Fernández Poyatos 8821a91f3f feat(db): Update Django DB manager to use psycopg3 and connection pooling (#6541) 2025-01-16 15:29:02 +01:00
Rubén De la Torre Vico 98d9256f92 feat(mysql): extract MySQL resource metadata automated (#6556) 2025-01-16 09:24:06 -05:00
Rubén De la Torre Vico b35495eaa7 feat(keyvault): extract KeyVault resource metadata automated (#6553) 2025-01-16 09:17:36 -05:00
Rubén De la Torre Vico 74d6b614b3 feat(iam): extract IAM resource metadata automated (#6552) 2025-01-16 09:05:23 -05:00
Sergio Garcia dd63c16a74 fix(gcp): iterate through service projects (#6549)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-01-16 08:52:52 -05:00
Pablo Lara 4280266a96 fix(dep): address compatibility issues (#6543) 2025-01-16 14:28:49 +01:00
Hugo Pereira Brito b1f02098ff feat(aws): include resource metadata in services from r* to s* (#6536)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-15 18:10:53 -05:00
Pedro Martín 95189b574a feat(gcp): add resource metadata to report (#6500)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-15 18:09:35 -05:00
Hugo Pereira Brito c5d23503bf feat(aws): include resource metadata in services from a* to b* (#6504)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-15 18:03:37 -05:00
Daniel Barranquero 77950f6069 chore(aws): add resource metadata to services from t to w (#6546) 2025-01-15 17:22:08 -05:00
Daniel Barranquero ec5f2b3753 chore(aws): add resource metadata to services from f to o (#6545) 2025-01-15 17:15:50 -05:00
Rubén De la Torre Vico 9e7104fb7f feat(defender): extract Defender resource metadata in automated way (#6538) 2025-01-15 12:14:24 -05:00
Rubén De la Torre Vico 6b3b6ca45e feat(appinsights): extract App Insights resource metadata in automated way (#6540) 2025-01-15 11:45:23 -05:00
Hugo Pereira Brito 20b8b0b24e feat: add resource metadata to emr_cluster_account_public_block_enabled (#6539) 2025-01-15 11:44:51 -05:00
Sergio Garcia 4e11540458 feat(kubernetes): add resource metadata to report (#6479) 2025-01-15 11:36:09 -05:00
Hugo Pereira Brito ee87f2676d feat(aws): include resource metadata in services from d* to e* (#6532)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-15 10:05:04 -05:00
Daniel Barranquero 74a90aab98 feat(aws): add resource metadata to all services starting with c (#6493) 2025-01-15 09:04:19 -05:00
Rubén De la Torre Vico 48ff9a5100 feat(cosmosdb): extract CosmosDB resource metadata in automated way (#6533) 2025-01-15 08:51:48 -05:00
Rubén De la Torre Vico 3dfd578ee5 feat(containerregistry): extract Container Registry resource metadata in automated way (#6530) 2025-01-15 08:51:16 -05:00
Rubén De la Torre Vico 0db46cdc81 feat(azure-app): extract Web App resource metadata in automated way (#6529) 2025-01-15 08:48:36 -05:00
Prowler Bot fdac58d031 chore(regions_update): Changes in regions for AWS services (#6526)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-15 08:46:35 -05:00
dependabot[bot] df9d4ce856 chore(deps): bump google-api-python-client from 2.158.0 to 2.159.0 (#6521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-15 08:33:47 -05:00
Pedro Martín e6ae4e97e8 docs(readme): update pr template to add check for readme (#6531) 2025-01-15 12:12:45 +01:00
Adrián Jesús Peña Rodríguez 10a4c28922 feat(finding): add first_seen attribute (#6460) 2025-01-15 11:25:41 +01:00
dependabot[bot] 8a828c6e51 chore(deps): bump django from 5.1.4 to 5.1.5 in /api (#6519)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-15 10:52:11 +01:00
Víctor Fernández Poyatos d7b40905ff feat(findings): Add resource_tag filters for findings endpoint (#6527) 2025-01-15 10:30:36 +01:00
Adrián Jesús Peña Rodríguez f9a3b5f3cd feat(provider-secret): make existing external_id field mandatory (#6510) 2025-01-15 10:14:44 +01:00
Pablo Lara b73b89242f feat(filters): add resource type filter for findings (#6524) 2025-01-15 08:40:53 +01:00
dependabot[bot] 23a0f6e8de chore(deps-dev): bump eslint-config-prettier from 9.1.0 to 10.0.1 in /ui (#6518)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-15 06:55:25 +01:00
Pedro Martín 87967abc3f feat(kubernetes): add CIS 1.10 compliance (#6508)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-14 14:16:00 -05:00
Rubén De la Torre Vico ce60c286dc feat(aks): use Check_Report_Azure constructor properly in AKS checks (#6509) 2025-01-14 14:14:02 -05:00
Pepe Fagoaga 90fd9b0eb8 chore(version): set next minor (#6511) 2025-01-14 14:06:24 -05:00
Prowler Bot ca262a6797 chore(regions_update): Changes in regions for AWS services (#6495)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-14 12:43:44 -05:00
Rubén De la Torre Vico c056d39775 feat(aisearch): use Check_Report_Azure constructor properly in AISearch checks (#6506) 2025-01-14 12:37:01 -05:00
johannes-engler-mw 1c4426ea4b fix(Azure TDE): add filter for master DB (#6351) 2025-01-14 12:34:52 -05:00
Pedro Martín 36520bd7a1 feat(azure): add CIS 3.0 for Azure (#5226) 2025-01-14 12:07:22 -05:00
Pepe Fagoaga badf0ace76 feat(prowler-role): Add templates to deploy it in AWS (#6499) 2025-01-14 12:04:20 -05:00
Rubén De la Torre Vico f1f61249e0 feat(azure): include resource metadata in Check_Report_Azure (#6505) 2025-01-14 11:32:40 -05:00
dependabot[bot] b371cac18c chore(deps): bump jinja2 from 3.1.4 to 3.1.5 (#6457)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-14 10:03:45 -05:00
Víctor Fernández Poyatos 1846535d8d feat(findings): add /findings/metadata to retrieve dynamic filters information (#6503) 2025-01-14 15:30:03 +01:00
dependabot[bot] d7d9118b9b chore(deps-dev): bump bandit from 1.8.0 to 1.8.2 (#6485)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-14 08:49:37 -05:00
2855 changed files with 217769 additions and 39586 deletions
+43 -2
View File
@@ -3,8 +3,8 @@
# For production, it is recommended to use a secure method to store these variables and change the default secret keys.
#### Prowler UI Configuration ####
PROWLER_UI_VERSION="latest"
SITE_URL=http://localhost:3000
PROWLER_UI_VERSION="stable"
AUTH_URL=http://localhost:3000
API_BASE_URL=http://prowler-api:8080/api/v1
NEXT_PUBLIC_API_DOCS_URL=http://prowler-api:8080/api/v1/docs
AUTH_TRUST_HOST=true
@@ -30,6 +30,30 @@ VALKEY_HOST=valkey
VALKEY_PORT=6379
VALKEY_DB=0
# API scan settings
# The path to the directory where scan output should be stored
DJANGO_TMP_OUTPUT_DIRECTORY="/tmp/prowler_api_output"
# The maximum number of findings to process in a single batch
DJANGO_FINDINGS_BATCH_SIZE=1000
# The AWS access key to be used when uploading scan output to an S3 bucket
# If left empty, default AWS credentials resolution behavior will be used
DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID=""
# The AWS secret key to be used when uploading scan output to an S3 bucket
DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY=""
# An optional AWS session token
DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN=""
# The AWS region where your S3 bucket is located (e.g., "us-east-1")
DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION=""
# The name of the S3 bucket where scan output should be stored
DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET=""
# Django settings
DJANGO_ALLOWED_HOSTS=localhost,127.0.0.1,prowler-api
DJANGO_BIND_ADDRESS=0.0.0.0
@@ -92,3 +116,20 @@ jQIDAQAB
# openssl rand -base64 32
DJANGO_SECRETS_ENCRYPTION_KEY="oE/ltOhp/n1TdbHjVmzcjDPLcLA41CVI/4Rk+UB5ESc="
DJANGO_BROKER_VISIBILITY_TIMEOUT=86400
DJANGO_SENTRY_DSN=
# Sentry settings
SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.6.0
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
SOCIAL_GOOGLE_OAUTH_CLIENT_ID=""
SOCIAL_GOOGLE_OAUTH_CLIENT_SECRET=""
SOCIAL_GITHUB_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/github"
SOCIAL_GITHUB_OAUTH_CLIENT_ID=""
SOCIAL_GITHUB_OAUTH_CLIENT_SECRET=""
+80 -76
View File
@@ -9,108 +9,112 @@ updates:
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "daily"
open-pull-requests-limit: 10
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
- "pip"
- package-ecosystem: "pip"
directory: "/api"
schedule:
interval: "daily"
open-pull-requests-limit: 10
target-branch: master
labels:
- "dependencies"
- "pip"
- "component/api"
# Dependabot Updates are temporary disabled - 2025/03/19
# - package-ecosystem: "pip"
# directory: "/api"
# schedule:
# interval: "daily"
# open-pull-requests-limit: 10
# target-branch: master
# labels:
# - "dependencies"
# - "pip"
# - "component/api"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "daily"
open-pull-requests-limit: 10
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
- "github_actions"
- package-ecosystem: "npm"
directory: "/ui"
schedule:
interval: "daily"
open-pull-requests-limit: 10
target-branch: master
labels:
- "dependencies"
- "npm"
- "component/ui"
# Dependabot Updates are temporary disabled - 2025/03/19
# - package-ecosystem: "npm"
# directory: "/ui"
# schedule:
# interval: "daily"
# open-pull-requests-limit: 10
# target-branch: master
# labels:
# - "dependencies"
# - "npm"
# - "component/ui"
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
- "docker"
# Dependabot Updates are temporary disabled - 2025/04/15
# v4.6
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: v4.6
labels:
- "dependencies"
- "pip"
- "v4"
# - package-ecosystem: "pip"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "pip"
# - "v4"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: v4.6
labels:
- "dependencies"
- "github_actions"
- "v4"
# - package-ecosystem: "github-actions"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "github_actions"
# - "v4"
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: v4.6
labels:
- "dependencies"
- "docker"
- "v4"
# - package-ecosystem: "docker"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "docker"
# - "v4"
# Dependabot Updates are temporary disabled - 2025/03/19
# v3
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 10
target-branch: v3
labels:
- "dependencies"
- "pip"
- "v3"
# - package-ecosystem: "pip"
# directory: "/"
# schedule:
# interval: "monthly"
# open-pull-requests-limit: 10
# target-branch: v3
# labels:
# - "dependencies"
# - "pip"
# - "v3"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 10
target-branch: v3
labels:
- "dependencies"
- "github_actions"
- "v3"
# - package-ecosystem: "github-actions"
# directory: "/"
# schedule:
# interval: "monthly"
# open-pull-requests-limit: 10
# target-branch: v3
# labels:
# - "dependencies"
# - "github_actions"
# - "v3"
+10
View File
@@ -92,3 +92,13 @@ component/api:
component/ui:
- changed-files:
- any-glob-to-any-file: "ui/**"
compliance:
- changed-files:
- any-glob-to-any-file: "prowler/compliance/**"
- any-glob-to-any-file: "prowler/lib/outputs/compliance/**"
- any-glob-to-any-file: "tests/lib/outputs/compliance/**"
review-django-migrations:
- changed-files:
- any-glob-to-any-file: "api/src/backend/api/migrations/**"
+8 -1
View File
@@ -15,7 +15,14 @@ Please include a summary of the change and which issue is fixed. List any depend
- [ ] Review if the code is being covered by tests.
- [ ] Review if code is being documented following this specification https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
- [ ] Review if backport is needed.
- [ ] Review if is needed to change the [Readme.md](https://github.com/prowler-cloud/prowler/blob/master/README.md)
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/prowler/CHANGELOG.md), if applicable.
#### API
- [ ] Verify if API specs need to be regenerated.
- [ ] Check if version updates are required (e.g., specs, Poetry, etc.).
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/api/CHANGELOG.md), if applicable.
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
@@ -61,33 +61,40 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set short git commit SHA
id: vars
run: |
shortSha=$(git rev-parse --short ${{ github.sha }})
echo "SHORT_SHA=${shortSha}" >> $GITHUB_ENV
- name: Login to DockerHub
uses: docker/login-action@v3
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@v6
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.WORKING_DIRECTORY }}
# Set push: false for testing
push: true
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.LATEST_TAG }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.SHORT_SHA }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@v6
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true
@@ -96,3 +103,12 @@ jobs:
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.STABLE_TAG }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Trigger deployment
if: github.event_name == 'push'
uses: peter-evans/repository-dispatch@ff45666b9427631e3450c54a1bcbee4d9ff4d7c0 # v3.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.CLOUD_DISPATCH }}
event-type: prowler-api-deploy
client-payload: '{"sha": "${{ github.sha }}", "short_sha": "${{ env.SHORT_SHA }}"}'
+3 -3
View File
@@ -44,16 +44,16 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/api-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
category: "/language:${{matrix.language}}"
+39 -9
View File
@@ -6,6 +6,7 @@ on:
- "master"
- "v5.*"
paths:
- ".github/workflows/api-pull-request.yml"
- "api/**"
pull_request:
branches:
@@ -14,7 +15,6 @@ on:
paths:
- "api/**"
env:
POSTGRES_HOST: localhost
POSTGRES_PORT: 5432
@@ -26,7 +26,8 @@ env:
VALKEY_HOST: localhost
VALKEY_PORT: 6379
VALKEY_DB: 0
API_WORKING_DIR: ./api
IMAGE_NAME: prowler-api
jobs:
test:
@@ -70,11 +71,11 @@ jobs:
--health-retries 5
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@v45
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: api/**
files_ignore: |
@@ -84,16 +85,30 @@ jobs:
api/README.md
api/mkdocs.yml
- name: Replace @master with current branch in pyproject.toml
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
BRANCH_NAME="${GITHUB_HEAD_REF:-${GITHUB_REF_NAME}}"
echo "Using branch: $BRANCH_NAME"
sed -i "s|@master|@$BRANCH_NAME|g" pyproject.toml
- name: Install poetry
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
python -m pip install --upgrade pip
pipx install poetry==1.8.5
pipx install poetry==2.1.1
- name: Update poetry.lock after the branch name change
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry lock
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@v5
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
@@ -102,7 +117,7 @@ jobs:
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry install
poetry install --no-root
poetry run pip list
VERSION=$(curl --silent "https://api.github.com/repos/hadolint/hadolint/releases/latest" | \
grep '"tag_name":' | \
@@ -144,7 +159,7 @@ jobs:
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run safety check --ignore 70612,66963
poetry run safety check --ignore 70612,66963,74429
- name: Vulture
working-directory: ./api
@@ -166,8 +181,23 @@ jobs:
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@v5
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: api
test-container-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.API_WORKING_DIR }}
push: false
tags: ${{ env.IMAGE_NAME }}:latest
outputs: type=docker
cache-from: type=gha
cache-to: type=gha,mode=max
+2 -2
View File
@@ -23,7 +23,7 @@ jobs:
steps:
- name: Check labels
id: preview_label_check
uses: docker://agilepathway/pull-request-label-checker:v1.6.55
uses: agilepathway/label-checker@c3d16ad512e7cea5961df85ff2486bb774caf3c5 # v1.6.65
with:
allow_failure: true
prefix_mode: true
@@ -33,7 +33,7 @@ jobs:
- name: Backport Action
if: steps.preview_label_check.outputs.label_check == 'success'
uses: sorenlouv/backport-github-action@v9.5.1
uses: sorenlouv/backport-github-action@ad888e978060bc1b2798690dd9d03c4036560947 # v9.5.1
with:
github_token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
auto_backport_label_prefix: ${{ env.BACKPORT_LABEL_PREFIX }}
@@ -17,7 +17,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Leave PR comment with the Prowler Documentation URI
uses: peter-evans/create-or-update-comment@v4
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ env.PR_NUMBER }}
body: |
+23
View File
@@ -0,0 +1,23 @@
name: Prowler - Conventional Commit
on:
pull_request:
types:
- "opened"
- "edited"
- "synchronize"
branches:
- "master"
- "v3"
- "v4.*"
- "v5.*"
jobs:
conventional-commit-check:
runs-on: ubuntu-latest
steps:
- name: conventional-commit-check
id: conventional-commit-check
uses: agenthunt/conventional-commit-checker-action@9e552d650d0e205553ec7792d447929fc78e012b # v2.0.0
with:
pr-title-regex: '^([^\s(]+)(?:\(([^)]+)\))?: (.+)'
+2 -2
View File
@@ -7,11 +7,11 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@v3.88.2
uses: trufflesecurity/trufflehog@b06f6d72a3791308bb7ba59c2b8cb7a083bd17e4 # v3.88.26
with:
path: ./
base: ${{ github.event.repository.default_branch }}
+1 -1
View File
@@ -14,4 +14,4 @@ jobs:
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: actions/labeler@v5
- uses: actions/labeler@8558fd74291d67161a8a78ce36a881fa63b766a9 # v5.0.0
+34
View File
@@ -0,0 +1,34 @@
name: Prowler - Merged Pull Request
on:
pull_request_target:
branches: ['master']
types: ['closed']
jobs:
trigger-cloud-pull-request:
name: Trigger Cloud Pull Request
if: github.event.pull_request.merged == true && github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set short git commit SHA
id: vars
run: |
shortSha=$(git rev-parse --short ${{ github.sha }})
echo "SHORT_SHA=${shortSha}" >> $GITHUB_ENV
- name: Trigger pull request
uses: peter-evans/repository-dispatch@ff45666b9427631e3450c54a1bcbee4d9ff4d7c0 # v3.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.CLOUD_DISPATCH }}
event-type: prowler-pull-request-merged
client-payload: '{
"PROWLER_COMMIT_SHA": "${{ github.sha }}",
"PROWLER_COMMIT_SHORT_SHA": "${{ env.SHORT_SHA }}",
"PROWLER_PR_TITLE": "${{ github.event.pull_request.title }}",
"PROWLER_PR_LABELS": ${{ toJson(github.event.pull_request.labels.*.name) }},
"PROWLER_PR_BODY": ${{ toJson(github.event.pull_request.body) }}
}'
@@ -59,16 +59,16 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Python
uses: actions/setup-python@v5
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: Install Poetry
run: |
pipx install poetry==1.8.5
pipx install poetry==2.*
pipx inject poetry poetry-bumpversion
- name: Get Prowler version
@@ -108,13 +108,13 @@ jobs:
esac
- name: Login to DockerHub
uses: docker/login-action@v3
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Public ECR
uses: docker/login-action@v3
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
registry: public.ecr.aws
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
@@ -123,11 +123,11 @@ jobs:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build and push container image (latest)
if: github.event_name == 'push'
uses: docker/build-push-action@v6
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
push: true
tags: |
@@ -140,7 +140,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@v6
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
# Use local context to get changes
# https://github.com/docker/build-push-action#path-context
+94
View File
@@ -0,0 +1,94 @@
name: SDK - Bump Version
on:
release:
types: [published]
env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
jobs:
bump-version:
name: Bump Version
if: github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Get Prowler version
shell: bash
run: |
if [[ $PROWLER_VERSION =~ ^([0-9]+)\.([0-9]+)\.([0-9]+)$ ]]; then
MAJOR_VERSION=${BASH_REMATCH[1]}
MINOR_VERSION=${BASH_REMATCH[2]}
FIX_VERSION=${BASH_REMATCH[3]}
if (( MAJOR_VERSION == 5 )); then
if (( FIX_VERSION == 0 )); then
echo "Minor Release: $PROWLER_VERSION"
BUMP_VERSION_TO=${MAJOR_VERSION}.$((MINOR_VERSION + 1)).${FIX_VERSION}
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
TARGET_BRANCH=${BASE_BRANCH}
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
echo "Bumping to next minor version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
else
echo "Patch Release: $PROWLER_VERSION"
BUMP_VERSION_TO=${MAJOR_VERSION}.${MINOR_VERSION}.$((FIX_VERSION + 1))
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
TARGET_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
echo "Bumping to next patch version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
fi
else
echo "Releasing another Prowler major version, aborting..."
exit 1
fi
else
echo "Invalid version syntax: '$PROWLER_VERSION' (must be N.N.N)" >&2
exit 1
fi
- name: Bump versions in files
run: |
echo "Using PROWLER_VERSION=$PROWLER_VERSION"
echo "Using BUMP_VERSION_TO=$BUMP_VERSION_TO"
set -e
echo "Bumping version in pyproject.toml ..."
sed -i "s|version = \"${PROWLER_VERSION}\"|version = \"${BUMP_VERSION_TO}\"|" pyproject.toml
echo "Bumping version in prowler/config/config.py ..."
sed -i "s|prowler_version = \"${PROWLER_VERSION}\"|prowler_version = \"${BUMP_VERSION_TO}\"|" prowler/config/config.py
echo "Bumping version in .env ..."
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${BUMP_VERSION_TO}|" .env
git --no-pager diff
- name: Create Pull Request
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.TARGET_BRANCH }}
commit-message: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
branch: "version-bump-to-v${{ env.BUMP_VERSION_TO }}"
title: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
body: |
### Description
Bump Prowler version to v${{ env.BUMP_VERSION_TO }}
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
+5 -3
View File
@@ -21,6 +21,7 @@ on:
paths-ignore:
- 'ui/**'
- 'api/**'
- '.github/**'
pull_request:
branches:
- "master"
@@ -30,6 +31,7 @@ on:
paths-ignore:
- 'ui/**'
- 'api/**'
- '.github/**'
schedule:
- cron: '00 12 * * *'
@@ -50,16 +52,16 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/sdk-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
category: "/language:${{matrix.language}}"
+109 -8
View File
@@ -21,11 +21,11 @@ jobs:
python-version: ["3.9", "3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@v45
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: ./**
files_ignore: |
@@ -34,21 +34,24 @@ jobs:
permissions/**
api/**
ui/**
prowler/CHANGELOG.md
README.md
mkdocs.yml
.backportrc.json
.env
docker-compose*
examples/**
.gitignore
- name: Install poetry
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
python -m pip install --upgrade pip
pipx install poetry==1.8.5
pipx install poetry==2.1.1
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@v5
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
@@ -56,7 +59,7 @@ jobs:
- name: Install dependencies
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry install
poetry install --no-root
poetry run pip list
VERSION=$(curl --silent "https://api.github.com/repos/hadolint/hadolint/releases/latest" | \
grep '"tag_name":' | \
@@ -104,15 +107,113 @@ jobs:
run: |
/tmp/hadolint Dockerfile --ignore=DL3013
- name: Test with pytest
# Test AWS
- name: AWS - Check if any file has changed
id: aws-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/aws/**
./tests/providers/aws/**
.poetry.lock
- name: AWS - Test
if: steps.aws-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml tests/providers/aws
# Test Azure
- name: Azure - Check if any file has changed
id: azure-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/azure/**
./tests/providers/azure/**
.poetry.lock
- name: Azure - Test
if: steps.azure-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/azure --cov-report=xml:azure_coverage.xml tests/providers/azure
# Test GCP
- name: GCP - Check if any file has changed
id: gcp-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/gcp/**
./tests/providers/gcp/**
.poetry.lock
- name: GCP - Test
if: steps.gcp-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/gcp --cov-report=xml:gcp_coverage.xml tests/providers/gcp
# Test Kubernetes
- name: Kubernetes - Check if any file has changed
id: kubernetes-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/kubernetes/**
./tests/providers/kubernetes/**
.poetry.lock
- name: Kubernetes - Test
if: steps.kubernetes-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/kubernetes --cov-report=xml:kubernetes_coverage.xml tests/providers/kubernetes
# Test NHN
- name: NHN - Check if any file has changed
id: nhn-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/nhn/**
./tests/providers/nhn/**
.poetry.lock
- name: NHN - Test
if: steps.nhn-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/nhn --cov-report=xml:nhn_coverage.xml tests/providers/nhn
# Test M365
- name: M365 - Check if any file has changed
id: m365-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/m365/**
./tests/providers/m365/**
.poetry.lock
- name: M365 - Test
if: steps.m365-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/m365 --cov-report=xml:m365_coverage.xml tests/providers/m365
# Common Tests
- name: Lib - Test
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler --cov-report=xml tests
poetry run pytest -n auto --cov=./prowler/lib --cov-report=xml:lib_coverage.xml tests/lib
- name: Config - Test
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/config --cov-report=xml:config_coverage.xml tests/config
# Codecov
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@v5
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: prowler
files: ./aws_coverage.xml,./azure_coverage.xml,./gcp_coverage.xml,./kubernetes_coverage.xml,./nhn_coverage.xml,./m365_coverage.xml,./lib_coverage.xml,./config_coverage.xml
+5 -5
View File
@@ -7,7 +7,7 @@ on:
env:
RELEASE_TAG: ${{ github.event.release.tag_name }}
PYTHON_VERSION: 3.11
CACHE: "poetry"
# CACHE: "poetry"
jobs:
repository-check:
@@ -64,17 +64,17 @@ jobs:
;;
esac
- uses: actions/checkout@v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install dependencies
run: |
pipx install poetry==1.8.5
pipx install poetry==2.1.1
- name: Setup Python
uses: actions/setup-python@v5
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: ${{ env.CACHE }}
# cache: ${{ env.CACHE }}
- name: Build Prowler package
run: |
@@ -4,7 +4,7 @@ name: SDK - Refresh AWS services' regions
on:
schedule:
- cron: "0 9 * * *" #runs at 09:00 UTC everyday
- cron: "0 9 * * 1" # runs at 09:00 UTC every Monday
env:
GITHUB_BRANCH: "master"
@@ -23,12 +23,12 @@ jobs:
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
ref: ${{ env.GITHUB_BRANCH }}
- name: setup python
uses: actions/setup-python@v5
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.9 #install the python needed
@@ -38,7 +38,7 @@ jobs:
pip install boto3
- name: Configure AWS Credentials -- DEV
uses: aws-actions/configure-aws-credentials@v4
uses: aws-actions/configure-aws-credentials@ececac1a45f3b08a01d2dd070d28d111c5fe6722 # v4.1.0
with:
aws-region: ${{ env.AWS_REGION_DEV }}
role-to-assume: ${{ secrets.DEV_IAM_ROLE_ARN }}
@@ -50,12 +50,13 @@ jobs:
# Create pull request
- name: Create Pull Request
uses: peter-evans/create-pull-request@v7
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
commit-message: "feat(regions_update): Update regions for AWS services"
branch: "aws-services-regions-updated-${{ github.sha }}"
labels: "status/waiting-for-revision, severity/low, provider/aws, backport-to-v3"
labels: "status/waiting-for-revision, severity/low, provider/aws"
title: "chore(regions_update): Changes in regions for AWS services"
body: |
### Description
@@ -61,38 +61,58 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set short git commit SHA
id: vars
run: |
shortSha=$(git rev-parse --short ${{ github.sha }})
echo "SHORT_SHA=${shortSha}" >> $GITHUB_ENV
- name: Login to DockerHub
uses: docker/login-action@v3
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@v6
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=${{ env.SHORT_SHA }}
# Set push: false for testing
push: true
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.LATEST_TAG }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.SHORT_SHA }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@v6
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${{ env.RELEASE_TAG }}
push: true
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.RELEASE_TAG }}
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.STABLE_TAG }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Trigger deployment
if: github.event_name == 'push'
uses: peter-evans/repository-dispatch@ff45666b9427631e3450c54a1bcbee4d9ff4d7c0 # v3.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.CLOUD_DISPATCH }}
event-type: prowler-ui-deploy
client-payload: '{"sha": "${{ github.sha }}", "short_sha": "${{ env.SHORT_SHA }}"}'
+3 -3
View File
@@ -44,16 +44,16 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/ui-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
category: "/language:${{matrix.language}}"
+80
View File
@@ -0,0 +1,80 @@
name: UI - E2E Tests
on:
pull_request:
branches:
- master
- "v5.*"
paths:
- 'ui/**'
env:
SERVICES_TO_START: "api-dev postgres valkey worker-beat worker-dev"
DOCKER_COMPOSE_FILE: "docker-compose-dev.yml"
jobs:
e2e:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: 20
cache: 'npm'
cache-dependency-path: './ui/package-lock.json'
# - name: Cache Playwright Browsers
# uses: actions/cache@v4
# with:
# path: ~/.cache/ms-playwright
# key: playwright-${{ runner.os }}-${{ hashFiles('**/package-lock.json') }}
# restore-keys: |
# playwright-${{ runner.os }}-
- name: Install dependencies
run: npm ci
working-directory: ./ui
- name: Install Playwright Browsers
run: npx playwright install --with-deps
working-directory: ./ui
- name: Set up Docker Compose
uses: docker/setup-compose-action@364cc21a5de5b1ee4a7f5f9d3fa374ce0ccde746 #v1.2.0
- name: Start Docker Compose
run: docker compose -f ${DOCKER_COMPOSE_FILE} up -d ${SERVICES_TO_START}
- name: Wait for API to be ready
run: |
for i in {1..30}; do
if curl -s http://localhost:8000/api/v1; then
echo "API is up!"
break
fi
echo "Waiting for API..."
sleep 5
done
- name: Run Playwright tests
run: npx playwright test
working-directory: ./ui
- name: Upload Playwright report
uses: actions/upload-artifact@v4
with:
name: playwright-report
path: ./ui/playwright-report
- name: Upload Playwright videos
uses: actions/upload-artifact@v4
with:
name: test-videos
path: ./ui/test-results/**/*.webm
- name: Docker Compose Down
if: always()
run: docker compose -f ${DOCKER_COMPOSE_FILE} down
+23 -2
View File
@@ -6,6 +6,7 @@ on:
- "master"
- "v5.*"
paths:
- ".github/workflows/ui-pull-request.yml"
- "ui/**"
pull_request:
branches:
@@ -13,6 +14,9 @@ on:
- "v5.*"
paths:
- 'ui/**'
env:
UI_WORKING_DIR: ./ui
IMAGE_NAME: prowler-ui
jobs:
test-and-coverage:
@@ -23,11 +27,11 @@ jobs:
node-version: [20.x]
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- name: Setup Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v4
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: ${{ matrix.node-version }}
- name: Install dependencies
@@ -39,3 +43,20 @@ jobs:
- name: Build the application
working-directory: ./ui
run: npm run build
test-container-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.UI_WORKING_DIR }}
# Always build using `prod` target
target: prod
push: false
tags: ${{ env.IMAGE_NAME }}:latest
outputs: type=docker
build-args: |
NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY=pk_test_51LwpXXXX
+14 -1
View File
@@ -31,7 +31,7 @@ tags
*.DS_Store
# Prowler output
output/
/output
# Prowler found secrets
secrets-*/
@@ -42,13 +42,19 @@ junit-reports/
# VSCode files
.vscode/
# Cursor files
.cursorignore
.cursor/
# Terraform
.terraform*
*.tfstate
*.tfstate.*
# .env
ui/.env*
api/.env*
.env.local
# Coverage
.coverage*
@@ -60,3 +66,10 @@ node_modules
# Persistent data
_data/
# Playwright
node_modules/
/test-results/
/playwright-report/
/blob-report/
/playwright/.cache/
+21 -3
View File
@@ -27,6 +27,7 @@ repos:
hooks:
- id: shellcheck
exclude: contrib
## PYTHON
- repo: https://github.com/myint/autoflake
rev: v2.3.1
@@ -58,11 +59,28 @@ repos:
args: ["--ignore=E266,W503,E203,E501,W605"]
- repo: https://github.com/python-poetry/poetry
rev: 1.8.0
rev: 2.1.1
hooks:
- id: poetry-check
name: API - poetry-check
args: ["--directory=./api"]
pass_filenames: false
- id: poetry-lock
args: ["--no-update"]
name: API - poetry-lock
args: ["--directory=./api"]
pass_filenames: false
- id: poetry-check
name: SDK - poetry-check
args: ["--directory=./"]
pass_filenames: false
- id: poetry-lock
name: SDK - poetry-lock
args: ["--directory=./"]
pass_filenames: false
- repo: https://github.com/hadolint/hadolint
rev: v2.13.0-beta
@@ -97,7 +115,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'safety check --ignore 70612,66963'
entry: bash -c 'safety check --ignore 70612,66963,74429'
language: system
- id: vulture
+45 -19
View File
@@ -1,38 +1,64 @@
FROM python:3.12.8-alpine3.20
FROM python:3.12.10-slim-bookworm AS build
LABEL maintainer="https://github.com/prowler-cloud/prowler"
LABEL org.opencontainers.image.source="https://github.com/prowler-cloud/prowler"
# Update system dependencies and install essential tools
#hadolint ignore=DL3018
RUN apk --no-cache upgrade && apk --no-cache add curl git
ARG POWERSHELL_VERSION=7.5.0
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
elif [ "$ARCH" = "aarch64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
else \
echo "Unsupported architecture: $ARCH" && exit 1 ; \
fi && \
mkdir -p /opt/microsoft/powershell/7 && \
tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7 && \
chmod +x /opt/microsoft/powershell/7/pwsh && \
ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh && \
rm /tmp/powershell.tar.gz
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
# Create non-root user
RUN mkdir -p /home/prowler && \
echo 'prowler:x:1000:1000:prowler:/home/prowler:' > /etc/passwd && \
echo 'prowler:x:1000:' > /etc/group && \
chown -R prowler:prowler /home/prowler
USER prowler
# Copy necessary files
WORKDIR /home/prowler
# Copy necessary files
COPY prowler/ /home/prowler/prowler/
COPY dashboard/ /home/prowler/dashboard/
COPY pyproject.toml /home/prowler
COPY README.md /home/prowler
COPY README.md /home/prowler/
COPY prowler/providers/m365/lib/powershell/m365_powershell.py /home/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py
# Install Python dependencies
ENV HOME='/home/prowler'
ENV PATH="$HOME/.local/bin:$PATH"
RUN pip install --no-cache-dir --upgrade pip setuptools wheel && \
pip install --no-cache-dir .
ENV PATH="${HOME}/.local/bin:${PATH}"
#hadolint ignore=DL3013
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
# By default poetry does not compile Python source files to bytecode during installation.
# This speeds up the installation process, but the first execution may take a little more
# time because Python then compiles source files to bytecode automatically. If you want to
# compile source files to bytecode during installation, you can use the --compile option
RUN poetry install --compile && \
rm -rf ~/.cache/pip
# Install PowerShell modules
RUN poetry run python prowler/providers/m365/lib/powershell/m365_powershell.py
# Remove deprecated dash dependencies
RUN pip uninstall dash-html-components -y && \
pip uninstall dash-core-components -y
# Remove Prowler directory and build files
USER 0
RUN rm -rf /home/prowler/prowler /home/prowler/pyproject.toml /home/prowler/README.md /home/prowler/build /home/prowler/prowler.egg-info
USER prowler
ENTRYPOINT ["prowler"]
ENTRYPOINT ["poetry", "run", "prowler"]
+27 -12
View File
@@ -71,10 +71,14 @@ It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, Fe
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 561 | 81 -> `prowler aws --list-services` | 30 -> `prowler aws --list-compliance` | 9 -> `prowler aws --list-categories` |
| GCP | 77 | 13 -> `prowler gcp --list-services` | 4 -> `prowler gcp --list-compliance` | 2 -> `prowler gcp --list-categories`|
| Azure | 139 | 18 -> `prowler azure --list-services` | 4 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
| Kubernetes | 83 | 7 -> `prowler kubernetes --list-services` | 1 -> `prowler kubernetes --list-compliance` | 7 -> `prowler kubernetes --list-categories` |
| AWS | 564 | 82 | 33 | 10 |
| GCP | 79 | 13 | 7 | 3 |
| Azure | 140 | 18 | 8 | 3 |
| Kubernetes | 83 | 7 | 4 | 7 |
| M365 | 44 | 2 | 1 | 0 |
| NHN (Unofficial) | 6 | 2 | 1 | 0 |
> You can list the checks, services, compliance frameworks and categories with `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
# 💻 Installation
@@ -106,7 +110,7 @@ docker compose up -d
**Requirements**
* `git` installed.
* `poetry` installed: [poetry installation](https://python-poetry.org/docs/#installation).
* `poetry` v2 installed: [poetry installation](https://python-poetry.org/docs/#installation).
* `npm` installed: [npm installation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
* `Docker Compose` installed: https://docs.docker.com/compose/install/.
@@ -116,7 +120,7 @@ docker compose up -d
git clone https://github.com/prowler-cloud/prowler
cd prowler/api
poetry install
poetry shell
eval $(poetry env activate)
set -a
source .env
docker compose up postgres valkey -d
@@ -124,6 +128,11 @@ cd src/backend
python manage.py migrate --database admin
gunicorn -c config/guniconf.py config.wsgi:application
```
> [!IMPORTANT]
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
>
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
> Now, you can access the API documentation at http://localhost:8080/api/v1/docs.
@@ -133,7 +142,7 @@ gunicorn -c config/guniconf.py config.wsgi:application
git clone https://github.com/prowler-cloud/prowler
cd prowler/api
poetry install
poetry shell
eval $(poetry env activate)
set -a
source .env
cd src/backend
@@ -146,7 +155,7 @@ python -m celery -A config.celery worker -l info -E
git clone https://github.com/prowler-cloud/prowler
cd prowler/api
poetry install
poetry shell
eval $(poetry env activate)
set -a
source .env
cd src/backend
@@ -167,7 +176,7 @@ npm start
## Prowler CLI
### Pip package
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/), thus can be installed using pip with Python >= 3.9, < 3.13:
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/), thus can be installed using pip with Python > 3.9.1, < 3.13:
```console
pip install prowler
@@ -197,15 +206,21 @@ The container images are available here:
### From GitHub
Python >= 3.9, < 3.13 is required with pip and poetry:
Python > 3.9.1, < 3.13 is required with pip and poetry:
``` console
git clone https://github.com/prowler-cloud/prowler
cd prowler
poetry shell
eval $(poetry env activate)
poetry install
python prowler.py -v
python prowler-cli.py -v
```
> [!IMPORTANT]
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
>
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
> If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
# 📐✏️ High level architecture
+17
View File
@@ -23,6 +23,7 @@ DJANGO_SECRETS_ENCRYPTION_KEY=""
DJANGO_MANAGE_DB_PARTITIONS=[True|False]
DJANGO_CELERY_DEADLOCK_ATTEMPTS=5
DJANGO_BROKER_VISIBILITY_TIMEOUT=86400
DJANGO_SENTRY_DSN=
# PostgreSQL settings
# If running django and celery on host, use 'localhost', else use 'postgres-db'
@@ -39,3 +40,19 @@ POSTGRES_DB=prowler_db
VALKEY_HOST=[localhost|valkey]
VALKEY_PORT=6379
VALKEY_DB=0
# Sentry settings
SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
# Social login credentials
DJANGO_GOOGLE_OAUTH_CLIENT_ID=""
DJANGO_GOOGLE_OAUTH_CLIENT_SECRET=""
DJANGO_GOOGLE_OAUTH_CALLBACK_URL=""
DJANGO_GITHUB_OAUTH_CLIENT_ID=""
DJANGO_GITHUB_OAUTH_CLIENT_SECRET=""
DJANGO_GITHUB_OAUTH_CALLBACK_URL=""
# Deletion Task Batch Size
DJANGO_DELETION_BATCH_SIZE=5000
+1 -1
View File
@@ -80,7 +80,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'poetry run safety check --ignore 70612,66963'
entry: bash -c 'poetry run safety check --ignore 70612,66963,74429'
language: system
- id: vulture
+80
View File
@@ -0,0 +1,80 @@
# Prowler API Changelog
All notable changes to the **Prowler API** are documented in this file.
## [v1.7.0] (Prowler v5.6.0)
### Added
- Added M365 as a new provider [(#7563)](https://github.com/prowler-cloud/prowler/pull/7563).
- Added a `compliance/` folder and ZIPexport functionality for all compliance reports.[(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
- Added a new API endpoint to fetch and download any specific compliance file by name [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
---
## [v1.6.0] (Prowler v5.5.0)
### Added
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
- HTTP Security Headers [(#7289)](https://github.com/prowler-cloud/prowler/pull/7289).
- New endpoint to get the compliance overviews metadata [(#7333)](https://github.com/prowler-cloud/prowler/pull/7333).
- Support for muted findings [(#7378)](https://github.com/prowler-cloud/prowler/pull/7378).
- Added missing fields to API findings and resources [(#7318)](https://github.com/prowler-cloud/prowler/pull/7318).
---
## [v1.5.4] (Prowler v5.4.4)
### Fixed
- Fixed a bug with periodic tasks when trying to delete a provider ([#7466])(https://github.com/prowler-cloud/prowler/pull/7466).
---
## [v1.5.3] (Prowler v5.4.3)
### Fixed
- Added duplicated scheduled scans handling ([#7401])(https://github.com/prowler-cloud/prowler/pull/7401).
- Added environment variable to configure the deletion task batch size ([#7423])(https://github.com/prowler-cloud/prowler/pull/7423).
---
## [v1.5.2] (Prowler v5.4.2)
### Changed
- Refactored deletion logic and implemented retry mechanism for deletion tasks [(#7349)](https://github.com/prowler-cloud/prowler/pull/7349).
---
## [v1.5.1] (Prowler v5.4.1)
### Fixed
- Added a handled response in case local files are missing [(#7183)](https://github.com/prowler-cloud/prowler/pull/7183).
- Fixed a race condition when deleting export files after the S3 upload [(#7172)](https://github.com/prowler-cloud/prowler/pull/7172).
- Handled exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283).
---
## [v1.5.0] (Prowler v5.4.0)
### Added
- Social login integration with Google and GitHub [(#6906)](https://github.com/prowler-cloud/prowler/pull/6906)
- Add API scan report system, now all scans launched from the API will generate a compressed file with the report in OCSF, CSV and HTML formats [(#6878)](https://github.com/prowler-cloud/prowler/pull/6878).
- Configurable Sentry integration [(#6874)](https://github.com/prowler-cloud/prowler/pull/6874)
### Changed
- Optimized `GET /findings` endpoint to improve response time and size [(#7019)](https://github.com/prowler-cloud/prowler/pull/7019).
---
## [v1.4.0] (Prowler v5.3.0)
### Changed
- Daily scheduled scan instances are now created beforehand with `SCHEDULED` state [(#6700)](https://github.com/prowler-cloud/prowler/pull/6700).
- Findings endpoints now require at least one date filter [(#6800)](https://github.com/prowler-cloud/prowler/pull/6800).
- Findings metadata endpoint received a performance improvement [(#6863)](https://github.com/prowler-cloud/prowler/pull/6863).
- Increased the allowed length of the provider UID for Kubernetes providers [(#6869)](https://github.com/prowler-cloud/prowler/pull/6869).
---
+31 -15
View File
@@ -1,13 +1,33 @@
FROM python:3.12.8-alpine3.20 AS build
FROM python:3.12.10-slim-bookworm AS build
LABEL maintainer="https://github.com/prowler-cloud/api"
# hadolint ignore=DL3018
RUN apk --no-cache add gcc python3-dev musl-dev linux-headers curl-dev
ARG POWERSHELL_VERSION=7.5.0
ENV POWERSHELL_VERSION=${POWERSHELL_VERSION}
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
elif [ "$ARCH" = "aarch64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
else \
echo "Unsupported architecture: $ARCH" && exit 1 ; \
fi && \
mkdir -p /opt/microsoft/powershell/7 && \
tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7 && \
chmod +x /opt/microsoft/powershell/7/pwsh && \
ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh && \
rm /tmp/powershell.tar.gz
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
RUN apk --no-cache upgrade && \
addgroup -g 1000 prowler && \
adduser -D -u 1000 -G prowler prowler
USER prowler
WORKDIR /home/prowler
@@ -17,27 +37,23 @@ COPY pyproject.toml ./
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
COPY src/backend/ ./backend/
COPY src/backend/ ./backend/
ENV PATH="/home/prowler/.local/bin:$PATH"
RUN poetry install && \
# Add `--no-root` to avoid installing the current project as a package
RUN poetry install --no-root && \
rm -rf ~/.cache/pip
COPY docker-entrypoint.sh ./docker-entrypoint.sh
RUN poetry run python "$(poetry env info --path)/src/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py"
WORKDIR /home/prowler/backend
# Development image
# hadolint ignore=DL3006
FROM build AS dev
USER 0
# hadolint ignore=DL3018
RUN apk --no-cache add curl vim
USER prowler
ENTRYPOINT ["../docker-entrypoint.sh", "dev"]
# Production image
+64
View File
@@ -235,6 +235,7 @@ To view the logs for any component (e.g., Django, Celery worker), you can use th
```console
docker logs -f $(docker ps --format "{{.Names}}" | grep 'api-')
```
## Applying migrations
@@ -269,3 +270,66 @@ poetry shell
cd src/backend
pytest
```
# Custom commands
Django provides a way to create custom commands that can be run from the command line.
> These commands can be found in: ```prowler/api/src/backend/api/management/commands```
To run a custom command, you need to be in the `prowler/api/src/backend` directory and run:
```console
poetry shell
python manage.py <command_name>
```
## Generate dummy data
```console
python manage.py findings --tenant
<TENANT_ID> --findings <NUM_FINDINGS> --re
sources <NUM_RESOURCES> --batch <TRANSACTION_BATCH_SIZE> --alias <ALIAS>
```
This command creates, for a given tenant, a provider, scan and a set of findings and resources related altogether.
> Scan progress and state are updated in real time.
> - 0-33%: Create resources.
> - 33-66%: Create findings.
> - 66%: Create resource-finding mapping.
>
> The last step is required to access the findings details, since the UI needs that to print all the information.
### Example
```console
~/backend $ poetry run python manage.py findings --tenant
fffb1893-3fc7-4623-a5d9-fae47da1c528 --findings 25000 --re
sources 1000 --batch 5000 --alias test-script
Starting data population
Tenant: fffb1893-3fc7-4623-a5d9-fae47da1c528
Alias: test-script
Resources: 1000
Findings: 25000
Batch size: 5000
Creating resources...
100%|███████████████████████| 1/1 [00:00<00:00, 7.72it/s]
Resources created successfully.
Creating findings...
100%|███████████████████████| 5/5 [00:05<00:00, 1.09s/it]
Findings created successfully.
Creating resource-finding mappings...
100%|███████████████████████| 5/5 [00:02<00:00, 1.81it/s]
Resource-finding mappings created successfully.
Successfully populated test data.
```
+1 -1
View File
@@ -28,7 +28,7 @@ start_prod_server() {
start_worker() {
echo "Starting the worker..."
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans -E
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion -E --max-tasks-per-child 1
}
start_worker_beat() {
+1854 -1441
View File
File diff suppressed because it is too large Load Diff
+35 -29
View File
@@ -2,43 +2,51 @@
build-backend = "poetry.core.masonry.api"
requires = ["poetry-core"]
[tool.poetry]
authors = ["Prowler Team"]
[project]
authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
dependencies = [
"celery[pytest] (>=5.4.0,<6.0.0)",
"dj-rest-auth[with_social,jwt] (==7.0.1)",
"django==5.1.8",
"django-allauth==65.4.1",
"django-celery-beat (>=2.7.0,<3.0.0)",
"django-celery-results (>=2.5.1,<3.0.0)",
"django-cors-headers==4.4.0",
"django-environ==0.11.2",
"django-filter==24.3",
"django-guid==3.5.0",
"django-postgres-extra (>=2.0.8,<3.0.0)",
"djangorestframework==3.15.2",
"djangorestframework-jsonapi==7.0.2",
"djangorestframework-simplejwt (>=5.3.1,<6.0.0)",
"drf-nested-routers (>=0.94.1,<1.0.0)",
"drf-spectacular==0.27.2",
"drf-spectacular-jsonapi==0.5.1",
"gunicorn==23.0.0",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
"uuid6==2024.7.10"
]
description = "Prowler's API (Django/DRF)"
license = "Apache-2.0"
name = "prowler-api"
package-mode = false
version = "1.1.0"
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.7.0"
[tool.poetry.dependencies]
celery = {extras = ["pytest"], version = "^5.4.0"}
django = "5.1.4"
django-celery-beat = "^2.7.0"
django-celery-results = "^2.5.1"
django-cors-headers = "4.4.0"
django-environ = "0.11.2"
django-filter = "24.3"
django-guid = "3.5.0"
django-postgres-extra = "^2.0.8"
djangorestframework = "3.15.2"
djangorestframework-jsonapi = "7.0.2"
djangorestframework-simplejwt = "^5.3.1"
drf-nested-routers = "^0.94.1"
drf-spectacular = "0.27.2"
drf-spectacular-jsonapi = "0.5.1"
gunicorn = "23.0.0"
prowler = "^5.0"
psycopg2-binary = "2.9.9"
pytest-celery = {extras = ["redis"], version = "^1.0.1"}
# Needed for prowler compatibility
python = ">=3.11,<3.13"
uuid6 = "2024.7.10"
[project.scripts]
celery = "src.backend.config.settings.celery"
[tool.poetry.group.dev.dependencies]
bandit = "1.7.9"
coverage = "7.5.4"
django-silk = "5.3.2"
docker = "7.1.0"
freezegun = "1.5.1"
marshmallow = ">=3.15.0,<4.0.0"
mypy = "1.10.1"
pylint = "3.2.5"
pytest = "8.2.2"
@@ -49,7 +57,5 @@ pytest-randomly = "3.15.0"
pytest-xdist = "3.6.1"
ruff = "0.5.0"
safety = "3.2.9"
tqdm = "4.67.1"
vulture = "2.14"
[tool.poetry.scripts]
celery = "src.backend.config.settings.celery"
+61
View File
@@ -0,0 +1,61 @@
from allauth.socialaccount.adapter import DefaultSocialAccountAdapter
from django.db import transaction
from api.db_router import MainRouter
from api.db_utils import rls_transaction
from api.models import Membership, Role, Tenant, User, UserRoleRelationship
class ProwlerSocialAccountAdapter(DefaultSocialAccountAdapter):
@staticmethod
def get_user_by_email(email: str):
try:
return User.objects.get(email=email)
except User.DoesNotExist:
return None
def pre_social_login(self, request, sociallogin):
# Link existing accounts with the same email address
email = sociallogin.account.extra_data.get("email")
if email:
existing_user = self.get_user_by_email(email)
if existing_user:
sociallogin.connect(request, existing_user)
def save_user(self, request, sociallogin, form=None):
"""
Called after the user data is fully populated from the provider
and is about to be saved to the DB for the first time.
"""
with transaction.atomic(using=MainRouter.admin_db):
user = super().save_user(request, sociallogin, form)
user.save(using=MainRouter.admin_db)
social_account_name = sociallogin.account.extra_data.get("name")
if social_account_name:
user.name = social_account_name
user.save(using=MainRouter.admin_db)
tenant = Tenant.objects.using(MainRouter.admin_db).create(
name=f"{user.email.split('@')[0]} default tenant"
)
with rls_transaction(str(tenant.id)):
Membership.objects.using(MainRouter.admin_db).create(
user=user, tenant=tenant, role=Membership.RoleChoices.OWNER
)
role = Role.objects.using(MainRouter.admin_db).create(
name="admin",
tenant_id=tenant.id,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=True,
)
UserRoleRelationship.objects.using(MainRouter.admin_db).create(
user=user,
role=role,
tenant_id=tenant.id,
)
return user
+28 -2
View File
@@ -1,12 +1,38 @@
from types import MappingProxyType
from api.models import Provider
from prowler.config.config import get_available_compliance_frameworks
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.check.models import CheckMetadata
from api.models import Provider
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = {}
PROWLER_CHECKS = {}
AVAILABLE_COMPLIANCE_FRAMEWORKS = {}
def get_compliance_frameworks(provider_type: Provider.ProviderChoices) -> list[str]:
"""
Retrieve and cache the list of available compliance frameworks for a specific cloud provider.
This function lazily loads and caches the available compliance frameworks (e.g., CIS, MITRE, ISO)
for each provider type (AWS, Azure, GCP, etc.) on first access. Subsequent calls for the same
provider will return the cached result.
Args:
provider_type (Provider.ProviderChoices): The cloud provider type for which to retrieve
available compliance frameworks (e.g., "aws", "azure", "gcp", "m365").
Returns:
list[str]: A list of framework identifiers (e.g., "cis_1.4_aws", "mitre_attack_azure") available
for the given provider.
"""
global AVAILABLE_COMPLIANCE_FRAMEWORKS
if provider_type not in AVAILABLE_COMPLIANCE_FRAMEWORKS:
AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type] = (
get_available_compliance_frameworks(provider_type)
)
return AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type]
def get_prowler_provider_checks(provider_type: Provider.ProviderChoices):
+13 -2
View File
@@ -1,18 +1,29 @@
ALLOWED_APPS = ("django", "socialaccount", "account", "authtoken", "silk")
class MainRouter:
default_db = "default"
admin_db = "admin"
def db_for_read(self, model, **hints): # noqa: F841
model_table_name = model._meta.db_table
if model_table_name.startswith("django_"):
if model_table_name.startswith("django_") or any(
model_table_name.startswith(f"{app}_") for app in ALLOWED_APPS
):
return self.admin_db
return None
def db_for_write(self, model, **hints): # noqa: F841
model_table_name = model._meta.db_table
if model_table_name.startswith("django_"):
if any(model_table_name.startswith(f"{app}_") for app in ALLOWED_APPS):
return self.admin_db
return None
def allow_migrate(self, db, app_label, model_name=None, **hints): # noqa: F841
return db == self.admin_db
def allow_relation(self, obj1, obj2, **hints): # noqa: F841
# Allow relations if both objects are in either "default" or "admin" db connectors
if {obj1._state.db, obj2._state.db} <= {self.default_db, self.admin_db}:
return True
return None
+36 -9
View File
@@ -6,6 +6,7 @@ from datetime import datetime, timedelta, timezone
from django.conf import settings
from django.contrib.auth.models import BaseUserManager
from django.db import connection, models, transaction
from django_celery_beat.models import PeriodicTask
from psycopg2 import connect as psycopg2_connect
from psycopg2.extensions import AsIs, new_type, register_adapter, register_type
from rest_framework_json_api.serializers import ValidationError
@@ -105,11 +106,12 @@ def generate_random_token(length: int = 14, symbols: str | None = None) -> str:
return "".join(secrets.choice(symbols or _symbols) for _ in range(length))
def batch_delete(queryset, batch_size=5000):
def batch_delete(tenant_id, queryset, batch_size=settings.DJANGO_DELETION_BATCH_SIZE):
"""
Deletes objects in batches and returns the total number of deletions and a summary.
Args:
tenant_id (str): Tenant ID the queryset belongs to.
queryset (QuerySet): The queryset of objects to delete.
batch_size (int): The number of objects to delete in each batch.
@@ -120,15 +122,16 @@ def batch_delete(queryset, batch_size=5000):
deletion_summary = {}
while True:
# Get a batch of IDs to delete
batch_ids = set(
queryset.values_list("id", flat=True).order_by("id")[:batch_size]
)
if not batch_ids:
# No more objects to delete
break
with rls_transaction(tenant_id, POSTGRES_TENANT_VAR):
# Get a batch of IDs to delete
batch_ids = set(
queryset.values_list("id", flat=True).order_by("id")[:batch_size]
)
if not batch_ids:
# No more objects to delete
break
deleted_count, deleted_info = queryset.filter(id__in=batch_ids).delete()
deleted_count, deleted_info = queryset.filter(id__in=batch_ids).delete()
total_deleted += deleted_count
for model_label, count in deleted_info.items():
@@ -137,6 +140,18 @@ def batch_delete(queryset, batch_size=5000):
return total_deleted, deletion_summary
def delete_related_daily_task(provider_id: str):
"""
Deletes the periodic task associated with a specific provider.
Args:
provider_id (str): The unique identifier for the provider
whose related periodic task should be deleted.
"""
task_name = f"scan-perform-scheduled-{provider_id}"
PeriodicTask.objects.filter(name=task_name).delete()
# Postgres Enums
@@ -318,3 +333,15 @@ class InvitationStateEnum(EnumType):
class InvitationStateEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("invitation_state", *args, **kwargs)
# Postgres enum definition for Integration type
class IntegrationTypeEnum(EnumType):
enum_type_name = "integration_type"
class IntegrationTypeEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("integration_type", *args, **kwargs)
+25 -16
View File
@@ -7,7 +7,7 @@ from rest_framework_json_api.serializers import ValidationError
from api.db_utils import POSTGRES_TENANT_VAR, SET_CONFIG_QUERY
def set_tenant(func):
def set_tenant(func=None, *, keep_tenant=False):
"""
Decorator to set the tenant context for a Celery task based on the provided tenant_id.
@@ -40,20 +40,29 @@ def set_tenant(func):
# The tenant context will be set before the task logic executes.
"""
@wraps(func)
@transaction.atomic
def wrapper(*args, **kwargs):
try:
tenant_id = kwargs.pop("tenant_id")
except KeyError:
raise KeyError("This task requires the tenant_id")
try:
uuid.UUID(tenant_id)
except ValueError:
raise ValidationError("Tenant ID must be a valid UUID")
with connection.cursor() as cursor:
cursor.execute(SET_CONFIG_QUERY, [POSTGRES_TENANT_VAR, tenant_id])
def decorator(func):
@wraps(func)
@transaction.atomic
def wrapper(*args, **kwargs):
try:
if not keep_tenant:
tenant_id = kwargs.pop("tenant_id")
else:
tenant_id = kwargs["tenant_id"]
except KeyError:
raise KeyError("This task requires the tenant_id")
try:
uuid.UUID(tenant_id)
except ValueError:
raise ValidationError("Tenant ID must be a valid UUID")
with connection.cursor() as cursor:
cursor.execute(SET_CONFIG_QUERY, [POSTGRES_TENANT_VAR, tenant_id])
return func(*args, **kwargs)
return func(*args, **kwargs)
return wrapper
return wrapper
if func is None:
return decorator
else:
return decorator(func)
+120 -25
View File
@@ -1,4 +1,4 @@
from datetime import date, datetime, timezone
from datetime import date, datetime, timedelta, timezone
from django.conf import settings
from django.db.models import Q
@@ -24,6 +24,7 @@ from api.db_utils import (
from api.models import (
ComplianceOverview,
Finding,
Integration,
Invitation,
Membership,
PermissionChoices,
@@ -286,6 +287,9 @@ class FindingFilter(FilterSet):
status = ChoiceFilter(choices=StatusChoices.choices)
severity = ChoiceFilter(choices=SeverityChoices)
impact = ChoiceFilter(choices=SeverityChoices)
muted = BooleanFilter(
help_text="If this filter is not provided, muted and non-muted findings will be returned."
)
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
@@ -319,13 +323,41 @@ class FindingFilter(FilterSet):
field_name="resources__type", lookup_expr="icontains"
)
# Temporarily disabled until we implement tag filtering in the UI
# resource_tag_key = CharFilter(field_name="resources__tags__key")
# resource_tag_key__in = CharInFilter(
# field_name="resources__tags__key", lookup_expr="in"
# )
# resource_tag_key__icontains = CharFilter(
# field_name="resources__tags__key", lookup_expr="icontains"
# )
# resource_tag_value = CharFilter(field_name="resources__tags__value")
# resource_tag_value__in = CharInFilter(
# field_name="resources__tags__value", lookup_expr="in"
# )
# resource_tag_value__icontains = CharFilter(
# field_name="resources__tags__value", lookup_expr="icontains"
# )
# resource_tags = CharInFilter(
# method="filter_resource_tag",
# lookup_expr="in",
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
# "separated by commas.",
# )
scan = UUIDFilter(method="filter_scan_id")
scan__in = UUIDInFilter(method="filter_scan_id_in")
inserted_at = DateFilter(method="filter_inserted_at", lookup_expr="date")
inserted_at__date = DateFilter(method="filter_inserted_at", lookup_expr="date")
inserted_at__gte = DateFilter(method="filter_inserted_at_gte")
inserted_at__lte = DateFilter(method="filter_inserted_at_lte")
inserted_at__gte = DateFilter(
method="filter_inserted_at_gte",
help_text=f"Maximum date range is {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
)
inserted_at__lte = DateFilter(
method="filter_inserted_at_lte",
help_text=f"Maximum date range is {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
)
class Meta:
model = Finding
@@ -353,6 +385,52 @@ class FindingFilter(FilterSet):
},
}
def filter_queryset(self, queryset):
if not (self.data.get("scan") or self.data.get("scan__in")) and not (
self.data.get("inserted_at")
or self.data.get("inserted_at__date")
or self.data.get("inserted_at__gte")
or self.data.get("inserted_at__lte")
):
raise ValidationError(
[
{
"detail": "At least one date filter is required: filter[inserted_at], filter[inserted_at.gte], "
"or filter[inserted_at.lte].",
"status": 400,
"source": {"pointer": "/data/attributes/inserted_at"},
"code": "required",
}
]
)
gte_date = (
datetime.strptime(self.data.get("inserted_at__gte"), "%Y-%m-%d").date()
if self.data.get("inserted_at__gte")
else datetime.now(timezone.utc).date()
)
lte_date = (
datetime.strptime(self.data.get("inserted_at__lte"), "%Y-%m-%d").date()
if self.data.get("inserted_at__lte")
else datetime.now(timezone.utc).date()
)
if abs(lte_date - gte_date) > timedelta(
days=settings.FINDINGS_MAX_DAYS_IN_RANGE
):
raise ValidationError(
[
{
"detail": f"The date range cannot exceed {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
"status": 400,
"source": {"pointer": "/data/attributes/inserted_at"},
"code": "invalid",
}
]
)
return super().filter_queryset(queryset)
# Convert filter values to UUIDv7 values for use with partitioning
def filter_scan_id(self, queryset, name, value):
try:
@@ -373,9 +451,7 @@ class FindingFilter(FilterSet):
)
return (
queryset.filter(id__gte=start)
.filter(id__lt=end)
.filter(scan__id=value_uuid)
queryset.filter(id__gte=start).filter(id__lt=end).filter(scan_id=value_uuid)
)
def filter_scan_id_in(self, queryset, name, value):
@@ -400,31 +476,42 @@ class FindingFilter(FilterSet):
]
)
if start == end:
return queryset.filter(id__gte=start).filter(scan__id__in=uuid_list)
return queryset.filter(id__gte=start).filter(scan_id__in=uuid_list)
else:
return (
queryset.filter(id__gte=start)
.filter(id__lt=end)
.filter(scan__id__in=uuid_list)
.filter(scan_id__in=uuid_list)
)
def filter_inserted_at(self, queryset, name, value):
value = self.maybe_date_to_datetime(value)
start = uuid7_start(datetime_to_uuid7(value))
datetime_value = self.maybe_date_to_datetime(value)
start = uuid7_start(datetime_to_uuid7(datetime_value))
end = uuid7_start(datetime_to_uuid7(datetime_value + timedelta(days=1)))
return queryset.filter(id__gte=start).filter(inserted_at__date=value)
return queryset.filter(id__gte=start, id__lt=end)
def filter_inserted_at_gte(self, queryset, name, value):
value = self.maybe_date_to_datetime(value)
start = uuid7_start(datetime_to_uuid7(value))
datetime_value = self.maybe_date_to_datetime(value)
start = uuid7_start(datetime_to_uuid7(datetime_value))
return queryset.filter(id__gte=start).filter(inserted_at__gte=value)
return queryset.filter(id__gte=start)
def filter_inserted_at_lte(self, queryset, name, value):
value = self.maybe_date_to_datetime(value)
end = uuid7_start(datetime_to_uuid7(value))
datetime_value = self.maybe_date_to_datetime(value)
end = uuid7_start(datetime_to_uuid7(datetime_value + timedelta(days=1)))
return queryset.filter(id__lte=end).filter(inserted_at__lte=value)
return queryset.filter(id__lt=end)
def filter_resource_tag(self, queryset, name, value):
overall_query = Q()
for key_value_pair in value:
tag_key, tag_value = key_value_pair.split(":", 1)
overall_query |= Q(
resources__tags__key__icontains=tag_key,
resources__tags__value__icontains=tag_value,
)
return queryset.filter(overall_query).distinct()
@staticmethod
def maybe_date_to_datetime(value):
@@ -530,12 +617,6 @@ class ScanSummaryFilter(FilterSet):
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
region = CharFilter(field_name="region")
muted_findings = BooleanFilter(method="filter_muted_findings")
def filter_muted_findings(self, queryset, name, value):
if not value:
return queryset.exclude(muted__gt=0)
return queryset
class Meta:
model = ScanSummary
@@ -546,8 +627,6 @@ class ScanSummaryFilter(FilterSet):
class ServiceOverviewFilter(ScanSummaryFilter):
muted_findings = None
def is_valid(self):
# Check if at least one of the inserted_at filters is present
inserted_at_filters = [
@@ -565,3 +644,19 @@ class ServiceOverviewFilter(ScanSummaryFilter):
}
)
return super().is_valid()
class IntegrationFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
integration_type = ChoiceFilter(choices=Integration.IntegrationChoices.choices)
integration_type__in = ChoiceInFilter(
choices=Integration.IntegrationChoices.choices,
field_name="integration_type",
lookup_expr="in",
)
class Meta:
model = Integration
fields = {
"inserted_at": ["date", "gte", "lte"],
}
@@ -122,6 +122,22 @@
"scanner_args": {}
}
},
{
"model": "api.provider",
"pk": "7791914f-d646-4fe2-b2ed-73f2c6499a36",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:45:26.352Z",
"updated_at": "2024-10-18T11:16:23.533Z",
"provider": "kubernetes",
"uid": "gke_lucky-coast-419309_us-central1_autopilot-cluster-2",
"alias": "k8s_testing_2",
"connected": true,
"connection_last_checked_at": "2024-10-18T11:16:23.503Z",
"metadata": {},
"scanner_args": {}
}
},
{
"model": "api.providersecret",
"pk": "11491b47-75ae-4f71-ad8d-3e630a72182e",
@@ -11,9 +11,7 @@
"unique_resource_count": 1,
"duration": 5,
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled"
]
"checks_to_execute": ["accessanalyzer_enabled"]
},
"inserted_at": "2024-09-01T17:25:27.050Z",
"started_at": "2024-09-01T17:25:27.050Z",
@@ -33,9 +31,7 @@
"unique_resource_count": 1,
"duration": 20,
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled"
]
"checks_to_execute": ["accessanalyzer_enabled"]
},
"inserted_at": "2024-09-02T17:24:27.050Z",
"started_at": "2024-09-02T17:24:27.050Z",
@@ -55,9 +51,7 @@
"unique_resource_count": 10,
"duration": 10,
"scanner_args": {
"checks_to_execute": [
"cloudsql_instance_automated_backups"
]
"checks_to_execute": ["cloudsql_instance_automated_backups"]
},
"inserted_at": "2024-09-02T19:26:27.050Z",
"started_at": "2024-09-02T19:26:27.050Z",
@@ -77,9 +71,7 @@
"unique_resource_count": 1,
"duration": 35,
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled"
]
"checks_to_execute": ["accessanalyzer_enabled"]
},
"inserted_at": "2024-09-02T19:27:27.050Z",
"started_at": "2024-09-02T19:27:27.050Z",
@@ -97,9 +89,7 @@
"name": "test scheduled aws scan",
"state": "available",
"scanner_args": {
"checks_to_execute": [
"cloudformation_stack_outputs_find_secrets"
]
"checks_to_execute": ["cloudformation_stack_outputs_find_secrets"]
},
"scheduled_at": "2030-09-02T19:20:27.050Z",
"inserted_at": "2024-09-02T19:24:27.050Z",
@@ -178,9 +168,7 @@
"unique_resource_count": 19,
"progress": 100,
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled"
]
"checks_to_execute": ["accessanalyzer_enabled"]
},
"duration": 7,
"scheduled_at": null,
@@ -190,6 +178,56 @@
"completed_at": "2024-10-18T10:46:05.127Z"
}
},
{
"model": "api.scan",
"pk": "6dd8925f-a52d-48de-a546-d2d90db30ab1",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"name": "real scan azure",
"provider": "1b59e032-3eb6-4694-93a5-df84cd9b3ce2",
"trigger": "manual",
"state": "completed",
"unique_resource_count": 20,
"progress": 100,
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled",
"account_security_contact_information_is_registered"
]
},
"duration": 4,
"scheduled_at": null,
"inserted_at": "2024-10-18T11:16:21.358Z",
"updated_at": "2024-10-18T11:16:26.060Z",
"started_at": "2024-10-18T11:16:21.593Z",
"completed_at": "2024-10-18T11:16:26.060Z"
}
},
{
"model": "api.scan",
"pk": "4ca7ce89-3236-41a8-a369-8937bc152af5",
"fields": {
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"name": "real scan k8s",
"provider": "7791914f-d646-4fe2-b2ed-73f2c6499a36",
"trigger": "manual",
"state": "completed",
"unique_resource_count": 20,
"progress": 100,
"scanner_args": {
"checks_to_execute": [
"accessanalyzer_enabled",
"account_security_contact_information_is_registered"
]
},
"duration": 4,
"scheduled_at": null,
"inserted_at": "2024-10-18T11:16:21.358Z",
"updated_at": "2024-10-18T11:16:26.060Z",
"started_at": "2024-10-18T11:16:21.593Z",
"completed_at": "2024-10-18T11:16:26.060Z"
}
},
{
"model": "api.scan",
"pk": "01929f57-c0ee-7553-be0b-cbde006fb6f7",
@@ -6,6 +6,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.823Z",
"updated_at": "2024-10-18T10:46:04.841Z",
"first_seen_at": "2024-10-18T10:46:04.823Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-south-2-112233445566",
"delta": "new",
"status": "FAIL",
@@ -61,6 +62,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.855Z",
"updated_at": "2024-10-18T10:46:04.858Z",
"first_seen_at": "2024-10-18T10:46:04.855Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-west-3-112233445566",
"delta": "new",
"status": "FAIL",
@@ -116,6 +118,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.869Z",
"updated_at": "2024-10-18T10:46:04.876Z",
"first_seen_at": "2024-10-18T10:46:04.869Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-central-2-112233445566",
"delta": "new",
"status": "FAIL",
@@ -171,6 +174,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.888Z",
"updated_at": "2024-10-18T10:46:04.892Z",
"first_seen_at": "2024-10-18T10:46:04.888Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-west-1-112233445566",
"delta": "new",
"status": "FAIL",
@@ -226,6 +230,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.901Z",
"updated_at": "2024-10-18T10:46:04.905Z",
"first_seen_at": "2024-10-18T10:46:04.901Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-us-east-2-112233445566",
"delta": "new",
"status": "FAIL",
@@ -281,6 +286,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.915Z",
"updated_at": "2024-10-18T10:46:04.919Z",
"first_seen_at": "2024-10-18T10:46:04.915Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-south-1-112233445566",
"delta": "new",
"status": "FAIL",
@@ -336,6 +342,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.929Z",
"updated_at": "2024-10-18T10:46:04.934Z",
"first_seen_at": "2024-10-18T10:46:04.929Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-us-west-1-112233445566",
"delta": "new",
"status": "FAIL",
@@ -391,6 +398,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.944Z",
"updated_at": "2024-10-18T10:46:04.947Z",
"first_seen_at": "2024-10-18T10:46:04.944Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ca-central-1-112233445566",
"delta": "new",
"status": "FAIL",
@@ -446,6 +454,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.957Z",
"updated_at": "2024-10-18T10:46:04.962Z",
"first_seen_at": "2024-10-18T10:46:04.957Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-us-east-1-ConsoleAnalyzer-83b66ad7-d024-454e-b851-52d11cc1cf7c",
"delta": "new",
"status": "PASS",
@@ -501,6 +510,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.971Z",
"updated_at": "2024-10-18T10:46:04.975Z",
"first_seen_at": "2024-10-18T10:46:04.971Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-west-2-112233445566",
"delta": "new",
"status": "FAIL",
@@ -556,6 +566,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.984Z",
"updated_at": "2024-10-18T10:46:04.989Z",
"first_seen_at": "2024-10-18T10:46:04.984Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-sa-east-1-112233445566",
"delta": "new",
"status": "FAIL",
@@ -611,6 +622,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:04.999Z",
"updated_at": "2024-10-18T10:46:05.003Z",
"first_seen_at": "2024-10-18T10:46:04.999Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-north-1-112233445566",
"delta": "new",
"status": "FAIL",
@@ -666,6 +678,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.013Z",
"updated_at": "2024-10-18T10:46:05.018Z",
"first_seen_at": "2024-10-18T10:46:05.013Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-us-west-2-112233445566",
"delta": "new",
"status": "FAIL",
@@ -721,6 +734,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.029Z",
"updated_at": "2024-10-18T10:46:05.033Z",
"first_seen_at": "2024-10-18T10:46:05.029Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-southeast-1-112233445566",
"delta": "new",
"status": "FAIL",
@@ -776,6 +790,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.045Z",
"updated_at": "2024-10-18T10:46:05.050Z",
"first_seen_at": "2024-10-18T10:46:05.045Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-central-1-112233445566",
"delta": "new",
"status": "FAIL",
@@ -831,6 +846,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.061Z",
"updated_at": "2024-10-18T10:46:05.065Z",
"first_seen_at": "2024-10-18T10:46:05.061Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-northeast-1-112233445566",
"delta": "new",
"status": "FAIL",
@@ -886,6 +902,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.080Z",
"updated_at": "2024-10-18T10:46:05.085Z",
"first_seen_at": "2024-10-18T10:46:05.080Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-southeast-2-112233445566",
"delta": "new",
"status": "FAIL",
@@ -941,6 +958,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.099Z",
"updated_at": "2024-10-18T10:46:05.104Z",
"first_seen_at": "2024-10-18T10:46:05.099Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-northeast-2-112233445566",
"delta": "new",
"status": "FAIL",
@@ -996,6 +1014,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T10:46:05.115Z",
"updated_at": "2024-10-18T10:46:05.121Z",
"first_seen_at": "2024-10-18T10:46:05.115Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-northeast-3-112233445566",
"delta": "new",
"status": "FAIL",
@@ -1051,6 +1070,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.489Z",
"updated_at": "2024-10-18T11:16:24.506Z",
"first_seen_at": "2024-10-18T10:46:04.823Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-south-2-112233445566",
"delta": null,
"status": "FAIL",
@@ -1106,6 +1126,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.518Z",
"updated_at": "2024-10-18T11:16:24.521Z",
"first_seen_at": "2024-10-18T10:46:04.855Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-west-3-112233445566",
"delta": null,
"status": "FAIL",
@@ -1161,6 +1182,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.526Z",
"updated_at": "2024-10-18T11:16:24.529Z",
"first_seen_at": "2024-10-18T10:46:04.869Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-central-2-112233445566",
"delta": null,
"status": "FAIL",
@@ -1216,6 +1238,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.535Z",
"updated_at": "2024-10-18T11:16:24.538Z",
"first_seen_at": "2024-10-18T10:46:04.888Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-west-1-112233445566",
"delta": null,
"status": "FAIL",
@@ -1271,6 +1294,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.544Z",
"updated_at": "2024-10-18T11:16:24.546Z",
"first_seen_at": "2024-10-18T10:46:04.901Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-us-east-2-112233445566",
"delta": null,
"status": "FAIL",
@@ -1326,6 +1350,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.551Z",
"updated_at": "2024-10-18T11:16:24.554Z",
"first_seen_at": "2024-10-18T10:46:04.915Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-south-1-112233445566",
"delta": null,
"status": "FAIL",
@@ -1381,6 +1406,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.560Z",
"updated_at": "2024-10-18T11:16:24.562Z",
"first_seen_at": "2024-10-18T10:46:04.929Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-us-west-1-112233445566",
"delta": null,
"status": "FAIL",
@@ -1436,6 +1462,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.567Z",
"updated_at": "2024-10-18T11:16:24.569Z",
"first_seen_at": "2024-10-18T10:46:04.944Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ca-central-1-112233445566",
"delta": null,
"status": "FAIL",
@@ -1491,6 +1518,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.573Z",
"updated_at": "2024-10-18T11:16:24.575Z",
"first_seen_at": "2024-10-18T10:46:04.957Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-us-east-1-ConsoleAnalyzer-83b66ad7-d024-454e-b851-52d11cc1cf7c",
"delta": null,
"status": "PASS",
@@ -1546,6 +1574,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.580Z",
"updated_at": "2024-10-18T11:16:24.582Z",
"first_seen_at": "2024-10-18T10:46:04.971Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-west-2-112233445566",
"delta": null,
"status": "FAIL",
@@ -1601,6 +1630,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.587Z",
"updated_at": "2024-10-18T11:16:24.589Z",
"first_seen_at": "2024-10-18T10:46:04.984Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-sa-east-1-112233445566",
"delta": null,
"status": "FAIL",
@@ -1656,6 +1686,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.595Z",
"updated_at": "2024-10-18T11:16:24.597Z",
"first_seen_at": "2024-10-18T10:46:04.999Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-north-1-112233445566",
"delta": null,
"status": "FAIL",
@@ -1711,6 +1742,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.602Z",
"updated_at": "2024-10-18T11:16:24.604Z",
"first_seen_at": "2024-10-18T10:46:05.013Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-us-west-2-112233445566",
"delta": null,
"status": "FAIL",
@@ -1766,6 +1798,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.610Z",
"updated_at": "2024-10-18T11:16:24.612Z",
"first_seen_at": "2024-10-18T10:46:05.029Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-southeast-1-112233445566",
"delta": null,
"status": "FAIL",
@@ -1821,6 +1854,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.617Z",
"updated_at": "2024-10-18T11:16:24.620Z",
"first_seen_at": "2024-10-18T10:46:05.045Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-eu-central-1-112233445566",
"delta": null,
"status": "FAIL",
@@ -1876,6 +1910,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.625Z",
"updated_at": "2024-10-18T11:16:24.627Z",
"first_seen_at": "2024-10-18T10:46:05.061Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-northeast-1-112233445566",
"delta": null,
"status": "FAIL",
@@ -1931,6 +1966,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.632Z",
"updated_at": "2024-10-18T11:16:24.634Z",
"first_seen_at": "2024-10-18T10:46:05.080Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-southeast-2-112233445566",
"delta": null,
"status": "FAIL",
@@ -1986,6 +2022,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.639Z",
"updated_at": "2024-10-18T11:16:24.642Z",
"first_seen_at": "2024-10-18T10:46:05.099Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-northeast-2-112233445566",
"delta": null,
"status": "FAIL",
@@ -2041,6 +2078,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:24.646Z",
"updated_at": "2024-10-18T11:16:24.648Z",
"first_seen_at": "2024-10-18T10:46:05.115Z",
"uid": "prowler-aws-accessanalyzer_enabled-112233445566-ap-northeast-3-112233445566",
"delta": null,
"status": "FAIL",
@@ -2096,6 +2134,7 @@
"tenant": "12646005-9067-4d2a-a098-8bb378604362",
"inserted_at": "2024-10-18T11:16:26.033Z",
"updated_at": "2024-10-18T11:16:26.045Z",
"first_seen_at": "2024-10-18T11:16:26.033Z",
"uid": "prowler-aws-account_security_contact_information_is_registered-112233445566-us-east-1-112233445566",
"delta": "new",
"status": "MANUAL",
@@ -0,0 +1,237 @@
import random
from datetime import datetime, timezone
from math import ceil
from uuid import uuid4
from django.core.management.base import BaseCommand
from tqdm import tqdm
from api.db_utils import rls_transaction
from api.models import (
Finding,
Provider,
Resource,
ResourceFindingMapping,
Scan,
StatusChoices,
)
from prowler.lib.check.models import CheckMetadata
class Command(BaseCommand):
help = "Populates the database with test data for performance testing."
def add_arguments(self, parser):
parser.add_argument(
"--tenant",
type=str,
required=True,
help="Tenant id for which the data will be populated.",
)
parser.add_argument(
"--resources",
type=int,
required=True,
help="The number of resources to create.",
)
parser.add_argument(
"--findings",
type=int,
required=True,
help="The number of findings to create.",
)
parser.add_argument(
"--batch", type=int, required=True, help="The batch size for bulk creation."
)
parser.add_argument(
"--alias",
type=str,
required=False,
help="Optional alias for the provider and scan",
)
def handle(self, *args, **options):
tenant_id = options["tenant"]
num_resources = options["resources"]
num_findings = options["findings"]
batch_size = options["batch"]
alias = options["alias"] or "Testing"
uid_token = str(uuid4())
self.stdout.write(self.style.NOTICE("Starting data population"))
self.stdout.write(self.style.NOTICE(f"\tTenant: {tenant_id}"))
self.stdout.write(self.style.NOTICE(f"\tAlias: {alias}"))
self.stdout.write(self.style.NOTICE(f"\tResources: {num_resources}"))
self.stdout.write(self.style.NOTICE(f"\tFindings: {num_findings}"))
self.stdout.write(self.style.NOTICE(f"\tBatch size: {batch_size}\n\n"))
# Resource metadata
possible_regions = [
"us-east-1",
"us-east-2",
"us-west-1",
"us-west-2",
"ca-central-1",
"eu-central-1",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"ap-southeast-1",
"ap-southeast-2",
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"sa-east-1",
]
possible_services = []
possible_types = []
bulk_check_metadata = CheckMetadata.get_bulk(provider="aws")
for check_metadata in bulk_check_metadata.values():
if check_metadata.ServiceName not in possible_services:
possible_services.append(check_metadata.ServiceName)
if (
check_metadata.ResourceType
and check_metadata.ResourceType not in possible_types
):
possible_types.append(check_metadata.ResourceType)
with rls_transaction(tenant_id):
provider, _ = Provider.all_objects.get_or_create(
tenant_id=tenant_id,
provider="aws",
connected=True,
uid=str(random.randint(100000000000, 999999999999)),
defaults={
"alias": alias,
},
)
with rls_transaction(tenant_id):
scan = Scan.all_objects.create(
tenant_id=tenant_id,
provider=provider,
name=alias,
trigger="manual",
state="executing",
progress=0,
started_at=datetime.now(timezone.utc),
)
scan_state = "completed"
try:
# Create resources
resources = []
for i in range(num_resources):
resources.append(
Resource(
tenant_id=tenant_id,
provider_id=provider.id,
uid=f"testing-{uid_token}-{i}",
name=f"Testing {uid_token}-{i}",
region=random.choice(possible_regions),
service=random.choice(possible_services),
type=random.choice(possible_types),
)
)
num_batches = ceil(len(resources) / batch_size)
self.stdout.write(self.style.WARNING("Creating resources..."))
for i in tqdm(range(0, len(resources), batch_size), total=num_batches):
with rls_transaction(tenant_id):
Resource.all_objects.bulk_create(resources[i : i + batch_size])
self.stdout.write(self.style.SUCCESS("Resources created successfully.\n\n"))
with rls_transaction(tenant_id):
scan.progress = 33
scan.save()
# Create Findings
findings = []
possible_deltas = ["new", "changed", None]
possible_severities = ["critical", "high", "medium", "low"]
findings_resources_mapping = []
for i in range(num_findings):
severity = random.choice(possible_severities)
check_id = random.randint(1, 1000)
assigned_resource_num = random.randint(0, len(resources) - 1)
assigned_resource = resources[assigned_resource_num]
findings_resources_mapping.append(assigned_resource_num)
findings.append(
Finding(
tenant_id=tenant_id,
scan=scan,
uid=f"testing-{uid_token}-{i}",
delta=random.choice(possible_deltas),
check_id=f"check-{check_id}",
status=random.choice(list(StatusChoices)),
severity=severity,
impact=severity,
raw_result={},
check_metadata={
"checktitle": f"Test title for check {check_id}",
"risk": f"Testing risk {uid_token}-{i}",
"provider": "aws",
"severity": severity,
"categories": ["category1", "category2", "category3"],
"description": "This is a random description that should not matter for testing purposes.",
"servicename": assigned_resource.service,
"resourcetype": assigned_resource.type,
},
)
)
num_batches = ceil(len(findings) / batch_size)
self.stdout.write(self.style.WARNING("Creating findings..."))
for i in tqdm(range(0, len(findings), batch_size), total=num_batches):
with rls_transaction(tenant_id):
Finding.all_objects.bulk_create(findings[i : i + batch_size])
self.stdout.write(self.style.SUCCESS("Findings created successfully.\n\n"))
with rls_transaction(tenant_id):
scan.progress = 66
scan.save()
# Create ResourceFindingMapping
mappings = []
for index, f in enumerate(findings):
mappings.append(
ResourceFindingMapping(
tenant_id=tenant_id,
resource=resources[findings_resources_mapping[index]],
finding=f,
)
)
num_batches = ceil(len(mappings) / batch_size)
self.stdout.write(
self.style.WARNING("Creating resource-finding mappings...")
)
for i in tqdm(range(0, len(mappings), batch_size), total=num_batches):
with rls_transaction(tenant_id):
ResourceFindingMapping.objects.bulk_create(
mappings[i : i + batch_size]
)
self.stdout.write(
self.style.SUCCESS(
"Resource-finding mappings created successfully.\n\n"
)
)
except Exception as e:
self.stdout.write(self.style.ERROR(f"Failed to populate test data: {e}"))
scan_state = "failed"
finally:
scan.completed_at = datetime.now(timezone.utc)
scan.duration = int(
(datetime.now(timezone.utc) - scan.started_at).total_seconds()
)
scan.progress = 100
scan.state = scan_state
scan.unique_resource_count = num_resources
with rls_transaction(tenant_id):
scan.save()
self.stdout.write(self.style.NOTICE("Successfully populated test data."))
@@ -0,0 +1,15 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0005_rbac_missing_admin_roles"),
]
operations = [
migrations.AddField(
model_name="finding",
name="first_seen_at",
field=models.DateTimeField(editable=False, null=True),
),
]
@@ -0,0 +1,25 @@
# Generated by Django 5.1.5 on 2025-01-28 15:03
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0006_findings_first_seen"),
]
operations = [
migrations.AddIndex(
model_name="scan",
index=models.Index(
fields=["tenant_id", "provider_id", "state", "inserted_at"],
name="scans_prov_state_insert_idx",
),
),
migrations.AddIndex(
model_name="scansummary",
index=models.Index(
fields=["tenant_id", "scan_id"], name="scan_summaries_tenant_scan_idx"
),
),
]
@@ -0,0 +1,64 @@
import json
from datetime import datetime, timedelta, timezone
import django.db.models.deletion
from django.db import migrations, models
from django_celery_beat.models import PeriodicTask
from api.db_utils import rls_transaction
from api.models import Scan, StateChoices
def migrate_daily_scheduled_scan_tasks(apps, schema_editor):
for daily_scheduled_scan_task in PeriodicTask.objects.filter(
task="scan-perform-scheduled"
):
task_kwargs = json.loads(daily_scheduled_scan_task.kwargs)
tenant_id = task_kwargs["tenant_id"]
provider_id = task_kwargs["provider_id"]
current_time = datetime.now(timezone.utc)
scheduled_time_today = datetime.combine(
current_time.date(),
daily_scheduled_scan_task.start_time.time(),
tzinfo=timezone.utc,
)
if current_time < scheduled_time_today:
next_scan_date = scheduled_time_today
else:
next_scan_date = scheduled_time_today + timedelta(days=1)
with rls_transaction(tenant_id):
Scan.objects.create(
tenant_id=tenant_id,
name="Daily scheduled scan",
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
state=StateChoices.SCHEDULED,
scheduled_at=next_scan_date,
scheduler_task_id=daily_scheduled_scan_task.id,
)
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0007_scan_and_scan_summaries_indexes"),
("django_celery_beat", "0019_alter_periodictasks_options"),
]
operations = [
migrations.AddField(
model_name="scan",
name="scheduler_task",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
to="django_celery_beat.periodictask",
),
),
migrations.RunPython(migrate_daily_scheduled_scan_tasks),
]
@@ -0,0 +1,22 @@
# Generated by Django 5.1.5 on 2025-02-07 09:42
import django.core.validators
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0008_daily_scheduled_tasks_update"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="uid",
field=models.CharField(
max_length=250,
validators=[django.core.validators.MinLengthValidator(3)],
verbose_name="Unique identifier for the provider, set by the provider",
),
),
]
@@ -0,0 +1,109 @@
from functools import partial
from django.db import connection, migrations
def create_index_on_partitions(
apps, schema_editor, parent_table: str, index_name: str, index_details: str
):
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass;
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
# Iterate over partitions and create index concurrently.
# Note: PostgreSQL does not allow CONCURRENTLY inside a transaction,
# so we need atomic = False for this migration.
for partition in partitions:
sql = (
f"CREATE INDEX CONCURRENTLY IF NOT EXISTS {partition.replace('.', '_')}_{index_name} ON {partition} "
f"{index_details};"
)
schema_editor.execute(sql)
def drop_index_on_partitions(apps, schema_editor, parent_table: str, index_name: str):
with schema_editor.connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass;
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
# Iterate over partitions and drop index concurrently.
for partition in partitions:
partition_index = f"{partition.replace('.', '_')}_{index_name}"
sql = f"DROP INDEX CONCURRENTLY IF EXISTS {partition_index};"
schema_editor.execute(sql)
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0009_increase_provider_uid_maximum_length"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="findings_tenant_and_id_idx",
index_details="(tenant_id, id)",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="findings_tenant_and_id_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_idx",
index_details="(tenant_id, scan_id)",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_id_idx",
index_details="(tenant_id, scan_id, id)",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_id_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_delta_new_idx",
index_details="(tenant_id, id) where delta = 'new'",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_delta_new_idx",
),
),
]
@@ -0,0 +1,49 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0010_findings_performance_indexes_partitions"),
]
operations = [
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "id"], name="findings_tenant_and_id_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "scan_id"], name="find_tenant_scan_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "scan_id", "id"], name="find_tenant_scan_id_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=models.Index(
condition=models.Q(("delta", "new")),
fields=["tenant_id", "id"],
name="find_delta_new_idx",
),
),
migrations.AddIndex(
model_name="resourcetagmapping",
index=models.Index(
fields=["tenant_id", "resource_id"], name="resource_tag_tenant_idx"
),
),
migrations.AddIndex(
model_name="resource",
index=models.Index(
fields=["tenant_id", "service", "region", "type"],
name="resource_tenant_metadata_idx",
),
),
]
@@ -0,0 +1,15 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0011_findings_performance_indexes_parent"),
]
operations = [
migrations.AddField(
model_name="scan",
name="output_location",
field=models.CharField(blank=True, max_length=200, null=True),
),
]
@@ -0,0 +1,35 @@
# Generated by Django 5.1.5 on 2025-03-03 15:46
from functools import partial
from django.db import migrations
from api.db_utils import IntegrationTypeEnum, PostgresEnumMigration, register_enum
from api.models import Integration
IntegrationTypeEnumMigration = PostgresEnumMigration(
enum_name="integration_type",
enum_values=tuple(
integration_type[0]
for integration_type in Integration.IntegrationChoices.choices
),
)
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0012_scan_report_output"),
]
operations = [
migrations.RunPython(
IntegrationTypeEnumMigration.create_enum_type,
reverse_code=IntegrationTypeEnumMigration.drop_enum_type,
),
migrations.RunPython(
partial(register_enum, enum_class=IntegrationTypeEnum),
reverse_code=migrations.RunPython.noop,
),
]
@@ -0,0 +1,131 @@
# Generated by Django 5.1.5 on 2025-03-03 15:46
import uuid
import django.db.models.deletion
from django.db import migrations, models
import api.db_utils
import api.rls
from api.rls import RowLevelSecurityConstraint
class Migration(migrations.Migration):
dependencies = [
("api", "0013_integrations_enum"),
]
operations = [
migrations.CreateModel(
name="Integration",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
("enabled", models.BooleanField(default=False)),
("connected", models.BooleanField(blank=True, null=True)),
(
"connection_last_checked_at",
models.DateTimeField(blank=True, null=True),
),
(
"integration_type",
api.db_utils.IntegrationTypeEnumField(
choices=[
("amazon_s3", "Amazon S3"),
("saml", "SAML"),
("aws_security_hub", "AWS Security Hub"),
("jira", "JIRA"),
("slack", "Slack"),
]
),
),
("configuration", models.JSONField(default=dict)),
("_credentials", models.BinaryField(db_column="credentials")),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={"db_table": "integrations", "abstract": False},
),
migrations.AddConstraint(
model_name="integration",
constraint=RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_integration",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.CreateModel(
name="IntegrationProviderRelationship",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
(
"integration",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="api.integration",
),
),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.provider"
),
),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "integration_provider_mappings",
"constraints": [
models.UniqueConstraint(
fields=("integration_id", "provider_id"),
name="unique_integration_provider_rel",
),
],
},
),
migrations.AddConstraint(
model_name="IntegrationProviderRelationship",
constraint=RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_integrationproviderrelationship",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddField(
model_name="integration",
name="providers",
field=models.ManyToManyField(
blank=True,
related_name="integrations",
through="api.IntegrationProviderRelationship",
to="api.provider",
),
),
]
@@ -0,0 +1,26 @@
# Generated by Django 5.1.5 on 2025-03-25 11:29
from django.db import migrations, models
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0014_integrations"),
]
operations = [
migrations.AddField(
model_name="finding",
name="muted",
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name="finding",
name="status",
field=api.db_utils.StatusEnumField(
choices=[("FAIL", "Fail"), ("PASS", "Pass"), ("MANUAL", "Manual")]
),
),
]
@@ -0,0 +1,32 @@
# Generated by Django 5.1.5 on 2025-03-31 10:46
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0015_finding_muted"),
]
operations = [
migrations.AddField(
model_name="finding",
name="compliance",
field=models.JSONField(blank=True, default=dict, null=True),
),
migrations.AddField(
model_name="resource",
name="details",
field=models.TextField(blank=True, null=True),
),
migrations.AddField(
model_name="resource",
name="metadata",
field=models.TextField(blank=True, null=True),
),
migrations.AddField(
model_name="resource",
name="partition",
field=models.TextField(blank=True, null=True),
),
]
@@ -0,0 +1,32 @@
# Generated by Django 5.1.7 on 2025-04-16 08:47
from django.db import migrations
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0016_finding_compliance_resource_details_and_more"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="provider",
field=api.db_utils.ProviderEnumField(
choices=[
("aws", "AWS"),
("azure", "Azure"),
("gcp", "GCP"),
("kubernetes", "Kubernetes"),
("m365", "M365"),
],
default="aws",
),
),
migrations.RunSQL(
"ALTER TYPE provider ADD VALUE IF NOT EXISTS 'm365';",
reverse_sql=migrations.RunSQL.noop,
),
]
+134 -4
View File
@@ -11,6 +11,7 @@ from django.core.validators import MinLengthValidator
from django.db import models
from django.db.models import Q
from django.utils.translation import gettext_lazy as _
from django_celery_beat.models import PeriodicTask
from django_celery_results.models import TaskResult
from psqlextra.manager import PostgresManager
from psqlextra.models import PostgresPartitionedModel
@@ -20,6 +21,7 @@ from uuid6 import uuid7
from api.db_utils import (
CustomUserManager,
FindingDeltaEnumField,
IntegrationTypeEnumField,
InvitationStateEnumField,
MemberRoleEnumField,
ProviderEnumField,
@@ -57,7 +59,6 @@ class StatusChoices(models.TextChoices):
FAIL = "FAIL", _("Fail")
PASS = "PASS", _("Pass")
MANUAL = "MANUAL", _("Manual")
MUTED = "MUTED", _("Muted")
class StateChoices(models.TextChoices):
@@ -190,6 +191,7 @@ class Provider(RowLevelSecurityProtectedModel):
AZURE = "azure", _("Azure")
GCP = "gcp", _("GCP")
KUBERNETES = "kubernetes", _("Kubernetes")
M365 = "m365", _("M365")
@staticmethod
def validate_aws_uid(value):
@@ -213,6 +215,15 @@ class Provider(RowLevelSecurityProtectedModel):
pointer="/data/attributes/uid",
)
@staticmethod
def validate_m365_uid(value):
if not re.match(r"^[a-zA-Z0-9-]+\.onmicrosoft\.com$", value):
raise ModelValidationError(
detail="M365 tenant ID must be a valid domain.",
code="m365-uid",
pointer="/data/attributes/uid",
)
@staticmethod
def validate_gcp_uid(value):
if not re.match(r"^[a-z][a-z0-9-]{5,29}$", value):
@@ -226,13 +237,13 @@ class Provider(RowLevelSecurityProtectedModel):
@staticmethod
def validate_kubernetes_uid(value):
if not re.match(
r"(^[a-z0-9]([-a-z0-9]{1,61}[a-z0-9])?$)|(^arn:aws(-cn|-us-gov|-iso|-iso-b)?:[a-zA-Z0-9\-]+:([a-z]{2}-[a-z]+-\d{1})?:(\d{12})?:[a-zA-Z0-9\-_\/:\.\*]+(:\d+)?$)",
r"^[a-z0-9][A-Za-z0-9_.:\/-]{1,250}$",
value,
):
raise ModelValidationError(
detail="The value must either be a valid Kubernetes UID (up to 63 characters, "
"starting and ending with a lowercase letter or number, containing only "
"lowercase alphanumeric characters and hyphens) or a valid EKS ARN.",
"lowercase alphanumeric characters and hyphens) or a valid AWS EKS Cluster ARN, GCP GKE Context Name or Azure AKS Cluster Name.",
code="kubernetes-uid",
pointer="/data/attributes/uid",
)
@@ -246,7 +257,7 @@ class Provider(RowLevelSecurityProtectedModel):
)
uid = models.CharField(
"Unique identifier for the provider, set by the provider",
max_length=63,
max_length=250,
blank=False,
validators=[MinLengthValidator(3)],
)
@@ -410,6 +421,10 @@ class Scan(RowLevelSecurityProtectedModel):
started_at = models.DateTimeField(null=True, blank=True)
completed_at = models.DateTimeField(null=True, blank=True)
next_scan_at = models.DateTimeField(null=True, blank=True)
scheduler_task = models.ForeignKey(
PeriodicTask, on_delete=models.CASCADE, null=True, blank=True
)
output_location = models.CharField(blank=True, null=True, max_length=200)
# TODO: mutelist foreign key
class Meta(RowLevelSecurityProtectedModel.Meta):
@@ -428,6 +443,10 @@ class Scan(RowLevelSecurityProtectedModel):
fields=["provider", "state", "trigger", "scheduled_at"],
name="scans_prov_state_trig_sche_idx",
),
models.Index(
fields=["tenant_id", "provider_id", "state", "inserted_at"],
name="scans_prov_state_insert_idx",
),
]
class JSONAPIMeta:
@@ -509,6 +528,11 @@ class Resource(RowLevelSecurityProtectedModel):
editable=False,
)
metadata = models.TextField(blank=True, null=True)
details = models.TextField(blank=True, null=True)
partition = models.TextField(blank=True, null=True)
# Relationships
tags = models.ManyToManyField(
ResourceTag,
verbose_name="Tags associated with the resource, by provider",
@@ -544,6 +568,10 @@ class Resource(RowLevelSecurityProtectedModel):
fields=["uid", "region", "service", "name"],
name="resource_uid_reg_serv_name_idx",
),
models.Index(
fields=["tenant_id", "service", "region", "type"],
name="resource_tenant_metadata_idx",
),
GinIndex(fields=["text_search"], name="gin_resources_search_idx"),
]
@@ -591,6 +619,12 @@ class ResourceTagMapping(RowLevelSecurityProtectedModel):
),
]
indexes = [
models.Index(
fields=["tenant_id", "resource_id"], name="resource_tag_tenant_idx"
),
]
class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
"""
@@ -615,6 +649,7 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
id = models.UUIDField(primary_key=True, default=uuid7, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
first_seen_at = models.DateTimeField(editable=False, null=True)
uid = models.CharField(max_length=300)
delta = FindingDeltaEnumField(
@@ -635,6 +670,8 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
tags = models.JSONField(default=dict, null=True, blank=True)
check_id = models.CharField(max_length=100, blank=False, null=False)
check_metadata = models.JSONField(default=dict, null=False)
muted = models.BooleanField(default=False, null=False)
compliance = models.JSONField(default=dict, null=True, blank=True)
# Relationships
scan = models.ForeignKey(to=Scan, related_name="findings", on_delete=models.CASCADE)
@@ -688,7 +725,17 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
],
name="findings_filter_idx",
),
models.Index(fields=["tenant_id", "id"], name="findings_tenant_and_id_idx"),
GinIndex(fields=["text_search"], name="gin_findings_search_idx"),
models.Index(fields=["tenant_id", "scan_id"], name="find_tenant_scan_idx"),
models.Index(
fields=["tenant_id", "scan_id", "id"], name="find_tenant_scan_id_idx"
),
models.Index(
fields=["tenant_id", "id"],
condition=Q(delta="new"),
name="find_delta_new_idx",
),
]
class JSONAPIMeta:
@@ -1099,6 +1146,89 @@ class ScanSummary(RowLevelSecurityProtectedModel):
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
indexes = [
models.Index(
fields=["tenant_id", "scan_id"],
name="scan_summaries_tenant_scan_idx",
)
]
class JSONAPIMeta:
resource_name = "scan-summaries"
class Integration(RowLevelSecurityProtectedModel):
class IntegrationChoices(models.TextChoices):
S3 = "amazon_s3", _("Amazon S3")
SAML = "saml", _("SAML")
AWS_SECURITY_HUB = "aws_security_hub", _("AWS Security Hub")
JIRA = "jira", _("JIRA")
SLACK = "slack", _("Slack")
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
enabled = models.BooleanField(default=False)
connected = models.BooleanField(null=True, blank=True)
connection_last_checked_at = models.DateTimeField(null=True, blank=True)
integration_type = IntegrationTypeEnumField(choices=IntegrationChoices.choices)
configuration = models.JSONField(default=dict)
_credentials = models.BinaryField(db_column="credentials")
providers = models.ManyToManyField(
Provider,
related_name="integrations",
through="IntegrationProviderRelationship",
blank=True,
)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "integrations"
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class JSONAPIMeta:
resource_name = "integrations"
@property
def credentials(self):
if isinstance(self._credentials, memoryview):
encrypted_bytes = self._credentials.tobytes()
elif isinstance(self._credentials, str):
encrypted_bytes = self._credentials.encode()
else:
encrypted_bytes = self._credentials
decrypted_data = fernet.decrypt(encrypted_bytes)
return json.loads(decrypted_data.decode())
@credentials.setter
def credentials(self, value):
encrypted_data = fernet.encrypt(json.dumps(value).encode())
self._credentials = encrypted_data
class IntegrationProviderRelationship(RowLevelSecurityProtectedModel):
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
integration = models.ForeignKey(Integration, on_delete=models.CASCADE)
provider = models.ForeignKey(Provider, on_delete=models.CASCADE)
inserted_at = models.DateTimeField(auto_now_add=True)
class Meta:
db_table = "integration_provider_mappings"
constraints = [
models.UniqueConstraint(
fields=["integration_id", "provider_id"],
name="unique_integration_provider_rel",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
+12 -11
View File
@@ -2,8 +2,7 @@ from typing import Any
from uuid import uuid4
from django.core.exceptions import ValidationError
from django.db import DEFAULT_DB_ALIAS
from django.db import models
from django.db import DEFAULT_DB_ALIAS, models
from django.db.backends.ddl_references import Statement, Table
from api.db_utils import DB_USER, POSTGRES_TENANT_VAR
@@ -59,11 +58,11 @@ class RowLevelSecurityConstraint(models.BaseConstraint):
drop_sql_query = """
ALTER TABLE %(table_name)s NO FORCE ROW LEVEL SECURITY;
ALTER TABLE %(table_name)s DISABLE ROW LEVEL SECURITY;
REVOKE ALL ON TABLE %(table_name) TO %(db_user)s;
REVOKE ALL ON TABLE %(table_name)s FROM %(db_user)s;
"""
drop_policy_sql_query = """
DROP POLICY IF EXISTS %(db_user)s_%(table_name)s_{statement} on %(table_name)s;
DROP POLICY IF EXISTS %(db_user)s_%(raw_table_name)s_{statement} ON %(table_name)s;
"""
def __init__(
@@ -88,9 +87,7 @@ class RowLevelSecurityConstraint(models.BaseConstraint):
f"{grant_queries}{self.grant_sql_query.format(statement=statement)}"
)
full_create_sql_query = (
f"{self.rls_sql_query}" f"{policy_queries}" f"{grant_queries}"
)
full_create_sql_query = f"{self.rls_sql_query}{policy_queries}{grant_queries}"
table_name = model._meta.db_table
if self.partition_name:
@@ -107,16 +104,20 @@ class RowLevelSecurityConstraint(models.BaseConstraint):
def remove_sql(self, model: Any, schema_editor: Any) -> Any:
field_column = schema_editor.quote_name(self.target_field)
raw_table_name = model._meta.db_table
table_name = raw_table_name
if self.partition_name:
raw_table_name = f"{raw_table_name}_{self.partition_name}"
table_name = raw_table_name
full_drop_sql_query = (
f"{self.drop_sql_query}"
f"{''.join([self.drop_policy_sql_query.format(statement) for statement in self.statements])}"
f"{''.join([self.drop_policy_sql_query.format(statement=statement) for statement in self.statements])}"
)
table_name = model._meta.db_table
if self.partition_name:
table_name = f"{table_name}_{self.partition_name}"
return Statement(
full_drop_sql_query,
table_name=Table(table_name, schema_editor.quote_name),
raw_table_name=raw_table_name,
field_column=field_column,
db_user=DB_USER,
partition_name=self.partition_name,
+3 -4
View File
@@ -1,12 +1,12 @@
from celery import states
from celery.signals import before_task_publish
from config.celery import celery_app
from django.db.models.signals import post_delete
from django.dispatch import receiver
from django_celery_beat.models import PeriodicTask
from django_celery_results.backends.database import DatabaseBackend
from api.db_utils import delete_related_daily_task
from api.models import Provider
from config.celery import celery_app
def create_task_result_on_publish(sender=None, headers=None, **kwargs): # noqa: F841
@@ -31,5 +31,4 @@ before_task_publish.connect(
@receiver(post_delete, sender=Provider)
def delete_provider_scan_task(sender, instance, **kwargs): # noqa: F841
# Delete the associated periodic task when the provider is deleted
task_name = f"scan-perform-scheduled-{instance.id}"
PeriodicTask.objects.filter(name=task_name).delete()
delete_related_daily_task(instance.id)
File diff suppressed because it is too large Load Diff
@@ -3,6 +3,8 @@ from conftest import TEST_PASSWORD, get_api_tokens, get_authorization_header
from django.urls import reverse
from rest_framework.test import APIClient
from api.models import Membership, User
@pytest.mark.django_db
def test_basic_authentication():
@@ -177,3 +179,122 @@ def test_user_me_when_inviting_users(create_test_user, tenants_fixture, roles_fi
user2_me = client.get(reverse("user-me"), headers=user2_headers)
assert user2_me.status_code == 200
assert user2_me.json()["data"]["attributes"]["email"] == user2_email
@pytest.mark.django_db
class TestTokenSwitchTenant:
def test_switch_tenant_with_valid_token(self, tenants_fixture, providers_fixture):
client = APIClient()
test_user = "test_email@prowler.com"
test_password = "test_password"
# Check that we can create a new user without any kind of authentication
user_creation_response = client.post(
reverse("user-list"),
data={
"data": {
"type": "users",
"attributes": {
"name": "test",
"email": test_user,
"password": test_password,
},
}
},
format="vnd.api+json",
)
assert user_creation_response.status_code == 201
# Create a new relationship between this user and another tenant
tenant_id = tenants_fixture[0].id
user_instance = User.objects.get(email=test_user)
Membership.objects.create(user=user_instance, tenant_id=tenant_id)
# Check that using our new user's credentials we can authenticate and get the providers
access_token, _ = get_api_tokens(client, test_user, test_password)
auth_headers = get_authorization_header(access_token)
user_me_response = client.get(
reverse("user-me"),
headers=auth_headers,
)
assert user_me_response.status_code == 200
# Assert this user belongs to two tenants
assert (
user_me_response.json()["data"]["relationships"]["memberships"]["meta"][
"count"
]
== 2
)
provider_response = client.get(
reverse("provider-list"),
headers=auth_headers,
)
assert provider_response.status_code == 200
# Empty response since there are no providers in this tenant
assert not provider_response.json()["data"]
switch_tenant_response = client.post(
reverse("token-switch"),
data={
"data": {
"type": "tokens-switch-tenant",
"attributes": {"tenant_id": tenant_id},
}
},
headers=auth_headers,
)
assert switch_tenant_response.status_code == 200
new_access_token = switch_tenant_response.json()["data"]["attributes"]["access"]
new_auth_headers = get_authorization_header(new_access_token)
provider_response = client.get(
reverse("provider-list"),
headers=new_auth_headers,
)
assert provider_response.status_code == 200
# Now it must be data because we switched to another tenant with providers
assert provider_response.json()["data"]
def test_switch_tenant_with_invalid_token(self, create_test_user, tenants_fixture):
client = APIClient()
access_token, refresh_token = get_api_tokens(
client, create_test_user.email, TEST_PASSWORD
)
auth_headers = get_authorization_header(access_token)
invalid_token_response = client.post(
reverse("token-switch"),
data={
"data": {
"type": "tokens-switch-tenant",
"attributes": {"tenant_id": "invalid_tenant_id"},
}
},
headers=auth_headers,
)
assert invalid_token_response.status_code == 400
assert invalid_token_response.json()["errors"][0]["code"] == "invalid"
assert (
invalid_token_response.json()["errors"][0]["detail"]
== "Must be a valid UUID."
)
invalid_tenant_response = client.post(
reverse("token-switch"),
data={
"data": {
"type": "tokens-switch-tenant",
"attributes": {"tenant_id": tenants_fixture[-1].id},
}
},
headers=auth_headers,
)
assert invalid_tenant_response.status_code == 400
assert invalid_tenant_response.json()["errors"][0]["code"] == "invalid"
assert invalid_tenant_response.json()["errors"][0]["detail"] == (
"Tenant does not exist or user is not a " "member."
)
+3 -2
View File
@@ -131,9 +131,10 @@ class TestBatchDelete:
return provider_count
@pytest.mark.django_db
def test_batch_delete(self, create_test_providers):
def test_batch_delete(self, tenants_fixture, create_test_providers):
tenant_id = str(tenants_fixture[0].id)
_, summary = batch_delete(
Provider.objects.all(), batch_size=create_test_providers // 2
tenant_id, Provider.objects.all(), batch_size=create_test_providers // 2
)
assert Provider.objects.all().count() == 0
assert summary == {"api.Provider": create_test_providers}
+28
View File
@@ -92,3 +92,31 @@ class TestResourceModel:
assert len(resource.tags.filter(tenant_id=tenant_id)) == 0
assert resource.get_tags(tenant_id=tenant_id) == {}
# @pytest.mark.django_db
# class TestFindingModel:
# def test_add_finding_with_long_uid(
# self, providers_fixture, scans_fixture, resources_fixture
# ):
# provider, *_ = providers_fixture
# tenant_id = provider.tenant_id
# long_uid = "1" * 500
# _ = Finding.objects.create(
# tenant_id=tenant_id,
# uid=long_uid,
# delta=Finding.DeltaChoices.NEW,
# check_metadata={},
# status=StatusChoices.PASS,
# status_extended="",
# severity="high",
# impact="high",
# raw_result={},
# check_id="test_check",
# scan=scans_fixture[0],
# first_seen_at=None,
# muted=False,
# compliance={},
# )
# assert Finding.objects.filter(uid=long_uid).exists()
+106 -1
View File
@@ -1,7 +1,19 @@
from unittest.mock import ANY, Mock, patch
import pytest
from django.urls import reverse
from rest_framework import status
from unittest.mock import patch, ANY, Mock
from api.models import (
Membership,
ProviderGroup,
ProviderGroupMembership,
Role,
RoleProviderGroupRelationship,
User,
UserRoleRelationship,
)
from api.v1.serializers import TokenSerializer
@pytest.mark.django_db
@@ -304,3 +316,96 @@ class TestProviderViewSet:
reverse("provider-connection", kwargs={"pk": provider.id})
)
assert response.status_code == status.HTTP_403_FORBIDDEN
@pytest.mark.django_db
class TestLimitedVisibility:
TEST_EMAIL = "rbac@rbac.com"
TEST_PASSWORD = "thisisapassword123"
@pytest.fixture
def limited_admin_user(
self, django_db_setup, django_db_blocker, tenants_fixture, providers_fixture
):
with django_db_blocker.unblock():
tenant = tenants_fixture[0]
provider = providers_fixture[0]
user = User.objects.create_user(
name="testing",
email=self.TEST_EMAIL,
password=self.TEST_PASSWORD,
)
Membership.objects.create(
user=user,
tenant=tenant,
role=Membership.RoleChoices.OWNER,
)
role = Role.objects.create(
name="limited_visibility",
tenant=tenant,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=False,
)
UserRoleRelationship.objects.create(
user=user,
role=role,
tenant=tenant,
)
provider_group = ProviderGroup.objects.create(
name="limited_visibility_group",
tenant=tenant,
)
ProviderGroupMembership.objects.create(
tenant=tenant,
provider=provider,
provider_group=provider_group,
)
RoleProviderGroupRelationship.objects.create(
tenant=tenant, role=role, provider_group=provider_group
)
return user
@pytest.fixture
def authenticated_client_rbac_limited(
self, limited_admin_user, tenants_fixture, client
):
client.user = limited_admin_user
tenant_id = tenants_fixture[0].id
serializer = TokenSerializer(
data={
"type": "tokens",
"email": self.TEST_EMAIL,
"password": self.TEST_PASSWORD,
"tenant_id": tenant_id,
}
)
serializer.is_valid(raise_exception=True)
access_token = serializer.validated_data["access"]
client.defaults["HTTP_AUTHORIZATION"] = f"Bearer {access_token}"
return client
def test_integrations(
self, authenticated_client_rbac_limited, integrations_fixture, providers_fixture
):
# Integration 2 is related to provider1 and provider 2
# This user cannot see provider 2
integration = integrations_fixture[1]
response = authenticated_client_rbac_limited.get(
reverse("integration-detail", kwargs={"pk": integration.id})
)
assert response.status_code == status.HTTP_200_OK
assert integration.providers.count() == 2
assert (
response.json()["data"]["relationships"]["providers"]["meta"]["count"] == 1
)
+36 -18
View File
@@ -1,25 +1,25 @@
from datetime import datetime, timedelta, timezone
from unittest.mock import patch, MagicMock
from unittest.mock import MagicMock, patch
import pytest
from rest_framework.exceptions import NotFound, ValidationError
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Invitation, Provider
from api.utils import (
get_prowler_provider_kwargs,
initialize_prowler_provider,
merge_dicts,
prowler_provider_connection_test,
return_prowler_provider,
validate_invitation,
)
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
from rest_framework.exceptions import ValidationError, NotFound
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Invitation
from api.models import Provider
from api.utils import (
merge_dicts,
return_prowler_provider,
initialize_prowler_provider,
prowler_provider_connection_test,
get_prowler_provider_kwargs,
)
from api.utils import validate_invitation
from prowler.providers.m365.m365_provider import M365Provider
class TestMergeDicts:
@@ -105,6 +105,7 @@ class TestReturnProwlerProvider:
(Provider.ProviderChoices.GCP.value, GcpProvider),
(Provider.ProviderChoices.AZURE.value, AzureProvider),
(Provider.ProviderChoices.KUBERNETES.value, KubernetesProvider),
(Provider.ProviderChoices.M365.value, M365Provider),
],
)
def test_return_prowler_provider(self, provider_type, expected_provider):
@@ -144,6 +145,18 @@ class TestProwlerProviderConnectionTest:
key="value", provider_id="1234567890", raise_on_exception=False
)
@pytest.mark.django_db
@patch("api.utils.return_prowler_provider")
def test_prowler_provider_connection_test_without_secret(
self, mock_return_prowler_provider, providers_fixture
):
mock_return_prowler_provider.return_value = MagicMock()
connection = prowler_provider_connection_test(providers_fixture[0])
assert connection.is_connected is False
assert isinstance(connection.error, Provider.secret.RelatedObjectDoesNotExist)
assert str(connection.error) == "Provider has no secret."
class TestGetProwlerProviderKwargs:
@pytest.mark.parametrize(
@@ -165,6 +178,10 @@ class TestGetProwlerProviderKwargs:
Provider.ProviderChoices.KUBERNETES.value,
{"context": "provider_uid"},
),
(
Provider.ProviderChoices.M365.value,
{},
),
],
)
def test_get_prowler_provider_kwargs(self, provider_type, expected_extra_kwargs):
@@ -274,9 +291,10 @@ class TestValidateInvitation:
expired_time = datetime.now(timezone.utc) - timedelta(days=1)
invitation.expires_at = expired_time
with patch("api.utils.Invitation.objects.using") as mock_using, patch(
"api.utils.datetime"
) as mock_datetime:
with (
patch("api.utils.Invitation.objects.using") as mock_using,
patch("api.utils.datetime") as mock_datetime,
):
mock_db = mock_using.return_value
mock_db.get.return_value = invitation
mock_datetime.now.return_value = datetime.now(timezone.utc)
File diff suppressed because it is too large Load Diff
+28 -10
View File
@@ -1,15 +1,26 @@
from datetime import datetime, timezone
from allauth.socialaccount.providers.oauth2.client import OAuth2Client
from rest_framework.exceptions import NotFound, ValidationError
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Invitation, Provider
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.common.models import Connection
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
from rest_framework.exceptions import ValidationError, NotFound
from prowler.providers.m365.m365_provider import M365Provider
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Provider, Invitation
class CustomOAuth2Client(OAuth2Client):
def __init__(self, client_id, secret, *args, **kwargs):
# Remove any duplicate "scope_delimiter" from kwargs
# Bug present in dj-rest-auth after version v7.0.1
# https://github.com/iMerica/dj-rest-auth/issues/673
kwargs.pop("scope_delimiter", None)
super().__init__(client_id, secret, *args, **kwargs)
def merge_dicts(default_dict: dict, replacement_dict: dict) -> dict:
@@ -41,14 +52,14 @@ def merge_dicts(default_dict: dict, replacement_dict: dict) -> dict:
def return_prowler_provider(
provider: Provider,
) -> [AwsProvider | AzureProvider | GcpProvider | KubernetesProvider]:
) -> [AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider]:
"""Return the Prowler provider class based on the given provider type.
Args:
provider (Provider): The provider object containing the provider type and associated secrets.
Returns:
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider: The corresponding provider class.
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider: The corresponding provider class.
Raises:
ValueError: If the provider type specified in `provider.provider` is not supported.
@@ -62,6 +73,8 @@ def return_prowler_provider(
prowler_provider = AzureProvider
case Provider.ProviderChoices.KUBERNETES.value:
prowler_provider = KubernetesProvider
case Provider.ProviderChoices.M365.value:
prowler_provider = M365Provider
case _:
raise ValueError(f"Provider type {provider.provider} not supported")
return prowler_provider
@@ -94,15 +107,15 @@ def get_prowler_provider_kwargs(provider: Provider) -> dict:
def initialize_prowler_provider(
provider: Provider,
) -> AwsProvider | AzureProvider | GcpProvider | KubernetesProvider:
) -> AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider:
"""Initialize a Prowler provider instance based on the given provider type.
Args:
provider (Provider): The provider object containing the provider type and associated secrets.
Returns:
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider: An instance of the corresponding provider class
(`AwsProvider`, `AzureProvider`, `GcpProvider`, or `KubernetesProvider`) initialized with the
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider: An instance of the corresponding provider class
(`AwsProvider`, `AzureProvider`, `GcpProvider`, `KubernetesProvider` or `M365Provider`) initialized with the
provider's secrets.
"""
prowler_provider = return_prowler_provider(provider)
@@ -120,7 +133,12 @@ def prowler_provider_connection_test(provider: Provider) -> Connection:
Connection: A connection object representing the result of the connection test for the specified provider.
"""
prowler_provider = return_prowler_provider(provider)
prowler_provider_kwargs = provider.secret.secret
try:
prowler_provider_kwargs = provider.secret.secret
except Provider.secret.RelatedObjectDoesNotExist as secret_error:
return Connection(is_connected=False, error=secret_error)
return prowler_provider.test_connection(
**prowler_provider_kwargs, provider_id=provider.uid, raise_on_exception=False
)
+1 -1
View File
@@ -106,7 +106,7 @@ def uuid7_end(uuid_obj: UUID, offset_months: int = 1) -> UUID:
Args:
uuid_obj: A UUIDv7 object.
offset_days: Number of months to offset from the given UUID's date. Defaults to 1 to handle if
offset_months: Number of months to offset from the given UUID's date. Defaults to 1 to handle if
partitions are not being used, if so the value will be the one set at FINDINGS_TABLE_PARTITION_MONTHS.
Returns:
@@ -0,0 +1,122 @@
from drf_spectacular.utils import extend_schema_field
from rest_framework_json_api import serializers
from rest_framework_json_api.serializers import ValidationError
class BaseValidateSerializer(serializers.Serializer):
def validate(self, data):
if hasattr(self, "initial_data"):
initial_data = set(self.initial_data.keys()) - {"id", "type"}
unknown_keys = initial_data - set(self.fields.keys())
if unknown_keys:
raise ValidationError(f"Invalid fields: {unknown_keys}")
return data
# Integrations
class S3ConfigSerializer(BaseValidateSerializer):
bucket_name = serializers.CharField()
output_directory = serializers.CharField()
class Meta:
resource_name = "integrations"
class AWSCredentialSerializer(BaseValidateSerializer):
role_arn = serializers.CharField(required=False)
external_id = serializers.CharField(required=False)
role_session_name = serializers.CharField(required=False)
session_duration = serializers.IntegerField(
required=False, min_value=900, max_value=43200
)
aws_access_key_id = serializers.CharField(required=False)
aws_secret_access_key = serializers.CharField(required=False)
aws_session_token = serializers.CharField(required=False)
class Meta:
resource_name = "integrations"
@extend_schema_field(
{
"oneOf": [
{
"type": "object",
"title": "AWS Credentials",
"properties": {
"role_arn": {
"type": "string",
"description": "The Amazon Resource Name (ARN) of the role to assume. Required for AWS role "
"assumption.",
},
"external_id": {
"type": "string",
"description": "An identifier to enhance security for role assumption.",
},
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Only required if the environment lacks pre-configured "
"AWS credentials.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Required if 'aws_access_key_id' is provided or if "
"no AWS credentials are pre-configured.",
},
"aws_session_token": {
"type": "string",
"description": "The session token for temporary credentials, if applicable.",
},
"session_duration": {
"type": "integer",
"minimum": 900,
"maximum": 43200,
"default": 3600,
"description": "The duration (in seconds) for the role session.",
},
"role_session_name": {
"type": "string",
"description": "An identifier for the role session, useful for tracking sessions in AWS logs. "
"The regex used to validate this parameter is a string of characters consisting of "
"upper- and lower-case alphanumeric characters with no spaces. You can also include "
"underscores or any of the following characters: =,.@-\n\n"
"Examples:\n"
"- MySession123\n"
"- User_Session-1\n"
"- Test.Session@2",
"pattern": "^[a-zA-Z0-9=,.@_-]+$",
},
},
},
]
}
)
class IntegrationCredentialField(serializers.JSONField):
pass
@extend_schema_field(
{
"oneOf": [
{
"type": "object",
"title": "Amazon S3",
"properties": {
"bucket_name": {
"type": "string",
"description": "The name of the S3 bucket where files will be stored.",
},
"output_directory": {
"type": "string",
"description": "The directory path within the bucket where files will be saved.",
},
},
"required": ["bucket_name", "output_directory"],
},
]
}
)
class IntegrationConfigField(serializers.JSONField):
pass
@@ -0,0 +1,172 @@
from drf_spectacular.utils import extend_schema_field
from rest_framework_json_api import serializers
@extend_schema_field(
{
"oneOf": [
{
"type": "object",
"title": "AWS Static Credentials",
"properties": {
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Required for environments where no IAM role is being "
"assumed and direct AWS access is needed.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Must accompany 'aws_access_key_id' to authorize "
"access to AWS resources.",
},
"aws_session_token": {
"type": "string",
"description": "The session token associated with temporary credentials. Only needed for "
"session-based or temporary AWS access.",
},
},
"required": ["aws_access_key_id", "aws_secret_access_key"],
},
{
"type": "object",
"title": "AWS Assume Role",
"properties": {
"role_arn": {
"type": "string",
"description": "The Amazon Resource Name (ARN) of the role to assume. Required for AWS role "
"assumption.",
},
"external_id": {
"type": "string",
"description": "An identifier to enhance security for role assumption.",
},
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Only required if the environment lacks pre-configured "
"AWS credentials.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Required if 'aws_access_key_id' is provided or if "
"no AWS credentials are pre-configured.",
},
"aws_session_token": {
"type": "string",
"description": "The session token for temporary credentials, if applicable.",
},
"session_duration": {
"type": "integer",
"minimum": 900,
"maximum": 43200,
"default": 3600,
"description": "The duration (in seconds) for the role session.",
},
"role_session_name": {
"type": "string",
"description": "An identifier for the role session, useful for tracking sessions in AWS logs. "
"The regex used to validate this parameter is a string of characters consisting of "
"upper- and lower-case alphanumeric characters with no spaces. You can also include "
"underscores or any of the following characters: =,.@-\n\n"
"Examples:\n"
"- MySession123\n"
"- User_Session-1\n"
"- Test.Session@2",
"pattern": "^[a-zA-Z0-9=,.@_-]+$",
},
},
"required": ["role_arn", "external_id"],
},
{
"type": "object",
"title": "Azure Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The Azure application (client) ID for authentication in Azure AD.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the application (client) ID, providing "
"secure access.",
},
"tenant_id": {
"type": "string",
"description": "The Azure tenant ID, representing the directory where the application is "
"registered.",
},
},
"required": ["client_id", "client_secret", "tenant_id"],
},
{
"type": "object",
"title": "M365 Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The Azure application (client) ID for authentication in Azure AD.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the application (client) ID, providing "
"secure access.",
},
"tenant_id": {
"type": "string",
"description": "The Azure tenant ID, representing the directory where the application is "
"registered.",
},
"user": {
"type": "email",
"description": "User microsoft email address.",
},
"encrypted_password": {
"type": "string",
"description": "User encrypted password.",
},
},
"required": [
"client_id",
"client_secret",
"tenant_id",
"user",
"encrypted_password",
],
},
{
"type": "object",
"title": "GCP Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The client ID from Google Cloud, used to identify the application for GCP "
"access.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the GCP client ID, required for secure "
"access.",
},
"refresh_token": {
"type": "string",
"description": "A refresh token that allows the application to obtain new access tokens for "
"extended use.",
},
},
"required": ["client_id", "client_secret", "refresh_token"],
},
{
"type": "object",
"title": "Kubernetes Static Credentials",
"properties": {
"kubeconfig_content": {
"type": "string",
"description": "The content of the Kubernetes kubeconfig file, encoded as a string.",
}
},
"required": ["kubeconfig_content"],
},
]
}
)
class ProviderSecretField(serializers.JSONField):
pass
+477 -184
View File
@@ -16,6 +16,8 @@ from rest_framework_simplejwt.tokens import RefreshToken
from api.models import (
ComplianceOverview,
Finding,
Integration,
IntegrationProviderRelationship,
Invitation,
InvitationRoleRelationship,
Membership,
@@ -34,11 +36,76 @@ from api.models import (
UserRoleRelationship,
)
from api.rls import Tenant
from api.v1.serializer_utils.integrations import (
AWSCredentialSerializer,
IntegrationConfigField,
IntegrationCredentialField,
S3ConfigSerializer,
)
from api.v1.serializer_utils.providers import ProviderSecretField
# Tokens
class TokenSerializer(TokenObtainPairSerializer):
def generate_tokens(user: User, tenant_id: str) -> dict:
try:
refresh = RefreshToken.for_user(user)
except InvalidKeyError:
# Handle invalid key error
raise ValidationError(
{
"detail": "Token generation failed due to invalid key configuration. Provide valid "
"DJANGO_TOKEN_SIGNING_KEY and DJANGO_TOKEN_VERIFYING_KEY in the environment."
}
)
except Exception as e:
raise ValidationError({"detail": str(e)})
# Post-process the tokens
# Set the tenant_id
refresh["tenant_id"] = tenant_id
# Set the nbf (not before) claim to the iat (issued at) claim. At this moment, simplejwt does not provide a
# way to set the nbf claim
refresh.payload["nbf"] = refresh["iat"]
# Get the access token
access = refresh.access_token
if settings.SIMPLE_JWT["UPDATE_LAST_LOGIN"]:
update_last_login(None, user)
return {"access": str(access), "refresh": str(refresh)}
class BaseTokenSerializer(TokenObtainPairSerializer):
def custom_validate(self, attrs, social: bool = False):
email = attrs.get("email")
password = attrs.get("password")
tenant_id = str(attrs.get("tenant_id", ""))
# Authenticate user
user = (
User.objects.get(email=email)
if social
else authenticate(username=email, password=password)
)
if user is None:
raise ValidationError("Invalid credentials")
if tenant_id:
if not user.is_member_of_tenant(tenant_id):
raise ValidationError("Tenant does not exist or user is not a member.")
else:
first_membership = user.memberships.order_by("date_joined").first()
if first_membership is None:
raise ValidationError("User has no memberships.")
tenant_id = str(first_membership.tenant_id)
return generate_tokens(user, tenant_id)
class TokenSerializer(BaseTokenSerializer):
email = serializers.EmailField(write_only=True)
password = serializers.CharField(write_only=True)
tenant_id = serializers.UUIDField(
@@ -56,53 +123,25 @@ class TokenSerializer(TokenObtainPairSerializer):
resource_name = "tokens"
def validate(self, attrs):
email = attrs.get("email")
password = attrs.get("password")
tenant_id = str(attrs.get("tenant_id", ""))
return super().custom_validate(attrs)
# Authenticate user
user = authenticate(username=email, password=password)
if user is None:
raise ValidationError("Invalid credentials")
if tenant_id:
if not user.is_member_of_tenant(tenant_id):
raise ValidationError("Tenant does not exist or user is not a member.")
else:
first_membership = user.memberships.order_by("date_joined").first()
if first_membership is None:
raise ValidationError("User has no memberships.")
tenant_id = str(first_membership.tenant_id)
class TokenSocialLoginSerializer(BaseTokenSerializer):
email = serializers.EmailField(write_only=True)
# Generate tokens
try:
refresh = RefreshToken.for_user(user)
except InvalidKeyError:
# Handle invalid key error
raise ValidationError(
{
"detail": "Token generation failed due to invalid key configuration. Provide valid "
"DJANGO_TOKEN_SIGNING_KEY and DJANGO_TOKEN_VERIFYING_KEY in the environment."
}
)
except Exception as e:
raise ValidationError({"detail": str(e)})
# Output tokens
refresh = serializers.CharField(read_only=True)
access = serializers.CharField(read_only=True)
# Post-process the tokens
# Set the tenant_id
refresh["tenant_id"] = tenant_id
class JSONAPIMeta:
resource_name = "tokens"
# Set the nbf (not before) claim to the iat (issued at) claim. At this moment, simplejwt does not provide a
# way to set the nbf claim
refresh.payload["nbf"] = refresh["iat"]
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields.pop("password", None)
# Get the access token
access = refresh.access_token
if settings.SIMPLE_JWT["UPDATE_LAST_LOGIN"]:
update_last_login(None, user)
return {"access": str(access), "refresh": str(refresh)}
def validate(self, attrs):
return super().custom_validate(attrs, social=True)
# TODO: Check if we can change the parent class to TokenRefreshSerializer from rest_framework_simplejwt.serializers
@@ -140,6 +179,30 @@ class TokenRefreshSerializer(serializers.Serializer):
raise ValidationError({"refresh": "Invalid or expired token"})
class TokenSwitchTenantSerializer(serializers.Serializer):
tenant_id = serializers.UUIDField(
write_only=True, help_text="The tenant ID for which to request a new token."
)
access = serializers.CharField(read_only=True)
refresh = serializers.CharField(read_only=True)
class JSONAPIMeta:
resource_name = "tokens-switch-tenant"
def validate(self, attrs):
request = self.context["request"]
user = request.user
if not user.is_authenticated:
raise ValidationError("Invalid or expired token.")
tenant_id = str(attrs.get("tenant_id"))
if not user.is_member_of_tenant(tenant_id):
raise ValidationError("Tenant does not exist or user is not a member.")
return generate_tokens(user, tenant_id)
# Base
@@ -691,6 +754,43 @@ class ProviderSerializer(RLSSerializer):
}
class ProviderIncludeSerializer(RLSSerializer):
"""
Serializer for the Provider model.
"""
provider = ProviderEnumSerializerField()
connection = serializers.SerializerMethodField(read_only=True)
class Meta:
model = Provider
fields = [
"id",
"inserted_at",
"updated_at",
"provider",
"uid",
"alias",
"connection",
# "scanner_args",
]
@extend_schema_field(
{
"type": "object",
"properties": {
"connected": {"type": "boolean"},
"last_checked_at": {"type": "string", "format": "date-time"},
},
}
)
def get_connection(self, obj):
return {
"connected": obj.connected,
"last_checked_at": obj.connection_last_checked_at,
}
class ProviderCreateSerializer(RLSSerializer, BaseWriteSerializer):
class Meta:
model = Provider
@@ -752,6 +852,39 @@ class ScanSerializer(RLSSerializer):
"url",
]
included_serializers = {
"provider": "api.v1.serializers.ProviderIncludeSerializer",
}
class ScanIncludeSerializer(RLSSerializer):
trigger = serializers.ChoiceField(
choices=Scan.TriggerChoices.choices, read_only=True
)
state = StateEnumSerializerField(read_only=True)
class Meta:
model = Scan
fields = [
"id",
"name",
"trigger",
"state",
"unique_resource_count",
"progress",
# "scanner_args",
"duration",
"inserted_at",
"started_at",
"completed_at",
"scheduled_at",
"provider",
]
included_serializers = {
"provider": "api.v1.serializers.ProviderIncludeSerializer",
}
class ScanCreateSerializer(RLSSerializer, BaseWriteSerializer):
class Meta:
@@ -819,6 +952,23 @@ class ScanTaskSerializer(RLSSerializer):
]
class ScanReportSerializer(serializers.Serializer):
id = serializers.CharField(source="scan")
class Meta:
resource_name = "scan-reports"
fields = ["id"]
class ScanComplianceReportSerializer(serializers.Serializer):
id = serializers.CharField(source="scan")
name = serializers.CharField()
class Meta:
resource_name = "scan-reports"
fields = ["id", "name"]
class ResourceTagSerializer(RLSSerializer):
"""
Serializer for the ResourceTag model
@@ -884,6 +1034,51 @@ class ResourceSerializer(RLSSerializer):
return fields
class ResourceIncludeSerializer(RLSSerializer):
"""
Serializer for the Resource model.
"""
tags = serializers.SerializerMethodField()
type_ = serializers.CharField(read_only=True)
class Meta:
model = Resource
fields = [
"id",
"inserted_at",
"updated_at",
"uid",
"name",
"region",
"service",
"type_",
"tags",
]
extra_kwargs = {
"id": {"read_only": True},
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
}
@extend_schema_field(
{
"type": "object",
"description": "Tags associated with the resource",
"example": {"env": "prod", "owner": "johndoe"},
}
)
def get_tags(self, obj):
return obj.get_tags(self.context.get("tenant_id"))
def get_fields(self):
"""`type` is a Python reserved keyword."""
fields = super().get_fields()
type_ = fields.pop("type_")
fields["type"] = type_
return fields
class FindingSerializer(RLSSerializer):
"""
Serializer for the Finding model.
@@ -905,6 +1100,8 @@ class FindingSerializer(RLSSerializer):
"raw_result",
"inserted_at",
"updated_at",
"first_seen_at",
"muted",
"url",
# Relationships
"scan",
@@ -912,11 +1109,12 @@ class FindingSerializer(RLSSerializer):
]
included_serializers = {
"scan": ScanSerializer,
"resources": ResourceSerializer,
"scan": ScanIncludeSerializer,
"resources": ResourceIncludeSerializer,
}
# To be removed when the related endpoint is removed as well
class FindingDynamicFilterSerializer(serializers.Serializer):
services = serializers.ListField(child=serializers.CharField(), allow_empty=True)
regions = serializers.ListField(child=serializers.CharField(), allow_empty=True)
@@ -925,6 +1123,19 @@ class FindingDynamicFilterSerializer(serializers.Serializer):
resource_name = "finding-dynamic-filters"
class FindingMetadataSerializer(serializers.Serializer):
services = serializers.ListField(child=serializers.CharField(), allow_empty=True)
regions = serializers.ListField(child=serializers.CharField(), allow_empty=True)
resource_types = serializers.ListField(
child=serializers.CharField(), allow_empty=True
)
# Temporarily disabled until we implement tag filtering in the UI
# tags = serializers.JSONField(help_text="Tags are described as key-value pairs.")
class Meta:
resource_name = "findings-metadata"
# Provider secrets
class BaseWriteProviderSecretSerializer(BaseWriteSerializer):
@staticmethod
@@ -940,6 +1151,8 @@ class BaseWriteProviderSecretSerializer(BaseWriteSerializer):
serializer = GCPProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.KUBERNETES.value:
serializer = KubernetesProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.M365.value:
serializer = M365ProviderSecret(data=secret)
else:
raise serializers.ValidationError(
{"provider": f"Provider type not supported {provider_type}"}
@@ -979,6 +1192,17 @@ class AzureProviderSecret(serializers.Serializer):
resource_name = "provider-secrets"
class M365ProviderSecret(serializers.Serializer):
client_id = serializers.CharField()
client_secret = serializers.CharField()
tenant_id = serializers.CharField()
user = serializers.EmailField()
encrypted_password = serializers.CharField()
class Meta:
resource_name = "provider-secrets"
class GCPProviderSecret(serializers.Serializer):
client_id = serializers.CharField()
client_secret = serializers.CharField()
@@ -997,7 +1221,7 @@ class KubernetesProviderSecret(serializers.Serializer):
class AWSRoleAssumptionProviderSecret(serializers.Serializer):
role_arn = serializers.CharField()
external_id = serializers.CharField(required=False)
external_id = serializers.CharField()
role_session_name = serializers.CharField(required=False)
session_duration = serializers.IntegerField(
required=False, min_value=900, max_value=43200
@@ -1010,142 +1234,6 @@ class AWSRoleAssumptionProviderSecret(serializers.Serializer):
resource_name = "provider-secrets"
@extend_schema_field(
{
"oneOf": [
{
"type": "object",
"title": "AWS Static Credentials",
"properties": {
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Required for environments where no IAM role is being "
"assumed and direct AWS access is needed.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Must accompany 'aws_access_key_id' to authorize "
"access to AWS resources.",
},
"aws_session_token": {
"type": "string",
"description": "The session token associated with temporary credentials. Only needed for "
"session-based or temporary AWS access.",
},
},
"required": ["aws_access_key_id", "aws_secret_access_key"],
},
{
"type": "object",
"title": "AWS Assume Role",
"properties": {
"role_arn": {
"type": "string",
"description": "The Amazon Resource Name (ARN) of the role to assume. Required for AWS role "
"assumption.",
},
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Only required if the environment lacks pre-configured "
"AWS credentials.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Required if 'aws_access_key_id' is provided or if "
"no AWS credentials are pre-configured.",
},
"aws_session_token": {
"type": "string",
"description": "The session token for temporary credentials, if applicable.",
},
"session_duration": {
"type": "integer",
"minimum": 900,
"maximum": 43200,
"default": 3600,
"description": "The duration (in seconds) for the role session.",
},
"external_id": {
"type": "string",
"description": "An optional identifier to enhance security for role assumption; may be "
"required by the role administrator.",
},
"role_session_name": {
"type": "string",
"description": "An identifier for the role session, useful for tracking sessions in AWS logs. "
"The regex used to validate this parameter is a string of characters consisting of "
"upper- and lower-case alphanumeric characters with no spaces. You can also include "
"underscores or any of the following characters: =,.@-\n\n"
"Examples:\n"
"- MySession123\n"
"- User_Session-1\n"
"- Test.Session@2",
"pattern": "^[a-zA-Z0-9=,.@_-]+$",
},
},
"required": ["role_arn"],
},
{
"type": "object",
"title": "Azure Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The Azure application (client) ID for authentication in Azure AD.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the application (client) ID, providing "
"secure access.",
},
"tenant_id": {
"type": "string",
"description": "The Azure tenant ID, representing the directory where the application is "
"registered.",
},
},
"required": ["client_id", "client_secret", "tenant_id"],
},
{
"type": "object",
"title": "GCP Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The client ID from Google Cloud, used to identify the application for GCP "
"access.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the GCP client ID, required for secure "
"access.",
},
"refresh_token": {
"type": "string",
"description": "A refresh token that allows the application to obtain new access tokens for "
"extended use.",
},
},
"required": ["client_id", "client_secret", "refresh_token"],
},
{
"type": "object",
"title": "Kubernetes Static Credentials",
"properties": {
"kubeconfig_content": {
"type": "string",
"description": "The content of the Kubernetes kubeconfig file, encoded as a string.",
}
},
"required": ["kubeconfig_content"],
},
]
}
)
class ProviderSecretField(serializers.JSONField):
pass
class ProviderSecretSerializer(RLSSerializer):
"""
Serializer for the ProviderSecret model.
@@ -1419,8 +1507,8 @@ class RoleSerializer(RLSSerializer, BaseWriteSerializer):
"manage_account",
# Disable for the first release
# "manage_billing",
# "manage_integrations",
# /Disable for the first release
"manage_integrations",
"manage_providers",
"manage_scans",
"permission_state",
@@ -1702,6 +1790,13 @@ class ComplianceOverviewFullSerializer(ComplianceOverviewSerializer):
return obj.requirements
class ComplianceOverviewMetadataSerializer(serializers.Serializer):
regions = serializers.ListField(child=serializers.CharField(), allow_empty=True)
class Meta:
resource_name = "compliance-overviews-metadata"
# Overviews
@@ -1722,7 +1817,7 @@ class OverviewProviderSerializer(serializers.Serializer):
"properties": {
"pass": {"type": "integer"},
"fail": {"type": "integer"},
"manual": {"type": "integer"},
"muted": {"type": "integer"},
"total": {"type": "integer"},
},
}
@@ -1731,7 +1826,7 @@ class OverviewProviderSerializer(serializers.Serializer):
return {
"pass": obj["findings_passed"],
"fail": obj["findings_failed"],
"manual": obj["findings_manual"],
"muted": obj["findings_muted"],
"total": obj["total_findings"],
}
@@ -1826,3 +1921,201 @@ class ScheduleDailyCreateSerializer(serializers.Serializer):
if unknown_keys:
raise ValidationError(f"Invalid fields: {unknown_keys}")
return data
# Integrations
class BaseWriteIntegrationSerializer(BaseWriteSerializer):
@staticmethod
def validate_integration_data(
integration_type: str,
providers: list[Provider], # noqa
configuration: dict,
credentials: dict,
):
if integration_type == Integration.IntegrationChoices.S3:
config_serializer = S3ConfigSerializer
credentials_serializers = [AWSCredentialSerializer]
# TODO: This will be required for AWS Security Hub
# if providers and not all(
# provider.provider == Provider.ProviderChoices.AWS
# for provider in providers
# ):
# raise serializers.ValidationError(
# {"providers": "All providers must be AWS for the S3 integration."}
# )
else:
raise serializers.ValidationError(
{
"integration_type": f"Integration type not supported yet: {integration_type}"
}
)
config_serializer(data=configuration).is_valid(raise_exception=True)
for cred_serializer in credentials_serializers:
try:
cred_serializer(data=credentials).is_valid(raise_exception=True)
break
except ValidationError:
continue
else:
raise ValidationError(
{"credentials": "Invalid credentials for the integration type."}
)
class IntegrationSerializer(RLSSerializer):
"""
Serializer for the Integration model.
"""
providers = serializers.ResourceRelatedField(
queryset=Provider.objects.all(), many=True
)
class Meta:
model = Integration
fields = [
"id",
"inserted_at",
"updated_at",
"enabled",
"connected",
"connection_last_checked_at",
"integration_type",
"configuration",
"providers",
"url",
]
included_serializers = {
"providers": "api.v1.serializers.ProviderIncludeSerializer",
}
def to_representation(self, instance):
representation = super().to_representation(instance)
allowed_providers = self.context.get("allowed_providers")
if allowed_providers:
allowed_provider_ids = {str(provider.id) for provider in allowed_providers}
representation["providers"] = [
provider
for provider in representation["providers"]
if provider["id"] in allowed_provider_ids
]
return representation
class IntegrationCreateSerializer(BaseWriteIntegrationSerializer):
credentials = IntegrationCredentialField(write_only=True)
configuration = IntegrationConfigField()
providers = serializers.ResourceRelatedField(
queryset=Provider.objects.all(), many=True, required=False
)
class Meta:
model = Integration
fields = [
"inserted_at",
"updated_at",
"enabled",
"connected",
"connection_last_checked_at",
"integration_type",
"configuration",
"credentials",
"providers",
]
extra_kwargs = {
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
"connected": {"read_only": True},
"enabled": {"read_only": True},
"connection_last_checked_at": {"read_only": True},
}
def validate(self, attrs):
integration_type = attrs.get("integration_type")
providers = attrs.get("providers")
configuration = attrs.get("configuration")
credentials = attrs.get("credentials")
validated_attrs = super().validate(attrs)
self.validate_integration_data(
integration_type, providers, configuration, credentials
)
return validated_attrs
def create(self, validated_data):
tenant_id = self.context.get("tenant_id")
providers = validated_data.pop("providers", [])
integration = Integration.objects.create(tenant_id=tenant_id, **validated_data)
through_model_instances = [
IntegrationProviderRelationship(
integration=integration,
provider=provider,
tenant_id=tenant_id,
)
for provider in providers
]
IntegrationProviderRelationship.objects.bulk_create(through_model_instances)
return integration
class IntegrationUpdateSerializer(BaseWriteIntegrationSerializer):
credentials = IntegrationCredentialField(write_only=True, required=False)
configuration = IntegrationConfigField(required=False)
providers = serializers.ResourceRelatedField(
queryset=Provider.objects.all(), many=True, required=False
)
class Meta:
model = Integration
fields = [
"inserted_at",
"updated_at",
"enabled",
"connected",
"connection_last_checked_at",
"integration_type",
"configuration",
"credentials",
"providers",
]
extra_kwargs = {
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
"connected": {"read_only": True},
"connection_last_checked_at": {"read_only": True},
"integration_type": {"read_only": True},
}
def validate(self, attrs):
integration_type = self.instance.integration_type
providers = attrs.get("providers")
configuration = attrs.get("configuration") or self.instance.configuration
credentials = attrs.get("credentials") or self.instance.credentials
validated_attrs = super().validate(attrs)
self.validate_integration_data(
integration_type, providers, configuration, credentials
)
return validated_attrs
def update(self, instance, validated_data):
tenant_id = self.context.get("tenant_id")
if validated_data.get("providers") is not None:
instance.providers.clear()
new_relationships = [
IntegrationProviderRelationship(
integration=instance, provider=provider, tenant_id=tenant_id
)
for provider in validated_data["providers"]
]
IntegrationProviderRelationship.objects.bulk_create(new_relationships)
return super().update(instance, validated_data)
+17 -9
View File
@@ -3,28 +3,32 @@ from drf_spectacular.views import SpectacularRedocView
from rest_framework_nested import routers
from api.v1.views import (
ComplianceOverviewViewSet,
CustomTokenObtainView,
CustomTokenRefreshView,
CustomTokenSwitchTenantView,
FindingViewSet,
MembershipViewSet,
ProviderGroupViewSet,
ProviderGroupProvidersRelationshipView,
ProviderSecretViewSet,
InvitationViewSet,
GithubSocialLoginView,
GoogleSocialLoginView,
IntegrationViewSet,
InvitationAcceptViewSet,
RoleViewSet,
RoleProviderGroupRelationshipView,
UserRoleRelationshipView,
InvitationViewSet,
MembershipViewSet,
OverviewViewSet,
ComplianceOverviewViewSet,
ProviderGroupProvidersRelationshipView,
ProviderGroupViewSet,
ProviderSecretViewSet,
ProviderViewSet,
ResourceViewSet,
RoleProviderGroupRelationshipView,
RoleViewSet,
ScanViewSet,
ScheduleViewSet,
SchemaView,
TaskViewSet,
TenantMembersViewSet,
TenantViewSet,
UserRoleRelationshipView,
UserViewSet,
)
@@ -44,6 +48,7 @@ router.register(
)
router.register(r"overviews", OverviewViewSet, basename="overview")
router.register(r"schedules", ScheduleViewSet, basename="schedule")
router.register(r"integrations", IntegrationViewSet, basename="integration")
tenants_router = routers.NestedSimpleRouter(router, r"tenants", lookup="tenant")
tenants_router.register(
@@ -56,6 +61,7 @@ users_router.register(r"memberships", MembershipViewSet, basename="user-membersh
urlpatterns = [
path("tokens", CustomTokenObtainView.as_view(), name="token-obtain"),
path("tokens/refresh", CustomTokenRefreshView.as_view(), name="token-refresh"),
path("tokens/switch", CustomTokenSwitchTenantView.as_view(), name="token-switch"),
path(
"providers/secrets",
ProviderSecretViewSet.as_view({"get": "list", "post": "create"}),
@@ -106,6 +112,8 @@ urlpatterns = [
),
name="provider_group-providers-relationship",
),
path("tokens/google", GoogleSocialLoginView.as_view(), name="token-google"),
path("tokens/github", GithubSocialLoginView.as_view(), name="token-github"),
path("", include(router.urls)),
path("", include(tenants_router.urls)),
path("", include(users_router.urls)),
+624 -95
View File
@@ -1,12 +1,28 @@
import glob
import os
import sentry_sdk
from allauth.socialaccount.providers.github.views import GitHubOAuth2Adapter
from allauth.socialaccount.providers.google.views import GoogleOAuth2Adapter
from botocore.exceptions import ClientError, NoCredentialsError, ParamValidationError
from celery.result import AsyncResult
from config.env import env
from config.settings.social_login import (
GITHUB_OAUTH_CALLBACK_URL,
GOOGLE_OAUTH_CALLBACK_URL,
)
from dj_rest_auth.registration.views import SocialLoginView
from django.conf import settings as django_settings
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.search import SearchQuery
from django.db import transaction
from django.db.models import Count, F, OuterRef, Prefetch, Q, Subquery, Sum
from django.db.models import Count, Exists, F, OuterRef, Prefetch, Q, Subquery, Sum
from django.db.models.functions import Coalesce
from django.http import HttpResponse
from django.urls import reverse
from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_control
from django_celery_beat.models import PeriodicTask
from drf_spectacular.settings import spectacular_settings
from drf_spectacular.utils import (
OpenApiParameter,
@@ -30,19 +46,21 @@ from rest_framework.permissions import SAFE_METHODS
from rest_framework_json_api.views import RelationshipView, Response
from rest_framework_simplejwt.exceptions import InvalidToken, TokenError
from tasks.beat import schedule_provider_scan
from tasks.jobs.export import get_s3_client
from tasks.tasks import (
check_provider_connection_task,
delete_provider_task,
delete_tenant_task,
perform_scan_summary_task,
perform_scan_task,
)
from api.base_views import BaseRLSViewSet, BaseTenantViewset, BaseUserViewset
from api.compliance import get_compliance_frameworks
from api.db_router import MainRouter
from api.filters import (
ComplianceOverviewFilter,
FindingFilter,
IntegrationFilter,
InvitationFilter,
MembershipFilter,
ProviderFilter,
@@ -60,6 +78,7 @@ from api.filters import (
from api.models import (
ComplianceOverview,
Finding,
Integration,
Invitation,
Membership,
Provider,
@@ -67,13 +86,13 @@ from api.models import (
ProviderGroupMembership,
ProviderSecret,
Resource,
ResourceFindingMapping,
Role,
RoleProviderGroupRelationship,
Scan,
ScanSummary,
SeverityChoices,
StateChoices,
StatusChoices,
Task,
User,
UserRoleRelationship,
@@ -81,13 +100,17 @@ from api.models import (
from api.pagination import ComplianceOverviewPagination
from api.rbac.permissions import Permissions, get_providers, get_role
from api.rls import Tenant
from api.utils import validate_invitation
from api.uuid_utils import datetime_to_uuid7
from api.utils import CustomOAuth2Client, validate_invitation
from api.v1.serializers import (
ComplianceOverviewFullSerializer,
ComplianceOverviewMetadataSerializer,
ComplianceOverviewSerializer,
FindingDynamicFilterSerializer,
FindingMetadataSerializer,
FindingSerializer,
IntegrationCreateSerializer,
IntegrationSerializer,
IntegrationUpdateSerializer,
InvitationAcceptSerializer,
InvitationCreateSerializer,
InvitationSerializer,
@@ -112,7 +135,9 @@ from api.v1.serializers import (
RoleProviderGroupRelationshipSerializer,
RoleSerializer,
RoleUpdateSerializer,
ScanComplianceReportSerializer,
ScanCreateSerializer,
ScanReportSerializer,
ScanSerializer,
ScanUpdateSerializer,
ScheduleDailyCreateSerializer,
@@ -120,6 +145,8 @@ from api.v1.serializers import (
TenantSerializer,
TokenRefreshSerializer,
TokenSerializer,
TokenSocialLoginSerializer,
TokenSwitchTenantSerializer,
UserCreateSerializer,
UserRoleRelationshipSerializer,
UserSerializer,
@@ -186,13 +213,43 @@ class CustomTokenRefreshView(GenericAPIView):
)
@extend_schema(
tags=["Token"],
summary="Switch tenant using a valid tenant ID",
description="Switch tenant by providing a valid tenant ID. The authenticated user must belong to the tenant.",
)
class CustomTokenSwitchTenantView(GenericAPIView):
permission_classes = [permissions.IsAuthenticated]
resource_name = "tokens-switch-tenant"
serializer_class = TokenSwitchTenantSerializer
http_method_names = ["post"]
def post(self, request):
serializer = TokenSwitchTenantSerializer(
data=request.data, context={"request": request}
)
try:
serializer.is_valid(raise_exception=True)
except TokenError as e:
raise InvalidToken(e.args[0])
return Response(
data={
"type": "tokens-switch-tenant",
"attributes": serializer.validated_data,
},
status=status.HTTP_200_OK,
)
@extend_schema(exclude=True)
class SchemaView(SpectacularAPIView):
serializer_class = None
def get(self, request, *args, **kwargs):
spectacular_settings.TITLE = "Prowler API"
spectacular_settings.VERSION = "1.1.1"
spectacular_settings.VERSION = "1.7.0"
spectacular_settings.DESCRIPTION = (
"Prowler API specification.\n\nThis file is auto-generated."
)
@@ -248,10 +305,67 @@ class SchemaView(SpectacularAPIView):
"description": "Endpoints for task management, allowing retrieval of task status and "
"revoking tasks that have not started.",
},
{
"name": "Integration",
"description": "Endpoints for managing third-party integrations, including registration, configuration,"
" retrieval, and deletion of integrations such as S3, JIRA, or other services.",
},
]
return super().get(request, *args, **kwargs)
@extend_schema(exclude=True)
class GoogleSocialLoginView(SocialLoginView):
adapter_class = GoogleOAuth2Adapter
client_class = CustomOAuth2Client
callback_url = GOOGLE_OAUTH_CALLBACK_URL
def get_response(self):
original_response = super().get_response()
if self.user and self.user.is_authenticated:
serializer = TokenSocialLoginSerializer(data={"email": self.user.email})
try:
serializer.is_valid(raise_exception=True)
except TokenError as e:
raise InvalidToken(e.args[0])
return Response(
data={
"type": "google-social-tokens",
"attributes": serializer.validated_data,
},
status=status.HTTP_200_OK,
)
return original_response
@extend_schema(exclude=True)
class GithubSocialLoginView(SocialLoginView):
adapter_class = GitHubOAuth2Adapter
client_class = CustomOAuth2Client
callback_url = GITHUB_OAUTH_CALLBACK_URL
def get_response(self):
original_response = super().get_response()
if self.user and self.user.is_authenticated:
serializer = TokenSocialLoginSerializer(data={"email": self.user.email})
try:
serializer.is_valid(raise_exception=True)
except TokenError as e:
raise InvalidToken(e.args[0])
return Response(
data={
"type": "github-social-tokens",
"attributes": serializer.validated_data,
},
status=status.HTTP_200_OK,
)
return original_response
@extend_schema_view(
list=extend_schema(
tags=["User"],
@@ -275,8 +389,8 @@ class SchemaView(SpectacularAPIView):
),
destroy=extend_schema(
tags=["User"],
summary="Delete a user account",
description="Remove a user account from the system.",
summary="Delete the user account",
description="Remove the current user account from the system.",
),
me=extend_schema(
tags=["User"],
@@ -340,6 +454,12 @@ class UserViewSet(BaseUserViewset):
status=status.HTTP_200_OK,
)
def destroy(self, request, *args, **kwargs):
if kwargs["pk"] != str(self.request.user.id):
raise ValidationError("Only the current user can be deleted.")
return super().destroy(request, *args, **kwargs)
@extend_schema(
parameters=[
OpenApiParameter(
@@ -971,6 +1091,8 @@ class ProviderViewSet(BaseRLSViewSet):
provider = get_object_or_404(Provider, pk=pk)
provider.is_deleted = True
provider.save()
task_name = f"scan-perform-scheduled-{pk}"
PeriodicTask.objects.filter(name=task_name).update(enabled=False)
with transaction.atomic():
task = delete_provider_task.delay(
@@ -1018,6 +1140,39 @@ class ProviderViewSet(BaseRLSViewSet):
request=ScanCreateSerializer,
responses={202: OpenApiResponse(response=TaskSerializer)},
),
report=extend_schema(
tags=["Scan"],
summary="Download ZIP report",
description="Returns a ZIP file containing the requested report",
request=ScanReportSerializer,
responses={
200: OpenApiResponse(description="Report obtained successfully"),
202: OpenApiResponse(description="The task is in progress"),
403: OpenApiResponse(description="There is a problem with credentials"),
404: OpenApiResponse(description="The scan has no reports"),
},
),
compliance=extend_schema(
tags=["Scan"],
summary="Retrieve compliance report as CSV",
description="Download a specific compliance report (e.g., 'cis_1.4_aws') as a CSV file.",
parameters=[
OpenApiParameter(
name="name",
type=str,
location=OpenApiParameter.PATH,
required=True,
description="The compliance report name, like 'cis_1.4_aws'",
),
],
responses={
200: OpenApiResponse(
description="CSV file containing the compliance report"
),
404: OpenApiResponse(description="Compliance report not found"),
},
request=None,
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
@@ -1044,7 +1199,7 @@ class ScanViewSet(BaseRLSViewSet):
"""
if self.request.method in SAFE_METHODS:
# No permissions required for GET requests
self.required_permissions = [Permissions.MANAGE_PROVIDERS]
self.required_permissions = []
else:
# Require permission for non-GET requests
self.required_permissions = [Permissions.MANAGE_SCANS]
@@ -1066,6 +1221,14 @@ class ScanViewSet(BaseRLSViewSet):
return ScanCreateSerializer
elif self.action == "partial_update":
return ScanUpdateSerializer
elif self.action == "report":
if hasattr(self, "response_serializer_class"):
return self.response_serializer_class
return ScanReportSerializer
elif self.action == "compliance":
if hasattr(self, "response_serializer_class"):
return self.response_serializer_class
return ScanComplianceReportSerializer
return super().get_serializer_class()
def partial_update(self, request, *args, **kwargs):
@@ -1083,6 +1246,210 @@ class ScanViewSet(BaseRLSViewSet):
)
return Response(data=read_serializer.data, status=status.HTTP_200_OK)
def _get_task_status(self, scan_instance):
"""
Returns task status if the scan or its associated report-generation task is still executing.
If the scan is in an EXECUTING state or if a background task related to report generation
is found and also executing, this method returns a 202 Accepted response with the task
metadata and a `Content-Location` header pointing to the task detail endpoint.
Args:
scan_instance (Scan): The scan instance for which the task status is being checked.
Returns:
Response or None:
- A `Response` with HTTP 202 status and serialized task data if the task is executing.
- `None` if no running task is found or if the task has already completed.
"""
task = None
if scan_instance.state == StateChoices.EXECUTING and scan_instance.task:
task = scan_instance.task
else:
try:
task = Task.objects.get(
task_runner_task__task_name="scan-report",
task_runner_task__task_args__contains=str(scan_instance.id),
)
except Task.DoesNotExist:
return None
self.response_serializer_class = TaskSerializer
serializer = self.get_serializer(task)
if serializer.data.get("state") != StateChoices.EXECUTING:
return None
return Response(
data=serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse(
"task-detail", kwargs={"pk": serializer.data["id"]}
)
},
)
def _load_file(self, path_pattern, s3=False, bucket=None, list_objects=False):
"""
Loads a binary file (e.g., ZIP or CSV) and returns its content and filename.
Depending on the input parameters, this method supports loading:
- From S3 using a direct key.
- From S3 by listing objects under a prefix and matching suffix.
- From the local filesystem using glob pattern matching.
Args:
path_pattern (str): The key or glob pattern representing the file location.
s3 (bool, optional): Whether the file is stored in S3. Defaults to False.
bucket (str, optional): The name of the S3 bucket, required if `s3=True`. Defaults to None.
list_objects (bool, optional): If True and `s3=True`, list objects by prefix to find the file. Defaults to False.
Returns:
tuple[bytes, str]: A tuple containing the file content as bytes and the filename if successful.
Response: A DRF `Response` object with an appropriate status and error detail if an error occurs.
"""
if s3:
try:
client = get_s3_client()
except (ClientError, NoCredentialsError, ParamValidationError):
return Response(
{"detail": "There is a problem with credentials."},
status=status.HTTP_403_FORBIDDEN,
)
if list_objects:
# list keys under prefix then match suffix
prefix = os.path.dirname(path_pattern)
suffix = os.path.basename(path_pattern)
try:
resp = client.list_objects_v2(Bucket=bucket, Prefix=prefix)
except ClientError as e:
sentry_sdk.capture_exception(e)
return Response(
{
"detail": "Unable to list compliance files in S3: encountered an AWS error."
},
status=status.HTTP_502_BAD_GATEWAY,
)
contents = resp.get("Contents", [])
keys = [obj["Key"] for obj in contents if obj["Key"].endswith(suffix)]
if not keys:
return Response(
{
"detail": f"No compliance file found for name '{os.path.splitext(suffix)[0]}'."
},
status=status.HTTP_404_NOT_FOUND,
)
# path_pattern here is prefix, but in compliance we build correct suffix check before
key = keys[0]
else:
# path_pattern is exact key
key = path_pattern
try:
s3_obj = client.get_object(Bucket=bucket, Key=key)
except ClientError as e:
code = e.response.get("Error", {}).get("Code")
if code == "NoSuchKey":
return Response(
{"detail": "The scan has no reports."},
status=status.HTTP_404_NOT_FOUND,
)
return Response(
{"detail": "There is a problem with credentials."},
status=status.HTTP_403_FORBIDDEN,
)
content = s3_obj["Body"].read()
filename = os.path.basename(key)
else:
files = glob.glob(path_pattern)
if not files:
return Response(
{"detail": "The scan has no reports."},
status=status.HTTP_404_NOT_FOUND,
)
filepath = files[0]
with open(filepath, "rb") as f:
content = f.read()
filename = os.path.basename(filepath)
return content, filename
def _serve_file(self, content, filename, content_type):
response = HttpResponse(content, content_type=content_type)
response["Content-Disposition"] = f'attachment; filename="{filename}"'
return response
@action(detail=True, methods=["get"], url_name="report")
def report(self, request, pk=None):
scan = self.get_object()
# Check for executing tasks
running_resp = self._get_task_status(scan)
if running_resp:
return running_resp
if not scan.output_location:
return Response(
{"detail": "The scan has no reports."}, status=status.HTTP_404_NOT_FOUND
)
if scan.output_location.startswith("s3://"):
bucket = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
key_prefix = scan.output_location.removeprefix(f"s3://{bucket}/")
loader = self._load_file(
key_prefix, s3=True, bucket=bucket, list_objects=False
)
else:
loader = self._load_file(scan.output_location, s3=False)
if isinstance(loader, Response):
return loader
content, filename = loader
return self._serve_file(content, filename, "application/x-zip-compressed")
@action(
detail=True,
methods=["get"],
url_path="compliance/(?P<name>[^/]+)",
url_name="compliance",
)
def compliance(self, request, pk=None, name=None):
scan = self.get_object()
if name not in get_compliance_frameworks(scan.provider.provider):
return Response(
{"detail": f"Compliance '{name}' not found."},
status=status.HTTP_404_NOT_FOUND,
)
running_resp = self._get_task_status(scan)
if running_resp:
return running_resp
if not scan.output_location:
return Response(
{"detail": "The scan has no reports."}, status=status.HTTP_404_NOT_FOUND
)
if scan.output_location.startswith("s3://"):
bucket = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
key_prefix = scan.output_location.removeprefix(f"s3://{bucket}/")
prefix = os.path.join(
os.path.dirname(key_prefix), "compliance", f"{name}.csv"
)
loader = self._load_file(prefix, s3=True, bucket=bucket, list_objects=True)
else:
base = os.path.dirname(scan.output_location)
pattern = os.path.join(base, "compliance", f"*_{name}.csv")
loader = self._load_file(pattern, s3=False)
if isinstance(loader, Response):
return loader
content, filename = loader
return self._serve_file(content, filename, "text/csv")
def create(self, request, *args, **kwargs):
input_serializer = self.get_serializer(data=request.data)
input_serializer.is_valid(raise_exception=True)
@@ -1097,10 +1464,6 @@ class ScanViewSet(BaseRLSViewSet):
# Disabled for now
# checks_to_execute=scan.scanner_args.get("checks_to_execute"),
},
link=perform_scan_summary_task.si(
tenant_id=self.request.tenant_id,
scan_id=str(scan.id),
),
)
scan.task_id = task.id
@@ -1264,6 +1627,14 @@ class ResourceViewSet(BaseRLSViewSet):
tags=["Finding"],
summary="List all findings",
description="Retrieve a list of all findings with options for filtering by various criteria.",
parameters=[
OpenApiParameter(
name="filter[inserted_at]",
description="At least one of the variations of the `filter[inserted_at]` filter must be provided.",
required=True,
type=OpenApiTypes.DATE,
)
],
),
retrieve=extend_schema(
tags=["Finding"],
@@ -1274,33 +1645,51 @@ class ResourceViewSet(BaseRLSViewSet):
tags=["Finding"],
summary="Retrieve the services and regions that are impacted by findings",
description="Fetch services and regions affected in findings.",
responses={201: OpenApiResponse(response=MembershipSerializer)},
filters=True,
deprecated=True,
),
metadata=extend_schema(
tags=["Finding"],
summary="Retrieve metadata values from findings",
description="Fetch unique metadata values from a set of findings. This is useful for dynamic filtering.",
parameters=[
OpenApiParameter(
name="filter[inserted_at]",
description="At least one of the variations of the `filter[inserted_at]` filter must be provided.",
required=True,
type=OpenApiTypes.DATE,
)
],
filters=True,
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
class FindingViewSet(BaseRLSViewSet):
queryset = Finding.objects.all()
queryset = Finding.all_objects.all()
serializer_class = FindingSerializer
prefetch_for_includes = {
"__all__": [],
"resources": [
Prefetch("resources", queryset=Resource.objects.select_related("findings"))
],
"scan": [Prefetch("scan", queryset=Scan.objects.select_related("findings"))],
}
http_method_names = ["get"]
filterset_class = FindingFilter
ordering = ["-id"]
http_method_names = ["get"]
ordering = ["-inserted_at"]
ordering_fields = [
"id",
"status",
"severity",
"check_id",
"inserted_at",
"updated_at",
]
prefetch_for_includes = {
"__all__": [],
"resources": [
Prefetch(
"resources",
queryset=Resource.all_objects.prefetch_related("tags", "findings"),
)
],
"scan": [
Prefetch("scan", queryset=Scan.all_objects.select_related("findings"))
],
}
# RBAC required permissions (implicit -> MANAGE_PROVIDERS enable unlimited visibility or check the visibility of
# the provider through the provider group)
required_permissions = []
@@ -1308,52 +1697,65 @@ class FindingViewSet(BaseRLSViewSet):
def get_serializer_class(self):
if self.action == "findings_services_regions":
return FindingDynamicFilterSerializer
elif self.action == "metadata":
return FindingMetadataSerializer
return super().get_serializer_class()
def get_queryset(self):
tenant_id = self.request.tenant_id
user_roles = get_role(self.request.user)
if user_roles.unlimited_visibility:
# User has unlimited visibility, return all scans
queryset = Finding.objects.filter(tenant_id=self.request.tenant_id)
# User has unlimited visibility, return all findings
queryset = Finding.all_objects.filter(tenant_id=tenant_id)
else:
# User lacks permission, filter providers based on provider groups associated with the role
queryset = Finding.objects.filter(
# User lacks permission, filter findings based on provider groups associated with the role
queryset = Finding.all_objects.filter(
scan__provider__in=get_providers(user_roles)
)
search_value = self.request.query_params.get("filter[search]", None)
if search_value:
# Django's ORM will build a LEFT JOIN and OUTER JOIN on any "through" tables, resulting in duplicates
# The duplicates then require a `distinct` query
search_query = SearchQuery(
search_value, config="simple", search_type="plain"
)
resource_match = Resource.all_objects.filter(
text_search=search_query,
id__in=ResourceFindingMapping.objects.filter(
resource_id=OuterRef("pk"),
tenant_id=tenant_id,
).values("resource_id"),
)
queryset = queryset.filter(
Q(impact_extended__contains=search_value)
| Q(status_extended__contains=search_value)
| Q(check_id=search_value)
| Q(check_id__icontains=search_value)
| Q(text_search=search_query)
| Q(resources__uid=search_value)
| Q(resources__name=search_value)
| Q(resources__region=search_value)
| Q(resources__service=search_value)
| Q(resources__type=search_value)
| Q(resources__uid__contains=search_value)
| Q(resources__name__contains=search_value)
| Q(resources__region__contains=search_value)
| Q(resources__service__contains=search_value)
| Q(resources__tags__text_search=search_query)
| Q(resources__text_search=search_query)
).distinct()
Q(text_search=search_query) | Q(Exists(resource_match))
)
return queryset
def inserted_at_to_uuidv7(self, inserted_at):
if inserted_at is None:
return None
return datetime_to_uuid7(inserted_at)
def filter_queryset(self, queryset):
# Do not apply filters when retrieving specific finding
if self.action == "retrieve":
return queryset
return super().filter_queryset(queryset)
def list(self, request, *args, **kwargs):
base_qs = self.filter_queryset(self.get_queryset())
paginated_ids = self.paginate_queryset(base_qs.values_list("id", flat=True))
if paginated_ids is not None:
ids = list(paginated_ids)
findings = (
Finding.all_objects.filter(tenant_id=self.request.tenant_id, id__in=ids)
.select_related("scan")
.prefetch_related("resources")
)
# Re-sort in Python to preserve ordering:
findings = sorted(findings, key=lambda x: ids.index(x.id))
serializer = self.get_serializer(findings, many=True)
return self.get_paginated_response(serializer.data)
serializer = self.get_serializer(base_qs, many=True)
return Response(serializer.data)
@action(detail=False, methods=["get"], url_name="findings_services_regions")
def findings_services_regions(self, request):
@@ -1376,6 +1778,38 @@ class FindingViewSet(BaseRLSViewSet):
return Response(data=serializer.data, status=status.HTTP_200_OK)
@action(detail=False, methods=["get"], url_name="metadata")
def metadata(self, request):
tenant_id = self.request.tenant_id
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
filtered_ids = filtered_queryset.order_by().values("id")
relevant_resources = Resource.all_objects.filter(
tenant_id=tenant_id, findings__id__in=Subquery(filtered_ids)
).only("service", "region", "type")
aggregation = relevant_resources.aggregate(
services=ArrayAgg("service", flat=True),
regions=ArrayAgg("region", flat=True),
resource_types=ArrayAgg("type", flat=True),
)
services = sorted(set(aggregation["services"] or []))
regions = sorted({region for region in aggregation["regions"] or [] if region})
resource_types = sorted(set(aggregation["resource_types"] or []))
result = {
"services": services,
"regions": regions,
"resource_types": resource_types,
}
serializer = self.get_serializer(data=result)
serializer.is_valid(raise_exception=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@extend_schema_view(
list=extend_schema(
@@ -1761,6 +2195,21 @@ class RoleProviderGroupRelationshipView(RelationshipView, BaseRLSViewSet):
description="Fetch detailed information about a specific compliance overview by its ID, including detailed "
"requirement information and check's status.",
),
metadata=extend_schema(
tags=["Compliance Overview"],
summary="Retrieve metadata values from compliance overviews",
description="Fetch unique metadata values from a set of compliance overviews. This is useful for dynamic "
"filtering.",
parameters=[
OpenApiParameter(
name="filter[scan_id]",
required=True,
type=OpenApiTypes.UUID,
location=OpenApiParameter.QUERY,
description="Related scan ID.",
),
],
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
@@ -1822,6 +2271,8 @@ class ComplianceOverviewViewSet(BaseRLSViewSet):
def get_serializer_class(self):
if self.action == "retrieve":
return ComplianceOverviewFullSerializer
elif self.action == "metadata":
return ComplianceOverviewMetadataSerializer
return super().get_serializer_class()
def list(self, request, *args, **kwargs):
@@ -1838,6 +2289,35 @@ class ComplianceOverviewViewSet(BaseRLSViewSet):
)
return super().list(request, *args, **kwargs)
@action(detail=False, methods=["get"], url_name="metadata")
def metadata(self, request):
scan_id = request.query_params.get("filter[scan_id]")
if not scan_id:
raise ValidationError(
[
{
"detail": "This query parameter is required.",
"status": 400,
"source": {"pointer": "filter[scan_id]"},
"code": "required",
}
]
)
tenant_id = self.request.tenant_id
regions = list(
ComplianceOverview.objects.filter(tenant_id=tenant_id, scan_id=scan_id)
.values_list("region", flat=True)
.order_by("region")
.distinct()
)
result = {"regions": regions}
serializer = self.get_serializer(data=result)
serializer.is_valid(raise_exception=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@extend_schema(tags=["Overview"])
@extend_schema_view(
@@ -1938,68 +2418,53 @@ class OverviewViewSet(BaseRLSViewSet):
@action(detail=False, methods=["get"], url_name="providers")
def providers(self, request):
tenant_id = self.request.tenant_id
# Subquery to get the most recent finding for each uid
latest_finding_ids = (
Finding.objects.filter(
latest_scan_ids = (
Scan.objects.filter(
tenant_id=tenant_id,
uid=OuterRef("uid"),
scan__provider=OuterRef("scan__provider"),
state=StateChoices.COMPLETED,
)
.order_by("-id") # Most recent by id
.values("id")[:1]
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
)
# Filter findings to only include the most recent for each uid
recent_findings = Finding.objects.filter(
tenant_id=tenant_id, id__in=Subquery(latest_finding_ids)
)
# Aggregate findings by provider
findings_aggregated = (
recent_findings.values("scan__provider__provider")
ScanSummary.objects.filter(tenant_id=tenant_id, scan_id__in=latest_scan_ids)
.values("scan__provider__provider")
.annotate(
findings_passed=Count("id", filter=Q(status=StatusChoices.PASS.value)),
findings_failed=Count("id", filter=Q(status=StatusChoices.FAIL.value)),
findings_manual=Count(
"id", filter=Q(status=StatusChoices.MANUAL.value)
),
total_findings=Count("id"),
findings_passed=Coalesce(Sum("_pass"), 0),
findings_failed=Coalesce(Sum("fail"), 0),
findings_muted=Coalesce(Sum("muted"), 0),
total_findings=Coalesce(Sum("total"), 0),
)
.order_by("-findings_failed")
)
# Aggregate total resources by provider
resources_aggregated = (
Resource.objects.filter(tenant_id=tenant_id)
.values("provider__provider")
.annotate(total_resources=Count("id"))
)
resources_dict = {
row["provider__provider"]: row["total_resources"]
for row in resources_aggregated
}
# Combine findings and resources data
overview = []
for findings in findings_aggregated:
provider = findings["scan__provider__provider"]
total_resources = next(
(
res["total_resources"]
for res in resources_aggregated
if res["provider__provider"] == provider
),
0,
)
for row in findings_aggregated:
provider_type = row["scan__provider__provider"]
overview.append(
{
"provider": provider,
"total_resources": total_resources,
"total_findings": findings["total_findings"],
"findings_passed": findings["findings_passed"],
"findings_failed": findings["findings_failed"],
"findings_manual": findings["findings_manual"],
"provider": provider_type,
"total_resources": resources_dict.get(provider_type, 0),
"total_findings": row["total_findings"],
"findings_passed": row["findings_passed"],
"findings_failed": row["findings_failed"],
"findings_muted": row["findings_muted"],
}
)
serializer = OverviewProviderSerializer(overview, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@action(detail=False, methods=["get"], url_name="findings")
@@ -2014,7 +2479,7 @@ class OverviewViewSet(BaseRLSViewSet):
state=StateChoices.COMPLETED,
provider_id=OuterRef("scan__provider_id"),
)
.order_by("-id")
.order_by("-inserted_at")
.values("id")[:1]
)
@@ -2059,7 +2524,7 @@ class OverviewViewSet(BaseRLSViewSet):
state=StateChoices.COMPLETED,
provider_id=OuterRef("scan__provider_id"),
)
.order_by("-id")
.order_by("-inserted_at")
.values("id")[:1]
)
@@ -2095,7 +2560,7 @@ class OverviewViewSet(BaseRLSViewSet):
state=StateChoices.COMPLETED,
provider_id=OuterRef("scan__provider_id"),
)
.order_by("-id")
.order_by("-inserted_at")
.values("id")[:1]
)
@@ -2173,3 +2638,67 @@ class ScheduleViewSet(BaseRLSViewSet):
)
},
)
@extend_schema_view(
list=extend_schema(
tags=["Integration"],
summary="List all integrations",
description="Retrieve a list of all configured integrations with options for filtering by various criteria.",
),
retrieve=extend_schema(
tags=["Integration"],
summary="Retrieve integration details",
description="Fetch detailed information about a specific integration by its ID.",
),
create=extend_schema(
tags=["Integration"],
summary="Create a new integration",
description="Register a new integration with the system, providing necessary configuration details.",
),
partial_update=extend_schema(
tags=["Integration"],
summary="Partially update an integration",
description="Modify certain fields of an existing integration without affecting other settings.",
),
destroy=extend_schema(
tags=["Integration"],
summary="Delete an integration",
description="Remove an integration from the system by its ID.",
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
class IntegrationViewSet(BaseRLSViewSet):
queryset = Integration.objects.all()
serializer_class = IntegrationSerializer
http_method_names = ["get", "post", "patch", "delete"]
filterset_class = IntegrationFilter
ordering = ["integration_type", "-inserted_at"]
# RBAC required permissions
required_permissions = [Permissions.MANAGE_INTEGRATIONS]
allowed_providers = None
def get_queryset(self):
user_roles = get_role(self.request.user)
if user_roles.unlimited_visibility:
# User has unlimited visibility, return all integrations
queryset = Integration.objects.filter(tenant_id=self.request.tenant_id)
else:
# User lacks permission, filter providers based on provider groups associated with the role
allowed_providers = get_providers(user_roles)
queryset = Integration.objects.filter(providers__in=allowed_providers)
self.allowed_providers = allowed_providers
return queryset
def get_serializer_class(self):
if self.action == "create":
return IntegrationCreateSerializer
elif self.action == "partial_update":
return IntegrationUpdateSerializer
return super().get_serializer_class()
def get_serializer_context(self):
context = super().get_serializer_context()
context["allowed_providers"] = self.allowed_providers
return context
+2 -2
View File
@@ -50,9 +50,9 @@ class RLSTask(Task):
tenant_id = kwargs.get("tenant_id")
with rls_transaction(tenant_id):
APITask.objects.create(
APITask.objects.update_or_create(
id=task_result_instance.task_id,
tenant_id=tenant_id,
task_runner_task=task_result_instance,
defaults={"task_runner_task": task_result_instance},
)
return result
+4 -5
View File
@@ -2,9 +2,8 @@ import json
import logging
from enum import StrEnum
from django_guid.log_filters import CorrelationId
from config.env import env
from django_guid.log_filters import CorrelationId
class BackendLogger(StrEnum):
@@ -39,9 +38,9 @@ class NDJSONFormatter(logging.Formatter):
"funcName": record.funcName,
"process": record.process,
"thread": record.thread,
"transaction_id": record.transaction_id
if hasattr(record, "transaction_id")
else None,
"transaction_id": (
record.transaction_id if hasattr(record, "transaction_id") else None
),
}
# Add REST API extra fields
+36
View File
@@ -4,6 +4,8 @@ from config.custom_logging import LOGGING # noqa
from config.env import BASE_DIR, env # noqa
from config.settings.celery import * # noqa
from config.settings.partitions import * # noqa
from config.settings.sentry import * # noqa
from config.settings.social_login import * # noqa
SECRET_KEY = env("SECRET_KEY", default="secret")
DEBUG = env.bool("DJANGO_DEBUG", default=False)
@@ -29,6 +31,13 @@ INSTALLED_APPS = [
"django_celery_results",
"django_celery_beat",
"rest_framework_simplejwt.token_blacklist",
"allauth",
"allauth.account",
"allauth.socialaccount",
"allauth.socialaccount.providers.google",
"allauth.socialaccount.providers.github",
"dj_rest_auth.registration",
"rest_framework.authtoken",
]
MIDDLEWARE = [
@@ -42,8 +51,11 @@ MIDDLEWARE = [
"django.contrib.messages.middleware.MessageMiddleware",
"django.middleware.clickjacking.XFrameOptionsMiddleware",
"api.middleware.APILoggingMiddleware",
"allauth.account.middleware.AccountMiddleware",
]
SITE_ID = 1
CORS_ALLOWED_ORIGINS = ["http://localhost", "http://127.0.0.1"]
ROOT_URLCONF = "config.urls"
@@ -207,3 +219,27 @@ CACHE_STALE_WHILE_REVALIDATE = env.int("DJANGO_STALE_WHILE_REVALIDATE", 60)
TESTING = False
FINDINGS_MAX_DAYS_IN_RANGE = env.int("DJANGO_FINDINGS_MAX_DAYS_IN_RANGE", 7)
# API export settings
DJANGO_TMP_OUTPUT_DIRECTORY = env.str(
"DJANGO_TMP_OUTPUT_DIRECTORY", "/tmp/prowler_api_output"
)
DJANGO_FINDINGS_BATCH_SIZE = env.str("DJANGO_FINDINGS_BATCH_SIZE", 1000)
DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID = env.str("DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID", "")
DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = env.str(
"DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY", ""
)
DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = env.str("DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN", "")
DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = env.str("DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION", "")
# HTTP Security Headers
SECURE_CONTENT_TYPE_NOSNIFF = True
X_FRAME_OPTIONS = "DENY"
SECURE_REFERRER_POLICY = "strict-origin-when-cross-origin"
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
-1
View File
@@ -1,7 +1,6 @@
from config.django.base import * # noqa
from config.env import env
DEBUG = env.bool("DJANGO_DEBUG", default=True)
ALLOWED_HOSTS = env.list("DJANGO_ALLOWED_HOSTS", default=["*"])
@@ -1,7 +1,6 @@
from config.django.base import * # noqa
from config.env import env
DEBUG = env.bool("DJANGO_DEBUG", default=False)
ALLOWED_HOSTS = env.list("DJANGO_ALLOWED_HOSTS", default=["localhost", "127.0.0.1"])
-1
View File
@@ -1,7 +1,6 @@
from config.django.base import * # noqa
from config.env import env
DEBUG = env.bool("DJANGO_DEBUG", default=False)
ALLOWED_HOSTS = env.list("DJANGO_ALLOWED_HOSTS", default=["localhost", "127.0.0.1"])
+102
View File
@@ -0,0 +1,102 @@
import sentry_sdk
from config.env import env
IGNORED_EXCEPTIONS = [
# Provider is not connected due to credentials errors
"is not connected",
# Authentication Errors from AWS
"InvalidToken",
"AccessDeniedException",
"AuthorizationErrorException",
"UnrecognizedClientException",
"UnauthorizedOperation",
"AuthFailure",
"InvalidClientTokenId",
"AWSInvalidProviderIdError",
"InternalServerErrorException",
"AccessDenied",
"No Shodan API Key", # Shodan Check
"RequestLimitExceeded", # For now we don't want to log the RequestLimitExceeded errors
"ThrottlingException",
"Rate exceeded",
"SubscriptionRequiredException",
"UnknownOperationException",
"OptInRequired",
"ReadTimeout",
"LimitExceeded",
"ConnectTimeoutError",
"ExpiredToken",
"IncompleteSignature",
"RegionDisabledException",
"TooManyRequestsException",
"SignatureDoesNotMatch",
"InvalidParameterValueException",
"InvalidInputException",
"ValidationException",
"AWSSecretAccessKeyInvalidError",
"InvalidAction",
"InvalidRequestException",
"RequestExpired",
"ConnectionClosedError",
"MaxRetryError",
"AWSAccessKeyIDInvalidError",
"AWSSessionTokenExpiredError",
"EndpointConnectionError", # AWS Service is not available in a region
"Pool is closed", # The following comes from urllib3: eu-west-1 -- HTTPClientError[126]: An HTTP Client raised an unhandled exception: AWSHTTPSConnectionPool(host='hostname.s3.eu-west-1.amazonaws.com', port=443): Pool is closed.
# Authentication Errors from GCP
"ClientAuthenticationError",
"AuthorizationFailed",
"Reauthentication is needed",
"Permission denied to get service",
"API has not been used in project",
"HttpError 404 when requesting",
"HttpError 403 when requesting",
"HttpError 400 when requesting",
"GCPNoAccesibleProjectsError",
# Authentication Errors from Azure
"ClientAuthenticationError",
"AuthorizationFailed",
"Subscription Not Registered",
"AzureNotValidClientIdError",
"AzureNotValidClientSecretError",
"AzureNotValidTenantIdError",
"AzureInvalidProviderIdError",
"AzureTenantIdAndClientSecretNotBelongingToClientIdError",
"AzureTenantIdAndClientIdNotBelongingToClientSecretError",
"AzureClientIdAndClientSecretNotBelongingToTenantIdError",
"AzureHTTPResponseError",
"Error with credentials provided",
]
def before_send(event, hint):
"""
before_send handles the Sentry events in order to sent them or not
"""
# Ignore logs with the ignored_exceptions
# https://docs.python.org/3/library/logging.html#logrecord-objects
if "log_record" in hint:
log_msg = hint["log_record"].msg
log_lvl = hint["log_record"].levelno
# Handle Error events and discard the rest
if log_lvl == 40 and any(ignored in log_msg for ignored in IGNORED_EXCEPTIONS):
return
return event
sentry_sdk.init(
dsn=env.str("DJANGO_SENTRY_DSN", ""),
# Add data like request headers and IP for users,
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
before_send=before_send,
send_default_pii=True,
_experiments={
# Set continuous_profiling_auto_start to True
# to automatically start the profiler on when
# possible.
"continuous_profiling_auto_start": True,
},
attach_stacktrace=True,
ignore_errors=IGNORED_EXCEPTIONS,
)
@@ -0,0 +1,53 @@
from config.env import env
# Provider Oauth settings
GOOGLE_OAUTH_CLIENT_ID = env("SOCIAL_GOOGLE_OAUTH_CLIENT_ID", default="")
GOOGLE_OAUTH_CLIENT_SECRET = env("SOCIAL_GOOGLE_OAUTH_CLIENT_SECRET", default="")
GOOGLE_OAUTH_CALLBACK_URL = env("SOCIAL_GOOGLE_OAUTH_CALLBACK_URL", default="")
GITHUB_OAUTH_CLIENT_ID = env("SOCIAL_GITHUB_OAUTH_CLIENT_ID", default="")
GITHUB_OAUTH_CLIENT_SECRET = env("SOCIAL_GITHUB_OAUTH_CLIENT_SECRET", default="")
GITHUB_OAUTH_CALLBACK_URL = env("SOCIAL_GITHUB_OAUTH_CALLBACK_URL", default="")
# Allauth settings
ACCOUNT_LOGIN_METHODS = {"email"} # Use Email / Password authentication
ACCOUNT_USERNAME_REQUIRED = False
ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_EMAIL_VERIFICATION = "none" # Do not require email confirmation
ACCOUNT_USER_MODEL_USERNAME_FIELD = None
REST_AUTH = {
"TOKEN_MODEL": None,
"REST_USE_JWT": True,
}
# django-allauth (social)
# Authenticate if local account with this email address already exists
SOCIALACCOUNT_EMAIL_AUTHENTICATION = True
# Connect local account and social account if local account with that email address already exists
SOCIALACCOUNT_EMAIL_AUTHENTICATION_AUTO_CONNECT = True
SOCIALACCOUNT_ADAPTER = "api.adapters.ProwlerSocialAccountAdapter"
SOCIALACCOUNT_PROVIDERS = {
"google": {
"APP": {
"client_id": GOOGLE_OAUTH_CLIENT_ID,
"secret": GOOGLE_OAUTH_CLIENT_SECRET,
"key": "",
},
"SCOPE": [
"email",
"profile",
],
"AUTH_PARAMS": {
"access_type": "online",
},
},
"github": {
"APP": {
"client_id": GITHUB_OAUTH_CLIENT_ID,
"secret": GITHUB_OAUTH_CLIENT_SECRET,
},
"SCOPE": [
"user",
"read:org",
],
},
}
+46 -1
View File
@@ -15,6 +15,8 @@ from api.db_utils import rls_transaction
from api.models import (
ComplianceOverview,
Finding,
Integration,
IntegrationProviderRelationship,
Invitation,
Membership,
Provider,
@@ -486,7 +488,7 @@ def scans_fixture(tenants_fixture, providers_fixture):
name="Scan 1",
provider=provider,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.AVAILABLE,
state=StateChoices.COMPLETED,
tenant_id=tenant.id,
started_at="2024-01-02T00:00:00Z",
)
@@ -626,6 +628,7 @@ def findings_fixture(scans_fixture, resources_fixture):
"CheckId": "test_check_id",
"Description": "test description apple sauce",
},
first_seen_at="2024-01-02T00:00:00Z",
)
finding1.add_resources([resource1])
@@ -651,6 +654,8 @@ def findings_fixture(scans_fixture, resources_fixture):
"CheckId": "test_check_id",
"Description": "test description orange juice",
},
first_seen_at="2024-01-02T00:00:00Z",
muted=True,
)
finding2.add_resources([resource2])
@@ -875,6 +880,46 @@ def scan_summaries_fixture(tenants_fixture, providers_fixture):
)
@pytest.fixture
def integrations_fixture(providers_fixture):
provider1, provider2, *_ = providers_fixture
tenant_id = provider1.tenant_id
integration1 = Integration.objects.create(
tenant_id=tenant_id,
enabled=True,
connected=True,
integration_type="amazon_s3",
configuration={"key": "value"},
credentials={"psswd": "1234"},
)
IntegrationProviderRelationship.objects.create(
tenant_id=tenant_id,
integration=integration1,
provider=provider1,
)
integration2 = Integration.objects.create(
tenant_id=tenant_id,
enabled=True,
connected=True,
integration_type="amazon_s3",
configuration={"key": "value"},
credentials={"psswd": "1234"},
)
IntegrationProviderRelationship.objects.create(
tenant_id=tenant_id,
integration=integration2,
provider=provider1,
)
IntegrationProviderRelationship.objects.create(
tenant_id=tenant_id,
integration=integration2,
provider=provider2,
)
return integration1, integration2
def get_authorization_header(access_token: str) -> dict:
return {"Authorization": f"Bearer {access_token}"}
+36 -19
View File
@@ -5,10 +5,14 @@ from django_celery_beat.models import IntervalSchedule, PeriodicTask
from rest_framework_json_api.serializers import ValidationError
from tasks.tasks import perform_scheduled_scan_task
from api.models import Provider
from api.db_utils import rls_transaction
from api.models import Provider, Scan, StateChoices
def schedule_provider_scan(provider_instance: Provider):
tenant_id = str(provider_instance.tenant_id)
provider_id = str(provider_instance.id)
schedule, _ = IntervalSchedule.objects.get_or_create(
every=24,
period=IntervalSchedule.HOURS,
@@ -17,23 +21,9 @@ def schedule_provider_scan(provider_instance: Provider):
# Create a unique name for the periodic task
task_name = f"scan-perform-scheduled-{provider_instance.id}"
# Schedule the task
_, created = PeriodicTask.objects.get_or_create(
interval=schedule,
name=task_name,
task="scan-perform-scheduled",
kwargs=json.dumps(
{
"tenant_id": str(provider_instance.tenant_id),
"provider_id": str(provider_instance.id),
}
),
one_off=False,
defaults={
"start_time": datetime.now(timezone.utc) + timedelta(hours=24),
},
)
if not created:
if PeriodicTask.objects.filter(
interval=schedule, name=task_name, task="scan-perform-scheduled"
).exists():
raise ValidationError(
[
{
@@ -45,9 +35,36 @@ def schedule_provider_scan(provider_instance: Provider):
]
)
with rls_transaction(tenant_id):
scheduled_scan = Scan.objects.create(
tenant_id=tenant_id,
name="Daily scheduled scan",
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
state=StateChoices.AVAILABLE,
scheduled_at=datetime.now(timezone.utc),
)
# Schedule the task
periodic_task_instance = PeriodicTask.objects.create(
interval=schedule,
name=task_name,
task="scan-perform-scheduled",
kwargs=json.dumps(
{
"tenant_id": tenant_id,
"provider_id": provider_id,
}
),
one_off=False,
start_time=datetime.now(timezone.utc) + timedelta(hours=24),
)
scheduled_scan.scheduler_task_id = periodic_task_instance.id
scheduled_scan.save()
return perform_scheduled_scan_task.apply_async(
kwargs={
"tenant_id": str(provider_instance.tenant_id),
"provider_id": str(provider_instance.id),
"provider_id": provider_id,
},
)
+27 -29
View File
@@ -1,5 +1,5 @@
from celery.utils.log import get_task_logger
from django.db import transaction
from django.db import DatabaseError
from api.db_router import MainRouter
from api.db_utils import batch_delete, rls_transaction
@@ -8,11 +8,12 @@ from api.models import Finding, Provider, Resource, Scan, ScanSummary, Tenant
logger = get_task_logger(__name__)
def delete_provider(pk: str):
def delete_provider(tenant_id: str, pk: str):
"""
Gracefully deletes an instance of a provider along with its related data.
Args:
tenant_id (str): Tenant ID the resources belong to.
pk (str): The primary key of the Provider instance to delete.
Returns:
@@ -22,33 +23,31 @@ def delete_provider(pk: str):
Raises:
Provider.DoesNotExist: If no instance with the provided primary key exists.
"""
instance = Provider.all_objects.get(pk=pk)
deletion_summary = {}
with rls_transaction(tenant_id):
instance = Provider.all_objects.get(pk=pk)
deletion_summary = {}
deletion_steps = [
("Scan Summaries", ScanSummary.all_objects.filter(scan__provider=instance)),
("Findings", Finding.all_objects.filter(scan__provider=instance)),
("Resources", Resource.all_objects.filter(provider=instance)),
("Scans", Scan.all_objects.filter(provider=instance)),
]
with transaction.atomic():
# Delete Scan Summaries
scan_summaries_qs = ScanSummary.all_objects.filter(scan__provider=instance)
_, scans_summ_summary = batch_delete(scan_summaries_qs)
deletion_summary.update(scans_summ_summary)
for step_name, queryset in deletion_steps:
try:
_, step_summary = batch_delete(tenant_id, queryset)
deletion_summary.update(step_summary)
except DatabaseError as db_error:
logger.error(f"Error deleting {step_name}: {db_error}")
raise
# Delete Findings
findings_qs = Finding.all_objects.filter(scan__provider=instance)
_, findings_summary = batch_delete(findings_qs)
deletion_summary.update(findings_summary)
# Delete Resources
resources_qs = Resource.all_objects.filter(provider=instance)
_, resources_summary = batch_delete(resources_qs)
deletion_summary.update(resources_summary)
# Delete Scans
scans_qs = Scan.all_objects.filter(provider=instance)
_, scans_summary = batch_delete(scans_qs)
deletion_summary.update(scans_summary)
provider_deleted_count, provider_summary = instance.delete()
try:
with rls_transaction(tenant_id):
_, provider_summary = instance.delete()
deletion_summary.update(provider_summary)
except DatabaseError as db_error:
logger.error(f"Error deleting Provider: {db_error}")
raise
return deletion_summary
@@ -66,9 +65,8 @@ def delete_tenant(pk: str):
deletion_summary = {}
for provider in Provider.objects.using(MainRouter.admin_db).filter(tenant_id=pk):
with rls_transaction(pk):
summary = delete_provider(provider.id)
deletion_summary.update(summary)
summary = delete_provider(pk, provider.id)
deletion_summary.update(summary)
Tenant.objects.using(MainRouter.admin_db).filter(id=pk).delete()
+249
View File
@@ -0,0 +1,249 @@
import os
import zipfile
import boto3
import config.django.base as base
from botocore.exceptions import ClientError, NoCredentialsError, ParamValidationError
from celery.utils.log import get_task_logger
from django.conf import settings
from prowler.config.config import (
csv_file_suffix,
html_file_suffix,
json_ocsf_file_suffix,
output_file_timestamp,
)
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
AWSWellArchitected,
)
from prowler.lib.outputs.compliance.cis.cis_aws import AWSCIS
from prowler.lib.outputs.compliance.cis.cis_azure import AzureCIS
from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
from prowler.lib.outputs.compliance.cis.cis_m365 import M365CIS
from prowler.lib.outputs.compliance.ens.ens_aws import AWSENS
from prowler.lib.outputs.compliance.ens.ens_azure import AzureENS
from prowler.lib.outputs.compliance.ens.ens_gcp import GCPENS
from prowler.lib.outputs.compliance.iso27001.iso27001_aws import AWSISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_azure import AzureISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_gcp import GCPISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_kubernetes import (
KubernetesISO27001,
)
from prowler.lib.outputs.compliance.kisa_ismsp.kisa_ismsp_aws import AWSKISAISMSP
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_aws import AWSMitreAttack
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_azure import (
AzureMitreAttack,
)
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_gcp import GCPMitreAttack
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_aws import (
ProwlerThreatScoreAWS,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_azure import (
ProwlerThreatScoreAzure,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_gcp import (
ProwlerThreatScoreGCP,
)
from prowler.lib.outputs.csv.csv import CSV
from prowler.lib.outputs.html.html import HTML
from prowler.lib.outputs.ocsf.ocsf import OCSF
logger = get_task_logger(__name__)
COMPLIANCE_CLASS_MAP = {
"aws": [
(lambda name: name.startswith("cis_"), AWSCIS),
(lambda name: name == "mitre_attack_aws", AWSMitreAttack),
(lambda name: name.startswith("ens_"), AWSENS),
(
lambda name: name.startswith("aws_well_architected_framework"),
AWSWellArchitected,
),
(lambda name: name.startswith("iso27001_"), AWSISO27001),
(lambda name: name.startswith("kisa"), AWSKISAISMSP),
(lambda name: name == "prowler_threatscore_aws", ProwlerThreatScoreAWS),
],
"azure": [
(lambda name: name.startswith("cis_"), AzureCIS),
(lambda name: name == "mitre_attack_azure", AzureMitreAttack),
(lambda name: name.startswith("ens_"), AzureENS),
(lambda name: name.startswith("iso27001_"), AzureISO27001),
(lambda name: name == "prowler_threatscore_azure", ProwlerThreatScoreAzure),
],
"gcp": [
(lambda name: name.startswith("cis_"), GCPCIS),
(lambda name: name == "mitre_attack_gcp", GCPMitreAttack),
(lambda name: name.startswith("ens_"), GCPENS),
(lambda name: name.startswith("iso27001_"), GCPISO27001),
(lambda name: name == "prowler_threatscore_gcp", ProwlerThreatScoreGCP),
],
"kubernetes": [
(lambda name: name.startswith("cis_"), KubernetesCIS),
(lambda name: name.startswith("iso27001_"), KubernetesISO27001),
],
"m365": [
(lambda name: name.startswith("cis_"), M365CIS),
],
}
# Predefined mapping for output formats and their configurations
OUTPUT_FORMATS_MAPPING = {
"csv": {
"class": CSV,
"suffix": csv_file_suffix,
"kwargs": {},
},
"json-ocsf": {"class": OCSF, "suffix": json_ocsf_file_suffix, "kwargs": {}},
"html": {"class": HTML, "suffix": html_file_suffix, "kwargs": {"stats": {}}},
}
def _compress_output_files(output_directory: str) -> str:
"""
Compress output files from all configured output formats into a ZIP archive.
Args:
output_directory (str): The directory where the output files are located.
The function looks up all known suffixes in OUTPUT_FORMATS_MAPPING
and compresses those files into a single ZIP.
Returns:
str: The full path to the newly created ZIP archive.
"""
zip_path = f"{output_directory}.zip"
parent_dir = os.path.dirname(output_directory)
zip_path_abs = os.path.abspath(zip_path)
with zipfile.ZipFile(zip_path, "w", zipfile.ZIP_DEFLATED) as zipf:
for foldername, _, filenames in os.walk(parent_dir):
for filename in filenames:
file_path = os.path.join(foldername, filename)
if os.path.abspath(file_path) == zip_path_abs:
continue
arcname = os.path.relpath(file_path, start=parent_dir)
zipf.write(file_path, arcname)
return zip_path
def get_s3_client():
"""
Create and return a boto3 S3 client using AWS credentials from environment variables.
This function attempts to initialize an S3 client by reading the AWS access key, secret key,
session token, and region from environment variables. It then validates the client by listing
available S3 buckets. If an error occurs during this process (for example, due to missing or
invalid credentials), it falls back to creating an S3 client without explicitly provided credentials,
which may rely on other configuration sources (e.g., IAM roles).
Returns:
boto3.client: A configured S3 client instance.
Raises:
ClientError, NoCredentialsError, or ParamValidationError if both attempts to create a client fail.
"""
s3_client = None
try:
s3_client = boto3.client(
"s3",
aws_access_key_id=settings.DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY,
aws_session_token=settings.DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN,
region_name=settings.DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION,
)
s3_client.list_buckets()
except (ClientError, NoCredentialsError, ParamValidationError, ValueError):
s3_client = boto3.client("s3")
s3_client.list_buckets()
return s3_client
def _upload_to_s3(tenant_id: str, zip_path: str, scan_id: str) -> str:
"""
Upload the specified ZIP file to an S3 bucket.
If the S3 bucket environment variables are not configured,
the function returns None without performing an upload.
Args:
tenant_id (str): The tenant identifier, used as part of the S3 key prefix.
zip_path (str): The local file system path to the ZIP file to be uploaded.
scan_id (str): The scan identifier, used as part of the S3 key prefix.
Returns:
str: The S3 URI of the uploaded file (e.g., "s3://<bucket>/<key>") if successful.
None: If the required environment variables for the S3 bucket are not set.
Raises:
botocore.exceptions.ClientError: If the upload attempt to S3 fails for any reason.
"""
bucket = base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET
if not bucket:
return None
try:
s3 = get_s3_client()
# Upload the ZIP file (outputs) to the S3 bucket
zip_key = f"{tenant_id}/{scan_id}/{os.path.basename(zip_path)}"
s3.upload_file(
Filename=zip_path,
Bucket=bucket,
Key=zip_key,
)
# Upload the compliance directory to the S3 bucket
compliance_dir = os.path.join(os.path.dirname(zip_path), "compliance")
for filename in os.listdir(compliance_dir):
local_path = os.path.join(compliance_dir, filename)
if not os.path.isfile(local_path):
continue
file_key = f"{tenant_id}/{scan_id}/compliance/{filename}"
s3.upload_file(Filename=local_path, Bucket=bucket, Key=file_key)
return f"s3://{base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET}/{zip_key}"
except (ClientError, NoCredentialsError, ParamValidationError, ValueError) as e:
logger.error(f"S3 upload failed: {str(e)}")
def _generate_output_directory(
output_directory, prowler_provider: object, tenant_id: str, scan_id: str
) -> tuple[str, str]:
"""
Generate a file system path for the output directory of a prowler scan.
This function constructs the output directory path by combining a base
temporary output directory, the tenant ID, the scan ID, and details about
the prowler provider along with a timestamp. The resulting path is used to
store the output files of a prowler scan.
Note:
This function depends on one external variable:
- `output_file_timestamp`: A timestamp (as a string) used to uniquely identify the output.
Args:
output_directory (str): The base output directory.
prowler_provider (object): An identifier or descriptor for the prowler provider.
Typically, this is a string indicating the provider (e.g., "aws").
tenant_id (str): The unique identifier for the tenant.
scan_id (str): The unique identifier for the scan.
Returns:
str: The constructed file system path for the prowler scan output directory.
Example:
>>> _generate_output_directory("/tmp", "aws", "tenant-1234", "scan-5678")
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56',
'/tmp/tenant-1234/aws/scan-5678/compliance/prowler-output-2023-02-15T12:34:56'
"""
path = (
f"{output_directory}/{tenant_id}/{scan_id}/prowler-output-"
f"{prowler_provider}-{output_file_timestamp}"
)
os.makedirs("/".join(path.split("/")[:-1]), exist_ok=True)
compliance_path = (
f"{output_directory}/{tenant_id}/{scan_id}/compliance/prowler-output-"
f"{prowler_provider}-{output_file_timestamp}"
)
os.makedirs("/".join(compliance_path.split("/")[:-1]), exist_ok=True)
return path, compliance_path
+68 -28
View File
@@ -1,3 +1,4 @@
import json
import time
from copy import deepcopy
from datetime import datetime, timezone
@@ -6,6 +7,7 @@ from celery.utils.log import get_task_logger
from config.settings.celery import CELERY_DEADLOCK_ATTEMPTS
from django.db import IntegrityError, OperationalError
from django.db.models import Case, Count, IntegerField, Sum, When
from tasks.utils import CustomEncoder
from api.compliance import (
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE,
@@ -152,6 +154,9 @@ def perform_prowler_scan(
for progress, findings in prowler_scan.scan():
for finding in findings:
if finding is None:
logger.error(f"None finding detected on scan {scan_id}.")
continue
for attempt in range(CELERY_DEADLOCK_ATTEMPTS):
try:
with rls_transaction(tenant_id):
@@ -176,7 +181,10 @@ def perform_prowler_scan(
# Update resource fields if necessary
updated_fields = []
if resource_instance.region != finding.region:
if (
finding.region
and resource_instance.region != finding.region
):
resource_instance.region = finding.region
updated_fields.append("region")
if resource_instance.service != finding.service_name:
@@ -185,6 +193,17 @@ def perform_prowler_scan(
if resource_instance.type != finding.resource_type:
resource_instance.type = finding.resource_type
updated_fields.append("type")
if resource_instance.metadata != finding.resource_metadata:
resource_instance.metadata = json.dumps(
finding.resource_metadata, cls=CustomEncoder
)
updated_fields.append("metadata")
if resource_instance.details != finding.resource_details:
resource_instance.details = finding.resource_details
updated_fields.append("details")
if resource_instance.partition != finding.partition:
resource_instance.partition = finding.partition
updated_fields.append("partition")
if updated_fields:
with rls_transaction(tenant_id):
resource_instance.save(update_fields=updated_fields)
@@ -219,24 +238,33 @@ def perform_prowler_scan(
# Process finding
with rls_transaction(tenant_id):
finding_uid = finding.uid
last_first_seen_at = None
if finding_uid not in last_status_cache:
most_recent_finding = (
Finding.objects.filter(uid=finding_uid)
.order_by("-id")
.values("status")
Finding.all_objects.filter(
tenant_id=tenant_id, uid=finding_uid
)
.order_by("-inserted_at")
.values("status", "first_seen_at")
.first()
)
last_status = (
most_recent_finding["status"]
if most_recent_finding
else None
)
last_status_cache[finding_uid] = last_status
last_status = None
if most_recent_finding:
last_status = most_recent_finding["status"]
last_first_seen_at = most_recent_finding["first_seen_at"]
last_status_cache[finding_uid] = last_status, last_first_seen_at
else:
last_status = last_status_cache[finding_uid]
last_status, last_first_seen_at = last_status_cache[finding_uid]
status = FindingStatus[finding.status]
delta = _create_finding_delta(last_status, status)
# For the findings prior to the change, when a first finding is found with delta!="new" it will be
# assigned a current date as first_seen_at and the successive findings with the same UID will
# always get the date of the previous finding.
# For new findings, when a finding (delta="new") is found for the first time, the first_seen_at
# attribute will be assigned the current date, the following findings will get that date.
if not last_first_seen_at:
last_first_seen_at = datetime.now(tz=timezone.utc)
# Create the finding
finding_instance = Finding.objects.create(
@@ -251,6 +279,9 @@ def perform_prowler_scan(
raw_result=finding.raw,
check_id=finding.check_id,
scan=scan_instance,
first_seen_at=last_first_seen_at,
muted=finding.muted,
compliance=finding.compliance,
)
finding_instance.add_resources([resource_instance])
@@ -328,9 +359,18 @@ def perform_prowler_scan(
total_requirements=compliance["total_requirements"],
)
)
with rls_transaction(tenant_id):
ComplianceOverview.objects.bulk_create(compliance_overview_objects)
try:
with rls_transaction(tenant_id):
ComplianceOverview.objects.bulk_create(
compliance_overview_objects, batch_size=100
)
except Exception as overview_exception:
import sentry_sdk
sentry_sdk.capture_exception(overview_exception)
logger.error(
f"Error storing compliance overview for scan {scan_id}: {overview_exception}"
)
if exception is not None:
raise exception
@@ -367,7 +407,7 @@ def aggregate_findings(tenant_id: str, scan_id: str):
- muted_changed: Muted findings with a delta of 'changed'.
"""
with rls_transaction(tenant_id):
findings = Finding.objects.filter(scan_id=scan_id)
findings = Finding.objects.filter(tenant_id=tenant_id, scan_id=scan_id)
aggregation = findings.values(
"check_id",
@@ -377,21 +417,21 @@ def aggregate_findings(tenant_id: str, scan_id: str):
).annotate(
fail=Sum(
Case(
When(status="FAIL", then=1),
When(status="FAIL", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
_pass=Sum(
Case(
When(status="PASS", then=1),
When(status="PASS", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
muted=Sum(
muted_count=Sum(
Case(
When(status="MUTED", then=1),
When(muted=True, then=1),
default=0,
output_field=IntegerField(),
)
@@ -399,63 +439,63 @@ def aggregate_findings(tenant_id: str, scan_id: str):
total=Count("id"),
new=Sum(
Case(
When(delta="new", then=1),
When(delta="new", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
changed=Sum(
Case(
When(delta="changed", then=1),
When(delta="changed", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
unchanged=Sum(
Case(
When(delta__isnull=True, then=1),
When(delta__isnull=True, muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
fail_new=Sum(
Case(
When(delta="new", status="FAIL", then=1),
When(delta="new", status="FAIL", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
fail_changed=Sum(
Case(
When(delta="changed", status="FAIL", then=1),
When(delta="changed", status="FAIL", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
pass_new=Sum(
Case(
When(delta="new", status="PASS", then=1),
When(delta="new", status="PASS", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
pass_changed=Sum(
Case(
When(delta="changed", status="PASS", then=1),
When(delta="changed", status="PASS", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
muted_new=Sum(
Case(
When(delta="new", status="MUTED", then=1),
When(delta="new", muted=True, then=1),
default=0,
output_field=IntegerField(),
)
),
muted_changed=Sum(
Case(
When(delta="changed", status="MUTED", then=1),
When(delta="changed", muted=True, then=1),
default=0,
output_field=IntegerField(),
)
@@ -473,7 +513,7 @@ def aggregate_findings(tenant_id: str, scan_id: str):
region=agg["resources__region"],
fail=agg["fail"],
_pass=agg["_pass"],
muted=agg["muted"],
muted=agg["muted_count"],
total=agg["total"],
new=agg["new"],
changed=agg["changed"],
+247 -28
View File
@@ -1,15 +1,35 @@
from datetime import datetime, timedelta, timezone
from pathlib import Path
from shutil import rmtree
from celery import shared_task
from celery import chain, shared_task
from celery.utils.log import get_task_logger
from config.celery import RLSTask
from config.django.base import DJANGO_FINDINGS_BATCH_SIZE, DJANGO_TMP_OUTPUT_DIRECTORY
from django_celery_beat.models import PeriodicTask
from tasks.jobs.connection import check_provider_connection
from tasks.jobs.deletion import delete_provider, delete_tenant
from tasks.jobs.export import (
COMPLIANCE_CLASS_MAP,
OUTPUT_FORMATS_MAPPING,
_compress_output_files,
_generate_output_directory,
_upload_to_s3,
)
from tasks.jobs.scan import aggregate_findings, perform_prowler_scan
from tasks.utils import batched, get_next_execution_datetime
from api.compliance import get_compliance_frameworks
from api.db_utils import rls_transaction
from api.decorators import set_tenant
from api.models import Provider, Scan
from api.models import Finding, Provider, Scan, ScanSummary, StateChoices
from api.utils import initialize_prowler_provider
from api.v1.serializers import ScanTaskSerializer
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.outputs.compliance.generic.generic import GenericCompliance
from prowler.lib.outputs.finding import Finding as FindingOutput
logger = get_task_logger(__name__)
@shared_task(base=RLSTask, name="provider-connection-check")
@@ -29,9 +49,10 @@ def check_provider_connection_task(provider_id: str):
return check_provider_connection(provider_id=provider_id)
@shared_task(base=RLSTask, name="provider-deletion")
@set_tenant
def delete_provider_task(provider_id: str):
@shared_task(
base=RLSTask, name="provider-deletion", queue="deletion", autoretry_for=(Exception,)
)
def delete_provider_task(provider_id: str, tenant_id: str):
"""
Task to delete a specific Provider instance.
@@ -39,6 +60,7 @@ def delete_provider_task(provider_id: str):
Args:
provider_id (str): The primary key of the `Provider` instance to be deleted.
tenant_id (str): Tenant ID the provider belongs to.
Returns:
tuple: A tuple containing:
@@ -46,7 +68,7 @@ def delete_provider_task(provider_id: str):
- A dictionary with the count of deleted instances per model,
including related models if cascading deletes were triggered.
"""
return delete_provider(pk=provider_id)
return delete_provider(tenant_id=tenant_id, pk=provider_id)
@shared_task(base=RLSTask, name="scan-perform", queue="scans")
@@ -69,13 +91,22 @@ def perform_scan_task(
Returns:
dict: The result of the scan execution, typically including the status and results of the performed checks.
"""
return perform_prowler_scan(
result = perform_prowler_scan(
tenant_id=tenant_id,
scan_id=scan_id,
provider_id=provider_id,
checks_to_execute=checks_to_execute,
)
chain(
perform_scan_summary_task.si(tenant_id, scan_id),
generate_outputs.si(
scan_id=scan_id, provider_id=provider_id, tenant_id=tenant_id
),
).apply_async()
return result
@shared_task(base=RLSTask, bind=True, name="scan-perform-scheduled", queue="scans")
def perform_scheduled_scan_task(self, tenant_id: str, provider_id: str):
@@ -100,34 +131,90 @@ def perform_scheduled_scan_task(self, tenant_id: str, provider_id: str):
task_id = self.request.id
with rls_transaction(tenant_id):
provider_instance = Provider.objects.get(pk=provider_id)
periodic_task_instance = PeriodicTask.objects.get(
name=f"scan-perform-scheduled-{provider_id}"
)
next_scan_date = datetime.combine(
datetime.now(timezone.utc), periodic_task_instance.start_time.time()
) + timedelta(hours=24)
scan_instance = Scan.objects.create(
executed_scan = Scan.objects.filter(
tenant_id=tenant_id,
name="Daily scheduled scan",
provider=provider_instance,
provider_id=provider_id,
task__task_runner_task__task_id=task_id,
).order_by("completed_at")
if (
Scan.objects.filter(
tenant_id=tenant_id,
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
state=StateChoices.EXECUTING,
scheduler_task_id=periodic_task_instance.id,
scheduled_at__date=datetime.now(timezone.utc).date(),
).exists()
or executed_scan.exists()
):
# Duplicated task execution due to visibility timeout or scan is already running
logger.warning(f"Duplicated scheduled scan for provider {provider_id}.")
try:
affected_scan = executed_scan.first()
if not affected_scan:
raise ValueError(
"Error retrieving affected scan details after detecting duplicated scheduled "
"scan."
)
# Return the affected scan details to avoid losing data
serializer = ScanTaskSerializer(instance=affected_scan)
except Exception as duplicated_scan_exception:
logger.error(
f"Duplicated scheduled scan for provider {provider_id}. Error retrieving affected scan details: "
f"{str(duplicated_scan_exception)}"
)
raise duplicated_scan_exception
return serializer.data
next_scan_datetime = get_next_execution_datetime(task_id, provider_id)
scan_instance, _ = Scan.objects.get_or_create(
tenant_id=tenant_id,
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
next_scan_at=next_scan_date,
task_id=task_id,
state__in=(StateChoices.SCHEDULED, StateChoices.AVAILABLE),
scheduler_task_id=periodic_task_instance.id,
defaults={
"state": StateChoices.SCHEDULED,
"name": "Daily scheduled scan",
"scheduled_at": next_scan_datetime - timedelta(days=1),
},
)
result = perform_prowler_scan(
tenant_id=tenant_id,
scan_id=str(scan_instance.id),
provider_id=provider_id,
)
perform_scan_summary_task.apply_async(
kwargs={
"tenant_id": tenant_id,
"scan_id": str(scan_instance.id),
}
)
scan_instance.task_id = task_id
scan_instance.save()
try:
result = perform_prowler_scan(
tenant_id=tenant_id,
scan_id=str(scan_instance.id),
provider_id=provider_id,
)
except Exception as e:
raise e
finally:
with rls_transaction(tenant_id):
Scan.objects.get_or_create(
tenant_id=tenant_id,
name="Daily scheduled scan",
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
state=StateChoices.SCHEDULED,
scheduled_at=next_scan_datetime,
scheduler_task_id=periodic_task_instance.id,
)
chain(
perform_scan_summary_task.si(tenant_id, scan_instance.id),
generate_outputs.si(
scan_id=str(scan_instance.id), provider_id=provider_id, tenant_id=tenant_id
),
).apply_async()
return result
@@ -136,6 +223,138 @@ def perform_scan_summary_task(tenant_id: str, scan_id: str):
return aggregate_findings(tenant_id=tenant_id, scan_id=scan_id)
@shared_task(name="tenant-deletion")
@shared_task(name="tenant-deletion", queue="deletion", autoretry_for=(Exception,))
def delete_tenant_task(tenant_id: str):
return delete_tenant(pk=tenant_id)
@shared_task(
base=RLSTask,
name="scan-report",
queue="scan-reports",
)
@set_tenant(keep_tenant=True)
def generate_outputs(scan_id: str, provider_id: str, tenant_id: str):
"""
Process findings in batches and generate output files in multiple formats.
This function retrieves findings associated with a scan, processes them
in batches of 50, and writes each batch to the corresponding output files.
It reuses output writer instances across batches, updates them with each
batch of transformed findings, and uses a flag to indicate when the final
batch is being processed. Finally, the output files are compressed and
uploaded to S3.
Args:
tenant_id (str): The tenant identifier.
scan_id (str): The scan identifier.
provider_id (str): The provider_id id to be used in generating outputs.
"""
# Check if the scan has findings
if not ScanSummary.objects.filter(scan_id=scan_id).exists():
logger.info(f"No findings found for scan {scan_id}")
return {"upload": False}
provider_obj = Provider.objects.get(id=provider_id)
prowler_provider = initialize_prowler_provider(provider_obj)
provider_uid = provider_obj.uid
provider_type = provider_obj.provider
frameworks_bulk = Compliance.get_bulk(provider_type)
frameworks_avail = get_compliance_frameworks(provider_type)
out_dir, comp_dir = _generate_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY, provider_uid, tenant_id, scan_id
)
def get_writer(writer_map, name, factory, is_last):
"""
Return existing writer_map[name] or create via factory().
In both cases set `.close_file = is_last`.
"""
initialization = False
if name not in writer_map:
writer_map[name] = factory()
initialization = True
w = writer_map[name]
w.close_file = is_last
return w, initialization
output_writers = {}
compliance_writers = {}
scan_summary = FindingOutput._transform_findings_stats(
ScanSummary.objects.filter(scan_id=scan_id)
)
qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid").iterator()
for batch, is_last in batched(qs, DJANGO_FINDINGS_BATCH_SIZE):
fos = [FindingOutput.transform_api_finding(f, prowler_provider) for f in batch]
# Outputs
for mode, cfg in OUTPUT_FORMATS_MAPPING.items():
cls = cfg["class"]
suffix = cfg["suffix"]
extra = cfg.get("kwargs", {}).copy()
if mode == "html":
extra.update(provider=prowler_provider, stats=scan_summary)
writer, initialization = get_writer(
output_writers,
cls,
lambda cls=cls, fos=fos, suffix=suffix: cls(
findings=fos,
file_path=out_dir,
file_extension=suffix,
from_cli=False,
),
is_last,
)
if not initialization:
writer.transform(fos)
writer.batch_write_data_to_file(**extra)
writer._data.clear()
# Compliance CSVs
for name in frameworks_avail:
compliance_obj = frameworks_bulk[name]
klass = GenericCompliance
for condition, cls in COMPLIANCE_CLASS_MAP.get(provider_type, []):
if condition(name):
klass = cls
break
filename = f"{comp_dir}_{name}.csv"
writer, initialization = get_writer(
compliance_writers,
name,
lambda klass=klass, fos=fos: klass(
findings=fos,
compliance=compliance_obj,
file_path=filename,
from_cli=False,
),
is_last,
)
if not initialization:
writer.transform(fos, compliance_obj, name)
writer.batch_write_data_to_file()
writer._data.clear()
compressed = _compress_output_files(out_dir)
upload_uri = _upload_to_s3(tenant_id, compressed, scan_id)
if upload_uri:
try:
rmtree(Path(compressed).parent, ignore_errors=True)
except Exception as e:
logger.error(f"Error deleting output files: {e}")
final_location, did_upload = upload_uri, True
else:
final_location, did_upload = compressed, False
Scan.all_objects.filter(id=scan_id).update(output_location=final_location)
logger.info(f"Scan outputs at {final_location}")
return {"upload": did_upload}
+4
View File
@@ -6,6 +6,8 @@ from django_celery_beat.models import IntervalSchedule, PeriodicTask
from rest_framework_json_api.serializers import ValidationError
from tasks.beat import schedule_provider_scan
from api.models import Scan
@pytest.mark.django_db
class TestScheduleProviderScan:
@@ -15,9 +17,11 @@ class TestScheduleProviderScan:
with patch(
"tasks.tasks.perform_scheduled_scan_task.apply_async"
) as mock_apply_async:
assert Scan.all_objects.count() == 0
result = schedule_provider_scan(provider_instance)
assert result is not None
assert Scan.all_objects.count() == 1
mock_apply_async.assert_called_once_with(
kwargs={
+5 -3
View File
@@ -9,17 +9,19 @@ from api.models import Provider, Tenant
class TestDeleteProvider:
def test_delete_provider_success(self, providers_fixture):
instance = providers_fixture[0]
result = delete_provider(instance.id)
tenant_id = str(instance.tenant_id)
result = delete_provider(tenant_id, instance.id)
assert result
with pytest.raises(ObjectDoesNotExist):
Provider.objects.get(pk=instance.id)
def test_delete_provider_does_not_exist(self):
def test_delete_provider_does_not_exist(self, tenants_fixture):
tenant_id = str(tenants_fixture[0].id)
non_existent_pk = "babf6796-cfcc-4fd3-9dcf-88d012247645"
with pytest.raises(ObjectDoesNotExist):
delete_provider(non_existent_pk)
delete_provider(tenant_id, non_existent_pk)
@pytest.mark.django_db
+142
View File
@@ -0,0 +1,142 @@
import os
import zipfile
from unittest.mock import MagicMock, patch
import pytest
from botocore.exceptions import ClientError
from tasks.jobs.export import (
_compress_output_files,
_generate_output_directory,
_upload_to_s3,
get_s3_client,
)
@pytest.mark.django_db
class TestOutputs:
def test_compress_output_files_creates_zip(self, tmp_path):
output_dir = tmp_path / "output"
output_dir.mkdir()
file_path = output_dir / "result.csv"
file_path.write_text("data")
zip_path = _compress_output_files(str(output_dir))
assert zip_path.endswith(".zip")
assert os.path.exists(zip_path)
with zipfile.ZipFile(zip_path, "r") as zipf:
assert "output/result.csv" in zipf.namelist()
@patch("tasks.jobs.export.boto3.client")
@patch("tasks.jobs.export.settings")
def test_get_s3_client_success(self, mock_settings, mock_boto_client):
mock_settings.DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID = "test"
mock_settings.DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = "test"
mock_settings.DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = "token"
mock_settings.DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = "eu-west-1"
client_mock = MagicMock()
mock_boto_client.return_value = client_mock
client = get_s3_client()
assert client is not None
client_mock.list_buckets.assert_called()
@patch("tasks.jobs.export.boto3.client")
@patch("tasks.jobs.export.settings")
def test_get_s3_client_fallback(self, mock_settings, mock_boto_client):
mock_boto_client.side_effect = [
ClientError({"Error": {"Code": "403"}}, "ListBuckets"),
MagicMock(),
]
client = get_s3_client()
assert client is not None
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_success(self, mock_base, mock_get_client, tmp_path):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
zip_path = tmp_path / "outputs.zip"
zip_path.write_bytes(b"dummy")
compliance_dir = tmp_path / "compliance"
compliance_dir.mkdir()
compliance_file = compliance_dir / "report.csv"
compliance_file.write_text("ok")
client_mock = MagicMock()
mock_get_client.return_value = client_mock
result = _upload_to_s3("tenant-id", str(zip_path), "scan-id")
expected_uri = "s3://test-bucket/tenant-id/scan-id/outputs.zip"
assert result == expected_uri
assert client_mock.upload_file.call_count == 2
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_missing_bucket(self, mock_base, mock_get_client):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = ""
result = _upload_to_s3("tenant", "/tmp/fake.zip", "scan")
assert result is None
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_skips_non_files(self, mock_base, mock_get_client, tmp_path):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
zip_path = tmp_path / "results.zip"
zip_path.write_bytes(b"zip")
compliance_dir = tmp_path / "compliance"
compliance_dir.mkdir()
(compliance_dir / "subdir").mkdir()
client_mock = MagicMock()
mock_get_client.return_value = client_mock
result = _upload_to_s3("tenant", str(zip_path), "scan")
expected_uri = "s3://test-bucket/tenant/scan/results.zip"
assert result == expected_uri
client_mock.upload_file.assert_called_once()
@patch(
"tasks.jobs.export.get_s3_client",
side_effect=ClientError({"Error": {}}, "Upload"),
)
@patch("tasks.jobs.export.base")
@patch("tasks.jobs.export.logger.error")
def test_upload_to_s3_failure_logs_error(
self, mock_logger, mock_base, mock_get_client, tmp_path
):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "bucket"
zip_path = tmp_path / "zipfile.zip"
zip_path.write_bytes(b"zip")
compliance_dir = tmp_path / "compliance"
compliance_dir.mkdir()
(compliance_dir / "report.csv").write_text("csv")
_upload_to_s3("tenant", str(zip_path), "scan")
mock_logger.assert_called()
def test_generate_output_directory_creates_paths(self, tmp_path):
from prowler.config.config import output_file_timestamp
base_dir = str(tmp_path)
tenant_id = "t1"
scan_id = "s1"
provider = "aws"
path, compliance = _generate_output_directory(
base_dir, provider, tenant_id, scan_id
)
assert os.path.isdir(os.path.dirname(path))
assert os.path.isdir(os.path.dirname(compliance))
assert path.endswith(f"{provider}-{output_file_timestamp}")
assert compliance.endswith(f"{provider}-{output_file_timestamp}")
+15
View File
@@ -1,3 +1,4 @@
import json
import uuid
from unittest.mock import MagicMock, patch
@@ -7,6 +8,7 @@ from tasks.jobs.scan import (
_store_resources,
perform_prowler_scan,
)
from tasks.utils import CustomEncoder
from api.models import (
Finding,
@@ -107,7 +109,13 @@ class TestPerformScan:
finding.service_name = "service_name"
finding.resource_type = "resource_type"
finding.resource_tags = {"tag1": "value1", "tag2": "value2"}
finding.muted = False
finding.raw = {}
finding.resource_metadata = {"test": "metadata"}
finding.resource_details = {"details": "test"}
finding.partition = "partition"
finding.muted = True
finding.compliance = {"compliance1": "PASS"}
# Mock the ProwlerScan instance
mock_prowler_scan_instance = MagicMock()
@@ -145,6 +153,8 @@ class TestPerformScan:
assert scan_finding.severity == finding.severity
assert scan_finding.check_id == finding.check_id
assert scan_finding.raw_result == finding.raw
assert scan_finding.muted
assert scan_finding.compliance == finding.compliance
assert scan_resource.tenant == tenant
assert scan_resource.uid == finding.resource_uid
@@ -152,6 +162,11 @@ class TestPerformScan:
assert scan_resource.service == finding.service_name
assert scan_resource.type == finding.resource_type
assert scan_resource.name == finding.resource_name
assert scan_resource.metadata == json.dumps(
finding.resource_metadata, cls=CustomEncoder
)
assert scan_resource.details == f"{finding.resource_details}"
assert scan_resource.partition == finding.partition
# Assert that the resource tags have been created and associated
tags = scan_resource.tags.all()
+415
View File
@@ -0,0 +1,415 @@
import uuid
from pathlib import Path
from unittest.mock import MagicMock, patch
import pytest
from tasks.tasks import generate_outputs
@pytest.mark.django_db
class TestGenerateOutputs:
def setup_method(self):
self.scan_id = str(uuid.uuid4())
self.provider_id = str(uuid.uuid4())
self.tenant_id = str(uuid.uuid4())
def test_no_findings_returns_early(self):
with patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter:
mock_filter.return_value.exists.return_value = False
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": False}
mock_filter.assert_called_once_with(scan_id=self.scan_id)
@patch("tasks.tasks.rmtree")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Finding.all_objects.filter")
def test_generate_outputs_happy_path(
self,
mock_finding_filter,
mock_scan_summary_filter,
mock_provider_get,
mock_initialize_provider,
mock_compliance_get_bulk,
mock_get_available_frameworks,
mock_compress,
mock_upload,
mock_rmtree,
):
mock_scan_summary_filter.return_value.exists.return_value = True
mock_provider = MagicMock()
mock_provider.uid = "provider-uid"
mock_provider.provider = "aws"
mock_provider_get.return_value = mock_provider
prowler_provider = MagicMock()
mock_initialize_provider.return_value = prowler_provider
mock_compliance_get_bulk.return_value = {"cis": MagicMock()}
mock_get_available_frameworks.return_value = ["cis"]
dummy_finding = MagicMock(uid="f1")
mock_finding_filter.return_value.order_by.return_value.iterator.return_value = [
[dummy_finding],
True,
]
mock_transformed_stats = {"some": "stats"}
with (
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value=mock_transformed_stats,
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value={"transformed": "f1"},
),
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": MagicMock(name="JSONWriter"),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock(name="CSVCompliance"))]},
),
patch(
"tasks.tasks._generate_output_directory",
return_value=("out-dir", "comp-dir"),
),
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
):
mock_compress.return_value = "/tmp/zipped.zip"
mock_upload.return_value = "s3://bucket/zipped.zip"
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": True}
mock_scan_update.return_value.update.assert_called_once_with(
output_location="s3://bucket/zipped.zip"
)
mock_rmtree.assert_called_once_with(
Path("/tmp/zipped.zip").parent, ignore_errors=True
)
def test_generate_outputs_fails_upload(self):
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk"),
patch("tasks.tasks.get_compliance_frameworks"),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch("tasks.tasks.FindingOutput.transform_api_finding"),
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": MagicMock(name="Writer"),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value=None),
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
result = generate_outputs(
scan_id="scan",
provider_id="provider",
tenant_id=self.tenant_id,
)
assert result == {"upload": False}
mock_scan_update.return_value.update.assert_called_once()
def test_generate_outputs_triggers_html_extra_update(self):
mock_finding_output = MagicMock()
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value={"some": "stats"},
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value=mock_finding_output,
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/f.zip"),
patch("tasks.tasks.Scan.all_objects.filter"),
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
html_writer_mock = MagicMock()
with (
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"html": {
"class": lambda *args, **kwargs: html_writer_mock,
"suffix": ".html",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
):
generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
html_writer_mock.batch_write_data_to_file.assert_called_once()
def test_transform_called_only_on_second_batch(self):
raw1 = MagicMock()
raw2 = MagicMock()
tf1 = MagicMock()
tf1.compliance = {}
tf2 = MagicMock()
tf2.compliance = {}
writer_instances = []
class TrackingWriter:
def __init__(self, findings, file_path, file_extension, from_cli):
self.transform_called = 0
self.batch_write_data_to_file = MagicMock()
self._data = []
self.close_file = False
writer_instances.append(self)
def transform(self, fos):
self.transform_called += 1
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk"),
patch("tasks.tasks.get_compliance_frameworks", return_value=[]),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
side_effect=[tf1, tf2],
),
patch(
"tasks.tasks._generate_output_directory",
return_value=("outdir", "compdir"),
),
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
patch("tasks.tasks.rmtree"),
patch("tasks.tasks.Scan.all_objects.filter"),
patch(
"tasks.tasks.batched",
return_value=[
([raw1], False),
([raw2], True),
],
),
):
mock_summary.return_value.exists.return_value = True
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": TrackingWriter,
"suffix": ".json",
"kwargs": {},
}
},
):
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": True}
assert len(writer_instances) == 1
writer = writer_instances[0]
assert writer.transform_called == 1
def test_compliance_transform_called_on_second_batch(self):
raw1 = MagicMock()
raw2 = MagicMock()
compliance_obj = MagicMock()
writer_instances = []
class TrackingComplianceWriter:
def __init__(self, *args, **kwargs):
self.transform_calls = []
self._data = []
writer_instances.append(self)
def transform(self, fos, comp_obj, name):
self.transform_calls.append((fos, comp_obj, name))
def batch_write_data_to_file(self):
pass
two_batches = [
([raw1], False),
([raw2], True),
]
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
patch(
"tasks.tasks.Provider.objects.get",
return_value=MagicMock(uid="UID", provider="aws"),
),
patch("tasks.tasks.initialize_prowler_provider"),
patch(
"tasks.tasks.Compliance.get_bulk", return_value={"cis": compliance_obj}
),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch(
"tasks.tasks._generate_output_directory",
return_value=("outdir", "compdir"),
),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
side_effect=lambda f, prov: f,
),
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
patch("tasks.tasks.rmtree"),
patch(
"tasks.tasks.Scan.all_objects.filter",
return_value=MagicMock(update=lambda **kw: None),
),
patch("tasks.tasks.batched", return_value=two_batches),
patch("tasks.tasks.OUTPUT_FORMATS_MAPPING", {}),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda name: True, TrackingComplianceWriter)]},
),
):
mock_summary.return_value.exists.return_value = True
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert len(writer_instances) == 1
writer = writer_instances[0]
assert writer.transform_calls == [([raw2], compliance_obj, "cis")]
assert result == {"upload": True}
def test_generate_outputs_logs_rmtree_exception(self, caplog):
mock_finding_output = MagicMock()
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value={"some": "stats"},
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value=mock_finding_output,
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/file.zip"),
patch("tasks.tasks.Scan.all_objects.filter"),
patch("tasks.tasks.rmtree", side_effect=Exception("Test deletion error")),
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
with (
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": lambda *args, **kwargs: MagicMock(),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
):
with caplog.at_level("ERROR"):
generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert "Error deleting output files" in caplog.text
+102
View File
@@ -0,0 +1,102 @@
from datetime import datetime, timedelta, timezone
from unittest.mock import patch
import pytest
from django_celery_beat.models import IntervalSchedule, PeriodicTask
from django_celery_results.models import TaskResult
from tasks.utils import batched, get_next_execution_datetime
@pytest.mark.django_db
class TestGetNextExecutionDatetime:
@pytest.fixture
def setup_periodic_task(self, db):
# Create a periodic task with an hourly interval
interval = IntervalSchedule.objects.create(
every=1, period=IntervalSchedule.HOURS
)
periodic_task = PeriodicTask.objects.create(
name="scan-perform-scheduled-123",
task="scan-perform-scheduled",
interval=interval,
)
return periodic_task
@pytest.fixture
def setup_task_result(self, db):
# Create a task result record
task_result = TaskResult.objects.create(
task_id="abc123",
task_name="scan-perform-scheduled",
status="SUCCESS",
date_created=datetime.now(timezone.utc) - timedelta(hours=1),
result="Success",
)
return task_result
def test_get_next_execution_datetime_success(
self, setup_task_result, setup_periodic_task
):
task_result = setup_task_result
periodic_task = setup_periodic_task
# Mock periodic_task_name on TaskResult
with patch.object(
TaskResult, "periodic_task_name", return_value=periodic_task.name
):
next_execution = get_next_execution_datetime(
task_id=task_result.task_id, provider_id="123"
)
expected_time = task_result.date_created + timedelta(hours=1)
assert next_execution == expected_time
def test_get_next_execution_datetime_fallback_to_provider_id(
self, setup_task_result, setup_periodic_task
):
task_result = setup_task_result
# Simulate the case where `periodic_task_name` is missing
with patch.object(TaskResult, "periodic_task_name", return_value=None):
next_execution = get_next_execution_datetime(
task_id=task_result.task_id, provider_id="123"
)
expected_time = task_result.date_created + timedelta(hours=1)
assert next_execution == expected_time
def test_get_next_execution_datetime_periodic_task_does_not_exist(
self, setup_task_result
):
task_result = setup_task_result
with pytest.raises(PeriodicTask.DoesNotExist):
get_next_execution_datetime(
task_id=task_result.task_id, provider_id="nonexistent"
)
class TestBatchedFunction:
def test_empty_iterable(self):
result = list(batched([], 3))
assert result == [([], True)]
def test_exact_batches(self):
result = list(batched([1, 2, 3, 4], 2))
expected = [([1, 2], False), ([3, 4], False), ([], True)]
assert result == expected
def test_inexact_batches(self):
result = list(batched([1, 2, 3, 4, 5], 2))
expected = [([1, 2], False), ([3, 4], False), ([5], True)]
assert result == expected
def test_batch_size_one(self):
result = list(batched([1, 2, 3], 1))
expected = [([1], False), ([2], False), ([3], False), ([], True)]
assert result == expected
def test_batch_size_greater_than_length(self):
result = list(batched([1, 2, 3], 5))
expected = [([1, 2, 3], True)]
assert result == expected
+73
View File
@@ -0,0 +1,73 @@
import json
from datetime import datetime, timedelta, timezone
from enum import Enum
from django_celery_beat.models import PeriodicTask
from django_celery_results.models import TaskResult
class CustomEncoder(json.JSONEncoder):
def default(self, o):
# Enum serialization
if isinstance(o, Enum):
return o.value
# Datetime and timedelta serialization
if isinstance(o, datetime):
return o.isoformat(timespec="seconds")
if isinstance(o, timedelta):
return o.total_seconds()
# Custom object serialization
try:
return super().default(o)
except TypeError:
try:
return o.__dict__
except AttributeError:
return str(o)
def get_next_execution_datetime(task_id: int, provider_id: str) -> datetime:
task_instance = TaskResult.objects.get(task_id=task_id)
try:
periodic_task_instance = PeriodicTask.objects.get(
name=task_instance.periodic_task_name
)
except PeriodicTask.DoesNotExist:
periodic_task_instance = PeriodicTask.objects.get(
name=f"scan-perform-scheduled-{provider_id}"
)
interval = periodic_task_instance.interval
current_scheduled_time = datetime.combine(
datetime.now(timezone.utc).date(),
task_instance.date_created.time(),
tzinfo=timezone.utc,
)
return current_scheduled_time + timedelta(**{interval.period: interval.every})
def batched(iterable, batch_size):
"""
Yield successive batches from an iterable.
Args:
iterable: An iterable source of items.
batch_size (int): The number of items per batch.
Yields:
tuple: A pair (batch, is_last_batch) where:
- batch (list): A list of items (with length equal to batch_size,
except possibly for the last batch).
- is_last_batch (bool): True if this is the final batch, False otherwise.
"""
batch = []
for item in iterable:
batch.append(item)
if len(batch) == batch_size:
yield batch, False
batch = []
yield batch, True
View File
+156
View File
@@ -0,0 +1,156 @@
#!/usr/bin/env python3
import argparse
import re
import subprocess
import sys
from pathlib import Path
import matplotlib.pyplot as plt
import pandas as pd
plt.style.use("ggplot")
def run_locust(
locust_file: str,
host: str,
users: int,
hatch_rate: int,
run_time: str,
csv_prefix: Path,
) -> Path:
artifacts_dir = Path("artifacts")
artifacts_dir.mkdir(parents=True, exist_ok=True)
cmd = [
"locust",
"-f",
f"scenarios/{locust_file}",
"--headless",
"-u",
str(users),
"-r",
str(hatch_rate),
"-t",
run_time,
"--host",
host,
"--csv",
str(artifacts_dir / csv_prefix.name),
]
print(f"Running Locust: {' '.join(cmd)}")
process = subprocess.run(cmd)
if process.returncode:
sys.exit("Locust execution failed")
stats_file = artifacts_dir / f"{csv_prefix.stem}_stats.csv"
if not stats_file.exists():
sys.exit(f"Stats CSV not found: {stats_file}")
return stats_file
def load_percentiles(csv_path: Path) -> pd.DataFrame:
df = pd.read_csv(csv_path)
mapping = {"50%": "p50", "75%": "p75", "90%": "p90", "95%": "p95"}
available = [col for col in mapping if col in df.columns]
renamed = {col: mapping[col] for col in available}
df = df.rename(columns=renamed).set_index("Name")[renamed.values()]
return df.drop(index=["Aggregated"], errors="ignore")
def sanitize_label(label: str) -> str:
text = re.sub(r"[^\w]+", "_", label.strip().lower())
return text.strip("_")
def plot_multi_comparison(metrics: dict[str, pd.DataFrame]) -> None:
common = sorted(set.intersection(*(set(df.index) for df in metrics.values())))
percentiles = list(next(iter(metrics.values())).columns)
groups = len(metrics)
width = 0.8 / groups
for endpoint in common:
fig, ax = plt.subplots(figsize=(10, 5), dpi=100)
for idx, (label, df) in enumerate(metrics.items()):
series = df.loc[endpoint]
positions = [
i + (idx - groups / 2) * width + width / 2
for i in range(len(percentiles))
]
bars = ax.bar(positions, series.values, width, label=label)
for bar in bars:
height = bar.get_height()
ax.annotate(
f"{int(height)}",
xy=(bar.get_x() + bar.get_width() / 2, height),
xytext=(0, 3),
textcoords="offset points",
ha="center",
va="bottom",
fontsize=8,
)
ax.set_xticks(range(len(percentiles)))
ax.set_xticklabels(percentiles)
ax.set_ylabel("Latency (ms)")
ax.set_title(endpoint, fontsize=12)
ax.grid(True, axis="y", linestyle="--", alpha=0.7)
fig.tight_layout()
fig.subplots_adjust(right=0.75)
ax.legend(loc="center left", bbox_to_anchor=(1, 0.5), framealpha=0.9)
output = Path("artifacts") / f"comparison_{sanitize_label(endpoint)}.png"
plt.savefig(output)
plt.close(fig)
print(f"Saved chart: {output}")
def main() -> None:
parser = argparse.ArgumentParser(description="Run Locust and compare metrics")
parser.add_argument("--locustfile", required=True, help="Locust file in scenarios/")
parser.add_argument("--host", required=True, help="Target host URL")
parser.add_argument(
"--users", type=int, default=10, help="Number of simulated users"
)
parser.add_argument("--rate", type=int, default=1, help="Hatch rate per second")
parser.add_argument("--time", default="1m", help="Test duration (e.g. 30s, 1m)")
parser.add_argument(
"--metrics-dir", default="baselines", help="Directory with CSV baselines"
)
parser.add_argument("--version", default="current", help="Test version")
args = parser.parse_args()
metrics_dir = Path(args.metrics_dir)
if not metrics_dir.is_dir():
sys.exit(f"Metrics directory not found: {metrics_dir}")
metrics_data: dict[str, pd.DataFrame] = {}
for csv_file in sorted(metrics_dir.glob("*.csv")):
metrics_data[csv_file.stem] = load_percentiles(csv_file)
current_prefix = Path(args.version)
current_csv = run_locust(
locust_file=args.locustfile,
host=args.host,
users=args.users,
hatch_rate=args.rate,
run_time=args.time,
csv_prefix=current_prefix,
)
metrics_data[args.version] = load_percentiles(current_csv)
for endpoint in sorted(
set.intersection(*(set(df.index) for df in metrics_data.values()))
):
parts = [endpoint]
for label, df in metrics_data.items():
s = df.loc[endpoint]
parts.append(f"{label}: p50 {s.p50}, p75 {s.p75}, p90 {s.p90}, p95 {s.p95}")
print(" | ".join(parts))
plot_multi_comparison(metrics_data)
if __name__ == "__main__":
main()
+2
View File
@@ -0,0 +1,2 @@
locust==2.34.1
matplotlib==3.10.1
+202
View File
@@ -0,0 +1,202 @@
from locust import events, task
from utils.config import (
FINDINGS_UI_SORT_VALUES,
L_PROVIDER_NAME,
M_PROVIDER_NAME,
S_PROVIDER_NAME,
TARGET_INSERTED_AT,
)
from utils.helpers import (
APIUserBase,
get_api_token,
get_auth_headers,
get_next_resource_filter,
get_resource_filters_pairs,
get_scan_id_from_provider_name,
get_sort_value,
)
GLOBAL = {
"token": None,
"scan_ids": {},
"resource_filters": None,
"large_resource_filters": None,
}
@events.test_start.add_listener
def on_test_start(environment, **kwargs):
GLOBAL["token"] = get_api_token(environment.host)
GLOBAL["scan_ids"]["small"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], S_PROVIDER_NAME
)
GLOBAL["scan_ids"]["medium"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], M_PROVIDER_NAME
)
GLOBAL["scan_ids"]["large"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], L_PROVIDER_NAME
)
GLOBAL["resource_filters"] = get_resource_filters_pairs(
environment.host, GLOBAL["token"]
)
GLOBAL["large_resource_filters"] = get_resource_filters_pairs(
environment.host, GLOBAL["token"], GLOBAL["scan_ids"]["large"]
)
class APIUser(APIUserBase):
def on_start(self):
self.token = GLOBAL["token"]
self.s_scan_id = GLOBAL["scan_ids"]["small"]
self.m_scan_id = GLOBAL["scan_ids"]["medium"]
self.l_scan_id = GLOBAL["scan_ids"]["large"]
self.available_resource_filters = GLOBAL["resource_filters"]
self.available_resource_filters_large_scan = GLOBAL["large_resource_filters"]
@task
def findings_default(self):
name = "/findings"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_default_include(self):
name = "/findings?include"
page = self._next_page(name)
endpoint = (
f"/findings?page[number]={page}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata(self):
endpoint = f"/findings/metadata?" f"filter[inserted_at]={TARGET_INSERTED_AT}"
self.client.get(
endpoint, headers=get_auth_headers(self.token), name="/findings/metadata"
)
@task
def findings_scan_small(self):
name = "/findings?filter[scan_id] - 50k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.s_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_small(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.s_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 50k",
)
@task(2)
def findings_scan_medium(self):
name = "/findings?filter[scan_id] - 250k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.m_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_medium(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.m_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 250k",
)
@task
def findings_scan_large(self):
name = "/findings?filter[scan_id] - 500k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_scan_large_include(self):
name = "/findings?filter[scan_id]&include - 500k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_large(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.l_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 500k",
)
@task(2)
def findings_resource_filter(self):
name = "/findings?filter[resource_filter]&include"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings?filter[{filter_name}]={filter_value}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata_resource_filter(self):
name = "/findings/metadata?filter[resource_filter]"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings?filter[{filter_name}]={filter_value}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_resource_filter_large_scan_include(self):
name = "/findings?filter[resource_filter][scan]&include - 500k"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings?filter[{filter_name}]={filter_value}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)

Some files were not shown because too many files have changed in this diff Show More