Compare commits

..

554 Commits

Author SHA1 Message Date
Adrián Jesús Peña Rodríguez
221310ba94 fix: update unittest 2025-04-22 17:27:53 +02:00
Adrián Jesús Peña Rodríguez
33135f6c9a chore: update lock files 2025-04-22 17:05:03 +02:00
Adrián Jesús Peña Rodríguez
4aaacc9e16 feat: add bash to debian 2025-04-22 16:12:22 +02:00
Adrián Jesús Peña Rodríguez
165d09768f feat: change api base image to debian 2025-04-22 16:12:22 +02:00
Adrián Jesús Peña Rodríguez
6b8af51c23 fix: update base image 2025-04-22 16:12:08 +02:00
Adrián Jesús Peña Rodríguez
e40591a25f chore: change name 2025-04-22 16:12:08 +02:00
Adrián Jesús Peña Rodríguez
d6ac87341b chore: update changelog and spec 2025-04-22 16:12:08 +02:00
Adrián Jesús Peña Rodríguez
d431af6186 feat: change to domain_id 2025-04-22 16:12:08 +02:00
Adrián Jesús Peña Rodríguez
7b3b728207 feat: add m365 to API 2025-04-22 16:11:47 +02:00
César Arroba
cfab4dcef5 chore: pass labels on PR merge trigger (#7558) 2025-04-22 16:11:47 +02:00
César Arroba
cd48037670 chore: revert pass labels (#7556) 2025-04-22 16:11:47 +02:00
César Arroba
ab7e352029 chore: pass labels as json is required (#7555) 2025-04-22 16:11:47 +02:00
César Arroba
79ce899812 chore: fix merged PR action, incorrect order on payload (#7554) 2025-04-22 16:11:47 +02:00
César Arroba
29d4159b8c chore: pass labels (#7553) 2025-04-22 16:11:47 +02:00
César Arroba
aaf7188365 chore: fix json body (#7552) 2025-04-22 16:11:47 +02:00
César Arroba
6cbba3c5b1 chore: fix trigger (#7551) 2025-04-22 16:11:47 +02:00
César Arroba
9510b4c3ef chore(gha): trigger cloud pull-request when a PR is merged (#7212) 2025-04-22 16:11:47 +02:00
Felix Dreissig
c7754c4331 fix(aws): remove SHA-1 from ACM insecure key algorithms (#7547) 2025-04-22 16:11:47 +02:00
Daniel Barranquero
7800320189 feat(defender): add new check defender_antiphishing_policy_configured (#7453) 2025-04-22 16:11:47 +02:00
Daniel Barranquero
972c88e421 feat(defender): add new check defender_malware_policy_notifications_internal_users_malware_enabled (#7435)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-22 16:11:47 +02:00
Daniel Barranquero
3ee420769a feat(defender): add service and new check defender_malware_policy_common_attachments_filter_enabled (#7425)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-22 16:11:47 +02:00
Daniel Barranquero
f8d9873655 feat(exchange): add new check exchange_mailbox_audit_bypass_disabled (#7418)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 16:11:47 +02:00
Daniel Barranquero
3f5508847a feat(exchange): add service and new check exchange_organization_mailbox_auditing_enabled (#7408)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-22 16:11:47 +02:00
Hugo Pereira Brito
ed54b39211 feat(teams): add new check teams_email_sending_to_channel_disabled (#7533)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-22 16:11:47 +02:00
HugoPBrito
a6dd97b6bc feat: pwsh setup and print credentials comprobation 2025-04-21 17:54:12 +02:00
HugoPBrito
b6e1551397 Revert "feat: remove provider_id and adapt logic"
This reverts commit 3d9d118d2c.
2025-04-21 17:49:30 +02:00
HugoPBrito
c8567b5d2b Revert "fix: restore 2 tests"
This reverts commit 361269370a.
2025-04-21 17:49:12 +02:00
HugoPBrito
361269370a fix: restore 2 tests 2025-04-21 15:02:53 +02:00
HugoPBrito
3d9d118d2c feat: remove provider_id and adapt logic 2025-04-21 14:58:13 +02:00
HugoPBrito
b50a91d3e6 feat: enhance json parse logging 2025-04-21 13:23:37 +02:00
HugoPBrito
0843c49475 feat: add tests 2025-04-21 12:52:32 +02:00
HugoPBrito
3fe65b7a03 fix: missing provider_id 2025-04-21 11:40:53 +02:00
HugoPBrito
68369ca54f chore: remove old implementation 2025-04-21 11:31:18 +02:00
HugoPBrito
63f4b4f7d4 feat: ensure pwsh user belongs to tenant 2025-04-16 19:10:57 +02:00
HugoPBrito
754a0748da fix(m365): test_connection 2025-04-16 16:38:15 +02:00
Sergio Garcia
aa3182ebc5 feat(gcp): support CLOUDSDK_AUTH_ACCESS_TOKEN (#7495) 2025-04-16 10:35:04 -04:00
Sergio Garcia
32d27df0ba chore(regions): change interval to weekly (#7539) 2025-04-16 09:35:30 -04:00
Prowler Bot
6439f0a5f3 chore(regions_update): Changes in regions for AWS services (#7538)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-16 09:25:29 -04:00
Sergio Garcia
19476632ff chore(dependabot): change settings (#7536) 2025-04-16 11:26:57 +05:45
Pedro Martín
d4c12e4632 fix(iam): change some logger.info values (#7526)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-15 13:25:37 -04:00
Hugo Pereira Brito
52bd48168f feat: adapt Microsoft365 provider to use PowerShell (#7331)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-15 13:24:09 -04:00
Bogdan A
c0d935e232 docs(gcp): update required permissions for GCP (#7488) 2025-04-15 10:23:45 -04:00
Pepe Fagoaga
24dfd47329 fix(pypi): package name location in pyproject.toml while replicating for prowler-cloud (#7531) 2025-04-15 20:01:27 +05:45
dependabot[bot]
fbae338689 chore(deps): bump python from 3.12.9-alpine3.20 to 3.12.10-alpine3.20 (#7520)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-15 09:26:04 -04:00
dependabot[bot]
186fd88f8c chore(deps): bump codecov/codecov-action from 5.4.0 to 5.4.2 (#7522)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-15 09:25:44 -04:00
dependabot[bot]
14ff34c00a chore(deps): bump actions/setup-node from 4.3.0 to 4.4.0 (#7521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-15 09:25:23 -04:00
Prowler Bot
a66fa394d3 chore(regions_update): Changes in regions for AWS services (#7527)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-15 09:20:20 -04:00
Pepe Fagoaga
931766fe08 chore(action): Remove cache in PyPI release (#7532) 2025-04-15 18:58:26 +05:45
Pepe Fagoaga
c134914896 revert: fix(findings): increase uid max length to 600 (#7528) 2025-04-15 15:54:32 +05:45
Pepe Fagoaga
25dac080a5 chore(changelog): prepare for 5.5.1 (#7523) 2025-04-15 11:46:20 +05:45
Sergio Garcia
910d39eee4 chore(sdk): update changelog (#7512) 2025-04-15 11:19:50 +05:45
Pepe Fagoaga
d604ae5569 fix(pyproject): Restore packages location (#7510) 2025-04-14 16:50:50 -04:00
Bogdan A
42f46b0fb1 feat(gcp): add check for unused Service Accounts (#7419)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-14 11:53:54 -04:00
Pepe Fagoaga
abb5864224 chore(release): bump for 5.6.0 (#7503) 2025-04-14 11:50:46 -04:00
Prowler Bot
2e2a2bd89a chore(regions_update): Changes in regions for AWS services (#7491)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-14 10:29:19 -04:00
Sergio Garcia
f8ee841921 fix(gcp): handle projects without ID (#7496) 2025-04-14 10:25:54 -04:00
Pedro Martín
ceda8c76d2 feat(azure): add SOC2 compliance framework (#7489) 2025-04-14 10:16:20 -04:00
Pedro Martín
afe0b7443f fix(defender): add default name to contacts (#7483) 2025-04-14 10:16:07 -04:00
Prowler Bot
9b773897d2 chore(regions_update): Changes in regions for AWS services (#7487)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-14 09:53:40 -04:00
Pedro Martín
d6ec4c2c96 feat(sdk): add changelog file (#7499) 2025-04-14 09:22:50 -04:00
Prowler Bot
14ef169e99 chore(regions_update): Changes in regions for AWS services (#7497)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-14 09:22:21 -04:00
Pepe Fagoaga
22141f9706 fix(findings): increase uid max length to 600 (#7498)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-14 17:46:13 +05:45
Pablo Lara
a5c6fee5b4 fix: update redirect URL for SSO (#7493) 2025-04-11 18:25:28 +05:45
Pablo Lara
d3a5a5c0a1 fix: resolve social login issue in AuthForm on sign-up page (#7490) 2025-04-11 09:59:10 +02:00
dependabot[bot]
5d81869de4 chore(deps): bump tj-actions/changed-files from 46.0.4 to 46.0.5 (#7486)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 22:31:33 -04:00
Pepe Fagoaga
73ebf95d89 chore(changelog): Prepare for v5.5.0 (#7484) 2025-04-09 20:50:56 +05:45
Sergio Garcia
9f4574f4ff fix: handle errors in AWS and Azure (#7482) 2025-04-09 20:19:38 +05:45
Pedro Martín
cb239b20ab fix(aws): add default session_duration (#7479) 2025-04-09 19:19:17 +05:45
eeche
3ef79588b4 feat(NHN): add NHN cloud provider with 6 checks (#6870)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 09:13:24 -04:00
Prowler Bot
61000e386b chore(regions_update): Changes in regions for AWS services (#7478)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 09:11:29 -04:00
Pablo Lara
53cb57901f fix: fix TS type for session duration (#7481) 2025-04-09 13:44:53 +02:00
Pedro Martín
993ff4d78e feat(gcp): add SOC2 compliance framework (#7476) 2025-04-08 15:04:08 -04:00
Drew Kerrigan
8fb10fbbf7 fix(ui): Remove UTC from timestamps in app (#7474) 2025-04-08 17:43:44 +02:00
Pablo Lara
11e834f639 feat: update the NextJS version to the latest (#7473) 2025-04-08 17:40:39 +02:00
Prowler Bot
62bf2fbb9c chore(regions_update): Changes in regions for AWS services (#7467)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-08 10:21:42 -04:00
dependabot[bot]
e57930d6c2 chore(deps): bump github/codeql-action from 3.28.13 to 3.28.15 (#7463)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-08 09:38:18 -04:00
Pepe Fagoaga
e0c417a466 fix(action): Use poetry > v2 (#7472) 2025-04-08 18:34:24 +05:45
Sergio Garcia
b55f8efed1 fix: handle errors in AWS, Azure, and GCP (#7456) 2025-04-08 18:05:43 +05:45
Pablo Lara
7cbc60d977 feat: add link with the service status using static icon (#7468) 2025-04-08 12:06:21 +02:00
Adrián Jesús Peña Rodríguez
5b7912b558 fix(provider): disable periodic task on views before deleting (#7466)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-08 15:35:22 +05:45
Pedro Martín
57fca3e54d fix(soc2_aws): update compliance and remove some requirements (#7452) 2025-04-07 15:47:19 -04:00
Pedro Martín
e31c27b123 fix(gcp): handle logic for empty project names (#7436) 2025-04-07 11:51:15 -04:00
Sergio Garcia
74f1da818e fix(gcp): ignore redirect balancers and add regional ones (#7442) 2025-04-07 11:47:02 -04:00
Pedro Martín
910cfa601b fix(aws): add resource arn for transit gateways (#7447) 2025-04-07 11:46:53 -04:00
dependabot[bot]
fe321c3f8a chore(deps): bump tj-actions/changed-files from 46.0.3 to 46.0.4 (#7443)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-07 09:11:54 -04:00
Prowler Bot
43de0d405f chore(regions_update): Changes in regions for AWS services (#7446)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-07 09:11:23 -04:00
dependabot[bot]
ac6ed31c8e chore(deps): bump trufflesecurity/trufflehog from 3.88.22 to 3.88.23 (#7444)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-07 09:11:07 -04:00
Prowler Bot
9d47437de4 chore(regions_update): Changes in regions for AWS services (#7445)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-07 09:10:49 -04:00
Pablo Lara
eb7a62ff77 refactor: extract common auth headers into reusable helper (#7439) 2025-04-07 08:16:55 +02:00
Pedro Martín
67bc16b46d fix(defender): add default resource name in contacts (#7438) 2025-04-04 09:35:11 -04:00
Sergio Garcia
8552a578a0 fix(aws): solve multiple errors (#7431) 2025-04-04 09:34:58 -04:00
Sergio Garcia
a5d277e045 fix(docs): solve broken links (#7432) 2025-04-04 09:15:48 -04:00
Adrián Jesús Peña Rodríguez
6dbf2ac606 feat: add missing SDK fields to API findings and resources (#7318) 2025-04-04 14:57:49 +02:00
Prowler Bot
b1569ac2f3 chore(regions_update): Changes in regions for AWS services (#7434)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-04 08:36:23 -04:00
dependabot[bot]
3d0145b522 chore(deps): bump trufflesecurity/trufflehog from 3.88.20 to 3.88.22 (#7433)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-04 08:34:51 -04:00
Pedro Martín
44174526d6 docs: add onboarding information step by step for each provider (#7362) 2025-04-04 13:00:43 +02:00
Pablo Lara
0fd395ea83 fix: correct fetch variable name from invitations to roles (#7437) 2025-04-04 12:08:57 +02:00
dependabot[bot]
5e9d4a80a1 chore(deps): bump msgraph-sdk from 1.18.0 to 1.23.0 (#7128)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-04 11:27:39 +02:00
Pedro Martín
e4d234fe03 fix(azure): remove resource_name inside the Check_Report (#7420) 2025-04-03 11:35:02 -04:00
Prowler Bot
3202184718 chore(regions_update): Changes in regions for AWS services (#7424)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-03 09:39:00 -04:00
Sergio Garcia
41e576f4f1 fix(gcp): make logging sink check at project level (#7421) 2025-04-03 09:37:46 -04:00
Pepe Fagoaga
d8dce07019 chore(deletion): Add environment variable for batch size (#7423)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-03 15:31:13 +05:45
Prowler Bot
2b0a3144c7 chore(regions_update): Changes in regions for AWS services (#7417)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-02 09:59:08 -04:00
dependabot[bot]
62fbce0b5e chore(deps): bump azure-identity from 1.19.0 to 1.21.0 (#7192)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-02 11:16:47 +02:00
Pedro Martín
5a59bb335c fix(resources): add the correct id and names for resources (#7410) 2025-04-01 20:30:37 +02:00
Sergio Garcia
2719991630 fix(report): log as error when Resource ID or Name do not exist (#7411) 2025-04-01 20:24:18 +02:00
Daniel Barranquero
6a3b8c4674 feat(entra): add new check entra_admin_users_cloud_only (#7286) 2025-04-01 19:14:15 +02:00
dependabot[bot]
191fbf0177 chore(deps): bump azure-mgmt-applicationinsights from 4.0.0 to 4.1.0 (#7161)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-01 14:55:37 +02:00
Víctor Fernández Poyatos
228dd2952a fix(scans): Handle duplicated scan tasks (#7401) 2025-04-01 11:55:14 +02:00
dependabot[bot]
97db38aa25 chore(deps): bump azure-mgmt-containerregistry from 10.3.0 to 12.0.0 (#7025)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-01 10:29:31 +02:00
Pedro Martín
dc953a6e22 docs(python): add annotations about Python version (#7402) 2025-03-31 18:14:59 +02:00
Bogdan A
51e796a48d feat(gcp): add check for dormant (unused) SA keys (#7348)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2025-03-31 18:14:21 +02:00
Hugo Pereira Brito
024f1425df feat(entra): add new check entra_legacy_authentication_blocked (#7240) 2025-03-31 18:12:26 +02:00
Hugo Pereira Brito
a7ed610da9 feat(entra): add new check entra_users_mfa_enabled (#7228) 2025-03-31 17:54:52 +02:00
Hugo Pereira Brito
7ba99f22cd feat(entra): add new check entra_admin_users_phishing_resistant_mfa_enabled (#7211)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-31 17:52:28 +02:00
Hugo Pereira Brito
b8ce09ec34 fix(entra): check name and logic of entra_admin_users_have_mfa_enabled (#7230) 2025-03-31 17:50:51 +02:00
Daniel Barranquero
c243110a49 feat(entra): add new check entra_policy_guest_invite_only_for_admin_roles (#7241)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-31 14:53:50 +02:00
Daniel Barranquero
ee27636f32 fix(redshift): validation error for Cluster.multi_az (#7381) 2025-03-31 13:55:48 +02:00
dependabot[bot]
f2f41c9c44 chore(deps): bump azure-mgmt-resource from 23.2.0 to 23.3.0 (#7054)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-03-31 13:29:49 +02:00
Daniel Barranquero
9312890e6a feat(entra): add new check entra_policy_guest_users_access_restrictions (#7234) 2025-03-31 12:45:26 +02:00
Daniel Barranquero
9578281b4f feat(entra): add new check entra_policy_restricts_user_consent_for_apps (#7225) 2025-03-31 12:32:51 +02:00
Víctor Fernández Poyatos
08690068fc feat(findings): Handle muted findings in API and UI (#7378)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-31 12:25:58 +02:00
Hugo Pereira Brito
e06a33de84 feat(entra): add new check entra_managed_device_required_for_mfa_registration (#7203)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-31 12:24:47 +02:00
Prowler Bot
6a3db10fda chore(regions_update): Changes in regions for AWS services (#7395)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-31 10:18:53 +02:00
Andoni Alonso
bbed445efa chore(sentry): ignore exception when aws service not available in a region (#7352) 2025-03-31 10:13:19 +02:00
dependabot[bot]
9d65fb0bf2 chore(deps): bump trufflesecurity/trufflehog from 3.88.18 to 3.88.20 (#7394)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-31 10:12:55 +02:00
Prowler Bot
34f03ca110 chore(regions_update): Changes in regions for AWS services (#7391)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-27 11:10:07 +01:00
Daniel Barranquero
87c038f0c2 fix(rds): hundle Certificate rds-ca-2019 not found (#7383) 2025-03-27 11:09:33 +01:00
dependabot[bot]
b3014f03b1 chore(deps): bump actions/setup-python from 5.4.0 to 5.5.0 (#7390)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-27 09:13:50 +01:00
Daniel Barranquero
d39598c9fc fix(stepfunctions): Nonetype object has no attribute level (#7386) 2025-03-26 19:39:27 +01:00
Daniel Barranquero
5ea9106259 fix(fms): resource metadata could not be converted to dict (#7379) 2025-03-26 19:25:00 +01:00
Prowler Bot
bcc0b59de1 chore(regions_update): Changes in regions for AWS services (#7382)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-26 12:52:35 +01:00
Daniel Barranquero
5d6ed640f0 fix(vm): handle Nonetype is not iterable for extensions (#7360)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 12:25:15 +01:00
Sergio Garcia
dd1cc2d025 fix(s3): handle None S3 account public access block (#7350) 2025-03-25 11:39:19 +01:00
Andoni Alonso
52e5cc23e4 fix(storagegateway): describe smb/nfs share per region (#7374) 2025-03-25 10:35:37 +01:00
Pablo Lara
76a8e2be1f chore: tweak for button see findings (#7369) 2025-03-25 09:52:36 +01:00
Andoni Alonso
d989425490 fix(vm): handle NoneType accessing security_profile (#7221) 2025-03-25 09:33:00 +01:00
Hugo Pereira Brito
1e324b7ed2 fix(network): handle Nonetype is not iterable for security groups (#7208) 2025-03-25 09:28:37 +01:00
Sergio Garcia
e68aa62f94 fix(iam): handle none SAML Providers (#7359) 2025-03-25 09:24:32 +01:00
Daniel Barranquero
332b98a1ab fix(iam): handle UnboundLocalError cannot access local variable 'report' (#7361) 2025-03-25 09:22:35 +01:00
Pablo Lara
dd05ef7974 chore(scans): properly enable link to findings when scan is completed (#7368) 2025-03-25 08:45:37 +01:00
dependabot[bot]
d6862766d3 chore(deps): bump github/codeql-action from 3.28.12 to 3.28.13 (#7367)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-25 12:43:02 +05:45
dependabot[bot]
f52d005e2d chore(deps): bump tj-actions/changed-files from 46.0.1 to 46.0.3 (#7363)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-25 12:42:50 +05:45
Víctor Fernández Poyatos
bf475234a5 build(api): Force django-allauth==65.4.1 (#7358) 2025-03-24 17:39:47 +01:00
Pablo Lara
cd5985c056 docs: update readme (#7357) 2025-03-24 15:41:35 +01:00
Pablo Lara
ce33dbf823 chore(findings): apply default filter to show failed findings (#7356) 2025-03-24 15:38:09 +01:00
Pablo Lara
0a9d0688a7 docs(changelog): document addition of download column in scans table … (#7354) 2025-03-24 15:28:13 +01:00
Pablo Lara
24784f2ce5 feat(scans): add download button column for completed scans in table (#7353) 2025-03-24 15:22:36 +01:00
Víctor Fernández Poyatos
7a1e611b88 ref(providers): Refactor provider deletion functions (#7349) 2025-03-24 14:39:14 +01:00
Pepe Fagoaga
3073150008 chore(next): Remove x-powered-by header (#7346) 2025-03-24 16:17:18 +05:45
Jonny
9923def4cb chore(awslambda): update obsolete lambda runtimes (#7330)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-24 11:21:01 +01:00
Víctor Fernández Poyatos
a7f612303f feat(compliance): Add endpoint to retrieve compliance overviews metadata (#7333) 2025-03-24 10:34:43 +01:00
Pablo Lara
64c2a2217a docs: update changelog with Next.js security patch (#7339) (#7341) 2025-03-24 09:59:59 +01:00
Pablo Lara
4689d7a952 chore: upgrade Next.js to 14.2.25 to fix auth middleware vulnerability (#7339) 2025-03-24 09:48:41 +01:00
Prowler Bot
87cd143967 chore(regions_update): Changes in regions for AWS services (#7219)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-24 09:46:57 +01:00
Prowler Bot
e37fd05d58 chore(regions_update): Changes in regions for AWS services (#7246)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-24 09:46:26 +01:00
Prowler Bot
acc708bda5 chore(regions_update): Changes in regions for AWS services (#7250)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-24 09:46:08 +01:00
Prowler Bot
c7460bb69c chore(regions_update): Changes in regions for AWS services (#7334)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-24 09:35:47 +01:00
Pepe Fagoaga
84b273dab9 fix(action): Use Poetry v2 (#7329) 2025-03-20 18:49:32 +01:00
Prowler Bot
bb7ce2157e chore(regions_update): Changes in regions for AWS services (#7323)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-03-20 18:10:28 +05:45
Pepe Fagoaga
07b9e1d3a4 chore(api): Update CHANGELOG (#7325) 2025-03-20 15:22:00 +05:45
Pepe Fagoaga
96a879d761 fix(scan_id): Read the ID from the Scan object (#7324) 2025-03-20 15:18:31 +05:45
Pepe Fagoaga
283127c3f4 chore(aws-regions): remove backport to v3 (#7319) 2025-03-19 22:14:41 +05:45
dependabot[bot]
beeee80a0b chore(deps): bump github/codeql-action from 3.28.11 to 3.28.12 (#7321)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-19 22:14:23 +05:45
Pepe Fagoaga
06b62826b4 chore(dependabot): disable for v3 (#7316) 2025-03-19 21:56:52 +05:45
Pedro Martín
d0736af209 fix(gcp): make provider id mandatory in test_connection (#7296) 2025-03-19 18:33:49 +05:45
Pablo Lara
716c8c1a5f docs: add social login images and update documentation (#7314)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-03-19 17:16:37 +05:45
Pepe Fagoaga
e6cdda1bd9 chore(dependabot): Disable for API and UI (#7300) 2025-03-19 14:46:11 +05:45
Pedro Martín
2747a633bc fix(k8s): remove typos from PCI 4.0 (#7294) 2025-03-19 09:31:40 +01:00
Pepe Fagoaga
74118f5cfe chore(social-login): improve copy when not enabled (#7295) 2025-03-19 13:36:22 +05:45
dependabot[bot]
598bdf28bb chore(deps): bump trufflesecurity/trufflehog from 3.88.17 to 3.88.18 (#7297)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-19 12:31:52 +05:45
Pepe Fagoaga
d75f681c87 chore(security): Configure HTTP Security Headers (#7220)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-18 17:49:12 +01:00
Pepe Fagoaga
c7956ede6a chore(security): Add HTTP Security Headers (#7289) 2025-03-18 17:44:57 +01:00
Pablo Lara
64f5a69e84 fix: prevent SSR mismatch in OAuth URL generation (#7288) 2025-03-18 17:22:29 +01:00
dependabot[bot]
bfb15c34b8 chore(deps): bump azure-mgmt-containerservice from 34.0.0 to 34.1.0 (#6989)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-03-18 17:14:25 +01:00
Pablo Lara
638b3ac0cd chore(providers): change wording when adding a new provider (#7280) 2025-03-18 21:50:56 +05:45
Daniel Barranquero
9d6147a037 fix(route53): solve false positive in route53_public_hosted_zones_cloudwatch_logging_enabled (#7201) 2025-03-18 16:54:49 +01:00
Pepe Fagoaga
802c786ac2 fix(test-connection): Handle provider without secret (#7283) 2025-03-18 21:34:36 +05:45
Pepe Fagoaga
c8be8dbd9a fix(aws-regions): Use @prowler-bot as author (#7285) 2025-03-18 20:27:19 +05:45
Pablo Lara
7053b2bb37 chore: add env vars for social login (#7257)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-03-18 13:43:46 +01:00
Prowler Bot
447bf832cd chore(regions_update): Changes in regions for AWS services (#7281)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-18 17:35:44 +05:45
Pablo Lara
7c4571b55e feat(providers): add component to render a link to the documentation (#7282) 2025-03-18 12:05:38 +01:00
dependabot[bot]
eb7c16aba5 chore(deps): bump azure-mgmt-storage from 21.2.1 to 22.1.1 (#7098)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-03-18 11:06:46 +01:00
Adrián Jesús Peña Rodríguez
b09e83b171 chore: add api reference to download report section (#7243) 2025-03-18 14:54:13 +05:45
Hugo Pereira Brito
bb149a30a7 fix(microsoft365): typo Microsoft365NotTenantIdButClientIdAndClienSecretError (#7244) 2025-03-17 21:16:47 +05:45
Pablo Lara
d5be35af49 chore: Rename keyServer and extract to helper (#7256) 2025-03-17 21:11:27 +05:45
Pedro Martín
f6aa56d92b fix(.env): remove spaces (#7255) 2025-03-17 20:48:55 +05:45
Pedro Martín
6a4df15c47 fix(prowler): change from prowler.py to prowler-cli.py (#7253) 2025-03-17 15:44:15 +01:00
Pablo Lara
72de5fdb1b chore: update git ignore file (#7254) 2025-03-17 14:53:58 +01:00
Pedro Martín
a7f55d06af feat(jira): add basic auth method (#7233) 2025-03-17 14:31:35 +01:00
Pepe Fagoaga
97da78d4e7 fix(backport): Use container tagged version (#7252) 2025-03-17 18:19:43 +05:45
Pepe Fagoaga
c4f6161c73 chore(security): Pin actions to the Full-Length Commit SHA (#7249) 2025-03-17 17:11:28 +05:45
Pablo Lara
db7ffea24d chore: add env var for social login (#7251) 2025-03-17 10:23:01 +01:00
Prowler Bot
489b5abf82 chore(regions_update): Changes in regions for AWS services (#7237)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-17 13:47:56 +05:45
Prowler Bot
3a55c2ee07 chore(regions_update): Changes in regions for AWS services (#7245)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-17 12:34:44 +05:45
Pedro Martín
64d866271c fix(scan): add compliance info inside finding (#5649) 2025-03-17 12:18:00 +05:45
Pablo Lara
1ab2a80eab chore: improve UX when social login is not enabled (#7242) 2025-03-15 12:12:30 +01:00
Pablo Lara
89d4c521ba chore(social-login): disable social login buttons when env vars are not set (#7238) 2025-03-14 11:32:22 +01:00
Pablo Lara
f2e19d377a chore(social-login): rename env.vars for social login (#7232) 2025-03-13 17:07:17 +01:00
Pablo Lara
2b7b887b87 chore: social auth is algo in sign-up page (#7231) 2025-03-13 14:20:09 +01:00
Pablo Lara
44c70b5d01 chore: remove unused regions (#7229) 2025-03-13 13:57:16 +01:00
Pablo Lara
7514484c42 chore: change wording for launching a single scan (#7226) 2025-03-13 13:48:01 +01:00
Adrián Jesús Peña Rodríguez
9594c4c99f fix: add a handled response in case local files are missing (#7183) 2025-03-13 13:47:00 +01:00
Pablo Lara
56445c9753 chore: update changelog (#7223) 2025-03-13 13:39:26 +01:00
Adrián Jesús Peña Rodríguez
07419fd5e1 fix(exports): change the way to remove the local export files after s3 upload (#7172) 2025-03-13 13:37:17 +01:00
Pablo Lara
2e4dd12b41 feat(social-login): social login with Google is working (#7218)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-03-13 12:52:30 +01:00
Víctor Fernández Poyatos
fed2046c49 fix(migrations): add through parameter to integration.providers (#7222) 2025-03-13 12:47:34 +01:00
Pepe Fagoaga
db79db4786 fix(pyproject): Rename prowler.py (#7217) 2025-03-13 16:53:38 +05:45
Víctor Fernández Poyatos
6f027e3c57 feat(integrations): Added new endpoints to allow configuring integrations (#7167) 2025-03-12 19:57:55 +05:45
Daniel Barranquero
bdb877009f feat(entra): add new check entra_admin_mfa_enabled_for_administrative_roles (#7181)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-12 14:47:29 +01:00
Sergio Garcia
6564ec1ff5 fix(cloudwatch): handle None metric alarms (#7205) 2025-03-12 14:44:36 +01:00
Pedro Martín
443dc067b3 feat(kubernetes): add ISO 27001 2022 compliance framework (#7204) 2025-03-12 14:24:53 +01:00
Hugo Pereira Brito
6221650c5f feat(entra): add new check entra_identity_protection_sign_in_risk_enabled (#7171)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-12 13:53:47 +01:00
Andoni Alonso
034d0fd1f4 refactor(check): add docstrings and improve report handling (#7113) 2025-03-12 13:38:42 +01:00
Hugo Pereira Brito
e617ff0460 feat(docs): add microsoft365 configurable checks (#7200) 2025-03-12 12:52:35 +01:00
Hugo Pereira Brito
4b1ed607a7 feat(entra): add new check entra_identity_protection_user_risk_enabled (#7126)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-12 12:44:31 +01:00
Pepe Fagoaga
137365a670 chore(poetry): Upgrade to v2 (#7112) 2025-03-12 17:28:34 +05:45
Hugo Pereira Brito
1891a1b24f feat(entra): add new check entra_managed_device_required_for_authentication (#7115)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-12 11:34:14 +01:00
Daniel Barranquero
e57e070866 feat(entra): add new check entra_password_hash_sync_enabled (#7061) 2025-03-12 11:31:49 +01:00
dependabot[bot]
66998cd1ad chore(deps): bump google-api-python-client from 2.162.0 to 2.163.0 (#7191)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-12 11:25:24 +01:00
Prowler Bot
c0b1833446 chore(regions_update): Changes in regions for AWS services (#7197)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-12 11:25:06 +01:00
Pablo Lara
329a72c77c chore: update changelog (#7199) 2025-03-12 10:12:33 +01:00
Pablo Lara
2610ee9d0c feat(invitations): Disable editing for accepted invites (#7198) 2025-03-12 10:06:46 +01:00
Pablo Lara
a13ca9034e chore(scans): rename type to trigger (#7196) 2025-03-12 09:47:02 +01:00
Pablo Lara
5d1abb3689 chore: auto refresh if the state is also available (#7195) 2025-03-12 09:33:24 +01:00
Pablo Lara
e1d1c6d154 styles: tweaks styles (#7194) 2025-03-12 09:23:02 +01:00
Pablo Lara
e18e0e7cd4 chore(launch-scan): update wording (#7193) 2025-03-12 08:20:15 +01:00
Pablo Lara
eaf3d07a3f chore: update the changelog (#7190) 2025-03-12 08:15:28 +01:00
Hugo Pereira Brito
c88ae32b7f feat(microsoft365): add new check entra_admin_users_sign_in_frequency_enabled (#7020)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-11 19:18:33 +01:00
Pablo Lara
605613e220 feat(scans): allow running a scan once (#7188) 2025-03-11 17:47:47 +01:00
Sergio Garcia
d2772000ec chore(sentry): ignore new exceptions in Sentry (#7187) 2025-03-11 17:46:14 +01:00
Adrián Jesús Peña Rodríguez
42939a79f5 docs: add users, invitations and RBAC (#7109) 2025-03-11 21:59:04 +05:45
Daniel Barranquero
ed17931117 feat(entra): add new check entra_dynamic_group_for_guests_created (#7168)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-11 16:21:17 +01:00
Daniel Barranquero
66df5f7a1c chore(providers): enhance Remediation.Code.CLI field from check's metadata (#7094)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-03-11 16:15:58 +01:00
Pedro Martín
fc6e6696e5 feat(gcp): add ISO 27001 2022 compliance framework (#7185) 2025-03-11 15:16:40 +01:00
Sergio Garcia
465748c8a1 chore(sentry): ignore expected errors in GCP API (#7184) 2025-03-11 14:32:37 +01:00
Pedro Martín
e59cd71bbf fix(azure): add remaining checks for reqA.5.25 (#7182) 2025-03-11 14:16:10 +01:00
Daniel Barranquero
8a76fea310 feat(entra): add new check entra_admin_consent_workflow_enabled (#7110) 2025-03-11 13:18:17 +01:00
Adrián Jesús Peña Rodríguez
0e46be54ec docs: add generate_output documentation (#7122)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-11 17:23:32 +05:45
Pedro Martín
dc81813fdf fix(ens): remove and change duplicated ids (#7165) 2025-03-11 11:35:31 +01:00
Hugo Pereira Brito
eaa0df16bb refactor(microsoft365): resource metadata assertions (#7169) 2025-03-11 11:30:37 +01:00
Pedro Martín
c23e911028 feat(azure): add ISO 27001 2022 compliance framework (#7170) 2025-03-11 11:29:40 +01:00
dependabot[bot]
06b96a1007 chore(deps): bump tzlocal from 5.3 to 5.3.1 (#7162)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-11 11:17:50 +01:00
Prowler Bot
fa545c591f chore(regions_update): Changes in regions for AWS services (#7177)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-11 11:17:27 +01:00
dependabot[bot]
e828b780c7 chore(deps): bump trufflesecurity/trufflehog from 3.88.15 to 3.88.16 (#7174)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-11 11:16:57 +01:00
Harshit Raj Singh
eca8c5cabd feat(aws): AWS Found Sec Best Practices & PCI DSS v3.2.1 upgrade (#7017)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-03-11 09:31:16 +01:00
Pablo Lara
b7bce6008f fix: tweak z-index for custom inputs (#7166) 2025-03-10 11:55:04 +01:00
Pablo Lara
2fdf89883d feat(scans): improve scan launch provider selection (#7164) 2025-03-10 10:05:33 +01:00
dependabot[bot]
6c5d4bbaaa chore(deps): bump django from 5.1.5 to 5.1.7 in /api (#7145)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-10 09:50:09 +01:00
Gary Mclean
cb2f926d4f fix(azure): correct check title for SQL Server Unrestricted (#7123) 2025-03-07 18:24:24 +01:00
ryan-stavella
12c01b437e fix(metadata): typo in ec2_securitygroup_allow_wide_open_public_ipv4 (#7116) 2025-03-07 15:28:08 +01:00
dependabot[bot]
3253a58942 chore(deps-dev): bump mock from 5.1.0 to 5.2.0 (#7099)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-07 15:01:43 +01:00
Kay Agahd
199f7f14ea fix(doc): event_time has been changed to time_dt but was not documented (#7136) 2025-03-07 14:36:51 +01:00
Andoni Alonso
d42406d765 fix(metadata): match type with check results (#7111) 2025-03-07 14:34:07 +01:00
Kay Agahd
2276ffb1f6 fix(aws): ecs_task_definitions_no_environment_secrets.metadata.json (#7135) 2025-03-07 14:31:03 +01:00
dependabot[bot]
218fb3afb0 chore(deps): bump jinja2 from 3.1.5 to 3.1.6 (#7151)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-07 14:27:29 +01:00
Prowler Bot
a9fb890979 chore(regions_update): Changes in regions for AWS services (#7108)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-07 14:06:28 +01:00
Prowler Bot
54ebf5b455 chore(regions_update): Changes in regions for AWS services (#7119)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-07 14:04:48 +01:00
dependabot[bot]
c9a0475aa8 chore(deps-dev): bump mkdocs-git-revision-date-localized-plugin from 1.3.0 to 1.4.1 (#7129)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-07 14:03:44 +01:00
Prowler Bot
5567d9f88c chore(regions_update): Changes in regions for AWS services (#7131)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-07 13:19:08 +01:00
dependabot[bot]
56f3e661ae chore(deps): bump trufflesecurity/trufflehog from 3.88.14 to 3.88.15 (#7127)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-07 13:17:45 +01:00
César Arroba
1aa4479a10 chore: increase release to 5.5.0 (#7143) 2025-03-07 13:16:24 +01:00
Prowler Bot
7b625d0a91 chore(regions_update): Changes in regions for AWS services (#7146)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-07 13:15:51 +01:00
Pablo Lara
fd0529529d chore: update changelog (#7149) 2025-03-07 11:47:23 +01:00
Pablo Lara
af43191954 fix: tweaks for compliance cards (#7147) 2025-03-07 11:32:58 +01:00
Pablo Lara
2ce2ca7c91 feat: add changelog (#7141) 2025-03-06 16:46:55 +01:00
Víctor Fernández Poyatos
a0fc3db665 fix(overviews): manage overview exceptions and use batch_size with bulk (#7140) 2025-03-06 15:35:29 +01:00
César Arroba
feb458027f chore(ui-gha): delete double quotes on prowler version (#7139) 2025-03-06 19:48:53 +05:45
Pablo Lara
e5a5b7af5c fix(groups): display uid if alias is missing (#7137) 2025-03-06 14:37:36 +01:00
Pablo Lara
ad456ae2fe fix(credentials): adjust helper links to fit width (#7133) 2025-03-06 11:42:26 +01:00
Pepe Fagoaga
690cb51f6c revert(findings): change uid from varchar to text (#7132) 2025-03-06 16:24:35 +05:45
dependabot[bot]
14aaa2f376 chore(deps): bump jinja2 from 3.1.5 to 3.1.6 in /api (#7130)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-06 09:39:24 +01:00
César Arroba
6e47ca2c41 chore(ui-gha): add version prefix (#7125) 2025-03-05 21:13:24 +05:45
Víctor Fernández Poyatos
0d99d2be9b fix(reports): Fix task kwargs and result (#7124) 2025-03-05 21:10:44 +05:45
César Arroba
c322ef00e7 chore(ui): add prowler version on build (#7120) 2025-03-05 20:46:16 +05:45
Pablo Lara
3513421225 feat(compliance): new compliance selector (#7118) 2025-03-05 15:12:10 +01:00
Víctor Fernández Poyatos
b0e6bfbefe chore(api): Update changelog (#7090) 2025-03-04 17:44:34 +01:00
dependabot[bot]
f7a918730e chore(deps-dev): bump pytest from 8.3.4 to 8.3.5 (#7097)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-04 09:16:05 +01:00
Pablo Lara
cef33319c5 chore(ui): update label from 'Select a scan job' to 'Select a cloud p… (#7107) 2025-03-04 09:11:39 +01:00
Pablo Lara
2036a59210 fix(roles): show the correct error message (#7089) 2025-03-03 15:46:02 +01:00
Pablo Lara
e5eccb6227 fix: bug with create role and unlimited visibility checkbox (#7088) 2025-03-03 15:45:39 +01:00
Sergio Garcia
48c2c8567c feat(aws): add fixers for threat detection checks (#7085) 2025-03-03 14:20:23 +01:00
Pablo Lara
bbeef0299f feat(version): add prowler version to the sidebar (#7086) 2025-03-03 13:40:09 +01:00
Pablo Lara
bec5584d63 chore: Update the latest table findings with the most recent changes (#7084) 2025-03-03 13:16:30 +01:00
Pablo Lara
bdc759d34c feat(sidebar): sidebar with new functionalities (#7018) 2025-03-03 12:30:28 +01:00
Prowler Bot
8db442d8ba chore(regions_update): Changes in regions for AWS services (#7067)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 09:29:48 +01:00
Sergio Garcia
9e7a0d4175 fix(threat detection): run single threat detection check (#7065) 2025-02-28 13:51:07 +01:00
Pepe Fagoaga
9c33b3f5a9 refactor(stats): Use Finding instead of Check_Report (#7053)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-02-28 10:54:48 +01:00
Pepe Fagoaga
7e7e2c87dc chore(examples): Scan AWS (#7064) 2025-02-28 15:25:10 +05:45
Sergio Garcia
2f741f35a8 chore(gcp): enhance GCP APIs logic (#7046) 2025-02-28 14:55:43 +05:45
dependabot[bot]
c411466df7 chore(deps): bump trufflesecurity/trufflehog from 3.88.13 to 3.88.14 (#7063)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-28 09:10:47 +01:00
Daniel Barranquero
9679939307 feat(m365): add sharepoint service with 4 checks (#7057)
Co-authored-by: MarioRgzLpz <mariorgzlpz1809@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-27 18:15:17 +01:00
Pedro Martín
8539423b22 feat(docs): add info related with sts assume role and regions (#7062) 2025-02-27 17:40:31 +01:00
Daniel Barranquero
81edafdf09 fix(azure): handle account not supporting Blob (#7060) 2025-02-27 13:20:56 +01:00
Sergio Garcia
e0a262882a fix(ecs): ensure unique finding id in ECS checks (#7059) 2025-02-27 13:02:22 +01:00
Prowler Bot
89237ab99e chore(regions_update): Changes in regions for AWS services (#7056)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-27 11:00:13 +01:00
Hugo Pereira Brito
0f414e451e feat(microsoft365): add new check entra_policy_ensure_default_user_cannot_create_tenants (#6918)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-27 10:31:02 +01:00
Pablo Lara
1180522725 feat(exports): download scan exports (#7006) 2025-02-27 14:08:12 +05:45
Pepe Fagoaga
81c7ebf123 fix(env): UI version must be stable (#7055) 2025-02-27 13:32:53 +05:45
Víctor Fernández Poyatos
258f05e6f4 fix(migrations): Fix migration dependency order (#7051) 2025-02-26 17:26:21 +01:00
Víctor Fernández Poyatos
53efb1c153 feat(labeler): apply label on migration changes (#7052) 2025-02-26 17:03:12 +01:00
Pepe Fagoaga
26014a9705 fix(findings): change uid from varchar to text (#7048) 2025-02-26 21:17:16 +05:45
Víctor Fernández Poyatos
00ef037e45 feat(findings): Add Django management command to populate database with dummy data (#7049) 2025-02-26 16:15:37 +01:00
Adrián Jesús Peña Rodríguez
669ec74e67 feat(export): add API export system (#6878) 2025-02-26 15:49:44 +01:00
dependabot[bot]
c4528200b0 chore(deps-dev): bump black from 24.10.0 to 25.1.0 (#6733)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-26 11:38:09 +01:00
Daniel Barranquero
ba7cd0250a fix(elasticache): improve logic in elasticache_redis_cluster_backup_enabled (#7042) 2025-02-26 10:31:14 +01:00
Rubén De la Torre Vico
c5e97678a1 fix(azure): migrate resource models to avoid using SDK defaults (#6880) 2025-02-26 09:54:53 +01:00
Pedro Martín
337a46cdcc feat(aws): add ISO 27001 2022 compliance framework (#7035)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-26 08:34:08 +01:00
Hugo Pereira Brito
7f74b67f1f chore(iam): enhance iam_role_cross_service_confused_deputy_prevention recommendation (#7023) 2025-02-26 07:37:57 +01:00
Prowler Bot
5dcc48d2e5 chore(regions_update): Changes in regions for AWS services (#7034)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-26 07:30:07 +01:00
Prowler Bot
8b04aab07d chore(regions_update): Changes in regions for AWS services (#7015)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-26 07:29:42 +01:00
dependabot[bot]
eab4f6cf2e chore(deps): bump google-api-python-client from 2.161.0 to 2.162.0 (#7037)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-26 07:25:14 +01:00
Hugo Pereira Brito
7f8d623283 refactor(microsoft365): CheckReportMicrosoft365 and resource metadata (#6952) 2025-02-26 07:24:54 +01:00
Víctor Fernández Poyatos
dbffed8f1f feat(findings): Optimize findings endpoint (#7019) 2025-02-25 12:41:47 +01:00
Pepe Fagoaga
7e3688fdd0 chore(action): Conventional Commit Check (#7033) 2025-02-25 09:51:55 +01:00
dependabot[bot]
2e111e9ad3 chore(deps): bump trufflesecurity/trufflehog from 3.88.12 to 3.88.13 (#7026)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-25 14:34:24 +05:45
Pedro Martín
6d6070ff3f feat(outputs): add sample outputs (#6945) 2025-02-25 14:33:16 +05:45
Pedro Martín
391bbde353 fix(cis): show report table on the CLI (#6979) 2025-02-25 14:28:58 +05:45
Pedro Martín
3c56eb3762 feat(azure): add PCI DSS 4.0 (#6982) 2025-02-25 14:27:50 +05:45
Pedro Martín
7c14ea354b feat(kubernetes): add PCI DSS 4.0 (#7013) 2025-02-25 14:27:14 +05:45
Pedro Martín
c96aad0b77 feat(dashboard): take the latest finding uid by timestamp (#6987) 2025-02-25 14:25:03 +05:45
Víctor Fernández Poyatos
a9dd3e424b feat(tasks): add deletion queue for deletion tasks (#7022) 2025-02-24 18:02:52 +01:00
Pedro Martín
8a144a4046 feat(gcp): add PCI DSS 4.0 (#7010) 2025-02-21 16:19:20 +05:30
Prowler Bot
75f86d7267 chore(regions_update): Changes in regions for AWS services (#7011)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-21 15:37:15 +05:30
dependabot[bot]
bbf875fc2f chore(deps-dev): bump mkdocs-material from 9.6.4 to 9.6.5 (#7007)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-21 14:28:18 +05:30
Raj Chowdhury
59d491f61b fix(typo): solve typo in dashboard.md (#7009) 2025-02-21 14:17:08 +05:30
dependabot[bot]
ed640a1324 chore(deps): bump trufflesecurity/trufflehog from 3.88.11 to 3.88.12 (#7008)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-21 14:16:15 +05:30
César Arroba
e86fbcaef7 feat(api): setup sentry for OSS API (#6874) 2025-02-20 23:08:01 +05:45
Pablo Lara
7f48212054 chore(users): renaming the account now triggers a re-render in the sidebar (#7005) 2025-02-20 16:58:45 +01:00
dependabot[bot]
a2c5c71baf chore(deps): bump python from 3.12.8-alpine3.20 to 3.12.9-alpine3.20 (#6882)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 21:11:45 +05:30
dependabot[bot]
b904f81cb9 chore(deps): bump tzlocal from 5.2 to 5.3 (#6932)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 21:10:46 +05:30
dependabot[bot]
d64fe374dd chore(deps): bump cryptography from 43.0.1 to 44.0.1 in /api (#7001)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 12:55:36 +01:00
Hugo Pereira Brito
fe25e7938e docs(tutorials): update all deprecated poetry shell references (#7002) 2025-02-20 17:04:19 +05:45
Prowler Bot
931df361bf chore(regions_update): Changes in regions for AWS services (#6998)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-20 15:52:36 +05:30
Pedro Martín
d7c45f4aee chore(github): add compliance to PR labeler (#6996) 2025-02-20 14:50:43 +05:30
Pedro Martín
5e5bef581b fix(soc2_aws): remove duplicated checks (#6995) 2025-02-20 14:38:26 +05:30
Hugo Pereira Brito
2d9e95d812 docs(installation): add warning for poetry shell deprecation in README (#6983) 2025-02-20 14:19:35 +05:45
Pablo Lara
e5f979d106 chore(findings): add 'Status Extended' attribute to finding details (#6997) 2025-02-20 09:33:03 +01:00
Sergio Garcia
c7a5815203 fix(deps): update vulnerable cryptography dependency (#6993) 2025-02-20 12:18:15 +05:30
Pedro Martín
03e268722e feat(aws): add PCI DSS 4.0 (#6949) 2025-02-20 11:07:06 +05:30
dependabot[bot]
78a2774329 chore(deps): bump trufflesecurity/trufflehog from 3.88.9 to 3.88.11 (#6988)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 11:04:15 +05:30
dependabot[bot]
c1b5ab7f53 chore(deps): bump kubernetes from 32.0.0 to 32.0.1 (#6992)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-20 10:46:19 +05:30
Sergio Garcia
b861d97ad4 fix(report): remove invalid resources in report (#6852) 2025-02-19 21:27:52 +05:45
Pablo Lara
f3abcc9dd6 feat(scans): update the progress for executing scans (#6972) 2025-02-19 16:10:29 +01:00
César Arroba
cab13fe018 chore(gha): trigger API or UI deployment when push to master (#6946) 2025-02-19 18:08:51 +05:45
Prowler Bot
cc4b19c7ce chore(regions_update): Changes in regions for AWS services (#6978) 2025-02-19 11:04:45 +01:00
Pablo Lara
a754d9aee5 fix(roles): handle empty response in deleteRole and ensure revalidation (#6976) 2025-02-19 09:03:49 +01:00
Pedro Martín
22b54b2d8d feat(aws): add compliance CIS 4.0 (#6937) 2025-02-19 08:23:49 +05:30
dependabot[bot]
d12ca6301a chore(deps-dev): bump flake8 from 7.1.1 to 7.1.2 (#6954)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-19 08:09:58 +05:30
Hugo Pereira Brito
bc1b2ad9ab test(cloudfront): add name retrieval test for cloudfront bucket domains (#6969) 2025-02-19 08:08:55 +05:30
Pepe Fagoaga
1782ab1514 fix(ocsf): Adapt for 1.4.0 (#6971) 2025-02-19 08:06:13 +05:30
Prowler Bot
0384fc50e3 chore(regions_update): Changes in regions for AWS services (#6968)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-18 18:40:01 +05:30
dependabot[bot]
cc46dee9ee chore(deps-dev): bump bandit from 1.8.2 to 1.8.3 (#6955)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-18 18:39:10 +05:30
Hugo Pereira Brito
ed5a0ae45a fix(cloudfront): Incorrect bucket name retrievement (#6947) 2025-02-17 17:08:28 +01:00
Prowler Bot
928ccfefb8 chore(regions_update): Changes in regions for AWS services (#6944)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-17 16:55:15 +01:00
dependabot[bot]
7f6bfb7b3e chore(deps): bump trufflesecurity/trufflehog from 3.88.8 to 3.88.9 (#6943)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-17 16:54:52 +01:00
Rubén De la Torre Vico
bcbc9bf675 fix(gcp): Correct false positive when sslMode=ENCRYPTED_ONLY in CloudSQL (#6936) 2025-02-14 15:16:21 -05:00
dependabot[bot]
0ec4366f4c chore(deps): bump google-api-python-client from 2.160.0 to 2.161.0 (#6933)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-14 10:09:33 -05:00
César Arroba
ff72b7eea1 fix(gha): fix short sha step (#6939) 2025-02-14 19:11:26 +05:45
César Arroba
a32ca19251 chore(gha): add tag for api and ui images on push to master (#6920) 2025-02-14 18:01:22 +05:45
Pablo Lara
b79508956a fix(issue pages): apply sorting by default in issue pages (#6934) 2025-02-14 10:32:34 +01:00
dependabot[bot]
d76c5bd658 chore(deps): bump trufflesecurity/trufflehog from 3.88.7 to 3.88.8 (#6931)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-13 18:17:25 -05:00
Kay Agahd
580e11126c fix(aws): codebuild service threw KeyError for projects type CODEPIPELINE (#6919) 2025-02-13 12:22:09 -05:00
Sergio Garcia
736d40546a fix(gcp): handle DNS Managed Zone with no DNSSEC (#6924) 2025-02-13 12:18:50 -05:00
dependabot[bot]
88810d2bb5 chore(deps-dev): bump mkdocs-material from 9.6.3 to 9.6.4 (#6913)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-13 11:36:07 -05:00
Víctor Fernández Poyatos
3a8f4d2ffb feat(social-login): Add social login integration for Google and Github OAuth providers (#6906) 2025-02-13 16:54:38 +01:00
Sergio Garcia
1fe125a65f chore(docs): external K8s cluster Prowler App credentials (#6921) 2025-02-13 09:46:05 -05:00
Kay Agahd
0ff4df0836 fix(aws): SNS threw IndexError if SubscriptionArn is PendingConfirmation (#6896) 2025-02-13 09:34:48 -05:00
Pedro Martín
16b4775e2d fix(gcp): remove typos on CIS 3.0 (#6917) 2025-02-13 13:48:19 +01:00
dependabot[bot]
c3a13b8a29 chore(deps): bump trufflesecurity/trufflehog from 3.88.6 to 3.88.7 (#6915)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-12 19:15:03 -05:00
Sergio Garcia
d1053375b7 fix(aws): handle AccessDenied when retrieving resource policy (#6908)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-12 15:31:26 -05:00
César Arroba
0fa4538256 fix(gha): fix test build containers on pull requests actions (#6909) 2025-02-12 23:26:54 +05:45
Ogonna Iwunze
738644f288 fix(kms): Amazon KMS API call error handling (#6843)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-12 10:09:15 -05:00
dependabot[bot]
2f80b055ac chore(deps-dev): bump coverage from 7.6.11 to 7.6.12 (#6897)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-12 10:08:26 -05:00
Prowler Bot
fd62a1df10 chore(regions_update): Changes in regions for AWS services (#6900)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-12 10:06:42 -05:00
César Arroba
a85d0ebd0a chore(api): test build container image on pull request (#6850) 2025-02-12 15:44:05 +05:45
César Arroba
2c06902baa chore(ui): test build container image on pull request (#6849) 2025-02-12 15:43:22 +05:45
Pepe Fagoaga
76ac6429fe chore(version): Update version to 5.4.0 (#6894) 2025-02-11 17:51:08 -05:00
dependabot[bot]
43cae66b0d chore(deps-dev): bump coverage from 7.6.10 to 7.6.11 (#6887)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 19:30:36 -05:00
dependabot[bot]
dacddecc7d chore(deps): bump trufflesecurity/trufflehog from 3.88.5 to 3.88.6 (#6888)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 18:15:25 -05:00
Mario Rodriguez Lopez
dcb9267c2f feat(microsof365): Add documentation and compliance file (#6195)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-10 11:13:06 -05:00
Víctor Fernández Poyatos
ff35fd90fa chore(api): Update changelog and specs (#6876) 2025-02-10 12:06:34 +01:00
Víctor Fernández Poyatos
7469377079 chore: Add needed steps for API in PR template (#6875) 2025-02-10 15:20:09 +05:45
Pepe Fagoaga
c8441f8d38 fix(kubernetes): Change UID validation (#6869)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-10 14:55:24 +05:45
Pepe Fagoaga
abf4eb0ffc chore: Rename dashboard table latest findings (#6873)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-10 09:55:44 +01:00
dependabot[bot]
93717cc830 chore(deps-dev): bump mkdocs-material from 9.6.2 to 9.6.3 (#6871)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-07 18:24:49 -05:00
Sergio Garcia
b629bc81f8 docs(eks): add documentation about EKS onboarding (#6853)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-07 10:59:01 -05:00
Pedro Martín
f628897fe1 fix(dashboard): adjust the bar chart display (#6690) 2025-02-07 10:05:30 -05:00
Prowler Bot
54b82a78e3 chore(regions_update): Changes in regions for AWS services (#6858)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-07 10:02:28 -05:00
Víctor Fernández Poyatos
377faf145f feat(findings): Use ArrayAgg and subqueries on metadata endpoint (#6863)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-07 19:36:01 +05:45
Kay Agahd
69e316948f fix(aws): key error for detect-secrets (#6710) 2025-02-07 14:48:16 +01:00
Pablo Lara
62cbff4f53 feat: implement new functionality with inserted_at__gte in findings a… (#6864) 2025-02-07 14:25:25 +01:00
Víctor Fernández Poyatos
5582265e9d docs: Add details about user creation in Prowler app (#6862) 2025-02-07 13:29:25 +01:00
dependabot[bot]
fb5ea3c324 chore(deps): bump microsoft-kiota-abstractions from 1.9.1 to 1.9.2 (#6856)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-07 11:07:43 +01:00
Víctor Fernández Poyatos
9b5f676f50 feat(findings): Require date filters for findings endpoints (#6800) 2025-02-07 13:54:55 +05:45
Pranay Girase
88cfc0fa7e fix(typo): typos in Dashboard and Report in HTML (#6847) 2025-02-06 10:42:31 -05:00
Prowler Bot
665bfa2f13 chore(regions_update): Changes in regions for AWS services (#6848)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-06 08:46:32 -05:00
dependabot[bot]
b89b1a64f4 chore(deps): bump trufflesecurity/trufflehog from 3.88.4 to 3.88.5 (#6844)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-05 18:02:42 -05:00
Sergio Garcia
9ba657c261 fix(kms): handle error in DescribeKey function (#6839) 2025-02-05 14:03:31 -05:00
Mario Rodriguez Lopez
bce958b8e6 feat(entra): add new check entra_thirdparty_integrated_apps_not_allowed (#6357)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-05 12:45:48 -05:00
Daniel Barranquero
914012de2b fix(cloudfront): fix false positive in s3 origins (#6823) 2025-02-05 12:39:49 -05:00
Ogonna Iwunze
8d1c476aed feat(kms): add kms_cmk_not_multi_region AWS check (#6794)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-05 11:20:29 -05:00
Gary Mclean
567c729e9e fix(findings) Spelling mistakes correction (#6822) 2025-02-05 10:26:50 -05:00
Kay Agahd
3f03dd20e4 fix(aws) wording of report.status_extended in awslambda_function_not_publicly_accessible (#6824) 2025-02-05 10:23:52 -05:00
Daniel Barranquero
1c778354da fix(directoryservice): handle ClientException (#6781)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-05 10:22:32 -05:00
Prowler Bot
3a149fa459 chore(regions_update): Changes in regions for AWS services (#6821)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-05 09:19:56 -05:00
Mario Rodriguez Lopez
f3b121950d feat(entra): add new entra service for Microsoft365 (#6326)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 19:47:14 -05:00
Mario Rodriguez Lopez
43c13b7ba1 feat(microsoft365): add new check admincenter_settings_password_never_expire (#6023)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 17:24:11 -05:00
dependabot[bot]
9447b33800 chore(deps): bump kubernetes from 31.0.0 to 32.0.0 (#6678)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-04 17:22:51 -05:00
Hugo Pereira Brito
2934752eeb fix(elasticache): InvalidReplicationGroupStateFault error (#6815)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 14:28:31 -05:00
dependabot[bot]
dd6d8c71fd chore(deps-dev): bump moto from 5.0.27 to 5.0.28 (#6804)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 12:58:48 -05:00
Pablo Lara
80267c389b style(forms): improve spacing consistency (#6814) 2025-02-04 13:20:24 +01:00
Pablo Lara
acfbaf75d5 chore(forms): improvements to the sign-in and sign-up forms (#6813) 2025-02-04 12:46:07 +01:00
Pedro Martín
5f54377407 chore(aws_audit_manager_control_tower_guardrails): add checks to reqs (#6699) 2025-02-03 14:59:08 -05:00
Drew Kerrigan
552aa64741 docs(): add description of changed and new delta values to prowler app tutorial (#6801) 2025-02-03 20:51:03 +01:00
dependabot[bot]
d64f611f51 chore(deps): bump pytz from 2024.2 to 2025.1 (#6765)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 12:48:18 -05:00
dependabot[bot]
a96cc92d77 chore(deps-dev): bump mkdocs-material from 9.5.50 to 9.6.2 (#6799)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 11:37:02 -05:00
dependabot[bot]
3858cccc41 chore(deps-dev): bump pylint from 3.3.3 to 3.3.4 (#6721)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 10:32:42 -05:00
Pedro Martín
072828512a fix(cis_1.5_aws): add checks to needed reqs (#6695)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-03 10:32:20 -05:00
Pedro Martín
a73ffe5642 fix(cis_1.4_aws): add checks to needed reqs (#6696)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-03 10:32:10 -05:00
Pablo Lara
8e784a5b6d feat(scans): show scan details right after launch (#6791) 2025-02-03 16:08:47 +01:00
dependabot[bot]
1b6f9332f1 chore(deps): bump trufflesecurity/trufflehog from 3.88.2 to 3.88.4 (#6760)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 09:35:53 -05:00
secretcod3r
db8b472729 fix(gcp): fix wrong provider value in check (#6691) 2025-02-03 09:29:08 -05:00
Pedro Martín
867b371522 fix(cis_2.0_aws): add checks to needed reqs (#6694) 2025-02-03 09:28:04 -05:00
dependabot[bot]
c0d7c9fc7d chore(deps): bump google-api-python-client from 2.159.0 to 2.160.0 (#6720)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-03 09:27:17 -05:00
Pablo Lara
bb4685cf90 fix(findings): remove default status filtering (#6784) 2025-02-03 15:20:18 +01:00
Pablo Lara
6a95426749 fix(findings): order findings by inserted_at DESC (#6782) 2025-02-03 11:51:07 +01:00
Víctor Fernández Poyatos
ef6af8e84d feat(schedules): Rework daily schedule to always show the next scan (#6700) 2025-02-03 11:08:27 +01:00
Víctor Fernández Poyatos
763130f253 fix(celery): Kill celery worker process after every task to release memory (#6761) 2025-01-31 19:30:08 +05:45
Hugo Pereira Brito
1256c040e9 fix: microsoft365 mutelist (#6724) 2025-01-31 12:32:39 +01:00
dependabot[bot]
18b7b48a99 chore(deps): bump microsoft-kiota-abstractions from 1.6.8 to 1.9.1 (#6734)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-31 10:07:17 +01:00
Pepe Fagoaga
627c11503f fix(db_event): Handle other events (#6754) 2025-01-30 21:46:43 +05:45
Víctor Fernández Poyatos
712ba84f06 feat(scans): Optimize read queries during scans (#6753) 2025-01-30 20:51:12 +05:45
Pepe Fagoaga
5186e029b3 fix(set_report_color): Add more details to error (#6751) 2025-01-30 20:48:51 +05:45
Pablo Lara
5bfaedf903 fix: Enable hot reloading when using Docker Compose for UI (#6750) 2025-01-30 14:05:39 +01:00
Víctor Fernández Poyatos
5061da6897 feat(findings): Improve /findings/metadata performance (#6748) 2025-01-30 13:31:43 +01:00
Pepe Fagoaga
c159a28016 fix(neptune): correct service name (#6743) 2025-01-30 17:16:18 +05:45
Pepe Fagoaga
82a1b1c921 fix(finding): raise when generating invalid findings (#6738) 2025-01-30 15:59:38 +05:45
Pepe Fagoaga
bf2210d0f4 fix(acm): Key Error DomainName (#6739) 2025-01-30 15:54:31 +05:45
Kay Agahd
8f0772cb94 fix(aws): iam_user_with_temporary_credentials resource in OCSF (#6697)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2025-01-30 15:28:21 +05:45
Pepe Fagoaga
5b57079ecd fix(sns): Add region to subscriptions (#6731) 2025-01-30 14:38:21 +05:45
Matt Johnson
350d759517 chore: Update Google Analytics ID across all docs.prowler.com sites. (#6730) 2025-01-30 12:47:01 +05:45
Pablo Lara
edd793c9f5 fix(scans): change label for next scan (#6725) 2025-01-29 10:46:49 +01:00
Víctor Fernández Poyatos
545c2dc685 fix(migrations): Use indexes instead of constraints to define an index (#6722) 2025-01-29 14:24:04 +05:45
Víctor Fernández Poyatos
84955c066c revert: Update Django DB manager to use psycopg3 and connection pooling (#6717) 2025-01-28 22:15:01 +05:45
Víctor Fernández Poyatos
06dd03b170 fix(scan-summaries): Improve efficiency on providers overview (#6716) 2025-01-28 21:56:29 +05:45
Pedro Martín
47bc2ed2dc fix(defender): add field to SecurityContacts (#6693) 2025-01-28 15:52:56 +01:00
Pablo Lara
44281afc54 fix(scans): filters and sorting for scan table (#6713) 2025-01-28 13:26:31 +01:00
Víctor Fernández Poyatos
4d2859d145 fix(scans, findings): Improve API performance ordering by inserted_at instead of id (#6711) 2025-01-28 16:41:58 +05:45
Pablo Lara
45d44a1669 fix: fixed bug when opening finding details while a scan is in progress (#6708) 2025-01-28 06:58:18 +01:00
dependabot[bot]
ddd83b340e chore(deps): bump uuid from 10.0.0 to 11.0.5 in /ui (#6516)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-26 13:39:42 +01:00
Mario Rodriguez Lopez
ccdb54d7c3 feat(m365): add Microsoft 365 provider (#5902)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-24 13:14:17 -05:00
Rubén De la Torre Vico
bcc246d950 fix(cloudsql): add trusted client certificates case for cloudsql_instance_ssl_connections (#6682) 2025-01-24 10:42:45 -05:00
dependabot[bot]
62139e252a chore(deps): bump azure-mgmt-web from 7.3.1 to 8.0.0 (#6680)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-24 12:40:11 +01:00
dependabot[bot]
86950c3a0a chore(deps): bump msgraph-sdk from 1.17.0 to 1.18.0 (#6679)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-24 10:47:09 +01:00
dependabot[bot]
f4865ef68d chore(deps): bump azure-storage-blob from 12.24.0 to 12.24.1 (#6666)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-24 09:44:16 +01:00
Pepe Fagoaga
ea7209e7ae chore: bump for next minor (#6672) 2025-01-23 13:13:08 -05:00
Hugo Pereira Brito
998c551cf3 fix(cloudwatch): NoneType object is not iterable (#6671) 2025-01-23 12:27:07 -05:00
Paolo Frigo
e6f29b0116 docs: update # of checks, services, frameworks and categories (#6528)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 11:11:03 -05:00
Pepe Fagoaga
eb90bb39dc chore(api): Bump to v1.3.0 (#6670) 2025-01-23 21:25:29 +05:45
Pepe Fagoaga
ad189b35ad chore(scan): Remove ._findings (#6667) 2025-01-23 20:43:02 +05:45
Pablo Lara
7d2989a233 chore: adjust DateWithTime component height when used with InfoField (#6669) 2025-01-23 15:18:24 +01:00
Pablo Lara
862137ae7d chore(scans): improve scan details (#6665) 2025-01-23 13:20:41 +01:00
Pedro Martín
c86e082d9a feat(detect-secrets): get secrets plugins from config.yaml (#6544)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-23 17:18:19 +05:45
Sergio Garcia
80fe048f97 feat(resource metadata): add resource metadata to JSON OCSF (#6592)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-01-23 16:06:30 +05:45
dependabot[bot]
f2bffb3ce7 chore(deps): bump azure-mgmt-containerservice from 33.0.0 to 34.0.0 (#6630)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-22 16:37:07 -05:00
dependabot[bot]
cbe2f9eef8 chore(deps): bump azure-mgmt-compute from 33.1.0 to 34.0.0 (#6628)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-22 20:00:56 +01:00
Pepe Fagoaga
688f41f570 fix(templates): Customize principals and add validation (#6655) 2025-01-22 21:47:57 +05:45
Anton Rubets
a29197637e chore(helm): Add prowler helm support (#6580)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-22 10:55:26 -05:00
Prowler Bot
7a2712a37f chore(regions_update): Changes in regions for AWS services (#6652)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-22 09:30:03 -05:00
dependabot[bot]
189f5cfd8c chore(deps): bump boto3 from 1.35.94 to 1.35.99 (#6651)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-22 09:29:41 -05:00
Kay Agahd
e509480892 fix: add detector and line number of potential secret (#6654) 2025-01-22 20:13:23 +05:45
Pepe Fagoaga
7f7955351a chore(pre-commit): poetry checks for API and SDK (#6658) 2025-01-22 20:05:26 +05:45
Pepe Fagoaga
46f1db21a8 chore(api): Use prowler from master (#6657) 2025-01-22 20:05:02 +05:45
Pablo Lara
fbe7bc6951 feat(providers): show the cloud formation and terraform template links on the form (#6660) 2025-01-22 14:49:38 +01:00
Pablo Lara
f658507847 feat(providers): make external id field mandatory in the aws role secret form (#6656) 2025-01-22 12:45:31 +01:00
dependabot[bot]
374078683b chore(deps-dev): bump moto from 5.0.16 to 5.0.27 (#6632)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-21 13:56:06 -05:00
dependabot[bot]
114c4e0886 chore(deps): bump botocore from 1.35.94 to 1.35.99 (#6520)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-21 09:17:18 -05:00
Pablo Lara
67c62766d4 fix(filters): fix dynamic filters (#6642) 2025-01-21 13:33:27 +01:00
dependabot[bot]
3f2947158d chore(deps): bump prowler from 5.1.1 to 5.1.4 in /api (#6641)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-21 14:27:59 +05:45
dependabot[bot]
278a7cb356 chore(deps-dev): bump mkdocs-material from 9.5.49 to 9.5.50 (#6631)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-20 18:31:44 -05:00
Rubén De la Torre Vico
890158a79c fix(OCSF): fix OCSF output when timestamp is UNIX format (#6606) 2025-01-20 17:11:28 -05:00
Rubén De la Torre Vico
4dc1602b77 fix: update Azure CIS with existing App checks (#6611) 2025-01-20 15:12:00 -05:00
Kay Agahd
bbba0abac9 fix(aws): list tags for DocumentDB clusters (#6605) 2025-01-20 15:10:58 -05:00
Prowler Bot
d04fd807c6 chore(regions_update): Changes in regions for AWS services (#6599)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-20 15:09:35 -05:00
Pablo Lara
3456df4cf1 fix(snippet-id): improve provider ID readability in tables (#6615) 2025-01-20 17:23:19 +01:00
Pablo Lara
f56aaa791e chore(RBAC): add permission's info (#6612) 2025-01-20 16:14:48 +01:00
Adrián Jesús Peña Rodríguez
465a758770 fix(rbac): remove invalid required permission (#6608) 2025-01-20 15:21:52 +01:00
Pablo Lara
0f7c0c1b2c fix(RBAC): tweaks for edit role form (#6609) 2025-01-20 14:09:16 +01:00
Adrián Jesús Peña Rodríguez
bf8d10b6f6 feat(api): restrict the deletion of users, only the user of the request can be deleted (#6607) 2025-01-20 13:26:47 +01:00
Pablo Lara
20d04553d6 fix(RBAC): restore manage_account permission for roles (#6602) 2025-01-20 11:35:29 +01:00
Daniel Barranquero
b56d62e3c4 fix(sqs): fix flaky test (#6593) 2025-01-17 11:48:39 -05:00
Hugo Pereira Brito
9a332dcba1 chore(services): delete all comment headers (#6585) 2025-01-17 08:21:28 -05:00
Hugo Pereira Brito
166d9f8823 fix(apigatewayv2): managed exception NotFoundException (#6576) 2025-01-17 08:17:51 -05:00
Prowler Bot
42f5eed75f chore(regions_update): Changes in regions for AWS services (#6577)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-17 08:17:00 -05:00
Rubén De la Torre Vico
01a7db18dd fix: add missing Check_Report_Azure parameters (#6583) 2025-01-17 08:16:43 -05:00
Pablo Lara
d4507465a3 fix(providers): update the label and placeholder based on the cloud provider (#6581) 2025-01-17 12:28:38 +01:00
Pablo Lara
3ac92ed10a fix(findings): remove filter delta_in applied by default (#6578) 2025-01-17 11:03:12 +01:00
Pablo Lara
43c76ca85c feat(findings): add first seen in findings details (#6575) 2025-01-17 10:19:10 +01:00
dependabot[bot]
54d87fa96a chore(deps): bump prowler from 5.0.2 to 5.1.1 in /api (#6573)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-17 13:26:07 +05:45
Daniel Barranquero
f041f17268 fix(gcp): fix flaky tests from dns service (#6569) 2025-01-16 14:49:25 -05:00
dependabot[bot]
31c80a6967 chore(deps): bump msgraph-sdk from 1.16.0 to 1.17.0 (#6547)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-16 12:55:30 -05:00
Rubén De la Torre Vico
783ce136f4 feat(network): extract Network resource metadata automated (#6555)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-16 12:41:02 -05:00
Rubén De la Torre Vico
f829145781 feat(storage): extract Storage resource metadata automated (#6563)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-16 11:44:43 -05:00
Rubén De la Torre Vico
389337f8cd feat(vm): extract VM resource metadata automated (#6564) 2025-01-16 11:16:02 -05:00
Pedro Martín
a0713c2d66 fix(cis): add subsections if needed (#6559) 2025-01-16 11:10:54 -05:00
Rubén De la Torre Vico
f94d3cbce4 feat(sqlserver): extract SQL Server resource metadata automated (#6562) 2025-01-16 10:47:21 -05:00
Daniel Barranquero
8d8994b468 feat(aws): include resource metadata to remaining checks (#6551)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-16 10:44:14 -05:00
Rubén De la Torre Vico
784a9097a5 feat(postgresql): extract PostgreSQL resource metadata automated (#6560) 2025-01-16 10:37:55 -05:00
Pedro Martín
b9601626e3 fix(detect_secrets): refactor logic for detect-secrets (#6537) 2025-01-16 21:15:44 +05:45
Rubén De la Torre Vico
dc80b011f2 feat(policy): extract Policy resource metadata automated (#6558) 2025-01-16 10:29:28 -05:00
Rubén De la Torre Vico
ee7d32d460 feat(entra): extract Entra resource metadata automated (#6542)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-16 10:24:53 -05:00
Rubén De la Torre Vico
43fd9ee94e feat(monitor): extract monitor resource metadata automated (#6554)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-16 10:16:19 -05:00
Víctor Fernández Poyatos
8821a91f3f feat(db): Update Django DB manager to use psycopg3 and connection pooling (#6541) 2025-01-16 15:29:02 +01:00
Rubén De la Torre Vico
98d9256f92 feat(mysql): extract MySQL resource metadata automated (#6556) 2025-01-16 09:24:06 -05:00
Rubén De la Torre Vico
b35495eaa7 feat(keyvault): extract KeyVault resource metadata automated (#6553) 2025-01-16 09:17:36 -05:00
Rubén De la Torre Vico
74d6b614b3 feat(iam): extract IAM resource metadata automated (#6552) 2025-01-16 09:05:23 -05:00
Sergio Garcia
dd63c16a74 fix(gcp): iterate through service projects (#6549)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-01-16 08:52:52 -05:00
Pablo Lara
4280266a96 fix(dep): address compatibility issues (#6543) 2025-01-16 14:28:49 +01:00
Hugo Pereira Brito
b1f02098ff feat(aws): include resource metadata in services from r* to s* (#6536)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-15 18:10:53 -05:00
Pedro Martín
95189b574a feat(gcp): add resource metadata to report (#6500)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-15 18:09:35 -05:00
Hugo Pereira Brito
c5d23503bf feat(aws): include resource metadata in services from a* to b* (#6504)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-15 18:03:37 -05:00
Daniel Barranquero
77950f6069 chore(aws): add resource metadata to services from t to w (#6546) 2025-01-15 17:22:08 -05:00
Daniel Barranquero
ec5f2b3753 chore(aws): add resource metadata to services from f to o (#6545) 2025-01-15 17:15:50 -05:00
Rubén De la Torre Vico
9e7104fb7f feat(defender): extract Defender resource metadata in automated way (#6538) 2025-01-15 12:14:24 -05:00
Rubén De la Torre Vico
6b3b6ca45e feat(appinsights): extract App Insights resource metadata in automated way (#6540) 2025-01-15 11:45:23 -05:00
Hugo Pereira Brito
20b8b0b24e feat: add resource metadata to emr_cluster_account_public_block_enabled (#6539) 2025-01-15 11:44:51 -05:00
Sergio Garcia
4e11540458 feat(kubernetes): add resource metadata to report (#6479) 2025-01-15 11:36:09 -05:00
Hugo Pereira Brito
ee87f2676d feat(aws): include resource metadata in services from d* to e* (#6532)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-15 10:05:04 -05:00
Daniel Barranquero
74a90aab98 feat(aws): add resource metadata to all services starting with c (#6493) 2025-01-15 09:04:19 -05:00
Rubén De la Torre Vico
48ff9a5100 feat(cosmosdb): extract CosmosDB resource metadata in automated way (#6533) 2025-01-15 08:51:48 -05:00
Rubén De la Torre Vico
3dfd578ee5 feat(containerregistry): extract Container Registry resource metadata in automated way (#6530) 2025-01-15 08:51:16 -05:00
Rubén De la Torre Vico
0db46cdc81 feat(azure-app): extract Web App resource metadata in automated way (#6529) 2025-01-15 08:48:36 -05:00
Prowler Bot
fdac58d031 chore(regions_update): Changes in regions for AWS services (#6526)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-15 08:46:35 -05:00
dependabot[bot]
df9d4ce856 chore(deps): bump google-api-python-client from 2.158.0 to 2.159.0 (#6521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-15 08:33:47 -05:00
Pedro Martín
e6ae4e97e8 docs(readme): update pr template to add check for readme (#6531) 2025-01-15 12:12:45 +01:00
Adrián Jesús Peña Rodríguez
10a4c28922 feat(finding): add first_seen attribute (#6460) 2025-01-15 11:25:41 +01:00
dependabot[bot]
8a828c6e51 chore(deps): bump django from 5.1.4 to 5.1.5 in /api (#6519)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-15 10:52:11 +01:00
Víctor Fernández Poyatos
d7b40905ff feat(findings): Add resource_tag filters for findings endpoint (#6527) 2025-01-15 10:30:36 +01:00
Adrián Jesús Peña Rodríguez
f9a3b5f3cd feat(provider-secret): make existing external_id field mandatory (#6510) 2025-01-15 10:14:44 +01:00
Pablo Lara
b73b89242f feat(filters): add resource type filter for findings (#6524) 2025-01-15 08:40:53 +01:00
dependabot[bot]
23a0f6e8de chore(deps-dev): bump eslint-config-prettier from 9.1.0 to 10.0.1 in /ui (#6518)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-15 06:55:25 +01:00
Pedro Martín
87967abc3f feat(kubernetes): add CIS 1.10 compliance (#6508)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-14 14:16:00 -05:00
Rubén De la Torre Vico
ce60c286dc feat(aks): use Check_Report_Azure constructor properly in AKS checks (#6509) 2025-01-14 14:14:02 -05:00
Pepe Fagoaga
90fd9b0eb8 chore(version): set next minor (#6511) 2025-01-14 14:06:24 -05:00
Prowler Bot
ca262a6797 chore(regions_update): Changes in regions for AWS services (#6495)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-14 12:43:44 -05:00
Rubén De la Torre Vico
c056d39775 feat(aisearch): use Check_Report_Azure constructor properly in AISearch checks (#6506) 2025-01-14 12:37:01 -05:00
johannes-engler-mw
1c4426ea4b fix(Azure TDE): add filter for master DB (#6351) 2025-01-14 12:34:52 -05:00
Pedro Martín
36520bd7a1 feat(azure): add CIS 3.0 for Azure (#5226) 2025-01-14 12:07:22 -05:00
Pepe Fagoaga
badf0ace76 feat(prowler-role): Add templates to deploy it in AWS (#6499) 2025-01-14 12:04:20 -05:00
Rubén De la Torre Vico
f1f61249e0 feat(azure): include resource metadata in Check_Report_Azure (#6505) 2025-01-14 11:32:40 -05:00
dependabot[bot]
b371cac18c chore(deps): bump jinja2 from 3.1.4 to 3.1.5 (#6457)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-14 10:03:45 -05:00
Víctor Fernández Poyatos
1846535d8d feat(findings): add /findings/metadata to retrieve dynamic filters information (#6503) 2025-01-14 15:30:03 +01:00
dependabot[bot]
d7d9118b9b chore(deps-dev): bump bandit from 1.8.0 to 1.8.2 (#6485)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-14 08:49:37 -05:00
559 changed files with 4857 additions and 40744 deletions

2
.env
View File

@@ -123,7 +123,7 @@ SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.5.1
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.6.0
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"

View File

@@ -16,6 +16,7 @@ Please include a summary of the change and which issue is fixed. List any depend
- [ ] Review if code is being documented following this specification https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
- [ ] Review if backport is needed.
- [ ] Review if is needed to change the [Readme.md](https://github.com/prowler-cloud/prowler/blob/master/README.md)
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/prowler/CHANGELOG.md), if applicable.
#### API
- [ ] Verify if API specs need to be regenerated.

View File

@@ -81,7 +81,7 @@ jobs:
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
# Set push: false for testing
@@ -94,7 +94,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true

View File

@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/api-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
category: "/language:${{matrix.language}}"

View File

@@ -85,14 +85,6 @@ jobs:
api/README.md
api/mkdocs.yml
- name: Replace @master with current branch in pyproject.toml
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
BRANCH_NAME="${GITHUB_HEAD_REF:-${GITHUB_REF_NAME}}"
echo "Using branch: $BRANCH_NAME"
sed -i "s|@master|@$BRANCH_NAME|g" pyproject.toml
- name: Install poetry
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
@@ -100,15 +92,9 @@ jobs:
python -m pip install --upgrade pip
pipx install poetry==2.1.1
- name: Update poetry.lock after the branch name change
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry lock
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
@@ -159,7 +145,7 @@ jobs:
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run safety check --ignore 70612,66963,74429
poetry run safety check --ignore 70612,66963
- name: Vulture
working-directory: ./api
@@ -193,7 +179,7 @@ jobs:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.API_WORKING_DIR }}
push: false

View File

@@ -11,7 +11,7 @@ jobs:
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@b06f6d72a3791308bb7ba59c2b8cb7a083bd17e4 # v3.88.26
uses: trufflesecurity/trufflehog@690e5c7aff8347c3885096f3962a0633d9129607 # v3.88.23
with:
path: ./
base: ${{ github.event.repository.default_branch }}

View File

@@ -12,8 +12,6 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
ref: ${{ github.event.pull_request.merge_commit_sha }}
- name: Set short git commit SHA
id: vars
@@ -32,6 +30,5 @@ jobs:
"PROWLER_COMMIT_SHORT_SHA": "${{ env.SHORT_SHA }}",
"PROWLER_PR_TITLE": "${{ github.event.pull_request.title }}",
"PROWLER_PR_LABELS": ${{ toJson(github.event.pull_request.labels.*.name) }},
"PROWLER_PR_BODY": ${{ toJson(github.event.pull_request.body) }},
"PROWLER_PR_URL":${{ toJson(github.event.pull_request.html_url) }}
"PROWLER_PR_BODY": ${{ toJson(github.event.pull_request.body) }}
}'

View File

@@ -62,7 +62,7 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ env.PYTHON_VERSION }}
@@ -127,7 +127,7 @@ jobs:
- name: Build and push container image (latest)
if: github.event_name == 'push'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
push: true
tags: |
@@ -140,7 +140,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
# Use local context to get changes
# https://github.com/docker/build-push-action#path-context

View File

@@ -1,145 +0,0 @@
name: SDK - Bump Version
on:
release:
types: [published]
env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
jobs:
bump-version:
name: Bump Version
if: github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Get Prowler version
shell: bash
run: |
if [[ $PROWLER_VERSION =~ ^([0-9]+)\.([0-9]+)\.([0-9]+)$ ]]; then
MAJOR_VERSION=${BASH_REMATCH[1]}
MINOR_VERSION=${BASH_REMATCH[2]}
FIX_VERSION=${BASH_REMATCH[3]}
# Export version components to GitHub environment
echo "MAJOR_VERSION=${MAJOR_VERSION}" >> "${GITHUB_ENV}"
echo "MINOR_VERSION=${MINOR_VERSION}" >> "${GITHUB_ENV}"
echo "FIX_VERSION=${FIX_VERSION}" >> "${GITHUB_ENV}"
if (( MAJOR_VERSION == 5 )); then
if (( FIX_VERSION == 0 )); then
echo "Minor Release: $PROWLER_VERSION"
# Set up next minor version for master
BUMP_VERSION_TO=${MAJOR_VERSION}.$((MINOR_VERSION + 1)).${FIX_VERSION}
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
TARGET_BRANCH=${BASE_BRANCH}
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
# Set up patch version for version branch
PATCH_VERSION_TO=${MAJOR_VERSION}.${MINOR_VERSION}.1
echo "PATCH_VERSION_TO=${PATCH_VERSION_TO}" >> "${GITHUB_ENV}"
VERSION_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
echo "VERSION_BRANCH=${VERSION_BRANCH}" >> "${GITHUB_ENV}"
echo "Bumping to next minor version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
echo "Bumping to next patch version: ${PATCH_VERSION_TO} in branch ${VERSION_BRANCH}"
else
echo "Patch Release: $PROWLER_VERSION"
BUMP_VERSION_TO=${MAJOR_VERSION}.${MINOR_VERSION}.$((FIX_VERSION + 1))
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
TARGET_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
echo "Bumping to next patch version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
fi
else
echo "Releasing another Prowler major version, aborting..."
exit 1
fi
else
echo "Invalid version syntax: '$PROWLER_VERSION' (must be N.N.N)" >&2
exit 1
fi
- name: Bump versions in files
run: |
echo "Using PROWLER_VERSION=$PROWLER_VERSION"
echo "Using BUMP_VERSION_TO=$BUMP_VERSION_TO"
set -e
echo "Bumping version in pyproject.toml ..."
sed -i "s|version = \"${PROWLER_VERSION}\"|version = \"${BUMP_VERSION_TO}\"|" pyproject.toml
echo "Bumping version in prowler/config/config.py ..."
sed -i "s|prowler_version = \"${PROWLER_VERSION}\"|prowler_version = \"${BUMP_VERSION_TO}\"|" prowler/config/config.py
echo "Bumping version in .env ..."
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${BUMP_VERSION_TO}|" .env
git --no-pager diff
- name: Create Pull Request
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.TARGET_BRANCH }}
commit-message: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
branch: "version-bump-to-v${{ env.BUMP_VERSION_TO }}"
title: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
body: |
### Description
Bump Prowler version to v${{ env.BUMP_VERSION_TO }}
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
- name: Handle patch version for minor release
if: env.FIX_VERSION == '0'
run: |
echo "Using PROWLER_VERSION=$PROWLER_VERSION"
echo "Using PATCH_VERSION_TO=$PATCH_VERSION_TO"
set -e
echo "Bumping version in pyproject.toml ..."
sed -i "s|version = \"${PROWLER_VERSION}\"|version = \"${PATCH_VERSION_TO}\"|" pyproject.toml
echo "Bumping version in prowler/config/config.py ..."
sed -i "s|prowler_version = \"${PROWLER_VERSION}\"|prowler_version = \"${PATCH_VERSION_TO}\"|" prowler/config/config.py
echo "Bumping version in .env ..."
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PATCH_VERSION_TO}|" .env
git --no-pager diff
- name: Create Pull Request for patch version
if: env.FIX_VERSION == '0'
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.VERSION_BRANCH }}
commit-message: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
branch: "version-bump-to-v${{ env.PATCH_VERSION_TO }}"
title: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
body: |
### Description
Bump Prowler version to v${{ env.PATCH_VERSION_TO }}
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

View File

@@ -56,12 +56,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/sdk-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
category: "/language:${{matrix.language}}"

View File

@@ -51,7 +51,7 @@ jobs:
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
@@ -107,123 +107,11 @@ jobs:
run: |
/tmp/hadolint Dockerfile --ignore=DL3013
# Test AWS
- name: AWS - Check if any file has changed
id: aws-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/aws/**
./tests/providers/aws/**
.poetry.lock
- name: AWS - Test
if: steps.aws-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml tests/providers/aws
# Test Azure
- name: Azure - Check if any file has changed
id: azure-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/azure/**
./tests/providers/azure/**
.poetry.lock
- name: Azure - Test
if: steps.azure-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/azure --cov-report=xml:azure_coverage.xml tests/providers/azure
# Test GCP
- name: GCP - Check if any file has changed
id: gcp-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/gcp/**
./tests/providers/gcp/**
.poetry.lock
- name: GCP - Test
if: steps.gcp-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/gcp --cov-report=xml:gcp_coverage.xml tests/providers/gcp
# Test Kubernetes
- name: Kubernetes - Check if any file has changed
id: kubernetes-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/kubernetes/**
./tests/providers/kubernetes/**
.poetry.lock
- name: Kubernetes - Test
if: steps.kubernetes-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/kubernetes --cov-report=xml:kubernetes_coverage.xml tests/providers/kubernetes
# Test GitHub
- name: GitHub - Check if any file has changed
id: github-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/github/**
./tests/providers/github/**
.poetry.lock
- name: GitHub - Test
if: steps.github-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/github --cov-report=xml:github_coverage.xml tests/providers/github
# Test NHN
- name: NHN - Check if any file has changed
id: nhn-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/nhn/**
./tests/providers/nhn/**
.poetry.lock
- name: NHN - Test
if: steps.nhn-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/nhn --cov-report=xml:nhn_coverage.xml tests/providers/nhn
# Test M365
- name: M365 - Check if any file has changed
id: m365-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/m365/**
./tests/providers/m365/**
.poetry.lock
- name: M365 - Test
if: steps.m365-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/m365 --cov-report=xml:m365_coverage.xml tests/providers/m365
# Common Tests
- name: Lib - Test
- name: Test with pytest
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/lib --cov-report=xml:lib_coverage.xml tests/lib
poetry run pytest -n auto --cov=./prowler --cov-report=xml tests
- name: Config - Test
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/config --cov-report=xml:config_coverage.xml tests/config
# Codecov
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
@@ -231,4 +119,3 @@ jobs:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: prowler
files: ./aws_coverage.xml,./azure_coverage.xml,./gcp_coverage.xml,./kubernetes_coverage.xml,./github_coverage.xml,./nhn_coverage.xml,./m365_coverage.xml,./lib_coverage.xml,./config_coverage.xml

View File

@@ -71,7 +71,7 @@ jobs:
pipx install poetry==2.1.1
- name: Setup Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ env.PYTHON_VERSION }}
# cache: ${{ env.CACHE }}

View File

@@ -28,7 +28,7 @@ jobs:
ref: ${{ env.GITHUB_BRANCH }}
- name: setup python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: 3.9 #install the python needed

View File

@@ -81,7 +81,7 @@ jobs:
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
@@ -96,7 +96,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |

View File

@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/ui-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
category: "/language:${{matrix.language}}"

View File

@@ -50,7 +50,7 @@ jobs:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.UI_WORKING_DIR }}
# Always build using `prod` target

3
.gitignore vendored
View File

@@ -42,9 +42,6 @@ junit-reports/
# VSCode files
.vscode/
# Cursor files
.cursorignore
# Terraform
.terraform*
*.tfstate

View File

@@ -115,7 +115,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'safety check --ignore 70612,66963,74429'
entry: bash -c 'safety check --ignore 70612,66963'
language: system
- id: vulture

View File

@@ -1,43 +1,24 @@
FROM python:3.12.10-slim-bookworm AS build
FROM python:3.12.10-alpine3.20
LABEL maintainer="https://github.com/prowler-cloud/prowler"
LABEL org.opencontainers.image.source="https://github.com/prowler-cloud/prowler"
ARG POWERSHELL_VERSION=7.5.0
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
elif [ "$ARCH" = "aarch64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
else \
echo "Unsupported architecture: $ARCH" && exit 1 ; \
fi && \
mkdir -p /opt/microsoft/powershell/7 && \
tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7 && \
chmod +x /opt/microsoft/powershell/7/pwsh && \
ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh && \
rm /tmp/powershell.tar.gz
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
# Update system dependencies and install essential tools
#hadolint ignore=DL3018
RUN apk --no-cache upgrade && apk --no-cache add curl git gcc python3-dev musl-dev linux-headers
# Create non-root user
RUN mkdir -p /home/prowler && \
echo 'prowler:x:1000:1000:prowler:/home/prowler:' > /etc/passwd && \
echo 'prowler:x:1000:' > /etc/group && \
chown -R prowler:prowler /home/prowler
USER prowler
WORKDIR /home/prowler
# Copy necessary files
WORKDIR /home/prowler
COPY prowler/ /home/prowler/prowler/
COPY dashboard/ /home/prowler/dashboard/
COPY pyproject.toml /home/prowler
COPY README.md /home/prowler/
COPY prowler/providers/m365/lib/powershell/m365_powershell.py /home/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py
# Install Python dependencies
ENV HOME='/home/prowler'
@@ -53,9 +34,6 @@ RUN pip install --no-cache-dir --upgrade pip && \
RUN poetry install --compile && \
rm -rf ~/.cache/pip
# Install PowerShell modules
RUN poetry run python prowler/providers/m365/lib/powershell/m365_powershell.py
# Remove deprecated dash dependencies
RUN pip uninstall dash-html-components -y && \
pip uninstall dash-core-components -y

157
README.md
View File

@@ -3,7 +3,7 @@
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/master/docs/img/prowler-logo-white.png#gh-dark-mode-only" width="50%" height="50%">
</p>
<p align="center">
<b><i>Prowler Open Source</b> is as dynamic and adaptable as the environment it secures. It is trusted by the industry leaders to uphold the highest standards in security.
<b><i>Prowler Open Source</b> is as dynamic and adaptable as the environment theyre meant to protect. Trusted by the leaders in security.
</p>
<p align="center">
<b>Learn more at <a href="https://prowler.com">prowler.com</i></b>
@@ -43,29 +43,15 @@
# Description
**Prowler** is an open-source security tool designed to assess and enforce security best practices across AWS, Azure, Google Cloud, and Kubernetes. It supports tasks such as security audits, incident response, continuous monitoring, system hardening, forensic readiness, and remediation processes.
Prowler includes hundreds of built-in controls to ensure compliance with standards and frameworks, including:
- **Industry Standards:** CIS, NIST 800, NIST CSF, and CISA
- **Regulatory Compliance and Governance:** RBI, FedRAMP, and PCI-DSS
- **Frameworks for Sensitive Data and Privacy:** GDPR, HIPAA, and FFIEC
- **Frameworks for Organizational Governance and Quality Control:** SOC2 and GXP
- **AWS-Specific Frameworks:** AWS Foundational Technical Review (FTR) and AWS Well-Architected Framework (Security Pillar)
- **National Security Standards:** ENS (Spanish National Security Scheme)
- **Custom Security Frameworks:** Tailored to your needs
## Prowler CLI and Prowler Cloud
Prowler offers a Command Line Interface (CLI), known as Prowler Open Source, and an additional service built on top of it, called <a href="https://prowler.com">Prowler Cloud</a>.
**Prowler** is an Open Source security tool to perform AWS, Azure, Google Cloud and Kubernetes security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness, and also remediations! We have Prowler CLI (Command Line Interface) that we call Prowler Open Source and a service on top of it that we call <a href="https://prowler.com">Prowler Cloud</a>.
## Prowler App
Prowler App is a web-based application that simplifies running Prowler across your cloud provider accounts. It provides a user-friendly interface to visualize the results and streamline your security assessments.
Prowler App is a web application that allows you to run Prowler in your cloud provider accounts and visualize the results in a user-friendly interface.
![Prowler App](docs/img/overview.png)
>For more details, refer to the [Prowler App Documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-app-installation)
>More details at [Prowler App Documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-app-installation)
## Prowler CLI
@@ -74,7 +60,6 @@ prowler <provider>
```
![Prowler CLI Execution](docs/img/short-display.png)
## Prowler Dashboard
```console
@@ -82,28 +67,26 @@ prowler dashboard
```
![Prowler Dashboard](docs/img/dashboard.png)
# Prowler at a Glance
It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks.
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 564 | 82 | 34 | 10 |
| AWS | 564 | 82 | 33 | 10 |
| GCP | 79 | 13 | 7 | 3 |
| Azure | 140 | 18 | 8 | 3 |
| Kubernetes | 83 | 7 | 4 | 7 |
| GitHub | 3 | 2 | 1 | 0 |
| M365 | 44 | 2 | 2 | 0 |
| M365 | 5 | 2 | 1 | 0 |
| NHN (Unofficial) | 6 | 2 | 1 | 0 |
> Use the following commands to list Prowler's available checks, services, compliance frameworks, and categories: `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
> You can list the checks, services, compliance frameworks and categories with `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
# 💻 Installation
## Prowler App
Installing Prowler App
Prowler App offers flexible installation methods tailored to various environments:
Prowler App can be installed in different ways, depending on your environment:
> For detailed instructions on using Prowler App, refer to the [Prowler App Usage Guide](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/prowler-app/).
> See how to use Prowler App in the [Prowler App Usage Guide](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/prowler-app/).
### Docker Compose
@@ -119,16 +102,8 @@ curl -LO https://raw.githubusercontent.com/prowler-cloud/prowler/refs/heads/mast
docker compose up -d
```
> Containers are built for `linux/amd64`.
### Configuring Your Workstation for Prowler App
If your workstation's architecture is incompatible, you can resolve this by:
- **Setting the environment variable**: `DOCKER_DEFAULT_PLATFORM=linux/amd64`
- **Using the following flag in your Docker command**: `--platform linux/amd64`
> Once configured, access the Prowler App at http://localhost:3000. Sign up using your email and password to get started.
> Containers are built for `linux/amd64`. If your workstation's architecture is different, please set `DOCKER_DEFAULT_PLATFORM=linux/amd64` in your environment or use the `--platform linux/amd64` flag in the docker command.
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
### From GitHub
@@ -154,12 +129,12 @@ python manage.py migrate --database admin
gunicorn -c config/guniconf.py config.wsgi:application
```
> [!IMPORTANT]
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
>
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
> After completing the setup, access the API documentation at http://localhost:8080/api/v1/docs.
> Now, you can access the API documentation at http://localhost:8080/api/v1/docs.
**Commands to run the API Worker**
@@ -197,31 +172,29 @@ npm run build
npm start
```
> Once configured, access the Prowler App at http://localhost:3000. Sign up using your email and password to get started.
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
## Prowler CLI
### Pip package
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/). Consequently, it can be installed using pip with Python >3.9.1, <3.13:
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/), thus can be installed using pip with Python > 3.9.1, < 3.13:
```console
pip install prowler
prowler -v
```
>For further guidance, refer to [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli-installation)
>More details at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli-installation)
### Containers
**Available Versions of Prowler CLI**
The available versions of Prowler CLI are the following:
The following versions of Prowler CLI are available, depending on your requirements:
- `latest`: Synchronizes with the `master` branch. Note that this version is not stable.
- `v4-latest`: Synchronizes with the `v4` branch. Note that this version is not stable.
- `v3-latest`: Synchronizes with the `v3` branch. Note that this version is not stable.
- `<x.y.z>` (release): Stable releases corresponding to specific versions. You can find the complete list of releases [here](https://github.com/prowler-cloud/prowler/releases).
- `stable`: Always points to the latest release.
- `v4-stable`: Always points to the latest release for v4.
- `v3-stable`: Always points to the latest release for v3.
- `latest`: in sync with `master` branch (bear in mind that it is not a stable version)
- `v4-latest`: in sync with `v4` branch (bear in mind that it is not a stable version)
- `v3-latest`: in sync with `v3` branch (bear in mind that it is not a stable version)
- `<x.y.z>` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases.
- `stable`: this tag always point to the latest release.
- `v4-stable`: this tag always point to the latest release for v4.
- `v3-stable`: this tag always point to the latest release for v3.
The container images are available here:
- Prowler CLI:
@@ -233,7 +206,7 @@ The container images are available here:
### From GitHub
Python >3.9.1, <3.13 is required with pip and Poetry:
Python > 3.9.1, < 3.13 is required with pip and poetry:
``` console
git clone https://github.com/prowler-cloud/prowler
@@ -243,46 +216,25 @@ poetry install
python prowler-cli.py -v
```
> [!IMPORTANT]
> To clone Prowler on Windows, configure Git to support long file paths by running the following command: `git config core.longpaths true`.
> [!IMPORTANT]
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
>
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
# ✏️ High level architecture
> If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
# 📐✏️ High level architecture
## Prowler App
**Prowler App** is composed of three key components:
The **Prowler App** consists of three main components:
- **Prowler UI**: A web-based interface, built with Next.js, providing a user-friendly experience for executing Prowler scans and visualizing results.
- **Prowler API**: A backend service, developed with Django REST Framework, responsible for running Prowler scans and storing the generated results.
- **Prowler SDK**: A Python SDK designed to extend the functionality of the Prowler CLI for advanced capabilities.
- **Prowler UI**: A user-friendly web interface for running Prowler and viewing results, powered by Next.js.
- **Prowler API**: The backend API that executes Prowler scans and stores the results, built with Django REST Framework.
- **Prowler SDK**: A Python SDK that integrates with the Prowler CLI for advanced functionality.
![Prowler App Architecture](docs/img/prowler-app-architecture.png)
## Prowler CLI
**Running Prowler**
Prowler can be executed across various environments, offering flexibility to meet your needs. It can be run from:
- Your own workstation
- A Kubernetes Job
- Google Compute Engine
- Azure Virtual Machines (VMs)
- Amazon EC2 instances
- AWS Fargate or other container platforms
- CloudShell
And many more environments.
You can run Prowler from your workstation, a Kubernetes Job, a Google Compute Engine, an Azure VM, an EC2 instance, Fargate or any other container, CloudShell and many more.
![Architecture](docs/img/architecture.png)
@@ -290,36 +242,23 @@ And many more environments.
## General
- `Allowlist` now is called `Mutelist`.
- The `--quiet` option has been deprecated. Use the `--status` flag to filter findings based on their status: PASS, FAIL, or MANUAL.
- All findings with an `INFO` status have been reclassified as `MANUAL`.
- The CSV output format is standardized across all providers.
- The `--quiet` option has been deprecated, now use the `--status` flag to select the finding's status you want to get from PASS, FAIL or MANUAL.
- All `INFO` finding's status has changed to `MANUAL`.
- The CSV output format is common for all the providers.
**Deprecated Output Formats**
The following formats are now deprecated:
- Native JSON has been replaced with JSON in [OCSF] v1.1.0 format, which is standardized across all providers (https://schema.ocsf.io/).
We have deprecated some of our outputs formats:
- The native JSON is replaced for the JSON [OCSF](https://schema.ocsf.io/) v1.1.0, common for all the providers.
## AWS
**AWS Flag Deprecation**
The flag --sts-endpoint-region has been deprecated due to the adoption of AWS STS regional tokens.
**Sending FAIL Results to AWS Security Hub**
- To send only FAILS to AWS Security Hub, use one of the following options: `--send-sh-only-fails` or `--security-hub --status FAIL`.
- Deprecate the AWS flag --sts-endpoint-region since we use AWS STS regional tokens.
- To send only FAILS to AWS Security Hub, now use either `--send-sh-only-fails` or `--security-hub --status FAIL`.
# 📖 Documentation
**Documentation Resources**
For installation instructions, usage details, tutorials, and the Developer Guide, visit https://docs.prowler.com/
Install, Usage, Tutorials and Developer Guide is at https://docs.prowler.com/
# 📃 License
**Prowler License Information**
Prowler is licensed under the Apache License 2.0, as indicated in each file within the repository. Obtaining a Copy of the License
A copy of the License is available at <http://www.apache.org/licenses/LICENSE-2.0>
Prowler is licensed as Apache License 2.0 as specified in each file. You may obtain a copy of the License at
<http://www.apache.org/licenses/LICENSE-2.0>

View File

@@ -80,7 +80,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'poetry run safety check --ignore 70612,66963,74429'
entry: bash -c 'poetry run safety check --ignore 70612,66963'
language: system
- id: vulture

View File

@@ -2,31 +2,12 @@
All notable changes to the **Prowler API** are documented in this file.
## [v1.8.1] (Prowler v5.7.1)
### Fixed
- Added database index to improve performance on finding lookup. [(#7800)](https://github.com/prowler-cloud/prowler/pull/7800)
---
## [v1.8.0] (Prowler v5.7.0)
### Added
- Added huge improvements to `/findings/metadata` and resource related filters for findings [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
- Added improvements to `/overviews` endpoints [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
- Added new queue to perform backfill background tasks [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
- Added new endpoints to retrieve latest findings and metadata [(#7743)](https://github.com/prowler-cloud/prowler/pull/7743).
- Added export support for Prowler ThreatScore in M365 [(7783)](https://github.com/prowler-cloud/prowler/pull/7783)
---
## [v1.7.0] (Prowler v5.6.0)
## [v1.7.0] (UNRELEASED)
### Added
- Added M365 as a new provider [(#7563)](https://github.com/prowler-cloud/prowler/pull/7563).
- Added a `compliance/` folder and ZIPexport functionality for all compliance reports.[(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
- Added a new API endpoint to fetch and download any specific compliance file by name [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
---
@@ -72,10 +53,6 @@ All notable changes to the **Prowler API** are documented in this file.
- Handled exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283).
### Added
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
---
## [v1.5.0] (Prowler v5.4.0)

View File

@@ -1,20 +1,21 @@
FROM python:3.12.10-slim-bookworm AS build
FROM python:3.12-slim-bookworm AS build
LABEL maintainer="https://github.com/prowler-cloud/api"
ARG POWERSHELL_VERSION=7.5.0
ENV POWERSHELL_VERSION=${POWERSHELL_VERSION}
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
RUN apt-get update && apt-get install -y --no-install-recommends \
gcc python3-dev bash \
libcurl4-openssl-dev ca-certificates less ncurses-base \
krb5-multidev libgcc-s1 gettext libssl3 \
libstdc++6 tzdata liburcu8 zlib1g libicu72 curl openssh-client \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-x64.tar.gz -o /tmp/powershell.tar.gz ; \
elif [ "$ARCH" = "aarch64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-arm64.tar.gz -o /tmp/powershell.tar.gz ; \
else \
echo "Unsupported architecture: $ARCH" && exit 1 ; \
fi && \
@@ -47,13 +48,19 @@ RUN poetry install --no-root && \
COPY docker-entrypoint.sh ./docker-entrypoint.sh
RUN poetry run python "$(poetry env info --path)/src/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py"
WORKDIR /home/prowler/backend
# Development image
FROM build AS dev
USER root
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends \
curl vim \
&& rm -rf /var/lib/apt/lists/*
USER prowler
ENTRYPOINT ["../docker-entrypoint.sh", "dev"]
# Production image

View File

@@ -235,7 +235,6 @@ To view the logs for any component (e.g., Django, Celery worker), you can use th
```console
docker logs -f $(docker ps --format "{{.Names}}" | grep 'api-')
```
## Applying migrations

View File

@@ -28,7 +28,7 @@ start_prod_server() {
start_worker() {
echo "Starting the worker..."
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion,backfill -E --max-tasks-per-child 1
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion -E --max-tasks-per-child 1
}
start_worker_beat() {

89
api/poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 2.1.3 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
[[package]]
name = "about-time"
@@ -880,6 +880,7 @@ description = "Foreign Function Interface for Python calling C code."
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
markers = "platform_python_implementation != \"PyPy\""
files = [
{file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"},
{file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"},
@@ -949,7 +950,6 @@ files = [
{file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"},
{file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"},
]
markers = {dev = "platform_python_implementation != \"PyPy\""}
[package.dependencies]
pycparser = "*"
@@ -1469,14 +1469,14 @@ with-social = ["django-allauth[socialaccount] (>=64.0.0)"]
[[package]]
name = "django"
version = "5.1.8"
version = "5.1.7"
description = "A high-level Python web framework that encourages rapid development and clean, pragmatic design."
optional = false
python-versions = ">=3.10"
groups = ["main", "dev"]
files = [
{file = "Django-5.1.8-py3-none-any.whl", hash = "sha256:11b28fa4b00e59d0def004e9ee012fefbb1065a5beb39ee838983fd24493ad4f"},
{file = "Django-5.1.8.tar.gz", hash = "sha256:42e92a1dd2810072bcc40a39a212b693f94406d0ba0749e68eb642f31dc770b4"},
{file = "Django-5.1.7-py3-none-any.whl", hash = "sha256:1323617cb624add820cb9611cdcc788312d250824f92ca6048fda8625514af2b"},
{file = "Django-5.1.7.tar.gz", hash = "sha256:30de4ee43a98e5d3da36a9002f287ff400b43ca51791920bfb35f6917bfe041c"},
]
[package.dependencies]
@@ -2237,14 +2237,14 @@ tornado = ["tornado (>=0.2)"]
[[package]]
name = "h11"
version = "0.16.0"
version = "0.14.0"
description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1"
optional = false
python-versions = ">=3.8"
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86"},
{file = "h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1"},
{file = "h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761"},
{file = "h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d"},
]
[[package]]
@@ -2277,19 +2277,19 @@ files = [
[[package]]
name = "httpcore"
version = "1.0.9"
version = "1.0.8"
description = "A minimal low-level HTTP client."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55"},
{file = "httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8"},
{file = "httpcore-1.0.8-py3-none-any.whl", hash = "sha256:5254cf149bcb5f75e9d1b2b9f729ea4a4b883d1ad7379fc632b727cec23674be"},
{file = "httpcore-1.0.8.tar.gz", hash = "sha256:86e94505ed24ea06514883fd44d2bc02d90e77e7979c8eb71b90f41d364a1bad"},
]
[package.dependencies]
certifi = "*"
h11 = ">=0.16"
h11 = ">=0.13,<0.15"
[package.extras]
asyncio = ["anyio (>=4.0,<5.0)"]
@@ -3597,7 +3597,7 @@ files = [
[[package]]
name = "prowler"
version = "5.7.0"
version = "5.6.0"
description = "Prowler is an Open Source security tool to perform AWS, GCP and Azure security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks."
optional = false
python-versions = ">3.9.1,<3.13"
@@ -3645,7 +3645,6 @@ numpy = "2.0.2"
pandas = "2.2.3"
py-ocsf-models = "0.3.1"
pydantic = "1.10.21"
pygithub = "2.5.0"
python-dateutil = ">=2.9.0.post0,<3.0.0"
pytz = "2025.1"
schema = "0.7.7"
@@ -3657,8 +3656,8 @@ tzlocal = "5.3.1"
[package.source]
type = "git"
url = "https://github.com/prowler-cloud/prowler.git"
reference = "v5.7"
resolved_reference = "a3b606fc7124ce94f27ed2fd2ba8ad8f734a69d1"
reference = "master"
resolved_reference = "0edf19928269b6d42d1c463ab13b90d7c21a65e4"
[[package]]
name = "psutil"
@@ -3835,11 +3834,11 @@ description = "C parser in Python"
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
markers = "platform_python_implementation != \"PyPy\""
files = [
{file = "pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc"},
{file = "pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6"},
]
markers = {dev = "platform_python_implementation != \"PyPy\""}
[[package]]
name = "pycurl"
@@ -3950,26 +3949,6 @@ typing-extensions = ">=4.2.0"
dotenv = ["python-dotenv (>=0.10.4)"]
email = ["email-validator (>=1.0.3)"]
[[package]]
name = "pygithub"
version = "2.5.0"
description = "Use the full Github API v3"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "PyGithub-2.5.0-py3-none-any.whl", hash = "sha256:b0b635999a658ab8e08720bdd3318893ff20e2275f6446fcf35bf3f44f2c0fd2"},
{file = "pygithub-2.5.0.tar.gz", hash = "sha256:e1613ac508a9be710920d26eb18b1905ebd9926aa49398e88151c1b526aad3cf"},
]
[package.dependencies]
Deprecated = "*"
pyjwt = {version = ">=2.4.0", extras = ["crypto"]}
pynacl = ">=1.4.0"
requests = ">=2.14.0"
typing-extensions = ">=4.0.0"
urllib3 = ">=1.26.0"
[[package]]
name = "pygments"
version = "2.19.1"
@@ -4034,33 +4013,6 @@ tomlkit = ">=0.10.1"
spelling = ["pyenchant (>=3.2,<4.0)"]
testutils = ["gitpython (>3)"]
[[package]]
name = "pynacl"
version = "1.5.0"
description = "Python binding to the Networking and Cryptography (NaCl) library"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "PyNaCl-1.5.0-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:401002a4aaa07c9414132aaed7f6836ff98f59277a234704ff66878c2ee4a0d1"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:52cb72a79269189d4e0dc537556f4740f7f0a9ec41c1322598799b0bdad4ef92"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a36d4a9dda1f19ce6e03c9a784a2921a4b726b02e1c736600ca9c22029474394"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:0c84947a22519e013607c9be43706dd42513f9e6ae5d39d3613ca1e142fba44d"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:06b8f6fa7f5de8d5d2f7573fe8c863c051225a27b61e6860fd047b1775807858"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:a422368fc821589c228f4c49438a368831cb5bbc0eab5ebe1d7fac9dded6567b"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:61f642bf2378713e2c2e1de73444a3778e5f0a38be6fee0fe532fe30060282ff"},
{file = "PyNaCl-1.5.0-cp36-abi3-win32.whl", hash = "sha256:e46dae94e34b085175f8abb3b0aaa7da40767865ac82c928eeb9e57e1ea8a543"},
{file = "PyNaCl-1.5.0-cp36-abi3-win_amd64.whl", hash = "sha256:20f42270d27e1b6a29f54032090b972d97f0a1b0948cc52392041ef7831fee93"},
{file = "PyNaCl-1.5.0.tar.gz", hash = "sha256:8ac7448f09ab85811607bdd21ec2464495ac8b7c66d146bf545b0f08fb9220ba"},
]
[package.dependencies]
cffi = ">=1.4.1"
[package.extras]
docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
[[package]]
name = "pyparsing"
version = "3.2.3"
@@ -4685,7 +4637,6 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f66efbc1caa63c088dead1c4170d148eabc9b80d95fb75b6c92ac0aad2437d76"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:22353049ba4181685023b25b5b51a574bce33e7f51c759371a7422dcae5402a6"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:932205970b9f9991b34f55136be327501903f7c66830e9760a8ffb15b07f05cd"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a52d48f4e7bf9005e8f0a89209bf9a73f7190ddf0489eee5eb51377385f59f2a"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win32.whl", hash = "sha256:3eac5a91891ceb88138c113f9db04f3cebdae277f5d44eaa3651a4f573e6a5da"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win_amd64.whl", hash = "sha256:ab007f2f5a87bd08ab1499bdf96f3d5c6ad4dcfa364884cb4549aa0154b13a28"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-macosx_13_0_arm64.whl", hash = "sha256:4a6679521a58256a90b0d89e03992c15144c5f3858f40d7c18886023d7943db6"},
@@ -4694,7 +4645,6 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:811ea1594b8a0fb466172c384267a4e5e367298af6b228931f273b111f17ef52"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:cf12567a7b565cbf65d438dec6cfbe2917d3c1bdddfce84a9930b7d35ea59642"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7dd5adc8b930b12c8fc5b99e2d535a09889941aa0d0bd06f4749e9a9397c71d2"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1492a6051dab8d912fc2adeef0e8c72216b24d57bd896ea607cb90bb0c4981d3"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win32.whl", hash = "sha256:bd0a08f0bab19093c54e18a14a10b4322e1eacc5217056f3c063bd2f59853ce4"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win_amd64.whl", hash = "sha256:a274fb2cb086c7a3dea4322ec27f4cb5cc4b6298adb583ab0e211a4682f241eb"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:20b0f8dc160ba83b6dcc0e256846e1a02d044e13f7ea74a3d1d56ede4e48c632"},
@@ -4703,7 +4653,6 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:749c16fcc4a2b09f28843cda5a193e0283e47454b63ec4b81eaa2242f50e4ccd"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bf165fef1f223beae7333275156ab2022cffe255dcc51c27f066b4370da81e31"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:32621c177bbf782ca5a18ba4d7af0f1082a3f6e517ac2a18b3974d4edf349680"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b82a7c94a498853aa0b272fd5bc67f29008da798d4f93a2f9f289feb8426a58d"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win32.whl", hash = "sha256:e8c4ebfcfd57177b572e2040777b8abc537cdef58a2120e830124946aa9b42c5"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win_amd64.whl", hash = "sha256:0467c5965282c62203273b838ae77c0d29d7638c8a4e3a1c8bdd3602c10904e4"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:4c8c5d82f50bb53986a5e02d1b3092b03622c02c2eb78e29bec33fd9593bae1a"},
@@ -4712,7 +4661,6 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:96777d473c05ee3e5e3c3e999f5d23c6f4ec5b0c38c098b3a5229085f74236c6"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:3bc2a80e6420ca8b7d3590791e2dfc709c88ab9152c00eeb511c9875ce5778bf"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:e188d2699864c11c36cdfdada94d781fd5d6b0071cd9c427bceb08ad3d7c70e1"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4f6f3eac23941b32afccc23081e1f50612bdbe4e982012ef4f5797986828cd01"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win32.whl", hash = "sha256:6442cb36270b3afb1b4951f060eccca1ce49f3d087ca1ca4563a6eb479cb3de6"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win_amd64.whl", hash = "sha256:e5b8daf27af0b90da7bb903a876477a9e6d7270be6146906b276605997c7e9a3"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:fc4b630cd3fa2cf7fce38afa91d7cfe844a9f75d7f0f36393fa98815e911d987"},
@@ -4721,7 +4669,6 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e2f1c3765db32be59d18ab3953f43ab62a761327aafc1594a2a1fbe038b8b8a7"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:d85252669dc32f98ebcd5d36768f5d4faeaeaa2d655ac0473be490ecdae3c285"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:e143ada795c341b56de9418c58d028989093ee611aa27ffb9b7f609c00d813ed"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:2c59aa6170b990d8d2719323e628aaf36f3bfbc1c26279c0eeeb24d05d2d11c7"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win32.whl", hash = "sha256:beffaed67936fbbeffd10966a4eb53c402fafd3d6833770516bf7314bc6ffa12"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win_amd64.whl", hash = "sha256:040ae85536960525ea62868b642bdb0c2cc6021c9f9d507810c0c604e66f5a7b"},
{file = "ruamel.yaml.clib-0.2.12.tar.gz", hash = "sha256:6c8fbb13ec503f99a91901ab46e0b07ae7941cd527393187039aec586fdfd36f"},
@@ -5536,4 +5483,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.11,<3.13"
content-hash = "db1beb68c9757678759b79a515ff19a21b1201502c1e7c24f579ccc47aef8644"
content-hash = "f3ede96fb76c14d02da37c1132b41a14fa4045787807446fac2cd1750be193e0"

View File

@@ -7,7 +7,7 @@ authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
dependencies = [
"celery[pytest] (>=5.4.0,<6.0.0)",
"dj-rest-auth[with_social,jwt] (==7.0.1)",
"django==5.1.8",
"django==5.1.7",
"django-allauth==65.4.1",
"django-celery-beat (>=2.7.0,<3.0.0)",
"django-celery-results (>=2.5.1,<3.0.0)",
@@ -23,7 +23,7 @@ dependencies = [
"drf-spectacular==0.27.2",
"drf-spectacular-jsonapi==0.5.1",
"gunicorn==23.0.0",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.7",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
@@ -35,7 +35,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.8.1"
version = "1.6.0"
[project.scripts]
celery = "src.backend.config.settings.celery"

View File

@@ -1,38 +1,12 @@
from types import MappingProxyType
from api.models import Provider
from prowler.config.config import get_available_compliance_frameworks
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.check.models import CheckMetadata
from api.models import Provider
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = {}
PROWLER_CHECKS = {}
AVAILABLE_COMPLIANCE_FRAMEWORKS = {}
def get_compliance_frameworks(provider_type: Provider.ProviderChoices) -> list[str]:
"""
Retrieve and cache the list of available compliance frameworks for a specific cloud provider.
This function lazily loads and caches the available compliance frameworks (e.g., CIS, MITRE, ISO)
for each provider type (AWS, Azure, GCP, etc.) on first access. Subsequent calls for the same
provider will return the cached result.
Args:
provider_type (Provider.ProviderChoices): The cloud provider type for which to retrieve
available compliance frameworks (e.g., "aws", "azure", "gcp", "m365").
Returns:
list[str]: A list of framework identifiers (e.g., "cis_1.4_aws", "mitre_attack_azure") available
for the given provider.
"""
global AVAILABLE_COMPLIANCE_FRAMEWORKS
if provider_type not in AVAILABLE_COMPLIANCE_FRAMEWORKS:
AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type] = (
get_available_compliance_frameworks(provider_type)
)
return AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type]
def get_prowler_provider_checks(provider_type: Provider.ProviderChoices):

View File

@@ -227,77 +227,6 @@ def register_enum(apps, schema_editor, enum_class): # noqa: F841
register_adapter(enum_class, enum_adapter)
def create_index_on_partitions(
apps, # noqa: F841
schema_editor,
parent_table: str,
index_name: str,
columns: str,
method: str = "BTREE",
where: str = "",
):
"""
Create an index on every existing partition of `parent_table`.
Args:
parent_table: The name of the root table (e.g. "findings").
index_name: A short name for the index (will be prefixed per-partition).
columns: The parenthesized column list, e.g. "tenant_id, scan_id, status".
method: The index method—BTREE, GIN, etc. Defaults to BTREE.
where: Optional WHERE clause (without the leading "WHERE"), e.g. "status = 'FAIL'".
"""
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
where_sql = f" WHERE {where}" if where else ""
for partition in partitions:
idx_name = f"{partition.replace('.', '_')}_{index_name}"
sql = (
f"CREATE INDEX CONCURRENTLY IF NOT EXISTS {idx_name} "
f"ON {partition} USING {method} ({columns})"
f"{where_sql};"
)
schema_editor.execute(sql)
def drop_index_on_partitions(
apps, # noqa: F841
schema_editor,
parent_table: str,
index_name: str,
):
"""
Drop the per-partition indexes that were created by create_index_on_partitions.
Args:
parent_table: The name of the root table (e.g. "findings").
index_name: The same short name used when creating them.
"""
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
for partition in partitions:
idx_name = f"{partition.replace('.', '_')}_{index_name}"
sql = f"DROP INDEX CONCURRENTLY IF EXISTS {idx_name};"
schema_editor.execute(sql)
# Postgres enum definition for member role

View File

@@ -81,114 +81,6 @@ class ChoiceInFilter(BaseInFilter, ChoiceFilter):
pass
class CommonFindingFilters(FilterSet):
# We filter providers from the scan in findings
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_type__in = ChoiceInFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_uid = CharFilter(field_name="scan__provider__uid", lookup_expr="exact")
provider_uid__in = CharInFilter(field_name="scan__provider__uid", lookup_expr="in")
provider_uid__icontains = CharFilter(
field_name="scan__provider__uid", lookup_expr="icontains"
)
provider_alias = CharFilter(field_name="scan__provider__alias", lookup_expr="exact")
provider_alias__in = CharInFilter(
field_name="scan__provider__alias", lookup_expr="in"
)
provider_alias__icontains = CharFilter(
field_name="scan__provider__alias", lookup_expr="icontains"
)
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
uid = CharFilter(field_name="uid")
delta = ChoiceFilter(choices=Finding.DeltaChoices.choices)
status = ChoiceFilter(choices=StatusChoices.choices)
severity = ChoiceFilter(choices=SeverityChoices)
impact = ChoiceFilter(choices=SeverityChoices)
muted = BooleanFilter(
help_text="If this filter is not provided, muted and non-muted findings will be returned."
)
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
region = CharFilter(method="filter_resource_region")
region__in = CharInFilter(field_name="resource_regions", lookup_expr="overlap")
region__icontains = CharFilter(
field_name="resource_regions", lookup_expr="icontains"
)
service = CharFilter(method="filter_resource_service")
service__in = CharInFilter(field_name="resource_services", lookup_expr="overlap")
service__icontains = CharFilter(
field_name="resource_services", lookup_expr="icontains"
)
resource_uid = CharFilter(field_name="resources__uid")
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
resource_uid__icontains = CharFilter(
field_name="resources__uid", lookup_expr="icontains"
)
resource_name = CharFilter(field_name="resources__name")
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
resource_name__icontains = CharFilter(
field_name="resources__name", lookup_expr="icontains"
)
resource_type = CharFilter(method="filter_resource_type")
resource_type__in = CharInFilter(field_name="resource_types", lookup_expr="overlap")
resource_type__icontains = CharFilter(
field_name="resources__type", lookup_expr="icontains"
)
# Temporarily disabled until we implement tag filtering in the UI
# resource_tag_key = CharFilter(field_name="resources__tags__key")
# resource_tag_key__in = CharInFilter(
# field_name="resources__tags__key", lookup_expr="in"
# )
# resource_tag_key__icontains = CharFilter(
# field_name="resources__tags__key", lookup_expr="icontains"
# )
# resource_tag_value = CharFilter(field_name="resources__tags__value")
# resource_tag_value__in = CharInFilter(
# field_name="resources__tags__value", lookup_expr="in"
# )
# resource_tag_value__icontains = CharFilter(
# field_name="resources__tags__value", lookup_expr="icontains"
# )
# resource_tags = CharInFilter(
# method="filter_resource_tag",
# lookup_expr="in",
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
# "separated by commas.",
# )
def filter_resource_service(self, queryset, name, value):
return queryset.filter(resource_services__contains=[value])
def filter_resource_region(self, queryset, name, value):
return queryset.filter(resource_regions__contains=[value])
def filter_resource_type(self, queryset, name, value):
return queryset.filter(resource_types__contains=[value])
def filter_resource_tag(self, queryset, name, value):
overall_query = Q()
for key_value_pair in value:
tag_key, tag_value = key_value_pair.split(":", 1)
overall_query |= Q(
resources__tags__key__icontains=tag_key,
resources__tags__value__icontains=tag_value,
)
return queryset.filter(overall_query).distinct()
class TenantFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
@@ -365,7 +257,94 @@ class ResourceFilter(ProviderRelationshipFilterSet):
return queryset.filter(tags__text_search=value)
class FindingFilter(CommonFindingFilters):
class FindingFilter(FilterSet):
# We filter providers from the scan in findings
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_type__in = ChoiceInFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_uid = CharFilter(field_name="scan__provider__uid", lookup_expr="exact")
provider_uid__in = CharInFilter(field_name="scan__provider__uid", lookup_expr="in")
provider_uid__icontains = CharFilter(
field_name="scan__provider__uid", lookup_expr="icontains"
)
provider_alias = CharFilter(field_name="scan__provider__alias", lookup_expr="exact")
provider_alias__in = CharInFilter(
field_name="scan__provider__alias", lookup_expr="in"
)
provider_alias__icontains = CharFilter(
field_name="scan__provider__alias", lookup_expr="icontains"
)
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
uid = CharFilter(field_name="uid")
delta = ChoiceFilter(choices=Finding.DeltaChoices.choices)
status = ChoiceFilter(choices=StatusChoices.choices)
severity = ChoiceFilter(choices=SeverityChoices)
impact = ChoiceFilter(choices=SeverityChoices)
muted = BooleanFilter(
help_text="If this filter is not provided, muted and non-muted findings will be returned."
)
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
region = CharFilter(field_name="resources__region")
region__in = CharInFilter(field_name="resources__region", lookup_expr="in")
region__icontains = CharFilter(
field_name="resources__region", lookup_expr="icontains"
)
service = CharFilter(field_name="resources__service")
service__in = CharInFilter(field_name="resources__service", lookup_expr="in")
service__icontains = CharFilter(
field_name="resources__service", lookup_expr="icontains"
)
resource_uid = CharFilter(field_name="resources__uid")
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
resource_uid__icontains = CharFilter(
field_name="resources__uid", lookup_expr="icontains"
)
resource_name = CharFilter(field_name="resources__name")
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
resource_name__icontains = CharFilter(
field_name="resources__name", lookup_expr="icontains"
)
resource_type = CharFilter(field_name="resources__type")
resource_type__in = CharInFilter(field_name="resources__type", lookup_expr="in")
resource_type__icontains = CharFilter(
field_name="resources__type", lookup_expr="icontains"
)
# Temporarily disabled until we implement tag filtering in the UI
# resource_tag_key = CharFilter(field_name="resources__tags__key")
# resource_tag_key__in = CharInFilter(
# field_name="resources__tags__key", lookup_expr="in"
# )
# resource_tag_key__icontains = CharFilter(
# field_name="resources__tags__key", lookup_expr="icontains"
# )
# resource_tag_value = CharFilter(field_name="resources__tags__value")
# resource_tag_value__in = CharInFilter(
# field_name="resources__tags__value", lookup_expr="in"
# )
# resource_tag_value__icontains = CharFilter(
# field_name="resources__tags__value", lookup_expr="icontains"
# )
# resource_tags = CharInFilter(
# method="filter_resource_tag",
# lookup_expr="in",
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
# "separated by commas.",
# )
scan = UUIDFilter(method="filter_scan_id")
scan__in = UUIDInFilter(method="filter_scan_id_in")
@@ -406,15 +385,6 @@ class FindingFilter(CommonFindingFilters):
},
}
def filter_resource_type(self, queryset, name, value):
return queryset.filter(resource_types__contains=[value])
def filter_resource_region(self, queryset, name, value):
return queryset.filter(resource_regions__contains=[value])
def filter_resource_service(self, queryset, name, value):
return queryset.filter(resource_services__contains=[value])
def filter_queryset(self, queryset):
if not (self.data.get("scan") or self.data.get("scan__in")) and not (
self.data.get("inserted_at")
@@ -533,6 +503,16 @@ class FindingFilter(CommonFindingFilters):
return queryset.filter(id__lt=end)
def filter_resource_tag(self, queryset, name, value):
overall_query = Q()
for key_value_pair in value:
tag_key, tag_value = key_value_pair.split(":", 1)
overall_query |= Q(
resources__tags__key__icontains=tag_key,
resources__tags__value__icontains=tag_value,
)
return queryset.filter(overall_query).distinct()
@staticmethod
def maybe_date_to_datetime(value):
dt = value
@@ -541,31 +521,6 @@ class FindingFilter(CommonFindingFilters):
return dt
class LatestFindingFilter(CommonFindingFilters):
class Meta:
model = Finding
fields = {
"id": ["exact", "in"],
"uid": ["exact", "in"],
"delta": ["exact", "in"],
"status": ["exact", "in"],
"severity": ["exact", "in"],
"impact": ["exact", "in"],
"check_id": ["exact", "in", "icontains"],
}
filter_overrides = {
FindingDeltaEnumField: {
"filter_class": CharFilter,
},
StatusEnumField: {
"filter_class": CharFilter,
},
SeverityEnumField: {
"filter_class": CharFilter,
},
}
class ProviderSecretFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")

View File

@@ -12,7 +12,6 @@ from api.models import (
Provider,
Resource,
ResourceFindingMapping,
ResourceScanSummary,
Scan,
StatusChoices,
)
@@ -134,7 +133,6 @@ class Command(BaseCommand):
region=random.choice(possible_regions),
service=random.choice(possible_services),
type=random.choice(possible_types),
inserted_at="2024-10-01T00:00:00Z",
)
)
@@ -183,10 +181,6 @@ class Command(BaseCommand):
"servicename": assigned_resource.service,
"resourcetype": assigned_resource.type,
},
resource_types=[assigned_resource.type],
resource_regions=[assigned_resource.region],
resource_services=[assigned_resource.service],
inserted_at="2024-10-01T00:00:00Z",
)
)
@@ -203,22 +197,12 @@ class Command(BaseCommand):
# Create ResourceFindingMapping
mappings = []
scan_resource_cache: set[tuple] = set()
for index, finding_instance in enumerate(findings):
resource_instance = resources[findings_resources_mapping[index]]
for index, f in enumerate(findings):
mappings.append(
ResourceFindingMapping(
tenant_id=tenant_id,
resource=resource_instance,
finding=finding_instance,
)
)
scan_resource_cache.add(
(
str(resource_instance.id),
resource_instance.service,
resource_instance.region,
resource_instance.type,
resource=resources[findings_resources_mapping[index]],
finding=f,
)
)
@@ -236,38 +220,6 @@ class Command(BaseCommand):
"Resource-finding mappings created successfully.\n\n"
)
)
with rls_transaction(tenant_id):
scan.progress = 99
scan.save()
self.stdout.write(self.style.WARNING("Creating finding filter values..."))
resource_scan_summaries = [
ResourceScanSummary(
tenant_id=tenant_id,
scan_id=str(scan.id),
resource_id=resource_id,
service=service,
region=region,
resource_type=resource_type,
)
for resource_id, service, region, resource_type in scan_resource_cache
]
num_batches = ceil(len(resource_scan_summaries) / batch_size)
with rls_transaction(tenant_id):
for i in tqdm(
range(0, len(resource_scan_summaries), batch_size),
total=num_batches,
):
with rls_transaction(tenant_id):
ResourceScanSummary.objects.bulk_create(
resource_scan_summaries[i : i + batch_size],
ignore_conflicts=True,
)
self.stdout.write(
self.style.SUCCESS("Finding filter values created successfully.\n\n")
)
except Exception as e:
self.stdout.write(self.style.ERROR(f"Failed to populate test data: {e}"))
scan_state = "failed"

View File

@@ -25,8 +25,4 @@ class Migration(migrations.Migration):
default="aws",
),
),
migrations.RunSQL(
"ALTER TYPE provider ADD VALUE IF NOT EXISTS 'm365';",
reverse_sql=migrations.RunSQL.noop,
),
]

View File

@@ -1,81 +0,0 @@
# Generated by Django 5.1.7 on 2025-05-05 10:01
import uuid
import django.db.models.deletion
import uuid6
from django.db import migrations, models
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0017_m365_provider"),
]
operations = [
migrations.CreateModel(
name="ResourceScanSummary",
fields=[
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("scan_id", models.UUIDField(db_index=True, default=uuid6.uuid7)),
("resource_id", models.UUIDField(db_index=True, default=uuid.uuid4)),
("service", models.CharField(max_length=100)),
("region", models.CharField(max_length=100)),
("resource_type", models.CharField(max_length=100)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "resource_scan_summaries",
"indexes": [
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="rss_tenant_scan_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region"],
name="rss_tenant_scan_reg_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "resource_type"],
name="rss_tenant_scan_type_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region", "service"],
name="rss_tenant_scan_reg_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "service", "resource_type"],
name="rss_tenant_scan_svc_type_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region", "resource_type"],
name="rss_tenant_scan_reg_type_idx",
),
],
"unique_together": {("tenant_id", "scan_id", "resource_id")},
},
),
migrations.AddConstraint(
model_name="resourcescansummary",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_resourcescansummary",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]

View File

@@ -1,42 +0,0 @@
import django.contrib.postgres.fields
import django.contrib.postgres.indexes
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0018_resource_scan_summaries"),
]
operations = [
migrations.AddField(
model_name="finding",
name="resource_regions",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(max_length=100),
blank=True,
null=True,
size=None,
),
),
migrations.AddField(
model_name="finding",
name="resource_services",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(max_length=100),
blank=True,
null=True,
size=None,
),
),
migrations.AddField(
model_name="finding",
name="resource_types",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(max_length=100),
blank=True,
null=True,
size=None,
),
),
]

View File

@@ -1,86 +0,0 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0019_finding_denormalize_resource_fields"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_service_idx",
columns="resource_services",
method="GIN",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_service_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_region_idx",
columns="resource_regions",
method="GIN",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_region_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_rtype_idx",
columns="resource_types",
method="GIN",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_rtype_idx",
),
),
migrations.RunPython(
partial(
drop_index_on_partitions,
parent_table="findings",
index_name="findings_uid_idx",
),
reverse_code=partial(
create_index_on_partitions,
parent_table="findings",
index_name="findings_uid_idx",
columns="uid",
method="BTREE",
),
),
migrations.RunPython(
partial(
drop_index_on_partitions,
parent_table="findings",
index_name="findings_filter_idx",
),
reverse_code=partial(
create_index_on_partitions,
parent_table="findings",
index_name="findings_filter_idx",
columns="scan_id, impact, severity, status, check_id, delta",
method="BTREE",
),
),
]

View File

@@ -1,37 +0,0 @@
import django.contrib.postgres.indexes
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("api", "0020_findings_new_performance_indexes_partitions"),
]
operations = [
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=["resource_services"], name="gin_find_service_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=["resource_regions"], name="gin_find_region_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=["resource_types"], name="gin_find_rtype_idx"
),
),
migrations.RemoveIndex(
model_name="finding",
name="findings_uid_idx",
),
migrations.RemoveIndex(
model_name="finding",
name="findings_filter_idx",
),
]

View File

@@ -1,38 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-12 10:04
from django.contrib.postgres.operations import AddIndexConcurrently
from django.db import migrations, models
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0021_findings_new_performance_indexes_parent"),
("django_celery_beat", "0019_alter_periodictasks_options"),
]
operations = [
AddIndexConcurrently(
model_name="scan",
index=models.Index(
condition=models.Q(("state", "completed")),
fields=["tenant_id", "provider_id", "state", "-inserted_at"],
name="scans_prov_state_ins_desc_idx",
),
),
AddIndexConcurrently(
model_name="scansummary",
index=models.Index(
fields=["tenant_id", "scan_id", "service"],
name="ss_tenant_scan_service_idx",
),
),
AddIndexConcurrently(
model_name="scansummary",
index=models.Index(
fields=["tenant_id", "scan_id", "severity"],
name="ss_tenant_scan_severity_idx",
),
),
]

View File

@@ -1,28 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-12 10:18
from django.contrib.postgres.operations import AddIndexConcurrently
from django.db import migrations, models
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0022_scan_summaries_performance_indexes"),
]
operations = [
AddIndexConcurrently(
model_name="resource",
index=models.Index(
fields=["tenant_id", "id"], name="resources_tenant_id_idx"
),
),
AddIndexConcurrently(
model_name="resource",
index=models.Index(
fields=["tenant_id", "provider_id"],
name="resources_tenant_provider_idx",
),
),
]

View File

@@ -1,29 +0,0 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0023_resources_lookup_optimization"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_tenant_uid_inserted_idx",
columns="tenant_id, uid, inserted_at DESC",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_tenant_uid_inserted_idx",
),
)
]

View File

@@ -1,17 +0,0 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0024_findings_uid_index_partitions"),
]
operations = [
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "uid", "-inserted_at"],
name="find_tenant_uid_inserted_idx",
),
),
]

View File

@@ -5,7 +5,6 @@ from uuid import UUID, uuid4
from cryptography.fernet import Fernet
from django.conf import settings
from django.contrib.auth.models import AbstractBaseUser
from django.contrib.postgres.fields import ArrayField
from django.contrib.postgres.indexes import GinIndex
from django.contrib.postgres.search import SearchVector, SearchVectorField
from django.core.validators import MinLengthValidator
@@ -218,13 +217,9 @@ class Provider(RowLevelSecurityProtectedModel):
@staticmethod
def validate_m365_uid(value):
if not re.match(
r"""^(?!-)[A-Za-z0-9](?:[A-Za-z0-9-]{0,61}[A-Za-z0-9])?(?:\.(?!-)[A-Za-z0-9]"""
r"""(?:[A-Za-z0-9-]{0,61}[A-Za-z0-9])?)*\.[A-Za-z]{2,}$""",
value,
):
if not re.match(r"^[a-zA-Z0-9-]+\.onmicrosoft\.com$", value):
raise ModelValidationError(
detail="M365 domain ID must be a valid domain.",
detail="M365 tenant ID must be a valid domain.",
code="m365-uid",
pointer="/data/attributes/uid",
)
@@ -430,7 +425,6 @@ class Scan(RowLevelSecurityProtectedModel):
PeriodicTask, on_delete=models.CASCADE, null=True, blank=True
)
output_location = models.CharField(blank=True, null=True, max_length=200)
# TODO: mutelist foreign key
class Meta(RowLevelSecurityProtectedModel.Meta):
@@ -453,11 +447,6 @@ class Scan(RowLevelSecurityProtectedModel):
fields=["tenant_id", "provider_id", "state", "inserted_at"],
name="scans_prov_state_insert_idx",
),
models.Index(
fields=["tenant_id", "provider_id", "state", "-inserted_at"],
condition=Q(state=StateChoices.COMPLETED),
name="scans_prov_state_ins_desc_idx",
),
]
class JSONAPIMeta:
@@ -584,11 +573,6 @@ class Resource(RowLevelSecurityProtectedModel):
name="resource_tenant_metadata_idx",
),
GinIndex(fields=["text_search"], name="gin_resources_search_idx"),
models.Index(fields=["tenant_id", "id"], name="resources_tenant_id_idx"),
models.Index(
fields=["tenant_id", "provider_id"],
name="resources_tenant_provider_idx",
),
]
constraints = [
@@ -689,21 +673,6 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
muted = models.BooleanField(default=False, null=False)
compliance = models.JSONField(default=dict, null=True, blank=True)
# Denormalize resource data for performance
resource_regions = ArrayField(
models.CharField(max_length=100), blank=True, null=True
)
resource_services = ArrayField(
models.CharField(max_length=100),
blank=True,
null=True,
)
resource_types = ArrayField(
models.CharField(max_length=100),
blank=True,
null=True,
)
# Relationships
scan = models.ForeignKey(to=Scan, related_name="findings", on_delete=models.CASCADE)
@@ -744,6 +713,18 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
]
indexes = [
models.Index(fields=["uid"], name="findings_uid_idx"),
models.Index(
fields=[
"scan_id",
"impact",
"severity",
"status",
"check_id",
"delta",
],
name="findings_filter_idx",
),
models.Index(fields=["tenant_id", "id"], name="findings_tenant_and_id_idx"),
GinIndex(fields=["text_search"], name="gin_findings_search_idx"),
models.Index(fields=["tenant_id", "scan_id"], name="find_tenant_scan_idx"),
@@ -755,42 +736,19 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
condition=Q(delta="new"),
name="find_delta_new_idx",
),
models.Index(
fields=["tenant_id", "uid", "-inserted_at"],
name="find_tenant_uid_inserted_idx",
),
GinIndex(fields=["resource_services"], name="gin_find_service_idx"),
GinIndex(fields=["resource_regions"], name="gin_find_region_idx"),
GinIndex(fields=["resource_types"], name="gin_find_rtype_idx"),
]
class JSONAPIMeta:
resource_name = "findings"
def add_resources(self, resources: list[Resource] | None):
if not resources:
return
self.resource_regions = self.resource_regions or []
self.resource_services = self.resource_services or []
self.resource_types = self.resource_types or []
# Deduplication
regions = set(self.resource_regions)
services = set(self.resource_services)
types = set(self.resource_types)
# Add new relationships with the tenant_id field
for resource in resources:
ResourceFindingMapping.objects.update_or_create(
resource=resource, finding=self, tenant_id=self.tenant_id
)
regions.add(resource.region)
services.add(resource.service)
types.add(resource.type)
self.resource_regions = list(regions)
self.resource_services = list(services)
self.resource_types = list(types)
# Save the instance
self.save()
@@ -1192,15 +1150,7 @@ class ScanSummary(RowLevelSecurityProtectedModel):
models.Index(
fields=["tenant_id", "scan_id"],
name="scan_summaries_tenant_scan_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="ss_tenant_scan_service_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "severity"],
name="ss_tenant_scan_severity_idx",
),
)
]
class JSONAPIMeta:
@@ -1282,52 +1232,3 @@ class IntegrationProviderRelationship(RowLevelSecurityProtectedModel):
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class ResourceScanSummary(RowLevelSecurityProtectedModel):
scan_id = models.UUIDField(default=uuid7, db_index=True)
resource_id = models.UUIDField(default=uuid4, db_index=True)
service = models.CharField(max_length=100)
region = models.CharField(max_length=100)
resource_type = models.CharField(max_length=100)
class Meta:
db_table = "resource_scan_summaries"
unique_together = (("tenant_id", "scan_id", "resource_id"),)
indexes = [
# Single-dimension lookups:
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="rss_tenant_scan_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region"],
name="rss_tenant_scan_reg_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "resource_type"],
name="rss_tenant_scan_type_idx",
),
# Two-dimension cross-filters:
models.Index(
fields=["tenant_id", "scan_id", "region", "service"],
name="rss_tenant_scan_reg_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "service", "resource_type"],
name="rss_tenant_scan_svc_type_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region", "resource_type"],
name="rss_tenant_scan_reg_type_idx",
),
]
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]

View File

@@ -1,7 +1,7 @@
openapi: 3.0.3
info:
title: Prowler API
version: 1.8.1
version: 1.6.0
description: |-
Prowler API specification.
@@ -1238,416 +1238,6 @@ paths:
schema:
$ref: '#/components/schemas/FindingDynamicFilterResponse'
description: ''
/api/v1/findings/latest:
get:
operationId: findings_latest_retrieve
description: Retrieve a list of the latest findings from the latest scans for
each provider with options for filtering by various criteria.
summary: List the latest findings
parameters:
- in: query
name: fields[findings]
schema:
type: array
items:
type: string
enum:
- uid
- delta
- status
- status_extended
- severity
- check_id
- check_metadata
- raw_result
- inserted_at
- updated_at
- first_seen_at
- muted
- url
- scan
- resources
description: endpoint return only specific fields in the response on a per-type
basis by including a fields[TYPE] query parameter.
explode: false
- in: query
name: filter[check_id]
schema:
type: string
- in: query
name: filter[check_id__icontains]
schema:
type: string
- in: query
name: filter[check_id__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[delta]
schema:
type: string
nullable: true
enum:
- changed
- new
description: |-
* `new` - New
* `changed` - Changed
- in: query
name: filter[delta__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[id]
schema:
type: string
format: uuid
- in: query
name: filter[id__in]
schema:
type: array
items:
type: string
format: uuid
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[impact]
schema:
type: string
enum:
- critical
- high
- informational
- low
- medium
description: |-
* `critical` - Critical
* `high` - High
* `medium` - Medium
* `low` - Low
* `informational` - Informational
- in: query
name: filter[impact__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[muted]
schema:
type: boolean
description: If this filter is not provided, muted and non-muted findings
will be returned.
- in: query
name: filter[provider]
schema:
type: string
format: uuid
- in: query
name: filter[provider__in]
schema:
type: array
items:
type: string
format: uuid
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[provider_alias]
schema:
type: string
- in: query
name: filter[provider_alias__icontains]
schema:
type: string
- in: query
name: filter[provider_alias__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[provider_type]
schema:
type: string
enum:
- aws
- azure
- gcp
- kubernetes
- m365
description: |-
* `aws` - AWS
* `azure` - Azure
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
- in: query
name: filter[provider_type__in]
schema:
type: array
items:
type: string
enum:
- aws
- azure
- gcp
- kubernetes
- m365
description: |-
Multiple values may be separated by commas.
* `aws` - AWS
* `azure` - Azure
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
explode: false
style: form
- in: query
name: filter[provider_uid]
schema:
type: string
- in: query
name: filter[provider_uid__icontains]
schema:
type: string
- in: query
name: filter[provider_uid__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[region]
schema:
type: string
- in: query
name: filter[region__icontains]
schema:
type: string
- in: query
name: filter[region__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[resource_name]
schema:
type: string
- in: query
name: filter[resource_name__icontains]
schema:
type: string
- in: query
name: filter[resource_name__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[resource_type]
schema:
type: string
- in: query
name: filter[resource_type__icontains]
schema:
type: string
- in: query
name: filter[resource_type__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[resource_uid]
schema:
type: string
- in: query
name: filter[resource_uid__icontains]
schema:
type: string
- in: query
name: filter[resource_uid__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[resources]
schema:
type: array
items:
type: string
format: uuid
description: Multiple values may be separated by commas.
explode: false
style: form
- name: filter[search]
required: false
in: query
description: A search term.
schema:
type: string
- in: query
name: filter[service]
schema:
type: string
- in: query
name: filter[service__icontains]
schema:
type: string
- in: query
name: filter[service__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[severity]
schema:
type: string
enum:
- critical
- high
- informational
- low
- medium
description: |-
* `critical` - Critical
* `high` - High
* `medium` - Medium
* `low` - Low
* `informational` - Informational
- in: query
name: filter[severity__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[status]
schema:
type: string
enum:
- FAIL
- MANUAL
- PASS
description: |-
* `FAIL` - Fail
* `PASS` - Pass
* `MANUAL` - Manual
- in: query
name: filter[status__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[uid]
schema:
type: string
- in: query
name: filter[uid__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[updated_at]
schema:
type: string
format: date
- in: query
name: include
schema:
type: array
items:
type: string
enum:
- scan
- resources
description: include query parameter to allow the client to customize which
related resources should be returned.
explode: false
- name: sort
required: false
in: query
description: '[list of fields to sort by](https://jsonapi.org/format/#fetching-sorting)'
schema:
type: array
items:
type: string
enum:
- status
- -status
- severity
- -severity
- check_id
- -check_id
- inserted_at
- -inserted_at
- updated_at
- -updated_at
explode: false
tags:
- Finding
security:
- jwtAuth: []
responses:
'200':
content:
application/vnd.api+json:
schema:
$ref: '#/components/schemas/FindingResponse'
description: ''
/api/v1/findings/metadata:
get:
operationId: findings_metadata_retrieve
@@ -2084,392 +1674,6 @@ paths:
schema:
$ref: '#/components/schemas/FindingMetadataResponse'
description: ''
/api/v1/findings/metadata/latest:
get:
operationId: findings_metadata_latest_retrieve
description: Fetch unique metadata values from a set of findings from the latest
scans for each provider. This is useful for dynamic filtering.
summary: Retrieve metadata values from the latest findings
parameters:
- in: query
name: fields[findings-metadata]
schema:
type: array
items:
type: string
enum:
- services
- regions
- resource_types
description: endpoint return only specific fields in the response on a per-type
basis by including a fields[TYPE] query parameter.
explode: false
- in: query
name: filter[check_id]
schema:
type: string
- in: query
name: filter[check_id__icontains]
schema:
type: string
- in: query
name: filter[check_id__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[delta]
schema:
type: string
nullable: true
enum:
- changed
- new
description: |-
* `new` - New
* `changed` - Changed
- in: query
name: filter[delta__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[id]
schema:
type: string
format: uuid
- in: query
name: filter[id__in]
schema:
type: array
items:
type: string
format: uuid
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[impact]
schema:
type: string
enum:
- critical
- high
- informational
- low
- medium
description: |-
* `critical` - Critical
* `high` - High
* `medium` - Medium
* `low` - Low
* `informational` - Informational
- in: query
name: filter[impact__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[muted]
schema:
type: boolean
description: If this filter is not provided, muted and non-muted findings
will be returned.
- in: query
name: filter[provider]
schema:
type: string
format: uuid
- in: query
name: filter[provider__in]
schema:
type: array
items:
type: string
format: uuid
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[provider_alias]
schema:
type: string
- in: query
name: filter[provider_alias__icontains]
schema:
type: string
- in: query
name: filter[provider_alias__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[provider_type]
schema:
type: string
enum:
- aws
- azure
- gcp
- kubernetes
- m365
description: |-
* `aws` - AWS
* `azure` - Azure
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
- in: query
name: filter[provider_type__in]
schema:
type: array
items:
type: string
enum:
- aws
- azure
- gcp
- kubernetes
- m365
description: |-
Multiple values may be separated by commas.
* `aws` - AWS
* `azure` - Azure
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
explode: false
style: form
- in: query
name: filter[provider_uid]
schema:
type: string
- in: query
name: filter[provider_uid__icontains]
schema:
type: string
- in: query
name: filter[provider_uid__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[region]
schema:
type: string
- in: query
name: filter[region__icontains]
schema:
type: string
- in: query
name: filter[region__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[resource_name]
schema:
type: string
- in: query
name: filter[resource_name__icontains]
schema:
type: string
- in: query
name: filter[resource_name__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[resource_type]
schema:
type: string
- in: query
name: filter[resource_type__icontains]
schema:
type: string
- in: query
name: filter[resource_type__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[resource_uid]
schema:
type: string
- in: query
name: filter[resource_uid__icontains]
schema:
type: string
- in: query
name: filter[resource_uid__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[resources]
schema:
type: array
items:
type: string
format: uuid
description: Multiple values may be separated by commas.
explode: false
style: form
- name: filter[search]
required: false
in: query
description: A search term.
schema:
type: string
- in: query
name: filter[service]
schema:
type: string
- in: query
name: filter[service__icontains]
schema:
type: string
- in: query
name: filter[service__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[severity]
schema:
type: string
enum:
- critical
- high
- informational
- low
- medium
description: |-
* `critical` - Critical
* `high` - High
* `medium` - Medium
* `low` - Low
* `informational` - Informational
- in: query
name: filter[severity__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[status]
schema:
type: string
enum:
- FAIL
- MANUAL
- PASS
description: |-
* `FAIL` - Fail
* `PASS` - Pass
* `MANUAL` - Manual
- in: query
name: filter[status__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[uid]
schema:
type: string
- in: query
name: filter[uid__in]
schema:
type: array
items:
type: string
description: Multiple values may be separated by commas.
explode: false
style: form
- in: query
name: filter[updated_at]
schema:
type: string
format: date
- name: sort
required: false
in: query
description: '[list of fields to sort by](https://jsonapi.org/format/#fetching-sorting)'
schema:
type: array
items:
type: string
enum:
- status
- -status
- severity
- -severity
- check_id
- -check_id
- inserted_at
- -inserted_at
- updated_at
- -updated_at
explode: false
tags:
- Finding
security:
- jwtAuth: []
responses:
'200':
content:
application/vnd.api+json:
schema:
$ref: '#/components/schemas/FindingMetadataResponse'
description: ''
/api/v1/integrations:
get:
operationId: integrations_list
@@ -5299,47 +4503,6 @@ paths:
schema:
$ref: '#/components/schemas/ScanUpdateResponse'
description: ''
/api/v1/scans/{id}/compliance/{name}:
get:
operationId: scans_compliance_retrieve
description: Download a specific compliance report (e.g., 'cis_1.4_aws') as
a CSV file.
summary: Retrieve compliance report as CSV
parameters:
- in: query
name: fields[scan-reports]
schema:
type: array
items:
type: string
enum:
- id
- name
description: endpoint return only specific fields in the response on a per-type
basis by including a fields[TYPE] query parameter.
explode: false
- in: path
name: id
schema:
type: string
format: uuid
description: A UUID string identifying this scan.
required: true
- in: path
name: name
schema:
type: string
description: The compliance report name, like 'cis_1.4_aws'
required: true
tags:
- Scan
security:
- jwtAuth: []
responses:
'200':
description: CSV file containing the compliance report
'404':
description: Compliance report not found
/api/v1/scans/{id}/report:
get:
operationId: scans_report_retrieve

View File

@@ -2,20 +2,17 @@ import glob
import io
import json
import os
import tempfile
from datetime import datetime, timedelta, timezone
from pathlib import Path
from unittest.mock import ANY, MagicMock, Mock, patch
from unittest.mock import ANY, Mock, patch
import jwt
import pytest
from botocore.exceptions import ClientError, NoCredentialsError
from botocore.exceptions import NoCredentialsError
from conftest import API_JSON_CONTENT_TYPE, TEST_PASSWORD, TEST_USER
from django.conf import settings
from django.urls import reverse
from rest_framework import status
from api.compliance import get_compliance_frameworks
from api.models import (
ComplianceOverview,
Integration,
@@ -918,26 +915,6 @@ class TestProviderViewSet:
"uid": "8851db6b-42e5-4533-aa9e-30a32d67e875",
"alias": "test",
},
{
"provider": "m365",
"uid": "TestingPro.onMirosoft.com",
"alias": "test",
},
{
"provider": "m365",
"uid": "subdomain.domain.es",
"alias": "test",
},
{
"provider": "m365",
"uid": "microsoft.net",
"alias": "test",
},
{
"provider": "m365",
"uid": "subdomain1.subdomain2.subdomain3.subdomain4.domain.net",
"alias": "test",
},
]
),
)
@@ -1006,51 +983,6 @@ class TestProviderViewSet:
"invalid_choice",
"provider",
),
(
{
"provider": "m365",
"uid": "https://test.com",
"alias": "test",
},
"m365-uid",
"uid",
),
(
{
"provider": "m365",
"uid": "thisisnotadomain",
"alias": "test",
},
"m365-uid",
"uid",
),
(
{
"provider": "m365",
"uid": "http://test.com",
"alias": "test",
},
"m365-uid",
"uid",
),
(
{
"provider": "m365",
"uid": f"{'a' * 64}.domain.com",
"alias": "test",
},
"m365-uid",
"uid",
),
(
{
"provider": "m365",
"uid": f"subdomain.{'a' * 64}.com",
"alias": "test",
},
"m365-uid",
"uid",
),
]
),
)
@@ -2345,8 +2277,7 @@ class TestScanViewSet:
scan.save()
monkeypatch.setattr(
"api.v1.views.env",
type("env", (), {"str": lambda self, *args, **kwargs: "test-bucket"})(),
"api.v1.views.env", type("env", (), {"str": lambda self, key: bucket})()
)
class FakeS3Client:
@@ -2385,296 +2316,35 @@ class TestScanViewSet:
assert response.status_code == 404
assert response.json()["errors"]["detail"] == "The scan has no reports."
def test_report_local_file(self, authenticated_client, scans_fixture, monkeypatch):
scan = scans_fixture[0]
with tempfile.TemporaryDirectory() as tmp:
tmp_path = Path(tmp)
base_tmp = tmp_path / "report_local_file"
base_tmp.mkdir(parents=True, exist_ok=True)
file_content = b"local zip file content"
file_path = base_tmp / "report.zip"
file_path.write_bytes(file_content)
scan.output_location = str(file_path)
scan.state = StateChoices.COMPLETED
scan.save()
monkeypatch.setattr(
glob,
"glob",
lambda pattern: [str(file_path)] if pattern == str(file_path) else [],
)
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == 200
assert response.content == file_content
content_disposition = response.get("Content-Disposition")
assert content_disposition.startswith('attachment; filename="')
assert f'filename="{file_path.name}"' in content_disposition
def test_compliance_invalid_framework(self, authenticated_client, scans_fixture):
scan = scans_fixture[0]
scan.state = StateChoices.COMPLETED
scan.output_location = "dummy"
scan.save()
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": "invalid"})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_404_NOT_FOUND
assert resp.json()["errors"]["detail"] == "Compliance 'invalid' not found."
def test_compliance_executing(
self, authenticated_client, scans_fixture, monkeypatch
def test_report_local_file(
self, authenticated_client, scans_fixture, tmp_path, monkeypatch
):
"""
When output_location is a local file path, the view should read the file from disk
and return it with proper headers.
"""
scan = scans_fixture[0]
scan.state = StateChoices.EXECUTING
scan.save()
task = Task.objects.create(tenant_id=scan.tenant_id)
scan.task = task
scan.save()
dummy = {"id": str(task.id), "state": StateChoices.EXECUTING}
file_content = b"local zip file content"
file_path = tmp_path / "report.zip"
file_path.write_bytes(file_content)
monkeypatch.setattr(
"api.v1.views.TaskSerializer",
lambda *args, **kwargs: type("S", (), {"data": dummy}),
)
framework = get_compliance_frameworks(scan.provider.provider)[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_202_ACCEPTED
assert "Content-Location" in resp
assert dummy["id"] in resp["Content-Location"]
def test_compliance_no_output(self, authenticated_client, scans_fixture):
scan = scans_fixture[0]
scan.state = StateChoices.COMPLETED
scan.output_location = ""
scan.save()
framework = get_compliance_frameworks(scan.provider.provider)[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_404_NOT_FOUND
assert resp.json()["errors"]["detail"] == "The scan has no reports."
def test_compliance_s3_no_credentials(
self, authenticated_client, scans_fixture, monkeypatch
):
scan = scans_fixture[0]
bucket = "bucket"
key = "file.zip"
scan.output_location = f"s3://{bucket}/{key}"
scan.output_location = str(file_path)
scan.state = StateChoices.COMPLETED
scan.save()
monkeypatch.setattr(
"api.v1.views.get_s3_client",
lambda: (_ for _ in ()).throw(NoCredentialsError()),
glob,
"glob",
lambda pattern: [str(file_path)] if pattern == str(file_path) else [],
)
framework = get_compliance_frameworks(scan.provider.provider)[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_403_FORBIDDEN
assert resp.json()["errors"]["detail"] == "There is a problem with credentials."
def test_compliance_s3_success(
self, authenticated_client, scans_fixture, monkeypatch
):
scan = scans_fixture[0]
bucket = "bucket"
prefix = "path/scan.zip"
scan.output_location = f"s3://{bucket}/{prefix}"
scan.state = StateChoices.COMPLETED
scan.save()
monkeypatch.setattr(
"api.v1.views.env",
type("env", (), {"str": lambda self, *args, **kwargs: "test-bucket"})(),
)
match_key = "path/compliance/mitre_attack_aws.csv"
class FakeS3Client:
def list_objects_v2(self, Bucket, Prefix):
return {"Contents": [{"Key": match_key}]}
def get_object(self, Bucket, Key):
return {"Body": io.BytesIO(b"ignored")}
monkeypatch.setattr("api.v1.views.get_s3_client", lambda: FakeS3Client())
framework = match_key.split("/")[-1].split(".")[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_200_OK
cd = resp["Content-Disposition"]
assert cd.startswith('attachment; filename="')
assert cd.endswith('filename="mitre_attack_aws.csv"')
def test_compliance_s3_not_found(
self, authenticated_client, scans_fixture, monkeypatch
):
scan = scans_fixture[0]
bucket = "bucket"
scan.output_location = f"s3://{bucket}/x/scan.zip"
scan.state = StateChoices.COMPLETED
scan.save()
monkeypatch.setattr(
"api.v1.views.env",
type("env", (), {"str": lambda self, *args, **kwargs: "test-bucket"})(),
)
class FakeS3Client:
def list_objects_v2(self, Bucket, Prefix):
return {"Contents": []}
def get_object(self, Bucket, Key):
return {"Body": io.BytesIO(b"ignored")}
monkeypatch.setattr("api.v1.views.get_s3_client", lambda: FakeS3Client())
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": "cis_1.4_aws"})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_404_NOT_FOUND
assert (
resp.json()["errors"]["detail"]
== "No compliance file found for name 'cis_1.4_aws'."
)
def test_compliance_local_file(
self, authenticated_client, scans_fixture, monkeypatch
):
scan = scans_fixture[0]
scan.state = StateChoices.COMPLETED
with tempfile.TemporaryDirectory() as tmp:
tmp_path = Path(tmp)
base = tmp_path / "reports"
comp_dir = base / "compliance"
comp_dir.mkdir(parents=True, exist_ok=True)
fname = comp_dir / "scan_cis.csv"
fname.write_bytes(b"ignored")
scan.output_location = str(base / "scan.zip")
scan.save()
monkeypatch.setattr(
glob,
"glob",
lambda p: [str(fname)] if p.endswith("*_cis_1.4_aws.csv") else [],
)
url = reverse(
"scan-compliance", kwargs={"pk": scan.id, "name": "cis_1.4_aws"}
)
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_200_OK
cd = resp["Content-Disposition"]
assert cd.startswith('attachment; filename="')
assert cd.endswith(f'filename="{fname.name}"')
@patch("api.v1.views.Task.objects.get")
@patch("api.v1.views.TaskSerializer")
def test__get_task_status_returns_none_if_task_not_executing(
self, mock_task_serializer, mock_task_get, authenticated_client, scans_fixture
):
scan = scans_fixture[0]
scan.state = StateChoices.COMPLETED
scan.output_location = "dummy"
scan.save()
task = Task.objects.create(tenant_id=scan.tenant_id)
mock_task_get.return_value = task
mock_task_serializer.return_value.data = {
"id": str(task.id),
"state": StateChoices.COMPLETED,
}
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_404_NOT_FOUND
@patch("api.v1.views.get_s3_client")
@patch("api.v1.views.sentry_sdk.capture_exception")
def test_compliance_list_objects_client_error(
self,
mock_sentry_capture,
mock_get_s3_client,
authenticated_client,
scans_fixture,
):
scan = scans_fixture[0]
scan.output_location = "s3://test-bucket/path/to/scan.zip"
scan.state = StateChoices.COMPLETED
scan.save()
fake_client = MagicMock()
fake_client.list_objects_v2.side_effect = ClientError(
{"Error": {"Code": "InternalError"}}, "ListObjectsV2"
)
mock_get_s3_client.return_value = fake_client
framework = get_compliance_frameworks(scan.provider.provider)[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_502_BAD_GATEWAY
assert (
response.json()["errors"]["detail"]
== "Unable to list compliance files in S3: encountered an AWS error."
)
mock_sentry_capture.assert_called()
@patch("api.v1.views.get_s3_client")
def test_report_s3_nosuchkey(
self, mock_get_s3_client, authenticated_client, scans_fixture
):
scan = scans_fixture[0]
scan.output_location = "s3://test-bucket/report.zip"
scan.state = StateChoices.COMPLETED
scan.save()
fake_client = MagicMock()
fake_client.get_object.side_effect = ClientError(
{"Error": {"Code": "NoSuchKey"}}, "GetObject"
)
mock_get_s3_client.return_value = fake_client
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_404_NOT_FOUND
assert response.json()["errors"]["detail"] == "The scan has no reports."
@patch("api.v1.views.get_s3_client")
def test_report_s3_client_error_other(
self, mock_get_s3_client, authenticated_client, scans_fixture
):
scan = scans_fixture[0]
scan.output_location = "s3://test-bucket/report.zip"
scan.state = StateChoices.COMPLETED
scan.save()
fake_client = MagicMock()
fake_client.get_object.side_effect = ClientError(
{"Error": {"Code": "AccessDenied"}}, "GetObject"
)
mock_get_s3_client.return_value = fake_client
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert (
response.json()["errors"]["detail"]
== "There is a problem with credentials."
)
assert response.status_code == 200
assert response.content == file_content
content_disposition = response.get("Content-Disposition")
assert content_disposition.startswith('attachment; filename="')
assert f'filename="{file_path.name}"' in content_disposition
@pytest.mark.django_db
@@ -3167,9 +2837,7 @@ class TestFindingViewSet:
)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_findings_metadata_retrieve(
self, authenticated_client, findings_fixture, backfill_scan_metadata_fixture
):
def test_findings_metadata_retrieve(self, authenticated_client, findings_fixture):
finding_1, *_ = findings_fixture
response = authenticated_client.get(
reverse("finding-metadata"),
@@ -3192,14 +2860,14 @@ class TestFindingViewSet:
)
# assert data["data"]["attributes"]["tags"] == expected_tags
def test_findings_metadata_resource_filter_retrieve(
self, authenticated_client, findings_fixture, backfill_scan_metadata_fixture
def test_findings_metadata_severity_retrieve(
self, authenticated_client, findings_fixture
):
finding_1, *_ = findings_fixture
response = authenticated_client.get(
reverse("finding-metadata"),
{
"filter[region]": "eu-west-1",
"filter[severity__in]": ["low", "medium"],
"filter[inserted_at]": finding_1.inserted_at.strftime("%Y-%m-%d"),
},
)
@@ -3250,29 +2918,6 @@ class TestFindingViewSet:
]
}
def test_findings_latest(self, authenticated_client, latest_scan_finding):
response = authenticated_client.get(
reverse("finding-latest"),
)
assert response.status_code == status.HTTP_200_OK
# The latest scan only has one finding, in comparison with `GET /findings`
assert len(response.json()["data"]) == 1
assert (
response.json()["data"][0]["attributes"]["status"]
== latest_scan_finding.status
)
def test_findings_metadata_latest(self, authenticated_client, latest_scan_finding):
response = authenticated_client.get(
reverse("finding-metadata_latest"),
)
assert response.status_code == status.HTTP_200_OK
attributes = response.json()["data"]["attributes"]
assert attributes["services"] == latest_scan_finding.resource_services
assert attributes["regions"] == latest_scan_finding.resource_regions
assert attributes["resource_types"] == latest_scan_finding.resource_types
@pytest.mark.django_db
class TestJWTFields:
@@ -4910,8 +4555,9 @@ class TestOverviewViewSet:
assert response.json()["data"][0]["attributes"]["findings"]["pass"] == 2
assert response.json()["data"][0]["attributes"]["findings"]["fail"] == 1
assert response.json()["data"][0]["attributes"]["findings"]["muted"] == 1
# Since we rely on completed scans, there are only 2 resources now
assert response.json()["data"][0]["attributes"]["resources"]["total"] == 2
assert response.json()["data"][0]["attributes"]["resources"]["total"] == len(
resources_fixture
)
def test_overview_services_list_no_required_filters(
self, authenticated_client, scan_summaries_fixture

View File

@@ -1,14 +1,11 @@
from datetime import datetime, timezone
from allauth.socialaccount.providers.oauth2.client import OAuth2Client
from django.contrib.postgres.aggregates import ArrayAgg
from django.db.models import Subquery
from rest_framework.exceptions import NotFound, ValidationError
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Invitation, Provider, Resource
from api.v1.serializers import FindingMetadataSerializer
from api.models import Invitation, Provider
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.common.models import Connection
@@ -208,33 +205,3 @@ def validate_invitation(
)
return invitation
# ToRemove after removing the fallback mechanism in /findings/metadata
def get_findings_metadata_no_aggregations(tenant_id: str, filtered_queryset):
filtered_ids = filtered_queryset.order_by().values("id")
relevant_resources = Resource.all_objects.filter(
tenant_id=tenant_id, findings__id__in=Subquery(filtered_ids)
).only("service", "region", "type")
aggregation = relevant_resources.aggregate(
services=ArrayAgg("service", flat=True),
regions=ArrayAgg("region", flat=True),
resource_types=ArrayAgg("type", flat=True),
)
services = sorted(set(aggregation["services"] or []))
regions = sorted({region for region in aggregation["regions"] or [] if region})
resource_types = sorted(set(aggregation["resource_types"] or []))
result = {
"services": services,
"regions": regions,
"resource_types": resource_types,
}
serializer = FindingMetadataSerializer(data=result)
serializer.is_valid(raise_exception=True)
return serializer.data

View File

@@ -1,33 +0,0 @@
from rest_framework.response import Response
class PaginateByPkMixin:
"""
Mixin to paginate on a list of PKs (cheaper than heavy JOINs),
re-fetch the full objects with the desired select/prefetch,
re-sort them to preserve DB ordering, then serialize + return.
"""
def paginate_by_pk(
self,
request, # noqa: F841
base_queryset,
manager,
select_related: list[str] | None = None,
prefetch_related: list[str] | None = None,
) -> Response:
pk_list = base_queryset.values_list("id", flat=True)
page = self.paginate_queryset(pk_list)
if page is None:
return Response(self.get_serializer(base_queryset, many=True).data)
queryset = manager.filter(id__in=page)
if select_related:
queryset = queryset.select_related(*select_related)
if prefetch_related:
queryset = queryset.prefetch_related(*prefetch_related)
queryset = sorted(queryset, key=lambda obj: page.index(obj.id))
serialized = self.get_serializer(queryset, many=True).data
return self.get_paginated_response(serialized)

View File

@@ -960,15 +960,6 @@ class ScanReportSerializer(serializers.Serializer):
fields = ["id"]
class ScanComplianceReportSerializer(serializers.Serializer):
id = serializers.CharField(source="scan")
name = serializers.CharField()
class Meta:
resource_name = "scan-reports"
fields = ["id", "name"]
class ResourceTagSerializer(RLSSerializer):
"""
Serializer for the ResourceTag model

View File

@@ -1,6 +1,5 @@
import glob
import os
from datetime import datetime, timedelta, timezone
import sentry_sdk
from allauth.socialaccount.providers.github.views import GitHubOAuth2Adapter
@@ -21,7 +20,6 @@ from django.db.models import Count, Exists, F, OuterRef, Prefetch, Q, Subquery,
from django.db.models.functions import Coalesce
from django.http import HttpResponse
from django.urls import reverse
from django.utils.dateparse import parse_date
from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_control
from django_celery_beat.models import PeriodicTask
@@ -50,7 +48,6 @@ from rest_framework_simplejwt.exceptions import InvalidToken, TokenError
from tasks.beat import schedule_provider_scan
from tasks.jobs.export import get_s3_client
from tasks.tasks import (
backfill_scan_resource_summaries_task,
check_provider_connection_task,
delete_provider_task,
delete_tenant_task,
@@ -58,14 +55,12 @@ from tasks.tasks import (
)
from api.base_views import BaseRLSViewSet, BaseTenantViewset, BaseUserViewset
from api.compliance import get_compliance_frameworks
from api.db_router import MainRouter
from api.filters import (
ComplianceOverviewFilter,
FindingFilter,
IntegrationFilter,
InvitationFilter,
LatestFindingFilter,
MembershipFilter,
ProviderFilter,
ProviderGroupFilter,
@@ -91,7 +86,6 @@ from api.models import (
ProviderSecret,
Resource,
ResourceFindingMapping,
ResourceScanSummary,
Role,
RoleProviderGroupRelationship,
Scan,
@@ -105,13 +99,7 @@ from api.models import (
from api.pagination import ComplianceOverviewPagination
from api.rbac.permissions import Permissions, get_providers, get_role
from api.rls import Tenant
from api.utils import (
CustomOAuth2Client,
get_findings_metadata_no_aggregations,
validate_invitation,
)
from api.uuid_utils import datetime_to_uuid7, uuid7_start
from api.v1.mixins import PaginateByPkMixin
from api.utils import CustomOAuth2Client, validate_invitation
from api.v1.serializers import (
ComplianceOverviewFullSerializer,
ComplianceOverviewMetadataSerializer,
@@ -146,7 +134,6 @@ from api.v1.serializers import (
RoleProviderGroupRelationshipSerializer,
RoleSerializer,
RoleUpdateSerializer,
ScanComplianceReportSerializer,
ScanCreateSerializer,
ScanReportSerializer,
ScanSerializer,
@@ -260,7 +247,7 @@ class SchemaView(SpectacularAPIView):
def get(self, request, *args, **kwargs):
spectacular_settings.TITLE = "Prowler API"
spectacular_settings.VERSION = "1.8.1"
spectacular_settings.VERSION = "1.6.0"
spectacular_settings.DESCRIPTION = (
"Prowler API specification.\n\nThis file is auto-generated."
)
@@ -1163,27 +1150,6 @@ class ProviderViewSet(BaseRLSViewSet):
404: OpenApiResponse(description="The scan has no reports"),
},
),
compliance=extend_schema(
tags=["Scan"],
summary="Retrieve compliance report as CSV",
description="Download a specific compliance report (e.g., 'cis_1.4_aws') as a CSV file.",
parameters=[
OpenApiParameter(
name="name",
type=str,
location=OpenApiParameter.PATH,
required=True,
description="The compliance report name, like 'cis_1.4_aws'",
),
],
responses={
200: OpenApiResponse(
description="CSV file containing the compliance report"
),
404: OpenApiResponse(description="Compliance report not found"),
},
request=None,
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
@@ -1236,10 +1202,6 @@ class ScanViewSet(BaseRLSViewSet):
if hasattr(self, "response_serializer_class"):
return self.response_serializer_class
return ScanReportSerializer
elif self.action == "compliance":
if hasattr(self, "response_serializer_class"):
return self.response_serializer_class
return ScanComplianceReportSerializer
return super().get_serializer_class()
def partial_update(self, request, *args, **kwargs):
@@ -1257,111 +1219,70 @@ class ScanViewSet(BaseRLSViewSet):
)
return Response(data=read_serializer.data, status=status.HTTP_200_OK)
def _get_task_status(self, scan_instance):
"""
Returns task status if the scan or its associated report-generation task is still executing.
@action(detail=True, methods=["get"], url_name="report")
def report(self, request, pk=None):
scan_instance = self.get_object()
If the scan is in an EXECUTING state or if a background task related to report generation
is found and also executing, this method returns a 202 Accepted response with the task
metadata and a `Content-Location` header pointing to the task detail endpoint.
if scan_instance.state == StateChoices.EXECUTING:
# If the scan is still running, return the task
prowler_task = Task.objects.get(id=scan_instance.task.id)
self.response_serializer_class = TaskSerializer
output_serializer = self.get_serializer(prowler_task)
return Response(
data=output_serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse(
"task-detail", kwargs={"pk": output_serializer.data["id"]}
)
},
)
Args:
scan_instance (Scan): The scan instance for which the task status is being checked.
Returns:
Response or None:
- A `Response` with HTTP 202 status and serialized task data if the task is executing.
- `None` if no running task is found or if the task has already completed.
"""
task = None
if scan_instance.state == StateChoices.EXECUTING and scan_instance.task:
task = scan_instance.task
else:
try:
task = Task.objects.get(
task_runner_task__task_name="scan-report",
task_runner_task__task_args__contains=str(scan_instance.id),
try:
output_celery_task = Task.objects.get(
task_runner_task__task_name="scan-report",
task_runner_task__task_args__contains=pk,
)
self.response_serializer_class = TaskSerializer
output_serializer = self.get_serializer(output_celery_task)
if output_serializer.data["state"] == StateChoices.EXECUTING:
# If the task is still running, return the task
return Response(
data=output_serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse(
"task-detail", kwargs={"pk": output_serializer.data["id"]}
)
},
)
except Task.DoesNotExist:
return None
except Task.DoesNotExist:
# If the task does not exist, it means that the task is removed from the database
pass
self.response_serializer_class = TaskSerializer
serializer = self.get_serializer(task)
output_location = scan_instance.output_location
if not output_location:
return Response(
{"detail": "The scan has no reports."},
status=status.HTTP_404_NOT_FOUND,
)
if serializer.data.get("state") != StateChoices.EXECUTING:
return None
return Response(
data=serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse(
"task-detail", kwargs={"pk": serializer.data["id"]}
)
},
)
def _load_file(self, path_pattern, s3=False, bucket=None, list_objects=False):
"""
Loads a binary file (e.g., ZIP or CSV) and returns its content and filename.
Depending on the input parameters, this method supports loading:
- From S3 using a direct key.
- From S3 by listing objects under a prefix and matching suffix.
- From the local filesystem using glob pattern matching.
Args:
path_pattern (str): The key or glob pattern representing the file location.
s3 (bool, optional): Whether the file is stored in S3. Defaults to False.
bucket (str, optional): The name of the S3 bucket, required if `s3=True`. Defaults to None.
list_objects (bool, optional): If True and `s3=True`, list objects by prefix to find the file. Defaults to False.
Returns:
tuple[bytes, str]: A tuple containing the file content as bytes and the filename if successful.
Response: A DRF `Response` object with an appropriate status and error detail if an error occurs.
"""
if s3:
if scan_instance.output_location.startswith("s3://"):
try:
client = get_s3_client()
s3_client = get_s3_client()
except (ClientError, NoCredentialsError, ParamValidationError):
return Response(
{"detail": "There is a problem with credentials."},
status=status.HTTP_403_FORBIDDEN,
)
if list_objects:
# list keys under prefix then match suffix
prefix = os.path.dirname(path_pattern)
suffix = os.path.basename(path_pattern)
try:
resp = client.list_objects_v2(Bucket=bucket, Prefix=prefix)
except ClientError as e:
sentry_sdk.capture_exception(e)
return Response(
{
"detail": "Unable to list compliance files in S3: encountered an AWS error."
},
status=status.HTTP_502_BAD_GATEWAY,
)
contents = resp.get("Contents", [])
keys = [obj["Key"] for obj in contents if obj["Key"].endswith(suffix)]
if not keys:
return Response(
{
"detail": f"No compliance file found for name '{os.path.splitext(suffix)[0]}'."
},
status=status.HTTP_404_NOT_FOUND,
)
# path_pattern here is prefix, but in compliance we build correct suffix check before
key = keys[0]
else:
# path_pattern is exact key
key = path_pattern
bucket_name = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET")
key = output_location[len(f"s3://{bucket_name}/") :]
try:
s3_obj = client.get_object(Bucket=bucket, Key=key)
s3_object = s3_client.get_object(Bucket=bucket_name, Key=key)
except ClientError as e:
code = e.response.get("Error", {}).get("Code")
if code == "NoSuchKey":
error_code = e.response.get("Error", {}).get("Code")
if error_code == "NoSuchKey":
return Response(
{"detail": "The scan has no reports."},
status=status.HTTP_404_NOT_FOUND,
@@ -1370,97 +1291,28 @@ class ScanViewSet(BaseRLSViewSet):
{"detail": "There is a problem with credentials."},
status=status.HTTP_403_FORBIDDEN,
)
content = s3_obj["Body"].read()
filename = os.path.basename(key)
file_content = s3_object["Body"].read()
filename = os.path.basename(output_location.split("/")[-1])
else:
files = glob.glob(path_pattern)
if not files:
zip_files = glob.glob(output_location)
try:
file_path = zip_files[0]
except IndexError as e:
sentry_sdk.capture_exception(e)
return Response(
{"detail": "The scan has no reports."},
status=status.HTTP_404_NOT_FOUND,
)
filepath = files[0]
with open(filepath, "rb") as f:
content = f.read()
filename = os.path.basename(filepath)
with open(file_path, "rb") as f:
file_content = f.read()
filename = os.path.basename(file_path)
return content, filename
def _serve_file(self, content, filename, content_type):
response = HttpResponse(content, content_type=content_type)
response = HttpResponse(
file_content, content_type="application/x-zip-compressed"
)
response["Content-Disposition"] = f'attachment; filename="{filename}"'
return response
@action(detail=True, methods=["get"], url_name="report")
def report(self, request, pk=None):
scan = self.get_object()
# Check for executing tasks
running_resp = self._get_task_status(scan)
if running_resp:
return running_resp
if not scan.output_location:
return Response(
{"detail": "The scan has no reports."}, status=status.HTTP_404_NOT_FOUND
)
if scan.output_location.startswith("s3://"):
bucket = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
key_prefix = scan.output_location.removeprefix(f"s3://{bucket}/")
loader = self._load_file(
key_prefix, s3=True, bucket=bucket, list_objects=False
)
else:
loader = self._load_file(scan.output_location, s3=False)
if isinstance(loader, Response):
return loader
content, filename = loader
return self._serve_file(content, filename, "application/x-zip-compressed")
@action(
detail=True,
methods=["get"],
url_path="compliance/(?P<name>[^/]+)",
url_name="compliance",
)
def compliance(self, request, pk=None, name=None):
scan = self.get_object()
if name not in get_compliance_frameworks(scan.provider.provider):
return Response(
{"detail": f"Compliance '{name}' not found."},
status=status.HTTP_404_NOT_FOUND,
)
running_resp = self._get_task_status(scan)
if running_resp:
return running_resp
if not scan.output_location:
return Response(
{"detail": "The scan has no reports."}, status=status.HTTP_404_NOT_FOUND
)
if scan.output_location.startswith("s3://"):
bucket = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
key_prefix = scan.output_location.removeprefix(f"s3://{bucket}/")
prefix = os.path.join(
os.path.dirname(key_prefix), "compliance", f"{name}.csv"
)
loader = self._load_file(prefix, s3=True, bucket=bucket, list_objects=True)
else:
base = os.path.dirname(scan.output_location)
pattern = os.path.join(base, "compliance", f"*_{name}.csv")
loader = self._load_file(pattern, s3=False)
if isinstance(loader, Response):
return loader
content, filename = loader
return self._serve_file(content, filename, "text/csv")
def create(self, request, *args, **kwargs):
input_serializer = self.get_serializer(data=request.data)
input_serializer.is_valid(raise_exception=True)
@@ -1673,24 +1525,10 @@ class ResourceViewSet(BaseRLSViewSet):
],
filters=True,
),
latest=extend_schema(
tags=["Finding"],
summary="List the latest findings",
description="Retrieve a list of the latest findings from the latest scans for each provider with options for "
"filtering by various criteria.",
filters=True,
),
metadata_latest=extend_schema(
tags=["Finding"],
summary="Retrieve metadata values from the latest findings",
description="Fetch unique metadata values from a set of findings from the latest scans for each provider. "
"This is useful for dynamic filtering.",
filters=True,
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
class FindingViewSet(BaseRLSViewSet):
queryset = Finding.all_objects.all()
serializer_class = FindingSerializer
filterset_class = FindingFilter
@@ -1722,16 +1560,11 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
def get_serializer_class(self):
if self.action == "findings_services_regions":
return FindingDynamicFilterSerializer
elif self.action in ["metadata", "metadata_latest"]:
elif self.action == "metadata":
return FindingMetadataSerializer
return super().get_serializer_class()
def get_filterset_class(self):
if self.action in ["latest", "metadata_latest"]:
return LatestFindingFilter
return FindingFilter
def get_queryset(self):
tenant_id = self.request.tenant_id
user_roles = get_role(self.request.user)
@@ -1771,14 +1604,21 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
return super().filter_queryset(queryset)
def list(self, request, *args, **kwargs):
filtered_queryset = self.filter_queryset(self.get_queryset())
return self.paginate_by_pk(
request,
filtered_queryset,
manager=Finding.all_objects,
select_related=["scan"],
prefetch_related=["resources"],
)
base_qs = self.filter_queryset(self.get_queryset())
paginated_ids = self.paginate_queryset(base_qs.values_list("id", flat=True))
if paginated_ids is not None:
ids = list(paginated_ids)
findings = (
Finding.all_objects.filter(tenant_id=self.request.tenant_id, id__in=ids)
.select_related("scan")
.prefetch_related("resources")
)
# Re-sort in Python to preserve ordering:
findings = sorted(findings, key=lambda x: ids.index(x.id))
serializer = self.get_serializer(findings, many=True)
return self.get_paginated_response(serializer.data)
serializer = self.get_serializer(base_qs, many=True)
return Response(serializer.data)
@action(detail=False, methods=["get"], url_name="findings_services_regions")
def findings_services_regions(self, request):
@@ -1803,104 +1643,26 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
@action(detail=False, methods=["get"], url_name="metadata")
def metadata(self, request):
# Force filter validation
filtered_queryset = self.filter_queryset(self.get_queryset())
tenant_id = self.request.tenant_id
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
tenant_id = request.tenant_id
query_params = request.query_params
filtered_ids = filtered_queryset.order_by().values("id")
queryset = ResourceScanSummary.objects.filter(tenant_id=tenant_id)
scan_based_filters = {}
relevant_resources = Resource.all_objects.filter(
tenant_id=tenant_id, findings__id__in=Subquery(filtered_ids)
).only("service", "region", "type")
if scans := query_params.get("filter[scan__in]") or query_params.get(
"filter[scan]"
):
queryset = queryset.filter(scan_id__in=scans.split(","))
scan_based_filters = {"id__in": scans.split(",")}
else:
exact = query_params.get("filter[inserted_at]")
gte = query_params.get("filter[inserted_at__gte]")
lte = query_params.get("filter[inserted_at__lte]")
date_filters = {}
if exact:
date = parse_date(exact)
datetime_start = datetime.combine(
date, datetime.min.time(), tzinfo=timezone.utc
)
datetime_end = datetime_start + timedelta(days=1)
date_filters["scan_id__gte"] = uuid7_start(
datetime_to_uuid7(datetime_start)
)
date_filters["scan_id__lt"] = uuid7_start(
datetime_to_uuid7(datetime_end)
)
else:
if gte:
date_start = parse_date(gte)
datetime_start = datetime.combine(
date_start, datetime.min.time(), tzinfo=timezone.utc
)
date_filters["scan_id__gte"] = uuid7_start(
datetime_to_uuid7(datetime_start)
)
if lte:
date_end = parse_date(lte)
datetime_end = datetime.combine(
date_end + timedelta(days=1),
datetime.min.time(),
tzinfo=timezone.utc,
)
date_filters["scan_id__lt"] = uuid7_start(
datetime_to_uuid7(datetime_end)
)
if date_filters:
queryset = queryset.filter(**date_filters)
scan_based_filters = {
key.lstrip("scan_"): value for key, value in date_filters.items()
}
# ToRemove: Temporary fallback mechanism
if not queryset.exists():
scan_ids = Scan.objects.filter(
tenant_id=tenant_id, **scan_based_filters
).values_list("id", flat=True)
for scan_id in scan_ids:
backfill_scan_resource_summaries_task.apply_async(
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
)
return Response(
get_findings_metadata_no_aggregations(tenant_id, filtered_queryset)
)
if service_filter := query_params.get("filter[service]") or query_params.get(
"filter[service__in]"
):
queryset = queryset.filter(service__in=service_filter.split(","))
if region_filter := query_params.get("filter[region]") or query_params.get(
"filter[region__in]"
):
queryset = queryset.filter(region__in=region_filter.split(","))
if resource_type_filter := query_params.get(
"filter[resource_type]"
) or query_params.get("filter[resource_type__in]"):
queryset = queryset.filter(
resource_type__in=resource_type_filter.split(",")
)
services = list(
queryset.values_list("service", flat=True).distinct().order_by("service")
)
regions = list(
queryset.values_list("region", flat=True).distinct().order_by("region")
)
resource_types = list(
queryset.values_list("resource_type", flat=True)
.distinct()
.order_by("resource_type")
aggregation = relevant_resources.aggregate(
services=ArrayAgg("service", flat=True),
regions=ArrayAgg("region", flat=True),
resource_types=ArrayAgg("type", flat=True),
)
services = sorted(set(aggregation["services"] or []))
regions = sorted({region for region in aggregation["regions"] or [] if region})
resource_types = sorted(set(aggregation["resource_types"] or []))
result = {
"services": services,
"regions": regions,
@@ -1909,108 +1671,7 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
serializer = self.get_serializer(data=result)
serializer.is_valid(raise_exception=True)
return Response(serializer.data)
@action(detail=False, methods=["get"], url_name="latest")
def latest(self, request):
tenant_id = request.tenant_id
filtered_queryset = self.filter_queryset(self.get_queryset())
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
)
filtered_queryset = filtered_queryset.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
)
return self.paginate_by_pk(
request,
filtered_queryset,
manager=Finding.all_objects,
select_related=["scan"],
prefetch_related=["resources"],
)
@action(
detail=False,
methods=["get"],
url_name="metadata_latest",
url_path="metadata/latest",
)
def metadata_latest(self, request):
tenant_id = request.tenant_id
query_params = request.query_params
latest_scans_queryset = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
)
latest_scans_ids = list(latest_scans_queryset.values_list("id", flat=True))
queryset = ResourceScanSummary.objects.filter(
tenant_id=tenant_id,
scan_id__in=latest_scans_queryset.values_list("id", flat=True),
)
# ToRemove: Temporary fallback mechanism
present_ids = set(
ResourceScanSummary.objects.filter(
tenant_id=tenant_id, scan_id__in=latest_scans_ids
)
.values_list("scan_id", flat=True)
.distinct()
)
missing_scan_ids = [sid for sid in latest_scans_ids if sid not in present_ids]
if missing_scan_ids:
for scan_id in missing_scan_ids:
backfill_scan_resource_summaries_task.apply_async(
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
)
return Response(
get_findings_metadata_no_aggregations(
tenant_id, self.filter_queryset(self.get_queryset())
)
)
if service_filter := query_params.get("filter[service]") or query_params.get(
"filter[service__in]"
):
queryset = queryset.filter(service__in=service_filter.split(","))
if region_filter := query_params.get("filter[region]") or query_params.get(
"filter[region__in]"
):
queryset = queryset.filter(region__in=region_filter.split(","))
if resource_type_filter := query_params.get(
"filter[resource_type]"
) or query_params.get("filter[resource_type__in]"):
queryset = queryset.filter(
resource_type__in=resource_type_filter.split(",")
)
services = list(
queryset.values_list("service", flat=True).distinct().order_by("service")
)
regions = list(
queryset.values_list("region", flat=True).distinct().order_by("region")
)
resource_types = list(
queryset.values_list("resource_type", flat=True)
.distinct()
.order_by("resource_type")
)
result = {
"services": services,
"regions": regions,
"resource_types": resource_types,
}
serializer = self.get_serializer(data=result)
serializer.is_valid(raise_exception=True)
return Response(serializer.data)
return Response(serializer.data, status=status.HTTP_200_OK)
@extend_schema_view(
@@ -2577,8 +2238,8 @@ class OverviewViewSet(BaseRLSViewSet):
def _get_filtered_queryset(model):
if role.unlimited_visibility:
return model.all_objects.filter(tenant_id=self.request.tenant_id)
return model.all_objects.filter(
return model.objects.filter(tenant_id=self.request.tenant_id)
return model.objects.filter(
tenant_id=self.request.tenant_id, scan__provider__in=providers
)
@@ -2622,38 +2283,51 @@ class OverviewViewSet(BaseRLSViewSet):
tenant_id = self.request.tenant_id
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
Scan.objects.filter(
tenant_id=tenant_id,
state=StateChoices.COMPLETED,
)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
)
resource_count_queryset = (
Resource.all_objects.filter(
tenant_id=tenant_id,
provider_id=OuterRef("scan__provider_id"),
)
.order_by()
.values("provider_id")
.annotate(cnt=Count("id"))
.values("cnt")
)
overview_queryset = (
ScanSummary.all_objects.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
)
.values(provider=F("scan__provider__provider"))
findings_aggregated = (
ScanSummary.objects.filter(tenant_id=tenant_id, scan_id__in=latest_scan_ids)
.values("scan__provider__provider")
.annotate(
findings_passed=Coalesce(Sum("_pass"), 0),
findings_failed=Coalesce(Sum("fail"), 0),
findings_muted=Coalesce(Sum("muted"), 0),
total_findings=Coalesce(Sum("total"), 0),
total_resources=Coalesce(Subquery(resource_count_queryset), 0),
)
)
serializer = OverviewProviderSerializer(overview_queryset, many=True)
resources_aggregated = (
Resource.objects.filter(tenant_id=tenant_id)
.values("provider__provider")
.annotate(total_resources=Count("id"))
)
resources_dict = {
row["provider__provider"]: row["total_resources"]
for row in resources_aggregated
}
overview = []
for row in findings_aggregated:
provider_type = row["scan__provider__provider"]
overview.append(
{
"provider": provider_type,
"total_resources": resources_dict.get(provider_type, 0),
"total_findings": row["total_findings"],
"findings_passed": row["findings_passed"],
"findings_failed": row["findings_failed"],
"findings_muted": row["findings_muted"],
}
)
serializer = OverviewProviderSerializer(overview, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@action(detail=False, methods=["get"], url_name="findings")
@@ -2662,16 +2336,22 @@ class OverviewViewSet(BaseRLSViewSet):
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
latest_scan_subquery = (
Scan.objects.filter(
tenant_id=tenant_id,
state=StateChoices.COMPLETED,
provider_id=OuterRef("scan__provider_id"),
)
.order_by("-inserted_at")
.values("id")[:1]
)
filtered_queryset = filtered_queryset.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
annotated_queryset = filtered_queryset.annotate(
latest_scan_id=Subquery(latest_scan_subquery)
)
filtered_queryset = annotated_queryset.filter(scan_id=F("latest_scan_id"))
aggregated_totals = filtered_queryset.aggregate(
_pass=Sum("_pass") or 0,
fail=Sum("fail") or 0,
@@ -2701,16 +2381,22 @@ class OverviewViewSet(BaseRLSViewSet):
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
latest_scan_subquery = (
Scan.objects.filter(
tenant_id=tenant_id,
state=StateChoices.COMPLETED,
provider_id=OuterRef("scan__provider_id"),
)
.order_by("-inserted_at")
.values("id")[:1]
)
filtered_queryset = filtered_queryset.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
annotated_queryset = filtered_queryset.annotate(
latest_scan_id=Subquery(latest_scan_subquery)
)
filtered_queryset = annotated_queryset.filter(scan_id=F("latest_scan_id"))
severity_counts = (
filtered_queryset.values("severity")
.annotate(count=Sum("total"))
@@ -2731,16 +2417,22 @@ class OverviewViewSet(BaseRLSViewSet):
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
latest_scan_subquery = (
Scan.objects.filter(
tenant_id=tenant_id,
state=StateChoices.COMPLETED,
provider_id=OuterRef("scan__provider_id"),
)
.order_by("-inserted_at")
.values("id")[:1]
)
filtered_queryset = filtered_queryset.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
annotated_queryset = filtered_queryset.annotate(
latest_scan_id=Subquery(latest_scan_subquery)
)
filtered_queryset = annotated_queryset.filter(scan_id=F("latest_scan_id"))
services_data = (
filtered_queryset.values("service")
.annotate(_pass=Sum("_pass"))

View File

@@ -111,7 +111,6 @@ SPECTACULAR_SETTINGS = {
"PREPROCESSING_HOOKS": [
"drf_spectacular_jsonapi.hooks.fix_nested_path_parameters",
],
"TITLE": "API Reference - Prowler",
}
WSGI_APPLICATION = "config.wsgi.application"
@@ -238,8 +237,9 @@ DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = env.str(
DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = env.str("DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN", "")
DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = env.str("DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION", "")
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
# HTTP Security Headers
SECURE_CONTENT_TYPE_NOSNIFF = True
X_FRAME_OPTIONS = "DENY"
SECURE_REFERRER_POLICY = "strict-origin-when-cross-origin"
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)

View File

@@ -39,9 +39,6 @@ IGNORED_EXCEPTIONS = [
"RequestExpired",
"ConnectionClosedError",
"MaxRetryError",
"AWSAccessKeyIDInvalidError",
"AWSSessionTokenExpiredError",
"EndpointConnectionError", # AWS Service is not available in a region
"Pool is closed", # The following comes from urllib3: eu-west-1 -- HTTPClientError[126]: An HTTP Client raised an unhandled exception: AWSHTTPSConnectionPool(host='hostname.s3.eu-west-1.amazonaws.com', port=443): Pool is closed.
# Authentication Errors from GCP
"ClientAuthenticationError",
@@ -66,6 +63,8 @@ IGNORED_EXCEPTIONS = [
"AzureClientIdAndClientSecretNotBelongingToTenantIdError",
"AzureHTTPResponseError",
"Error with credentials provided",
# AWS Service is not available in a region
"EndpointConnectionError",
]
@@ -97,6 +96,4 @@ sentry_sdk.init(
# possible.
"continuous_profiling_auto_start": True,
},
attach_stacktrace=True,
ignore_errors=IGNORED_EXCEPTIONS,
)

View File

@@ -10,7 +10,6 @@ from django.urls import reverse
from django_celery_results.models import TaskResult
from rest_framework import status
from rest_framework.test import APIClient
from tasks.jobs.backfill import backfill_resource_scan_summaries
from api.db_utils import rls_transaction
from api.models import (
@@ -921,55 +920,6 @@ def integrations_fixture(providers_fixture):
return integration1, integration2
@pytest.fixture
def backfill_scan_metadata_fixture(scans_fixture, findings_fixture):
for scan_instance in scans_fixture:
tenant_id = scan_instance.tenant_id
scan_id = scan_instance.id
backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
@pytest.fixture(scope="function")
def latest_scan_finding(authenticated_client, providers_fixture, resources_fixture):
provider = providers_fixture[0]
tenant_id = str(providers_fixture[0].tenant_id)
resource = resources_fixture[0]
scan = Scan.objects.create(
name="latest completed scan",
provider=provider,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.COMPLETED,
tenant_id=tenant_id,
)
finding = Finding.objects.create(
tenant_id=tenant_id,
uid="test_finding_uid_1",
scan=scan,
delta="new",
status=Status.FAIL,
status_extended="test status extended ",
impact=Severity.critical,
impact_extended="test impact extended one",
severity=Severity.critical,
raw_result={
"status": Status.FAIL,
"impact": Severity.critical,
"severity": Severity.critical,
},
tags={"test": "dev-qa"},
check_id="test_check_id",
check_metadata={
"CheckId": "test_check_id",
"Description": "test description apple sauce",
},
first_seen_at="2024-01-02T00:00:00Z",
)
finding.add_resources([resource])
backfill_resource_scan_summaries(tenant_id, str(scan.id))
return finding
def get_authorization_header(access_token: str) -> dict:
return {"Authorization": f"Bearer {access_token}"}

View File

@@ -1,61 +0,0 @@
from api.db_utils import rls_transaction
from api.models import (
Resource,
ResourceFindingMapping,
ResourceScanSummary,
Scan,
StateChoices,
)
def backfill_resource_scan_summaries(tenant_id: str, scan_id: str):
with rls_transaction(tenant_id):
if ResourceScanSummary.objects.filter(
tenant_id=tenant_id, scan_id=scan_id
).exists():
return {"status": "already backfilled"}
with rls_transaction(tenant_id):
if not Scan.objects.filter(
tenant_id=tenant_id,
id=scan_id,
state__in=(StateChoices.COMPLETED, StateChoices.FAILED),
).exists():
return {"status": "scan is not completed"}
resource_ids_qs = (
ResourceFindingMapping.objects.filter(
tenant_id=tenant_id, finding__scan_id=scan_id
)
.values_list("resource_id", flat=True)
.distinct()
)
resource_ids = list(resource_ids_qs)
if not resource_ids:
return {"status": "no resources to backfill"}
resources_qs = Resource.objects.filter(
tenant_id=tenant_id, id__in=resource_ids
).only("id", "service", "region", "type")
summaries = []
for resource in resources_qs.iterator():
summaries.append(
ResourceScanSummary(
tenant_id=tenant_id,
scan_id=scan_id,
resource_id=str(resource.id),
service=resource.service,
region=resource.region,
resource_type=resource.type,
)
)
for i in range(0, len(summaries), 500):
ResourceScanSummary.objects.bulk_create(
summaries[i : i + 500], ignore_conflicts=True
)
return {"status": "backfilled", "inserted": len(summaries)}

View File

@@ -13,41 +13,6 @@ from prowler.config.config import (
json_ocsf_file_suffix,
output_file_timestamp,
)
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
AWSWellArchitected,
)
from prowler.lib.outputs.compliance.cis.cis_aws import AWSCIS
from prowler.lib.outputs.compliance.cis.cis_azure import AzureCIS
from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
from prowler.lib.outputs.compliance.cis.cis_m365 import M365CIS
from prowler.lib.outputs.compliance.ens.ens_aws import AWSENS
from prowler.lib.outputs.compliance.ens.ens_azure import AzureENS
from prowler.lib.outputs.compliance.ens.ens_gcp import GCPENS
from prowler.lib.outputs.compliance.iso27001.iso27001_aws import AWSISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_azure import AzureISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_gcp import GCPISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_kubernetes import (
KubernetesISO27001,
)
from prowler.lib.outputs.compliance.kisa_ismsp.kisa_ismsp_aws import AWSKISAISMSP
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_aws import AWSMitreAttack
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_azure import (
AzureMitreAttack,
)
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_gcp import GCPMitreAttack
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_aws import (
ProwlerThreatScoreAWS,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_azure import (
ProwlerThreatScoreAzure,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_gcp import (
ProwlerThreatScoreGCP,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_m365 import (
ProwlerThreatScoreM365,
)
from prowler.lib.outputs.csv.csv import CSV
from prowler.lib.outputs.html.html import HTML
from prowler.lib.outputs.ocsf.ocsf import OCSF
@@ -55,44 +20,6 @@ from prowler.lib.outputs.ocsf.ocsf import OCSF
logger = get_task_logger(__name__)
COMPLIANCE_CLASS_MAP = {
"aws": [
(lambda name: name.startswith("cis_"), AWSCIS),
(lambda name: name == "mitre_attack_aws", AWSMitreAttack),
(lambda name: name.startswith("ens_"), AWSENS),
(
lambda name: name.startswith("aws_well_architected_framework"),
AWSWellArchitected,
),
(lambda name: name.startswith("iso27001_"), AWSISO27001),
(lambda name: name.startswith("kisa"), AWSKISAISMSP),
(lambda name: name == "prowler_threatscore_aws", ProwlerThreatScoreAWS),
],
"azure": [
(lambda name: name.startswith("cis_"), AzureCIS),
(lambda name: name == "mitre_attack_azure", AzureMitreAttack),
(lambda name: name.startswith("ens_"), AzureENS),
(lambda name: name.startswith("iso27001_"), AzureISO27001),
(lambda name: name == "prowler_threatscore_azure", ProwlerThreatScoreAzure),
],
"gcp": [
(lambda name: name.startswith("cis_"), GCPCIS),
(lambda name: name == "mitre_attack_gcp", GCPMitreAttack),
(lambda name: name.startswith("ens_"), GCPENS),
(lambda name: name.startswith("iso27001_"), GCPISO27001),
(lambda name: name == "prowler_threatscore_gcp", ProwlerThreatScoreGCP),
],
"kubernetes": [
(lambda name: name.startswith("cis_"), KubernetesCIS),
(lambda name: name.startswith("iso27001_"), KubernetesISO27001),
],
"m365": [
(lambda name: name.startswith("cis_"), M365CIS),
(lambda name: name == "prowler_threatscore_m365", ProwlerThreatScoreM365),
],
}
# Predefined mapping for output formats and their configurations
OUTPUT_FORMATS_MAPPING = {
"csv": {
@@ -116,17 +43,13 @@ def _compress_output_files(output_directory: str) -> str:
str: The full path to the newly created ZIP archive.
"""
zip_path = f"{output_directory}.zip"
parent_dir = os.path.dirname(output_directory)
zip_path_abs = os.path.abspath(zip_path)
with zipfile.ZipFile(zip_path, "w", zipfile.ZIP_DEFLATED) as zipf:
for foldername, _, filenames in os.walk(parent_dir):
for filename in filenames:
file_path = os.path.join(foldername, filename)
if os.path.abspath(file_path) == zip_path_abs:
continue
arcname = os.path.relpath(file_path, start=parent_dir)
zipf.write(file_path, arcname)
for suffix in [config["suffix"] for config in OUTPUT_FORMATS_MAPPING.values()]:
zipf.write(
f"{output_directory}{suffix}",
f"output/{output_directory.split('/')[-1]}{suffix}",
)
return zip_path
@@ -179,38 +102,25 @@ def _upload_to_s3(tenant_id: str, zip_path: str, scan_id: str) -> str:
Raises:
botocore.exceptions.ClientError: If the upload attempt to S3 fails for any reason.
"""
bucket = base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET
if not bucket:
return None
if not base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET:
return
try:
s3 = get_s3_client()
# Upload the ZIP file (outputs) to the S3 bucket
zip_key = f"{tenant_id}/{scan_id}/{os.path.basename(zip_path)}"
s3_key = f"{tenant_id}/{scan_id}/{os.path.basename(zip_path)}"
s3.upload_file(
Filename=zip_path,
Bucket=bucket,
Key=zip_key,
Bucket=base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET,
Key=s3_key,
)
# Upload the compliance directory to the S3 bucket
compliance_dir = os.path.join(os.path.dirname(zip_path), "compliance")
for filename in os.listdir(compliance_dir):
local_path = os.path.join(compliance_dir, filename)
if not os.path.isfile(local_path):
continue
file_key = f"{tenant_id}/{scan_id}/compliance/{filename}"
s3.upload_file(Filename=local_path, Bucket=bucket, Key=file_key)
return f"s3://{base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET}/{zip_key}"
return f"s3://{base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET}/{s3_key}"
except (ClientError, NoCredentialsError, ParamValidationError, ValueError) as e:
logger.error(f"S3 upload failed: {str(e)}")
def _generate_output_directory(
output_directory, prowler_provider: object, tenant_id: str, scan_id: str
) -> tuple[str, str]:
) -> str:
"""
Generate a file system path for the output directory of a prowler scan.
@@ -235,8 +145,7 @@ def _generate_output_directory(
Example:
>>> _generate_output_directory("/tmp", "aws", "tenant-1234", "scan-5678")
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56',
'/tmp/tenant-1234/aws/scan-5678/compliance/prowler-output-2023-02-15T12:34:56'
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56'
"""
path = (
f"{output_directory}/{tenant_id}/{scan_id}/prowler-output-"
@@ -244,10 +153,4 @@ def _generate_output_directory(
)
os.makedirs("/".join(path.split("/")[:-1]), exist_ok=True)
compliance_path = (
f"{output_directory}/{tenant_id}/{scan_id}/compliance/prowler-output-"
f"{prowler_provider}-{output_file_timestamp}"
)
os.makedirs("/".join(compliance_path.split("/")[:-1]), exist_ok=True)
return path, compliance_path
return path

View File

@@ -19,7 +19,6 @@ from api.models import (
Finding,
Provider,
Resource,
ResourceScanSummary,
ResourceTag,
Scan,
ScanSummary,
@@ -122,7 +121,6 @@ def perform_prowler_scan(
check_status_by_region = {}
exception = None
unique_resources = set()
scan_resource_cache: set[tuple[str, str, str, str]] = set()
start_time = time.time()
with rls_transaction(tenant_id):
@@ -297,16 +295,6 @@ def perform_prowler_scan(
continue
region_dict[finding.check_id] = finding.status.value
# Update scan resource summaries
scan_resource_cache.add(
(
str(resource_instance.id),
resource_instance.service,
resource_instance.region,
resource_instance.type,
)
)
# Update scan progress
with rls_transaction(tenant_id):
scan_instance.progress = progress
@@ -326,90 +314,66 @@ def perform_prowler_scan(
scan_instance.unique_resource_count = len(unique_resources)
scan_instance.save()
if exception is None:
try:
regions = prowler_provider.get_regions()
except AttributeError:
regions = set()
compliance_template = PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE[
provider_instance.provider
]
compliance_overview_by_region = {
region: deepcopy(compliance_template) for region in regions
}
for region, check_status in check_status_by_region.items():
compliance_data = compliance_overview_by_region.setdefault(
region, deepcopy(compliance_template)
)
for check_name, status in check_status.items():
generate_scan_compliance(
compliance_data,
provider_instance.provider,
check_name,
status,
)
# Prepare compliance overview objects
compliance_overview_objects = []
for region, compliance_data in compliance_overview_by_region.items():
for compliance_id, compliance in compliance_data.items():
compliance_overview_objects.append(
ComplianceOverview(
tenant_id=tenant_id,
scan=scan_instance,
region=region,
compliance_id=compliance_id,
framework=compliance["framework"],
version=compliance["version"],
description=compliance["description"],
requirements=compliance["requirements"],
requirements_passed=compliance["requirements_status"]["passed"],
requirements_failed=compliance["requirements_status"]["failed"],
requirements_manual=compliance["requirements_status"]["manual"],
total_requirements=compliance["total_requirements"],
)
)
try:
with rls_transaction(tenant_id):
ComplianceOverview.objects.bulk_create(
compliance_overview_objects, batch_size=100
)
except Exception as overview_exception:
import sentry_sdk
sentry_sdk.capture_exception(overview_exception)
logger.error(
f"Error storing compliance overview for scan {scan_id}: {overview_exception}"
)
if exception is not None:
raise exception
try:
regions = prowler_provider.get_regions()
except AttributeError:
regions = set()
compliance_template = PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE[
provider_instance.provider
]
compliance_overview_by_region = {
region: deepcopy(compliance_template) for region in regions
}
for region, check_status in check_status_by_region.items():
compliance_data = compliance_overview_by_region.setdefault(
region, deepcopy(compliance_template)
)
for check_name, status in check_status.items():
generate_scan_compliance(
compliance_data,
provider_instance.provider,
check_name,
status,
)
# Prepare compliance overview objects
compliance_overview_objects = []
for region, compliance_data in compliance_overview_by_region.items():
for compliance_id, compliance in compliance_data.items():
compliance_overview_objects.append(
ComplianceOverview(
tenant_id=tenant_id,
scan=scan_instance,
region=region,
compliance_id=compliance_id,
framework=compliance["framework"],
version=compliance["version"],
description=compliance["description"],
requirements=compliance["requirements"],
requirements_passed=compliance["requirements_status"]["passed"],
requirements_failed=compliance["requirements_status"]["failed"],
requirements_manual=compliance["requirements_status"]["manual"],
total_requirements=compliance["total_requirements"],
)
)
try:
with rls_transaction(tenant_id):
ComplianceOverview.objects.bulk_create(
compliance_overview_objects, batch_size=500
)
except Exception as overview_exception:
import sentry_sdk
sentry_sdk.capture_exception(overview_exception)
logger.error(
f"Error storing compliance overview for scan {scan_id}: {overview_exception}"
)
try:
resource_scan_summaries = [
ResourceScanSummary(
tenant_id=tenant_id,
scan_id=scan_id,
resource_id=resource_id,
service=service,
region=region,
resource_type=resource_type,
)
for resource_id, service, region, resource_type in scan_resource_cache
]
with rls_transaction(tenant_id):
ResourceScanSummary.objects.bulk_create(
resource_scan_summaries, batch_size=500, ignore_conflicts=True
)
except Exception as filter_exception:
import sentry_sdk
sentry_sdk.capture_exception(filter_exception)
logger.error(
f"Error storing filter values for scan {scan_id}: {filter_exception}"
)
serializer = ScanTaskSerializer(instance=scan_instance)
return serializer.data

View File

@@ -7,11 +7,9 @@ from celery.utils.log import get_task_logger
from config.celery import RLSTask
from config.django.base import DJANGO_FINDINGS_BATCH_SIZE, DJANGO_TMP_OUTPUT_DIRECTORY
from django_celery_beat.models import PeriodicTask
from tasks.jobs.backfill import backfill_resource_scan_summaries
from tasks.jobs.connection import check_provider_connection
from tasks.jobs.deletion import delete_provider, delete_tenant
from tasks.jobs.export import (
COMPLIANCE_CLASS_MAP,
OUTPUT_FORMATS_MAPPING,
_compress_output_files,
_generate_output_directory,
@@ -20,14 +18,11 @@ from tasks.jobs.export import (
from tasks.jobs.scan import aggregate_findings, perform_prowler_scan
from tasks.utils import batched, get_next_execution_datetime
from api.compliance import get_compliance_frameworks
from api.db_utils import rls_transaction
from api.decorators import set_tenant
from api.models import Finding, Provider, Scan, ScanSummary, StateChoices
from api.utils import initialize_prowler_provider
from api.v1.serializers import ScanTaskSerializer
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.outputs.compliance.generic.generic import GenericCompliance
from prowler.lib.outputs.finding import Finding as FindingOutput
logger = get_task_logger(__name__)
@@ -256,118 +251,84 @@ def generate_outputs(scan_id: str, provider_id: str, tenant_id: str):
logger.info(f"No findings found for scan {scan_id}")
return {"upload": False}
provider_obj = Provider.objects.get(id=provider_id)
prowler_provider = initialize_prowler_provider(provider_obj)
provider_uid = provider_obj.uid
provider_type = provider_obj.provider
# Initialize the prowler provider
prowler_provider = initialize_prowler_provider(Provider.objects.get(id=provider_id))
frameworks_bulk = Compliance.get_bulk(provider_type)
frameworks_avail = get_compliance_frameworks(provider_type)
out_dir, comp_dir = _generate_output_directory(
# Get the provider UID
provider_uid = Provider.objects.get(id=provider_id).uid
# Generate and ensure the output directory exists
output_directory = _generate_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY, provider_uid, tenant_id, scan_id
)
def get_writer(writer_map, name, factory, is_last):
"""
Return existing writer_map[name] or create via factory().
In both cases set `.close_file = is_last`.
"""
initialization = False
if name not in writer_map:
writer_map[name] = factory()
initialization = True
w = writer_map[name]
w.close_file = is_last
return w, initialization
# Define auxiliary variables
output_writers = {}
compliance_writers = {}
scan_summary = FindingOutput._transform_findings_stats(
ScanSummary.objects.filter(scan_id=scan_id)
)
qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid").iterator()
for batch, is_last in batched(qs, DJANGO_FINDINGS_BATCH_SIZE):
fos = [FindingOutput.transform_api_finding(f, prowler_provider) for f in batch]
# Retrieve findings queryset
findings_qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid")
# Outputs
for mode, cfg in OUTPUT_FORMATS_MAPPING.items():
cls = cfg["class"]
suffix = cfg["suffix"]
extra = cfg.get("kwargs", {}).copy()
# Process findings in batches
for batch, is_last_batch in batched(
findings_qs.iterator(), DJANGO_FINDINGS_BATCH_SIZE
):
finding_outputs = [
FindingOutput.transform_api_finding(finding, prowler_provider)
for finding in batch
]
# Generate output files
for mode, config in OUTPUT_FORMATS_MAPPING.items():
kwargs = dict(config.get("kwargs", {}))
if mode == "html":
extra.update(provider=prowler_provider, stats=scan_summary)
kwargs["provider"] = prowler_provider
kwargs["stats"] = scan_summary
writer, initialization = get_writer(
output_writers,
cls,
lambda cls=cls, fos=fos, suffix=suffix: cls(
findings=fos,
file_path=out_dir,
file_extension=suffix,
writer_class = config["class"]
if writer_class in output_writers:
writer = output_writers[writer_class]
writer.transform(finding_outputs)
writer.close_file = is_last_batch
else:
writer = writer_class(
findings=finding_outputs,
file_path=output_directory,
file_extension=config["suffix"],
from_cli=False,
),
is_last,
)
if not initialization:
writer.transform(fos)
writer.batch_write_data_to_file(**extra)
writer._data.clear()
)
writer.close_file = is_last_batch
output_writers[writer_class] = writer
# Compliance CSVs
for name in frameworks_avail:
compliance_obj = frameworks_bulk[name]
# Write the current batch using the writer
writer.batch_write_data_to_file(**kwargs)
klass = GenericCompliance
for condition, cls in COMPLIANCE_CLASS_MAP.get(provider_type, []):
if condition(name):
klass = cls
break
# TODO: Refactor the output classes to avoid this manual reset
writer._data = []
filename = f"{comp_dir}_{name}.csv"
# Compress output files
output_directory = _compress_output_files(output_directory)
writer, initialization = get_writer(
compliance_writers,
name,
lambda klass=klass, fos=fos: klass(
findings=fos,
compliance=compliance_obj,
file_path=filename,
from_cli=False,
),
is_last,
)
if not initialization:
writer.transform(fos, compliance_obj, name)
writer.batch_write_data_to_file()
writer._data.clear()
# Save to configured storage
uploaded = _upload_to_s3(tenant_id, output_directory, scan_id)
compressed = _compress_output_files(out_dir)
upload_uri = _upload_to_s3(tenant_id, compressed, scan_id)
if upload_uri:
if uploaded:
# Remove the local files after upload
try:
rmtree(Path(compressed).parent, ignore_errors=True)
except Exception as e:
rmtree(Path(output_directory).parent, ignore_errors=True)
except FileNotFoundError as e:
logger.error(f"Error deleting output files: {e}")
final_location, did_upload = upload_uri, True
output_directory = uploaded
uploaded = True
else:
final_location, did_upload = compressed, False
uploaded = False
Scan.all_objects.filter(id=scan_id).update(output_location=final_location)
logger.info(f"Scan outputs at {final_location}")
return {"upload": did_upload}
# Update the scan instance with the output path
Scan.all_objects.filter(id=scan_id).update(output_location=output_directory)
logger.info(f"Scan output files generated, output location: {output_directory}")
@shared_task(name="backfill-scan-resource-summaries", queue="backfill")
def backfill_scan_resource_summaries_task(tenant_id: str, scan_id: str):
"""
Tries to backfill the resource scan summaries table for a given scan.
Args:
tenant_id (str): The tenant identifier.
scan_id (str): The scan identifier.
"""
return backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
return {"upload": uploaded}

View File

@@ -1,79 +0,0 @@
from uuid import uuid4
import pytest
from tasks.jobs.backfill import backfill_resource_scan_summaries
from api.models import ResourceScanSummary, Scan, StateChoices
@pytest.mark.django_db
class TestBackfillResourceScanSummaries:
@pytest.fixture(scope="function")
def resource_scan_summary_data(self, scans_fixture):
scan = scans_fixture[0]
return ResourceScanSummary.objects.create(
tenant_id=scan.tenant_id,
scan_id=scan.id,
resource_id=str(uuid4()),
service="aws",
region="us-east-1",
resource_type="instance",
)
@pytest.fixture(scope="function")
def get_not_completed_scans(self, providers_fixture):
provider_id = providers_fixture[0].id
tenant_id = providers_fixture[0].tenant_id
scan_1 = Scan.objects.create(
tenant_id=tenant_id,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.EXECUTING,
provider_id=provider_id,
)
scan_2 = Scan.objects.create(
tenant_id=tenant_id,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.AVAILABLE,
provider_id=provider_id,
)
return scan_1, scan_2
def test_already_backfilled(self, resource_scan_summary_data):
tenant_id = resource_scan_summary_data.tenant_id
scan_id = resource_scan_summary_data.scan_id
result = backfill_resource_scan_summaries(tenant_id, scan_id)
assert result == {"status": "already backfilled"}
def test_not_completed_scan(self, get_not_completed_scans):
for scan_instance in get_not_completed_scans:
tenant_id = scan_instance.tenant_id
scan_id = scan_instance.id
result = backfill_resource_scan_summaries(tenant_id, scan_id)
assert result == {"status": "scan is not completed"}
def test_successful_backfill_inserts_one_summary(
self, resources_fixture, findings_fixture
):
tenant_id = findings_fixture[0].tenant_id
scan_id = findings_fixture[0].scan_id
# This scan affects the first two resources
resources = resources_fixture[:2]
result = backfill_resource_scan_summaries(tenant_id, scan_id)
assert result == {"status": "backfilled", "inserted": len(resources)}
# Verify correct values
summaries = ResourceScanSummary.objects.filter(
tenant_id=tenant_id, scan_id=scan_id
)
assert summaries.count() == len(resources)
for resource in resources:
summary = summaries.get(resource_id=resource.id)
assert summary.resource_id == resource.id
assert summary.service == resource.service
assert summary.region == resource.region
assert summary.resource_type == resource.type

View File

@@ -1,147 +0,0 @@
import os
import zipfile
from pathlib import Path
from unittest.mock import MagicMock, patch
import pytest
from botocore.exceptions import ClientError
from tasks.jobs.export import (
_compress_output_files,
_generate_output_directory,
_upload_to_s3,
get_s3_client,
)
@pytest.mark.django_db
class TestOutputs:
def test_compress_output_files_creates_zip(self, tmpdir):
base_tmp = Path(str(tmpdir.mkdir("compress_output")))
output_dir = base_tmp / "output"
output_dir.mkdir()
file_path = output_dir / "result.csv"
file_path.write_text("data")
zip_path = _compress_output_files(str(output_dir))
assert zip_path.endswith(".zip")
assert os.path.exists(zip_path)
with zipfile.ZipFile(zip_path, "r") as zipf:
assert "output/result.csv" in zipf.namelist()
@patch("tasks.jobs.export.boto3.client")
@patch("tasks.jobs.export.settings")
def test_get_s3_client_success(self, mock_settings, mock_boto_client):
mock_settings.DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID = "test"
mock_settings.DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = "test"
mock_settings.DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = "token"
mock_settings.DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = "eu-west-1"
client_mock = MagicMock()
mock_boto_client.return_value = client_mock
client = get_s3_client()
assert client is not None
client_mock.list_buckets.assert_called()
@patch("tasks.jobs.export.boto3.client")
@patch("tasks.jobs.export.settings")
def test_get_s3_client_fallback(self, mock_settings, mock_boto_client):
mock_boto_client.side_effect = [
ClientError({"Error": {"Code": "403"}}, "ListBuckets"),
MagicMock(),
]
client = get_s3_client()
assert client is not None
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_success(self, mock_base, mock_get_client, tmpdir):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
base_tmp = Path(str(tmpdir.mkdir("upload_success")))
zip_path = base_tmp / "outputs.zip"
zip_path.write_bytes(b"dummy")
compliance_dir = base_tmp / "compliance"
compliance_dir.mkdir()
(compliance_dir / "report.csv").write_text("ok")
client_mock = MagicMock()
mock_get_client.return_value = client_mock
result = _upload_to_s3("tenant-id", str(zip_path), "scan-id")
expected_uri = "s3://test-bucket/tenant-id/scan-id/outputs.zip"
assert result == expected_uri
assert client_mock.upload_file.call_count == 2
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_missing_bucket(self, mock_base, mock_get_client):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = ""
result = _upload_to_s3("tenant", "/tmp/fake.zip", "scan")
assert result is None
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_skips_non_files(self, mock_base, mock_get_client, tmpdir):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
base_tmp = Path(str(tmpdir.mkdir("upload_skips_non_files")))
zip_path = base_tmp / "results.zip"
zip_path.write_bytes(b"zip")
compliance_dir = base_tmp / "compliance"
compliance_dir.mkdir()
(compliance_dir / "subdir").mkdir()
client_mock = MagicMock()
mock_get_client.return_value = client_mock
result = _upload_to_s3("tenant", str(zip_path), "scan")
expected_uri = "s3://test-bucket/tenant/scan/results.zip"
assert result == expected_uri
client_mock.upload_file.assert_called_once()
@patch(
"tasks.jobs.export.get_s3_client",
side_effect=ClientError({"Error": {}}, "Upload"),
)
@patch("tasks.jobs.export.base")
@patch("tasks.jobs.export.logger.error")
def test_upload_to_s3_failure_logs_error(
self, mock_logger, mock_base, mock_get_client, tmpdir
):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "bucket"
base_tmp = Path(str(tmpdir.mkdir("upload_failure_logs")))
zip_path = base_tmp / "zipfile.zip"
zip_path.write_bytes(b"zip")
compliance_dir = base_tmp / "compliance"
compliance_dir.mkdir()
(compliance_dir / "report.csv").write_text("csv")
_upload_to_s3("tenant", str(zip_path), "scan")
mock_logger.assert_called()
def test_generate_output_directory_creates_paths(self, tmpdir):
from prowler.config.config import output_file_timestamp
base_tmp = Path(str(tmpdir.mkdir("generate_output")))
base_dir = str(base_tmp)
tenant_id = "t1"
scan_id = "s1"
provider = "aws"
path, compliance = _generate_output_directory(
base_dir, provider, tenant_id, scan_id
)
assert os.path.isdir(os.path.dirname(path))
assert os.path.isdir(os.path.dirname(compliance))
assert path.endswith(f"{provider}-{output_file_timestamp}")
assert compliance.endswith(f"{provider}-{output_file_timestamp}")

View File

@@ -1,415 +0,0 @@
import uuid
from pathlib import Path
from unittest.mock import MagicMock, patch
import pytest
from tasks.tasks import generate_outputs
@pytest.mark.django_db
class TestGenerateOutputs:
def setup_method(self):
self.scan_id = str(uuid.uuid4())
self.provider_id = str(uuid.uuid4())
self.tenant_id = str(uuid.uuid4())
def test_no_findings_returns_early(self):
with patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter:
mock_filter.return_value.exists.return_value = False
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": False}
mock_filter.assert_called_once_with(scan_id=self.scan_id)
@patch("tasks.tasks.rmtree")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Finding.all_objects.filter")
def test_generate_outputs_happy_path(
self,
mock_finding_filter,
mock_scan_summary_filter,
mock_provider_get,
mock_initialize_provider,
mock_compliance_get_bulk,
mock_get_available_frameworks,
mock_compress,
mock_upload,
mock_rmtree,
):
mock_scan_summary_filter.return_value.exists.return_value = True
mock_provider = MagicMock()
mock_provider.uid = "provider-uid"
mock_provider.provider = "aws"
mock_provider_get.return_value = mock_provider
prowler_provider = MagicMock()
mock_initialize_provider.return_value = prowler_provider
mock_compliance_get_bulk.return_value = {"cis": MagicMock()}
mock_get_available_frameworks.return_value = ["cis"]
dummy_finding = MagicMock(uid="f1")
mock_finding_filter.return_value.order_by.return_value.iterator.return_value = [
[dummy_finding],
True,
]
mock_transformed_stats = {"some": "stats"}
with (
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value=mock_transformed_stats,
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value={"transformed": "f1"},
),
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": MagicMock(name="JSONWriter"),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock(name="CSVCompliance"))]},
),
patch(
"tasks.tasks._generate_output_directory",
return_value=("out-dir", "comp-dir"),
),
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
):
mock_compress.return_value = "/tmp/zipped.zip"
mock_upload.return_value = "s3://bucket/zipped.zip"
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": True}
mock_scan_update.return_value.update.assert_called_once_with(
output_location="s3://bucket/zipped.zip"
)
mock_rmtree.assert_called_once_with(
Path("/tmp/zipped.zip").parent, ignore_errors=True
)
def test_generate_outputs_fails_upload(self):
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk"),
patch("tasks.tasks.get_compliance_frameworks"),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch("tasks.tasks.FindingOutput.transform_api_finding"),
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": MagicMock(name="Writer"),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value=None),
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
result = generate_outputs(
scan_id="scan",
provider_id="provider",
tenant_id=self.tenant_id,
)
assert result == {"upload": False}
mock_scan_update.return_value.update.assert_called_once()
def test_generate_outputs_triggers_html_extra_update(self):
mock_finding_output = MagicMock()
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value={"some": "stats"},
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value=mock_finding_output,
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/f.zip"),
patch("tasks.tasks.Scan.all_objects.filter"),
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
html_writer_mock = MagicMock()
with (
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"html": {
"class": lambda *args, **kwargs: html_writer_mock,
"suffix": ".html",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
):
generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
html_writer_mock.batch_write_data_to_file.assert_called_once()
def test_transform_called_only_on_second_batch(self):
raw1 = MagicMock()
raw2 = MagicMock()
tf1 = MagicMock()
tf1.compliance = {}
tf2 = MagicMock()
tf2.compliance = {}
writer_instances = []
class TrackingWriter:
def __init__(self, findings, file_path, file_extension, from_cli):
self.transform_called = 0
self.batch_write_data_to_file = MagicMock()
self._data = []
self.close_file = False
writer_instances.append(self)
def transform(self, fos):
self.transform_called += 1
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk"),
patch("tasks.tasks.get_compliance_frameworks", return_value=[]),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
side_effect=[tf1, tf2],
),
patch(
"tasks.tasks._generate_output_directory",
return_value=("outdir", "compdir"),
),
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
patch("tasks.tasks.rmtree"),
patch("tasks.tasks.Scan.all_objects.filter"),
patch(
"tasks.tasks.batched",
return_value=[
([raw1], False),
([raw2], True),
],
),
):
mock_summary.return_value.exists.return_value = True
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": TrackingWriter,
"suffix": ".json",
"kwargs": {},
}
},
):
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": True}
assert len(writer_instances) == 1
writer = writer_instances[0]
assert writer.transform_called == 1
def test_compliance_transform_called_on_second_batch(self):
raw1 = MagicMock()
raw2 = MagicMock()
compliance_obj = MagicMock()
writer_instances = []
class TrackingComplianceWriter:
def __init__(self, *args, **kwargs):
self.transform_calls = []
self._data = []
writer_instances.append(self)
def transform(self, fos, comp_obj, name):
self.transform_calls.append((fos, comp_obj, name))
def batch_write_data_to_file(self):
pass
two_batches = [
([raw1], False),
([raw2], True),
]
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
patch(
"tasks.tasks.Provider.objects.get",
return_value=MagicMock(uid="UID", provider="aws"),
),
patch("tasks.tasks.initialize_prowler_provider"),
patch(
"tasks.tasks.Compliance.get_bulk", return_value={"cis": compliance_obj}
),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch(
"tasks.tasks._generate_output_directory",
return_value=("outdir", "compdir"),
),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
side_effect=lambda f, prov: f,
),
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
patch("tasks.tasks.rmtree"),
patch(
"tasks.tasks.Scan.all_objects.filter",
return_value=MagicMock(update=lambda **kw: None),
),
patch("tasks.tasks.batched", return_value=two_batches),
patch("tasks.tasks.OUTPUT_FORMATS_MAPPING", {}),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda name: True, TrackingComplianceWriter)]},
),
):
mock_summary.return_value.exists.return_value = True
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert len(writer_instances) == 1
writer = writer_instances[0]
assert writer.transform_calls == [([raw2], compliance_obj, "cis")]
assert result == {"upload": True}
def test_generate_outputs_logs_rmtree_exception(self, caplog):
mock_finding_output = MagicMock()
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value={"some": "stats"},
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value=mock_finding_output,
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/file.zip"),
patch("tasks.tasks.Scan.all_objects.filter"),
patch("tasks.tasks.rmtree", side_effect=Exception("Test deletion error")),
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
with (
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": lambda *args, **kwargs: MagicMock(),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
):
with caplog.at_level("ERROR"):
generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert "Error deleting output files" in caplog.text

View File

@@ -1,156 +0,0 @@
#!/usr/bin/env python3
import argparse
import os
import re
import subprocess
import sys
from pathlib import Path
import matplotlib.pyplot as plt
import pandas as pd
plt.style.use("ggplot")
def run_locust(
locust_file: str,
host: str,
users: int,
hatch_rate: int,
run_time: str,
csv_prefix: Path,
) -> Path:
artifacts_dir = Path("artifacts")
artifacts_dir.mkdir(parents=True, exist_ok=True)
cmd = [
"locust",
"-f",
f"scenarios/{locust_file}",
"--headless",
"-u",
str(users),
"-r",
str(hatch_rate),
"-t",
run_time,
"--host",
host,
"--csv",
str(artifacts_dir / csv_prefix.name),
]
print(f"Running Locust: {' '.join(cmd)}")
process = subprocess.run(cmd)
if process.returncode:
sys.exit("Locust execution failed")
stats_file = artifacts_dir / f"{csv_prefix.stem}_stats.csv"
if not stats_file.exists():
sys.exit(f"Stats CSV not found: {stats_file}")
return stats_file
def load_percentiles(csv_path: Path) -> pd.DataFrame:
df = pd.read_csv(csv_path)
mapping = {"50%": "p50", "75%": "p75", "90%": "p90", "95%": "p95"}
available = [col for col in mapping if col in df.columns]
renamed = {col: mapping[col] for col in available}
df = df.rename(columns=renamed).set_index("Name")[renamed.values()]
return df.drop(index=["Aggregated"], errors="ignore")
def sanitize_label(label: str) -> str:
text = re.sub(r"[^\w]+", "_", label.strip().lower())
return text.strip("_")
def plot_multi_comparison(metrics: dict[str, pd.DataFrame]) -> None:
common = sorted(set.intersection(*(set(df.index) for df in metrics.values())))
percentiles = list(next(iter(metrics.values())).columns)
groups = len(metrics)
width = 0.8 / groups
for endpoint in common:
fig, ax = plt.subplots(figsize=(10, 5), dpi=100)
for idx, (label, df) in enumerate(metrics.items()):
series = df.loc[endpoint]
positions = [
i + (idx - groups / 2) * width + width / 2
for i in range(len(percentiles))
]
bars = ax.bar(positions, series.values, width, label=label)
for bar in bars:
height = bar.get_height()
ax.annotate(
f"{int(height)}",
xy=(bar.get_x() + bar.get_width() / 2, height),
xytext=(0, 3),
textcoords="offset points",
ha="center",
va="bottom",
fontsize=8,
)
ax.set_xticks(range(len(percentiles)))
ax.set_xticklabels(percentiles)
ax.set_ylabel("Latency (ms)")
ax.set_title(endpoint, fontsize=12)
ax.grid(True, axis="y", linestyle="--", alpha=0.7)
fig.tight_layout()
fig.subplots_adjust(right=0.75)
ax.legend(loc="center left", bbox_to_anchor=(1, 0.5), framealpha=0.9)
output = Path("artifacts") / f"comparison_{sanitize_label(endpoint)}.png"
plt.savefig(output)
plt.close(fig)
print(f"Saved chart: {output}")
def main() -> None:
parser = argparse.ArgumentParser(description="Run Locust and compare metrics")
parser.add_argument("--locustfile", required=True, help="Locust file in scenarios/")
parser.add_argument("--host", required=True, help="Target host URL")
parser.add_argument(
"--users", type=int, default=10, help="Number of simulated users"
)
parser.add_argument("--rate", type=int, default=1, help="Hatch rate per second")
parser.add_argument("--time", default="1m", help="Test duration (e.g. 30s, 1m)")
parser.add_argument(
"--metrics-dir", default="baselines", help="Directory with CSV baselines"
)
parser.add_argument("--version", default="current", help="Test version")
args = parser.parse_args()
metrics_dir = Path(args.metrics_dir)
os.makedirs(metrics_dir, exist_ok=True)
metrics_data: dict[str, pd.DataFrame] = {}
for csv_file in sorted(metrics_dir.glob("*.csv")):
metrics_data[csv_file.stem] = load_percentiles(csv_file)
current_prefix = Path(args.version)
current_csv = run_locust(
locust_file=args.locustfile,
host=args.host,
users=args.users,
hatch_rate=args.rate,
run_time=args.time,
csv_prefix=current_prefix,
)
metrics_data[args.version] = load_percentiles(current_csv)
for endpoint in sorted(
set.intersection(*(set(df.index) for df in metrics_data.values()))
):
parts = [endpoint]
for label, df in metrics_data.items():
s = df.loc[endpoint]
parts.append(f"{label}: p50 {s.p50}, p75 {s.p75}, p90 {s.p90}, p95 {s.p95}")
print(" | ".join(parts))
plot_multi_comparison(metrics_data)
if __name__ == "__main__":
main()

View File

@@ -1,3 +0,0 @@
locust==2.34.1
matplotlib==3.10.1
pandas==2.2.3

View File

@@ -1,216 +0,0 @@
from locust import events, task
from utils.config import (
FINDINGS_UI_SORT_VALUES,
L_PROVIDER_NAME,
M_PROVIDER_NAME,
S_PROVIDER_NAME,
TARGET_INSERTED_AT,
)
from utils.helpers import (
APIUserBase,
get_api_token,
get_auth_headers,
get_next_resource_filter,
get_resource_filters_pairs,
get_scan_id_from_provider_name,
get_sort_value,
)
GLOBAL = {
"token": None,
"scan_ids": {},
"resource_filters": None,
"large_resource_filters": None,
}
@events.test_start.add_listener
def on_test_start(environment, **kwargs):
GLOBAL["token"] = get_api_token(environment.host)
GLOBAL["scan_ids"]["small"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], S_PROVIDER_NAME
)
GLOBAL["scan_ids"]["medium"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], M_PROVIDER_NAME
)
GLOBAL["scan_ids"]["large"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], L_PROVIDER_NAME
)
GLOBAL["resource_filters"] = get_resource_filters_pairs(
environment.host, GLOBAL["token"]
)
GLOBAL["large_resource_filters"] = get_resource_filters_pairs(
environment.host, GLOBAL["token"], GLOBAL["scan_ids"]["large"]
)
class APIUser(APIUserBase):
def on_start(self):
self.token = GLOBAL["token"]
self.s_scan_id = GLOBAL["scan_ids"]["small"]
self.m_scan_id = GLOBAL["scan_ids"]["medium"]
self.l_scan_id = GLOBAL["scan_ids"]["large"]
self.available_resource_filters = GLOBAL["resource_filters"]
self.available_resource_filters_large_scan = GLOBAL["large_resource_filters"]
@task
def findings_default(self):
name = "/findings"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_default_include(self):
name = "/findings?include"
page = self._next_page(name)
endpoint = (
f"/findings?page[number]={page}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata(self):
endpoint = f"/findings/metadata?" f"filter[inserted_at]={TARGET_INSERTED_AT}"
self.client.get(
endpoint, headers=get_auth_headers(self.token), name="/findings/metadata"
)
@task
def findings_scan_small(self):
name = "/findings?filter[scan_id] - 50k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.s_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_small(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.s_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 50k",
)
@task(2)
def findings_scan_medium(self):
name = "/findings?filter[scan_id] - 250k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.m_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_medium(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.m_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 250k",
)
@task
def findings_scan_large(self):
name = "/findings?filter[scan_id] - 500k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_scan_large_include(self):
name = "/findings?filter[scan_id]&include - 500k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_large(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.l_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 500k",
)
@task(2)
def findings_resource_filter(self):
name = "/findings?filter[resource_filter]&include"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings?filter[{filter_name}]={filter_value}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata_resource_filter(self):
name = "/findings/metadata?filter[resource_filter]"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings/metadata?filter[{filter_name}]={filter_value}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata_resource_filter_scan_large(self):
name = "/findings/metadata?filter[resource_filter]&filter[scan_id] - 500k"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings/metadata?filter[{filter_name}]={filter_value}"
f"&filter[scan]={self.l_scan_id}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(2)
def findings_resource_filter_large_scan_include(self):
name = "/findings?filter[resource_filter][scan]&include - 500k"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings?filter[{filter_name}]={filter_value}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)

View File

@@ -1,19 +0,0 @@
import os
USER_EMAIL = os.environ.get("USER_EMAIL")
USER_PASSWORD = os.environ.get("USER_PASSWORD")
BASE_HEADERS = {"Content-Type": "application/vnd.api+json"}
FINDINGS_UI_SORT_VALUES = ["severity", "status", "-inserted_at"]
TARGET_INSERTED_AT = os.environ.get("TARGET_INSERTED_AT", "2025-04-22")
FINDINGS_RESOURCE_METADATA = {
"regions": "region",
"resource_types": "resource_type",
"services": "service",
}
S_PROVIDER_NAME = "provider-50k"
M_PROVIDER_NAME = "provider-250k"
L_PROVIDER_NAME = "provider-500k"

View File

@@ -1,168 +0,0 @@
import random
from collections import defaultdict
from threading import Lock
import requests
from locust import HttpUser, between
from utils.config import (
BASE_HEADERS,
FINDINGS_RESOURCE_METADATA,
TARGET_INSERTED_AT,
USER_EMAIL,
USER_PASSWORD,
)
_global_page_counters = defaultdict(int)
_page_lock = Lock()
class APIUserBase(HttpUser):
"""
Base class for API user simulation in Locust performance tests.
Attributes:
abstract (bool): Indicates this is an abstract user class.
wait_time: Time between task executions, randomized between 1 and 5 seconds.
"""
abstract = True
wait_time = between(1, 5)
def _next_page(self, endpoint_name: str) -> int:
"""
Returns the next page number for a given endpoint. Thread-safe.
Args:
endpoint_name (str): Name of the API endpoint being paginated.
Returns:
int: The next page number for the given endpoint.
"""
with _page_lock:
_global_page_counters[endpoint_name] += 1
return _global_page_counters[endpoint_name]
def get_next_resource_filter(available_values: dict) -> tuple:
"""
Randomly selects a filter type and value from available options.
Args:
available_values (dict): Dictionary with filter types as keys and list of possible values.
Returns:
tuple: A (filter_type, filter_value) pair randomly selected.
"""
filter_type = random.choice(list(available_values.keys()))
filter_value = random.choice(available_values[filter_type])
return filter_type, filter_value
def get_auth_headers(token: str) -> dict:
"""
Returns the headers for the API requests.
Args:
token (str): The token to be included in the headers.
Returns:
dict: The headers for the API requests.
"""
return {
"Authorization": f"Bearer {token}",
**BASE_HEADERS,
}
def get_api_token(host: str) -> str:
"""
Authenticates with the API and retrieves a bearer token.
Args:
host (str): The host URL of the API.
Returns:
str: The access token for authenticated requests.
Raises:
AssertionError: If the request fails or does not return a 200 status code.
"""
login_payload = {
"data": {
"type": "tokens",
"attributes": {"email": USER_EMAIL, "password": USER_PASSWORD},
}
}
response = requests.post(f"{host}/tokens", json=login_payload, headers=BASE_HEADERS)
assert response.status_code == 200, f"Failed to get token: {response.text}"
return response.json()["data"]["attributes"]["access"]
def get_scan_id_from_provider_name(host: str, token: str, provider_name: str) -> str:
"""
Retrieves the scan ID associated with a specific provider name.
Args:
host (str): The host URL of the API.
token (str): Bearer token for authentication.
provider_name (str): Name of the provider to filter scans by.
Returns:
str: The ID of the scan.
Raises:
AssertionError: If the request fails or does not return a 200 status code.
"""
response = requests.get(
f"{host}/scans?fields[scans]=id&filter[provider_alias]={provider_name}",
headers=get_auth_headers(token),
)
assert response.status_code == 200, f"Failed to get scan: {response.text}"
return response.json()["data"][0]["id"]
def get_resource_filters_pairs(host: str, token: str, scan_id: str = "") -> dict:
"""
Retrieves and maps resource metadata filter values from the findings endpoint.
Args:
host (str): The host URL of the API.
token (str): Bearer token for authentication.
scan_id (str, optional): Optional scan ID to filter metadata. Defaults to using inserted_at timestamp.
Returns:
dict: A dictionary of resource filter metadata.
Raises:
AssertionError: If the request fails or does not return a 200 status code.
"""
metadata_filters = (
f"filter[scan]={scan_id}"
if scan_id
else f"filter[inserted_at]={TARGET_INSERTED_AT}"
)
response = requests.get(
f"{host}/findings/metadata?{metadata_filters}", headers=get_auth_headers(token)
)
assert (
response.status_code == 200
), f"Failed to get resource filters values: {response.text}"
attributes = response.json()["data"]["attributes"]
return {
FINDINGS_RESOURCE_METADATA[key]: values
for key, values in attributes.items()
if key in FINDINGS_RESOURCE_METADATA.keys()
}
def get_sort_value(sort_values: list) -> str:
"""
Constructs a sort query string from a list of sort keys.
Args:
sort_values (list): The list of sort values to include in the query.
Returns:
str: A formatted sort query string (e.g., "sort=created_at,-severity").
"""
return f"sort={','.join(sort_values)}"

View File

@@ -1,9 +1,6 @@
#!/bin/bash
# Run Prowler against All AWS Accounts in an AWS Organization
# Activate Poetry Environment
eval "$(poetry env activate)"
# Show Prowler Version
prowler -v

View File

@@ -16,6 +16,7 @@ spec:
containers:
- name: prowler
image: {{ .Values.image.repository }}:{{ .Values.image.tag }}
command: ["prowler"]
args: ["kubernetes", "-z", "-b"]
imagePullPolicy: {{ .Values.image.pullPolicy }}
volumeMounts:

View File

@@ -161,7 +161,7 @@ def update_nav_bar(pathname):
html.Span(
[
html.Img(src="assets/favicon.ico", className="w-5"),
"Subscribe to Prowler Cloud",
"Subscribe to prowler SaaS",
],
className="flex items-center gap-x-3 text-white",
),

Binary file not shown.

Before

Width:  |  Height:  |  Size: 68 KiB

View File

@@ -1361,9 +1361,6 @@ video {
.lg\:grid-cols-4 {
grid-template-columns: repeat(4, minmax(0, 1fr));
}
.lg\:grid-cols-5 {
grid-template-columns: repeat(5, minmax(0, 1fr));
}
.lg\:justify-normal {
justify-content: normal;
@@ -1406,4 +1403,4 @@ video {
.\32xl\:w-\[9\%\] {
width: 9%;
}
}
}

View File

@@ -2228,356 +2228,13 @@ def get_section_containers_ens(data, section_1, section_2, section_3, section_4)
return html.Div(section_containers, className="compliance-data-layout")
def get_section_containers_3_levels(data, section_1, section_2, section_3):
data["STATUS"] = data["STATUS"].apply(map_status_to_icon)
findings_counts_marco = (
data.groupby([section_1, "STATUS"]).size().unstack(fill_value=0)
)
section_containers = []
data[section_1] = data[section_1].astype(str)
data[section_2] = data[section_2].astype(str)
data[section_3] = data[section_3].astype(str)
data.sort_values(
by=section_3,
key=lambda x: x.map(extract_numeric_values),
ascending=True,
inplace=True,
)
for marco in data[section_1].unique():
success_marco = findings_counts_marco.loc[marco].get(pass_emoji, 0)
failed_marco = findings_counts_marco.loc[marco].get(fail_emoji, 0)
fig_name = go.Figure(
[
go.Bar(
name="Failed",
x=[failed_marco],
y=[""],
orientation="h",
marker=dict(color="#e77676"),
width=[0.8],
),
go.Bar(
name="Success",
x=[success_marco],
y=[""],
orientation="h",
marker=dict(color="#45cc6e"),
width=[0.8],
),
]
)
fig_name.update_layout(
barmode="stack",
margin=dict(l=10, r=10, t=10, b=10),
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
showlegend=False,
width=350,
height=30,
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
annotations=[
dict(
x=success_marco + failed_marco,
y=0,
xref="x",
yref="y",
text=str(success_marco),
showarrow=False,
font=dict(color="#45cc6e", size=14),
xanchor="left",
yanchor="middle",
),
dict(
x=0,
y=0,
xref="x",
yref="y",
text=str(failed_marco),
showarrow=False,
font=dict(color="#e77676", size=14),
xanchor="right",
yanchor="middle",
),
],
)
fig_name.add_annotation(
x=failed_marco,
y=0.3,
text="|",
showarrow=False,
font=dict(size=20),
xanchor="center",
yanchor="middle",
)
graph_div = html.Div(
dcc.Graph(
figure=fig_name, config={"staticPlot": True}, className="info-bar"
),
className="graph-section",
)
direct_internal_items = []
for categoria in data[data[section_1] == marco][section_2].unique():
specific_data = data[
(data[section_1] == marco) & (data[section_2] == categoria)
]
findings_counts_categoria = (
specific_data.groupby([section_2, "STATUS"])
.size()
.unstack(fill_value=0)
)
success_categoria = findings_counts_categoria.loc[categoria].get(
pass_emoji, 0
)
failed_categoria = findings_counts_categoria.loc[categoria].get(
fail_emoji, 0
)
fig_section = go.Figure(
[
go.Bar(
name="Failed",
x=[failed_categoria],
y=[""],
orientation="h",
marker=dict(color="#e77676"),
width=[0.8],
),
go.Bar(
name="Success",
x=[success_categoria],
y=[""],
orientation="h",
marker=dict(color="#45cc6e"),
width=[0.8],
),
]
)
fig_section.update_layout(
barmode="stack",
margin=dict(l=10, r=10, t=10, b=10),
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
showlegend=False,
width=350,
height=30,
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
annotations=[
dict(
x=success_categoria + failed_categoria,
y=0,
xref="x",
yref="y",
text=str(success_categoria),
showarrow=False,
font=dict(color="#45cc6e", size=14),
xanchor="left",
yanchor="middle",
),
dict(
x=0,
y=0,
xref="x",
yref="y",
text=str(failed_categoria),
showarrow=False,
font=dict(color="#e77676", size=14),
xanchor="right",
yanchor="middle",
),
],
)
fig_section.add_annotation(
x=failed_categoria,
y=0.3,
text="|",
showarrow=False,
font=dict(size=20),
xanchor="center",
yanchor="middle",
)
graph_div_section = html.Div(
dcc.Graph(
figure=fig_section,
config={"staticPlot": True},
className="info-bar-child",
),
className="graph-section-req",
)
direct_internal_items_idgrupocontrol = []
for idgrupocontrol in specific_data[section_3].unique():
specific_data2 = specific_data[
specific_data[section_3] == idgrupocontrol
]
findings_counts_idgrupocontrol = (
specific_data2.groupby([section_3, "STATUS"])
.size()
.unstack(fill_value=0)
)
success_idgrupocontrol = findings_counts_idgrupocontrol.loc[
idgrupocontrol
].get(pass_emoji, 0)
failed_idgrupocontrol = findings_counts_idgrupocontrol.loc[
idgrupocontrol
].get(fail_emoji, 0)
fig_idgrupocontrol = go.Figure(
[
go.Bar(
name="Failed",
x=[failed_idgrupocontrol],
y=[""],
orientation="h",
marker=dict(color="#e77676"),
width=[0.8],
),
go.Bar(
name="Success",
x=[success_idgrupocontrol],
y=[""],
orientation="h",
marker=dict(color="#45cc6e"),
width=[0.8],
),
]
)
fig_idgrupocontrol.update_layout(
barmode="stack",
margin=dict(l=10, r=10, t=10, b=10),
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
showlegend=False,
width=350,
height=30,
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
annotations=[
dict(
x=success_idgrupocontrol + failed_idgrupocontrol,
y=0,
xref="x",
yref="y",
text=str(success_idgrupocontrol),
showarrow=False,
font=dict(color="#45cc6e", size=14),
xanchor="left",
yanchor="middle",
),
dict(
x=0,
y=0,
xref="x",
yref="y",
text=str(failed_idgrupocontrol),
showarrow=False,
font=dict(color="#e77676", size=14),
xanchor="right",
yanchor="middle",
),
],
)
fig_idgrupocontrol.add_annotation(
x=failed_idgrupocontrol,
y=0.3,
text="|",
showarrow=False,
font=dict(size=20),
xanchor="center",
yanchor="middle",
)
graph_div_idgrupocontrol = html.Div(
dcc.Graph(
figure=fig_idgrupocontrol,
config={"staticPlot": True},
className="info-bar-child",
),
className="graph-section-req",
)
data_table = dash_table.DataTable(
data=specific_data2.to_dict("records"),
columns=[
{"name": i, "id": i}
for i in [
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
],
style_table={"overflowX": "auto"},
style_as_list_view=True,
style_cell={"textAlign": "left", "padding": "5px"},
)
internal_accordion_item_2 = dbc.AccordionItem(
title=idgrupocontrol,
children=[
graph_div_idgrupocontrol,
html.Div([data_table], className="inner-accordion-content"),
],
)
direct_internal_items_idgrupocontrol.append(
html.Div(
[
graph_div_idgrupocontrol,
dbc.Accordion(
[internal_accordion_item_2],
start_collapsed=True,
flush=True,
),
],
className="accordion-inner--child",
)
)
internal_accordion_item = dbc.AccordionItem(
title=categoria,
children=direct_internal_items_idgrupocontrol,
)
internal_section_container = html.Div(
[
graph_div_section,
dbc.Accordion(
[internal_accordion_item], start_collapsed=True, flush=True
),
],
className="accordion-inner--child",
)
direct_internal_items.append(internal_section_container)
accordion_item = dbc.AccordionItem(title=marco, children=direct_internal_items)
section_container = html.Div(
[
graph_div,
dbc.Accordion([accordion_item], start_collapsed=True, flush=True),
],
className="accordion-inner",
)
section_containers.append(section_container)
return html.Div(section_containers, className="compliance-data-layout")
# This function extracts and compares up to two numeric values, ensuring correct sorting for version-like strings.
def extract_numeric_values(value):
numbers = re.findall(r"\d+", str(value))
if len(numbers) == 3:
return int(numbers[0]), int(numbers[1]), int(numbers[2])
elif len(numbers) == 2:
if len(numbers) >= 2:
return int(numbers[0]), int(numbers[1])
elif len(numbers) == 1:
return int(numbers[0])
return int(numbers[0]), 0
return 0, 0

View File

@@ -1,6 +1,6 @@
import warnings
from dashboard.common_methods import get_section_containers_3_levels
from dashboard.common_methods import get_section_containers_format2
warnings.filterwarnings("ignore")
@@ -10,7 +10,6 @@ def get_table(data):
[
"REQUIREMENTS_ATTRIBUTES_NAME",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"CHECKID",
"STATUS",
"REGION",
@@ -18,10 +17,6 @@ def get_table(data):
"RESOURCEID",
]
]
return get_section_containers_3_levels(
aux,
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"REQUIREMENTS_ATTRIBUTES_NAME",
return get_section_containers_format2(
aux, "REQUIREMENTS_ATTRIBUTES_NAME", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -1,6 +1,6 @@
import warnings
from dashboard.common_methods import get_section_containers_3_levels
from dashboard.common_methods import get_section_containers_format2
warnings.filterwarnings("ignore")
@@ -10,7 +10,6 @@ def get_table(data):
[
"REQUIREMENTS_ATTRIBUTES_NAME",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"CHECKID",
"STATUS",
"REGION",
@@ -19,9 +18,6 @@ def get_table(data):
]
]
return get_section_containers_3_levels(
aux,
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"REQUIREMENTS_ATTRIBUTES_NAME",
return get_section_containers_format2(
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ATTRIBUTES_NAME"
)

View File

@@ -1,24 +0,0 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -1,24 +0,0 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -1,6 +1,6 @@
import warnings
from dashboard.common_methods import get_section_containers_3_levels
from dashboard.common_methods import get_section_containers_kisa_ismsp
warnings.filterwarnings("ignore")
@@ -8,7 +8,7 @@ warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
"REQUIREMENTS_ID",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
# "REQUIREMENTS_DESCRIPTION",
@@ -20,9 +20,6 @@ def get_table(data):
]
].copy()
return get_section_containers_3_levels(
aux,
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
return get_section_containers_kisa_ismsp(
aux, "REQUIREMENTS_ATTRIBUTES_SUBDOMAIN", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -1,6 +1,6 @@
import warnings
from dashboard.common_methods import get_section_containers_3_levels
from dashboard.common_methods import get_section_containers_kisa_ismsp
warnings.filterwarnings("ignore")
@@ -8,7 +8,7 @@ warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
"REQUIREMENTS_ID",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
# "REQUIREMENTS_DESCRIPTION",
@@ -20,9 +20,6 @@ def get_table(data):
]
].copy()
return get_section_containers_3_levels(
aux,
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
return get_section_containers_kisa_ismsp(
aux, "REQUIREMENTS_ATTRIBUTES_SUBDOMAIN", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -1,24 +0,0 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -1,24 +0,0 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -1,24 +0,0 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -1,6 +1,6 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
from dashboard.common_methods import get_section_containers_format3
warnings.filterwarnings("ignore")
@@ -19,6 +19,6 @@ def get_table(data):
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
return get_section_containers_format3(
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
)

View File

@@ -57,9 +57,8 @@ def create_layout_overview(
html.Div(className="flex", id="azure_card", n_clicks=0),
html.Div(className="flex", id="gcp_card", n_clicks=0),
html.Div(className="flex", id="k8s_card", n_clicks=0),
html.Div(className="flex", id="m365_card", n_clicks=0),
],
className="grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-5",
className="grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-4",
),
html.H4(
"Count of Findings by severity",
@@ -132,7 +131,7 @@ def create_layout_compliance(
html.A(
[
html.Img(src="assets/favicon.ico", className="w-5 mr-3"),
html.Span("Subscribe to Prowler Cloud"),
html.Span("Subscribe to prowler SaaS"),
],
href="https://prowler.pro/",
target="_blank",

View File

@@ -76,8 +76,6 @@ def load_csv_files(csv_files):
result = result.replace("_AZURE", " - AZURE")
if "KUBERNETES" in result:
result = result.replace("_KUBERNETES", " - KUBERNETES")
if "M65" in result:
result = result.replace("_M65", " - M65")
results.append(result)
unique_results = set(results)
@@ -269,15 +267,6 @@ def display_data(
data["REQUIREMENTS_ATTRIBUTES_PROFILE"] = data[
"REQUIREMENTS_ATTRIBUTES_PROFILE"
].apply(lambda x: x.split(" - ")[0])
# Add the column ACCOUNTID to the data if the provider is m65
if "m365" in analytics_input:
data.rename(columns={"TENANTID": "ACCOUNTID"}, inplace=True)
data.rename(columns={"LOCATION": "REGION"}, inplace=True)
if "REQUIREMENTS_ATTRIBUTES_PROFILE" in data.columns:
data["REQUIREMENTS_ATTRIBUTES_PROFILE"] = data[
"REQUIREMENTS_ATTRIBUTES_PROFILE"
].apply(lambda x: x.split(" - ")[0])
# Filter the chosen level of the CIS
if is_level_1:
data = data[data["REQUIREMENTS_ATTRIBUTES_PROFILE"] == "Level 1"]
@@ -408,13 +397,7 @@ def display_data(
compliance_module = importlib.import_module(
f"dashboard.compliance.{current}"
)
data = data.drop_duplicates(
subset=["CHECKID", "STATUS", "MUTED", "RESOURCEID", "STATUSEXTENDED"]
)
if "threatscore" in analytics_input:
data = get_threatscore_mean_by_pillar(data)
data.drop_duplicates(keep="first", inplace=True)
table = compliance_module.get_table(data)
except ModuleNotFoundError:
table = html.Div(
@@ -433,9 +416,6 @@ def display_data(
)
df = data.copy()
# Remove Muted rows
if "MUTED" in df.columns:
df = df[df["MUTED"] == "False"]
df = df.groupby(["STATUS"]).size().reset_index(name="counts")
df = df.sort_values(by=["counts"], ascending=False)
@@ -450,9 +430,6 @@ def display_data(
if "pci" in analytics_input:
pie_2 = get_bar_graph(df, "REQUIREMENTS_ID")
current_filter = "req_id"
elif "threatscore" in analytics_input:
pie_2 = get_table_prowler_threatscore(df)
current_filter = "threatscore"
elif (
"REQUIREMENTS_ATTRIBUTES_SECTION" in df.columns
and not df["REQUIREMENTS_ATTRIBUTES_SECTION"].isnull().values.any()
@@ -511,13 +488,6 @@ def display_data(
pie_2, f"Top 5 failed {current_filter} by requirements"
)
if "threatscore" in analytics_input:
security_level_graph = get_graph(
pie_2,
"Pillar Score by requirements (1 = Lowest Risk, 5 = Highest Risk)",
margin_top=0,
)
return (
table_output,
overall_status_result_graph,
@@ -531,7 +501,7 @@ def display_data(
)
def get_graph(pie, title, margin_top=7):
def get_graph(pie, title):
return [
html.Span(
title,
@@ -544,7 +514,7 @@ def get_graph(pie, title, margin_top=7):
"display": "flex",
"justify-content": "center",
"align-items": "center",
"margin-top": f"{margin_top}%",
"margin-top": "7%",
},
),
]
@@ -648,87 +618,3 @@ def get_table(current_compliance, table):
className="relative flex flex-col bg-white shadow-provider rounded-xl px-4 py-3 flex-wrap w-full",
),
]
def get_threatscore_mean_by_pillar(df):
modified_df = df[df["STATUS"] == "FAIL"]
modified_df["REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"] = pd.to_numeric(
modified_df["REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"], errors="coerce"
)
pillar_means = (
modified_df.groupby("REQUIREMENTS_ATTRIBUTES_SECTION")[
"REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"
]
.mean()
.round(2)
)
output = []
for pillar, mean in pillar_means.items():
output.append(f"{pillar} - [{mean}]")
for value in output:
if value.split(" - ")[0] in df["REQUIREMENTS_ATTRIBUTES_SECTION"].values:
df.loc[
df["REQUIREMENTS_ATTRIBUTES_SECTION"] == value.split(" - ")[0],
"REQUIREMENTS_ATTRIBUTES_SECTION",
] = value
return df
def get_table_prowler_threatscore(df):
df = df[df["STATUS"] == "FAIL"]
# Delete " - " from the column REQUIREMENTS_ATTRIBUTES_SECTION
df["REQUIREMENTS_ATTRIBUTES_SECTION"] = (
df["REQUIREMENTS_ATTRIBUTES_SECTION"].str.split(" - ").str[0]
)
df["REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"] = pd.to_numeric(
df["REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"], errors="coerce"
)
score_df = (
df.groupby("REQUIREMENTS_ATTRIBUTES_SECTION")[
"REQUIREMENTS_ATTRIBUTES_LEVELOFRISK"
]
.mean()
.reset_index()
.rename(
columns={
"REQUIREMENTS_ATTRIBUTES_SECTION": "Pillar",
"REQUIREMENTS_ATTRIBUTES_LEVELOFRISK": "Score",
}
)
)
fig = px.bar(
score_df,
x="Pillar",
y="Score",
color="Score",
color_continuous_scale=[
"#45cc6e",
"#f4d44d",
"#e77676",
], # verde → amarillo → rojo
hover_data={"Score": True, "Pillar": True},
labels={"Score": "Average Risk Score", "Pillar": "Section"},
height=400,
)
fig.update_layout(
xaxis_title="Pillar",
yaxis_title="Level of Risk",
margin=dict(l=20, r=20, t=30, b=20),
plot_bgcolor="rgba(0,0,0,0)",
paper_bgcolor="rgba(0,0,0,0)",
coloraxis_colorbar=dict(title="Risk"),
)
return dcc.Graph(
figure=fig,
style={"height": "25rem", "width": "40rem"},
)

View File

@@ -74,9 +74,6 @@ gcp_provider_logo = html.Img(
ks8_provider_logo = html.Img(
src="assets/images/providers/k8s_provider.png", alt="k8s provider"
)
m365_provider_logo = html.Img(
src="assets/images/providers/m365_provider.png", alt="m365 provider"
)
def load_csv_files(csv_files):
@@ -226,8 +223,6 @@ else:
accounts.append(account + " - AZURE")
if "gcp" in list(data[data["ACCOUNT_NAME"] == account]["PROVIDER"]):
accounts.append(account + " - GCP")
if "m365" in list(data[data["ACCOUNT_NAME"] == account]["PROVIDER"]):
accounts.append(account + " - M365")
if "ACCOUNT_UID" in data.columns:
for account in data["ACCOUNT_UID"].unique():
@@ -278,8 +273,6 @@ else:
services.append(service + " - AZURE")
if "gcp" in list(data[data["SERVICE_NAME"] == service]["PROVIDER"]):
services.append(service + " - GCP")
if "m365" in list(data[data["SERVICE_NAME"] == service]["PROVIDER"]):
services.append(service + " - M365")
services = ["All"] + services
services = [
@@ -492,7 +485,6 @@ else:
Output("azure_card", "children"),
Output("gcp_card", "children"),
Output("k8s_card", "children"),
Output("m365_card", "children"),
Output("subscribe_card", "children"),
Output("info-file-over", "title"),
Output("severity-filter", "value"),
@@ -507,7 +499,6 @@ else:
Output("azure_card", "n_clicks"),
Output("gcp_card", "n_clicks"),
Output("k8s_card", "n_clicks"),
Output("m365_card", "n_clicks"),
],
Input("cloud-account-filter", "value"),
Input("region-filter", "value"),
@@ -522,7 +513,6 @@ else:
Input("azure_card", "n_clicks"),
Input("gcp_card", "n_clicks"),
Input("k8s_card", "n_clicks"),
Input("m365_card", "n_clicks"),
Input("sort_button_check_name", "n_clicks"),
Input("sort_button_severity", "n_clicks"),
Input("sort_button_status", "n_clicks"),
@@ -544,7 +534,6 @@ def filter_data(
azure_clicks,
gcp_clicks,
k8s_clicks,
m365_clicks,
sort_button_check_name,
sort_button_severity,
sort_button_status,
@@ -565,7 +554,6 @@ def filter_data(
azure_clicks = 0
gcp_clicks = 0
k8s_clicks = 0
m365_clicks = 0
if azure_clicks > 0:
filtered_data = data.copy()
if azure_clicks % 2 != 0 and "azure" in list(data["PROVIDER"]):
@@ -573,7 +561,6 @@ def filter_data(
aws_clicks = 0
gcp_clicks = 0
k8s_clicks = 0
m365_clicks = 0
if gcp_clicks > 0:
filtered_data = data.copy()
if gcp_clicks % 2 != 0 and "gcp" in list(data["PROVIDER"]):
@@ -581,7 +568,6 @@ def filter_data(
aws_clicks = 0
azure_clicks = 0
k8s_clicks = 0
m365_clicks = 0
if k8s_clicks > 0:
filtered_data = data.copy()
if k8s_clicks % 2 != 0 and "kubernetes" in list(data["PROVIDER"]):
@@ -589,15 +575,6 @@ def filter_data(
aws_clicks = 0
azure_clicks = 0
gcp_clicks = 0
m365_clicks = 0
if m365_clicks > 0:
filtered_data = data.copy()
if m365_clicks % 2 != 0 and "m365" in list(data["PROVIDER"]):
filtered_data = filtered_data[filtered_data["PROVIDER"] == "m365"]
aws_clicks = 0
azure_clicks = 0
gcp_clicks = 0
k8s_clicks = 0
# For all the data, we will add to the status column the value 'MUTED (FAIL)' and 'MUTED (PASS)' depending on the value of the column 'STATUS' and 'MUTED'
if "MUTED" in filtered_data.columns:
@@ -698,8 +675,6 @@ def filter_data(
all_account_names.append(account)
if "gcp" in list(data[data["ACCOUNT_NAME"] == account]["PROVIDER"]):
all_account_names.append(account)
if "m365" in list(data[data["ACCOUNT_NAME"] == account]["PROVIDER"]):
all_account_names.append(account)
all_items = all_account_ids + all_account_names + ["All"]
@@ -717,8 +692,6 @@ def filter_data(
cloud_accounts_options.append(item + " - AZURE")
if "gcp" in list(data[data["ACCOUNT_NAME"] == item]["PROVIDER"]):
cloud_accounts_options.append(item + " - GCP")
if "m365" in list(data[data["ACCOUNT_NAME"] == item]["PROVIDER"]):
cloud_accounts_options.append(item + " - M365")
# Filter ACCOUNT
if cloud_account_values == ["All"]:
@@ -817,7 +790,6 @@ def filter_data(
service_filter_options = ["All"]
all_items = filtered_data["SERVICE_NAME"].unique()
for item in all_items:
if item not in service_filter_options and item.__class__.__name__ == "str":
if "aws" in list(
@@ -836,10 +808,6 @@ def filter_data(
filtered_data[filtered_data["SERVICE_NAME"] == item]["PROVIDER"]
):
service_filter_options.append(item + " - GCP")
if "m365" in list(
filtered_data[filtered_data["SERVICE_NAME"] == item]["PROVIDER"]
):
service_filter_options.append(item + " - M365")
# Filter Service
if service_values == ["All"]:
@@ -1267,10 +1235,6 @@ def filter_data(
filtered_data.loc[
filtered_data["ACCOUNT_UID"] == account, "ACCOUNT_UID"
] = (account + " - GCP")
if "m365" in list(data[data["ACCOUNT_UID"] == account]["PROVIDER"]):
filtered_data.loc[
filtered_data["ACCOUNT_UID"] == account, "ACCOUNT_UID"
] = (account + " - M365")
table_collapsible = []
for item in filtered_data.to_dict("records"):
@@ -1338,17 +1302,14 @@ def filter_data(
k8s_card = create_provider_card(
"kubernetes", ks8_provider_logo, "Clusters", full_filtered_data
)
m365_card = create_provider_card(
"m365", m365_provider_logo, "Accounts", full_filtered_data
)
# Subscribe to Prowler Cloud card
# Subscribe to prowler SaaS card
subscribe_card = [
html.Div(
html.A(
[
html.Img(src="assets/favicon.ico", className="w-5 mr-3"),
html.Span("Subscribe to Prowler Cloud"),
html.Span("Subscribe to prowler SaaS"),
],
href="https://prowler.pro/",
target="_blank",
@@ -1385,7 +1346,6 @@ def filter_data(
azure_card,
gcp_card,
k8s_card,
m365_card,
subscribe_card,
list_files,
severity_values,
@@ -1400,7 +1360,6 @@ def filter_data(
azure_clicks,
gcp_clicks,
k8s_clicks,
m365_clicks,
)
else:
return (
@@ -1418,7 +1377,6 @@ def filter_data(
azure_card,
gcp_card,
k8s_card,
m365_card,
subscribe_card,
list_files,
severity_values,
@@ -1433,7 +1391,6 @@ def filter_data(
azure_clicks,
gcp_clicks,
k8s_clicks,
m365_clicks,
)

View File

@@ -130,12 +130,10 @@ Prowler for M365 currently supports the following authentication types:
???+ warning
For Prowler App only the Service Principal with User Credentials authentication method is supported.
For Prowler App only the Service Principal with an application authentication method is supported.
### Service Principal authentication
Authentication flag: `--sp-env-auth`
To allow Prowler assume the service principal identity to start the scan it is needed to configure the following environment variables:
```console
@@ -145,14 +143,9 @@ export AZURE_TENANT_ID="XXXXXXXXX"
```
If you try to execute Prowler with the `--sp-env-auth` flag and those variables are empty or not exported, the execution is going to fail.
Follow the instructions in the [Create Prowler Service Principal](../tutorials/microsoft365/getting-started-m365.md#create-the-service-principal-app) section to create a service principal.
With this credentials you will only be able to run the checks that work through MS Graph, this means that you won't run all the provider. If you want to scan all the checks from M365 you will need to use the recommended authentication method.
Follow the instructions in the [Create Prowler Service Principal](../tutorials/azure/create-prowler-service-principal.md) section to create a service principal.
### Service Principal and User Credentials authentication (recommended)
Authentication flag: `--env-auth`
This authentication method follows the same approach as the service principal method but introduces two additional environment variables for user credentials: `M365_USER` and `M365_ENCRYPTED_PASSWORD`.
```console
@@ -163,322 +156,24 @@ export M365_USER="your_email@example.com"
export M365_ENCRYPTED_PASSWORD="6500780061006d0070006c006500700061007300730077006f0072006400" # replace this to yours
```
These two new environment variables are **required** to execute the PowerShell modules needed to retrieve information from M365 services. Prowler uses Service Principal authentication to access Microsoft Graph and user credentials to authenticate to Microsoft PowerShell modules.
These two new environment variables are required to execute the PowerShell modules needed to retrieve information from M365 services. Prowler will use service principal authentication to log into MS Graph and user credentials to authenticate to Microsoft PowerShell modules.
- `M365_USER` should be your Microsoft account email using the default domain. This means it must look like `example@YourCompany.onmicrosoft.com`.
The `M365_USER` should be your Microsoft account email, and `M365_ENCRYPTED_PASSWORD` must be an encrypted SecureString.
To convert your password into a valid encrypted string, run the following commands in PowerShell:
To ensure that you are using the default domain you can see how to verify it [here](../tutorials/microsoft365/getting-started-m365.md#step-1-obtain-your-domain).
```console
$securePassword = ConvertTo-SecureString "examplepassword" -AsPlainText -Force
$encryptedPassword = $securePassword | ConvertFrom-SecureString
```
If you don't have a user created with that domain, Prowler will not work as it will not be able to ensure both app an user belong to the same tenant. To proceed, you can either create a new user with that domain or modify the domain of an existing user.
![User Domains](../tutorials/microsoft365/img/user-domains.png)
- `M365_ENCRYPTED_PASSWORD` must be an encrypted SecureString. To convert your password into a valid encrypted string, you need to use PowerShell.
???+ warning
Passwords encrypted using ConvertTo-SecureString can only be decrypted on the same OS/user context. If you generate an encrypted password on macOS or Linux (both UNIX), it should fail on Windows and vice versa. As Prowler Cloud runs on UNIX if you generate your password using Windows it won't work so you'll need to generate a new password using any UNIX distro (example above)
If you are working from Windows and you will use your encrypted password in a different system (like for example executing Prowler in macOS or adding your password to Prowler Cloud), you will need to generate a "UNIX compatible" version of your encrypted password. This can be done using WSL which is so easy to install on Windows.
=== "UNIX"
Open a PowerShell cmd with a [supported version](requirements.md#supported-powershell-versions) and then run the following command:
```console
$securePassword = ConvertTo-SecureString "examplepassword" -AsPlainText -Force
$encryptedPassword = $securePassword | ConvertFrom-SecureString
Write-Output $encryptedPassword
6500780061006d0070006c006500700061007300730077006f0072006400
```
If everything is done correctly, you will see the encrypted string that you need to set as the `M365_ENCRYPTED_PASSWORD` environment variable.
=== "Windows"
How to install WSL and PowerShell on it to generate that password (you can use a different distro but this one will work for sure):
```console
wsl --install -d Ubuntu-22.04
```
Then, open the Ubuntu terminal and run the following commands:
```console
sudo apt update && sudo apt install -y wget apt-transport-https software-properties-common
wget -q "https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/packages-microsoft-prod.deb"
sudo dpkg -i packages-microsoft-prod.deb
sudo apt update
sudo apt install -y powershell
pwsh
```
With this done you will see now that a prompt running PowerShell with the latest version is open so here you will be able to generate your encrypted password:
```console
$securePassword = ConvertTo-SecureString "examplepassword" -AsPlainText -Force
$encryptedPassword = $securePassword | ConvertFrom-SecureString
Write-Output $encryptedPassword
6500780061006d0070006c006500700061007300730077006f0072006400
```
If everything is done correctly, you will see the encrypted string that you need to set as the `M365_ENCRYPTED_PASSWORD` environment variable.
If everything is done correctly, you will see the encrypted string that you need to set as the `M365_ENCRYPTED_PASSWORD` environment variable.
```console
Write-Output $encryptedPassword
6500780061006d0070006c006500700061007300730077006f0072006400
```
### Interactive Browser authentication
Authentication flag: `--browser-auth`
This authentication method requires the user to authenticate against Azure using the default browser to start the scan, also `--tenant-id` flag is required.
With this credentials you will only be able to run the checks that work through MS Graph, this means that you won't run all the provider. If you want to scan all the checks from M365 you will need to use the recommended authentication method.
Since this is a delegated permission authentication method, necessary permissions should be given to the user, not the app.
### Needed permissions
Prowler for M365 requires two types of permission scopes to be set (if you want to run the full provider including PowerShell checks). Both must be configured using Microsoft Entra ID:
- **Service Principal Application Permissions**: These are set at the **application** level and are used to retrieve data from the identity being assessed:
- `Directory.Read.All`: Required for all services.
- `Policy.Read.All`: Required for all services.
- `User.Read` (IMPORTANT: this must be set as **delegated**): Required for the sign-in.
- `Sites.Read.All`: Required for SharePoint service.
- `SharePointTenantSettings.Read.All`: Required for SharePoint service.
- **Powershell Modules Permissions**: These are set at the `M365_USER` level, so the user used to run Prowler must have one of the following roles:
- `Global Reader` (recommended): this allows you to read all roles needed.
- `Exchange Administrator` and `Teams Administrator`: user needs both roles but with this [roles](https://learn.microsoft.com/en-us/exchange/permissions-exo/permissions-exo#microsoft-365-permissions-in-exchange-online) you can access to the same information as a Global Reader (since only read access is needed, Global Reader is recommended).
In order to know how to assign those permissions and roles follow the instructions in the Microsoft Entra ID [permissions](../tutorials/microsoft365/getting-started-m365.md#grant-required-api-permissions) and [roles](../tutorials/microsoft365/getting-started-m365.md#assign-required-roles-to-your-user) section.
### Supported PowerShell versions
You must have PowerShell installed to run certain M365 checks.
Currently, we support **PowerShell version 7.4 or higher** (7.5 is recommended).
This requirement exists because **PowerShell 5.1** (the version that comes by default on some Windows systems) does not support several cmdlets needed to run the checks properly.
Additionally, earlier [PowerShell Cross-Platform versions](https://learn.microsoft.com/en-us/powershell/scripting/install/powershell-support-lifecycle?view=powershell-7.5) are no longer under technical support, which may cause unexpected errors.
???+ note
Installing powershell will be only needed if you install prowler from pip or other sources, these means that the SDK and API containers contain PowerShell installed by default.
Installing PowerShell is different depending on your OS.
- [Windows](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.5#install-powershell-using-winget-recommended): you will need to update PowerShell to +7.4 to be able to run prowler, if not some checks will not show findings and the provider could not work as expected. This version of PowerShell is [supported](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.4#supported-versions-of-windows) on Windows 10, Windows 11, Windows Server 2016 and higher versions.
```console
winget install --id Microsoft.PowerShell --source winget
```
- [MacOS](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-macos?view=powershell-7.5#install-the-latest-stable-release-of-powershell): installing PowerShell on MacOS needs to have installed [brew](https://brew.sh/), once you have it is just running the command above, Pwsh is only supported in macOS 15 (Sequoia) x64 and Arm64, macOS 14 (Sonoma) x64 and Arm64, macOS 13 (Ventura) x64 and Arm64
```console
brew install powershell/tap/powershell
```
Once it's installed run `pwsh` on your terminal to verify it's working.
- Linux: installing PowerShell on Linux depends on the distro you are using:
- [Ubuntu](https://learn.microsoft.com/es-es/powershell/scripting/install/install-ubuntu?view=powershell-7.5#installation-via-package-repository-the-package-repository): The required version for installing PowerShell +7.4 on Ubuntu are Ubuntu 22.04 and Ubuntu 24.04. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget apt-transport-https software-properties-common
# Get the version of Ubuntu
source /etc/os-release
# Download the Microsoft repository keys
wget -q https://packages.microsoft.com/config/ubuntu/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Alpine](https://learn.microsoft.com/es-es/powershell/scripting/install/install-alpine?view=powershell-7.5#installation-steps): The only supported version for installing PowerShell +7.4 on Alpine is Alpine 3.20. The unique way to install it is downloading the tar.gz package available on [PowerShell github](https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz). You just need to follow the following steps:
```console
# Install the requirements
sudo apk add --no-cache \
ca-certificates \
less \
ncurses-terminfo-base \
krb5-libs \
libgcc \
libintl \
libssl3 \
libstdc++ \
tzdata \
userspace-rcu \
zlib \
icu-libs \
curl
apk -X https://dl-cdn.alpinelinux.org/alpine/edge/main add --no-cache \
lttng-ust \
openssh-client \
# Download the powershell '.tar.gz' archive
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz -o /tmp/powershell.tar.gz
# Create the target folder where powershell will be placed
sudo mkdir -p /opt/microsoft/powershell/7
# Expand powershell to the target folder
sudo tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7
# Set execute permissions
sudo chmod +x /opt/microsoft/powershell/7/pwsh
# Create the symbolic link that points to pwsh
sudo ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh
# Start PowerShell
pwsh
```
- [Debian](https://learn.microsoft.com/es-es/powershell/scripting/install/install-debian?view=powershell-7.5#installation-on-debian-11-or-12-via-the-package-repository): The required version for installing PowerShell +7.4 on Debian are Debian 11 and Debian 12. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget
# Get the version of Debian
source /etc/os-release
# Download the Microsoft repository GPG keys
wget -q https://packages.microsoft.com/config/debian/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository GPG keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository GPG keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Rhel](https://learn.microsoft.com/es-es/powershell/scripting/install/install-rhel?view=powershell-7.5#installation-via-the-package-repository): The required version for installing PowerShell +7.4 on Red Hat are RHEL 8 and RHEL 9. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Get version of RHEL
source /etc/os-release
if [ ${VERSION_ID%.*} -lt 8 ]
then majorver=7
elif [ ${VERSION_ID%.*} -lt 9 ]
then majorver=8
else majorver=9
fi
# Download the Microsoft RedHat repository package
curl -sSL -O https://packages.microsoft.com/config/rhel/$majorver/packages-microsoft-prod.rpm
# Register the Microsoft RedHat repository
sudo rpm -i packages-microsoft-prod.rpm
# Delete the downloaded package after installing
rm packages-microsoft-prod.rpm
# Update package index files
sudo dnf update
# Install PowerShell
sudo dnf install powershell -y
```
- [Docker](https://learn.microsoft.com/es-es/powershell/scripting/install/powershell-in-docker?view=powershell-7.5#use-powershell-in-a-container): The following command download the latest stable versions of PowerShell:
```console
docker pull mcr.microsoft.com/dotnet/sdk:9.0
```
To start an interactive shell of Pwsh you just need to run:
```console
docker run -it mcr.microsoft.com/dotnet/sdk:9.0 pwsh
```
### Needed PowerShell modules
To obtain the required data for this provider, we use several PowerShell cmdlets.
These cmdlets come from different modules that must be installed.
The installation of these modules will be performed automatically if you run Prowler with the flag `--init-modules`. This an example way of running Prowler and installing the modules:
```console
python3 prowler-cli.py m365 --verbose --log-level ERROR --env-auth --init-modules
```
If you already have them installed, there is no problem even if you use the flag because it will automatically check if the needed modules are already installed.
???+ note
Prowler installs the modules using `-Scope CurrentUser`.
If you encounter any issues with services not working after the automatic installation, try installing the modules manually using `-Scope AllUsers` (administrator permissions are required for this).
The command needed to install a module manually is:
```powershell
Install-Module -Name "ModuleName" -Scope AllUsers -Force
```
The required modules are:
- [ExchangeOnlineManagement](https://www.powershellgallery.com/packages/ExchangeOnlineManagement/3.6.0): Minimum version 3.6.0. Required for several checks across Exchange, Defender, and Purview.
- [MicrosoftTeams](https://www.powershellgallery.com/packages/MicrosoftTeams/6.6.0): Minimum version 6.6.0. Required for all Teams checks.
## GitHub
### Authentication
Prowler supports multiple methods to [authenticate with GitHub](https://docs.github.com/en/rest/authentication/authenticating-to-the-rest-api). These include:
- **Personal Access Token (PAT)**
- **OAuth App Token**
- **GitHub App Credentials**
This flexibility allows you to scan and analyze your GitHub account, including repositories, organizations, and applications, using the method that best suits your use case.
The provided credentials must have the appropriate permissions to perform all the required checks.
???+ note
GitHub App Credentials support less checks than other authentication methods.
To use `--browser-auth` the user needs to authenticate against Azure using the default browser to start the scan, also `--tenant-id` flag is required.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 65 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.0 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 313 KiB

After

Width:  |  Height:  |  Size: 746 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 39 KiB

After

Width:  |  Height:  |  Size: 153 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 347 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 54 KiB

After

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 16 KiB

View File

@@ -53,7 +53,7 @@ Prowler App can be installed in different ways, depending on your environment:
You can change the environment variables in the `.env` file. Note that it is not recommended to use the default values in production environments.
???+ note
There is a development mode available, you can use the file https://github.com/prowler-cloud/prowler/blob/master/docker-compose-dev.yml to run the app in development mode.
There is a development mode available, you can use the file https://github.com/prowler-cloud/prowler/blob/master/docker-compose.dev.yml to run the app in development mode.
???+ warning
Google and GitHub authentication is only available in [Prowler Cloud](https://prowler.com).
@@ -565,16 +565,11 @@ kubectl logs prowler-XXXXX --namespace prowler-ns
???+ note
By default, `prowler` will scan all namespaces in your active Kubernetes context. Use the flag `--context` to specify the context to be scanned and `--namespaces` to specify the namespaces to be scanned.
#### Microsoft 365
With M365 you need to specify which auth method is going to be used:
```console
# To use both service principal (for MSGraph) and user credentials (for PowerShell modules)
prowler m365 --env-auth
# To use service principal authentication
prowler m365 --sp-env-auth
@@ -586,31 +581,7 @@ prowler m365 --browser-auth --tenant-id "XXXXXXXX"
```
See more details about M365 Authentication in [Requirements](getting-started/requirements.md#microsoft-365)
#### GitHub
Prowler allows you to scan your GitHub account, including your repositories, organizations or applications.
There are several supported login methods:
```console
# Personal Access Token (PAT):
prowler github --personal-access-token pat
# OAuth App Token:
prowler github --oauth-app-token oauth_token
# GitHub App Credentials:
prowler github --github-app-id app_id --github-app-key app_key
```
???+ note
If no login method is explicitly provided, Prowler will automatically attempt to authenticate using environment variables in the following order of precedence:
1. `GITHUB_PERSONAL_ACCESS_TOKEN`
2. `OAUTH_APP_TOKEN`
3. `GITHUB_APP_ID` and `GITHUB_APP_KEY`
See more details about M365 Authentication in [Requirements](getting-started/requirements/#microsoft-365)
## Prowler v2 Documentation
For **Prowler v2 Documentation**, please check it out [here](https://github.com/prowler-cloud/prowler/blob/8818f47333a0c1c1a457453c87af0ea5b89a385f/README.md).

View File

@@ -1,14 +1,14 @@
# Getting Started with AWS on Prowler Cloud/App
# Getting Started with AWS on Prowler Cloud
<iframe width="560" height="380" src="https://www.youtube-nocookie.com/embed/RPgIWOCERzY" title="Prowler Cloud Onboarding AWS" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
Set up your AWS account to enable security scanning using Prowler Cloud/App.
Set up your AWS account to enable security scanning using Prowler Cloud.
## Requirements
To configure your AWS account, youll need:
1. Access to Prowler Cloud/App
1. Access to Prowler Cloud
2. Properly configured AWS credentials (either static or via an assumed IAM role)
---
@@ -22,9 +22,9 @@ To configure your AWS account, youll need:
---
## Step 2: Access Prowler Cloud/App
## Step 2: Access Prowler Cloud
1. Navigate to [Prowler Cloud](https://cloud.prowler.com/) or launch [Prowler App](../prowler-app.md)
1. Navigate to [Prowler Cloud](https://cloud.prowler.com/)
2. Go to `Configuration` > `Cloud Providers`
![Cloud Providers Page](../img/cloud-providers-page.png)
@@ -117,7 +117,7 @@ This method grants permanent access and is the recommended setup for production
terraform apply
```
2. During `plan` and `apply`, you will be prompted for the **External ID**, which is available in the Prowler Cloud/App UI:
2. During `plan` and `apply`, you will be prompted for the **External ID**, which is available in the Prowler Cloud UI:
![Get External ID](./img/get-external-id-prowler-cloud.png)
@@ -135,7 +135,7 @@ This method grants permanent access and is the recommended setup for production
![New Role Info](./img/get-role-arn.png)
10. Paste the ARN into the corresponding field in Prowler Cloud/App
10. Paste the ARN into the corresponding field in Prowler Cloud
![Input the Role ARN](./img/paste-role-arn-prowler.png)
@@ -171,7 +171,7 @@ You can also configure your AWS account using static credentials (not recommende
![CloudShell Output](./img/cloudshell-output.png)
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud/App setup screen.
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud setup screen.
=== "Short term credentials (Recommended)"
@@ -203,9 +203,9 @@ You can also configure your AWS account using static credentials (not recommende
}
```
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud/App setup screen.
> ⚠️ Save these credentials securely and paste them into the Prowler Cloud setup screen.
Complete the form in Prowler Cloud/App and click `Next`
Complete the form in Prowler Cloud and click `Next`
![Filled credentials page](./img/prowler-cloud-credentials-next.png)

View File

@@ -1,15 +1,15 @@
# Getting Started with Azure on Prowler Cloud/App
# Getting Started with Azure on Prowler Cloud
<iframe width="560" height="380" src="https://www.youtube-nocookie.com/embed/v1as8vTFlMg" title="Prowler Cloud Onboarding Azure" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
Set up your Azure subscription to enable security scanning using Prowler Cloud/App.
Set up your Azure subscription to enable security scanning using Prowler Cloud.
## Requirements
To configure your Azure subscription, youll need:
1. Get the `Subscription ID`
2. Access to Prowler Cloud/App
2. Access to Prowler Cloud
3. Configure authentication in Azure:
3.1 Create a Service Principal
@@ -18,7 +18,7 @@ To configure your Azure subscription, youll need:
3.3 Assign permissions at the subscription level
4. Add the credentials to Prowler Cloud/App
4. Add the credentials to Prowler Cloud
---
@@ -32,9 +32,9 @@ To configure your Azure subscription, youll need:
---
## Step 2: Access Prowler Cloud/App
## Step 2: Access Prowler Cloud
1. Go to [Prowler Cloud](https://cloud.prowler.com/) or launch [Prowler App](../prowler-app.md)
1. Go to [Prowler Cloud](https://cloud.prowler.com/)
2. Navigate to `Configuration` > `Cloud Providers`
![Cloud Providers Page](../img/cloud-providers-page.png)
@@ -148,13 +148,13 @@ Assign the following Microsoft Graph permissions:
---
## Step 4: Add Credentials to Prowler Cloud/App
## Step 4: Add Credentials to Prowler Cloud
1. Go to your App Registration overview and copy the `Client ID` and `Tenant ID`
![App Overview](./img/app-overview.png)
2. Go to Prowler Cloud/App and paste:
2. Go to Prowler Cloud and paste:
- `Client ID`
- `Tenant ID`

View File

@@ -23,7 +23,55 @@ In order to see which compliance frameworks are cover by Prowler, you can use op
prowler <provider> --list-compliance
```
The full and updated list of supported compliance frameworks for each provider is available at [Prowler Hub](https://hub.prowler.com/compliance).
### AWS
- `aws_account_security_onboarding_aws`
- `aws_audit_manager_control_tower_guardrails_aws`
- `aws_foundational_security_best_practices_aws`
- `aws_foundational_technical_review_aws`
- `aws_well_architected_framework_reliability_pillar_aws`
- `aws_well_architected_framework_security_pillar_aws`
- `cis_1.4_aws`
- `cis_1.5_aws`
- `cis_2.0_aws`
- `cis_3.0_aws`
- `cisa_aws`
- `ens_rd2022_aws`
- `fedramp_low_revision_4_aws`
- `fedramp_moderate_revision_4_aws`
- `ffiec_aws`
- `gdpr_aws`
- `gxp_21_cfr_part_11_aws`
- `gxp_eu_annex_11_aws`
- `hipaa_aws`
- `iso27001_2013_aws`
- `kisa_isms_p_2023_aws`
- `kisa_isms_p_2023_korean_aws`
- `mitre_attack_aws`
- `nist_800_171_revision_2_aws`
- `nist_800_53_revision_4_aws`
- `nist_800_53_revision_5_aws`
- `nist_csf_1.1_aws`
- `pci_3.2.1_aws`
- `rbi_cyber_security_framework_aws`
- `soc2_aws`
### Azure
- `cis_2.0_azure`
- `cis_2.1_azure`
- `ens_rd2022_azure`
- `mitre_attack_azure`
### GCP
- `cis_2.0_gcp`
- `ens_rd2022_gcp`
- `mitre_attack_gcp`
### Kubernetes
- `cis_1.8_kubernetes`
## List Requirements of Compliance Frameworks
For each compliance framework, you can use option `--list-compliance-requirements` to list its requirements:

View File

@@ -106,7 +106,6 @@ The following list includes all the Microsoft 365 checks with configurable varia
|---------------------------------------------------------------|--------------------------------------------------|-----------------|
| `entra_admin_users_sign_in_frequency_enabled` | `sign_in_frequency` | Integer |
| `teams_external_file_sharing_restricted` | `allowed_cloud_storage_services` | List of Strings |
| `exchange_organization_mailtips_enabled` | `recommended_mailtips_large_audience_threshold` | Integer |
## Config YAML File Structure
@@ -521,9 +520,5 @@ m365:
#"allow_google_drive",
#"allow_share_file",
]
# Exchange Organization Settings
# m365.exchange_organization_mailtips_enabled
recommended_mailtips_large_audience_threshold: 25 # maximum number of recipients
```

View File

@@ -1,20 +1,20 @@
# Getting Started with GCP on Prowler Cloud/App
# Getting Started with GCP on Prowler Cloud
<iframe width="560" height="380" src="https://www.youtube-nocookie.com/embed/v1as8vTFlMg" title="Prowler Cloud Onboarding GCP" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
Set up your GCP project to enable security scanning using Prowler Cloud/App.
Set up your GCP project to enable security scanning using Prowler Cloud.
## Requirements
To configure your GCP project, youll need:
1. Get the `Project ID`
2. Access to Prowler Cloud/App
2. Access to Prowler Cloud
3. Configure authentication in GCP:
3.1 Retrieve credentials from Google Cloud
4. Add the credentials to Prowler Cloud/App
4. Add the credentials to Prowler Cloud
---
@@ -27,9 +27,9 @@ To configure your GCP project, youll need:
---
## Step 2: Access Prowler Cloud/App
## Step 2: Access Prowler Cloud
1. Go to [Prowler Cloud](https://cloud.prowler.com/) or launch [Prowler App](../prowler-app.md)
1. Go to [Prowler Cloud](https://cloud.prowler.com/)
2. Navigate to `Configuration` > `Cloud Providers`
![Cloud Providers Page](../img/cloud-providers-page.png)
@@ -86,7 +86,7 @@ To configure your GCP project, youll need:
![Get the FileName](./img/get-temp-file-credentials.png)
8. Extract the following values for Prowler Cloud/App:
8. Extract the following values for Prowler Cloud:
- `client_id`
- `client_secret`
@@ -96,9 +96,9 @@ To configure your GCP project, youll need:
---
## Step 4: Add Credentials to Prowler Cloud/App
## Step 4: Add Credentials to Prowler Cloud
1. Go back to Prowler Cloud/App and enter the required credentials, then click `Next`
1. Go back to Prowler Cloud and enter the required credentials, then click `Next`
![Enter the Credentials](./img/enter-credentials-prowler-cloud.png)

View File

@@ -1,44 +0,0 @@
# GitHub Authentication
Prowler supports multiple methods to [authenticate with GitHub](https://docs.github.com/en/rest/authentication/authenticating-to-the-rest-api). These include:
- **Personal Access Token (PAT)**
- **OAuth App Token**
- **GitHub App Credentials**
This flexibility allows you to scan and analyze your GitHub account, including repositories, organizations, and applications, using the method that best suits your use case.
## Supported Login Methods
Here are the available login methods and their respective flags:
### Personal Access Token (PAT)
Use this method by providing your personal access token directly.
```console
prowler github --personal-access-token pat
```
### OAuth App Token
Authenticate using an OAuth app token.
```console
prowler github --oauth-app-token oauth_token
```
### GitHub App Credentials
Use GitHub App credentials by specifying the App ID and the private key.
```console
prowler github --github-app-id app_id --github-app-key app_key
```
### Automatic Login Method Detection
If no login method is explicitly provided, Prowler will automatically attempt to authenticate using environment variables in the following order of precedence:
1. `GITHUB_PERSONAL_ACCESS_TOKEN`
2. `OAUTH_APP_TOKEN`
3. `GITHUB_APP_ID` and `GITHUB_APP_KEY`
???+ note
Ensure the corresponding environment variables are set up before running Prowler for automatic detection if you don't plan to specify the login method.

View File

@@ -20,19 +20,3 @@ kubectl logs prowler-XXXXX --namespace prowler-ns
???+ note
By default, `prowler` will scan all namespaces in your active Kubernetes context. Use the [`--namespace`](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/kubernetes/namespace/) flag to specify the namespace(s) to be scanned.
???+ tip "Identifying the cluster in reports"
When running in in-cluster mode, the Kubernetes API does not expose the actual cluster name by default.
To uniquely identify the cluster in logs and reports, you can:
- Use the `--cluster-name` flag to manually set the cluster name:
```bash
prowler -p kubernetes --cluster-name production-cluster
```
- Or set the `CLUSTER_NAME` environment variable:
```yaml
env:
- name: CLUSTER_NAME
value: production-cluster
```

View File

@@ -1,205 +0,0 @@
# Getting Started with M365 on Prowler Cloud/App
Set up your M365 account to enable security scanning using Prowler Cloud/App.
## Requirements
To configure your M365 account, youll need:
1. Obtain your `Default Domain` from the Entra ID portal.
2. Access Prowler Cloud/App and add a new cloud provider `Microsoft 365`.
3. Configure your M365 account:
3.1 Create the Service Principal app.
3.2 Grant the required API permissions.
3.3 Assign the required roles to your user.
3.4 Retrieve your encrypted password.
4. Add the credentials to Prowler Cloud/App.
## Step 1: Obtain your Domain
Go to the Entra ID portal, then you can search for `Domain` or go to Identity > Settings > Domain Names.
![Search Domain Names](./img/search-domain-names.png)
<br>
![Custom Domain Names](./img/custom-domain-names.png)
Once you are there just look for the `Default Domain` this should be something similar to `YourCompany.onmicrosoft.com`. To ensure that you are picking the correct domain just click on it and verify that the type is `Initial` and you can't delete it.
![Search Default Domain](./img/search-default-domain.png)
---
## Step 2: Access Prowler Cloud/App
1. Go to [Prowler Cloud](https://cloud.prowler.com/) or launch [Prowler App](../prowler-app.md)
2. Navigate to `Configuration` > `Cloud Providers`
![Cloud Providers Page](../img/cloud-providers-page.png)
3. Click on `Add Cloud Provider`
![Add a Cloud Provider](../img/add-cloud-provider.png)
4. Select `Microsoft 365`
![Select Microsoft 365](./img/select-m365-prowler-cloud.png)
5. Add the Domain ID and an optional alias, then click `Next`
![Add Domain ID](./img/add-domain-id.png)
---
## Step 3: Configure your M365 account
### Create the Service Principal app
A Service Principal is required to grant Prowler the necessary privileges.
1. Access **Microsoft Entra ID**
![Overview of Microsoft Entra ID](./img/microsoft-entra-id.png)
2. Navigate to `Applications` > `App registrations`
![App Registration nav](./img/app-registration-menu.png)
3. Click `+ New registration`, complete the form, and click `Register`
![New Registration](./img/new-registration.png)
4. Go to `Certificates & secrets` > `+ New client secret`
![Certificate & Secrets nav](./img/certificates-and-secrets.png)
5. Fill in the required fields and click `Add`, then copy the generated value (that value will be `AZURE_CLIENT_SECRET`)
![New Client Secret](./img/new-client-secret.png)
With this done you will have all the needed keys, summarized in the following table
| Value | Description |
|-------|-------------|
| Client ID | Application (client) ID |
| Client Secret | AZURE_CLIENT_SECRET |
| Tenant ID | Directory (tenant) ID |
---
### Grant required API permissions
Assign the following Microsoft Graph permissions:
- `Directory.Read.All`: Required for all services.
- `Policy.Read.All`: Required for all services.
- `User.Read` (IMPORTANT: this is set as **delegated**): Required for the sign-in.
- `Sites.Read.All`: Required for SharePoint service.
- `SharePointTenantSettings.Read.All`: Required for SharePoint service.
Follow these steps to assign the permissions:
1. Go to your App Registration > Select your Prowler App created before > click on `API permissions`
![API Permission Page](./img/api-permissions-page.png)
2. Click `+ Add a permission` > `Microsoft Graph` > `Application permissions`
![Add API Permission](./img/add-app-api-permission.png)
3. Search and select every permission below and once all are selected click on `Add permissions`:
- `Directory.Read.All`
- `Policy.Read.All`
- `Sites.Read.All`
- `SharePointTenantSettings.Read.All`
![Permission Screenshots](./img/directory-permission.png)
4. Click `Add permissions`, then grant admin consent
![Grant Admin Consent](./img/grant-admin-consent.png)
5. Click `+ Add a permission` > `Microsoft Graph` > `Delegated permissions`
![Add API Permission](./img/add-delegated-api-permission.png)
6. Search and select:
- `User.Read`
![Permission Screenshots](./img/directory-permission-delegated.png)
7. Click `Add permissions`, then grant admin consent
![Grant Admin Consent](./img/grant-admin-consent-delegated.png)
---
### Assign required roles to your user
Assign one of the following roles to your User:
- `Global Reader` (recommended): this allows you to read all roles needed.
- `Exchange Administrator` and `Teams Administrator`: user needs both roles but with this [roles](https://learn.microsoft.com/en-us/exchange/permissions-exo/permissions-exo#microsoft-365-permissions-in-exchange-online) you can access to the same information as a Global Reader (here you only read so that's why we recomend that role).
Follow these steps to assign the role:
1. Go to Users > All Users > Click on the email for the user you will use
![User Overview](./img/user-info-page.png)
2. Click `Assigned Roles`
![User Roles](./img/user-role-page.png)
3. Click on `Add assignments`, then search and select:
- `Global Reader` This is the recommended, if you want to use the others just search for them
![Global Reader Screenshots](./img/global-reader.png)
4. Click on next, then assign the role as `Active`, and click on `Assign` to grant admin consent
![Grant Admin Consent for Role](./img/grant-admin-consent-for-role.png)
---
### Get your encrypted password
For this step you will need to use PowerShell, here you will have to create your Encrypted Password based on the password of the User that you are going to use. For more information about how to generate this Password go [here](../../getting-started/requirements.md#service-principal-and-user-credentials-authentication-recommended) and follow the steps needed to obtain `M365_ENCRYPTED_PASSWORD`.
---
## Step 4: Add credentials to Prowler Cloud/App
1. Go to your App Registration overview and copy the `Client ID` and `Tenant ID`
![App Overview](./img/app-overview.png)
2. Go to Prowler Cloud/App and paste:
- `Client ID`
- `Tenant ID`
- `AZURE_CLIENT_SECRET` from earlier
- `M365_USER` your user using the default domain, more info [here](../../getting-started/requirements.md#service-principal-and-user-credentials-authentication-recommended)
- `M365_ENCRYPTED_PASSWORD` generated before
![Prowler Cloud M365 Credentials](./img/m365-credentials.png)
3. Click `Next`
![Next Detail](./img/click-next-m365.png)
4. Click `Launch Scan`
![Launch Scan M365](./img/launch-scan.png)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 155 KiB

Some files were not shown because too many files have changed in this diff Show More