Compare commits

..

514 Commits

Author SHA1 Message Date
Pablo Lara
d8e575e6dc fix: update redirect URL for SSO (#7493) 2025-04-11 14:42:30 +02:00
Pablo Lara
f7ed5bd365 fix: resolve social login issue in AuthForm on sign-up page (#7490) 2025-04-11 10:02:14 +02:00
dependabot[bot]
1c5348d846 chore(deps): bump tj-actions/changed-files from 46.0.4 to 46.0.5 (#7486)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-11 10:02:08 +02:00
Pepe Fagoaga
2bde25a471 fix(api): poetry.lock 2025-04-10 12:32:59 +02:00
Pepe Fagoaga
c1b70ed0fc fix(api): build Prowler from v5 branch 2025-04-09 17:20:47 +02:00
Pepe Fagoaga
575ef41d4f fix(api): 1.6.0 in pyproject 2025-04-09 17:15:30 +02:00
Pepe Fagoaga
8f19bfda0d chore(changelog): Prepare for v5.5.0 (#7484) 2025-04-09 17:07:13 +02:00
Sergio Garcia
fefdce129b fix: handle errors in AWS and Azure (#7482) 2025-04-09 16:35:20 +02:00
Pepe Fagoaga
f85abf90fb fix: conflict in poetry.lock 2025-04-09 16:17:29 +02:00
Pedro Martín
855cc17a23 fix(aws): add default session_duration (#7479) 2025-04-09 16:14:23 +02:00
eeche
d489c80857 feat(NHN): add NHN cloud provider with 6 checks (#6870)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:14:22 +02:00
Prowler Bot
b81e12f697 chore(regions_update): Changes in regions for AWS services (#7478)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:14:22 +02:00
Pablo Lara
c02b9073d1 fix: fix TS type for session duration (#7481) 2025-04-09 16:14:22 +02:00
Pedro Martín
2ce4d72111 feat(gcp): add SOC2 compliance framework (#7476) 2025-04-09 16:14:22 +02:00
Drew Kerrigan
eec0b45b32 fix(ui): Remove UTC from timestamps in app (#7474) 2025-04-09 16:14:22 +02:00
Pablo Lara
f4746a1b09 feat: update the NextJS version to the latest (#7473) 2025-04-09 16:14:22 +02:00
Prowler Bot
d4c23efee3 chore(regions_update): Changes in regions for AWS services (#7467)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:14:22 +02:00
dependabot[bot]
6d7711ca1c chore(deps): bump github/codeql-action from 3.28.13 to 3.28.15 (#7463)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:14:22 +02:00
Pepe Fagoaga
a7f733522c fix(action): Use poetry > v2 (#7472) 2025-04-09 16:14:22 +02:00
Pablo Lara
e2dfc1f383 feat: add link with the service status using static icon (#7468) 2025-04-09 16:14:22 +02:00
dependabot[bot]
163ae5d79d chore(deps): bump tj-actions/changed-files from 46.0.3 to 46.0.4 (#7443)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:14:22 +02:00
Prowler Bot
ff8c33ecf8 chore(regions_update): Changes in regions for AWS services (#7446)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:14:18 +02:00
dependabot[bot]
cf4e8e940d chore(deps): bump trufflesecurity/trufflehog from 3.88.22 to 3.88.23 (#7444)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:13:36 +02:00
Prowler Bot
763e88edb5 chore(regions_update): Changes in regions for AWS services (#7445)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:13:36 +02:00
Pablo Lara
c301fbd8b3 refactor: extract common auth headers into reusable helper (#7439) 2025-04-09 16:13:36 +02:00
Sergio Garcia
5121015586 fix(docs): solve broken links (#7432) 2025-04-09 16:13:36 +02:00
Adrián Jesús Peña Rodríguez
b750b6492d feat: add missing SDK fields to API findings and resources (#7318) 2025-04-09 16:13:36 +02:00
Prowler Bot
da656b5086 chore(regions_update): Changes in regions for AWS services (#7434)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:13:36 +02:00
dependabot[bot]
c82437c32c chore(deps): bump trufflesecurity/trufflehog from 3.88.20 to 3.88.22 (#7433)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:13:36 +02:00
Pedro Martín
22c4216ed6 docs: add onboarding information step by step for each provider (#7362) 2025-04-09 16:13:36 +02:00
Pablo Lara
07d0f72239 fix: correct fetch variable name from invitations to roles (#7437) 2025-04-09 16:13:36 +02:00
Prowler Bot
6d14b5619c chore(regions_update): Changes in regions for AWS services (#7424)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:12:19 +02:00
Prowler Bot
f12c7000bb chore(regions_update): Changes in regions for AWS services (#7417)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:12:19 +02:00
dependabot[bot]
ec06fd0bf4 chore(deps): bump azure-identity from 1.19.0 to 1.21.0 (#7192)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:12:19 +02:00
Daniel Barranquero
92911d2c0b feat(entra): add new check entra_admin_users_cloud_only (#7286) 2025-04-09 16:12:19 +02:00
dependabot[bot]
985bfc1618 chore(deps): bump azure-mgmt-applicationinsights from 4.0.0 to 4.1.0 (#7161)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:12:03 +02:00
dependabot[bot]
d34c1556a1 chore(deps): bump azure-mgmt-containerregistry from 10.3.0 to 12.0.0 (#7025)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:11:44 +02:00
Pedro Martín
c0d4f43310 docs(python): add annotations about Python version (#7402) 2025-04-09 16:10:47 +02:00
Bogdan A
0b2dac83bd feat(gcp): add check for dormant (unused) SA keys (#7348)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
b3d7bb4e8d feat(entra): add new check entra_legacy_authentication_blocked (#7240) 2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
71f8bcefa8 feat(entra): add new check entra_users_mfa_enabled (#7228) 2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
a11b338994 feat(entra): add new check entra_admin_users_phishing_resistant_mfa_enabled (#7211)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
c7c8fc90b2 fix(entra): check name and logic of entra_admin_users_have_mfa_enabled (#7230) 2025-04-09 16:10:47 +02:00
Daniel Barranquero
2bd32f6e8f feat(entra): add new check entra_policy_guest_invite_only_for_admin_roles (#7241)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-09 16:10:47 +02:00
dependabot[bot]
32c8cc723e chore(deps): bump azure-mgmt-resource from 23.2.0 to 23.3.0 (#7054)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:10:47 +02:00
Daniel Barranquero
c298541d8d feat(entra): add new check entra_policy_guest_users_access_restrictions (#7234) 2025-04-09 16:10:47 +02:00
Daniel Barranquero
dcf53ea357 feat(entra): add new check entra_policy_restricts_user_consent_for_apps (#7225) 2025-04-09 16:10:47 +02:00
Víctor Fernández Poyatos
3c9ae06086 feat(findings): Handle muted findings in API and UI (#7378)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
735a8fbb95 feat(entra): add new check entra_managed_device_required_for_mfa_registration (#7203)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:10:47 +02:00
Prowler Bot
7442eb398c chore(regions_update): Changes in regions for AWS services (#7395)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
dependabot[bot]
509f185890 chore(deps): bump trufflesecurity/trufflehog from 3.88.18 to 3.88.20 (#7394)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
Prowler Bot
7712968eeb chore(regions_update): Changes in regions for AWS services (#7391)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
dependabot[bot]
fc3ee5534d chore(deps): bump actions/setup-python from 5.4.0 to 5.5.0 (#7390)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
Prowler Bot
376ec4c73b chore(regions_update): Changes in regions for AWS services (#7382)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
Pablo Lara
3db0a0ed6f chore: tweak for button see findings (#7369) 2025-04-09 16:10:46 +02:00
Pablo Lara
c3e5980eb0 chore(scans): properly enable link to findings when scan is completed (#7368) 2025-04-09 16:10:46 +02:00
dependabot[bot]
c830829b55 chore(deps): bump github/codeql-action from 3.28.12 to 3.28.13 (#7367)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:46 +02:00
dependabot[bot]
3b21b7ecf0 chore(deps): bump tj-actions/changed-files from 46.0.1 to 46.0.3 (#7363)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:46 +02:00
Víctor Fernández Poyatos
a3810f3fda build(api): Force django-allauth==65.4.1 (#7358) 2025-04-09 16:10:46 +02:00
Pablo Lara
18fc405d15 docs: update readme (#7357) 2025-04-09 16:10:46 +02:00
Pablo Lara
aeb3bdc779 chore(findings): apply default filter to show failed findings (#7356) 2025-04-09 16:10:46 +02:00
Pablo Lara
b32f0997f5 docs(changelog): document addition of download column in scans table … (#7354) 2025-04-09 16:10:46 +02:00
Pablo Lara
a72d8d7c83 feat(scans): add download button column for completed scans in table (#7353) 2025-04-09 16:10:46 +02:00
Víctor Fernández Poyatos
a4f5566589 feat(compliance): Add endpoint to retrieve compliance overviews metadata (#7333) 2025-04-09 16:10:46 +02:00
Pablo Lara
bd839e1398 docs: update changelog with Next.js security patch (#7339) (#7341) 2025-04-09 16:10:46 +02:00
Prowler Bot
7e9b1630f0 chore(regions_update): Changes in regions for AWS services (#7219)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:10:29 +02:00
Prowler Bot
e2330acbff chore(regions_update): Changes in regions for AWS services (#7246)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:10:13 +02:00
Prowler Bot
0f15a20128 chore(regions_update): Changes in regions for AWS services (#7250)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:10:06 +02:00
Prowler Bot
237ec20513 chore(regions_update): Changes in regions for AWS services (#7334)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
497adbb4e3 fix(action): Use Poetry v2 (#7329) 2025-04-09 16:10:01 +02:00
Prowler Bot
1bb939c609 chore(regions_update): Changes in regions for AWS services (#7323)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
8ee73af0c0 chore(aws-regions): remove backport to v3 (#7319) 2025-04-09 16:10:01 +02:00
dependabot[bot]
6ead888399 chore(deps): bump github/codeql-action from 3.28.11 to 3.28.12 (#7321)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
c20f9edb0b chore(dependabot): disable for v3 (#7316) 2025-04-09 16:10:01 +02:00
Pablo Lara
747d62393e docs: add social login images and update documentation (#7314)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
e757bde7ca chore(dependabot): Disable for API and UI (#7300) 2025-04-09 16:10:01 +02:00
Pedro Martín
29ff19eead fix(k8s): remove typos from PCI 4.0 (#7294) 2025-04-09 16:10:01 +02:00
Pepe Fagoaga
c516773cff chore(social-login): improve copy when not enabled (#7295) 2025-04-09 16:10:01 +02:00
dependabot[bot]
e4a682e23e chore(deps): bump trufflesecurity/trufflehog from 3.88.17 to 3.88.18 (#7297)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
d427455db5 chore(security): Configure HTTP Security Headers (#7220)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
f0cbcacbe4 chore(security): Add HTTP Security Headers (#7289) 2025-04-09 16:09:55 +02:00
Pablo Lara
df11afbcf0 fix: prevent SSR mismatch in OAuth URL generation (#7288) 2025-04-09 16:07:31 +02:00
dependabot[bot]
cc6db6b680 chore(deps): bump azure-mgmt-containerservice from 34.0.0 to 34.1.0 (#6989)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:07:31 +02:00
Pablo Lara
0eb51dc147 chore(providers): change wording when adding a new provider (#7280) 2025-04-09 16:07:31 +02:00
Pepe Fagoaga
aeb3dc9ac2 fix(aws-regions): Use @prowler-bot as author (#7285) 2025-04-09 16:07:31 +02:00
Pablo Lara
36c8240f2b chore: add env vars for social login (#7257)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-04-09 16:07:31 +02:00
Prowler Bot
a0852c397d chore(regions_update): Changes in regions for AWS services (#7281)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:07:31 +02:00
Pablo Lara
32f1d1a713 feat(providers): add component to render a link to the documentation (#7282) 2025-04-09 16:07:31 +02:00
dependabot[bot]
76be3bd4ae chore(deps): bump azure-mgmt-storage from 21.2.1 to 22.1.1 (#7098)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:07:31 +02:00
Adrián Jesús Peña Rodríguez
93b59e15e8 chore: add api reference to download report section (#7243) 2025-04-09 16:07:30 +02:00
Pablo Lara
f559b87018 chore: Rename keyServer and extract to helper (#7256) 2025-04-09 16:07:30 +02:00
Pedro Martín
84b50f2a3f fix(.env): remove spaces (#7255) 2025-04-09 16:07:30 +02:00
Pedro Martín
69c7dc339f fix(prowler): change from prowler.py to prowler-cli.py (#7253) 2025-04-09 16:07:30 +02:00
Pablo Lara
ea396a90e3 chore: update git ignore file (#7254) 2025-04-09 16:07:30 +02:00
Pedro Martín
d460509262 feat(jira): add basic auth method (#7233) 2025-04-09 16:07:30 +02:00
Pepe Fagoaga
4d2bb93a8c fix(backport): Use container tagged version (#7252) 2025-04-09 16:07:30 +02:00
Pepe Fagoaga
d936c3f934 chore(security): Pin actions to the Full-Length Commit SHA (#7249) 2025-04-09 16:07:30 +02:00
Pablo Lara
c1e36f0df7 chore: add env var for social login (#7251) 2025-04-09 16:07:30 +02:00
Prowler Bot
d1f04fefb0 chore(regions_update): Changes in regions for AWS services (#7237)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:07:27 +02:00
Prowler Bot
8703c55b5a chore(regions_update): Changes in regions for AWS services (#7245)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:06:57 +02:00
Pablo Lara
0af5c9c9c2 chore: improve UX when social login is not enabled (#7242) 2025-04-09 16:06:57 +02:00
Pablo Lara
57717467c7 chore(social-login): disable social login buttons when env vars are not set (#7238) 2025-04-09 16:06:57 +02:00
Pablo Lara
b59930803c chore(social-login): rename env.vars for social login (#7232) 2025-04-09 16:06:57 +02:00
Pablo Lara
835272b3ab chore: social auth is algo in sign-up page (#7231) 2025-04-09 16:06:57 +02:00
Pablo Lara
6a67d8d93a chore: remove unused regions (#7229) 2025-04-09 16:06:57 +02:00
Pablo Lara
f87fd35c95 chore: change wording for launching a single scan (#7226) 2025-04-09 16:06:57 +02:00
Pablo Lara
7a00750e0a chore: update changelog (#7223) 2025-04-09 16:06:57 +02:00
Pablo Lara
412697c418 feat(social-login): social login with Google is working (#7218)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-09 16:06:57 +02:00
Víctor Fernández Poyatos
3cd4e4cbbf fix(migrations): add through parameter to integration.providers (#7222) 2025-04-09 16:06:57 +02:00
Pepe Fagoaga
e18cace3bb fix(pyproject): Rename prowler.py (#7217) 2025-04-09 16:06:57 +02:00
Víctor Fernández Poyatos
a1252e7afc feat(integrations): Added new endpoints to allow configuring integrations (#7167) 2025-04-09 16:06:34 +02:00
Daniel Barranquero
7bc9aa2424 feat(entra): add new check entra_admin_mfa_enabled_for_administrative_roles (#7181)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:04:42 +02:00
Pedro Martín
c4f87f45d4 feat(kubernetes): add ISO 27001 2022 compliance framework (#7204) 2025-04-09 16:04:42 +02:00
Hugo Pereira Brito
93f3a094fc feat(entra): add new check entra_identity_protection_sign_in_risk_enabled (#7171)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:04:42 +02:00
Andoni Alonso
881133d6b9 refactor(check): add docstrings and improve report handling (#7113) 2025-04-09 16:04:42 +02:00
Hugo Pereira Brito
ba02a26cc1 feat(docs): add microsoft365 configurable checks (#7200) 2025-04-09 16:04:42 +02:00
Hugo Pereira Brito
bb1b81455c feat(entra): add new check entra_identity_protection_user_risk_enabled (#7126)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:04:42 +02:00
Pepe Fagoaga
3116fef89a chore(poetry): Upgrade to v2 (#7112) 2025-04-09 16:04:33 +02:00
Hugo Pereira Brito
f6dbf3ef5f feat(entra): add new check entra_managed_device_required_for_authentication (#7115)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:01:40 +02:00
Daniel Barranquero
35a9c03467 feat(entra): add new check entra_password_hash_sync_enabled (#7061) 2025-04-09 16:01:40 +02:00
dependabot[bot]
5fc95a879c chore(deps): bump google-api-python-client from 2.162.0 to 2.163.0 (#7191)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:01:40 +02:00
Prowler Bot
8ffa958c20 chore(regions_update): Changes in regions for AWS services (#7197)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:01:40 +02:00
Pablo Lara
476cc59ca1 chore: update changelog (#7199) 2025-04-09 16:01:40 +02:00
Pablo Lara
26390cdcd9 feat(invitations): Disable editing for accepted invites (#7198) 2025-04-09 16:01:40 +02:00
Pablo Lara
95b8390ddd chore(scans): rename type to trigger (#7196) 2025-04-09 16:01:40 +02:00
Pablo Lara
e9aebbd269 chore: auto refresh if the state is also available (#7195) 2025-04-09 16:01:40 +02:00
Pablo Lara
11ffbd86eb styles: tweaks styles (#7194) 2025-04-09 16:01:40 +02:00
Pablo Lara
9f32ff0c10 chore(launch-scan): update wording (#7193) 2025-04-09 16:01:40 +02:00
Pablo Lara
33046eb95d chore: update the changelog (#7190) 2025-04-09 16:01:40 +02:00
Hugo Pereira Brito
a77d6ebaa2 feat(microsoft365): add new check entra_admin_users_sign_in_frequency_enabled (#7020)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:01:40 +02:00
Pablo Lara
f1fa79b7ae feat(scans): allow running a scan once (#7188) 2025-04-09 16:01:40 +02:00
Adrián Jesús Peña Rodríguez
72874201c1 docs: add users, invitations and RBAC (#7109) 2025-04-09 16:01:40 +02:00
Daniel Barranquero
d979076814 feat(entra): add new check entra_dynamic_group_for_guests_created (#7168)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-09 16:01:40 +02:00
Daniel Barranquero
2bdc6fef26 chore(providers): enhance Remediation.Code.CLI field from check's metadata (#7094)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-04-09 16:01:40 +02:00
Pedro Martín
5bfcdbb0b9 feat(gcp): add ISO 27001 2022 compliance framework (#7185) 2025-04-09 16:01:39 +02:00
Pedro Martín
8812566c83 fix(azure): add remaining checks for reqA.5.25 (#7182) 2025-04-09 16:01:39 +02:00
Daniel Barranquero
fab7a107fa feat(entra): add new check entra_admin_consent_workflow_enabled (#7110) 2025-04-09 16:01:39 +02:00
Adrián Jesús Peña Rodríguez
cbd5197ddc docs: add generate_output documentation (#7122)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-09 16:01:39 +02:00
Hugo Pereira Brito
0b87b63cdd refactor(microsoft365): resource metadata assertions (#7169) 2025-04-09 16:01:39 +02:00
Pedro Martín
18d7ce3607 feat(azure): add ISO 27001 2022 compliance framework (#7170) 2025-04-09 16:01:39 +02:00
dependabot[bot]
8f20eb958d chore(deps): bump tzlocal from 5.3 to 5.3.1 (#7162)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:01:39 +02:00
Prowler Bot
33ab90e119 chore(regions_update): Changes in regions for AWS services (#7177)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:01:39 +02:00
dependabot[bot]
d4c59bf4fe chore(deps): bump trufflesecurity/trufflehog from 3.88.15 to 3.88.16 (#7174)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:01:39 +02:00
Harshit Raj Singh
5e45b30823 feat(aws): AWS Found Sec Best Practices & PCI DSS v3.2.1 upgrade (#7017)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-04-09 16:01:39 +02:00
Pablo Lara
77f15a8749 fix: tweak z-index for custom inputs (#7166) 2025-04-09 16:01:39 +02:00
Pablo Lara
953fc0590b feat(scans): improve scan launch provider selection (#7164) 2025-04-09 16:01:39 +02:00
dependabot[bot]
8ed6e497a3 chore(deps): bump django from 5.1.5 to 5.1.7 in /api (#7145)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:01:01 +02:00
dependabot[bot]
10c3a1e3ce chore(deps-dev): bump mock from 5.1.0 to 5.2.0 (#7099)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 15:58:53 +02:00
Kay Agahd
32dec0b235 fix(doc): event_time has been changed to time_dt but was not documented (#7136) 2025-04-09 15:58:53 +02:00
dependabot[bot]
bf50c5e7b0 chore(deps): bump jinja2 from 3.1.5 to 3.1.6 (#7151)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 15:58:53 +02:00
Prowler Bot
a660cf7024 chore(regions_update): Changes in regions for AWS services (#7108)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 15:58:49 +02:00
Prowler Bot
50851aec64 chore(regions_update): Changes in regions for AWS services (#7119)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 15:58:35 +02:00
dependabot[bot]
b931a81692 chore(deps-dev): bump mkdocs-git-revision-date-localized-plugin from 1.3.0 to 1.4.1 (#7129)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 15:57:04 +02:00
Prowler Bot
7ce291d0d3 chore(regions_update): Changes in regions for AWS services (#7131)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 15:56:58 +02:00
dependabot[bot]
d257e6368e chore(deps): bump trufflesecurity/trufflehog from 3.88.14 to 3.88.15 (#7127)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 15:55:06 +02:00
César Arroba
9f43391521 chore: increase release to 5.5.0 (#7143) 2025-04-09 15:54:59 +02:00
Prowler Bot
02099b793d chore(regions_update): Changes in regions for AWS services (#7146)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 15:53:07 +02:00
Pablo Lara
56b760208e chore: update changelog (#7149) 2025-04-09 15:53:07 +02:00
Pablo Lara
042f138f56 feat: add changelog (#7141) 2025-04-09 15:53:07 +02:00
Pepe Fagoaga
f347080dbd chore(release): bump for 5.4.5 (#7475) 2025-04-08 13:14:51 -04:00
Prowler Bot
2cc8363697 fix: handle errors in AWS, Azure, and GCP (#7471)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-08 19:03:42 +05:45
Prowler Bot
154467e7c8 fix(provider): disable periodic task on views before deleting (#7470)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-08 15:50:39 +05:45
Prowler Bot
16c71f3c04 fix(soc2_aws): update compliance and remove some requirements (#7455)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-04-07 16:53:52 -04:00
Prowler Bot
849707166a fix(gcp): handle logic for empty project names (#7450)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-07 15:21:54 -04:00
Prowler Bot
59513d777c fix(aws): add resource arn for transit gateways (#7448)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-07 13:38:28 -04:00
Prowler Bot
f52929673d fix(gcp): ignore redirect balancers and add regional ones (#7449)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-07 12:43:45 -04:00
Prowler Bot
611681488d fix(defender): add default resource name in contacts (#7441)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-04 11:49:58 -04:00
Prowler Bot
9630f23585 fix(aws): solve multiple errors (#7440)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-04 10:47:08 -04:00
Prowler Bot
23cfd91708 fix(azure): remove resource_name inside the Check_Report (#7430)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-04-03 12:34:53 -04:00
Prowler Bot
0c44f28bf7 fix(gcp): make logging sink check at project level (#7428)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-03 11:34:03 -04:00
Pepe Fagoaga
09c529d810 chore(release): bump for 5.4.4 (#7427) 2025-04-03 10:19:07 -04:00
Prowler Bot
5be859d86c chore(deletion): Add environment variable for batch size (#7426)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-03 15:42:22 +05:45
Prowler Bot
5340008c8f chore(sentry): ignore exception when aws service not available in a region (#7398)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-04-03 13:01:14 +05:45
Prowler Bot
a5aa4c30d7 fix(scans): Handle duplicated scan tasks (#7409)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-03 13:00:58 +05:45
Prowler Bot
56d0c2fbea fix(resources): add the correct id and names for resources (#7414)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-01 22:24:33 +02:00
Prowler Bot
6b7ef199e0 fix(report): log as error when Resource ID or Name do not exist (#7412)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-01 21:23:16 +02:00
Prowler Bot
933c5063ee fix(redshift): validation error for Cluster.multi_az (#7400)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-31 14:51:49 +02:00
Prowler Bot
72a998a692 fix(rds): hundle Certificate rds-ca-2019 not found (#7392)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-27 12:04:19 +01:00
Prowler Bot
a4d1e3bb69 fix(stepfunctions): Nonetype object has no attribute level (#7389)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-27 11:09:17 +01:00
Prowler Bot
2a8b04cced fix(fms): resource metadata could not be converted to dict (#7388)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-27 09:13:16 +01:00
Prowler Bot
cfc02186d4 fix(vm): handle Nonetype is not iterable for extensions (#7377)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-26 00:31:57 +01:00
Prowler Bot
7e432a3b69 fix(s3): handle None S3 account public access block (#7376)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 17:55:54 +01:00
Prowler Bot
fed8de314f fix(storagegateway): describe smb/nfs share per region (#7375)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 16:56:25 +01:00
Prowler Bot
ce23c4b5aa fix(network): handle Nonetype is not iterable for security groups (#7372)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-25 15:53:55 +01:00
Prowler Bot
33e99ef628 fix(vm): handle NoneType accessing security_profile (#7373)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 14:43:49 +01:00
Prowler Bot
72761a6ef6 fix(iam): handle none SAML Providers (#7371)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 11:18:10 +01:00
Prowler Bot
2a4fdff827 fix(iam): handle UnboundLocalError cannot access local variable 'report' (#7370)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-25 10:20:49 +01:00
Pepe Fagoaga
8e264313bc chore(release): bump for 5.4.3 2025-03-24 18:08:59 +01:00
Víctor Fernández Poyatos
21654b0bc0 chore: bump API version (#7355) 2025-03-24 20:20:52 +05:45
Prowler Bot
7fa57ba3e2 ref(providers): Refactor provider deletion functions (#7351)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-03-24 19:43:39 +05:45
Prowler Bot
ea9b6bb191 chore(awslambda): update obsolete lambda runtimes (#7345)
Co-authored-by: Jonny <106528116+jonathanbro@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-24 13:18:05 +01:00
Prowler Bot
95acb7b0c8 chore(next): Remove x-powered-by header (#7347)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-24 16:44:50 +05:45
Prowler Bot
d23edfa337 chore: upgrade Next.js to 14.2.25 to fix auth middleware vulnerability (#7340)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-24 09:52:42 +01:00
Pepe Fagoaga
40678a5863 chore(release): bump for 5.4.2 (#7328) 2025-03-20 18:49:13 +01:00
Prowler Bot
23aded92a3 chore(api): Update CHANGELOG (#7327)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-20 15:24:16 +05:45
Prowler Bot
6e56d3862d fix(scan_id): Read the ID from the Scan object (#7326)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-20 15:22:57 +05:45
Prowler Bot
d95fccd163 fix(gcp): make provider id mandatory in test_connection (#7315)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-03-19 20:38:37 +05:45
Prowler Bot
7ddf860a55 fix: add a handled response in case local files are missing (#7227)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-03-19 11:58:25 +01:00
Prowler Bot
3f41c75a45 fix(route53): solve false positive in route53_public_hosted_zones_cloudwatch_logging_enabled (#7293)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-19 13:39:56 +05:45
Prowler Bot
04b6dbf639 fix(microsoft365): typo Microsoft365NotTenantIdButClientIdAndClienSecretError (#7258)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-03-19 13:38:08 +05:45
Prowler Bot
ff4d16deb5 fix(scan): add compliance info inside finding (#7247)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-03-19 13:37:16 +05:45
Prowler Bot
562921cd5e fix(test-connection): Handle provider without secret (#7290)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-19 13:36:38 +05:45
Prowler Bot
8f061e4fed fix(exports): change the way to remove the local export files after s3 upload (#7224)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-03-17 17:30:57 +05:45
Prowler Bot
3fb86d754a fix(cloudwatch): handle None metric alarms (#7207)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-12 16:18:44 +01:00
Prowler Bot
7874707310 chore(sentry): ignore new exceptions in Sentry (#7189)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-12 11:35:32 +01:00
Prowler Bot
1c934e37c7 chore(sentry): ignore expected errors in GCP API (#7186)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-11 17:27:07 +01:00
Prowler Bot
8459cff16d fix(ens): remove and change duplicated ids (#7180)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-03-11 12:46:31 +01:00
Prowler Bot
57ae096395 fix(azure): correct check title for SQL Server Unrestricted (#7160)
Co-authored-by: Gary Mclean <gary.mclean@krrv.io>
2025-03-07 19:22:35 +01:00
Prowler Bot
200185de25 fix(metadata): match type with check results (#7155)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-07 18:16:32 +01:00
Prowler Bot
f8447b0f79 fix(metadata): typo in ec2_securitygroup_allow_wide_open_public_ipv4 (#7158)
Co-authored-by: ryan-stavella <71134114+ryan-stavella@users.noreply.github.com>
2025-03-07 16:32:36 +01:00
Prowler Bot
19289bbe20 fix(aws): ecs_task_definitions_no_environment_secrets.metadata.json (#7153)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-03-07 15:27:59 +01:00
César Arroba
b5b371fa0c chore: increase release to 5.4.1 (#7144)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-07 14:20:28 +01:00
Prowler Bot
939a623cec fix: tweaks for compliance cards (#7148)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-07 11:40:55 +01:00
Víctor Fernández Poyatos
926f449ae6 fix(overviews): manage overview exceptions and use batch_size with bulk (#7140) 2025-03-06 15:39:51 +01:00
César Arroba
646668c6ae chore(ui-gha): delete double quotes on prowler version (#7139) 2025-03-06 15:39:40 +01:00
Prowler Bot
8e6b92792b fix(groups): display uid if alias is missing (#7138)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-06 14:41:57 +01:00
Prowler Bot
65c081ce38 fix(credentials): adjust helper links to fit width (#7134)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-06 13:01:10 +01:00
Pepe Fagoaga
5600131d6a revert(findings): change uid from varchar to text (#7132) 2025-03-06 11:43:07 +01:00
César Arroba
dbd271980f chore(ui-gha): add version prefix (#7125) 2025-03-05 16:34:42 +01:00
Víctor Fernández Poyatos
ff532a899e fix(reports): Fix task kwargs and result (#7124) 2025-03-05 16:34:36 +01:00
César Arroba
0c9675ec70 chore(ui): add prowler version on build (#7120) 2025-03-05 16:34:26 +01:00
Pablo Lara
d45eda2b2b feat(compliance): new compliance selector (#7118) 2025-03-05 16:34:07 +01:00
dependabot[bot]
0abcf80d19 chore(deps-dev): bump pytest from 8.3.4 to 8.3.5 (#7097)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-05 16:33:35 +01:00
Pablo Lara
c0e10fd395 chore(ui): update label from 'Select a scan job' to 'Select a cloud p… (#7107) 2025-03-05 16:32:20 +01:00
Pepe Fagoaga
a80e9b26a8 chore(api): Use Prowler from v5.4 (#7092) 2025-03-03 22:30:58 +05:45
Sergio Garcia
2a9cd57fb8 fix(deps): update vulnerable cryptography dependency (#6993) 2025-03-03 17:38:23 +01:00
Prowler Bot
a0ad1a5f49 chore(deps): bump cryptography from 43.0.1 to 44.0.1 in /api (#7003)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-03 17:21:15 +01:00
César Arroba
dff22dd166 Revert "chore(api): update prowler version to 5.4 (#7091)"
This reverts commit 58138810b9.
2025-03-03 17:11:20 +01:00
César Arroba
58138810b9 chore(api): update prowler version to 5.4 (#7091)
Co-authored-by: Prowler Bot <bot@prowler.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-03 17:08:45 +01:00
Víctor Fernández Poyatos
1ecc272fe4 chore: update changelog 2025-03-03 16:27:05 +01:00
Pablo Lara
b784167006 fix(roles): show the correct error message (#7089) 2025-03-03 15:50:28 +01:00
Pablo Lara
cf0ec8dea0 fix: bug with create role and unlimited visibility checkbox (#7088) 2025-03-03 15:50:23 +01:00
Sergio Garcia
96cae5e961 feat(aws): add fixers for threat detection checks (#7085) 2025-03-03 15:50:18 +01:00
Pablo Lara
a48e5cb15f feat(version): add prowler version to the sidebar (#7086) 2025-03-03 15:50:14 +01:00
Pablo Lara
5a9ff007e0 chore: Update the latest table findings with the most recent changes (#7084) 2025-03-03 15:50:06 +01:00
Prowler Bot
24c45f894c chore(regions_update): Changes in regions for AWS services (#7034)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 15:47:09 +01:00
dependabot[bot]
5d03c85629 chore(deps): bump cryptography from 43.0.1 to 44.0.1 in /api (#7001)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 15:46:15 +01:00
Pablo Lara
41dc397a7a feat(sidebar): sidebar with new functionalities (#7018) 2025-03-03 15:01:38 +01:00
Prowler Bot
237a9adce9 chore(regions_update): Changes in regions for AWS services (#7067)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 15:01:29 +01:00
Sergio Garcia
a06167f1c2 fix(threat detection): run single threat detection check (#7065) 2025-03-03 15:01:21 +01:00
Pepe Fagoaga
a7d58c40dd refactor(stats): Use Finding instead of Check_Report (#7053)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-03-03 15:01:12 +01:00
Pepe Fagoaga
e260c46389 chore(examples): Scan AWS (#7064) 2025-03-03 15:01:02 +01:00
Sergio Garcia
115169a596 chore(gcp): enhance GCP APIs logic (#7046) 2025-03-03 15:00:49 +01:00
dependabot[bot]
5b19173c1d chore(deps): bump trufflesecurity/trufflehog from 3.88.13 to 3.88.14 (#7063)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 15:00:40 +01:00
Daniel Barranquero
d3dd1644e6 feat(m365): add sharepoint service with 4 checks (#7057)
Co-authored-by: MarioRgzLpz <mariorgzlpz1809@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-03 15:00:29 +01:00
Pedro Martín
8ff0c59964 feat(docs): add info related with sts assume role and regions (#7062) 2025-03-03 15:00:19 +01:00
Daniel Barranquero
285939c389 fix(azure): handle account not supporting Blob (#7060) 2025-03-03 15:00:04 +01:00
Sergio Garcia
a62ae8af51 fix(ecs): ensure unique finding id in ECS checks (#7059) 2025-03-03 14:59:56 +01:00
Prowler Bot
5d78b9e439 chore(regions_update): Changes in regions for AWS services (#7056)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:59:45 +01:00
Hugo Pereira Brito
1056c270ca feat(microsoft365): add new check entra_policy_ensure_default_user_cannot_create_tenants (#6918)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-03 14:59:35 +01:00
Pablo Lara
eeef6600b7 feat(exports): download scan exports (#7006) 2025-03-03 14:59:14 +01:00
Pepe Fagoaga
e142f17abe fix(env): UI version must be stable (#7055) 2025-03-03 14:58:45 +01:00
Víctor Fernández Poyatos
a65d858dac fix(migrations): Fix migration dependency order (#7051) 2025-03-03 14:58:32 +01:00
Víctor Fernández Poyatos
6235a1ba41 feat(labeler): apply label on migration changes (#7052) 2025-03-03 14:58:22 +01:00
Pepe Fagoaga
05007d03ee fix(findings): change uid from varchar to text (#7048) 2025-03-03 14:58:13 +01:00
Víctor Fernández Poyatos
102d099947 feat(findings): Add Django management command to populate database with dummy data (#7049) 2025-03-03 14:58:03 +01:00
Adrián Jesús Peña Rodríguez
3194675a5c feat(export): add API export system (#6878) 2025-03-03 14:57:40 +01:00
dependabot[bot]
14e6e4aa68 chore(deps-dev): bump black from 24.10.0 to 25.1.0 (#6733)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-03 14:57:22 +01:00
Pedro Martín
b24c3665b5 feat(aws): add ISO 27001 2022 compliance framework (#7035)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-03 14:56:31 +01:00
Prowler Bot
1f60878867 chore(regions_update): Changes in regions for AWS services (#7015)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:54:16 +01:00
dependabot[bot]
2dd18662d8 chore(deps): bump google-api-python-client from 2.161.0 to 2.162.0 (#7037)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:54:02 +01:00
Hugo Pereira Brito
175360dbe6 refactor(microsoft365): CheckReportMicrosoft365 and resource metadata (#6952) 2025-03-03 14:53:42 +01:00
Víctor Fernández Poyatos
80e24b971f feat(findings): Optimize findings endpoint (#7019) 2025-03-03 14:53:22 +01:00
Pepe Fagoaga
78877c470a chore(action): Conventional Commit Check (#7033) 2025-03-03 14:53:08 +01:00
dependabot[bot]
9c9b100359 chore(deps): bump trufflesecurity/trufflehog from 3.88.12 to 3.88.13 (#7026)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:52:51 +01:00
Pedro Martín
10f3232294 feat(outputs): add sample outputs (#6945) 2025-03-03 14:52:37 +01:00
Pedro Martín
a2e5f70f36 fix(cis): show report table on the CLI (#6979) 2025-03-03 14:52:23 +01:00
Pedro Martín
8d8b31c757 feat(azure): add PCI DSS 4.0 (#6982) 2025-03-03 14:52:03 +01:00
Pedro Martín
cba1e718b9 feat(kubernetes): add PCI DSS 4.0 (#7013) 2025-03-03 14:51:42 +01:00
Pedro Martín
6c3c37fc26 feat(dashboard): take the latest finding uid by timestamp (#6987) 2025-03-03 14:51:29 +01:00
Víctor Fernández Poyatos
b610cacd0c feat(tasks): add deletion queue for deletion tasks (#7022) 2025-03-03 14:51:14 +01:00
Pedro Martín
027a5705cb feat(gcp): add PCI DSS 4.0 (#7010) 2025-03-03 14:51:00 +01:00
Prowler Bot
b7fbfb4360 chore(regions_update): Changes in regions for AWS services (#7011)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:50:41 +01:00
dependabot[bot]
5acf0a7e3d chore(deps-dev): bump mkdocs-material from 9.6.4 to 9.6.5 (#7007)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:50:26 +01:00
Raj Chowdhury
3a25e86e30 fix(typo): solve typo in dashboard.md (#7009) 2025-03-03 14:50:06 +01:00
dependabot[bot]
50f1592eb3 chore(deps): bump trufflesecurity/trufflehog from 3.88.11 to 3.88.12 (#7008)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:49:46 +01:00
César Arroba
0f2927cb88 feat(api): setup sentry for OSS API (#6874) 2025-03-03 14:49:32 +01:00
Pablo Lara
b4e1434052 chore(users): renaming the account now triggers a re-render in the sidebar (#7005) 2025-03-03 14:49:16 +01:00
dependabot[bot]
43710783f9 chore(deps): bump python from 3.12.8-alpine3.20 to 3.12.9-alpine3.20 (#6882)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:49:04 +01:00
dependabot[bot]
16f767e7b9 chore(deps): bump tzlocal from 5.2 to 5.3 (#6932)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:48:41 +01:00
Hugo Pereira Brito
42818217a0 docs(tutorials): update all deprecated poetry shell references (#7002) 2025-03-03 14:44:50 +01:00
Prowler Bot
13405594b2 chore(regions_update): Changes in regions for AWS services (#6998)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:44:33 +01:00
Pedro Martín
b5a3852334 chore(github): add compliance to PR labeler (#6996) 2025-03-03 14:44:16 +01:00
Hugo Pereira Brito
181ff1acb3 docs(installation): add warning for poetry shell deprecation in README (#6983) 2025-03-03 14:43:35 +01:00
Pablo Lara
91e59a3279 chore(findings): add 'Status Extended' attribute to finding details (#6997) 2025-03-03 14:43:20 +01:00
Pedro Martín
b3fad1a765 feat(aws): add PCI DSS 4.0 (#6949) 2025-03-03 14:42:41 +01:00
dependabot[bot]
6f90927a79 chore(deps): bump trufflesecurity/trufflehog from 3.88.9 to 3.88.11 (#6988)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:42:24 +01:00
dependabot[bot]
0bb4a9a3e9 chore(deps): bump kubernetes from 32.0.0 to 32.0.1 (#6992)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:42:10 +01:00
Pablo Lara
80d9cde60b feat(scans): update the progress for executing scans (#6972) 2025-03-03 14:41:39 +01:00
César Arroba
11196c2f83 chore(gha): trigger API or UI deployment when push to master (#6946) 2025-03-03 14:41:21 +01:00
Prowler Bot
55a0d0a1b5 chore(regions_update): Changes in regions for AWS services (#6978) 2025-03-03 14:41:06 +01:00
Pedro Martín
4e5e1d7bd4 feat(aws): add compliance CIS 4.0 (#6937) 2025-03-03 14:40:17 +01:00
dependabot[bot]
06a0a434ab chore(deps-dev): bump flake8 from 7.1.1 to 7.1.2 (#6954)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:39:58 +01:00
Pepe Fagoaga
153833fc55 fix(ocsf): Adapt for 1.4.0 (#6971) 2025-03-03 14:38:57 +01:00
Prowler Bot
fc4877975f chore(regions_update): Changes in regions for AWS services (#6968)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:38:21 +01:00
dependabot[bot]
0797efd4fd chore(deps-dev): bump bandit from 1.8.2 to 1.8.3 (#6955)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:38:07 +01:00
Prowler Bot
fbec99a0b7 chore(regions_update): Changes in regions for AWS services (#6944)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:37:28 +01:00
dependabot[bot]
b2cb1de95e chore(deps): bump trufflesecurity/trufflehog from 3.88.8 to 3.88.9 (#6943)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:37:06 +01:00
dependabot[bot]
190c2316d7 chore(deps): bump google-api-python-client from 2.160.0 to 2.161.0 (#6933)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:36:16 +01:00
César Arroba
f6c352281a fix(gha): fix short sha step (#6939) 2025-03-03 14:36:00 +01:00
César Arroba
66dfe89936 chore(gha): add tag for api and ui images on push to master (#6920) 2025-03-03 14:35:41 +01:00
dependabot[bot]
8b3942ca49 chore(deps): bump trufflesecurity/trufflehog from 3.88.7 to 3.88.8 (#6931)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:35:06 +01:00
dependabot[bot]
9d35213bd5 chore(deps-dev): bump mkdocs-material from 9.6.3 to 9.6.4 (#6913)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:34:13 +01:00
Víctor Fernández Poyatos
3e586e615d feat(social-login): Add social login integration for Google and Github OAuth providers (#6906) 2025-03-03 14:33:12 +01:00
Sergio Garcia
a4f950e093 chore(docs): external K8s cluster Prowler App credentials (#6921) 2025-03-03 14:32:30 +01:00
Pedro Martín
7c2441f6ff fix(gcp): remove typos on CIS 3.0 (#6917) 2025-03-03 14:31:48 +01:00
dependabot[bot]
0a92af3eb2 chore(deps): bump trufflesecurity/trufflehog from 3.88.6 to 3.88.7 (#6915)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:31:25 +01:00
César Arroba
666f3a0e20 fix(gha): fix test build containers on pull requests actions (#6909) 2025-03-03 14:30:35 +01:00
dependabot[bot]
06ef98b5cc chore(deps-dev): bump coverage from 7.6.11 to 7.6.12 (#6897)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:29:39 +01:00
Prowler Bot
79125bdd40 chore(regions_update): Changes in regions for AWS services (#6900)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:28:42 +01:00
César Arroba
e8e8b085ac chore(api): test build container image on pull request (#6850) 2025-03-03 14:28:24 +01:00
César Arroba
c9b81d003a chore(ui): test build container image on pull request (#6849) 2025-03-03 14:27:33 +01:00
Pepe Fagoaga
23fa3c1e38 chore(version): Update version to 5.4.0 (#6894) 2025-03-03 14:26:51 +01:00
dependabot[bot]
03fbd0baca chore(deps-dev): bump coverage from 7.6.10 to 7.6.11 (#6887)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:24:50 +01:00
dependabot[bot]
d4fe24ef47 chore(deps): bump trufflesecurity/trufflehog from 3.88.5 to 3.88.6 (#6888)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 13:05:51 +01:00
Prowler Bot
9c5220ee98 fix(elasticache): improve logic in elasticache_redis_cluster_backup_enabled (#7045)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-26 12:12:44 +01:00
Prowler Bot
6491bce5a6 fix(azure): migrate resource models to avoid using SDK defaults (#7043)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-02-26 10:49:33 +01:00
Prowler Bot
ca375dd79c chore(iam): enhance iam_role_cross_service_confused_deputy_prevention recommendation (#7041)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-02-26 08:35:26 +01:00
Prowler Bot
e807573b54 fix(soc2_aws): remove duplicated checks (#6999)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-20 17:26:50 +05:30
Prowler Bot
c0f4c9743f fix(deps): update vulnerable cryptography dependency (#6994)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-20 16:04:14 +05:30
Prowler Bot
5974d0b5da fix(report): remove invalid resources in report (#6984)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-20 11:05:58 +05:30
Prowler Bot
6244a8a5f7 fix(roles): handle empty response in deleteRole and ensure revalidation (#6977)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-19 09:06:56 +01:00
Prowler Bot
5b9dae4529 test(cloudfront): add name retrieval test for cloudfront bucket domains (#6975)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-02-19 09:12:17 +05:30
Prowler Bot
a424374c44 fix(cloudfront): Incorrect bucket name retrievement (#6967)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-02-19 08:08:31 +05:30
Prowler Bot
b7fc2542e8 fix(gcp): Correct false positive when sslMode=ENCRYPTED_ONLY in CloudSQL (#6942)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-02-14 16:36:26 -05:00
Prowler Bot
83a1598a1e fix(issue pages): apply sorting by default in issue pages (#6935)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-14 10:35:49 +01:00
Prowler Bot
b22b56a06b fix(gcp): handle DNS Managed Zone with no DNSSEC (#6928)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-13 14:50:05 -05:00
Prowler Bot
5020e4713c fix(aws): codebuild service threw KeyError for projects type CODEPIPELINE (#6930)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-13 13:53:05 -05:00
Prowler Bot
ee534a740e fix(aws): SNS threw IndexError if SubscriptionArn is PendingConfirmation (#6923)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-13 10:35:19 -05:00
Prowler Bot
48cb45b7a8 fix(aws): handle AccessDenied when retrieving resource policy (#6912)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-12 19:15:51 -05:00
Prowler Bot
91b74822e9 fix(kms): Amazon KMS API call error handling (#6904)
Co-authored-by: Ogonna Iwunze <1915636+wunzeco@users.noreply.github.com>
2025-02-12 11:08:47 -05:00
Pepe Fagoaga
287eef5085 chore(version): Update version to 5.3.1 (#6895) 2025-02-12 16:39:09 +05:45
Mario Rodriguez Lopez
45d359c84a feat(microsof365): Add documentation and compliance file (#6195)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-10 17:18:43 +01:00
Pablo Lara
6049e5d4e8 chore: Update prowler api version (#6877)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-10 17:39:08 +05:45
Víctor Fernández Poyatos
dfd377f89e chore(api): Update changelog and specs (#6876) 2025-02-10 12:25:04 +01:00
Víctor Fernández Poyatos
37e6c52c14 chore: Add needed steps for API in PR template (#6875) 2025-02-10 12:24:51 +01:00
Pepe Fagoaga
d6a7f4d88f fix(kubernetes): Change UID validation (#6869)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-10 11:24:34 +01:00
Pepe Fagoaga
239cda0a90 chore: Rename dashboard table latest findings (#6873)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-10 11:24:27 +01:00
dependabot[bot]
4a821e425b chore(deps-dev): bump mkdocs-material from 9.6.2 to 9.6.3 (#6871)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:24:21 +01:00
Sergio Garcia
e1a2f0c204 docs(eks): add documentation about EKS onboarding (#6853)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-10 11:24:16 +01:00
Prowler Bot
c70860c733 chore(regions_update): Changes in regions for AWS services (#6858)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-10 11:24:09 +01:00
Víctor Fernández Poyatos
05e71e033f feat(findings): Use ArrayAgg and subqueries on metadata endpoint (#6863)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-10 11:24:03 +01:00
Pablo Lara
5164ec2eb9 feat: implement new functionality with inserted_at__gte in findings a… (#6864) 2025-02-10 11:23:58 +01:00
Víctor Fernández Poyatos
be18dac4f9 docs: Add details about user creation in Prowler app (#6862) 2025-02-10 11:23:52 +01:00
dependabot[bot]
bb126c242f chore(deps): bump microsoft-kiota-abstractions from 1.9.1 to 1.9.2 (#6856)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:23:47 +01:00
Víctor Fernández Poyatos
e27780a856 feat(findings): Require date filters for findings endpoints (#6800) 2025-02-10 11:23:41 +01:00
Pranay Girase
196ec51751 fix(typo): typos in Dashboard and Report in HTML (#6847) 2025-02-10 11:23:33 +01:00
Prowler Bot
86abf9e64c chore(regions_update): Changes in regions for AWS services (#6848)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-10 11:23:28 +01:00
dependabot[bot]
9d8be578e3 chore(deps): bump trufflesecurity/trufflehog from 3.88.4 to 3.88.5 (#6844)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:23:19 +01:00
Mario Rodriguez Lopez
b3aa800082 feat(entra): add new check entra_thirdparty_integrated_apps_not_allowed (#6357)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:23:13 +01:00
Ogonna Iwunze
501674a778 feat(kms): add kms_cmk_not_multi_region AWS check (#6794)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:23:01 +01:00
Prowler Bot
5ff6ae79d8 chore(regions_update): Changes in regions for AWS services (#6821)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-10 11:21:08 +01:00
Mario Rodriguez Lopez
e518a869ab feat(entra): add new entra service for Microsoft365 (#6326)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:20:58 +01:00
Mario Rodriguez Lopez
43927a62f3 feat(microsoft365): add new check admincenter_settings_password_never_expire (#6023)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:20:52 +01:00
dependabot[bot]
335980c8d8 chore(deps): bump kubernetes from 31.0.0 to 32.0.0 (#6678)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:20:42 +01:00
Pablo Lara
ca3ee378db style(forms): improve spacing consistency (#6814) 2025-02-10 11:17:11 +01:00
Pablo Lara
c05bc1068a chore(forms): improvements to the sign-in and sign-up forms (#6813) 2025-02-10 11:17:03 +01:00
Drew Kerrigan
2e3164636d docs(): add description of changed and new delta values to prowler app tutorial (#6801) 2025-02-10 11:16:56 +01:00
dependabot[bot]
c34e07fc40 chore(deps): bump pytz from 2024.2 to 2025.1 (#6765)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:50 +01:00
dependabot[bot]
6022122a61 chore(deps-dev): bump mkdocs-material from 9.5.50 to 9.6.2 (#6799)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:44 +01:00
dependabot[bot]
f65f5e4b46 chore(deps-dev): bump pylint from 3.3.3 to 3.3.4 (#6721)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:37 +01:00
Pablo Lara
dee17733a0 feat(scans): show scan details right after launch (#6791) 2025-02-10 11:16:30 +01:00
dependabot[bot]
cddda1e64e chore(deps): bump trufflesecurity/trufflehog from 3.88.2 to 3.88.4 (#6760)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:23 +01:00
dependabot[bot]
f7b873db03 chore(deps): bump google-api-python-client from 2.159.0 to 2.160.0 (#6720)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:17 +01:00
Víctor Fernández Poyatos
792bc70d0a feat(schedules): Rework daily schedule to always show the next scan (#6700) 2025-02-10 11:16:11 +01:00
Hugo Pereira Brito
185491b061 fix: microsoft365 mutelist (#6724) 2025-02-10 11:16:05 +01:00
dependabot[bot]
3af8a43480 chore(deps): bump microsoft-kiota-abstractions from 1.6.8 to 1.9.1 (#6734)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:57 +01:00
Pablo Lara
fd78406b29 fix: Enable hot reloading when using Docker Compose for UI (#6750) 2025-02-10 11:15:49 +01:00
Matt Johnson
4758b258a3 chore: Update Google Analytics ID across all docs.prowler.com sites. (#6730) 2025-02-10 11:15:41 +01:00
dependabot[bot]
015e2b3b88 chore(deps): bump uuid from 10.0.0 to 11.0.5 in /ui (#6516)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:33 +01:00
Mario Rodriguez Lopez
e184c9cb61 feat(m365): add Microsoft 365 provider (#5902)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:15:24 +01:00
dependabot[bot]
9004a01183 chore(deps): bump azure-mgmt-web from 7.3.1 to 8.0.0 (#6680)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:18 +01:00
dependabot[bot]
dd65ba3d9e chore(deps): bump msgraph-sdk from 1.17.0 to 1.18.0 (#6679)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:08 +01:00
dependabot[bot]
bba616a18f chore(deps): bump azure-storage-blob from 12.24.0 to 12.24.1 (#6666)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:14:33 +01:00
Pepe Fagoaga
aa0f8d2981 chore: bump for next minor (#6672) 2025-02-10 11:13:42 +01:00
Paolo Frigo
2511d6ffa9 docs: update # of checks, services, frameworks and categories (#6528)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:12:02 +01:00
Prowler Bot
27329457be fix(dashboard): adjust the bar chart display (#6868)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-07 11:04:23 -05:00
Prowler Bot
7189f3d526 fix(aws): key error for detect-secrets (#6865)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-07 10:04:54 -05:00
Prowler Bot
58e7589c9d fix(kms): handle error in DescribeKey function (#6842)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-05 15:27:43 -05:00
Prowler Bot
d60f4b5ded fix(cloudfront): fix false positive in s3 origins (#6838)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-05 13:36:49 -05:00
Prowler Bot
4c2ec094f6 fix(findings): Spelling mistakes correction (#6834)
Co-authored-by: Gary Mclean <gary.mclean@krrv.io>
2025-02-05 11:53:17 -05:00
Prowler Bot
395ecaff5b fix(directoryservice): handle ClientException (#6828)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-05 11:20:13 -05:00
Prowler Bot
c39506ef7d fix(aws) wording of report.status_extended in awslambda_function_not_publicly_accessible (#6831)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-05 11:18:27 -05:00
Prowler Bot
eb90d479e2 chore(aws_audit_manager_control_tower_guardrails): add checks to reqs (#6803)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 19:47:04 -05:00
Prowler Bot
b92a73f5ea fix(elasticache): InvalidReplicationGroupStateFault error (#6820)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-02-04 16:08:26 -05:00
Prowler Bot
ad121f3059 chore(deps-dev): bump moto from 5.0.27 to 5.0.28 (#6817)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 14:25:04 -05:00
Pepe Fagoaga
70e4c5a44e chore: bump for next patch (#6764) 2025-02-03 15:25:23 -05:00
Prowler Bot
b5a46b7b59 fix(cis_1.5_aws): add checks to needed reqs (#6798)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-03 11:37:53 -05:00
Prowler Bot
f1a97cd166 fix(cis_1.4_aws): add checks to needed reqs (#6796)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-03 11:37:39 -05:00
Prowler Bot
0774508093 fix(cis_2.0_aws): add checks to needed reqs (#6787)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-03 11:37:24 -05:00
Prowler Bot
0664ce6b94 fix(gcp): fix wrong provider value in check (#6789)
Co-authored-by: secretcod3r <101349794+secretcod3r@users.noreply.github.com>
2025-02-03 10:32:53 -05:00
Prowler Bot
407c779c52 fix(findings): remove default status filtering (#6785)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-03 15:25:11 +01:00
Prowler Bot
c60f13f23f fix(findings): order findings by inserted_at DESC (#6783)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-03 11:57:45 +01:00
Prowler Bot
37d912ef01 fix(celery): Kill celery worker process after every task to release memory (#6763)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-31 19:37:34 +05:45
Pepe Fagoaga
d3de89c017 chore: bump for next patch (#6762) 2025-01-31 19:34:54 +05:45
Prowler Bot
cb22af25c6 fix(db_event): Handle other events (#6757)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 21:48:42 +05:45
Prowler Bot
a534b94495 fix(set_report_color): Add more details to error (#6755)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 21:46:34 +05:45
Prowler Bot
6262b4ff0b feat(scans): Optimize read queries during scans (#6756)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-30 20:56:11 +05:45
Prowler Bot
84ecd7ab2c feat(findings): Improve /findings/metadata performance (#6749)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-30 18:39:25 +05:45
Prowler Bot
1a5428445a fix(neptune): correct service name (#6747)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 17:52:00 +05:45
Prowler Bot
ac8e991ca0 fix(aws): iam_user_with_temporary_credentials resource in OCSF (#6741)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-01-30 17:17:20 +05:45
Prowler Bot
83a0331472 fix(acm): Key Error DomainName (#6744)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 17:17:05 +05:45
Prowler Bot
cce31e2971 fix(finding): raise when generating invalid findings (#6745)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 17:16:50 +05:45
Prowler Bot
0adf7d6e77 fix(sns): Add region to subscriptions (#6740)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 15:46:55 +05:45
Pepe Fagoaga
295f8b557e chore: bump for next patch (#6727) 2025-01-29 20:00:47 +05:45
Prowler Bot
bb2c5c3161 fix(scans): change label for next scan (#6726)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-29 10:49:35 +01:00
Prowler Bot
0018f36a36 fix(migrations): Use indexes instead of constraints to define an index (#6723)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-29 14:29:03 +05:45
Prowler Bot
857de84f49 fix(defender): add field to SecurityContacts (#6715)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-01-29 13:53:14 +05:45
Prowler Bot
9630f2242a revert: Update Django DB manager to use psycopg3 and connection pooling (#6719)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-28 22:33:32 +05:45
Prowler Bot
1fe125867c fix(scan-summaries): Improve efficiency on providers overview (#6718)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-28 22:15:22 +05:45
Prowler Bot
0737893240 fix(scans): filters and sorting for scan table (#6714)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-28 13:29:39 +01:00
Prowler Bot
282fe3d348 fix(scans, findings): Improve API performance ordering by inserted_at instead of id (#6712)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-28 17:28:26 +05:45
Prowler Bot
b5d83640ae fix: fixed bug when opening finding details while a scan is in progress (#6709)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-28 07:01:43 +01:00
Prowler Bot
2823d3ad21 fix(cloudsql): add trusted client certificates case for cloudsql_instance_ssl_connections (#6687)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-01-24 12:19:05 -05:00
Prowler Bot
00b93bfe86 fix(cloudwatch): NoneType object is not iterable (#6677)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-01-23 13:25:02 -05:00
Pepe Fagoaga
84c253d887 chore: bump for next patch (#6673) 2025-01-23 13:13:05 -05:00
Pepe Fagoaga
2ab5a702c9 chore(api): Bump to v1.3.0 (#6670) 2025-01-23 16:40:59 +01:00
Sergio Garcia
11d9cdf24e feat(resource metadata): add resource metadata to JSON OCSF (#6592)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-01-23 16:08:13 +01:00
Rubén De la Torre Vico
b1de41619b fix: add missing Check_Report_Azure parameters (#6583) 2025-01-23 16:07:23 +01:00
Rubén De la Torre Vico
18a4881a51 feat(network): extract Network resource metadata automated (#6555)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:06:44 +01:00
Rubén De la Torre Vico
de76a168c0 feat(storage): extract Storage resource metadata automated (#6563)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:06:40 +01:00
Rubén De la Torre Vico
bcc13a742d feat(vm): extract VM resource metadata automated (#6564) 2025-01-23 16:06:32 +01:00
Rubén De la Torre Vico
23b584f4bf feat(sqlserver): extract SQL Server resource metadata automated (#6562) 2025-01-23 16:06:25 +01:00
Daniel Barranquero
074b7c1ff5 feat(aws): include resource metadata to remaining checks (#6551)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 16:06:18 +01:00
Rubén De la Torre Vico
c04e9ed914 feat(postgresql): extract PostgreSQL resource metadata automated (#6560) 2025-01-23 16:06:11 +01:00
Rubén De la Torre Vico
1f1b126e79 feat(policy): extract Policy resource metadata automated (#6558) 2025-01-23 16:06:06 +01:00
Rubén De la Torre Vico
9b0dd80f13 feat(entra): extract Entra resource metadata automated (#6542)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:06:01 +01:00
Rubén De la Torre Vico
b0807478f2 feat(monitor): extract monitor resource metadata automated (#6554)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:05:54 +01:00
Rubén De la Torre Vico
11dfa58ecd feat(mysql): extract MySQL resource metadata automated (#6556) 2025-01-23 16:05:48 +01:00
Rubén De la Torre Vico
412f396e0a feat(keyvault): extract KeyVault resource metadata automated (#6553) 2025-01-23 16:05:44 +01:00
Rubén De la Torre Vico
8100d43ff2 feat(iam): extract IAM resource metadata automated (#6552) 2025-01-23 16:05:38 +01:00
Sergio Garcia
bc96acef48 fix(gcp): iterate through service projects (#6549)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-01-23 16:05:24 +01:00
Hugo Pereira Brito
6e9876b61a feat(aws): include resource metadata in services from r* to s* (#6536)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 16:05:17 +01:00
Pedro Martín
32253ca4f7 feat(gcp): add resource metadata to report (#6500)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 16:05:08 +01:00
Hugo Pereira Brito
4c6be5e283 feat(aws): include resource metadata in services from a* to b* (#6504)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:05:03 +01:00
Daniel Barranquero
69178fd7bd chore(aws): add resource metadata to services from t to w (#6546) 2025-01-23 16:04:49 +01:00
Daniel Barranquero
cd4432c14b chore(aws): add resource metadata to services from f to o (#6545) 2025-01-23 16:04:40 +01:00
Rubén De la Torre Vico
0e47172aec feat(defender): extract Defender resource metadata in automated way (#6538) 2025-01-23 16:04:30 +01:00
Rubén De la Torre Vico
efe18ae8b2 feat(appinsights): extract App Insights resource metadata in automated way (#6540) 2025-01-23 16:04:18 +01:00
Hugo Pereira Brito
d5e13df4fe feat: add resource metadata to emr_cluster_account_public_block_enabled (#6539) 2025-01-23 16:04:13 +01:00
Sergio Garcia
b0e84d74f2 feat(kubernetes): add resource metadata to report (#6479) 2025-01-23 16:04:05 +01:00
Hugo Pereira Brito
28526b591c feat(aws): include resource metadata in services from d* to e* (#6532)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:03:59 +01:00
Daniel Barranquero
51e6a6edb1 feat(aws): add resource metadata to all services starting with c (#6493) 2025-01-23 16:03:50 +01:00
Rubén De la Torre Vico
c1b132a29e feat(cosmosdb): extract CosmosDB resource metadata in automated way (#6533) 2025-01-23 16:03:43 +01:00
Rubén De la Torre Vico
e2530e7d57 feat(containerregistry): extract Container Registry resource metadata in automated way (#6530) 2025-01-23 16:03:37 +01:00
Rubén De la Torre Vico
f2fcb1599b feat(azure-app): extract Web App resource metadata in automated way (#6529) 2025-01-23 16:03:32 +01:00
Rubén De la Torre Vico
160eafa0c9 feat(aks): use Check_Report_Azure constructor properly in AKS checks (#6509) 2025-01-23 16:03:17 +01:00
Rubén De la Torre Vico
c2bc0f1368 feat(aisearch): use Check_Report_Azure constructor properly in AISearch checks (#6506) 2025-01-23 16:03:10 +01:00
Rubén De la Torre Vico
27d27fff81 feat(azure): include resource metadata in Check_Report_Azure (#6505) 2025-01-23 16:02:00 +01:00
Pepe Fagoaga
66e8f0ce18 chore(scan): Remove ._findings (#6667) 2025-01-23 15:58:37 +01:00
Pedro Martín
0ceae8361b feat(kubernetes): add CIS 1.10 compliance (#6508)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
# Conflicts:
#	prowler/compliance/kubernetes/cis_1.10_kubernetes.json
2025-01-23 15:39:40 +01:00
Pedro Martín
5e52fded83 feat(azure): add CIS 3.0 for Azure (#5226)
# Conflicts:
#	prowler/compliance/azure/cis_3.0_azure.json
2025-01-23 15:35:35 +01:00
Pablo Lara
0a7d07b4b6 chore: adjust DateWithTime component height when used with InfoField (#6669) 2025-01-23 15:21:42 +01:00
Pablo Lara
34b5f483d7 chore(scans): improve scan details (#6665) 2025-01-23 14:16:35 +01:00
Pedro Martín
9d04a1bc52 feat(detect-secrets): get secrets plugins from config.yaml (#6544)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-23 14:16:15 +01:00
dependabot[bot]
b25c0aaa00 chore(deps): bump azure-mgmt-containerservice from 33.0.0 to 34.0.0 (#6630)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:16:03 +01:00
dependabot[bot]
652b93ea45 chore(deps): bump azure-mgmt-compute from 33.1.0 to 34.0.0 (#6628)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:15:54 +01:00
Pepe Fagoaga
ccb7726511 fix(templates): Customize principals and add validation (#6655) 2025-01-23 14:15:41 +01:00
Anton Rubets
c514a4e451 chore(helm): Add prowler helm support (#6580)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 14:15:33 +01:00
Prowler Bot
d841bd6890 chore(regions_update): Changes in regions for AWS services (#6652)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-23 14:15:24 +01:00
dependabot[bot]
ff54e10ab2 chore(deps): bump boto3 from 1.35.94 to 1.35.99 (#6651)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:15:03 +01:00
Pepe Fagoaga
718e562741 chore(pre-commit): poetry checks for API and SDK (#6658) 2025-01-23 14:14:10 +01:00
Pablo Lara
ce05e2a939 feat(providers): show the cloud formation and terraform template links on the form (#6660) 2025-01-23 14:13:37 +01:00
Pablo Lara
90831d3084 feat(providers): make external id field mandatory in the aws role secret form (#6656) 2025-01-23 14:13:30 +01:00
dependabot[bot]
2c928dead5 chore(deps-dev): bump moto from 5.0.16 to 5.0.27 (#6632)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 14:13:23 +01:00
dependabot[bot]
3ffe147664 chore(deps): bump botocore from 1.35.94 to 1.35.99 (#6520)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 14:13:04 +01:00
dependabot[bot]
3373b0f47c chore(deps-dev): bump mkdocs-material from 9.5.49 to 9.5.50 (#6631)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:12:52 +01:00
Prowler Bot
b29db04560 chore(regions_update): Changes in regions for AWS services (#6599)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-23 14:12:38 +01:00
Hugo Pereira Brito
f964a0362e chore(services): delete all comment headers (#6585) 2025-01-23 14:11:58 +01:00
Prowler Bot
537b23dfae chore(regions_update): Changes in regions for AWS services (#6577)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-23 14:11:46 +01:00
Pablo Lara
48cf3528b4 feat(findings): add first seen in findings details (#6575) 2025-01-23 14:11:08 +01:00
dependabot[bot]
3c4b9d32c9 chore(deps): bump msgraph-sdk from 1.16.0 to 1.17.0 (#6547)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:09:46 +01:00
Víctor Fernández Poyatos
fd8361ae2c feat(db): Update Django DB manager to use psycopg3 and connection pooling (#6541)
# Conflicts:
#	api/poetry.lock
#	api/pyproject.toml
2025-01-23 14:07:34 +01:00
Prowler Bot
33cadaa932 chore(regions_update): Changes in regions for AWS services (#6526)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-23 14:02:14 +01:00
dependabot[bot]
cc126eb8b4 chore(deps): bump google-api-python-client from 2.158.0 to 2.159.0 (#6521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:02:07 +01:00
Pedro Martín
c996b3f6aa docs(readme): update pr template to add check for readme (#6531) 2025-01-23 14:01:53 +01:00
Adrián Jesús Peña Rodríguez
e2b406a300 feat(finding): add first_seen attribute (#6460) 2025-01-23 14:01:46 +01:00
dependabot[bot]
c2c69da603 chore(deps): bump django from 5.1.4 to 5.1.5 in /api (#6519)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
# Conflicts:
#	api/poetry.lock
2025-01-23 14:01:33 +01:00
Adrián Jesús Peña Rodríguez
4e51348ff2 feat(provider-secret): make existing external_id field mandatory (#6510) 2025-01-23 14:00:42 +01:00
dependabot[bot]
3257b82706 chore(deps-dev): bump eslint-config-prettier from 9.1.0 to 10.0.1 in /ui (#6518)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 13:59:26 +01:00
Pepe Fagoaga
c98d764daa chore(version): set next minor (#6511)
# Conflicts:
#	prowler/config/config.py
#	pyproject.toml
2025-01-23 13:57:20 +01:00
Prowler Bot
0448429a9f chore(regions_update): Changes in regions for AWS services (#6495)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 13:56:43 +01:00
Pepe Fagoaga
b948ac6125 feat(prowler-role): Add templates to deploy it in AWS (#6499) 2025-01-23 13:56:28 +01:00
dependabot[bot]
3acc09ea16 chore(deps): bump jinja2 from 3.1.4 to 3.1.5 (#6457)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 13:56:09 +01:00
dependabot[bot]
82d0d0de9d chore(deps-dev): bump bandit from 1.8.0 to 1.8.2 (#6485)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 13:56:00 +01:00
Prowler Bot
f9cfc4d087 fix: add detector and line number of potential secret (#6663)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-01-22 10:55:49 -05:00
Pepe Fagoaga
7d68ff455b chore(api): Use prowler from v5.1 (#6659) 2025-01-22 20:04:55 +05:45
Pepe Fagoaga
ddf4881971 chore(version): Update version to 5.1.6 (#6645) 2025-01-21 13:28:44 -05:00
Prowler Bot
9ad4944142 fix(filters): fix dynamic filters (#6643)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-21 13:40:18 +01:00
Prowler Bot
7f33ea76a4 fix(OCSF): fix OCSF output when timestamp is UNIX format (#6627)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-01-20 18:03:51 -05:00
Prowler Bot
1140c29384 fix(aws): list tags for DocumentDB clusters (#6622)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-01-20 17:22:06 -05:00
Prowler Bot
2441a62f39 fix: update Azure CIS with existing App checks (#6625)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-01-20 16:27:14 -05:00
Pepe Fagoaga
c26a231fc1 chore(version): Update version to 5.1.5 (#6618) 2025-01-20 15:07:58 -05:00
Prowler Bot
2fb2315037 chore(RBAC): add permission's info (#6617)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-20 17:31:52 +01:00
Prowler Bot
a9e475481a fix(snippet-id): improve provider ID readability in tables (#6616)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-20 17:28:04 +01:00
Prowler Bot
826d7c4dc3 fix(rbac): remove invalid required permission (#6614)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-01-20 17:02:31 +01:00
Prowler Bot
b7f4b37f66 feat(api): restrict the deletion of users, only the user of the request can be deleted (#6613)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-01-20 17:02:21 +01:00
Prowler Bot
193d691bfe fix(RBAC): tweaks for edit role form (#6610)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-20 14:12:43 +01:00
Prowler Bot
a359bc581c fix(RBAC): restore manage_account permission for roles (#6603)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-20 11:53:08 +01:00
Prowler Bot
9a28ff025a fix(sqs): fix flaky test (#6595)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-01-17 12:40:11 -05:00
Prowler Bot
f1c7050700 fix(apigatewayv2): managed exception NotFoundException (#6590)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-01-17 09:27:19 -05:00
Pepe Fagoaga
9391c27b9e chore(version): Update version to 5.1.4 (#6591) 2025-01-17 09:25:35 -05:00
Prowler Bot
4c54de092f feat(findings): Add resource_tag filters for findings endpoint (#6587)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-17 19:01:51 +05:45
Prowler Bot
690c482a43 fix(gcp): fix flaky tests from dns service (#6571)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-01-17 08:15:34 -05:00
Prowler Bot
ad2d857c6f feat(findings): add /findings/metadata to retrieve dynamic filters information (#6586)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-17 18:47:59 +05:45
Pepe Fagoaga
07ee59d2ef chore(version): Update version to 5.1.3 (#6584) 2025-01-17 18:46:08 +05:45
Prowler Bot
bec4617d0a fix(providers): update the label and placeholder based on the cloud provider (#6582)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-17 12:33:25 +01:00
Prowler Bot
94916f8305 fix(findings): remove filter delta_in applied by default (#6579)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-17 11:06:07 +01:00
Prowler Bot
44de651be3 fix(cis): add subsections if needed (#6568)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-01-16 14:49:49 -05:00
Prowler Bot
bdcba9c642 fix(detect_secrets): refactor logic for detect-secrets (#6566)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-01-16 13:07:18 -05:00
Prowler Bot
c172f75f1a fix(dep): address compatibility issues (#6557)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-16 14:35:06 +01:00
Prowler Bot
ec492fa13a feat(filters): add resource type filter for findings (#6525)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-15 08:43:49 +01:00
Prowler Bot
702659959c fix(Azure TDE): add filter for master DB (#6514)
Co-authored-by: johannes-engler-mw <132657752+johannes-engler-mw@users.noreply.github.com>
2025-01-14 15:25:27 -05:00
Pepe Fagoaga
fef332a591 chore(version): set next fixes 2025-01-14 18:05:04 +01:00
1403 changed files with 18911 additions and 140521 deletions

19
.env
View File

@@ -6,15 +6,11 @@
PROWLER_UI_VERSION="stable"
AUTH_URL=http://localhost:3000
API_BASE_URL=http://prowler-api:8080/api/v1
NEXT_PUBLIC_API_BASE_URL=${API_BASE_URL}
NEXT_PUBLIC_API_DOCS_URL=http://prowler-api:8080/api/v1/docs
AUTH_TRUST_HOST=true
UI_PORT=3000
# openssl rand -base64 32
AUTH_SECRET="N/c6mnaS5+SWq81+819OrzQZlmx1Vxtp/orjttJSmw8="
# Google Tag Manager ID
NEXT_PUBLIC_GOOGLE_TAG_MANAGER_ID=""
#### Prowler API Configuration ####
PROWLER_API_VERSION="stable"
@@ -28,10 +24,6 @@ POSTGRES_USER=prowler
POSTGRES_PASSWORD=postgres
POSTGRES_DB=prowler_db
# Celery-Prowler task settings
TASK_RETRY_DELAY_SECONDS=0.1
TASK_RETRY_ATTEMPTS=5
# Valkey settings
# If running Valkey and celery on host, use localhost, else use 'valkey'
VALKEY_HOST=valkey
@@ -131,7 +123,7 @@ SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.7.5
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.5.0
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
@@ -141,12 +133,3 @@ SOCIAL_GOOGLE_OAUTH_CLIENT_SECRET=""
SOCIAL_GITHUB_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/github"
SOCIAL_GITHUB_OAUTH_CLIENT_ID=""
SOCIAL_GITHUB_OAUTH_CLIENT_SECRET=""
# Single Sign-On (SSO)
SAML_SSO_CALLBACK_URL="${AUTH_URL}/api/auth/callback/saml"
# Lighthouse tracing
LANGSMITH_TRACING=false
LANGSMITH_ENDPOINT="https://api.smith.langchain.com"
LANGSMITH_API_KEY=""
LANGCHAIN_PROJECT=""

View File

@@ -9,8 +9,8 @@ updates:
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 25
interval: "daily"
open-pull-requests-limit: 10
target-branch: master
labels:
- "dependencies"
@@ -31,8 +31,8 @@ updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 25
interval: "daily"
open-pull-requests-limit: 10
target-branch: master
labels:
- "dependencies"
@@ -53,47 +53,46 @@ updates:
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 25
interval: "weekly"
open-pull-requests-limit: 10
target-branch: master
labels:
- "dependencies"
- "docker"
# Dependabot Updates are temporary disabled - 2025/04/15
# v4.6
# - package-ecosystem: "pip"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "pip"
# - "v4"
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: v4.6
labels:
- "dependencies"
- "pip"
- "v4"
# - package-ecosystem: "github-actions"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "github_actions"
# - "v4"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: v4.6
labels:
- "dependencies"
- "github_actions"
- "v4"
# - package-ecosystem: "docker"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "docker"
# - "v4"
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: v4.6
labels:
- "dependencies"
- "docker"
- "v4"
# Dependabot Updates are temporary disabled - 2025/03/19
# v3
@@ -108,13 +107,13 @@ updates:
# - "pip"
# - "v3"
# - package-ecosystem: "github-actions"
# directory: "/"
# schedule:
# interval: "monthly"
# open-pull-requests-limit: 10
# target-branch: v3
# labels:
# - "dependencies"
# - "github_actions"
# - "v3"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 10
target-branch: v3
labels:
- "dependencies"
- "github_actions"
- "v3"

5
.github/labeler.yml vendored
View File

@@ -27,11 +27,6 @@ provider/github:
- any-glob-to-any-file: "prowler/providers/github/**"
- any-glob-to-any-file: "tests/providers/github/**"
provider/iac:
- changed-files:
- any-glob-to-any-file: "prowler/providers/iac/**"
- any-glob-to-any-file: "tests/providers/iac/**"
github_actions:
- changed-files:
- any-glob-to-any-file: ".github/workflows/*"

View File

@@ -16,7 +16,6 @@ Please include a summary of the change and which issue is fixed. List any depend
- [ ] Review if code is being documented following this specification https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
- [ ] Review if backport is needed.
- [ ] Review if is needed to change the [Readme.md](https://github.com/prowler-cloud/prowler/blob/master/README.md)
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/prowler/CHANGELOG.md), if applicable.
#### API
- [ ] Verify if API specs need to be regenerated.

View File

@@ -76,12 +76,12 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
# Set push: false for testing
@@ -94,7 +94,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true

View File

@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/api-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
category: "/language:${{matrix.language}}"

View File

@@ -28,10 +28,6 @@ env:
VALKEY_DB: 0
API_WORKING_DIR: ./api
IMAGE_NAME: prowler-api
IGNORE_FILES: |
api/docs/**
api/README.md
api/CHANGELOG.md
jobs:
test:
@@ -82,15 +78,12 @@ jobs:
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: api/**
files_ignore: ${{ env.IGNORE_FILES }}
- name: Replace @master with current branch in pyproject.toml
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
BRANCH_NAME="${GITHUB_HEAD_REF:-${GITHUB_REF_NAME}}"
echo "Using branch: $BRANCH_NAME"
sed -i "s|@master|@$BRANCH_NAME|g" pyproject.toml
files_ignore: |
api/.github/**
api/docs/**
api/permissions/**
api/README.md
api/mkdocs.yml
- name: Install poetry
working-directory: ./api
@@ -99,25 +92,13 @@ jobs:
python -m pip install --upgrade pip
pipx install poetry==2.1.1
- name: Update poetry.lock after the branch name change
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry lock
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
- name: Install system dependencies for xmlsec
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
sudo apt-get update
sudo apt-get install -y libxml2-dev libxmlsec1-dev libxmlsec1-openssl pkg-config
- name: Install dependencies
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
@@ -136,12 +117,6 @@ jobs:
run: |
poetry check --lock
- name: Prevents known compatibility error between lxml and libxml2/libxmlsec versions - https://github.com/xmlsec/python-xmlsec/issues/320
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pip install --force-reinstall --no-binary lxml lxml
- name: Lint with ruff
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
@@ -169,9 +144,8 @@ jobs:
- name: Safety
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
# 76352, 76353, 77323 come from SDK, but they cannot upgrade it yet. It does not affect API
run: |
poetry run safety check --ignore 70612,66963,74429,76352,76353,77323
poetry run safety check --ignore 70612,66963
- name: Vulture
working-directory: ./api
@@ -193,7 +167,7 @@ jobs:
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
uses: codecov/codecov-action@0565863a31f2c772f9f0395002a31e3f06189574 # v5.4.0
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -202,19 +176,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: api/**
files_ignore: ${{ env.IGNORE_FILES }}
- name: Set up Docker Buildx
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.API_WORKING_DIR }}
push: false

View File

@@ -1,67 +0,0 @@
name: Create Backport Label
on:
release:
types: [published]
jobs:
create_label:
runs-on: ubuntu-latest
permissions:
contents: write
issues: write
steps:
- name: Create backport label
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
RELEASE_TAG: ${{ github.event.release.tag_name }}
OWNER_REPO: ${{ github.repository }}
run: |
VERSION_ONLY=${RELEASE_TAG#v} # Remove 'v' prefix if present (e.g., v3.2.0 -> 3.2.0)
# Check if it's a minor version (X.Y.0)
if [[ "$VERSION_ONLY" =~ ^[0-9]+\.[0-9]+\.0$ ]]; then
echo "Release ${RELEASE_TAG} (version ${VERSION_ONLY}) is a minor version. Proceeding to create backport label."
TWO_DIGIT_VERSION=${VERSION_ONLY%.0} # Extract X.Y from X.Y.0 (e.g., 5.6 from 5.6.0)
FINAL_LABEL_NAME="backport-to-v${TWO_DIGIT_VERSION}"
FINAL_DESCRIPTION="Backport PR to the v${TWO_DIGIT_VERSION} branch"
echo "Effective label name will be: ${FINAL_LABEL_NAME}"
echo "Effective description will be: ${FINAL_DESCRIPTION}"
# Check if the label already exists
STATUS_CODE=$(curl -s -o /dev/null -w "%{http_code}" -H "Authorization: token ${GITHUB_TOKEN}" "https://api.github.com/repos/${OWNER_REPO}/labels/${FINAL_LABEL_NAME}")
if [ "${STATUS_CODE}" -eq 200 ]; then
echo "Label '${FINAL_LABEL_NAME}' already exists."
elif [ "${STATUS_CODE}" -eq 404 ]; then
echo "Label '${FINAL_LABEL_NAME}' does not exist. Creating it..."
# Prepare JSON data payload
JSON_DATA=$(printf '{"name":"%s","description":"%s","color":"B60205"}' "${FINAL_LABEL_NAME}" "${FINAL_DESCRIPTION}")
CREATE_STATUS_CODE=$(curl -s -o /tmp/curl_create_response.json -w "%{http_code}" -X POST \
-H "Accept: application/vnd.github.v3+json" \
-H "Authorization: token ${GITHUB_TOKEN}" \
--data "${JSON_DATA}" \
"https://api.github.com/repos/${OWNER_REPO}/labels")
CREATE_RESPONSE_BODY=$(cat /tmp/curl_create_response.json)
rm -f /tmp/curl_create_response.json
if [ "$CREATE_STATUS_CODE" -eq 201 ]; then
echo "Label '${FINAL_LABEL_NAME}' created successfully."
else
echo "Error creating label '${FINAL_LABEL_NAME}'. Status: $CREATE_STATUS_CODE"
echo "Response: $CREATE_RESPONSE_BODY"
exit 1
fi
else
echo "Error checking for label '${FINAL_LABEL_NAME}'. HTTP Status: ${STATUS_CODE}"
exit 1
fi
else
echo "Release ${RELEASE_TAG} (version ${VERSION_ONLY}) is not a minor version. Skipping backport label creation."
exit 0
fi

View File

@@ -11,7 +11,7 @@ jobs:
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@6641d4ba5b684fffe195b9820345de1bf19f3181 # v3.89.2
uses: trufflesecurity/trufflehog@690e5c7aff8347c3885096f3962a0633d9129607 # v3.88.23
with:
path: ./
base: ${{ github.event.repository.default_branch }}

View File

@@ -1,86 +0,0 @@
name: Check Changelog
on:
pull_request:
types: [opened, synchronize, reopened, labeled, unlabeled]
jobs:
check-changelog:
if: contains(github.event.pull_request.labels.*.name, 'no-changelog') == false
runs-on: ubuntu-latest
permissions:
id-token: write
contents: read
pull-requests: write
env:
MONITORED_FOLDERS: "api ui prowler"
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
- name: Get list of changed files
id: changed_files
run: |
git fetch origin ${{ github.base_ref }}
git diff --name-only origin/${{ github.base_ref }}...HEAD > changed_files.txt
cat changed_files.txt
- name: Check for folder changes and changelog presence
id: check_folders
run: |
missing_changelogs=""
for folder in $MONITORED_FOLDERS; do
if grep -q "^${folder}/" changed_files.txt; then
echo "Detected changes in ${folder}/"
if ! grep -q "^${folder}/CHANGELOG.md$" changed_files.txt; then
echo "No changelog update found for ${folder}/"
missing_changelogs="${missing_changelogs}- \`${folder}\`\n"
fi
fi
done
echo "missing_changelogs<<EOF" >> $GITHUB_OUTPUT
echo -e "${missing_changelogs}" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
- name: Find existing changelog comment
if: github.event.pull_request.head.repo.full_name == github.repository
id: find_comment
uses: peter-evans/find-comment@3eae4d37986fb5a8592848f6a574fdf654e61f9e #v3.1.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: 'github-actions[bot]'
body-includes: '<!-- changelog-check -->'
- name: Comment on PR if changelog is missing
if: github.event.pull_request.head.repo.full_name == github.repository && steps.check_folders.outputs.missing_changelogs != ''
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.find_comment.outputs.comment-id }}
body: |
<!-- changelog-check -->
⚠️ **Changes detected in the following folders without a corresponding update to the `CHANGELOG.md`:**
${{ steps.check_folders.outputs.missing_changelogs }}
Please add an entry to the corresponding `CHANGELOG.md` file to maintain a clear history of changes.
- name: Comment on PR if all changelogs are present
if: github.event.pull_request.head.repo.full_name == github.repository && steps.check_folders.outputs.missing_changelogs == ''
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.find_comment.outputs.comment-id }}
body: |
<!-- changelog-check -->
✅ All necessary `CHANGELOG.md` files have been updated. Great job! 🎉
- name: Fail if changelog is missing
if: steps.check_folders.outputs.missing_changelogs != ''
run: |
echo "ERROR: Missing changelog updates in some folders."
exit 1

View File

@@ -1,37 +0,0 @@
name: Prowler - Merged Pull Request
on:
pull_request_target:
branches: ['master']
types: ['closed']
jobs:
trigger-cloud-pull-request:
name: Trigger Cloud Pull Request
if: github.event.pull_request.merged == true && github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
ref: ${{ github.event.pull_request.merge_commit_sha }}
- name: Set short git commit SHA
id: vars
run: |
shortSha=$(git rev-parse --short ${{ github.event.pull_request.merge_commit_sha }})
echo "SHORT_SHA=${shortSha}" >> $GITHUB_ENV
- name: Trigger pull request
uses: peter-evans/repository-dispatch@ff45666b9427631e3450c54a1bcbee4d9ff4d7c0 # v3.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.CLOUD_DISPATCH }}
event-type: prowler-pull-request-merged
client-payload: '{
"PROWLER_COMMIT_SHA": "${{ github.event.pull_request.merge_commit_sha }}",
"PROWLER_COMMIT_SHORT_SHA": "${{ env.SHORT_SHA }}",
"PROWLER_PR_TITLE": "${{ github.event.pull_request.title }}",
"PROWLER_PR_LABELS": ${{ toJson(github.event.pull_request.labels.*.name) }},
"PROWLER_PR_BODY": ${{ toJson(github.event.pull_request.body) }},
"PROWLER_PR_URL":${{ toJson(github.event.pull_request.html_url) }}
}'

View File

@@ -62,7 +62,7 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ env.PYTHON_VERSION }}
@@ -123,11 +123,11 @@ jobs:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build and push container image (latest)
if: github.event_name == 'push'
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
push: true
tags: |
@@ -140,7 +140,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
# Use local context to get changes
# https://github.com/docker/build-push-action#path-context

View File

@@ -1,145 +0,0 @@
name: SDK - Bump Version
on:
release:
types: [published]
env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
jobs:
bump-version:
name: Bump Version
if: github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Get Prowler version
shell: bash
run: |
if [[ $PROWLER_VERSION =~ ^([0-9]+)\.([0-9]+)\.([0-9]+)$ ]]; then
MAJOR_VERSION=${BASH_REMATCH[1]}
MINOR_VERSION=${BASH_REMATCH[2]}
FIX_VERSION=${BASH_REMATCH[3]}
# Export version components to GitHub environment
echo "MAJOR_VERSION=${MAJOR_VERSION}" >> "${GITHUB_ENV}"
echo "MINOR_VERSION=${MINOR_VERSION}" >> "${GITHUB_ENV}"
echo "FIX_VERSION=${FIX_VERSION}" >> "${GITHUB_ENV}"
if (( MAJOR_VERSION == 5 )); then
if (( FIX_VERSION == 0 )); then
echo "Minor Release: $PROWLER_VERSION"
# Set up next minor version for master
BUMP_VERSION_TO=${MAJOR_VERSION}.$((MINOR_VERSION + 1)).${FIX_VERSION}
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
TARGET_BRANCH=${BASE_BRANCH}
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
# Set up patch version for version branch
PATCH_VERSION_TO=${MAJOR_VERSION}.${MINOR_VERSION}.1
echo "PATCH_VERSION_TO=${PATCH_VERSION_TO}" >> "${GITHUB_ENV}"
VERSION_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
echo "VERSION_BRANCH=${VERSION_BRANCH}" >> "${GITHUB_ENV}"
echo "Bumping to next minor version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
echo "Bumping to next patch version: ${PATCH_VERSION_TO} in branch ${VERSION_BRANCH}"
else
echo "Patch Release: $PROWLER_VERSION"
BUMP_VERSION_TO=${MAJOR_VERSION}.${MINOR_VERSION}.$((FIX_VERSION + 1))
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
TARGET_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
echo "Bumping to next patch version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
fi
else
echo "Releasing another Prowler major version, aborting..."
exit 1
fi
else
echo "Invalid version syntax: '$PROWLER_VERSION' (must be N.N.N)" >&2
exit 1
fi
- name: Bump versions in files
run: |
echo "Using PROWLER_VERSION=$PROWLER_VERSION"
echo "Using BUMP_VERSION_TO=$BUMP_VERSION_TO"
set -e
echo "Bumping version in pyproject.toml ..."
sed -i "s|version = \"${PROWLER_VERSION}\"|version = \"${BUMP_VERSION_TO}\"|" pyproject.toml
echo "Bumping version in prowler/config/config.py ..."
sed -i "s|prowler_version = \"${PROWLER_VERSION}\"|prowler_version = \"${BUMP_VERSION_TO}\"|" prowler/config/config.py
echo "Bumping version in .env ..."
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${BUMP_VERSION_TO}|" .env
git --no-pager diff
- name: Create Pull Request
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.TARGET_BRANCH }}
commit-message: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
branch: "version-bump-to-v${{ env.BUMP_VERSION_TO }}"
title: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
body: |
### Description
Bump Prowler version to v${{ env.BUMP_VERSION_TO }}
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
- name: Handle patch version for minor release
if: env.FIX_VERSION == '0'
run: |
echo "Using PROWLER_VERSION=$PROWLER_VERSION"
echo "Using PATCH_VERSION_TO=$PATCH_VERSION_TO"
set -e
echo "Bumping version in pyproject.toml ..."
sed -i "s|version = \"${PROWLER_VERSION}\"|version = \"${PATCH_VERSION_TO}\"|" pyproject.toml
echo "Bumping version in prowler/config/config.py ..."
sed -i "s|prowler_version = \"${PROWLER_VERSION}\"|prowler_version = \"${PATCH_VERSION_TO}\"|" prowler/config/config.py
echo "Bumping version in .env ..."
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PATCH_VERSION_TO}|" .env
git --no-pager diff
- name: Create Pull Request for patch version
if: env.FIX_VERSION == '0'
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.VERSION_BRANCH }}
commit-message: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
branch: "version-bump-to-v${{ env.PATCH_VERSION_TO }}"
title: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
body: |
### Description
Bump Prowler version to v${{ env.PATCH_VERSION_TO }}
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

View File

@@ -21,7 +21,6 @@ on:
paths-ignore:
- 'ui/**'
- 'api/**'
- '.github/**'
pull_request:
branches:
- "master"
@@ -31,7 +30,6 @@ on:
paths-ignore:
- 'ui/**'
- 'api/**'
- '.github/**'
schedule:
- cron: '00 12 * * *'
@@ -56,12 +54,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/sdk-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
category: "/language:${{matrix.language}}"

View File

@@ -34,7 +34,6 @@ jobs:
permissions/**
api/**
ui/**
prowler/CHANGELOG.md
README.md
mkdocs.yml
.backportrc.json
@@ -51,7 +50,7 @@ jobs:
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
@@ -107,143 +106,15 @@ jobs:
run: |
/tmp/hadolint Dockerfile --ignore=DL3013
# Test AWS
- name: AWS - Check if any file has changed
id: aws-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/aws/**
./tests/providers/aws/**
.poetry.lock
- name: AWS - Test
if: steps.aws-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml tests/providers/aws
# Test Azure
- name: Azure - Check if any file has changed
id: azure-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/azure/**
./tests/providers/azure/**
.poetry.lock
- name: Azure - Test
if: steps.azure-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/azure --cov-report=xml:azure_coverage.xml tests/providers/azure
# Test GCP
- name: GCP - Check if any file has changed
id: gcp-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/gcp/**
./tests/providers/gcp/**
.poetry.lock
- name: GCP - Test
if: steps.gcp-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/gcp --cov-report=xml:gcp_coverage.xml tests/providers/gcp
# Test Kubernetes
- name: Kubernetes - Check if any file has changed
id: kubernetes-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/kubernetes/**
./tests/providers/kubernetes/**
.poetry.lock
- name: Kubernetes - Test
if: steps.kubernetes-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/kubernetes --cov-report=xml:kubernetes_coverage.xml tests/providers/kubernetes
# Test GitHub
- name: GitHub - Check if any file has changed
id: github-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/github/**
./tests/providers/github/**
.poetry.lock
- name: GitHub - Test
if: steps.github-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/github --cov-report=xml:github_coverage.xml tests/providers/github
# Test NHN
- name: NHN - Check if any file has changed
id: nhn-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/nhn/**
./tests/providers/nhn/**
.poetry.lock
- name: NHN - Test
if: steps.nhn-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/nhn --cov-report=xml:nhn_coverage.xml tests/providers/nhn
# Test M365
- name: M365 - Check if any file has changed
id: m365-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/m365/**
./tests/providers/m365/**
.poetry.lock
- name: M365 - Test
if: steps.m365-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/m365 --cov-report=xml:m365_coverage.xml tests/providers/m365
# Test IaC
- name: IaC - Check if any file has changed
id: iac-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/iac/**
./tests/providers/iac/**
.poetry.lock
- name: IaC - Test
if: steps.iac-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/iac --cov-report=xml:iac_coverage.xml tests/providers/iac
# Common Tests
- name: Lib - Test
- name: Test with pytest
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/lib --cov-report=xml:lib_coverage.xml tests/lib
poetry run pytest -n auto --cov=./prowler --cov-report=xml tests
- name: Config - Test
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/config --cov-report=xml:config_coverage.xml tests/config
# Codecov
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
uses: codecov/codecov-action@0565863a31f2c772f9f0395002a31e3f06189574 # v5.4.0
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: prowler
files: ./aws_coverage.xml,./azure_coverage.xml,./gcp_coverage.xml,./kubernetes_coverage.xml,./github_coverage.xml,./nhn_coverage.xml,./m365_coverage.xml,./lib_coverage.xml,./config_coverage.xml

View File

@@ -7,7 +7,7 @@ on:
env:
RELEASE_TAG: ${{ github.event.release.tag_name }}
PYTHON_VERSION: 3.11
# CACHE: "poetry"
CACHE: "poetry"
jobs:
repository-check:
@@ -71,10 +71,10 @@ jobs:
pipx install poetry==2.1.1
- name: Setup Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: ${{ env.PYTHON_VERSION }}
# cache: ${{ env.CACHE }}
cache: ${{ env.CACHE }}
- name: Build Prowler package
run: |

View File

@@ -4,7 +4,7 @@ name: SDK - Refresh AWS services' regions
on:
schedule:
- cron: "0 9 * * 1" # runs at 09:00 UTC every Monday
- cron: "0 9 * * *" #runs at 09:00 UTC everyday
env:
GITHUB_BRANCH: "master"
@@ -28,7 +28,7 @@ jobs:
ref: ${{ env.GITHUB_BRANCH }}
- name: setup python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0
with:
python-version: 3.9 #install the python needed
@@ -38,7 +38,7 @@ jobs:
pip install boto3
- name: Configure AWS Credentials -- DEV
uses: aws-actions/configure-aws-credentials@b47578312673ae6fa5b5096b330d9fbac3d116df # v4.2.1
uses: aws-actions/configure-aws-credentials@ececac1a45f3b08a01d2dd070d28d111c5fe6722 # v4.1.0
with:
aws-region: ${{ env.AWS_REGION_DEV }}
role-to-assume: ${{ secrets.DEV_IAM_ROLE_ARN }}

View File

@@ -30,7 +30,6 @@ env:
# Container Registries
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler-ui
NEXT_PUBLIC_API_BASE_URL: http://prowler-api:8080/api/v1
jobs:
repository-check:
@@ -77,17 +76,16 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=${{ env.SHORT_SHA }}
NEXT_PUBLIC_API_BASE_URL=${{ env.NEXT_PUBLIC_API_BASE_URL }}
# Set push: false for testing
push: true
tags: |
@@ -98,12 +96,11 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${{ env.RELEASE_TAG }}
NEXT_PUBLIC_API_BASE_URL=${{ env.NEXT_PUBLIC_API_BASE_URL }}
push: true
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.RELEASE_TAG }}

View File

@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
uses: github/codeql-action/init@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/ui-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
uses: github/codeql-action/analyze@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15
with:
category: "/language:${{matrix.language}}"

View File

@@ -31,75 +31,26 @@ jobs:
with:
persist-credentials: false
- name: Setup Node.js ${{ matrix.node-version }}
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
uses: actions/setup-node@cdca7365b2dadb8aad0a33bc7601856ffabcc48e # v4.3.0
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
cache-dependency-path: './ui/package-lock.json'
- name: Install dependencies
working-directory: ./ui
run: npm ci
run: npm install
- name: Run Healthcheck
working-directory: ./ui
run: npm run healthcheck
- name: Build the application
working-directory: ./ui
run: npm run build
e2e-tests:
runs-on: ubuntu-latest
env:
AUTH_SECRET: 'fallback-ci-secret-for-testing'
AUTH_TRUST_HOST: true
NEXTAUTH_URL: http://localhost:3000
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- name: Setup Node.js
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: '20.x'
cache: 'npm'
cache-dependency-path: './ui/package-lock.json'
- name: Install dependencies
working-directory: ./ui
run: npm ci
- name: Cache Playwright browsers
uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684 # v4.2.3
id: playwright-cache
with:
path: ~/.cache/ms-playwright
key: ${{ runner.os }}-playwright-${{ hashFiles('ui/package-lock.json') }}
restore-keys: |
${{ runner.os }}-playwright-
- name: Install Playwright browsers
working-directory: ./ui
if: steps.playwright-cache.outputs.cache-hit != 'true'
run: npm run test:e2e:install
- name: Build the application
working-directory: ./ui
run: npm run build
- name: Run Playwright tests
working-directory: ./ui
run: npm run test:e2e
- name: Upload Playwright report
uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b # v4.5.0
if: failure()
with:
name: playwright-report
path: ui/playwright-report/
retention-days: 30
test-container-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
with:
context: ${{ env.UI_WORKING_DIR }}
# Always build using `prod` target

13
.gitignore vendored
View File

@@ -42,19 +42,6 @@ junit-reports/
# VSCode files
.vscode/
# Cursor files
.cursorignore
.cursor/
# RooCode files
.roo/
.rooignore
.roomodes
# Cline files
.cline/
.clineignore
# Terraform
.terraform*
*.tfstate

View File

@@ -115,7 +115,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'safety check --ignore 70612,66963,74429,76352,76353'
entry: bash -c 'safety check --ignore 70612,66963'
language: system
- id: vulture

View File

@@ -1,44 +1,24 @@
FROM python:3.12.11-slim-bookworm AS build
FROM python:3.12.9-alpine3.20
LABEL maintainer="https://github.com/prowler-cloud/prowler"
LABEL org.opencontainers.image.source="https://github.com/prowler-cloud/prowler"
ARG POWERSHELL_VERSION=7.5.0
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends \
wget libicu72 libunwind8 libssl3 libcurl4 ca-certificates apt-transport-https gnupg \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
elif [ "$ARCH" = "aarch64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
else \
echo "Unsupported architecture: $ARCH" && exit 1 ; \
fi && \
mkdir -p /opt/microsoft/powershell/7 && \
tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7 && \
chmod +x /opt/microsoft/powershell/7/pwsh && \
ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh && \
rm /tmp/powershell.tar.gz
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
# Update system dependencies and install essential tools
#hadolint ignore=DL3018
RUN apk --no-cache upgrade && apk --no-cache add curl git gcc python3-dev musl-dev linux-headers
# Create non-root user
RUN mkdir -p /home/prowler && \
echo 'prowler:x:1000:1000:prowler:/home/prowler:' > /etc/passwd && \
echo 'prowler:x:1000:' > /etc/group && \
chown -R prowler:prowler /home/prowler
USER prowler
WORKDIR /home/prowler
# Copy necessary files
WORKDIR /home/prowler
COPY prowler/ /home/prowler/prowler/
COPY dashboard/ /home/prowler/dashboard/
COPY pyproject.toml /home/prowler
COPY README.md /home/prowler/
COPY prowler/providers/m365/lib/powershell/m365_powershell.py /home/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py
# Install Python dependencies
ENV HOME='/home/prowler'
@@ -47,12 +27,13 @@ ENV PATH="${HOME}/.local/bin:${PATH}"
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
# By default poetry does not compile Python source files to bytecode during installation.
# This speeds up the installation process, but the first execution may take a little more
# time because Python then compiles source files to bytecode automatically. If you want to
# compile source files to bytecode during installation, you can use the --compile option
RUN poetry install --compile && \
rm -rf ~/.cache/pip
# Install PowerShell modules
RUN poetry run python prowler/providers/m365/lib/powershell/m365_powershell.py
# Remove deprecated dash dependencies
RUN pip uninstall dash-html-components -y && \
pip uninstall dash-core-components -y

169
README.md
View File

@@ -3,7 +3,7 @@
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/master/docs/img/prowler-logo-white.png#gh-dark-mode-only" width="50%" height="50%">
</p>
<p align="center">
<b><i>Prowler</b> is the Open Cloud Security platform trusted by thousands to automate security and compliance in any cloud environment. With hundreds of ready-to-use checks and compliance frameworks, Prowler delivers real-time, customizable monitoring and seamless integrations, making cloud security simple, scalable, and cost-effective for organizations of any size.
<b><i>Prowler Open Source</b> is as dynamic and adaptable as the environment theyre meant to protect. Trusted by the leaders in security.
</p>
<p align="center">
<b>Learn more at <a href="https://prowler.com">prowler.com</i></b>
@@ -43,29 +43,15 @@
# Description
**Prowler** is an open-source security tool designed to assess and enforce security best practices across AWS, Azure, Google Cloud, and Kubernetes. It supports tasks such as security audits, incident response, continuous monitoring, system hardening, forensic readiness, and remediation processes.
Prowler includes hundreds of built-in controls to ensure compliance with standards and frameworks, including:
- **Industry Standards:** CIS, NIST 800, NIST CSF, and CISA
- **Regulatory Compliance and Governance:** RBI, FedRAMP, and PCI-DSS
- **Frameworks for Sensitive Data and Privacy:** GDPR, HIPAA, and FFIEC
- **Frameworks for Organizational Governance and Quality Control:** SOC2 and GXP
- **AWS-Specific Frameworks:** AWS Foundational Technical Review (FTR) and AWS Well-Architected Framework (Security Pillar)
- **National Security Standards:** ENS (Spanish National Security Scheme)
- **Custom Security Frameworks:** Tailored to your needs
## Prowler CLI and Prowler Cloud
Prowler offers a Command Line Interface (CLI), known as Prowler Open Source, and an additional service built on top of it, called <a href="https://prowler.com">Prowler Cloud</a>.
**Prowler** is an Open Source security tool to perform AWS, Azure, Google Cloud and Kubernetes security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness, and also remediations! We have Prowler CLI (Command Line Interface) that we call Prowler Open Source and a service on top of it that we call <a href="https://prowler.com">Prowler Cloud</a>.
## Prowler App
Prowler App is a web-based application that simplifies running Prowler across your cloud provider accounts. It provides a user-friendly interface to visualize the results and streamline your security assessments.
Prowler App is a web application that allows you to run Prowler in your cloud provider accounts and visualize the results in a user-friendly interface.
![Prowler App](docs/img/overview.png)
>For more details, refer to the [Prowler App Documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-app-installation)
>More details at [Prowler App Documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-app-installation)
## Prowler CLI
@@ -74,7 +60,6 @@ prowler <provider>
```
![Prowler CLI Execution](docs/img/short-display.png)
## Prowler Dashboard
```console
@@ -82,34 +67,26 @@ prowler dashboard
```
![Prowler Dashboard](docs/img/dashboard.png)
# Prowler at a Glance
It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks.
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 567 | 82 | 36 | 10 |
| GCP | 79 | 13 | 10 | 3 |
| Azure | 142 | 18 | 10 | 3 |
| Kubernetes | 83 | 7 | 5 | 7 |
| GitHub | 16 | 2 | 1 | 0 |
| M365 | 69 | 7 | 3 | 2 |
| AWS | 564 | 82 | 33 | 10 |
| GCP | 78 | 13 | 7 | 3 |
| Azure | 140 | 18 | 7 | 3 |
| Kubernetes | 83 | 7 | 4 | 7 |
| Microsoft365 | 5 | 2 | 1 | 0 |
| NHN (Unofficial) | 6 | 2 | 1 | 0 |
> [!Note]
> The numbers in the table are updated periodically.
> [!Tip]
> For the most accurate and up-to-date information about checks, services, frameworks, and categories, visit [**Prowler Hub**](https://hub.prowler.com).
> [!Note]
> Use the following commands to list Prowler's available checks, services, compliance frameworks, and categories: `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
> You can list the checks, services, compliance frameworks and categories with `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
# 💻 Installation
## Prowler App
Prowler App offers flexible installation methods tailored to various environments:
Prowler App can be installed in different ways, depending on your environment:
> For detailed instructions on using Prowler App, refer to the [Prowler App Usage Guide](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/prowler-app/).
> See how to use Prowler App in the [Prowler App Usage Guide](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/prowler-app/).
### Docker Compose
@@ -125,16 +102,8 @@ curl -LO https://raw.githubusercontent.com/prowler-cloud/prowler/refs/heads/mast
docker compose up -d
```
> Containers are built for `linux/amd64`.
### Configuring Your Workstation for Prowler App
If your workstation's architecture is incompatible, you can resolve this by:
- **Setting the environment variable**: `DOCKER_DEFAULT_PLATFORM=linux/amd64`
- **Using the following flag in your Docker command**: `--platform linux/amd64`
> Once configured, access the Prowler App at http://localhost:3000. Sign up using your email and password to get started.
> Containers are built for `linux/amd64`. If your workstation's architecture is different, please set `DOCKER_DEFAULT_PLATFORM=linux/amd64` in your environment or use the `--platform linux/amd64` flag in the docker command.
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
### From GitHub
@@ -160,12 +129,12 @@ python manage.py migrate --database admin
gunicorn -c config/guniconf.py config.wsgi:application
```
> [!IMPORTANT]
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
>
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
> After completing the setup, access the API documentation at http://localhost:8080/api/v1/docs.
> Now, you can access the API documentation at http://localhost:8080/api/v1/docs.
**Commands to run the API Worker**
@@ -203,31 +172,29 @@ npm run build
npm start
```
> Once configured, access the Prowler App at http://localhost:3000. Sign up using your email and password to get started.
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
## Prowler CLI
### Pip package
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/). Consequently, it can be installed using pip with Python >3.9.1, <3.13:
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/), thus can be installed using pip with Python > 3.9.1, < 3.13:
```console
pip install prowler
prowler -v
```
>For further guidance, refer to [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli-installation)
>More details at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli-installation)
### Containers
**Available Versions of Prowler CLI**
The available versions of Prowler CLI are the following:
The following versions of Prowler CLI are available, depending on your requirements:
- `latest`: Synchronizes with the `master` branch. Note that this version is not stable.
- `v4-latest`: Synchronizes with the `v4` branch. Note that this version is not stable.
- `v3-latest`: Synchronizes with the `v3` branch. Note that this version is not stable.
- `<x.y.z>` (release): Stable releases corresponding to specific versions. You can find the complete list of releases [here](https://github.com/prowler-cloud/prowler/releases).
- `stable`: Always points to the latest release.
- `v4-stable`: Always points to the latest release for v4.
- `v3-stable`: Always points to the latest release for v3.
- `latest`: in sync with `master` branch (bear in mind that it is not a stable version)
- `v4-latest`: in sync with `v4` branch (bear in mind that it is not a stable version)
- `v3-latest`: in sync with `v3` branch (bear in mind that it is not a stable version)
- `<x.y.z>` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases.
- `stable`: this tag always point to the latest release.
- `v4-stable`: this tag always point to the latest release for v4.
- `v3-stable`: this tag always point to the latest release for v3.
The container images are available here:
- Prowler CLI:
@@ -239,7 +206,7 @@ The container images are available here:
### From GitHub
Python >3.9.1, <3.13 is required with pip and Poetry:
Python > 3.9.1, < 3.13 is required with pip and poetry:
``` console
git clone https://github.com/prowler-cloud/prowler
@@ -249,46 +216,25 @@ poetry install
python prowler-cli.py -v
```
> [!IMPORTANT]
> To clone Prowler on Windows, configure Git to support long file paths by running the following command: `git config core.longpaths true`.
> [!IMPORTANT]
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
>
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
# ✏️ High level architecture
> If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
# 📐✏️ High level architecture
## Prowler App
**Prowler App** is composed of three key components:
The **Prowler App** consists of three main components:
- **Prowler UI**: A web-based interface, built with Next.js, providing a user-friendly experience for executing Prowler scans and visualizing results.
- **Prowler API**: A backend service, developed with Django REST Framework, responsible for running Prowler scans and storing the generated results.
- **Prowler SDK**: A Python SDK designed to extend the functionality of the Prowler CLI for advanced capabilities.
- **Prowler UI**: A user-friendly web interface for running Prowler and viewing results, powered by Next.js.
- **Prowler API**: The backend API that executes Prowler scans and stores the results, built with Django REST Framework.
- **Prowler SDK**: A Python SDK that integrates with the Prowler CLI for advanced functionality.
![Prowler App Architecture](docs/img/prowler-app-architecture.png)
## Prowler CLI
**Running Prowler**
Prowler can be executed across various environments, offering flexibility to meet your needs. It can be run from:
- Your own workstation
- A Kubernetes Job
- Google Compute Engine
- Azure Virtual Machines (VMs)
- Amazon EC2 instances
- AWS Fargate or other container platforms
- CloudShell
And many more environments.
You can run Prowler from your workstation, a Kubernetes Job, a Google Compute Engine, an Azure VM, an EC2 instance, Fargate or any other container, CloudShell and many more.
![Architecture](docs/img/architecture.png)
@@ -296,36 +242,23 @@ And many more environments.
## General
- `Allowlist` now is called `Mutelist`.
- The `--quiet` option has been deprecated. Use the `--status` flag to filter findings based on their status: PASS, FAIL, or MANUAL.
- All findings with an `INFO` status have been reclassified as `MANUAL`.
- The CSV output format is standardized across all providers.
- The `--quiet` option has been deprecated, now use the `--status` flag to select the finding's status you want to get from PASS, FAIL or MANUAL.
- All `INFO` finding's status has changed to `MANUAL`.
- The CSV output format is common for all the providers.
**Deprecated Output Formats**
The following formats are now deprecated:
- Native JSON has been replaced with JSON in [OCSF] v1.1.0 format, which is standardized across all providers (https://schema.ocsf.io/).
We have deprecated some of our outputs formats:
- The native JSON is replaced for the JSON [OCSF](https://schema.ocsf.io/) v1.1.0, common for all the providers.
## AWS
**AWS Flag Deprecation**
The flag --sts-endpoint-region has been deprecated due to the adoption of AWS STS regional tokens.
**Sending FAIL Results to AWS Security Hub**
- To send only FAILS to AWS Security Hub, use one of the following options: `--send-sh-only-fails` or `--security-hub --status FAIL`.
- Deprecate the AWS flag --sts-endpoint-region since we use AWS STS regional tokens.
- To send only FAILS to AWS Security Hub, now use either `--send-sh-only-fails` or `--security-hub --status FAIL`.
# 📖 Documentation
**Documentation Resources**
For installation instructions, usage details, tutorials, and the Developer Guide, visit https://docs.prowler.com/
Install, Usage, Tutorials and Developer Guide is at https://docs.prowler.com/
# 📃 License
**Prowler License Information**
Prowler is licensed under the Apache License 2.0, as indicated in each file within the repository. Obtaining a Copy of the License
A copy of the License is available at <http://www.apache.org/licenses/LICENSE-2.0>
Prowler is licensed as Apache License 2.0 as specified in each file. You may obtain a copy of the License at
<http://www.apache.org/licenses/LICENSE-2.0>

168
api/.gitignore vendored Normal file
View File

@@ -0,0 +1,168 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.pyc
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
/_data/
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock
# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/latest/usage/project/#working-with-version-control
.pdm.toml
.pdm-python
.pdm-build/
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
*.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
.idea/
# VSCode
.vscode/

View File

@@ -0,0 +1,91 @@
repos:
## GENERAL
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
hooks:
- id: check-merge-conflict
- id: check-yaml
args: ["--unsafe"]
- id: check-json
- id: end-of-file-fixer
- id: trailing-whitespace
- id: no-commit-to-branch
- id: pretty-format-json
args: ["--autofix", "--no-sort-keys", "--no-ensure-ascii"]
exclude: 'src/backend/api/fixtures/dev/.*\.json$'
## TOML
- repo: https://github.com/macisamuele/language-formatters-pre-commit-hooks
rev: v2.13.0
hooks:
- id: pretty-format-toml
args: [--autofix]
files: pyproject.toml
## BASH
- repo: https://github.com/koalaman/shellcheck-precommit
rev: v0.10.0
hooks:
- id: shellcheck
exclude: contrib
## PYTHON
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.5.0
hooks:
# Run the linter.
- id: ruff
args: [ --fix ]
# Run the formatter.
- id: ruff-format
- repo: https://github.com/python-poetry/poetry
rev: 1.8.0
hooks:
- id: poetry-check
args: ["--directory=src"]
- id: poetry-lock
args: ["--no-update", "--directory=src"]
- repo: https://github.com/hadolint/hadolint
rev: v2.13.0-beta
hooks:
- id: hadolint
args: ["--ignore=DL3013", "Dockerfile"]
- repo: local
hooks:
- id: pylint
name: pylint
entry: bash -c 'poetry run pylint --disable=W,C,R,E -j 0 -rn -sn src/'
language: system
files: '.*\.py'
- id: trufflehog
name: TruffleHog
description: Detect secrets in your data.
entry: bash -c 'trufflehog --no-update git file://. --only-verified --fail'
# For running trufflehog in docker, use the following entry instead:
# entry: bash -c 'docker run -v "$(pwd):/workdir" -i --rm trufflesecurity/trufflehog:latest git file:///workdir --only-verified --fail'
language: system
stages: ["commit", "push"]
- id: bandit
name: bandit
description: "Bandit is a tool for finding common security issues in Python code"
entry: bash -c 'poetry run bandit -q -lll -x '*_test.py,./contrib/,./.venv/' -r .'
language: system
files: '.*\.py'
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'poetry run safety check --ignore 70612,66963'
language: system
- id: vulture
name: vulture
description: "Vulture finds unused code in Python programs."
entry: bash -c 'poetry run vulture --exclude "contrib,.venv,tests,conftest.py" --min-confidence 100 .'
language: system
files: '.*\.py'

View File

@@ -2,147 +2,53 @@
All notable changes to the **Prowler API** are documented in this file.
## [v1.10.0] (Prowler UNRELEASED)
### Added
- SSO with SAML support [(#8175)](https://github.com/prowler-cloud/prowler/pull/8175)
---
## [v1.9.1] (Prowler v5.8.1)
### Fixed
- Scan with no resources will not trigger legacy code for findings metadata [(#8183)](https://github.com/prowler-cloud/prowler/pull/8183)
### Removed
- Validation of the provider's secret type during updates [(#8197)](https://github.com/prowler-cloud/prowler/pull/8197)
---
## [v1.9.0] (Prowler v5.8.0)
### Added
- Support GCP Service Account key [(#7824)](https://github.com/prowler-cloud/prowler/pull/7824)
- `GET /compliance-overviews` endpoints to retrieve compliance metadata and specific requirements statuses [(#7877)](https://github.com/prowler-cloud/prowler/pull/7877)
- Lighthouse configuration support [(#7848)](https://github.com/prowler-cloud/prowler/pull/7848)
### Changed
- Reworked `GET /compliance-overviews` to return proper requirement metrics [(#7877)](https://github.com/prowler-cloud/prowler/pull/7877)
- Optional `user` and `password` for M365 provider [(#7992)](https://github.com/prowler-cloud/prowler/pull/7992)
### Fixed
- Scheduled scans are no longer deleted when their daily schedule run is disabled [(#8082)](https://github.com/prowler-cloud/prowler/pull/8082)
---
## [v1.8.5] (Prowler v5.7.5)
### Fixed
- Normalize provider UID to ensure safe and unique export directory paths [(#8007)](https://github.com/prowler-cloud/prowler/pull/8007).
- Blank resource types in `/metadata` endpoints [(#8027)](https://github.com/prowler-cloud/prowler/pull/8027)
---
## [v1.8.4] (Prowler v5.7.4)
### Removed
- Reverted RLS transaction handling and DB custom backend [(#7994)](https://github.com/prowler-cloud/prowler/pull/7994)
---
## [v1.8.3] (Prowler v5.7.3)
### Added
- Database backend to handle already closed connections [(#7935)](https://github.com/prowler-cloud/prowler/pull/7935)
### Changed
- Renamed field encrypted_password to password for M365 provider [(#7784)](https://github.com/prowler-cloud/prowler/pull/7784)
### Fixed
- Transaction persistence with RLS operations [(#7916)](https://github.com/prowler-cloud/prowler/pull/7916)
- Reverted the change `get_with_retry` to use the original `get` method for retrieving tasks [(#7932)](https://github.com/prowler-cloud/prowler/pull/7932)
---
## [v1.8.2] (Prowler v5.7.2)
### Fixed
- Task lookup to use task_kwargs instead of task_args for scan report resolution [(#7830)](https://github.com/prowler-cloud/prowler/pull/7830)
- Kubernetes UID validation to allow valid context names [(#7871)](https://github.com/prowler-cloud/prowler/pull/7871)
- Connection status verification before launching a scan [(#7831)](https://github.com/prowler-cloud/prowler/pull/7831)
- Race condition when creating background tasks [(#7876)](https://github.com/prowler-cloud/prowler/pull/7876)
- Error when modifying or retrieving tenants due to missing user UUID in transaction context [(#7890)](https://github.com/prowler-cloud/prowler/pull/7890)
---
## [v1.8.1] (Prowler v5.7.1)
### Fixed
- Added database index to improve performance on finding lookup [(#7800)](https://github.com/prowler-cloud/prowler/pull/7800)
---
## [v1.8.0] (Prowler v5.7.0)
### Added
- Huge improvements to `/findings/metadata` and resource related filters for findings [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690)
- Improvements to `/overviews` endpoints [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690)
- Queue to perform backfill background tasks [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690)
- New endpoints to retrieve latest findings and metadata [(#7743)](https://github.com/prowler-cloud/prowler/pull/7743)
- Export support for Prowler ThreatScore in M365 [(7783)](https://github.com/prowler-cloud/prowler/pull/7783)
---
## [v1.7.0] (Prowler v5.6.0)
### Added
- M365 as a new provider [(#7563)](https://github.com/prowler-cloud/prowler/pull/7563)
- `compliance/` folder and ZIPexport functionality for all compliance reports [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653)
- API endpoint to fetch and download any specific compliance file by name [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653)
---
## [v1.6.0] (Prowler v5.5.0)
### Added
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167)
- HTTP Security Headers [(#7289)](https://github.com/prowler-cloud/prowler/pull/7289)
- New endpoint to get the compliance overviews metadata [(#7333)](https://github.com/prowler-cloud/prowler/pull/7333)
- Support for muted findings [(#7378)](https://github.com/prowler-cloud/prowler/pull/7378)
- Missing fields to API findings and resources [(#7318)](https://github.com/prowler-cloud/prowler/pull/7318)
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
- HTTP Security Headers [(#7289)](https://github.com/prowler-cloud/prowler/pull/7289).
- New endpoint to get the compliance overviews metadata [(#7333)](https://github.com/prowler-cloud/prowler/pull/7333).
- Support for muted findings [(#7378)](https://github.com/prowler-cloud/prowler/pull/7378).
- Added missing fields to API findings and resources [(#7318)](https://github.com/prowler-cloud/prowler/pull/7318).
---
## [v1.5.4] (Prowler v5.4.4)
### Fixed
- Bug with periodic tasks when trying to delete a provider [(#7466)](https://github.com/prowler-cloud/prowler/pull/7466)
- Fixed a bug with periodic tasks when trying to delete a provider ([#7466])(https://github.com/prowler-cloud/prowler/pull/7466).
---
## [v1.5.3] (Prowler v5.4.3)
### Fixed
- Duplicated scheduled scans handling [(#7401)](https://github.com/prowler-cloud/prowler/pull/7401)
- Environment variable to configure the deletion task batch size [(#7423)](https://github.com/prowler-cloud/prowler/pull/7423)
- Added duplicated scheduled scans handling ([#7401])(https://github.com/prowler-cloud/prowler/pull/7401).
- Added environment variable to configure the deletion task batch size ([#7423])(https://github.com/prowler-cloud/prowler/pull/7423).
---
## [v1.5.2] (Prowler v5.4.2)
### Changed
- Refactored deletion logic and implemented retry mechanism for deletion tasks [(#7349)](https://github.com/prowler-cloud/prowler/pull/7349)
- Refactored deletion logic and implemented retry mechanism for deletion tasks [(#7349)](https://github.com/prowler-cloud/prowler/pull/7349).
---
## [v1.5.1] (Prowler v5.4.1)
### Fixed
- Handle response in case local files are missing [(#7183)](https://github.com/prowler-cloud/prowler/pull/7183)
- Race condition when deleting export files after the S3 upload [(#7172)](https://github.com/prowler-cloud/prowler/pull/7172)
- Handle exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283)
- Added a handled response in case local files are missing [(#7183)](https://github.com/prowler-cloud/prowler/pull/7183).
- Fixed a race condition when deleting export files after the S3 upload [(#7172)](https://github.com/prowler-cloud/prowler/pull/7172).
- Handled exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283).
### Added
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
---
@@ -150,20 +56,20 @@ All notable changes to the **Prowler API** are documented in this file.
### Added
- Social login integration with Google and GitHub [(#6906)](https://github.com/prowler-cloud/prowler/pull/6906)
- API scan report system, now all scans launched from the API will generate a compressed file with the report in OCSF, CSV and HTML formats [(#6878)](https://github.com/prowler-cloud/prowler/pull/6878)
- Add API scan report system, now all scans launched from the API will generate a compressed file with the report in OCSF, CSV and HTML formats [(#6878)](https://github.com/prowler-cloud/prowler/pull/6878).
- Configurable Sentry integration [(#6874)](https://github.com/prowler-cloud/prowler/pull/6874)
### Changed
- Optimized `GET /findings` endpoint to improve response time and size [(#7019)](https://github.com/prowler-cloud/prowler/pull/7019)
- Optimized `GET /findings` endpoint to improve response time and size [(#7019)](https://github.com/prowler-cloud/prowler/pull/7019).
---
## [v1.4.0] (Prowler v5.3.0)
### Changed
- Daily scheduled scan instances are now created beforehand with `SCHEDULED` state [(#6700)](https://github.com/prowler-cloud/prowler/pull/6700)
- Findings endpoints now require at least one date filter [(#6800)](https://github.com/prowler-cloud/prowler/pull/6800)
- Findings metadata endpoint received a performance improvement [(#6863)](https://github.com/prowler-cloud/prowler/pull/6863)
- Increased the allowed length of the provider UID for Kubernetes providers [(#6869)](https://github.com/prowler-cloud/prowler/pull/6869)
- Daily scheduled scan instances are now created beforehand with `SCHEDULED` state [(#6700)](https://github.com/prowler-cloud/prowler/pull/6700).
- Findings endpoints now require at least one date filter [(#6800)](https://github.com/prowler-cloud/prowler/pull/6800).
- Findings metadata endpoint received a performance improvement [(#6863)](https://github.com/prowler-cloud/prowler/pull/6863).
- Increased the allowed length of the provider UID for Kubernetes providers [(#6869)](https://github.com/prowler-cloud/prowler/pull/6869).
---

View File

@@ -1,45 +1,13 @@
FROM python:3.12.10-slim-bookworm AS build
FROM python:3.12.8-alpine3.20 AS build
LABEL maintainer="https://github.com/prowler-cloud/api"
ARG POWERSHELL_VERSION=7.5.0
ENV POWERSHELL_VERSION=${POWERSHELL_VERSION}
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends \
wget \
libicu72 \
gcc \
g++ \
make \
libxml2-dev \
libxmlsec1-dev \
libxmlsec1-openssl \
pkg-config \
libtool \
libxslt1-dev \
python3-dev \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
elif [ "$ARCH" = "aarch64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
else \
echo "Unsupported architecture: $ARCH" && exit 1 ; \
fi && \
mkdir -p /opt/microsoft/powershell/7 && \
tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7 && \
chmod +x /opt/microsoft/powershell/7/pwsh && \
ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh && \
rm /tmp/powershell.tar.gz
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
# hadolint ignore=DL3018
RUN apk --no-cache add gcc python3-dev musl-dev linux-headers curl-dev
RUN apk --no-cache upgrade && \
addgroup -g 1000 prowler && \
adduser -D -u 1000 -G prowler prowler
USER prowler
WORKDIR /home/prowler
@@ -49,26 +17,28 @@ COPY pyproject.toml ./
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
COPY src/backend/ ./backend/
ENV PATH="/home/prowler/.local/bin:$PATH"
# Add `--no-root` to avoid installing the current project as a package
RUN poetry install --no-root && \
rm -rf ~/.cache/pip
RUN poetry run python "$(poetry env info --path)/src/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py"
# Prevents known compatibility error between lxml and libxml2/libxmlsec versions.
# See: https://github.com/xmlsec/python-xmlsec/issues/320
RUN poetry run pip install --force-reinstall --no-binary lxml lxml
COPY src/backend/ ./backend/
COPY docker-entrypoint.sh ./docker-entrypoint.sh
WORKDIR /home/prowler/backend
# Development image
# hadolint ignore=DL3006
FROM build AS dev
USER 0
# hadolint ignore=DL3018
RUN apk --no-cache add curl vim
USER prowler
ENTRYPOINT ["../docker-entrypoint.sh", "dev"]
# Production image

View File

@@ -235,7 +235,6 @@ To view the logs for any component (e.g., Django, Celery worker), you can use th
```console
docker logs -f $(docker ps --format "{{.Names}}" | grep 'api-')
```
## Applying migrations

125
api/docker-compose.yml Normal file
View File

@@ -0,0 +1,125 @@
services:
api:
build:
dockerfile: Dockerfile
image: prowler-api
env_file:
- path: ./.env
required: false
ports:
- "${DJANGO_PORT:-8000}:${DJANGO_PORT:-8000}"
profiles:
- prod
depends_on:
postgres:
condition: service_healthy
valkey:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "prod"
api-dev:
build:
dockerfile: Dockerfile
target: dev
image: prowler-api-dev
environment:
- DJANGO_SETTINGS_MODULE=config.django.devel
- DJANGO_LOGGING_FORMATTER=human_readable
env_file:
- path: ./.env
required: false
ports:
- "${DJANGO_PORT:-8080}:${DJANGO_PORT:-8080}"
volumes:
- "./src/backend:/home/prowler/backend"
- "./pyproject.toml:/home/prowler/pyproject.toml"
profiles:
- dev
depends_on:
postgres:
condition: service_healthy
valkey:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "dev"
postgres:
image: postgres:16.3-alpine
ports:
- "${POSTGRES_PORT:-5432}:${POSTGRES_PORT:-5432}"
hostname: "postgres-db"
volumes:
- ./_data/postgres:/var/lib/postgresql/data
environment:
- POSTGRES_USER=${POSTGRES_ADMIN_USER:-prowler}
- POSTGRES_PASSWORD=${POSTGRES_ADMIN_PASSWORD:-S3cret}
- POSTGRES_DB=${POSTGRES_DB:-prowler_db}
env_file:
- path: ./.env
required: false
healthcheck:
test: ["CMD-SHELL", "sh -c 'pg_isready -U ${POSTGRES_ADMIN_USER:-prowler} -d ${POSTGRES_DB:-prowler_db}'"]
interval: 5s
timeout: 5s
retries: 5
valkey:
image: valkey/valkey:7-alpine3.19
ports:
- "${VALKEY_PORT:-6379}:6379"
hostname: "valkey"
volumes:
- ./_data/valkey:/data
env_file:
- path: ./.env
required: false
healthcheck:
test: ["CMD-SHELL", "sh -c 'valkey-cli ping'"]
interval: 10s
timeout: 5s
retries: 3
worker:
build:
dockerfile: Dockerfile
image: prowler-worker
environment:
- DJANGO_SETTINGS_MODULE=${DJANGO_SETTINGS_MODULE:-config.django.production}
env_file:
- path: ./.env
required: false
profiles:
- dev
- prod
depends_on:
valkey:
condition: service_healthy
postgres:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "worker"
worker-beat:
build:
dockerfile: Dockerfile
image: prowler-worker
environment:
- DJANGO_SETTINGS_MODULE=${DJANGO_SETTINGS_MODULE:-config.django.production}
env_file:
- path: ./.env
required: false
profiles:
- dev
- prod
depends_on:
valkey:
condition: service_healthy
postgres:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "beat"

View File

@@ -3,10 +3,6 @@
apply_migrations() {
echo "Applying database migrations..."
# Fix Inconsistent migration history after adding sites app
poetry run python manage.py check_and_fix_socialaccount_sites_migration --database admin
poetry run python manage.py migrate --database admin
}
@@ -32,7 +28,7 @@ start_prod_server() {
start_worker() {
echo "Starting the worker..."
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion,backfill -E --max-tasks-per-child 1
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion -E --max-tasks-per-child 1
}
start_worker_beat() {

2215
api/poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -7,8 +7,8 @@ authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
dependencies = [
"celery[pytest] (>=5.4.0,<6.0.0)",
"dj-rest-auth[with_social,jwt] (==7.0.1)",
"django==5.1.10",
"django-allauth[saml] (>=65.8.0,<66.0.0)",
"django==5.1.7",
"django-allauth==65.4.1",
"django-celery-beat (>=2.7.0,<3.0.0)",
"django-celery-results (>=2.5.1,<3.0.0)",
"django-cors-headers==4.4.0",
@@ -23,12 +23,11 @@ dependencies = [
"drf-spectacular==0.27.2",
"drf-spectacular-jsonapi==0.5.1",
"gunicorn==23.0.0",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.5",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
"uuid6==2024.7.10",
"openai (>=1.82.0,<2.0.0)"
"uuid6==2024.7.10"
]
description = "Prowler's API (Django/DRF)"
license = "Apache-2.0"
@@ -36,7 +35,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.9.0"
version = "1.6.0"
[project.scripts]
celery = "src.backend.config.settings.celery"
@@ -47,7 +46,6 @@ coverage = "7.5.4"
django-silk = "5.3.2"
docker = "7.1.0"
freezegun = "1.5.1"
marshmallow = ">=3.15.0,<4.0.0"
mypy = "1.10.1"
pylint = "3.2.5"
pytest = "8.2.2"

View File

@@ -17,8 +17,6 @@ class ProwlerSocialAccountAdapter(DefaultSocialAccountAdapter):
def pre_social_login(self, request, sociallogin):
# Link existing accounts with the same email address
email = sociallogin.account.extra_data.get("email")
if sociallogin.provider.id == "saml":
email = sociallogin.user.email
if email:
existing_user = self.get_user_by_email(email)
if existing_user:
@@ -31,39 +29,33 @@ class ProwlerSocialAccountAdapter(DefaultSocialAccountAdapter):
"""
with transaction.atomic(using=MainRouter.admin_db):
user = super().save_user(request, sociallogin, form)
provider = sociallogin.provider.id
extra = sociallogin.account.extra_data
if provider != "saml":
# Handle other providers (e.g., GitHub, Google)
user.save(using=MainRouter.admin_db)
social_account_name = sociallogin.account.extra_data.get("name")
if social_account_name:
user.name = social_account_name
user.save(using=MainRouter.admin_db)
social_account_name = extra.get("name")
if social_account_name:
user.name = social_account_name
user.save(using=MainRouter.admin_db)
tenant = Tenant.objects.using(MainRouter.admin_db).create(
name=f"{user.email.split('@')[0]} default tenant"
tenant = Tenant.objects.using(MainRouter.admin_db).create(
name=f"{user.email.split('@')[0]} default tenant"
)
with rls_transaction(str(tenant.id)):
Membership.objects.using(MainRouter.admin_db).create(
user=user, tenant=tenant, role=Membership.RoleChoices.OWNER
)
role = Role.objects.using(MainRouter.admin_db).create(
name="admin",
tenant_id=tenant.id,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=True,
)
UserRoleRelationship.objects.using(MainRouter.admin_db).create(
user=user,
role=role,
tenant_id=tenant.id,
)
with rls_transaction(str(tenant.id)):
Membership.objects.using(MainRouter.admin_db).create(
user=user, tenant=tenant, role=Membership.RoleChoices.OWNER
)
role = Role.objects.using(MainRouter.admin_db).create(
name="admin",
tenant_id=tenant.id,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=True,
)
UserRoleRelationship.objects.using(MainRouter.admin_db).create(
user=user,
role=role,
tenant_id=tenant.id,
)
return user

View File

@@ -109,6 +109,16 @@ class BaseTenantViewset(BaseViewSet):
pass # Tenant might not exist, handle gracefully
def initial(self, request, *args, **kwargs):
if (
request.resolver_match.url_name != "tenant-detail"
and request.method != "DELETE"
):
user_id = str(request.user.id)
with rls_transaction(value=user_id, parameter=POSTGRES_USER_VAR):
return super().initial(request, *args, **kwargs)
# TODO: DRY this when we have time
if request.auth is None:
raise NotAuthenticated
@@ -116,8 +126,8 @@ class BaseTenantViewset(BaseViewSet):
if tenant_id is None:
raise NotAuthenticated("Tenant ID is not present in token")
user_id = str(request.user.id)
with rls_transaction(value=user_id, parameter=POSTGRES_USER_VAR):
with rls_transaction(tenant_id):
self.request.tenant_id = tenant_id
return super().initial(request, *args, **kwargs)

View File

@@ -1,38 +1,12 @@
from types import MappingProxyType
from api.models import Provider
from prowler.config.config import get_available_compliance_frameworks
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.check.models import CheckMetadata
from api.models import Provider
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = {}
PROWLER_CHECKS = {}
AVAILABLE_COMPLIANCE_FRAMEWORKS = {}
def get_compliance_frameworks(provider_type: Provider.ProviderChoices) -> list[str]:
"""
Retrieve and cache the list of available compliance frameworks for a specific cloud provider.
This function lazily loads and caches the available compliance frameworks (e.g., CIS, MITRE, ISO)
for each provider type (AWS, Azure, GCP, etc.) on first access. Subsequent calls for the same
provider will return the cached result.
Args:
provider_type (Provider.ProviderChoices): The cloud provider type for which to retrieve
available compliance frameworks (e.g., "aws", "azure", "gcp", "m365").
Returns:
list[str]: A list of framework identifiers (e.g., "cis_1.4_aws", "mitre_attack_azure") available
for the given provider.
"""
global AVAILABLE_COMPLIANCE_FRAMEWORKS
if provider_type not in AVAILABLE_COMPLIANCE_FRAMEWORKS:
AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type] = (
get_available_compliance_frameworks(provider_type)
)
return AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type]
def get_prowler_provider_checks(provider_type: Provider.ProviderChoices):
@@ -190,16 +164,10 @@ def generate_compliance_overview_template(prowler_compliance: dict):
total_checks = len(requirement.Checks)
checks_dict = {check: None for check in requirement.Checks}
req_status_val = "MANUAL" if total_checks == 0 else "PASS"
# Build requirement dictionary
requirement_dict = {
"name": requirement.Name or requirement.Id,
"description": requirement.Description,
"tactics": getattr(requirement, "Tactics", []),
"subtechniques": getattr(requirement, "SubTechniques", []),
"platforms": getattr(requirement, "Platforms", []),
"technique_url": getattr(requirement, "TechniqueURL", ""),
"attributes": [
dict(attribute) for attribute in requirement.Attributes
],
@@ -210,18 +178,20 @@ def generate_compliance_overview_template(prowler_compliance: dict):
"manual": 0,
"total": total_checks,
},
"status": req_status_val,
"status": "PASS",
}
# Update requirements status counts for the framework
if req_status_val == "MANUAL":
# Update requirements status
if total_checks == 0:
requirements_status["manual"] += 1
elif req_status_val == "PASS":
requirements_status["passed"] += 1
# Add requirement to compliance requirements
compliance_requirements[requirement.Id] = requirement_dict
# Calculate pending requirements
pending_requirements = total_requirements - requirements_status["manual"]
requirements_status["passed"] = pending_requirements
# Build compliance dictionary
compliance_dict = {
"framework": compliance_data.Framework,

View File

@@ -1,4 +1,3 @@
import re
import secrets
import uuid
from contextlib import contextmanager
@@ -153,28 +152,6 @@ def delete_related_daily_task(provider_id: str):
PeriodicTask.objects.filter(name=task_name).delete()
def create_objects_in_batches(
tenant_id: str, model, objects: list, batch_size: int = 500
):
"""
Bulk-create model instances in repeated, per-tenant RLS transactions.
All chunks execute in their own transaction, so no single transaction
grows too large.
Args:
tenant_id (str): UUID string of the tenant under which to set RLS.
model: Django model class whose `.objects.bulk_create()` will be called.
objects (list): List of model instances (unsaved) to bulk-create.
batch_size (int): Maximum number of objects per bulk_create call.
"""
total = len(objects)
for i in range(0, total, batch_size):
chunk = objects[i : i + batch_size]
with rls_transaction(value=tenant_id, parameter=POSTGRES_TENANT_VAR):
model.objects.bulk_create(chunk, batch_size)
# Postgres Enums
@@ -250,167 +227,6 @@ def register_enum(apps, schema_editor, enum_class): # noqa: F841
register_adapter(enum_class, enum_adapter)
def _should_create_index_on_partition(
partition_name: str, all_partitions: bool = False
) -> bool:
"""
Determine if we should create an index on this partition.
Args:
partition_name: The name of the partition (e.g., "findings_2025_aug", "findings_default")
all_partitions: If True, create on all partitions. If False, only current/future partitions.
Returns:
bool: True if index should be created on this partition, False otherwise.
"""
if all_partitions:
return True
# Extract date from partition name if it follows the pattern
# Partition names look like: findings_2025_aug, findings_2025_jul, etc.
date_pattern = r"(\d{4})_([a-z]{3})$"
match = re.search(date_pattern, partition_name)
if not match:
# If we can't parse the date, include it to be safe (e.g., default partition)
return True
try:
year_str, month_abbr = match.groups()
year = int(year_str)
# Map month abbreviations to numbers
month_map = {
"jan": 1,
"feb": 2,
"mar": 3,
"apr": 4,
"may": 5,
"jun": 6,
"jul": 7,
"aug": 8,
"sep": 9,
"oct": 10,
"nov": 11,
"dec": 12,
}
month = month_map.get(month_abbr.lower())
if month is None:
# Unknown month abbreviation, include it to be safe
return True
partition_date = datetime(year, month, 1, tzinfo=timezone.utc)
# Get current month start
now = datetime.now(timezone.utc)
current_month_start = now.replace(
day=1, hour=0, minute=0, second=0, microsecond=0
)
# Include current month and future partitions
return partition_date >= current_month_start
except (ValueError, TypeError):
# If date parsing fails, include it to be safe
return True
def create_index_on_partitions(
apps, # noqa: F841
schema_editor,
parent_table: str,
index_name: str,
columns: str,
method: str = "BTREE",
where: str = "",
all_partitions: bool = True,
):
"""
Create an index on existing partitions of `parent_table`.
Args:
parent_table: The name of the root table (e.g. "findings").
index_name: A short name for the index (will be prefixed per-partition).
columns: The parenthesized column list, e.g. "tenant_id, scan_id, status".
method: The index method—BTREE, GIN, etc. Defaults to BTREE.
where: Optional WHERE clause (without the leading "WHERE"), e.g. "status = 'FAIL'".
all_partitions: Whether to create indexes on all partitions or just current/future ones.
Defaults to False (current/future only) to avoid maintenance overhead
on old partitions where the index may not be needed.
Examples:
# Create index only on current and future partitions (recommended for new indexes)
create_index_on_partitions(
apps, schema_editor,
parent_table="findings",
index_name="new_performance_idx",
columns="tenant_id, status, severity",
all_partitions=False # Default behavior
)
# Create index on all partitions (use when migrating existing critical indexes)
create_index_on_partitions(
apps, schema_editor,
parent_table="findings",
index_name="critical_existing_idx",
columns="tenant_id, scan_id",
all_partitions=True
)
"""
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
where_sql = f" WHERE {where}" if where else ""
for partition in partitions:
if _should_create_index_on_partition(partition, all_partitions):
idx_name = f"{partition.replace('.', '_')}_{index_name}"
sql = (
f"CREATE INDEX CONCURRENTLY IF NOT EXISTS {idx_name} "
f"ON {partition} USING {method} ({columns})"
f"{where_sql};"
)
schema_editor.execute(sql)
def drop_index_on_partitions(
apps, # noqa: F841
schema_editor,
parent_table: str,
index_name: str,
):
"""
Drop the per-partition indexes that were created by create_index_on_partitions.
Args:
parent_table: The name of the root table (e.g. "findings").
index_name: The same short name used when creating them.
"""
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
for partition in partitions:
idx_name = f"{partition.replace('.', '_')}_{index_name}"
sql = f"DROP INDEX CONCURRENTLY IF EXISTS {idx_name};"
schema_editor.execute(sql)
# Postgres enum definition for member role

View File

@@ -3,7 +3,7 @@ from rest_framework import status
from rest_framework.exceptions import APIException
from rest_framework_json_api.exceptions import exception_handler
from rest_framework_json_api.serializers import ValidationError
from rest_framework_simplejwt.exceptions import InvalidToken, TokenError
from rest_framework_simplejwt.exceptions import TokenError, InvalidToken
class ModelValidationError(ValidationError):
@@ -32,31 +32,6 @@ class InvitationTokenExpiredException(APIException):
default_code = "token_expired"
# Task Management Exceptions (non-HTTP)
class TaskManagementError(Exception):
"""Base exception for task management errors."""
def __init__(self, task=None):
self.task = task
super().__init__()
class TaskFailedException(TaskManagementError):
"""Raised when a task has failed."""
class TaskNotFoundException(TaskManagementError):
"""Raised when a task is not found."""
class TaskInProgressException(TaskManagementError):
"""Raised when a task is running but there's no related Task object to return."""
def __init__(self, task_result=None):
self.task_result = task_result
super().__init__()
def custom_exception_handler(exc, context):
if isinstance(exc, django_validation_error):
if hasattr(exc, "error_dict"):
@@ -64,12 +39,7 @@ def custom_exception_handler(exc, context):
else:
exc = ValidationError(detail=exc.messages[0], code=exc.code)
elif isinstance(exc, (TokenError, InvalidToken)):
if (
hasattr(exc, "detail")
and isinstance(exc.detail, dict)
and "messages" in exc.detail
):
exc.detail["messages"] = [
message_item["message"] for message_item in exc.detail["messages"]
]
exc.detail["messages"] = [
message_item["message"] for message_item in exc.detail["messages"]
]
return exception_handler(exc, context)

View File

@@ -22,7 +22,7 @@ from api.db_utils import (
StatusEnumField,
)
from api.models import (
ComplianceRequirementOverview,
ComplianceOverview,
Finding,
Integration,
Invitation,
@@ -81,114 +81,6 @@ class ChoiceInFilter(BaseInFilter, ChoiceFilter):
pass
class CommonFindingFilters(FilterSet):
# We filter providers from the scan in findings
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_type__in = ChoiceInFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_uid = CharFilter(field_name="scan__provider__uid", lookup_expr="exact")
provider_uid__in = CharInFilter(field_name="scan__provider__uid", lookup_expr="in")
provider_uid__icontains = CharFilter(
field_name="scan__provider__uid", lookup_expr="icontains"
)
provider_alias = CharFilter(field_name="scan__provider__alias", lookup_expr="exact")
provider_alias__in = CharInFilter(
field_name="scan__provider__alias", lookup_expr="in"
)
provider_alias__icontains = CharFilter(
field_name="scan__provider__alias", lookup_expr="icontains"
)
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
uid = CharFilter(field_name="uid")
delta = ChoiceFilter(choices=Finding.DeltaChoices.choices)
status = ChoiceFilter(choices=StatusChoices.choices)
severity = ChoiceFilter(choices=SeverityChoices)
impact = ChoiceFilter(choices=SeverityChoices)
muted = BooleanFilter(
help_text="If this filter is not provided, muted and non-muted findings will be returned."
)
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
region = CharFilter(method="filter_resource_region")
region__in = CharInFilter(field_name="resource_regions", lookup_expr="overlap")
region__icontains = CharFilter(
field_name="resource_regions", lookup_expr="icontains"
)
service = CharFilter(method="filter_resource_service")
service__in = CharInFilter(field_name="resource_services", lookup_expr="overlap")
service__icontains = CharFilter(
field_name="resource_services", lookup_expr="icontains"
)
resource_uid = CharFilter(field_name="resources__uid")
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
resource_uid__icontains = CharFilter(
field_name="resources__uid", lookup_expr="icontains"
)
resource_name = CharFilter(field_name="resources__name")
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
resource_name__icontains = CharFilter(
field_name="resources__name", lookup_expr="icontains"
)
resource_type = CharFilter(method="filter_resource_type")
resource_type__in = CharInFilter(field_name="resource_types", lookup_expr="overlap")
resource_type__icontains = CharFilter(
field_name="resources__type", lookup_expr="icontains"
)
# Temporarily disabled until we implement tag filtering in the UI
# resource_tag_key = CharFilter(field_name="resources__tags__key")
# resource_tag_key__in = CharInFilter(
# field_name="resources__tags__key", lookup_expr="in"
# )
# resource_tag_key__icontains = CharFilter(
# field_name="resources__tags__key", lookup_expr="icontains"
# )
# resource_tag_value = CharFilter(field_name="resources__tags__value")
# resource_tag_value__in = CharInFilter(
# field_name="resources__tags__value", lookup_expr="in"
# )
# resource_tag_value__icontains = CharFilter(
# field_name="resources__tags__value", lookup_expr="icontains"
# )
# resource_tags = CharInFilter(
# method="filter_resource_tag",
# lookup_expr="in",
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
# "separated by commas.",
# )
def filter_resource_service(self, queryset, name, value):
return queryset.filter(resource_services__contains=[value])
def filter_resource_region(self, queryset, name, value):
return queryset.filter(resource_regions__contains=[value])
def filter_resource_type(self, queryset, name, value):
return queryset.filter(resource_types__contains=[value])
def filter_resource_tag(self, queryset, name, value):
overall_query = Q()
for key_value_pair in value:
tag_key, tag_value = key_value_pair.split(":", 1)
overall_query |= Q(
resources__tags__key__icontains=tag_key,
resources__tags__value__icontains=tag_value,
)
return queryset.filter(overall_query).distinct()
class TenantFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
@@ -365,7 +257,94 @@ class ResourceFilter(ProviderRelationshipFilterSet):
return queryset.filter(tags__text_search=value)
class FindingFilter(CommonFindingFilters):
class FindingFilter(FilterSet):
# We filter providers from the scan in findings
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_type__in = ChoiceInFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_uid = CharFilter(field_name="scan__provider__uid", lookup_expr="exact")
provider_uid__in = CharInFilter(field_name="scan__provider__uid", lookup_expr="in")
provider_uid__icontains = CharFilter(
field_name="scan__provider__uid", lookup_expr="icontains"
)
provider_alias = CharFilter(field_name="scan__provider__alias", lookup_expr="exact")
provider_alias__in = CharInFilter(
field_name="scan__provider__alias", lookup_expr="in"
)
provider_alias__icontains = CharFilter(
field_name="scan__provider__alias", lookup_expr="icontains"
)
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
uid = CharFilter(field_name="uid")
delta = ChoiceFilter(choices=Finding.DeltaChoices.choices)
status = ChoiceFilter(choices=StatusChoices.choices)
severity = ChoiceFilter(choices=SeverityChoices)
impact = ChoiceFilter(choices=SeverityChoices)
muted = BooleanFilter(
help_text="If this filter is not provided, muted and non-muted findings will be returned."
)
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
region = CharFilter(field_name="resources__region")
region__in = CharInFilter(field_name="resources__region", lookup_expr="in")
region__icontains = CharFilter(
field_name="resources__region", lookup_expr="icontains"
)
service = CharFilter(field_name="resources__service")
service__in = CharInFilter(field_name="resources__service", lookup_expr="in")
service__icontains = CharFilter(
field_name="resources__service", lookup_expr="icontains"
)
resource_uid = CharFilter(field_name="resources__uid")
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
resource_uid__icontains = CharFilter(
field_name="resources__uid", lookup_expr="icontains"
)
resource_name = CharFilter(field_name="resources__name")
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
resource_name__icontains = CharFilter(
field_name="resources__name", lookup_expr="icontains"
)
resource_type = CharFilter(field_name="resources__type")
resource_type__in = CharInFilter(field_name="resources__type", lookup_expr="in")
resource_type__icontains = CharFilter(
field_name="resources__type", lookup_expr="icontains"
)
# Temporarily disabled until we implement tag filtering in the UI
# resource_tag_key = CharFilter(field_name="resources__tags__key")
# resource_tag_key__in = CharInFilter(
# field_name="resources__tags__key", lookup_expr="in"
# )
# resource_tag_key__icontains = CharFilter(
# field_name="resources__tags__key", lookup_expr="icontains"
# )
# resource_tag_value = CharFilter(field_name="resources__tags__value")
# resource_tag_value__in = CharInFilter(
# field_name="resources__tags__value", lookup_expr="in"
# )
# resource_tag_value__icontains = CharFilter(
# field_name="resources__tags__value", lookup_expr="icontains"
# )
# resource_tags = CharInFilter(
# method="filter_resource_tag",
# lookup_expr="in",
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
# "separated by commas.",
# )
scan = UUIDFilter(method="filter_scan_id")
scan__in = UUIDInFilter(method="filter_scan_id_in")
@@ -406,15 +385,6 @@ class FindingFilter(CommonFindingFilters):
},
}
def filter_resource_type(self, queryset, name, value):
return queryset.filter(resource_types__contains=[value])
def filter_resource_region(self, queryset, name, value):
return queryset.filter(resource_regions__contains=[value])
def filter_resource_service(self, queryset, name, value):
return queryset.filter(resource_services__contains=[value])
def filter_queryset(self, queryset):
if not (self.data.get("scan") or self.data.get("scan__in")) and not (
self.data.get("inserted_at")
@@ -533,6 +503,16 @@ class FindingFilter(CommonFindingFilters):
return queryset.filter(id__lt=end)
def filter_resource_tag(self, queryset, name, value):
overall_query = Q()
for key_value_pair in value:
tag_key, tag_value = key_value_pair.split(":", 1)
overall_query |= Q(
resources__tags__key__icontains=tag_key,
resources__tags__value__icontains=tag_value,
)
return queryset.filter(overall_query).distinct()
@staticmethod
def maybe_date_to_datetime(value):
dt = value
@@ -541,31 +521,6 @@ class FindingFilter(CommonFindingFilters):
return dt
class LatestFindingFilter(CommonFindingFilters):
class Meta:
model = Finding
fields = {
"id": ["exact", "in"],
"uid": ["exact", "in"],
"delta": ["exact", "in"],
"status": ["exact", "in"],
"severity": ["exact", "in"],
"impact": ["exact", "in"],
"check_id": ["exact", "in", "icontains"],
}
filter_overrides = {
FindingDeltaEnumField: {
"filter_class": CharFilter,
},
StatusEnumField: {
"filter_class": CharFilter,
},
SeverityEnumField: {
"filter_class": CharFilter,
},
}
class ProviderSecretFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
@@ -637,11 +592,12 @@ class RoleFilter(FilterSet):
class ComplianceOverviewFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
scan_id = UUIDFilter(field_name="scan_id")
region = CharFilter(field_name="region")
provider_type = ChoiceFilter(choices=Provider.ProviderChoices.choices)
provider_type__in = ChoiceInFilter(choices=Provider.ProviderChoices.choices)
scan_id = UUIDFilter(field_name="scan__id")
class Meta:
model = ComplianceRequirementOverview
model = ComplianceOverview
fields = {
"inserted_at": ["date", "gte", "lte"],
"compliance_id": ["exact", "icontains"],

View File

@@ -1,80 +0,0 @@
from django.contrib.sites.models import Site
from django.core.management.base import BaseCommand
from django.db import DEFAULT_DB_ALIAS, connection, connections, transaction
from django.db.migrations.recorder import MigrationRecorder
def table_exists(table_name):
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT EXISTS (
SELECT 1 FROM information_schema.tables
WHERE table_name = %s
)
""",
[table_name],
)
return cursor.fetchone()[0]
class Command(BaseCommand):
help = "Fix migration inconsistency between socialaccount and sites"
def add_arguments(self, parser):
parser.add_argument(
"--database",
default=DEFAULT_DB_ALIAS,
help="Specifies the database to operate on.",
)
def handle(self, *args, **options):
db = options["database"]
connection = connections[db]
recorder = MigrationRecorder(connection)
applied = set(recorder.applied_migrations())
has_social = ("socialaccount", "0001_initial") in applied
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_name = 'django_site'
);
"""
)
site_table_exists = cursor.fetchone()[0]
if has_social and not site_table_exists:
self.stdout.write(
f"Detected inconsistency in '{db}'. Creating 'django_site' table manually..."
)
with transaction.atomic(using=db):
with connection.schema_editor() as schema_editor:
schema_editor.create_model(Site)
recorder.record_applied("sites", "0001_initial")
recorder.record_applied("sites", "0002_alter_domain_unique")
self.stdout.write(
"Fixed: 'django_site' table created and migrations registered."
)
# Ensure the relationship table also exists
if not table_exists("socialaccount_socialapp_sites"):
self.stdout.write(
"Detected missing 'socialaccount_socialapp_sites' table. Creating manually..."
)
with connection.schema_editor() as schema_editor:
from allauth.socialaccount.models import SocialApp
schema_editor.create_model(
SocialApp._meta.get_field("sites").remote_field.through
)
self.stdout.write(
"Fixed: 'socialaccount_socialapp_sites' table created."
)

View File

@@ -12,7 +12,6 @@ from api.models import (
Provider,
Resource,
ResourceFindingMapping,
ResourceScanSummary,
Scan,
StatusChoices,
)
@@ -134,7 +133,6 @@ class Command(BaseCommand):
region=random.choice(possible_regions),
service=random.choice(possible_services),
type=random.choice(possible_types),
inserted_at="2024-10-01T00:00:00Z",
)
)
@@ -183,10 +181,6 @@ class Command(BaseCommand):
"servicename": assigned_resource.service,
"resourcetype": assigned_resource.type,
},
resource_types=[assigned_resource.type],
resource_regions=[assigned_resource.region],
resource_services=[assigned_resource.service],
inserted_at="2024-10-01T00:00:00Z",
)
)
@@ -203,22 +197,12 @@ class Command(BaseCommand):
# Create ResourceFindingMapping
mappings = []
scan_resource_cache: set[tuple] = set()
for index, finding_instance in enumerate(findings):
resource_instance = resources[findings_resources_mapping[index]]
for index, f in enumerate(findings):
mappings.append(
ResourceFindingMapping(
tenant_id=tenant_id,
resource=resource_instance,
finding=finding_instance,
)
)
scan_resource_cache.add(
(
str(resource_instance.id),
resource_instance.service,
resource_instance.region,
resource_instance.type,
resource=resources[findings_resources_mapping[index]],
finding=f,
)
)
@@ -236,38 +220,6 @@ class Command(BaseCommand):
"Resource-finding mappings created successfully.\n\n"
)
)
with rls_transaction(tenant_id):
scan.progress = 99
scan.save()
self.stdout.write(self.style.WARNING("Creating finding filter values..."))
resource_scan_summaries = [
ResourceScanSummary(
tenant_id=tenant_id,
scan_id=str(scan.id),
resource_id=resource_id,
service=service,
region=region,
resource_type=resource_type,
)
for resource_id, service, region, resource_type in scan_resource_cache
]
num_batches = ceil(len(resource_scan_summaries) / batch_size)
with rls_transaction(tenant_id):
for i in tqdm(
range(0, len(resource_scan_summaries), batch_size),
total=num_batches,
):
with rls_transaction(tenant_id):
ResourceScanSummary.objects.bulk_create(
resource_scan_summaries[i : i + batch_size],
ignore_conflicts=True,
)
self.stdout.write(
self.style.SUCCESS("Finding filter values created successfully.\n\n")
)
except Exception as e:
self.stdout.write(self.style.ERROR(f"Failed to populate test data: {e}"))
scan_state = "failed"

View File

@@ -1,32 +0,0 @@
# Generated by Django 5.1.7 on 2025-04-16 08:47
from django.db import migrations
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0016_finding_compliance_resource_details_and_more"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="provider",
field=api.db_utils.ProviderEnumField(
choices=[
("aws", "AWS"),
("azure", "Azure"),
("gcp", "GCP"),
("kubernetes", "Kubernetes"),
("m365", "M365"),
],
default="aws",
),
),
migrations.RunSQL(
"ALTER TYPE provider ADD VALUE IF NOT EXISTS 'm365';",
reverse_sql=migrations.RunSQL.noop,
),
]

View File

@@ -1,81 +0,0 @@
# Generated by Django 5.1.7 on 2025-05-05 10:01
import uuid
import django.db.models.deletion
import uuid6
from django.db import migrations, models
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0017_m365_provider"),
]
operations = [
migrations.CreateModel(
name="ResourceScanSummary",
fields=[
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("scan_id", models.UUIDField(db_index=True, default=uuid6.uuid7)),
("resource_id", models.UUIDField(db_index=True, default=uuid.uuid4)),
("service", models.CharField(max_length=100)),
("region", models.CharField(max_length=100)),
("resource_type", models.CharField(max_length=100)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "resource_scan_summaries",
"indexes": [
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="rss_tenant_scan_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region"],
name="rss_tenant_scan_reg_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "resource_type"],
name="rss_tenant_scan_type_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region", "service"],
name="rss_tenant_scan_reg_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "service", "resource_type"],
name="rss_tenant_scan_svc_type_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region", "resource_type"],
name="rss_tenant_scan_reg_type_idx",
),
],
"unique_together": {("tenant_id", "scan_id", "resource_id")},
},
),
migrations.AddConstraint(
model_name="resourcescansummary",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_resourcescansummary",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]

View File

@@ -1,42 +0,0 @@
import django.contrib.postgres.fields
import django.contrib.postgres.indexes
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0018_resource_scan_summaries"),
]
operations = [
migrations.AddField(
model_name="finding",
name="resource_regions",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(max_length=100),
blank=True,
null=True,
size=None,
),
),
migrations.AddField(
model_name="finding",
name="resource_services",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(max_length=100),
blank=True,
null=True,
size=None,
),
),
migrations.AddField(
model_name="finding",
name="resource_types",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(max_length=100),
blank=True,
null=True,
size=None,
),
),
]

View File

@@ -1,86 +0,0 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0019_finding_denormalize_resource_fields"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_service_idx",
columns="resource_services",
method="GIN",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_service_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_region_idx",
columns="resource_regions",
method="GIN",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_region_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_rtype_idx",
columns="resource_types",
method="GIN",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_rtype_idx",
),
),
migrations.RunPython(
partial(
drop_index_on_partitions,
parent_table="findings",
index_name="findings_uid_idx",
),
reverse_code=partial(
create_index_on_partitions,
parent_table="findings",
index_name="findings_uid_idx",
columns="uid",
method="BTREE",
),
),
migrations.RunPython(
partial(
drop_index_on_partitions,
parent_table="findings",
index_name="findings_filter_idx",
),
reverse_code=partial(
create_index_on_partitions,
parent_table="findings",
index_name="findings_filter_idx",
columns="scan_id, impact, severity, status, check_id, delta",
method="BTREE",
),
),
]

View File

@@ -1,37 +0,0 @@
import django.contrib.postgres.indexes
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("api", "0020_findings_new_performance_indexes_partitions"),
]
operations = [
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=["resource_services"], name="gin_find_service_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=["resource_regions"], name="gin_find_region_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=["resource_types"], name="gin_find_rtype_idx"
),
),
migrations.RemoveIndex(
model_name="finding",
name="findings_uid_idx",
),
migrations.RemoveIndex(
model_name="finding",
name="findings_filter_idx",
),
]

View File

@@ -1,38 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-12 10:04
from django.contrib.postgres.operations import AddIndexConcurrently
from django.db import migrations, models
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0021_findings_new_performance_indexes_parent"),
("django_celery_beat", "0019_alter_periodictasks_options"),
]
operations = [
AddIndexConcurrently(
model_name="scan",
index=models.Index(
condition=models.Q(("state", "completed")),
fields=["tenant_id", "provider_id", "state", "-inserted_at"],
name="scans_prov_state_ins_desc_idx",
),
),
AddIndexConcurrently(
model_name="scansummary",
index=models.Index(
fields=["tenant_id", "scan_id", "service"],
name="ss_tenant_scan_service_idx",
),
),
AddIndexConcurrently(
model_name="scansummary",
index=models.Index(
fields=["tenant_id", "scan_id", "severity"],
name="ss_tenant_scan_severity_idx",
),
),
]

View File

@@ -1,28 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-12 10:18
from django.contrib.postgres.operations import AddIndexConcurrently
from django.db import migrations, models
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0022_scan_summaries_performance_indexes"),
]
operations = [
AddIndexConcurrently(
model_name="resource",
index=models.Index(
fields=["tenant_id", "id"], name="resources_tenant_id_idx"
),
),
AddIndexConcurrently(
model_name="resource",
index=models.Index(
fields=["tenant_id", "provider_id"],
name="resources_tenant_provider_idx",
),
),
]

View File

@@ -1,29 +0,0 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0023_resources_lookup_optimization"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_tenant_uid_inserted_idx",
columns="tenant_id, uid, inserted_at DESC",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_tenant_uid_inserted_idx",
),
)
]

View File

@@ -1,17 +0,0 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0024_findings_uid_index_partitions"),
]
operations = [
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "uid", "-inserted_at"],
name="find_tenant_uid_inserted_idx",
),
),
]

View File

@@ -1,14 +0,0 @@
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("api", "0025_findings_uid_index_parent"),
]
operations = [
migrations.RunSQL(
"ALTER TYPE provider_secret_type ADD VALUE IF NOT EXISTS 'service_account';",
reverse_sql=migrations.RunSQL.noop,
),
]

View File

@@ -1,124 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-21 11:37
import uuid
import django.db.models.deletion
from django.db import migrations, models
import api.db_utils
import api.rls
from api.rls import RowLevelSecurityConstraint
class Migration(migrations.Migration):
dependencies = [
("api", "0026_provider_secret_gcp_service_account"),
]
operations = [
migrations.CreateModel(
name="ComplianceRequirementOverview",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("compliance_id", models.TextField(blank=False)),
("framework", models.TextField(blank=False)),
("version", models.TextField(blank=True)),
("description", models.TextField(blank=True)),
("region", models.TextField(blank=False)),
("requirement_id", models.TextField(blank=False)),
(
"requirement_status",
api.db_utils.StatusEnumField(
choices=[
("FAIL", "Fail"),
("PASS", "Pass"),
("MANUAL", "Manual"),
]
),
),
("passed_checks", models.IntegerField(default=0)),
("failed_checks", models.IntegerField(default=0)),
("total_checks", models.IntegerField(default=0)),
(
"scan",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="compliance_requirements_overviews",
related_query_name="compliance_requirements_overview",
to="api.scan",
),
),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "compliance_requirements_overviews",
"abstract": False,
"indexes": [
models.Index(
fields=["tenant_id", "scan_id"], name="cro_tenant_scan_idx"
),
models.Index(
fields=["tenant_id", "scan_id", "compliance_id"],
name="cro_scan_comp_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "compliance_id", "region"],
name="cro_scan_comp_reg_idx",
),
models.Index(
fields=[
"tenant_id",
"scan_id",
"compliance_id",
"requirement_id",
],
name="cro_scan_comp_req_idx",
),
models.Index(
fields=[
"tenant_id",
"scan_id",
"compliance_id",
"requirement_id",
"region",
],
name="cro_scan_comp_req_reg_idx",
),
],
"constraints": [
models.UniqueConstraint(
fields=(
"tenant_id",
"scan_id",
"compliance_id",
"requirement_id",
"region",
),
name="unique_tenant_compliance_requirement_overview",
)
],
},
),
migrations.AddConstraint(
model_name="ComplianceRequirementOverview",
constraint=RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_compliancerequirementoverview",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]

View File

@@ -1,29 +0,0 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0027_compliance_requirement_overviews"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_check_idx",
columns="tenant_id, scan_id, check_id",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_tenant_scan_check_idx",
),
)
]

View File

@@ -1,17 +0,0 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0028_findings_check_index_partitions"),
]
operations = [
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "scan_id", "check_id"],
name="find_tenant_scan_check_idx",
),
),
]

View File

@@ -1,107 +0,0 @@
# Generated by Django 5.1.10 on 2025-06-12 12:45
import uuid
import django.core.validators
import django.db.models.deletion
from django.db import migrations, models
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0029_findings_check_index_parent"),
]
operations = [
migrations.CreateModel(
name="LighthouseConfiguration",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"name",
models.CharField(
help_text="Name of the configuration",
max_length=100,
validators=[django.core.validators.MinLengthValidator(3)],
),
),
(
"api_key",
models.BinaryField(
help_text="Encrypted API key for the LLM service"
),
),
(
"model",
models.CharField(
choices=[
("gpt-4o-2024-11-20", "GPT-4o v2024-11-20"),
("gpt-4o-2024-08-06", "GPT-4o v2024-08-06"),
("gpt-4o-2024-05-13", "GPT-4o v2024-05-13"),
("gpt-4o", "GPT-4o Default"),
("gpt-4o-mini-2024-07-18", "GPT-4o Mini v2024-07-18"),
("gpt-4o-mini", "GPT-4o Mini Default"),
],
default="gpt-4o-2024-08-06",
help_text="Must be one of the supported model names",
max_length=50,
),
),
(
"temperature",
models.FloatField(default=0, help_text="Must be between 0 and 1"),
),
(
"max_tokens",
models.IntegerField(
default=4000, help_text="Must be between 500 and 5000"
),
),
(
"business_context",
models.TextField(
blank=True,
default="",
help_text="Additional business context for this AI model configuration",
),
),
("is_active", models.BooleanField(default=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "lighthouse_configurations",
"abstract": False,
"constraints": [
models.UniqueConstraint(
fields=("tenant_id",),
name="unique_lighthouse_config_per_tenant",
),
],
},
),
migrations.AddConstraint(
model_name="lighthouseconfiguration",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_lighthouseconfiguration",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]

View File

@@ -1,24 +0,0 @@
# Generated by Django 5.1.10 on 2025-06-23 10:04
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0030_lighthouseconfiguration"),
("django_celery_beat", "0019_alter_periodictasks_options"),
]
operations = [
migrations.AlterField(
model_name="scan",
name="scheduler_task",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="django_celery_beat.periodictask",
),
),
]

View File

@@ -1,150 +0,0 @@
# Generated by Django 5.1.10 on 2025-07-02 15:47
import uuid
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
import api.db_utils
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0031_scan_disable_on_cascade_periodic_tasks"),
]
operations = [
migrations.AlterField(
model_name="integration",
name="integration_type",
field=api.db_utils.IntegrationTypeEnumField(
choices=[
("amazon_s3", "Amazon S3"),
("aws_security_hub", "AWS Security Hub"),
("jira", "JIRA"),
("slack", "Slack"),
]
),
),
migrations.CreateModel(
name="SAMLToken",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
("expires_at", models.DateTimeField(editable=False)),
("token", models.JSONField(unique=True)),
(
"user",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
],
options={
"db_table": "saml_tokens",
},
),
migrations.CreateModel(
name="SAMLConfiguration",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
(
"email_domain",
models.CharField(
help_text="Email domain used to identify the tenant, e.g. prowlerdemo.com",
max_length=254,
unique=True,
),
),
(
"metadata_xml",
models.TextField(
help_text="Raw IdP metadata XML to configure SingleSignOnService, certificates, etc."
),
),
("created_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "saml_configurations",
},
),
migrations.AddConstraint(
model_name="samlconfiguration",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_samlconfiguration",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddConstraint(
model_name="samlconfiguration",
constraint=models.UniqueConstraint(
fields=("tenant",), name="unique_samlconfig_per_tenant"
),
),
migrations.CreateModel(
name="SAMLDomainIndex",
fields=[
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("email_domain", models.CharField(max_length=254, unique=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "saml_domain_index",
},
),
migrations.AddConstraint(
model_name="samldomainindex",
constraint=models.UniqueConstraint(
fields=("email_domain", "tenant"),
name="unique_resources_by_email_domain",
),
),
migrations.AddConstraint(
model_name="samldomainindex",
constraint=api.rls.BaseSecurityConstraint(
name="statements_on_samldomainindex",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]

View File

@@ -1,21 +1,12 @@
import json
import logging
import re
import xml.etree.ElementTree as ET
from datetime import datetime, timedelta, timezone
from uuid import UUID, uuid4
from allauth.socialaccount.models import SocialApp
from config.custom_logging import BackendLogger
from config.settings.social_login import SOCIALACCOUNT_PROVIDERS
from cryptography.fernet import Fernet, InvalidToken
from cryptography.fernet import Fernet
from django.conf import settings
from django.contrib.auth.models import AbstractBaseUser
from django.contrib.postgres.fields import ArrayField
from django.contrib.postgres.indexes import GinIndex
from django.contrib.postgres.search import SearchVector, SearchVectorField
from django.contrib.sites.models import Site
from django.core.exceptions import ValidationError
from django.core.validators import MinLengthValidator
from django.db import models
from django.db.models import Q
@@ -27,7 +18,6 @@ from psqlextra.models import PostgresPartitionedModel
from psqlextra.types import PostgresPartitioningMethod
from uuid6 import uuid7
from api.db_router import MainRouter
from api.db_utils import (
CustomUserManager,
FindingDeltaEnumField,
@@ -58,8 +48,6 @@ fernet = Fernet(settings.SECRETS_ENCRYPTION_KEY.encode())
# Convert Prowler Severity enum to Django TextChoices
SeverityChoices = enum_to_choices(Severity)
logger = logging.getLogger(BackendLogger.API)
class StatusChoices(models.TextChoices):
"""
@@ -203,7 +191,6 @@ class Provider(RowLevelSecurityProtectedModel):
AZURE = "azure", _("Azure")
GCP = "gcp", _("GCP")
KUBERNETES = "kubernetes", _("Kubernetes")
M365 = "m365", _("M365")
@staticmethod
def validate_aws_uid(value):
@@ -227,19 +214,6 @@ class Provider(RowLevelSecurityProtectedModel):
pointer="/data/attributes/uid",
)
@staticmethod
def validate_m365_uid(value):
if not re.match(
r"""^(?!-)[A-Za-z0-9](?:[A-Za-z0-9-]{0,61}[A-Za-z0-9])?(?:\.(?!-)[A-Za-z0-9]"""
r"""(?:[A-Za-z0-9-]{0,61}[A-Za-z0-9])?)*\.[A-Za-z]{2,}$""",
value,
):
raise ModelValidationError(
detail="M365 domain ID must be a valid domain.",
code="m365-uid",
pointer="/data/attributes/uid",
)
@staticmethod
def validate_gcp_uid(value):
if not re.match(r"^[a-z][a-z0-9-]{5,29}$", value):
@@ -253,7 +227,7 @@ class Provider(RowLevelSecurityProtectedModel):
@staticmethod
def validate_kubernetes_uid(value):
if not re.match(
r"^[a-zA-Z0-9][a-zA-Z0-9._@:\/-]{1,250}$",
r"^[a-z0-9][A-Za-z0-9_.:\/-]{1,250}$",
value,
):
raise ModelValidationError(
@@ -438,10 +412,9 @@ class Scan(RowLevelSecurityProtectedModel):
completed_at = models.DateTimeField(null=True, blank=True)
next_scan_at = models.DateTimeField(null=True, blank=True)
scheduler_task = models.ForeignKey(
PeriodicTask, on_delete=models.SET_NULL, null=True, blank=True
PeriodicTask, on_delete=models.CASCADE, null=True, blank=True
)
output_location = models.CharField(blank=True, null=True, max_length=200)
# TODO: mutelist foreign key
class Meta(RowLevelSecurityProtectedModel.Meta):
@@ -464,11 +437,6 @@ class Scan(RowLevelSecurityProtectedModel):
fields=["tenant_id", "provider_id", "state", "inserted_at"],
name="scans_prov_state_insert_idx",
),
models.Index(
fields=["tenant_id", "provider_id", "state", "-inserted_at"],
condition=Q(state=StateChoices.COMPLETED),
name="scans_prov_state_ins_desc_idx",
),
]
class JSONAPIMeta:
@@ -595,11 +563,6 @@ class Resource(RowLevelSecurityProtectedModel):
name="resource_tenant_metadata_idx",
),
GinIndex(fields=["text_search"], name="gin_resources_search_idx"),
models.Index(fields=["tenant_id", "id"], name="resources_tenant_id_idx"),
models.Index(
fields=["tenant_id", "provider_id"],
name="resources_tenant_provider_idx",
),
]
constraints = [
@@ -700,21 +663,6 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
muted = models.BooleanField(default=False, null=False)
compliance = models.JSONField(default=dict, null=True, blank=True)
# Denormalize resource data for performance
resource_regions = ArrayField(
models.CharField(max_length=100), blank=True, null=True
)
resource_services = ArrayField(
models.CharField(max_length=100),
blank=True,
null=True,
)
resource_types = ArrayField(
models.CharField(max_length=100),
blank=True,
null=True,
)
# Relationships
scan = models.ForeignKey(to=Scan, related_name="findings", on_delete=models.CASCADE)
@@ -755,6 +703,18 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
]
indexes = [
models.Index(fields=["uid"], name="findings_uid_idx"),
models.Index(
fields=[
"scan_id",
"impact",
"severity",
"status",
"check_id",
"delta",
],
name="findings_filter_idx",
),
models.Index(fields=["tenant_id", "id"], name="findings_tenant_and_id_idx"),
GinIndex(fields=["text_search"], name="gin_findings_search_idx"),
models.Index(fields=["tenant_id", "scan_id"], name="find_tenant_scan_idx"),
@@ -766,46 +726,19 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
condition=Q(delta="new"),
name="find_delta_new_idx",
),
models.Index(
fields=["tenant_id", "uid", "-inserted_at"],
name="find_tenant_uid_inserted_idx",
),
GinIndex(fields=["resource_services"], name="gin_find_service_idx"),
GinIndex(fields=["resource_regions"], name="gin_find_region_idx"),
GinIndex(fields=["resource_types"], name="gin_find_rtype_idx"),
models.Index(
fields=["tenant_id", "scan_id", "check_id"],
name="find_tenant_scan_check_idx",
),
]
class JSONAPIMeta:
resource_name = "findings"
def add_resources(self, resources: list[Resource] | None):
if not resources:
return
self.resource_regions = self.resource_regions or []
self.resource_services = self.resource_services or []
self.resource_types = self.resource_types or []
# Deduplication
regions = set(self.resource_regions)
services = set(self.resource_services)
types = set(self.resource_types)
# Add new relationships with the tenant_id field
for resource in resources:
ResourceFindingMapping.objects.update_or_create(
resource=resource, finding=self, tenant_id=self.tenant_id
)
regions.add(resource.region)
services.add(resource.service)
types.add(resource.type)
self.resource_regions = list(regions)
self.resource_services = list(services)
self.resource_types = list(types)
# Save the instance
self.save()
@@ -865,7 +798,6 @@ class ProviderSecret(RowLevelSecurityProtectedModel):
class TypeChoices(models.TextChoices):
STATIC = "static", _("Key-value pairs")
ROLE = "role", _("Role assumption")
SERVICE_ACCOUNT = "service_account", _("GCP Service Account Key")
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
@@ -1158,78 +1090,6 @@ class ComplianceOverview(RowLevelSecurityProtectedModel):
resource_name = "compliance-overviews"
class ComplianceRequirementOverview(RowLevelSecurityProtectedModel):
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
compliance_id = models.TextField(blank=False)
framework = models.TextField(blank=False)
version = models.TextField(blank=True)
description = models.TextField(blank=True)
region = models.TextField(blank=False)
requirement_id = models.TextField(blank=False)
requirement_status = StatusEnumField(choices=StatusChoices)
passed_checks = models.IntegerField(default=0)
failed_checks = models.IntegerField(default=0)
total_checks = models.IntegerField(default=0)
scan = models.ForeignKey(
Scan,
on_delete=models.CASCADE,
related_name="compliance_requirements_overviews",
related_query_name="compliance_requirements_overview",
)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "compliance_requirements_overviews"
constraints = [
models.UniqueConstraint(
fields=(
"tenant_id",
"scan_id",
"compliance_id",
"requirement_id",
"region",
),
name="unique_tenant_compliance_requirement_overview",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "DELETE"],
),
]
indexes = [
models.Index(fields=["tenant_id", "scan_id"], name="cro_tenant_scan_idx"),
models.Index(
fields=["tenant_id", "scan_id", "compliance_id"],
name="cro_scan_comp_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "compliance_id", "region"],
name="cro_scan_comp_reg_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "compliance_id", "requirement_id"],
name="cro_scan_comp_req_idx",
),
models.Index(
fields=[
"tenant_id",
"scan_id",
"compliance_id",
"requirement_id",
"region",
],
name="cro_scan_comp_req_reg_idx",
),
]
class JSONAPIMeta:
resource_name = "compliance-requirements-overviews"
class ScanSummary(RowLevelSecurityProtectedModel):
objects = ActiveProviderManager()
all_objects = models.Manager()
@@ -1280,15 +1140,7 @@ class ScanSummary(RowLevelSecurityProtectedModel):
models.Index(
fields=["tenant_id", "scan_id"],
name="scan_summaries_tenant_scan_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="ss_tenant_scan_service_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "severity"],
name="ss_tenant_scan_severity_idx",
),
)
]
class JSONAPIMeta:
@@ -1298,6 +1150,7 @@ class ScanSummary(RowLevelSecurityProtectedModel):
class Integration(RowLevelSecurityProtectedModel):
class IntegrationChoices(models.TextChoices):
S3 = "amazon_s3", _("Amazon S3")
SAML = "saml", _("SAML")
AWS_SECURITY_HUB = "aws_security_hub", _("AWS Security Hub")
JIRA = "jira", _("JIRA")
SLACK = "slack", _("Slack")
@@ -1369,419 +1222,3 @@ class IntegrationProviderRelationship(RowLevelSecurityProtectedModel):
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class SAMLToken(models.Model):
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
expires_at = models.DateTimeField(editable=False)
token = models.JSONField(unique=True)
user = models.ForeignKey(User, on_delete=models.CASCADE)
class Meta:
db_table = "saml_tokens"
def save(self, *args, **kwargs):
if not self.expires_at:
self.expires_at = datetime.now(timezone.utc) + timedelta(seconds=15)
super().save(*args, **kwargs)
def is_expired(self) -> bool:
return datetime.now(timezone.utc) >= self.expires_at
class SAMLDomainIndex(models.Model):
"""
Public index of SAML domains. No RLS. Used for fast lookup in SAML login flow.
"""
email_domain = models.CharField(max_length=254, unique=True)
tenant = models.ForeignKey("Tenant", on_delete=models.CASCADE)
class Meta:
db_table = "saml_domain_index"
constraints = [
models.UniqueConstraint(
fields=("email_domain", "tenant"),
name="unique_resources_by_email_domain",
),
BaseSecurityConstraint(
name="statements_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class SAMLConfiguration(RowLevelSecurityProtectedModel):
"""
Stores per-tenant SAML settings, including email domain and IdP metadata.
Automatically syncs to a SocialApp instance on save.
Note:
This model exists to provide a tenant-aware abstraction over SAML configuration.
It supports row-level security, custom validation, and metadata parsing, enabling
Prowler to expose a clean API and admin interface for managing SAML integrations.
Although Django Allauth uses the SocialApp model to store provider configuration,
it is not designed for multi-tenant use. SocialApp lacks support for tenant scoping,
email domain mapping, and structured metadata handling.
By managing SAMLConfiguration separately, we ensure:
- Strong isolation between tenants via RLS.
- Ownership of raw IdP metadata and its validation.
- An explicit link between SAML config and business-level identifiers (e.g. email domain).
- Programmatic transformation into the SocialApp format used by Allauth.
In short, this model acts as a secure and user-friendly layer over Allauth's lower-level primitives.
"""
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
email_domain = models.CharField(
max_length=254,
unique=True,
help_text="Email domain used to identify the tenant, e.g. prowlerdemo.com",
)
metadata_xml = models.TextField(
help_text="Raw IdP metadata XML to configure SingleSignOnService, certificates, etc."
)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class JSONAPIMeta:
resource_name = "saml-configurations"
class Meta:
db_table = "saml_configurations"
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
# 1 config per tenant
models.UniqueConstraint(
fields=["tenant"],
name="unique_samlconfig_per_tenant",
),
]
def clean(self, old_email_domain=None):
# Domain must not contain @
if "@" in self.email_domain:
raise ValidationError({"email_domain": "Domain must not contain @"})
# Enforce at most one config per tenant
qs = SAMLConfiguration.objects.filter(tenant=self.tenant)
# Exclude ourselves in case of update
if self.pk:
qs = qs.exclude(pk=self.pk)
if qs.exists():
raise ValidationError(
{"tenant": "A SAML configuration already exists for this tenant."}
)
# The email domain must be unique in the entire system
qs = SAMLConfiguration.objects.using(MainRouter.admin_db).filter(
email_domain__iexact=self.email_domain
)
if qs.exists() and old_email_domain != self.email_domain:
raise ValidationError(
{"tenant": "There is a problem with your email domain."}
)
def save(self, *args, **kwargs):
self.email_domain = self.email_domain.strip().lower()
is_create = not SAMLConfiguration.objects.filter(pk=self.pk).exists()
if not is_create:
old = SAMLConfiguration.objects.get(pk=self.pk)
old_email_domain = old.email_domain
old_metadata_xml = old.metadata_xml
else:
old_email_domain = None
old_metadata_xml = None
self.clean(old_email_domain)
super().save(*args, **kwargs)
if is_create or (
old_email_domain != self.email_domain
or old_metadata_xml != self.metadata_xml
):
self._sync_social_app(old_email_domain)
# Sync the public index
if not is_create and old_email_domain and old_email_domain != self.email_domain:
SAMLDomainIndex.objects.filter(email_domain=old_email_domain).delete()
# Create/update the new domain index
SAMLDomainIndex.objects.update_or_create(
email_domain=self.email_domain, defaults={"tenant": self.tenant}
)
def _parse_metadata(self):
"""
Parse the raw IdP metadata XML and extract:
- entity_id
- sso_url
- slo_url (may be None)
- x509cert (required)
"""
ns = {
"md": "urn:oasis:names:tc:SAML:2.0:metadata",
"ds": "http://www.w3.org/2000/09/xmldsig#",
}
try:
root = ET.fromstring(self.metadata_xml)
except ET.ParseError as e:
raise ValidationError({"metadata_xml": f"Invalid XML: {e}"})
# Entity ID
entity_id = root.attrib.get("entityID")
# SSO endpoint (must exist)
sso = root.find(".//md:IDPSSODescriptor/md:SingleSignOnService", ns)
if sso is None or "Location" not in sso.attrib:
raise ValidationError(
{"metadata_xml": "Missing SingleSignOnService in metadata."}
)
sso_url = sso.attrib["Location"]
# SLO endpoint (optional)
slo = root.find(".//md:IDPSSODescriptor/md:SingleLogoutService", ns)
slo_url = slo.attrib.get("Location") if slo is not None else None
# X.509 certificate (required)
cert = root.find(
'.//md:KeyDescriptor[@use="signing"]/ds:KeyInfo/ds:X509Data/ds:X509Certificate',
ns,
)
if cert is None or not cert.text or not cert.text.strip():
raise ValidationError(
{
"metadata_xml": 'Metadata must include a <ds:X509Certificate> under <KeyDescriptor use="signing">.'
}
)
x509cert = cert.text.strip()
return {
"entity_id": entity_id,
"sso_url": sso_url,
"slo_url": slo_url,
"x509cert": x509cert,
}
def _sync_social_app(self, previous_email_domain=None):
"""
Create or update the corresponding SocialApp based on email_domain.
If the domain changed, update the matching SocialApp.
"""
idp_settings = self._parse_metadata()
settings_dict = SOCIALACCOUNT_PROVIDERS["saml"].copy()
settings_dict["idp"] = idp_settings
current_site = Site.objects.get(id=settings.SITE_ID)
social_app_qs = SocialApp.objects.filter(
provider="saml", client_id=previous_email_domain or self.email_domain
)
client_id = self.email_domain[:191]
name = f"SAML-{self.email_domain}"[:40]
if social_app_qs.exists():
social_app = social_app_qs.first()
social_app.client_id = client_id
social_app.name = name
social_app.settings = settings_dict
social_app.provider_id = idp_settings["entity_id"]
social_app.save()
social_app.sites.set([current_site])
else:
social_app = SocialApp.objects.create(
provider="saml",
client_id=client_id,
name=name,
settings=settings_dict,
provider_id=idp_settings["entity_id"],
)
social_app.sites.set([current_site])
class ResourceScanSummary(RowLevelSecurityProtectedModel):
scan_id = models.UUIDField(default=uuid7, db_index=True)
resource_id = models.UUIDField(default=uuid4, db_index=True)
service = models.CharField(max_length=100)
region = models.CharField(max_length=100)
resource_type = models.CharField(max_length=100)
class Meta:
db_table = "resource_scan_summaries"
unique_together = (("tenant_id", "scan_id", "resource_id"),)
indexes = [
# Single-dimension lookups:
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="rss_tenant_scan_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region"],
name="rss_tenant_scan_reg_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "resource_type"],
name="rss_tenant_scan_type_idx",
),
# Two-dimension cross-filters:
models.Index(
fields=["tenant_id", "scan_id", "region", "service"],
name="rss_tenant_scan_reg_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "service", "resource_type"],
name="rss_tenant_scan_svc_type_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region", "resource_type"],
name="rss_tenant_scan_reg_type_idx",
),
]
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class LighthouseConfiguration(RowLevelSecurityProtectedModel):
"""
Stores configuration and API keys for LLM services.
"""
class ModelChoices(models.TextChoices):
GPT_4O_2024_11_20 = "gpt-4o-2024-11-20", _("GPT-4o v2024-11-20")
GPT_4O_2024_08_06 = "gpt-4o-2024-08-06", _("GPT-4o v2024-08-06")
GPT_4O_2024_05_13 = "gpt-4o-2024-05-13", _("GPT-4o v2024-05-13")
GPT_4O = "gpt-4o", _("GPT-4o Default")
GPT_4O_MINI_2024_07_18 = "gpt-4o-mini-2024-07-18", _("GPT-4o Mini v2024-07-18")
GPT_4O_MINI = "gpt-4o-mini", _("GPT-4o Mini Default")
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
name = models.CharField(
max_length=100,
validators=[MinLengthValidator(3)],
blank=False,
null=False,
help_text="Name of the configuration",
)
api_key = models.BinaryField(
blank=False, null=False, help_text="Encrypted API key for the LLM service"
)
model = models.CharField(
max_length=50,
choices=ModelChoices.choices,
blank=False,
null=False,
default=ModelChoices.GPT_4O_2024_08_06,
help_text="Must be one of the supported model names",
)
temperature = models.FloatField(default=0, help_text="Must be between 0 and 1")
max_tokens = models.IntegerField(
default=4000, help_text="Must be between 500 and 5000"
)
business_context = models.TextField(
blank=True,
null=False,
default="",
help_text="Additional business context for this AI model configuration",
)
is_active = models.BooleanField(default=True)
def __str__(self):
return self.name
def clean(self):
super().clean()
# Validate temperature
if not 0 <= self.temperature <= 1:
raise ModelValidationError(
detail="Temperature must be between 0 and 1",
code="invalid_temperature",
pointer="/data/attributes/temperature",
)
# Validate max_tokens
if not 500 <= self.max_tokens <= 5000:
raise ModelValidationError(
detail="Max tokens must be between 500 and 5000",
code="invalid_max_tokens",
pointer="/data/attributes/max_tokens",
)
@property
def api_key_decoded(self):
"""Return the decrypted API key, or None if unavailable or invalid."""
if not self.api_key:
return None
try:
decrypted_key = fernet.decrypt(bytes(self.api_key))
return decrypted_key.decode()
except InvalidToken:
logger.warning("Invalid token while decrypting API key.")
except Exception as e:
logger.exception("Unexpected error while decrypting API key: %s", e)
@api_key_decoded.setter
def api_key_decoded(self, value):
"""Store the encrypted API key."""
if not value:
raise ModelValidationError(
detail="API key is required",
code="invalid_api_key",
pointer="/data/attributes/api_key",
)
# Validate OpenAI API key format
openai_key_pattern = r"^sk-[\w-]+T3BlbkFJ[\w-]+$"
if not re.match(openai_key_pattern, value):
raise ModelValidationError(
detail="Invalid OpenAI API key format.",
code="invalid_api_key",
pointer="/data/attributes/api_key",
)
self.api_key = fernet.encrypt(value.encode())
def save(self, *args, **kwargs):
self.full_clean()
super().save(*args, **kwargs)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "lighthouse_configurations"
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
# Add unique constraint for name within a tenant
models.UniqueConstraint(
fields=["tenant_id"], name="unique_lighthouse_config_per_tenant"
),
]
class JSONAPIMeta:
resource_name = "lighthouse-configurations"

View File

@@ -1,4 +1,4 @@
from drf_spectacular_jsonapi.schemas.pagination import JsonApiPageNumberPagination
from rest_framework_json_api.pagination import JsonApiPageNumberPagination
class ComplianceOverviewPagination(JsonApiPageNumberPagination):

File diff suppressed because it is too large Load Diff

View File

@@ -1,56 +0,0 @@
from unittest.mock import MagicMock
import pytest
from allauth.socialaccount.models import SocialLogin
from django.contrib.auth import get_user_model
from api.adapters import ProwlerSocialAccountAdapter
User = get_user_model()
@pytest.mark.django_db
class TestProwlerSocialAccountAdapter:
def test_get_user_by_email_returns_user(self, create_test_user):
adapter = ProwlerSocialAccountAdapter()
user = adapter.get_user_by_email(create_test_user.email)
assert user == create_test_user
def test_get_user_by_email_returns_none_for_unknown_email(self):
adapter = ProwlerSocialAccountAdapter()
assert adapter.get_user_by_email("notfound@example.com") is None
def test_pre_social_login_links_existing_user(self, create_test_user, rf):
adapter = ProwlerSocialAccountAdapter()
sociallogin = MagicMock(spec=SocialLogin)
sociallogin.account = MagicMock()
sociallogin.provider = MagicMock()
sociallogin.provider.id = "saml"
sociallogin.account.extra_data = {}
sociallogin.user = create_test_user
sociallogin.connect = MagicMock()
adapter.pre_social_login(rf.get("/"), sociallogin)
call_args = sociallogin.connect.call_args
assert call_args is not None
called_request, called_user = call_args[0]
assert called_request.path == "/"
assert called_user.email == create_test_user.email
def test_pre_social_login_no_link_if_email_missing(self, rf):
adapter = ProwlerSocialAccountAdapter()
sociallogin = MagicMock(spec=SocialLogin)
sociallogin.account = MagicMock()
sociallogin.provider = MagicMock()
sociallogin.user = MagicMock()
sociallogin.provider.id = "saml"
sociallogin.account.extra_data = {}
sociallogin.connect = MagicMock()
adapter.pre_social_login(rf.get("/"), sociallogin)
sociallogin.connect.assert_not_called()

View File

@@ -1,12 +1,12 @@
from unittest.mock import MagicMock, patch
from unittest.mock import patch, MagicMock
from api.compliance import (
generate_compliance_overview_template,
generate_scan_compliance,
get_prowler_provider_checks,
get_prowler_provider_compliance,
load_prowler_checks,
load_prowler_compliance,
load_prowler_checks,
generate_scan_compliance,
generate_compliance_overview_template,
)
from api.models import Provider
@@ -69,7 +69,7 @@ class TestCompliance:
load_prowler_compliance()
from api.compliance import PROWLER_CHECKS, PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE
from api.compliance import PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE, PROWLER_CHECKS
assert PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE == {
"template_key": "template_value"
@@ -218,10 +218,6 @@ class TestCompliance:
Description="Description of requirement 1",
Attributes=[],
Checks=["check1", "check2"],
Tactics=["tactic1"],
SubTechniques=["subtechnique1"],
Platforms=["platform1"],
TechniqueURL="https://example.com",
)
requirement2 = MagicMock(
Id="requirement2",
@@ -229,10 +225,6 @@ class TestCompliance:
Description="Description of requirement 2",
Attributes=[],
Checks=[],
Tactics=[],
SubTechniques=[],
Platforms=[],
TechniqueURL="",
)
compliance1 = MagicMock(
Requirements=[requirement1, requirement2],
@@ -255,10 +247,6 @@ class TestCompliance:
"requirement1": {
"name": "Requirement 1",
"description": "Description of requirement 1",
"tactics": ["tactic1"],
"subtechniques": ["subtechnique1"],
"platforms": ["platform1"],
"technique_url": "https://example.com",
"attributes": [],
"checks": {"check1": None, "check2": None},
"checks_status": {
@@ -272,10 +260,6 @@ class TestCompliance:
"requirement2": {
"name": "Requirement 2",
"description": "Description of requirement 2",
"tactics": [],
"subtechniques": [],
"platforms": [],
"technique_url": "",
"attributes": [],
"checks": {},
"checks_status": {
@@ -284,7 +268,7 @@ class TestCompliance:
"manual": 0,
"total": 0,
},
"status": "MANUAL",
"status": "PASS",
},
},
"requirements_status": {

View File

@@ -3,13 +3,9 @@ from enum import Enum
from unittest.mock import patch
import pytest
from django.conf import settings
from freezegun import freeze_time
from api.db_utils import (
_should_create_index_on_partition,
batch_delete,
create_objects_in_batches,
enum_to_choices,
generate_random_token,
one_week_from_now,
@@ -142,88 +138,3 @@ class TestBatchDelete:
)
assert Provider.objects.all().count() == 0
assert summary == {"api.Provider": create_test_providers}
class TestShouldCreateIndexOnPartition:
@freeze_time("2025-05-15 00:00:00Z")
@pytest.mark.parametrize(
"partition_name, all_partitions, expected",
[
("any_name", True, True),
("findings_default", True, True),
("findings_2022_jan", True, True),
("foo_bar", False, True),
("findings_2025_MAY", False, True),
("findings_2025_may", False, True),
("findings_2025_jun", False, True),
("findings_2025_apr", False, False),
("findings_2025_xyz", False, True),
],
)
def test_partition_inclusion_logic(self, partition_name, all_partitions, expected):
assert (
_should_create_index_on_partition(partition_name, all_partitions)
is expected
)
@freeze_time("2025-05-15 00:00:00Z")
def test_invalid_date_components(self):
# even if regex matches but int conversion fails, we fallback True
# (e.g. year too big, month number parse error)
bad_name = "findings_99999_jan"
assert _should_create_index_on_partition(bad_name, False) is True
bad_name2 = "findings_2025_abc"
# abc not in month_map → fallback True
assert _should_create_index_on_partition(bad_name2, False) is True
@pytest.mark.django_db
class TestCreateObjectsInBatches:
@pytest.fixture
def tenant(self, tenants_fixture):
return tenants_fixture[0]
def make_provider_instances(self, tenant, count):
"""
Return a list of `count` unsaved Provider instances for the given tenant.
"""
base_uid = 1000
return [
Provider(
tenant=tenant,
uid=str(base_uid + i),
provider=Provider.ProviderChoices.AWS,
)
for i in range(count)
]
def test_exact_multiple_of_batch(self, tenant):
total = 6
batch_size = 3
objs = self.make_provider_instances(tenant, total)
create_objects_in_batches(str(tenant.id), Provider, objs, batch_size=batch_size)
qs = Provider.objects.filter(tenant=tenant)
assert qs.count() == total
def test_non_multiple_of_batch(self, tenant):
total = 7
batch_size = 3
objs = self.make_provider_instances(tenant, total)
create_objects_in_batches(str(tenant.id), Provider, objs, batch_size=batch_size)
qs = Provider.objects.filter(tenant=tenant)
assert qs.count() == total
def test_batch_size_default(self, tenant):
default_size = settings.DJANGO_DELETION_BATCH_SIZE
total = default_size + 2
objs = self.make_provider_instances(tenant, total)
create_objects_in_batches(str(tenant.id), Provider, objs)
qs = Provider.objects.filter(tenant=tenant)
assert qs.count() == total

View File

@@ -1,379 +0,0 @@
import json
from uuid import uuid4
import pytest
from django_celery_results.models import TaskResult
from rest_framework import status
from rest_framework.response import Response
from api.exceptions import (
TaskFailedException,
TaskInProgressException,
TaskNotFoundException,
)
from api.models import Task, User
from api.rls import Tenant
from api.v1.mixins import PaginateByPkMixin, TaskManagementMixin
@pytest.mark.django_db
class TestPaginateByPkMixin:
@pytest.fixture
def tenant(self):
return Tenant.objects.create(name="Test Tenant")
@pytest.fixture
def users(self, tenant):
# Create 5 users with proper email field
users = []
for i in range(5):
user = User.objects.create(email=f"user{i}@example.com", name=f"User {i}")
users.append(user)
return users
class DummyView(PaginateByPkMixin):
def __init__(self, page):
self._page = page
def paginate_queryset(self, qs):
return self._page
def get_serializer(self, queryset, many):
class S:
def __init__(self, data):
# serialize to list of ids
self.data = [obj.id for obj in data] if many else queryset.id
return S(queryset)
def get_paginated_response(self, data):
return Response({"results": data}, status=status.HTTP_200_OK)
def test_no_pagination(self, users):
base_qs = User.objects.all().order_by("id")
view = self.DummyView(page=None)
resp = view.paginate_by_pk(
request=None, base_queryset=base_qs, manager=User.objects
)
# since no pagination, should return all ids in order
expected = [u.id for u in base_qs]
assert isinstance(resp, Response)
assert resp.data == expected
def test_with_pagination(self, users):
base_qs = User.objects.all().order_by("id")
# simulate paging to first 2 ids
page = [base_qs[1].id, base_qs[3].id]
view = self.DummyView(page=page)
resp = view.paginate_by_pk(
request=None, base_queryset=base_qs, manager=User.objects
)
# should fetch only those two users, in the same order as page
assert resp.status_code == status.HTTP_200_OK
assert resp.data == {"results": page}
@pytest.mark.django_db
class TestTaskManagementMixin:
class DummyView(TaskManagementMixin):
pass
@pytest.fixture
def tenant(self):
return Tenant.objects.create(name="Test Tenant")
@pytest.fixture(autouse=True)
def cleanup(self):
Task.objects.all().delete()
TaskResult.objects.all().delete()
def test_no_task_and_no_taskresult_raises_not_found(self):
view = self.DummyView()
with pytest.raises(TaskNotFoundException):
view.check_task_status("task_xyz", {"foo": "bar"})
def test_no_task_and_no_taskresult_returns_none_when_not_raising(self):
view = self.DummyView()
result = view.check_task_status(
"task_xyz", {"foo": "bar"}, raise_on_not_found=False
)
assert result is None
def test_taskresult_pending_raises_in_progress(self):
task_kwargs = {"foo": "bar"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="task_xyz",
task_kwargs=json.dumps(task_kwargs),
status="PENDING",
)
view = self.DummyView()
with pytest.raises(TaskInProgressException) as excinfo:
view.check_task_status("task_xyz", task_kwargs, raise_on_not_found=False)
assert hasattr(excinfo.value, "task_result")
assert excinfo.value.task_result == tr
def test_taskresult_started_raises_in_progress(self):
task_kwargs = {"foo": "bar"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="task_xyz",
task_kwargs=json.dumps(task_kwargs),
status="STARTED",
)
view = self.DummyView()
with pytest.raises(TaskInProgressException) as excinfo:
view.check_task_status("task_xyz", task_kwargs, raise_on_not_found=False)
assert hasattr(excinfo.value, "task_result")
assert excinfo.value.task_result == tr
def test_taskresult_progress_raises_in_progress(self):
task_kwargs = {"foo": "bar"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="task_xyz",
task_kwargs=json.dumps(task_kwargs),
status="PROGRESS",
)
view = self.DummyView()
with pytest.raises(TaskInProgressException) as excinfo:
view.check_task_status("task_xyz", task_kwargs, raise_on_not_found=False)
assert hasattr(excinfo.value, "task_result")
assert excinfo.value.task_result == tr
def test_taskresult_failure_raises_failed(self):
task_kwargs = {"a": 1}
TaskResult.objects.create(
task_id=str(uuid4()),
task_name="task_fail",
task_kwargs=json.dumps(task_kwargs),
status="FAILURE",
)
view = self.DummyView()
with pytest.raises(TaskFailedException):
view.check_task_status("task_fail", task_kwargs, raise_on_not_found=False)
def test_taskresult_failure_returns_none_when_not_raising(self):
task_kwargs = {"a": 1}
TaskResult.objects.create(
task_id=str(uuid4()),
task_name="task_fail",
task_kwargs=json.dumps(task_kwargs),
status="FAILURE",
)
view = self.DummyView()
result = view.check_task_status(
"task_fail", task_kwargs, raise_on_failed=False, raise_on_not_found=False
)
assert result is None
def test_taskresult_success_returns_none(self):
task_kwargs = {"x": 2}
TaskResult.objects.create(
task_id=str(uuid4()),
task_name="task_ok",
task_kwargs=json.dumps(task_kwargs),
status="SUCCESS",
)
view = self.DummyView()
# should not raise, and returns None
assert (
view.check_task_status("task_ok", task_kwargs, raise_on_not_found=False)
is None
)
def test_taskresult_revoked_returns_none(self):
task_kwargs = {"x": 2}
TaskResult.objects.create(
task_id=str(uuid4()),
task_name="task_revoked",
task_kwargs=json.dumps(task_kwargs),
status="REVOKED",
)
view = self.DummyView()
# should not raise, and returns None
assert (
view.check_task_status(
"task_revoked", task_kwargs, raise_on_not_found=False
)
is None
)
def test_task_with_failed_status_raises_failed(self, tenant):
task_kwargs = {"provider_id": "test"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs),
status="FAILURE",
)
task = Task.objects.create(tenant=tenant, task_runner_task=tr)
view = self.DummyView()
with pytest.raises(TaskFailedException) as excinfo:
view.check_task_status("scan_task", task_kwargs)
# Check that the exception contains the expected task
assert hasattr(excinfo.value, "task")
assert excinfo.value.task == task
def test_task_with_cancelled_status_raises_failed(self, tenant):
task_kwargs = {"provider_id": "test"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs),
status="REVOKED",
)
task = Task.objects.create(tenant=tenant, task_runner_task=tr)
view = self.DummyView()
with pytest.raises(TaskFailedException) as excinfo:
view.check_task_status("scan_task", task_kwargs)
# Check that the exception contains the expected task
assert hasattr(excinfo.value, "task")
assert excinfo.value.task == task
def test_task_with_failed_status_returns_task_when_not_raising(self, tenant):
task_kwargs = {"provider_id": "test"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs),
status="FAILURE",
)
task = Task.objects.create(tenant=tenant, task_runner_task=tr)
view = self.DummyView()
result = view.check_task_status("scan_task", task_kwargs, raise_on_failed=False)
assert result == task
def test_task_with_completed_status_returns_none(self, tenant):
task_kwargs = {"provider_id": "test"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs),
status="SUCCESS",
)
Task.objects.create(tenant=tenant, task_runner_task=tr)
view = self.DummyView()
result = view.check_task_status("scan_task", task_kwargs)
assert result is None
def test_task_with_executing_status_returns_task(self, tenant):
task_kwargs = {"provider_id": "test"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs),
status="STARTED",
)
task = Task.objects.create(tenant=tenant, task_runner_task=tr)
view = self.DummyView()
result = view.check_task_status("scan_task", task_kwargs)
assert result is not None
assert result.pk == task.pk
def test_task_with_pending_status_returns_task(self, tenant):
task_kwargs = {"provider_id": "test"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs),
status="PENDING",
)
task = Task.objects.create(tenant=tenant, task_runner_task=tr)
view = self.DummyView()
result = view.check_task_status("scan_task", task_kwargs)
assert result is not None
assert result.pk == task.pk
def test_get_task_response_if_running_returns_none_for_completed_task(self, tenant):
task_kwargs = {"provider_id": "test"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs),
status="SUCCESS",
)
Task.objects.create(tenant=tenant, task_runner_task=tr)
view = self.DummyView()
result = view.get_task_response_if_running("scan_task", task_kwargs)
assert result is None
def test_get_task_response_if_running_returns_none_for_no_task(self):
view = self.DummyView()
result = view.get_task_response_if_running(
"nonexistent", {"foo": "bar"}, raise_on_not_found=False
)
assert result is None
def test_get_task_response_if_running_returns_202_for_executing_task(self, tenant):
task_kwargs = {"provider_id": "test"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs),
status="STARTED",
)
task = Task.objects.create(tenant=tenant, task_runner_task=tr)
view = self.DummyView()
result = view.get_task_response_if_running("scan_task", task_kwargs)
assert isinstance(result, Response)
assert result.status_code == status.HTTP_202_ACCEPTED
assert "Content-Location" in result.headers
# The response should contain the serialized task data
assert result.data is not None
assert "id" in result.data
assert str(result.data["id"]) == str(task.id)
def test_get_task_response_if_running_returns_none_for_available_task(self, tenant):
task_kwargs = {"provider_id": "test"}
tr = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs),
status="PENDING",
)
Task.objects.create(tenant=tenant, task_runner_task=tr)
view = self.DummyView()
result = view.get_task_response_if_running("scan_task", task_kwargs)
# PENDING maps to AVAILABLE, which is not EXECUTING, so should return None
assert result is None
def test_kwargs_filtering_works_correctly(self, tenant):
# Create tasks with different kwargs
task_kwargs_1 = {"provider_id": "test1", "scan_type": "full"}
task_kwargs_2 = {"provider_id": "test2", "scan_type": "quick"}
tr1 = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs_1),
status="STARTED",
)
tr2 = TaskResult.objects.create(
task_id=str(uuid4()),
task_name="scan_task",
task_kwargs=json.dumps(task_kwargs_2),
status="STARTED",
)
task1 = Task.objects.create(tenant=tenant, task_runner_task=tr1)
task2 = Task.objects.create(tenant=tenant, task_runner_task=tr2)
view = self.DummyView()
# Should find task1 when searching for its kwargs
result1 = view.check_task_status("scan_task", {"provider_id": "test1"})
assert result1 is not None
assert result1.pk == task1.pk
# Should find task2 when searching for its kwargs
result2 = view.check_task_status("scan_task", {"provider_id": "test2"})
assert result2 is not None
assert result2.pk == task2.pk
# Should not find anything when searching for non-existent kwargs
result3 = view.check_task_status(
"scan_task", {"provider_id": "test3"}, raise_on_not_found=False
)
assert result3 is None

View File

@@ -1,9 +1,6 @@
import pytest
from allauth.socialaccount.models import SocialApp
from django.core.exceptions import ValidationError
from api.db_router import MainRouter
from api.models import Resource, ResourceTag, SAMLConfiguration, Tenant
from api.models import Resource, ResourceTag
@pytest.mark.django_db
@@ -95,177 +92,3 @@ class TestResourceModel:
assert len(resource.tags.filter(tenant_id=tenant_id)) == 0
assert resource.get_tags(tenant_id=tenant_id) == {}
# @pytest.mark.django_db
# class TestFindingModel:
# def test_add_finding_with_long_uid(
# self, providers_fixture, scans_fixture, resources_fixture
# ):
# provider, *_ = providers_fixture
# tenant_id = provider.tenant_id
# long_uid = "1" * 500
# _ = Finding.objects.create(
# tenant_id=tenant_id,
# uid=long_uid,
# delta=Finding.DeltaChoices.NEW,
# check_metadata={},
# status=StatusChoices.PASS,
# status_extended="",
# severity="high",
# impact="high",
# raw_result={},
# check_id="test_check",
# scan=scans_fixture[0],
# first_seen_at=None,
# muted=False,
# compliance={},
# )
# assert Finding.objects.filter(uid=long_uid).exists()
@pytest.mark.django_db
class TestSAMLConfigurationModel:
VALID_METADATA = """<?xml version='1.0' encoding='UTF-8'?>
<md:EntityDescriptor entityID='TEST' xmlns:md='urn:oasis:names:tc:SAML:2.0:metadata'>
<md:IDPSSODescriptor WantAuthnRequestsSigned='false' protocolSupportEnumeration='urn:oasis:names:tc:SAML:2.0:protocol'>
<md:KeyDescriptor use='signing'>
<ds:KeyInfo xmlns:ds='http://www.w3.org/2000/09/xmldsig#'>
<ds:X509Data>
<ds:X509Certificate>FAKECERTDATA</ds:X509Certificate>
</ds:X509Data>
</ds:KeyInfo>
</md:KeyDescriptor>
<md:SingleSignOnService Binding='urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST' Location='https://idp.test/sso'/>
</md:IDPSSODescriptor>
</md:EntityDescriptor>
"""
def test_creates_valid_configuration(self):
tenant = Tenant.objects.using(MainRouter.admin_db).create(name="Tenant A")
config = SAMLConfiguration.objects.using(MainRouter.admin_db).create(
email_domain="ssoexample.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant,
)
assert config.email_domain == "ssoexample.com"
assert SocialApp.objects.filter(client_id="ssoexample.com").exists()
def test_email_domain_with_at_symbol_fails(self):
tenant = Tenant.objects.using(MainRouter.admin_db).create(name="Tenant B")
config = SAMLConfiguration(
email_domain="invalid@domain.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config.clean()
errors = exc_info.value.message_dict
assert "email_domain" in errors
assert "Domain must not contain @" in errors["email_domain"][0]
def test_duplicate_email_domain_fails(self):
tenant1 = Tenant.objects.using(MainRouter.admin_db).create(name="Tenant C1")
tenant2 = Tenant.objects.using(MainRouter.admin_db).create(name="Tenant C2")
SAMLConfiguration.objects.using(MainRouter.admin_db).create(
email_domain="duplicate.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant1,
)
config = SAMLConfiguration(
email_domain="duplicate.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant2,
)
with pytest.raises(ValidationError) as exc_info:
config.clean()
errors = exc_info.value.message_dict
assert "tenant" in errors
assert "There is a problem with your email domain." in errors["tenant"][0]
def test_duplicate_tenant_config_fails(self):
tenant = Tenant.objects.using(MainRouter.admin_db).create(name="Tenant D")
SAMLConfiguration.objects.using(MainRouter.admin_db).create(
email_domain="unique1.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant,
)
config = SAMLConfiguration(
email_domain="unique2.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config.clean()
errors = exc_info.value.message_dict
assert "tenant" in errors
assert (
"A SAML configuration already exists for this tenant."
in errors["tenant"][0]
)
def test_invalid_metadata_xml_fails(self):
tenant = Tenant.objects.using(MainRouter.admin_db).create(name="Tenant E")
config = SAMLConfiguration(
email_domain="brokenxml.com",
metadata_xml="<bad<xml>",
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config._parse_metadata()
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
assert "Invalid XML" in errors["metadata_xml"][0]
assert "not well-formed" in errors["metadata_xml"][0]
def test_metadata_missing_sso_fails(self):
tenant = Tenant.objects.using(MainRouter.admin_db).create(name="Tenant F")
xml = """<md:EntityDescriptor entityID="x" xmlns:md="urn:oasis:names:tc:SAML:2.0:metadata">
<md:IDPSSODescriptor></md:IDPSSODescriptor>
</md:EntityDescriptor>"""
config = SAMLConfiguration(
email_domain="nosso.com",
metadata_xml=xml,
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config._parse_metadata()
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
assert "Missing SingleSignOnService" in errors["metadata_xml"][0]
def test_metadata_missing_certificate_fails(self):
tenant = Tenant.objects.using(MainRouter.admin_db).create(name="Tenant G")
xml = """<md:EntityDescriptor entityID="x" xmlns:md="urn:oasis:names:tc:SAML:2.0:metadata">
<md:IDPSSODescriptor>
<md:SingleSignOnService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect" Location="https://example.com/sso"/>
</md:IDPSSODescriptor>
</md:EntityDescriptor>"""
config = SAMLConfiguration(
email_domain="nocert.com",
metadata_xml=xml,
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config._parse_metadata()
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
assert "X509Certificate" in errors["metadata_xml"][0]

View File

@@ -1,80 +0,0 @@
import logging
from unittest.mock import MagicMock
from config.settings.sentry import before_send
def test_before_send_ignores_log_with_ignored_exception():
"""Test that before_send ignores logs containing ignored exceptions."""
log_record = MagicMock()
log_record.msg = "Provider kubernetes is not connected"
log_record.levelno = logging.ERROR # 40
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None
def test_before_send_ignores_exception_with_ignored_exception():
"""Test that before_send ignores exceptions containing ignored exceptions."""
exc_info = (Exception, Exception("Provider kubernetes is not connected"), None)
hint = {"exc_info": exc_info}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None
def test_before_send_passes_through_non_ignored_log():
"""Test that before_send passes through logs that don't contain ignored exceptions."""
log_record = MagicMock()
log_record.msg = "Some other error message"
log_record.levelno = logging.ERROR # 40
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was passed through
assert result == event
def test_before_send_passes_through_non_ignored_exception():
"""Test that before_send passes through exceptions that don't contain ignored exceptions."""
exc_info = (Exception, Exception("Some other error message"), None)
hint = {"exc_info": exc_info}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was passed through
assert result == event
def test_before_send_handles_warning_level():
"""Test that before_send handles warning level logs."""
log_record = MagicMock()
log_record.msg = "Provider kubernetes is not connected"
log_record.levelno = logging.WARNING # 30
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None

View File

@@ -19,7 +19,6 @@ from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
from prowler.providers.m365.m365_provider import M365Provider
class TestMergeDicts:
@@ -105,7 +104,6 @@ class TestReturnProwlerProvider:
(Provider.ProviderChoices.GCP.value, GcpProvider),
(Provider.ProviderChoices.AZURE.value, AzureProvider),
(Provider.ProviderChoices.KUBERNETES.value, KubernetesProvider),
(Provider.ProviderChoices.M365.value, M365Provider),
],
)
def test_return_prowler_provider(self, provider_type, expected_provider):
@@ -178,10 +176,6 @@ class TestGetProwlerProviderKwargs:
Provider.ProviderChoices.KUBERNETES.value,
{"context": "provider_uid"},
),
(
Provider.ProviderChoices.M365.value,
{},
),
],
)
def test_get_prowler_provider_kwargs(self, provider_type, expected_extra_kwargs):

File diff suppressed because it is too large Load Diff

View File

@@ -1,20 +1,16 @@
from datetime import datetime, timezone
from allauth.socialaccount.providers.oauth2.client import OAuth2Client
from django.contrib.postgres.aggregates import ArrayAgg
from django.db.models import Subquery
from rest_framework.exceptions import NotFound, ValidationError
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Invitation, Provider, Resource
from api.v1.serializers import FindingMetadataSerializer
from api.models import Invitation, Provider
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.common.models import Connection
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
from prowler.providers.m365.m365_provider import M365Provider
class CustomOAuth2Client(OAuth2Client):
@@ -55,14 +51,14 @@ def merge_dicts(default_dict: dict, replacement_dict: dict) -> dict:
def return_prowler_provider(
provider: Provider,
) -> [AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider]:
) -> [AwsProvider | AzureProvider | GcpProvider | KubernetesProvider]:
"""Return the Prowler provider class based on the given provider type.
Args:
provider (Provider): The provider object containing the provider type and associated secrets.
Returns:
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider: The corresponding provider class.
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider: The corresponding provider class.
Raises:
ValueError: If the provider type specified in `provider.provider` is not supported.
@@ -76,8 +72,6 @@ def return_prowler_provider(
prowler_provider = AzureProvider
case Provider.ProviderChoices.KUBERNETES.value:
prowler_provider = KubernetesProvider
case Provider.ProviderChoices.M365.value:
prowler_provider = M365Provider
case _:
raise ValueError(f"Provider type {provider.provider} not supported")
return prowler_provider
@@ -110,15 +104,15 @@ def get_prowler_provider_kwargs(provider: Provider) -> dict:
def initialize_prowler_provider(
provider: Provider,
) -> AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider:
) -> AwsProvider | AzureProvider | GcpProvider | KubernetesProvider:
"""Initialize a Prowler provider instance based on the given provider type.
Args:
provider (Provider): The provider object containing the provider type and associated secrets.
Returns:
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider: An instance of the corresponding provider class
(`AwsProvider`, `AzureProvider`, `GcpProvider`, `KubernetesProvider` or `M365Provider`) initialized with the
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider: An instance of the corresponding provider class
(`AwsProvider`, `AzureProvider`, `GcpProvider`, or `KubernetesProvider`) initialized with the
provider's secrets.
"""
prowler_provider = return_prowler_provider(provider)
@@ -136,12 +130,10 @@ def prowler_provider_connection_test(provider: Provider) -> Connection:
Connection: A connection object representing the result of the connection test for the specified provider.
"""
prowler_provider = return_prowler_provider(provider)
try:
prowler_provider_kwargs = provider.secret.secret
except Provider.secret.RelatedObjectDoesNotExist as secret_error:
return Connection(is_connected=False, error=secret_error)
return prowler_provider.test_connection(
**prowler_provider_kwargs, provider_id=provider.uid, raise_on_exception=False
)
@@ -208,33 +200,3 @@ def validate_invitation(
)
return invitation
# ToRemove after removing the fallback mechanism in /findings/metadata
def get_findings_metadata_no_aggregations(tenant_id: str, filtered_queryset):
filtered_ids = filtered_queryset.order_by().values("id")
relevant_resources = Resource.all_objects.filter(
tenant_id=tenant_id, findings__id__in=Subquery(filtered_ids)
).only("service", "region", "type")
aggregation = relevant_resources.aggregate(
services=ArrayAgg("service", flat=True),
regions=ArrayAgg("region", flat=True),
resource_types=ArrayAgg("type", flat=True),
)
services = sorted(set(aggregation["services"] or []))
regions = sorted({region for region in aggregation["regions"] or [] if region})
resource_types = sorted(set(aggregation["resource_types"] or []))
result = {
"services": services,
"regions": regions,
"resource_types": resource_types,
}
serializer = FindingMetadataSerializer(data=result)
serializer.is_valid(raise_exception=True)
return serializer.data

View File

@@ -1,222 +0,0 @@
from django.urls import reverse
from django_celery_results.models import TaskResult
from rest_framework import status
from rest_framework.response import Response
from api.exceptions import (
TaskFailedException,
TaskInProgressException,
TaskNotFoundException,
)
from api.models import StateChoices, Task
from api.v1.serializers import TaskSerializer
class PaginateByPkMixin:
"""
Mixin to paginate on a list of PKs (cheaper than heavy JOINs),
re-fetch the full objects with the desired select/prefetch,
re-sort them to preserve DB ordering, then serialize + return.
"""
def paginate_by_pk(
self,
request, # noqa: F841
base_queryset,
manager,
select_related: list[str] | None = None,
prefetch_related: list[str] | None = None,
) -> Response:
pk_list = base_queryset.values_list("id", flat=True)
page = self.paginate_queryset(pk_list)
if page is None:
return Response(self.get_serializer(base_queryset, many=True).data)
queryset = manager.filter(id__in=page)
if select_related:
queryset = queryset.select_related(*select_related)
if prefetch_related:
queryset = queryset.prefetch_related(*prefetch_related)
queryset = sorted(queryset, key=lambda obj: page.index(obj.id))
serialized = self.get_serializer(queryset, many=True).data
return self.get_paginated_response(serialized)
class TaskManagementMixin:
"""
Mixin to manage task status checking.
This mixin provides functionality to check if a task with specific parameters
is running, completed, failed, or doesn't exist. It returns the task when running
and raises specific exceptions for failed/not found scenarios that can be handled
at the view level.
"""
def check_task_status(
self,
task_name: str,
task_kwargs: dict,
raise_on_failed: bool = True,
raise_on_not_found: bool = True,
) -> Task | None:
"""
Check the status of a task with given name and kwargs.
This method first checks for a related Task object, and if not found,
checks TaskResult directly. If a TaskResult is found and running but
there's no related Task, it raises TaskInProgressException.
Args:
task_name (str): The name of the task to check
task_kwargs (dict): The kwargs to match against the task
raise_on_failed (bool): Whether to raise exception if task failed
raise_on_not_found (bool): Whether to raise exception if task not found
Returns:
Task | None: The task instance if found (regardless of state), None if not found and raise_on_not_found=False
Raises:
TaskFailedException: If task failed and raise_on_failed=True
TaskNotFoundException: If task not found and raise_on_not_found=True
TaskInProgressException: If task is running but no related Task object exists
"""
# First, try to find a Task object with related TaskResult
try:
# Build the filter for task kwargs
task_filter = {
"task_runner_task__task_name": task_name,
}
# Add kwargs filters - we need to check if the task kwargs contain our parameters
for key, value in task_kwargs.items():
task_filter["task_runner_task__task_kwargs__contains"] = str(value)
task = (
Task.objects.filter(**task_filter)
.select_related("task_runner_task")
.order_by("-inserted_at")
.first()
)
if task:
# Get task state using the same logic as TaskSerializer
task_state_mapping = {
"PENDING": StateChoices.AVAILABLE,
"STARTED": StateChoices.EXECUTING,
"PROGRESS": StateChoices.EXECUTING,
"SUCCESS": StateChoices.COMPLETED,
"FAILURE": StateChoices.FAILED,
"REVOKED": StateChoices.CANCELLED,
}
celery_status = (
task.task_runner_task.status if task.task_runner_task else None
)
task_state = task_state_mapping.get(
celery_status or "", StateChoices.AVAILABLE
)
# Check task state and raise exceptions accordingly
if task_state in (StateChoices.FAILED, StateChoices.CANCELLED):
if raise_on_failed:
raise TaskFailedException(task=task)
return task
elif task_state == StateChoices.COMPLETED:
return None
return task
except Task.DoesNotExist:
pass
# If no Task found, check TaskResult directly
try:
# Build the filter for TaskResult
task_result_filter = {
"task_name": task_name,
}
# Add kwargs filters - check if the task kwargs contain our parameters
for key, value in task_kwargs.items():
task_result_filter["task_kwargs__contains"] = str(value)
task_result = (
TaskResult.objects.filter(**task_result_filter)
.order_by("-date_created")
.first()
)
if task_result:
# Check if the TaskResult indicates a running task
if task_result.status in ["PENDING", "STARTED", "PROGRESS"]:
# Task is running but no related Task object exists
raise TaskInProgressException(task_result=task_result)
elif task_result.status == "FAILURE":
if raise_on_failed:
raise TaskFailedException(task=None)
# For other statuses (SUCCESS, REVOKED), we don't have a Task to return,
# so we treat it as not found
except TaskResult.DoesNotExist:
pass
# No task found at all
if raise_on_not_found:
raise TaskNotFoundException()
return None
def get_task_response_if_running(
self,
task_name: str,
task_kwargs: dict,
raise_on_failed: bool = True,
raise_on_not_found: bool = True,
) -> Response | None:
"""
Get a 202 response with task details if the task is currently running.
This method is useful for endpoints that should return task status when
a background task is in progress, similar to the compliance overview endpoints.
Args:
task_name (str): The name of the task to check
task_kwargs (dict): The kwargs to match against the task
Returns:
Response | None: 202 response with task details if running, None otherwise
"""
task = self.check_task_status(
task_name=task_name,
task_kwargs=task_kwargs,
raise_on_failed=raise_on_failed,
raise_on_not_found=raise_on_not_found,
)
if not task:
return None
# Get task state
task_state_mapping = {
"PENDING": StateChoices.AVAILABLE,
"STARTED": StateChoices.EXECUTING,
"PROGRESS": StateChoices.EXECUTING,
"SUCCESS": StateChoices.COMPLETED,
"FAILURE": StateChoices.FAILED,
"REVOKED": StateChoices.CANCELLED,
}
celery_status = task.task_runner_task.status if task.task_runner_task else None
task_state = task_state_mapping.get(celery_status or "", StateChoices.AVAILABLE)
if task_state == StateChoices.EXECUTING:
self.response_serializer_class = TaskSerializer
serializer = TaskSerializer(task)
return Response(
data=serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse("task-detail", kwargs={"pk": task.id})
},
)

View File

@@ -1,183 +0,0 @@
from drf_spectacular.utils import extend_schema_field
from rest_framework_json_api import serializers
@extend_schema_field(
{
"oneOf": [
{
"type": "object",
"title": "AWS Static Credentials",
"properties": {
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Required for environments where no IAM role is being "
"assumed and direct AWS access is needed.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Must accompany 'aws_access_key_id' to authorize "
"access to AWS resources.",
},
"aws_session_token": {
"type": "string",
"description": "The session token associated with temporary credentials. Only needed for "
"session-based or temporary AWS access.",
},
},
"required": ["aws_access_key_id", "aws_secret_access_key"],
},
{
"type": "object",
"title": "AWS Assume Role",
"properties": {
"role_arn": {
"type": "string",
"description": "The Amazon Resource Name (ARN) of the role to assume. Required for AWS role "
"assumption.",
},
"external_id": {
"type": "string",
"description": "An identifier to enhance security for role assumption.",
},
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Only required if the environment lacks pre-configured "
"AWS credentials.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Required if 'aws_access_key_id' is provided or if "
"no AWS credentials are pre-configured.",
},
"aws_session_token": {
"type": "string",
"description": "The session token for temporary credentials, if applicable.",
},
"session_duration": {
"type": "integer",
"minimum": 900,
"maximum": 43200,
"default": 3600,
"description": "The duration (in seconds) for the role session.",
},
"role_session_name": {
"type": "string",
"description": "An identifier for the role session, useful for tracking sessions in AWS logs. "
"The regex used to validate this parameter is a string of characters consisting of "
"upper- and lower-case alphanumeric characters with no spaces. You can also include "
"underscores or any of the following characters: =,.@-\n\n"
"Examples:\n"
"- MySession123\n"
"- User_Session-1\n"
"- Test.Session@2",
"pattern": "^[a-zA-Z0-9=,.@_-]+$",
},
},
"required": ["role_arn", "external_id"],
},
{
"type": "object",
"title": "Azure Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The Azure application (client) ID for authentication in Azure AD.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the application (client) ID, providing "
"secure access.",
},
"tenant_id": {
"type": "string",
"description": "The Azure tenant ID, representing the directory where the application is "
"registered.",
},
},
"required": ["client_id", "client_secret", "tenant_id"],
},
{
"type": "object",
"title": "M365 Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The Azure application (client) ID for authentication in Azure AD.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the application (client) ID, providing "
"secure access.",
},
"tenant_id": {
"type": "string",
"description": "The Azure tenant ID, representing the directory where the application is "
"registered.",
},
"user": {
"type": "email",
"description": "User microsoft email address.",
},
"password": {
"type": "string",
"description": "User password.",
},
},
"required": [
"client_id",
"client_secret",
"tenant_id",
"user",
"password",
],
},
{
"type": "object",
"title": "GCP Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The client ID from Google Cloud, used to identify the application for GCP "
"access.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the GCP client ID, required for secure "
"access.",
},
"refresh_token": {
"type": "string",
"description": "A refresh token that allows the application to obtain new access tokens for "
"extended use.",
},
},
"required": ["client_id", "client_secret", "refresh_token"],
},
{
"type": "object",
"title": "GCP Service Account Key",
"properties": {
"service_account_key": {
"type": "object",
"description": "The service account key for GCP.",
}
},
"required": ["service_account_key"],
},
{
"type": "object",
"title": "Kubernetes Static Credentials",
"properties": {
"kubeconfig_content": {
"type": "string",
"description": "The content of the Kubernetes kubeconfig file, encoded as a string.",
}
},
"required": ["kubeconfig_content"],
},
]
}
)
class ProviderSecretField(serializers.JSONField):
pass

View File

@@ -14,12 +14,12 @@ from rest_framework_simplejwt.serializers import TokenObtainPairSerializer
from rest_framework_simplejwt.tokens import RefreshToken
from api.models import (
ComplianceOverview,
Finding,
Integration,
IntegrationProviderRelationship,
Invitation,
InvitationRoleRelationship,
LighthouseConfiguration,
Membership,
Provider,
ProviderGroup,
@@ -29,10 +29,8 @@ from api.models import (
ResourceTag,
Role,
RoleProviderGroupRelationship,
SAMLConfiguration,
Scan,
StateChoices,
StatusChoices,
Task,
User,
UserRoleRelationship,
@@ -44,7 +42,6 @@ from api.v1.serializer_utils.integrations import (
IntegrationCredentialField,
S3ConfigSerializer,
)
from api.v1.serializer_utils.providers import ProviderSecretField
# Tokens
@@ -130,12 +127,6 @@ class TokenSerializer(BaseTokenSerializer):
class TokenSocialLoginSerializer(BaseTokenSerializer):
email = serializers.EmailField(write_only=True)
tenant_id = serializers.UUIDField(
write_only=True,
required=False,
help_text="If not provided, the tenant ID of the first membership that was added"
" to the user will be used.",
)
# Output tokens
refresh = serializers.CharField(read_only=True)
@@ -968,15 +959,6 @@ class ScanReportSerializer(serializers.Serializer):
fields = ["id"]
class ScanComplianceReportSerializer(serializers.Serializer):
id = serializers.CharField(source="scan")
name = serializers.CharField()
class Meta:
resource_name = "scan-reports"
fields = ["id", "name"]
class ResourceTagSerializer(RLSSerializer):
"""
Serializer for the ResourceTag model
@@ -1159,16 +1141,12 @@ class BaseWriteProviderSecretSerializer(BaseWriteSerializer):
serializer = GCPProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.KUBERNETES.value:
serializer = KubernetesProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.M365.value:
serializer = M365ProviderSecret(data=secret)
else:
raise serializers.ValidationError(
{"provider": f"Provider type not supported {provider_type}"}
)
elif secret_type == ProviderSecret.TypeChoices.ROLE:
serializer = AWSRoleAssumptionProviderSecret(data=secret)
elif secret_type == ProviderSecret.TypeChoices.SERVICE_ACCOUNT:
serializer = GCPServiceAccountProviderSecret(data=secret)
else:
raise serializers.ValidationError(
{"secret_type": f"Secret type not supported: {secret_type}"}
@@ -1202,17 +1180,6 @@ class AzureProviderSecret(serializers.Serializer):
resource_name = "provider-secrets"
class M365ProviderSecret(serializers.Serializer):
client_id = serializers.CharField()
client_secret = serializers.CharField()
tenant_id = serializers.CharField()
user = serializers.EmailField(required=False)
password = serializers.CharField(required=False)
class Meta:
resource_name = "provider-secrets"
class GCPProviderSecret(serializers.Serializer):
client_id = serializers.CharField()
client_secret = serializers.CharField()
@@ -1222,13 +1189,6 @@ class GCPProviderSecret(serializers.Serializer):
resource_name = "provider-secrets"
class GCPServiceAccountProviderSecret(serializers.Serializer):
service_account_key = serializers.JSONField()
class Meta:
resource_name = "provider-secrets"
class KubernetesProviderSecret(serializers.Serializer):
kubeconfig_content = serializers.CharField()
@@ -1251,6 +1211,141 @@ class AWSRoleAssumptionProviderSecret(serializers.Serializer):
resource_name = "provider-secrets"
@extend_schema_field(
{
"oneOf": [
{
"type": "object",
"title": "AWS Static Credentials",
"properties": {
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Required for environments where no IAM role is being "
"assumed and direct AWS access is needed.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Must accompany 'aws_access_key_id' to authorize "
"access to AWS resources.",
},
"aws_session_token": {
"type": "string",
"description": "The session token associated with temporary credentials. Only needed for "
"session-based or temporary AWS access.",
},
},
"required": ["aws_access_key_id", "aws_secret_access_key"],
},
{
"type": "object",
"title": "AWS Assume Role",
"properties": {
"role_arn": {
"type": "string",
"description": "The Amazon Resource Name (ARN) of the role to assume. Required for AWS role "
"assumption.",
},
"external_id": {
"type": "string",
"description": "An identifier to enhance security for role assumption.",
},
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Only required if the environment lacks pre-configured "
"AWS credentials.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Required if 'aws_access_key_id' is provided or if "
"no AWS credentials are pre-configured.",
},
"aws_session_token": {
"type": "string",
"description": "The session token for temporary credentials, if applicable.",
},
"session_duration": {
"type": "integer",
"minimum": 900,
"maximum": 43200,
"default": 3600,
"description": "The duration (in seconds) for the role session.",
},
"role_session_name": {
"type": "string",
"description": "An identifier for the role session, useful for tracking sessions in AWS logs. "
"The regex used to validate this parameter is a string of characters consisting of "
"upper- and lower-case alphanumeric characters with no spaces. You can also include "
"underscores or any of the following characters: =,.@-\n\n"
"Examples:\n"
"- MySession123\n"
"- User_Session-1\n"
"- Test.Session@2",
"pattern": "^[a-zA-Z0-9=,.@_-]+$",
},
},
"required": ["role_arn", "external_id"],
},
{
"type": "object",
"title": "Azure Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The Azure application (client) ID for authentication in Azure AD.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the application (client) ID, providing "
"secure access.",
},
"tenant_id": {
"type": "string",
"description": "The Azure tenant ID, representing the directory where the application is "
"registered.",
},
},
"required": ["client_id", "client_secret", "tenant_id"],
},
{
"type": "object",
"title": "GCP Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The client ID from Google Cloud, used to identify the application for GCP "
"access.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the GCP client ID, required for secure "
"access.",
},
"refresh_token": {
"type": "string",
"description": "A refresh token that allows the application to obtain new access tokens for "
"extended use.",
},
},
"required": ["client_id", "client_secret", "refresh_token"],
},
{
"type": "object",
"title": "Kubernetes Static Credentials",
"properties": {
"kubeconfig_content": {
"type": "string",
"description": "The content of the Kubernetes kubeconfig file, encoded as a string.",
}
},
"required": ["kubeconfig_content"],
},
]
}
)
class ProviderSecretField(serializers.JSONField):
pass
class ProviderSecretSerializer(RLSSerializer):
"""
Serializer for the ProviderSecret model.
@@ -1315,13 +1410,12 @@ class ProviderSecretUpdateSerializer(BaseWriteProviderSecretSerializer):
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
"provider": {"read_only": True},
"secret_type": {"required": False},
"secret_type": {"read_only": True},
}
def validate(self, attrs):
provider = self.instance.provider
# To allow updating a secret with the same type without making the `secret_type` mandatory
secret_type = attrs.get("secret_type") or self.instance.secret_type
secret_type = self.instance.secret_type
secret = attrs.get("secret")
validated_attrs = super().validate(attrs)
@@ -1688,63 +1782,130 @@ class RoleProviderGroupRelationshipSerializer(RLSSerializer, BaseWriteSerializer
# Compliance overview
class ComplianceOverviewSerializer(serializers.Serializer):
class ComplianceOverviewSerializer(RLSSerializer):
"""
Serializer for compliance requirement status aggregated by compliance framework.
This serializer is used to format aggregated compliance framework data,
providing counts of passed, failed, and manual requirements along with
an overall global status for each framework.
Serializer for the ComplianceOverview model.
"""
# Add ID field which will be used for resource identification
id = serializers.CharField()
framework = serializers.CharField()
version = serializers.CharField()
requirements_passed = serializers.IntegerField()
requirements_failed = serializers.IntegerField()
requirements_manual = serializers.IntegerField()
total_requirements = serializers.IntegerField()
requirements_status = serializers.SerializerMethodField(
read_only=True, method_name="get_requirements_status"
)
provider_type = serializers.SerializerMethodField(read_only=True)
class JSONAPIMeta:
resource_name = "compliance-overviews"
class Meta:
model = ComplianceOverview
fields = [
"id",
"inserted_at",
"compliance_id",
"framework",
"version",
"requirements_status",
"region",
"provider_type",
"scan",
"url",
]
@extend_schema_field(
{
"type": "object",
"properties": {
"passed": {"type": "integer"},
"failed": {"type": "integer"},
"manual": {"type": "integer"},
"total": {"type": "integer"},
},
}
)
def get_requirements_status(self, obj):
return {
"passed": obj.requirements_passed,
"failed": obj.requirements_failed,
"manual": obj.requirements_manual,
"total": obj.total_requirements,
}
@extend_schema_field(serializers.CharField(allow_null=True))
def get_provider_type(self, obj):
"""
Retrieves the provider_type from scan.provider.provider_type.
"""
try:
return obj.scan.provider.provider
except AttributeError:
return None
class ComplianceOverviewDetailSerializer(serializers.Serializer):
"""
Serializer for detailed compliance requirement information.
class ComplianceOverviewFullSerializer(ComplianceOverviewSerializer):
requirements = serializers.SerializerMethodField(read_only=True)
This serializer formats the aggregated requirement data, showing detailed status
and counts for each requirement across all regions.
"""
class Meta(ComplianceOverviewSerializer.Meta):
fields = ComplianceOverviewSerializer.Meta.fields + [
"description",
"requirements",
]
id = serializers.CharField()
framework = serializers.CharField()
version = serializers.CharField()
description = serializers.CharField()
status = serializers.ChoiceField(choices=StatusChoices.choices)
class JSONAPIMeta:
resource_name = "compliance-requirements-details"
class ComplianceOverviewAttributesSerializer(serializers.Serializer):
id = serializers.CharField()
framework_description = serializers.CharField()
name = serializers.CharField()
framework = serializers.CharField()
version = serializers.CharField()
description = serializers.CharField()
attributes = serializers.JSONField()
class JSONAPIMeta:
resource_name = "compliance-requirements-attributes"
@extend_schema_field(
{
"type": "object",
"properties": {
"requirement_id": {
"type": "object",
"properties": {
"name": {"type": "string"},
"checks": {
"type": "object",
"properties": {
"check_name": {
"type": "object",
"properties": {
"status": {
"type": "string",
"enum": ["PASS", "FAIL", None],
},
},
}
},
"description": "Each key in the 'checks' object is a check name, with values as "
"'PASS', 'FAIL', or null.",
},
"status": {
"type": "string",
"enum": ["PASS", "FAIL", "MANUAL"],
},
"attributes": {
"type": "array",
"items": {
"type": "object",
},
},
"description": {"type": "string"},
"checks_status": {
"type": "object",
"properties": {
"total": {"type": "integer"},
"pass": {"type": "integer"},
"fail": {"type": "integer"},
"manual": {"type": "integer"},
},
},
},
}
},
}
)
def get_requirements(self, obj):
"""
Returns the detailed structure of requirements.
"""
return obj.requirements
class ComplianceOverviewMetadataSerializer(serializers.Serializer):
regions = serializers.ListField(child=serializers.CharField(), allow_empty=True)
class JSONAPIMeta:
class Meta:
resource_name = "compliance-overviews-metadata"
@@ -2070,156 +2231,3 @@ class IntegrationUpdateSerializer(BaseWriteIntegrationSerializer):
IntegrationProviderRelationship.objects.bulk_create(new_relationships)
return super().update(instance, validated_data)
# SSO
class SamlInitiateSerializer(serializers.Serializer):
email_domain = serializers.CharField()
class JSONAPIMeta:
resource_name = "saml-initiate"
class SamlMetadataSerializer(serializers.Serializer):
class JSONAPIMeta:
resource_name = "saml-meta"
class SAMLConfigurationSerializer(RLSSerializer):
class Meta:
model = SAMLConfiguration
fields = ["id", "email_domain", "metadata_xml", "created_at", "updated_at"]
read_only_fields = ["id", "created_at", "updated_at"]
class LighthouseConfigSerializer(RLSSerializer):
"""
Serializer for the LighthouseConfig model.
"""
api_key = serializers.CharField(required=False)
class Meta:
model = LighthouseConfiguration
fields = [
"id",
"name",
"api_key",
"model",
"temperature",
"max_tokens",
"business_context",
"is_active",
"inserted_at",
"updated_at",
"url",
]
extra_kwargs = {
"id": {"read_only": True},
"is_active": {"read_only": True},
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
}
def to_representation(self, instance):
data = super().to_representation(instance)
# Check if api_key is specifically requested in fields param
fields_param = self.context.get("request", None) and self.context[
"request"
].query_params.get("fields[lighthouse-config]", "")
if fields_param == "api_key":
# Return decrypted key if specifically requested
data["api_key"] = instance.api_key_decoded if instance.api_key else None
else:
# Return masked key for general requests
data["api_key"] = "*" * len(instance.api_key) if instance.api_key else None
return data
class LighthouseConfigCreateSerializer(RLSSerializer, BaseWriteSerializer):
"""Serializer for creating new Lighthouse configurations."""
api_key = serializers.CharField(write_only=True, required=True)
class Meta:
model = LighthouseConfiguration
fields = [
"id",
"name",
"api_key",
"model",
"temperature",
"max_tokens",
"business_context",
"is_active",
"inserted_at",
"updated_at",
]
extra_kwargs = {
"id": {"read_only": True},
"is_active": {"read_only": True},
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
}
def validate(self, attrs):
tenant_id = self.context.get("request").tenant_id
if LighthouseConfiguration.objects.filter(tenant_id=tenant_id).exists():
raise serializers.ValidationError(
{
"tenant_id": "Lighthouse configuration already exists for this tenant."
}
)
return super().validate(attrs)
def create(self, validated_data):
api_key = validated_data.pop("api_key")
instance = super().create(validated_data)
instance.api_key_decoded = api_key
instance.save()
return instance
def to_representation(self, instance):
data = super().to_representation(instance)
# Always mask the API key in the response
data["api_key"] = "*" * len(instance.api_key) if instance.api_key else None
return data
class LighthouseConfigUpdateSerializer(BaseWriteSerializer):
"""
Serializer for updating LighthouseConfig instances.
"""
api_key = serializers.CharField(write_only=True, required=False)
class Meta:
model = LighthouseConfiguration
fields = [
"id",
"name",
"api_key",
"model",
"temperature",
"max_tokens",
"business_context",
"is_active",
]
extra_kwargs = {
"id": {"read_only": True},
"is_active": {"read_only": True},
"name": {"required": False},
"model": {"required": False},
"temperature": {"required": False},
"max_tokens": {"required": False},
}
def update(self, instance, validated_data):
api_key = validated_data.pop("api_key", None)
instance = super().update(instance, validated_data)
if api_key:
instance.api_key_decoded = api_key
instance.save()
return instance

View File

@@ -1,11 +1,9 @@
from allauth.socialaccount.providers.saml.views import ACSView, MetadataView, SLSView
from django.urls import include, path
from drf_spectacular.views import SpectacularRedocView
from rest_framework_nested import routers
from api.v1.views import (
ComplianceOverviewViewSet,
CustomSAMLLoginView,
CustomTokenObtainView,
CustomTokenRefreshView,
CustomTokenSwitchTenantView,
@@ -15,7 +13,6 @@ from api.v1.views import (
IntegrationViewSet,
InvitationAcceptViewSet,
InvitationViewSet,
LighthouseConfigViewSet,
MembershipViewSet,
OverviewViewSet,
ProviderGroupProvidersRelationshipView,
@@ -25,14 +22,10 @@ from api.v1.views import (
ResourceViewSet,
RoleProviderGroupRelationshipView,
RoleViewSet,
SAMLConfigurationViewSet,
SAMLInitiateAPIView,
SAMLTokenValidateView,
ScanViewSet,
ScheduleViewSet,
SchemaView,
TaskViewSet,
TenantFinishACSView,
TenantMembersViewSet,
TenantViewSet,
UserRoleRelationshipView,
@@ -56,12 +49,6 @@ router.register(
router.register(r"overviews", OverviewViewSet, basename="overview")
router.register(r"schedules", ScheduleViewSet, basename="schedule")
router.register(r"integrations", IntegrationViewSet, basename="integration")
router.register(r"saml-config", SAMLConfigurationViewSet, basename="saml-config")
router.register(
r"lighthouse-configurations",
LighthouseConfigViewSet,
basename="lighthouseconfiguration",
)
tenants_router = routers.NestedSimpleRouter(router, r"tenants", lookup="tenant")
tenants_router.register(
@@ -125,36 +112,6 @@ urlpatterns = [
),
name="provider_group-providers-relationship",
),
# API endpoint to start SAML SSO flow
path(
"auth/saml/initiate/", SAMLInitiateAPIView.as_view(), name="api_saml_initiate"
),
path(
"accounts/saml/<organization_slug>/login/",
CustomSAMLLoginView.as_view(),
name="saml_login",
),
path(
"accounts/saml/<organization_slug>/acs/",
ACSView.as_view(),
name="saml_acs",
),
path(
"accounts/saml/<organization_slug>/acs/finish/",
TenantFinishACSView.as_view(),
name="saml_finish_acs",
),
path(
"accounts/saml/<organization_slug>/sls/",
SLSView.as_view(),
name="saml_sls",
),
path(
"accounts/saml/<organization_slug>/metadata/",
MetadataView.as_view(),
name="saml_metadata",
),
path("tokens/saml", SAMLTokenValidateView.as_view(), name="token-saml"),
path("tokens/google", GoogleSocialLoginView.as_view(), name="token-google"),
path("tokens/github", GithubSocialLoginView.as_view(), name="token-github"),
path("", include(router.urls)),

File diff suppressed because it is too large Load Diff

View File

@@ -1,13 +1,6 @@
import warnings
from celery import Celery, Task
from config.env import env
# Suppress specific warnings from django-rest-auth: https://github.com/iMerica/dj-rest-auth/issues/684
warnings.filterwarnings(
"ignore", category=UserWarning, module="dj_rest_auth.registration.serializers"
)
BROKER_VISIBILITY_TIMEOUT = env.int("DJANGO_BROKER_VISIBILITY_TIMEOUT", default=86400)
celery_app = Celery("tasks")

View File

@@ -10,8 +10,6 @@ from config.settings.social_login import * # noqa
SECRET_KEY = env("SECRET_KEY", default="secret")
DEBUG = env.bool("DJANGO_DEBUG", default=False)
ALLOWED_HOSTS = ["localhost", "127.0.0.1"]
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
USE_X_FORWARDED_HOST = True
# Application definition
@@ -28,19 +26,16 @@ INSTALLED_APPS = [
"rest_framework",
"corsheaders",
"drf_spectacular",
"drf_spectacular_jsonapi",
"django_guid",
"rest_framework_json_api",
"django_celery_results",
"django_celery_beat",
"rest_framework_simplejwt.token_blacklist",
"allauth",
"django.contrib.sites",
"allauth.account",
"allauth.socialaccount",
"allauth.socialaccount.providers.google",
"allauth.socialaccount.providers.github",
"allauth.socialaccount.providers.saml",
"dj_rest_auth.registration",
"rest_framework.authtoken",
]
@@ -116,7 +111,6 @@ SPECTACULAR_SETTINGS = {
"PREPROCESSING_HOOKS": [
"drf_spectacular_jsonapi.hooks.fix_nested_path_parameters",
],
"TITLE": "API Reference - Prowler",
}
WSGI_APPLICATION = "config.wsgi.application"
@@ -243,13 +237,8 @@ DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = env.str(
DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = env.str("DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN", "")
DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = env.str("DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION", "")
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
# HTTP Security Headers
SECURE_CONTENT_TYPE_NOSNIFF = True
X_FRAME_OPTIONS = "DENY"
SECURE_REFERRER_POLICY = "strict-origin-when-cross-origin"
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
# SAML requirement
CSRF_COOKIE_SECURE = True
SESSION_COOKIE_SECURE = True

View File

@@ -39,9 +39,6 @@ IGNORED_EXCEPTIONS = [
"RequestExpired",
"ConnectionClosedError",
"MaxRetryError",
"AWSAccessKeyIDInvalidError",
"AWSSessionTokenExpiredError",
"EndpointConnectionError", # AWS Service is not available in a region
"Pool is closed", # The following comes from urllib3: eu-west-1 -- HTTPClientError[126]: An HTTP Client raised an unhandled exception: AWSHTTPSConnectionPool(host='hostname.s3.eu-west-1.amazonaws.com', port=443): Pool is closed.
# Authentication Errors from GCP
"ClientAuthenticationError",
@@ -66,6 +63,8 @@ IGNORED_EXCEPTIONS = [
"AzureClientIdAndClientSecretNotBelongingToTenantIdError",
"AzureHTTPResponseError",
"Error with credentials provided",
# AWS Service is not available in a region
"EndpointConnectionError",
]
@@ -79,16 +78,9 @@ def before_send(event, hint):
log_msg = hint["log_record"].msg
log_lvl = hint["log_record"].levelno
# Handle Error and Critical events and discard the rest
if log_lvl <= 40 and any(ignored in log_msg for ignored in IGNORED_EXCEPTIONS):
return None # Explicitly return None to drop the event
# Ignore exceptions with the ignored_exceptions
if "exc_info" in hint and hint["exc_info"]:
exc_value = str(hint["exc_info"][1])
if any(ignored in exc_value for ignored in IGNORED_EXCEPTIONS):
return None # Explicitly return None to drop the event
# Handle Error events and discard the rest
if log_lvl == 40 and any(ignored in log_msg for ignored in IGNORED_EXCEPTIONS):
return
return event
@@ -104,6 +96,4 @@ sentry_sdk.init(
# possible.
"continuous_profiling_auto_start": True,
},
attach_stacktrace=True,
ignore_errors=IGNORED_EXCEPTIONS,
)

View File

@@ -11,7 +11,8 @@ GITHUB_OAUTH_CALLBACK_URL = env("SOCIAL_GITHUB_OAUTH_CALLBACK_URL", default="")
# Allauth settings
ACCOUNT_LOGIN_METHODS = {"email"} # Use Email / Password authentication
ACCOUNT_SIGNUP_FIELDS = ["email*", "password1*", "password2*"]
ACCOUNT_USERNAME_REQUIRED = False
ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_EMAIL_VERIFICATION = "none" # Do not require email confirmation
ACCOUNT_USER_MODEL_USERNAME_FIELD = None
REST_AUTH = {
@@ -24,20 +25,6 @@ SOCIALACCOUNT_EMAIL_AUTHENTICATION = True
# Connect local account and social account if local account with that email address already exists
SOCIALACCOUNT_EMAIL_AUTHENTICATION_AUTO_CONNECT = True
SOCIALACCOUNT_ADAPTER = "api.adapters.ProwlerSocialAccountAdapter"
# def inline(pem: str) -> str:
# return "".join(
# line.strip()
# for line in pem.splitlines()
# if "CERTIFICATE" not in line and "KEY" not in line
# )
# # SAML keys (TODO: Validate certificates)
# SAML_PUBLIC_CERT = inline(env("SAML_PUBLIC_CERT", default=""))
# SAML_PRIVATE_KEY = inline(env("SAML_PRIVATE_KEY", default=""))
SOCIALACCOUNT_PROVIDERS = {
"google": {
"APP": {
@@ -63,20 +50,4 @@ SOCIALACCOUNT_PROVIDERS = {
"read:org",
],
},
"saml": {
"use_nameid_for_email": True,
"sp": {
"entity_id": "urn:prowler.com:sp",
},
"advanced": {
# TODO: Validate certificates
# "x509cert": SAML_PUBLIC_CERT,
# "private_key": SAML_PRIVATE_KEY,
# "authn_request_signed": True,
# "want_message_signed": True,
# "want_assertion_signed": True,
"reject_idp_initiated_sso": False,
"name_id_format": "urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress",
},
},
}

View File

@@ -1,9 +1,8 @@
import logging
from datetime import datetime, timedelta, timezone
from unittest.mock import MagicMock, patch
from unittest.mock import patch
import pytest
from allauth.socialaccount.models import SocialLogin
from django.conf import settings
from django.db import connection as django_connection
from django.db import connections as django_connections
@@ -11,17 +10,14 @@ from django.urls import reverse
from django_celery_results.models import TaskResult
from rest_framework import status
from rest_framework.test import APIClient
from tasks.jobs.backfill import backfill_resource_scan_summaries
from api.db_utils import rls_transaction
from api.models import (
ComplianceOverview,
ComplianceRequirementOverview,
Finding,
Integration,
IntegrationProviderRelationship,
Invitation,
LighthouseConfiguration,
Membership,
Provider,
ProviderGroup,
@@ -29,12 +25,9 @@ from api.models import (
Resource,
ResourceTag,
Role,
SAMLConfiguration,
SAMLDomainIndex,
Scan,
ScanSummary,
StateChoices,
StatusChoices,
Task,
User,
UserRoleRelationship,
@@ -381,14 +374,8 @@ def providers_fixture(tenants_fixture):
tenant_id=tenant.id,
scanner_args={"key1": "value1", "key2": {"key21": "value21"}},
)
provider6 = Provider.objects.create(
provider="m365",
uid="m365.test.com",
alias="m365_testing",
tenant_id=tenant.id,
)
return provider1, provider2, provider3, provider4, provider5, provider6
return provider1, provider2, provider3, provider4, provider5
@pytest.fixture
@@ -789,131 +776,6 @@ def compliance_overviews_fixture(scans_fixture, tenants_fixture):
return compliance_overview1, compliance_overview2
@pytest.fixture
def compliance_requirements_overviews_fixture(scans_fixture, tenants_fixture):
"""Fixture for ComplianceRequirementOverview objects used by the new ComplianceOverviewViewSet."""
tenant = tenants_fixture[0]
scan1, scan2, scan3 = scans_fixture
# Create ComplianceRequirementOverview objects for scan1
requirement_overview1 = ComplianceRequirementOverview.objects.create(
tenant=tenant,
scan=scan1,
compliance_id="aws_account_security_onboarding_aws",
framework="AWS-Account-Security-Onboarding",
version="1.0",
description="Description for AWS Account Security Onboarding",
region="eu-west-1",
requirement_id="requirement1",
requirement_status=StatusChoices.PASS,
passed_checks=2,
failed_checks=0,
total_checks=2,
)
requirement_overview2 = ComplianceRequirementOverview.objects.create(
tenant=tenant,
scan=scan1,
compliance_id="aws_account_security_onboarding_aws",
framework="AWS-Account-Security-Onboarding",
version="1.0",
description="Description for AWS Account Security Onboarding",
region="eu-west-1",
requirement_id="requirement2",
requirement_status=StatusChoices.PASS,
passed_checks=2,
failed_checks=0,
total_checks=2,
)
requirement_overview3 = ComplianceRequirementOverview.objects.create(
tenant=tenant,
scan=scan1,
compliance_id="aws_account_security_onboarding_aws",
framework="AWS-Account-Security-Onboarding",
version="1.0",
description="Description for AWS Account Security Onboarding",
region="eu-west-2",
requirement_id="requirement1",
requirement_status=StatusChoices.PASS,
passed_checks=2,
failed_checks=0,
total_checks=2,
)
requirement_overview4 = ComplianceRequirementOverview.objects.create(
tenant=tenant,
scan=scan1,
compliance_id="aws_account_security_onboarding_aws",
framework="AWS-Account-Security-Onboarding",
version="1.0",
description="Description for AWS Account Security Onboarding",
region="eu-west-2",
requirement_id="requirement2",
requirement_status=StatusChoices.FAIL,
passed_checks=1,
failed_checks=1,
total_checks=2,
)
requirement_overview5 = ComplianceRequirementOverview.objects.create(
tenant=tenant,
scan=scan1,
compliance_id="aws_account_security_onboarding_aws",
framework="AWS-Account-Security-Onboarding",
version="1.0",
description="Description for AWS Account Security Onboarding (MANUAL)",
region="eu-west-2",
requirement_id="requirement3",
requirement_status=StatusChoices.MANUAL,
passed_checks=0,
failed_checks=0,
total_checks=0,
)
# Create a different compliance framework for testing
requirement_overview6 = ComplianceRequirementOverview.objects.create(
tenant=tenant,
scan=scan1,
compliance_id="cis_1.4_aws",
framework="CIS-1.4-AWS",
version="1.4",
description="CIS AWS Foundations Benchmark v1.4.0",
region="eu-west-1",
requirement_id="cis_requirement1",
requirement_status=StatusChoices.FAIL,
passed_checks=0,
failed_checks=3,
total_checks=3,
)
# Create another compliance framework for testing MITRE ATT&CK
requirement_overview7 = ComplianceRequirementOverview.objects.create(
tenant=tenant,
scan=scan1,
compliance_id="mitre_attack_aws",
framework="MITRE-ATTACK",
version="1.0",
description="MITRE ATT&CK",
region="eu-west-1",
requirement_id="mitre_requirement1",
requirement_status=StatusChoices.FAIL,
passed_checks=0,
failed_checks=0,
total_checks=0,
)
return (
requirement_overview1,
requirement_overview2,
requirement_overview3,
requirement_overview4,
requirement_overview5,
requirement_overview6,
requirement_overview7,
)
def get_api_tokens(
api_client, user_email: str, user_password: str, tenant_id: str = None
) -> tuple[str, str]:
@@ -1058,127 +920,6 @@ def integrations_fixture(providers_fixture):
return integration1, integration2
@pytest.fixture
def backfill_scan_metadata_fixture(scans_fixture, findings_fixture):
for scan_instance in scans_fixture:
tenant_id = scan_instance.tenant_id
scan_id = scan_instance.id
backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
@pytest.fixture
def lighthouse_config_fixture(authenticated_client, tenants_fixture):
return LighthouseConfiguration.objects.create(
tenant_id=tenants_fixture[0].id,
name="OpenAI",
api_key_decoded="sk-test1234567890T3BlbkFJtest1234567890",
model="gpt-4o",
temperature=0,
max_tokens=4000,
business_context="Test business context",
is_active=True,
)
@pytest.fixture(scope="function")
def latest_scan_finding(authenticated_client, providers_fixture, resources_fixture):
provider = providers_fixture[0]
tenant_id = str(providers_fixture[0].tenant_id)
resource = resources_fixture[0]
scan = Scan.objects.create(
name="latest completed scan",
provider=provider,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.COMPLETED,
tenant_id=tenant_id,
)
finding = Finding.objects.create(
tenant_id=tenant_id,
uid="test_finding_uid_1",
scan=scan,
delta="new",
status=Status.FAIL,
status_extended="test status extended ",
impact=Severity.critical,
impact_extended="test impact extended one",
severity=Severity.critical,
raw_result={
"status": Status.FAIL,
"impact": Severity.critical,
"severity": Severity.critical,
},
tags={"test": "dev-qa"},
check_id="test_check_id",
check_metadata={
"CheckId": "test_check_id",
"Description": "test description apple sauce",
},
first_seen_at="2024-01-02T00:00:00Z",
)
finding.add_resources([resource])
backfill_resource_scan_summaries(tenant_id, str(scan.id))
return finding
@pytest.fixture
def saml_setup(tenants_fixture):
tenant_id = tenants_fixture[0].id
domain = "prowler.com"
SAMLDomainIndex.objects.create(email_domain=domain, tenant_id=tenant_id)
metadata_xml = """<?xml version='1.0' encoding='UTF-8'?>
<md:EntityDescriptor entityID='TEST' xmlns:md='urn:oasis:names:tc:SAML:2.0:metadata'>
<md:IDPSSODescriptor WantAuthnRequestsSigned='false' protocolSupportEnumeration='urn:oasis:names:tc:SAML:2.0:protocol'>
<md:KeyDescriptor use='signing'>
<ds:KeyInfo xmlns:ds='http://www.w3.org/2000/09/xmldsig#'>
<ds:X509Data>
<ds:X509Certificate>TEST</ds:X509Certificate>
</ds:X509Data>
</ds:KeyInfo>
</md:KeyDescriptor>
<md:NameIDFormat>urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress</md:NameIDFormat>
<md:SingleSignOnService Binding='urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST' Location='https://TEST/sso/saml'/>
<md:SingleSignOnService Binding='urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' Location='https://TEST/sso/saml'/>
</md:IDPSSODescriptor>
</md:EntityDescriptor>
"""
SAMLConfiguration.objects.create(
tenant_id=str(tenant_id),
email_domain=domain,
metadata_xml=metadata_xml,
)
return {
"email": f"user@{domain}",
"domain": domain,
"tenant_id": tenant_id,
}
@pytest.fixture
def saml_sociallogin(users_fixture):
user = users_fixture[0]
user.email = "samlsso@acme.com"
extra_data = {
"firstName": ["Test"],
"lastName": ["User"],
"organization": ["Prowler"],
"userType": ["member"],
}
account = MagicMock()
account.provider = "saml"
account.extra_data = extra_data
sociallogin = MagicMock(spec=SocialLogin)
sociallogin.account = account
sociallogin.user = user
return sociallogin
def get_authorization_header(access_token: str) -> dict:
return {"Authorization": f"Bearer {access_token}"}

View File

@@ -3,12 +3,6 @@
import os
import sys
import warnings
# Suppress specific warnings from django-rest-auth: https://github.com/iMerica/dj-rest-auth/issues/684
warnings.filterwarnings(
"ignore", category=UserWarning, module="dj_rest_auth.registration.serializers"
)
def main():

View File

@@ -1,61 +0,0 @@
from api.db_utils import rls_transaction
from api.models import (
Resource,
ResourceFindingMapping,
ResourceScanSummary,
Scan,
StateChoices,
)
def backfill_resource_scan_summaries(tenant_id: str, scan_id: str):
with rls_transaction(tenant_id):
if ResourceScanSummary.objects.filter(
tenant_id=tenant_id, scan_id=scan_id
).exists():
return {"status": "already backfilled"}
with rls_transaction(tenant_id):
if not Scan.objects.filter(
tenant_id=tenant_id,
id=scan_id,
state__in=(StateChoices.COMPLETED, StateChoices.FAILED),
).exists():
return {"status": "scan is not completed"}
resource_ids_qs = (
ResourceFindingMapping.objects.filter(
tenant_id=tenant_id, finding__scan_id=scan_id
)
.values_list("resource_id", flat=True)
.distinct()
)
resource_ids = list(resource_ids_qs)
if not resource_ids:
return {"status": "no resources to backfill"}
resources_qs = Resource.objects.filter(
tenant_id=tenant_id, id__in=resource_ids
).only("id", "service", "region", "type")
summaries = []
for resource in resources_qs.iterator():
summaries.append(
ResourceScanSummary(
tenant_id=tenant_id,
scan_id=scan_id,
resource_id=str(resource.id),
service=resource.service,
region=resource.region,
resource_type=resource.type,
)
)
for i in range(0, len(summaries), 500):
ResourceScanSummary.objects.bulk_create(
summaries[i : i + 500], ignore_conflicts=True
)
return {"status": "backfilled", "inserted": len(summaries)}

View File

@@ -1,9 +1,8 @@
from datetime import datetime, timezone
import openai
from celery.utils.log import get_task_logger
from api.models import LighthouseConfiguration, Provider
from api.models import Provider
from api.utils import prowler_provider_connection_test
logger = get_task_logger(__name__)
@@ -40,46 +39,3 @@ def check_provider_connection(provider_id: str):
connection_error = f"{connection_result.error}" if connection_result.error else None
return {"connected": connection_result.is_connected, "error": connection_error}
def check_lighthouse_connection(lighthouse_config_id: str):
"""
Business logic to check the connection status of a Lighthouse configuration.
Args:
lighthouse_config_id (str): The primary key of the LighthouseConfiguration instance to check.
Returns:
dict: A dictionary containing:
- 'connected' (bool): Indicates whether the connection is successful.
- 'error' (str or None): The error message if the connection failed, otherwise `None`.
- 'available_models' (list): List of available models if connection is successful.
Raises:
Model.DoesNotExist: If the lighthouse configuration does not exist.
"""
lighthouse_config = LighthouseConfiguration.objects.get(pk=lighthouse_config_id)
if not lighthouse_config.api_key_decoded:
lighthouse_config.is_active = False
lighthouse_config.save()
return {
"connected": False,
"error": "API key is invalid or missing.",
"available_models": [],
}
try:
client = openai.OpenAI(api_key=lighthouse_config.api_key_decoded)
models = client.models.list()
lighthouse_config.is_active = True
lighthouse_config.save()
return {
"connected": True,
"error": None,
"available_models": [model.id for model in models.data],
}
except Exception as e:
lighthouse_config.is_active = False
lighthouse_config.save()
return {"connected": False, "error": str(e), "available_models": []}

View File

@@ -1,5 +1,4 @@
import os
import re
import zipfile
import boto3
@@ -14,42 +13,6 @@ from prowler.config.config import (
json_ocsf_file_suffix,
output_file_timestamp,
)
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
AWSWellArchitected,
)
from prowler.lib.outputs.compliance.cis.cis_aws import AWSCIS
from prowler.lib.outputs.compliance.cis.cis_azure import AzureCIS
from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
from prowler.lib.outputs.compliance.cis.cis_m365 import M365CIS
from prowler.lib.outputs.compliance.ens.ens_aws import AWSENS
from prowler.lib.outputs.compliance.ens.ens_azure import AzureENS
from prowler.lib.outputs.compliance.ens.ens_gcp import GCPENS
from prowler.lib.outputs.compliance.iso27001.iso27001_aws import AWSISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_azure import AzureISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_gcp import GCPISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_kubernetes import (
KubernetesISO27001,
)
from prowler.lib.outputs.compliance.iso27001.iso27001_m365 import M365ISO27001
from prowler.lib.outputs.compliance.kisa_ismsp.kisa_ismsp_aws import AWSKISAISMSP
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_aws import AWSMitreAttack
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_azure import (
AzureMitreAttack,
)
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_gcp import GCPMitreAttack
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_aws import (
ProwlerThreatScoreAWS,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_azure import (
ProwlerThreatScoreAzure,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_gcp import (
ProwlerThreatScoreGCP,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_m365 import (
ProwlerThreatScoreM365,
)
from prowler.lib.outputs.csv.csv import CSV
from prowler.lib.outputs.html.html import HTML
from prowler.lib.outputs.ocsf.ocsf import OCSF
@@ -57,45 +20,6 @@ from prowler.lib.outputs.ocsf.ocsf import OCSF
logger = get_task_logger(__name__)
COMPLIANCE_CLASS_MAP = {
"aws": [
(lambda name: name.startswith("cis_"), AWSCIS),
(lambda name: name == "mitre_attack_aws", AWSMitreAttack),
(lambda name: name.startswith("ens_"), AWSENS),
(
lambda name: name.startswith("aws_well_architected_framework"),
AWSWellArchitected,
),
(lambda name: name.startswith("iso27001_"), AWSISO27001),
(lambda name: name.startswith("kisa"), AWSKISAISMSP),
(lambda name: name == "prowler_threatscore_aws", ProwlerThreatScoreAWS),
],
"azure": [
(lambda name: name.startswith("cis_"), AzureCIS),
(lambda name: name == "mitre_attack_azure", AzureMitreAttack),
(lambda name: name.startswith("ens_"), AzureENS),
(lambda name: name.startswith("iso27001_"), AzureISO27001),
(lambda name: name == "prowler_threatscore_azure", ProwlerThreatScoreAzure),
],
"gcp": [
(lambda name: name.startswith("cis_"), GCPCIS),
(lambda name: name == "mitre_attack_gcp", GCPMitreAttack),
(lambda name: name.startswith("ens_"), GCPENS),
(lambda name: name.startswith("iso27001_"), GCPISO27001),
(lambda name: name == "prowler_threatscore_gcp", ProwlerThreatScoreGCP),
],
"kubernetes": [
(lambda name: name.startswith("cis_"), KubernetesCIS),
(lambda name: name.startswith("iso27001_"), KubernetesISO27001),
],
"m365": [
(lambda name: name.startswith("cis_"), M365CIS),
(lambda name: name == "prowler_threatscore_m365", ProwlerThreatScoreM365),
(lambda name: name.startswith("iso27001_"), M365ISO27001),
],
}
# Predefined mapping for output formats and their configurations
OUTPUT_FORMATS_MAPPING = {
"csv": {
@@ -119,17 +43,13 @@ def _compress_output_files(output_directory: str) -> str:
str: The full path to the newly created ZIP archive.
"""
zip_path = f"{output_directory}.zip"
parent_dir = os.path.dirname(output_directory)
zip_path_abs = os.path.abspath(zip_path)
with zipfile.ZipFile(zip_path, "w", zipfile.ZIP_DEFLATED) as zipf:
for foldername, _, filenames in os.walk(parent_dir):
for filename in filenames:
file_path = os.path.join(foldername, filename)
if os.path.abspath(file_path) == zip_path_abs:
continue
arcname = os.path.relpath(file_path, start=parent_dir)
zipf.write(file_path, arcname)
for suffix in [config["suffix"] for config in OUTPUT_FORMATS_MAPPING.values()]:
zipf.write(
f"{output_directory}{suffix}",
f"output/{output_directory.split('/')[-1]}{suffix}",
)
return zip_path
@@ -182,38 +102,25 @@ def _upload_to_s3(tenant_id: str, zip_path: str, scan_id: str) -> str:
Raises:
botocore.exceptions.ClientError: If the upload attempt to S3 fails for any reason.
"""
bucket = base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET
if not bucket:
return None
if not base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET:
return
try:
s3 = get_s3_client()
# Upload the ZIP file (outputs) to the S3 bucket
zip_key = f"{tenant_id}/{scan_id}/{os.path.basename(zip_path)}"
s3_key = f"{tenant_id}/{scan_id}/{os.path.basename(zip_path)}"
s3.upload_file(
Filename=zip_path,
Bucket=bucket,
Key=zip_key,
Bucket=base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET,
Key=s3_key,
)
# Upload the compliance directory to the S3 bucket
compliance_dir = os.path.join(os.path.dirname(zip_path), "compliance")
for filename in os.listdir(compliance_dir):
local_path = os.path.join(compliance_dir, filename)
if not os.path.isfile(local_path):
continue
file_key = f"{tenant_id}/{scan_id}/compliance/{filename}"
s3.upload_file(Filename=local_path, Bucket=bucket, Key=file_key)
return f"s3://{base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET}/{zip_key}"
return f"s3://{base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET}/{s3_key}"
except (ClientError, NoCredentialsError, ParamValidationError, ValueError) as e:
logger.error(f"S3 upload failed: {str(e)}")
def _generate_output_directory(
output_directory, prowler_provider: object, tenant_id: str, scan_id: str
) -> tuple[str, str]:
) -> str:
"""
Generate a file system path for the output directory of a prowler scan.
@@ -238,22 +145,12 @@ def _generate_output_directory(
Example:
>>> _generate_output_directory("/tmp", "aws", "tenant-1234", "scan-5678")
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56',
'/tmp/tenant-1234/aws/scan-5678/compliance/prowler-output-2023-02-15T12:34:56'
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56'
"""
# Sanitize the prowler provider name to ensure it is a valid directory name
prowler_provider_sanitized = re.sub(r"[^\w\-]", "-", prowler_provider)
path = (
f"{output_directory}/{tenant_id}/{scan_id}/prowler-output-"
f"{prowler_provider_sanitized}-{output_file_timestamp}"
f"{prowler_provider}-{output_file_timestamp}"
)
os.makedirs("/".join(path.split("/")[:-1]), exist_ok=True)
compliance_path = (
f"{output_directory}/{tenant_id}/{scan_id}/compliance/prowler-output-"
f"{prowler_provider_sanitized}-{output_file_timestamp}"
)
os.makedirs("/".join(compliance_path.split("/")[:-1]), exist_ok=True)
return path, compliance_path
return path

View File

@@ -13,20 +13,19 @@ from api.compliance import (
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE,
generate_scan_compliance,
)
from api.db_utils import create_objects_in_batches, rls_transaction
from api.db_utils import rls_transaction
from api.models import (
ComplianceRequirementOverview,
ComplianceOverview,
Finding,
Provider,
Resource,
ResourceScanSummary,
ResourceTag,
Scan,
ScanSummary,
StateChoices,
)
from api.models import StatusChoices as FindingStatus
from api.utils import initialize_prowler_provider, return_prowler_provider
from api.utils import initialize_prowler_provider
from api.v1.serializers import ScanTaskSerializer
from prowler.lib.outputs.finding import Finding as ProwlerFinding
from prowler.lib.scan.scan import Scan as ProwlerScan
@@ -119,11 +118,10 @@ def perform_prowler_scan(
ValueError: If the provider cannot be connected.
"""
check_status_by_region = {}
exception = None
unique_resources = set()
scan_resource_cache: set[tuple[str, str, str, str]] = set()
start_time = time.time()
exc = None
with rls_transaction(tenant_id):
provider_instance = Provider.objects.get(pk=provider_id)
@@ -139,7 +137,7 @@ def perform_prowler_scan(
provider_instance.connected = True
except Exception as e:
provider_instance.connected = False
exc = ValueError(
raise ValueError(
f"Provider {provider_instance.provider} is not connected: {e}"
)
finally:
@@ -148,12 +146,6 @@ def perform_prowler_scan(
)
provider_instance.save()
# If the provider is not connected, raise an exception outside the transaction.
# If raised within the transaction, the transaction will be rolled back and the provider will not be marked
# as not connected.
if exc:
raise exc
prowler_scan = ProwlerScan(provider=prowler_provider, checks=checks_to_execute)
resource_cache = {}
@@ -293,15 +285,15 @@ def perform_prowler_scan(
)
finding_instance.add_resources([resource_instance])
# Update scan resource summaries
scan_resource_cache.add(
(
str(resource_instance.id),
resource_instance.service,
resource_instance.region,
resource_instance.type,
)
)
# Update compliance data if applicable
if finding.status.value == "MUTED":
continue
region_dict = check_status_by_region.setdefault(finding.region, {})
current_status = region_dict.get(finding.check_id)
if current_status == "FAIL":
continue
region_dict[finding.check_id] = finding.status.value
# Update scan progress
with rls_transaction(tenant_id):
@@ -322,33 +314,66 @@ def perform_prowler_scan(
scan_instance.unique_resource_count = len(unique_resources)
scan_instance.save()
if exception is None:
try:
regions = prowler_provider.get_regions()
except AttributeError:
regions = set()
compliance_template = PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE[
provider_instance.provider
]
compliance_overview_by_region = {
region: deepcopy(compliance_template) for region in regions
}
for region, check_status in check_status_by_region.items():
compliance_data = compliance_overview_by_region.setdefault(
region, deepcopy(compliance_template)
)
for check_name, status in check_status.items():
generate_scan_compliance(
compliance_data,
provider_instance.provider,
check_name,
status,
)
# Prepare compliance overview objects
compliance_overview_objects = []
for region, compliance_data in compliance_overview_by_region.items():
for compliance_id, compliance in compliance_data.items():
compliance_overview_objects.append(
ComplianceOverview(
tenant_id=tenant_id,
scan=scan_instance,
region=region,
compliance_id=compliance_id,
framework=compliance["framework"],
version=compliance["version"],
description=compliance["description"],
requirements=compliance["requirements"],
requirements_passed=compliance["requirements_status"]["passed"],
requirements_failed=compliance["requirements_status"]["failed"],
requirements_manual=compliance["requirements_status"]["manual"],
total_requirements=compliance["total_requirements"],
)
)
try:
with rls_transaction(tenant_id):
ComplianceOverview.objects.bulk_create(
compliance_overview_objects, batch_size=100
)
except Exception as overview_exception:
import sentry_sdk
sentry_sdk.capture_exception(overview_exception)
logger.error(
f"Error storing compliance overview for scan {scan_id}: {overview_exception}"
)
if exception is not None:
raise exception
try:
resource_scan_summaries = [
ResourceScanSummary(
tenant_id=tenant_id,
scan_id=scan_id,
resource_id=resource_id,
service=service,
region=region,
resource_type=resource_type,
)
for resource_id, service, region, resource_type in scan_resource_cache
]
with rls_transaction(tenant_id):
ResourceScanSummary.objects.bulk_create(
resource_scan_summaries, batch_size=500, ignore_conflicts=True
)
except Exception as filter_exception:
import sentry_sdk
sentry_sdk.capture_exception(filter_exception)
logger.error(
f"Error storing filter values for scan {scan_id}: {filter_exception}"
)
serializer = ScanTaskSerializer(instance=scan_instance)
return serializer.data
@@ -503,114 +528,3 @@ def aggregate_findings(tenant_id: str, scan_id: str):
for agg in aggregation
}
ScanSummary.objects.bulk_create(scan_aggregations, batch_size=3000)
def create_compliance_requirements(tenant_id: str, scan_id: str):
"""
Create detailed compliance requirement overview records for a scan.
This function processes the compliance data collected during a scan and creates
individual records for each compliance requirement in each region. These detailed
records provide a granular view of compliance status.
Args:
tenant_id (str): The ID of the tenant for which to create records.
scan_id (str): The ID of the scan for which to create records.
Returns:
dict: A dictionary containing the number of requirements created and the regions processed.
Raises:
ValidationError: If tenant_id is not a valid UUID.
"""
try:
with rls_transaction(tenant_id):
scan_instance = Scan.objects.get(pk=scan_id)
provider_instance = scan_instance.provider
prowler_provider = return_prowler_provider(provider_instance)
# Get check status data by region from findings
check_status_by_region = {}
with rls_transaction(tenant_id):
findings = Finding.objects.filter(scan_id=scan_id, muted=False)
for finding in findings:
# Get region from resources
for resource in finding.resources.all():
region = resource.region
region_dict = check_status_by_region.setdefault(region, {})
current_status = region_dict.get(finding.check_id)
if current_status == "FAIL":
continue
region_dict[finding.check_id] = finding.status
try:
# Try to get regions from provider
regions = prowler_provider.get_regions()
except (AttributeError, Exception):
# If not available, use regions from findings
regions = set(check_status_by_region.keys())
# Get compliance template for the provider
compliance_template = PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE[
provider_instance.provider
]
# Create compliance data by region
compliance_overview_by_region = {
region: deepcopy(compliance_template) for region in regions
}
# Apply check statuses to compliance data
for region, check_status in check_status_by_region.items():
compliance_data = compliance_overview_by_region.setdefault(
region, deepcopy(compliance_template)
)
for check_name, status in check_status.items():
generate_scan_compliance(
compliance_data,
provider_instance.provider,
check_name,
status,
)
# Prepare compliance requirement objects
compliance_requirement_objects = []
for region, compliance_data in compliance_overview_by_region.items():
for compliance_id, compliance in compliance_data.items():
# Create an overview record for each requirement within each compliance framework
for requirement_id, requirement in compliance["requirements"].items():
compliance_requirement_objects.append(
ComplianceRequirementOverview(
tenant_id=tenant_id,
scan=scan_instance,
region=region,
compliance_id=compliance_id,
framework=compliance["framework"],
version=compliance["version"],
requirement_id=requirement_id,
description=requirement["description"],
passed_checks=requirement["checks_status"]["pass"],
failed_checks=requirement["checks_status"]["fail"],
total_checks=requirement["checks_status"]["total"],
requirement_status=requirement["status"],
)
)
# Bulk create requirement records
create_objects_in_batches(
tenant_id, ComplianceRequirementOverview, compliance_requirement_objects
)
return {
"requirements_created": len(compliance_requirement_objects),
"regions_processed": list(regions),
"compliance_frameworks": (
list(compliance_overview_by_region.get(list(regions)[0], {}).keys())
if regions
else []
),
}
except Exception as e:
logger.error(f"Error creating compliance requirements for scan {scan_id}: {e}")
raise e

View File

@@ -7,31 +7,22 @@ from celery.utils.log import get_task_logger
from config.celery import RLSTask
from config.django.base import DJANGO_FINDINGS_BATCH_SIZE, DJANGO_TMP_OUTPUT_DIRECTORY
from django_celery_beat.models import PeriodicTask
from tasks.jobs.backfill import backfill_resource_scan_summaries
from tasks.jobs.connection import check_lighthouse_connection, check_provider_connection
from tasks.jobs.connection import check_provider_connection
from tasks.jobs.deletion import delete_provider, delete_tenant
from tasks.jobs.export import (
COMPLIANCE_CLASS_MAP,
OUTPUT_FORMATS_MAPPING,
_compress_output_files,
_generate_output_directory,
_upload_to_s3,
)
from tasks.jobs.scan import (
aggregate_findings,
create_compliance_requirements,
perform_prowler_scan,
)
from tasks.jobs.scan import aggregate_findings, perform_prowler_scan
from tasks.utils import batched, get_next_execution_datetime
from api.compliance import get_compliance_frameworks
from api.db_utils import rls_transaction
from api.decorators import set_tenant
from api.models import Finding, Provider, Scan, ScanSummary, StateChoices
from api.utils import initialize_prowler_provider
from api.v1.serializers import ScanTaskSerializer
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.outputs.compliance.generic.generic import GenericCompliance
from prowler.lib.outputs.finding import Finding as FindingOutput
logger = get_task_logger(__name__)
@@ -105,7 +96,6 @@ def perform_scan_task(
chain(
perform_scan_summary_task.si(tenant_id, scan_id),
create_compliance_requirements_task.si(tenant_id=tenant_id, scan_id=scan_id),
generate_outputs.si(
scan_id=scan_id, provider_id=provider_id, tenant_id=tenant_id
),
@@ -216,9 +206,6 @@ def perform_scheduled_scan_task(self, tenant_id: str, provider_id: str):
chain(
perform_scan_summary_task.si(tenant_id, scan_instance.id),
create_compliance_requirements_task.si(
tenant_id=tenant_id, scan_id=str(scan_instance.id)
),
generate_outputs.si(
scan_id=str(scan_instance.id), provider_id=provider_id, tenant_id=tenant_id
),
@@ -264,153 +251,84 @@ def generate_outputs(scan_id: str, provider_id: str, tenant_id: str):
logger.info(f"No findings found for scan {scan_id}")
return {"upload": False}
provider_obj = Provider.objects.get(id=provider_id)
prowler_provider = initialize_prowler_provider(provider_obj)
provider_uid = provider_obj.uid
provider_type = provider_obj.provider
# Initialize the prowler provider
prowler_provider = initialize_prowler_provider(Provider.objects.get(id=provider_id))
frameworks_bulk = Compliance.get_bulk(provider_type)
frameworks_avail = get_compliance_frameworks(provider_type)
out_dir, comp_dir = _generate_output_directory(
# Get the provider UID
provider_uid = Provider.objects.get(id=provider_id).uid
# Generate and ensure the output directory exists
output_directory = _generate_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY, provider_uid, tenant_id, scan_id
)
def get_writer(writer_map, name, factory, is_last):
"""
Return existing writer_map[name] or create via factory().
In both cases set `.close_file = is_last`.
"""
initialization = False
if name not in writer_map:
writer_map[name] = factory()
initialization = True
w = writer_map[name]
w.close_file = is_last
return w, initialization
# Define auxiliary variables
output_writers = {}
compliance_writers = {}
scan_summary = FindingOutput._transform_findings_stats(
ScanSummary.objects.filter(scan_id=scan_id)
)
qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid").iterator()
for batch, is_last in batched(qs, DJANGO_FINDINGS_BATCH_SIZE):
fos = [FindingOutput.transform_api_finding(f, prowler_provider) for f in batch]
# Retrieve findings queryset
findings_qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid")
# Outputs
for mode, cfg in OUTPUT_FORMATS_MAPPING.items():
cls = cfg["class"]
suffix = cfg["suffix"]
extra = cfg.get("kwargs", {}).copy()
# Process findings in batches
for batch, is_last_batch in batched(
findings_qs.iterator(), DJANGO_FINDINGS_BATCH_SIZE
):
finding_outputs = [
FindingOutput.transform_api_finding(finding, prowler_provider)
for finding in batch
]
# Generate output files
for mode, config in OUTPUT_FORMATS_MAPPING.items():
kwargs = dict(config.get("kwargs", {}))
if mode == "html":
extra.update(provider=prowler_provider, stats=scan_summary)
kwargs["provider"] = prowler_provider
kwargs["stats"] = scan_summary
writer, initialization = get_writer(
output_writers,
cls,
lambda cls=cls, fos=fos, suffix=suffix: cls(
findings=fos,
file_path=out_dir,
file_extension=suffix,
writer_class = config["class"]
if writer_class in output_writers:
writer = output_writers[writer_class]
writer.transform(finding_outputs)
writer.close_file = is_last_batch
else:
writer = writer_class(
findings=finding_outputs,
file_path=output_directory,
file_extension=config["suffix"],
from_cli=False,
),
is_last,
)
if not initialization:
writer.transform(fos)
writer.batch_write_data_to_file(**extra)
writer._data.clear()
)
writer.close_file = is_last_batch
output_writers[writer_class] = writer
# Compliance CSVs
for name in frameworks_avail:
compliance_obj = frameworks_bulk[name]
# Write the current batch using the writer
writer.batch_write_data_to_file(**kwargs)
klass = GenericCompliance
for condition, cls in COMPLIANCE_CLASS_MAP.get(provider_type, []):
if condition(name):
klass = cls
break
# TODO: Refactor the output classes to avoid this manual reset
writer._data = []
filename = f"{comp_dir}_{name}.csv"
# Compress output files
output_directory = _compress_output_files(output_directory)
writer, initialization = get_writer(
compliance_writers,
name,
lambda klass=klass, fos=fos: klass(
findings=fos,
compliance=compliance_obj,
file_path=filename,
from_cli=False,
),
is_last,
)
if not initialization:
writer.transform(fos, compliance_obj, name)
writer.batch_write_data_to_file()
writer._data.clear()
# Save to configured storage
uploaded = _upload_to_s3(tenant_id, output_directory, scan_id)
compressed = _compress_output_files(out_dir)
upload_uri = _upload_to_s3(tenant_id, compressed, scan_id)
if upload_uri:
if uploaded:
# Remove the local files after upload
try:
rmtree(Path(compressed).parent, ignore_errors=True)
except Exception as e:
rmtree(Path(output_directory).parent, ignore_errors=True)
except FileNotFoundError as e:
logger.error(f"Error deleting output files: {e}")
final_location, did_upload = upload_uri, True
output_directory = uploaded
uploaded = True
else:
final_location, did_upload = compressed, False
uploaded = False
Scan.all_objects.filter(id=scan_id).update(output_location=final_location)
logger.info(f"Scan outputs at {final_location}")
return {"upload": did_upload}
# Update the scan instance with the output path
Scan.all_objects.filter(id=scan_id).update(output_location=output_directory)
logger.info(f"Scan output files generated, output location: {output_directory}")
@shared_task(name="backfill-scan-resource-summaries", queue="backfill")
def backfill_scan_resource_summaries_task(tenant_id: str, scan_id: str):
"""
Tries to backfill the resource scan summaries table for a given scan.
Args:
tenant_id (str): The tenant identifier.
scan_id (str): The scan identifier.
"""
return backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
@shared_task(base=RLSTask, name="scan-compliance-overviews")
def create_compliance_requirements_task(tenant_id: str, scan_id: str):
"""
Creates detailed compliance requirement records for a scan.
This task processes the compliance data collected during a scan and creates
individual records for each compliance requirement in each region. These detailed
records provide a granular view of compliance status.
Args:
tenant_id (str): The tenant ID for which to create records.
scan_id (str): The ID of the scan for which to create records.
"""
return create_compliance_requirements(tenant_id=tenant_id, scan_id=scan_id)
@shared_task(base=RLSTask, name="lighthouse-connection-check")
@set_tenant
def check_lighthouse_connection_task(lighthouse_config_id: str, tenant_id: str = None):
"""
Task to check the connection status of a Lighthouse configuration.
Args:
lighthouse_config_id (str): The primary key of the LighthouseConfiguration instance to check.
tenant_id (str): The tenant ID for the task.
Returns:
dict: A dictionary containing:
- 'connected' (bool): Indicates whether the connection is successful.
- 'error' (str or None): The error message if the connection failed, otherwise `None`.
- 'available_models' (list): List of available models if connection is successful.
"""
return check_lighthouse_connection(lighthouse_config_id=lighthouse_config_id)
return {"upload": uploaded}

View File

@@ -1,79 +0,0 @@
from uuid import uuid4
import pytest
from tasks.jobs.backfill import backfill_resource_scan_summaries
from api.models import ResourceScanSummary, Scan, StateChoices
@pytest.mark.django_db
class TestBackfillResourceScanSummaries:
@pytest.fixture(scope="function")
def resource_scan_summary_data(self, scans_fixture):
scan = scans_fixture[0]
return ResourceScanSummary.objects.create(
tenant_id=scan.tenant_id,
scan_id=scan.id,
resource_id=str(uuid4()),
service="aws",
region="us-east-1",
resource_type="instance",
)
@pytest.fixture(scope="function")
def get_not_completed_scans(self, providers_fixture):
provider_id = providers_fixture[0].id
tenant_id = providers_fixture[0].tenant_id
scan_1 = Scan.objects.create(
tenant_id=tenant_id,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.EXECUTING,
provider_id=provider_id,
)
scan_2 = Scan.objects.create(
tenant_id=tenant_id,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.AVAILABLE,
provider_id=provider_id,
)
return scan_1, scan_2
def test_already_backfilled(self, resource_scan_summary_data):
tenant_id = resource_scan_summary_data.tenant_id
scan_id = resource_scan_summary_data.scan_id
result = backfill_resource_scan_summaries(tenant_id, scan_id)
assert result == {"status": "already backfilled"}
def test_not_completed_scan(self, get_not_completed_scans):
for scan_instance in get_not_completed_scans:
tenant_id = scan_instance.tenant_id
scan_id = scan_instance.id
result = backfill_resource_scan_summaries(tenant_id, scan_id)
assert result == {"status": "scan is not completed"}
def test_successful_backfill_inserts_one_summary(
self, resources_fixture, findings_fixture
):
tenant_id = findings_fixture[0].tenant_id
scan_id = findings_fixture[0].scan_id
# This scan affects the first two resources
resources = resources_fixture[:2]
result = backfill_resource_scan_summaries(tenant_id, scan_id)
assert result == {"status": "backfilled", "inserted": len(resources)}
# Verify correct values
summaries = ResourceScanSummary.objects.filter(
tenant_id=tenant_id, scan_id=scan_id
)
assert summaries.count() == len(resources)
for resource in resources:
summary = summaries.get(resource_id=resource.id)
assert summary.resource_id == resource.id
assert summary.service == resource.service
assert summary.region == resource.region
assert summary.resource_type == resource.type

View File

@@ -55,22 +55,3 @@ class TestScheduleProviderScan:
assert "There is already a scheduled scan for this provider." in str(
exc_info.value
)
def test_remove_periodic_task(self, providers_fixture):
provider_instance = providers_fixture[0]
assert Scan.objects.count() == 0
with patch("tasks.tasks.perform_scheduled_scan_task.apply_async"):
schedule_provider_scan(provider_instance)
assert Scan.objects.count() == 1
scan = Scan.objects.first()
periodic_task = scan.scheduler_task
assert periodic_task is not None
periodic_task.delete()
scan.refresh_from_db()
# Assert the scan still exists but its scheduler_task is set to None
# Otherwise, Scan.DoesNotExist would be raised
assert Scan.objects.get(id=scan.id).scheduler_task is None

View File

@@ -1,10 +1,10 @@
from datetime import datetime, timezone
from unittest.mock import MagicMock, patch
from unittest.mock import patch, MagicMock
import pytest
from tasks.jobs.connection import check_lighthouse_connection, check_provider_connection
from api.models import LighthouseConfiguration, Provider
from api.models import Provider
from tasks.jobs.connection import check_provider_connection
@pytest.mark.parametrize(
@@ -70,60 +70,3 @@ def test_check_provider_connection_exception(
mock_provider_instance.save.assert_called_once()
assert mock_provider_instance.connected is False
@pytest.mark.parametrize(
"lighthouse_data",
[
{
"name": "OpenAI",
"api_key_decoded": "sk-test1234567890T3BlbkFJtest1234567890",
"model": "gpt-4o",
"temperature": 0,
"max_tokens": 4000,
"business_context": "Test business context",
"is_active": True,
},
],
)
@patch("tasks.jobs.connection.openai.OpenAI")
@pytest.mark.django_db
def test_check_lighthouse_connection(
mock_openai_client, tenants_fixture, lighthouse_data
):
lighthouse_config = LighthouseConfiguration.objects.create(
**lighthouse_data, tenant_id=tenants_fixture[0].id
)
mock_models = MagicMock()
mock_models.data = [MagicMock(id="gpt-4o"), MagicMock(id="gpt-4o-mini")]
mock_openai_client.return_value.models.list.return_value = mock_models
result = check_lighthouse_connection(
lighthouse_config_id=str(lighthouse_config.id),
)
lighthouse_config.refresh_from_db()
mock_openai_client.assert_called_once_with(
api_key=lighthouse_data["api_key_decoded"]
)
assert lighthouse_config.is_active is True
assert result["connected"] is True
assert result["error"] is None
assert result["available_models"] == ["gpt-4o", "gpt-4o-mini"]
@patch("tasks.jobs.connection.LighthouseConfiguration.objects.get")
@pytest.mark.django_db
def test_check_lighthouse_connection_missing_api_key(mock_lighthouse_get):
mock_lighthouse_instance = MagicMock()
mock_lighthouse_instance.api_key_decoded = None
mock_lighthouse_get.return_value = mock_lighthouse_instance
result = check_lighthouse_connection("lighthouse_config_id")
assert result["connected"] is False
assert result["error"] == "API key is invalid or missing."
assert result["available_models"] == []
assert mock_lighthouse_instance.is_active is False
mock_lighthouse_instance.save.assert_called_once()

View File

@@ -1,166 +0,0 @@
import os
import zipfile
from pathlib import Path
from unittest.mock import MagicMock, patch
import pytest
from botocore.exceptions import ClientError
from tasks.jobs.export import (
_compress_output_files,
_generate_output_directory,
_upload_to_s3,
get_s3_client,
)
@pytest.mark.django_db
class TestOutputs:
def test_compress_output_files_creates_zip(self, tmpdir):
base_tmp = Path(str(tmpdir.mkdir("compress_output")))
output_dir = base_tmp / "output"
output_dir.mkdir()
file_path = output_dir / "result.csv"
file_path.write_text("data")
zip_path = _compress_output_files(str(output_dir))
assert zip_path.endswith(".zip")
assert os.path.exists(zip_path)
with zipfile.ZipFile(zip_path, "r") as zipf:
assert "output/result.csv" in zipf.namelist()
@patch("tasks.jobs.export.boto3.client")
@patch("tasks.jobs.export.settings")
def test_get_s3_client_success(self, mock_settings, mock_boto_client):
mock_settings.DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID = "test"
mock_settings.DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = "test"
mock_settings.DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = "token"
mock_settings.DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = "eu-west-1"
client_mock = MagicMock()
mock_boto_client.return_value = client_mock
client = get_s3_client()
assert client is not None
client_mock.list_buckets.assert_called()
@patch("tasks.jobs.export.boto3.client")
@patch("tasks.jobs.export.settings")
def test_get_s3_client_fallback(self, mock_settings, mock_boto_client):
mock_boto_client.side_effect = [
ClientError({"Error": {"Code": "403"}}, "ListBuckets"),
MagicMock(),
]
client = get_s3_client()
assert client is not None
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_success(self, mock_base, mock_get_client, tmpdir):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
base_tmp = Path(str(tmpdir.mkdir("upload_success")))
zip_path = base_tmp / "outputs.zip"
zip_path.write_bytes(b"dummy")
compliance_dir = base_tmp / "compliance"
compliance_dir.mkdir()
(compliance_dir / "report.csv").write_text("ok")
client_mock = MagicMock()
mock_get_client.return_value = client_mock
result = _upload_to_s3("tenant-id", str(zip_path), "scan-id")
expected_uri = "s3://test-bucket/tenant-id/scan-id/outputs.zip"
assert result == expected_uri
assert client_mock.upload_file.call_count == 2
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_missing_bucket(self, mock_base, mock_get_client):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = ""
result = _upload_to_s3("tenant", "/tmp/fake.zip", "scan")
assert result is None
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_skips_non_files(self, mock_base, mock_get_client, tmpdir):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
base_tmp = Path(str(tmpdir.mkdir("upload_skips_non_files")))
zip_path = base_tmp / "results.zip"
zip_path.write_bytes(b"zip")
compliance_dir = base_tmp / "compliance"
compliance_dir.mkdir()
(compliance_dir / "subdir").mkdir()
client_mock = MagicMock()
mock_get_client.return_value = client_mock
result = _upload_to_s3("tenant", str(zip_path), "scan")
expected_uri = "s3://test-bucket/tenant/scan/results.zip"
assert result == expected_uri
client_mock.upload_file.assert_called_once()
@patch(
"tasks.jobs.export.get_s3_client",
side_effect=ClientError({"Error": {}}, "Upload"),
)
@patch("tasks.jobs.export.base")
@patch("tasks.jobs.export.logger.error")
def test_upload_to_s3_failure_logs_error(
self, mock_logger, mock_base, mock_get_client, tmpdir
):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "bucket"
base_tmp = Path(str(tmpdir.mkdir("upload_failure_logs")))
zip_path = base_tmp / "zipfile.zip"
zip_path.write_bytes(b"zip")
compliance_dir = base_tmp / "compliance"
compliance_dir.mkdir()
(compliance_dir / "report.csv").write_text("csv")
_upload_to_s3("tenant", str(zip_path), "scan")
mock_logger.assert_called()
def test_generate_output_directory_creates_paths(self, tmpdir):
from prowler.config.config import output_file_timestamp
base_tmp = Path(str(tmpdir.mkdir("generate_output")))
base_dir = str(base_tmp)
tenant_id = "t1"
scan_id = "s1"
provider = "aws"
path, compliance = _generate_output_directory(
base_dir, provider, tenant_id, scan_id
)
assert os.path.isdir(os.path.dirname(path))
assert os.path.isdir(os.path.dirname(compliance))
assert path.endswith(f"{provider}-{output_file_timestamp}")
assert compliance.endswith(f"{provider}-{output_file_timestamp}")
def test_generate_output_directory_invalid_character(self, tmpdir):
from prowler.config.config import output_file_timestamp
base_tmp = Path(str(tmpdir.mkdir("generate_output")))
base_dir = str(base_tmp)
tenant_id = "t1"
scan_id = "s1"
provider = "aws/test@check"
path, compliance = _generate_output_directory(
base_dir, provider, tenant_id, scan_id
)
assert os.path.isdir(os.path.dirname(path))
assert os.path.isdir(os.path.dirname(compliance))
assert path.endswith(f"aws-test-check-{output_file_timestamp}")
assert compliance.endswith(f"aws-test-check-{output_file_timestamp}")

View File

@@ -1,19 +1,16 @@
import json
import uuid
from datetime import datetime
from unittest.mock import MagicMock, patch
import pytest
from tasks.jobs.scan import (
_create_finding_delta,
_store_resources,
create_compliance_requirements,
perform_prowler_scan,
)
from tasks.utils import CustomEncoder
from api.models import (
ComplianceRequirementOverview,
Finding,
Provider,
Resource,
@@ -209,10 +206,6 @@ class TestPerformScan:
scan.refresh_from_db()
assert scan.state == StateChoices.FAILED
provider.refresh_from_db()
assert provider.connected is False
assert isinstance(provider.connection_last_checked_at, datetime)
@pytest.mark.parametrize(
"last_status, new_status, expected_delta",
[
@@ -237,7 +230,7 @@ class TestPerformScan:
):
tenant_id = uuid.uuid4()
provider_instance = MagicMock()
provider_instance.id = "provider123"
provider_instance.id = "provider456"
finding = MagicMock()
finding.resource_uid = "resource_uid_123"
@@ -252,16 +245,15 @@ class TestPerformScan:
resource_instance.region = finding.region
mock_get_or_create_resource.return_value = (resource_instance, True)
tag_instance = MagicMock()
mock_get_or_create_tag.return_value = (tag_instance, True)
resource, resource_uid_tuple = _store_resources(
finding, str(tenant_id), provider_instance
finding, tenant_id, provider_instance
)
mock_get_or_create_resource.assert_called_once_with(
tenant_id=str(tenant_id),
tenant_id=tenant_id,
provider=provider_instance,
uid=finding.resource_uid,
defaults={
@@ -308,11 +300,11 @@ class TestPerformScan:
mock_get_or_create_tag.return_value = (tag_instance, True)
resource, resource_uid_tuple = _store_resources(
finding, str(tenant_id), provider_instance
finding, tenant_id, provider_instance
)
mock_get_or_create_resource.assert_called_once_with(
tenant_id=str(tenant_id),
tenant_id=tenant_id,
provider=provider_instance,
uid=finding.resource_uid,
defaults={
@@ -366,14 +358,14 @@ class TestPerformScan:
]
resource, resource_uid_tuple = _store_resources(
finding, str(tenant_id), provider_instance
finding, tenant_id, provider_instance
)
mock_get_or_create_tag.assert_any_call(
tenant_id=str(tenant_id), key="tag1", value="value1"
tenant_id=tenant_id, key="tag1", value="value1"
)
mock_get_or_create_tag.assert_any_call(
tenant_id=str(tenant_id), key="tag2", value="value2"
tenant_id=tenant_id, key="tag2", value="value2"
)
resource_instance.upsert_or_delete_tags.assert_called_once()
tags_passed = resource_instance.upsert_or_delete_tags.call_args[1]["tags"]
@@ -385,782 +377,3 @@ class TestPerformScan:
# TODO Add tests for aggregations
@pytest.mark.django_db
class TestCreateComplianceRequirements:
def test_create_compliance_requirements_success(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
findings_fixture,
resources_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
patch("tasks.jobs.scan.generate_scan_compliance"),
patch("tasks.jobs.scan.create_objects_in_batches") as mock_create_objects,
patch("api.models.Finding.objects.filter") as mock_findings_filter,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
provider = providers_fixture[0]
provider.provider = Provider.ProviderChoices.AWS
provider.save()
scan.provider = provider
scan.save()
tenant_id = str(tenant.id)
scan_id = str(scan.id)
mock_prowler_provider_instance = MagicMock()
mock_prowler_provider_instance.get_regions.return_value = [
"us-east-1",
"us-west-2",
]
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"cis_1.4_aws": {
"framework": "CIS AWS Foundations Benchmark",
"version": "1.4.0",
"requirements": {
"1.1": {
"description": "Ensure root access key does not exist",
"checks_status": {
"pass": 0,
"fail": 0,
"manual": 0,
"total": 1,
},
"status": "PASS",
},
"1.2": {
"description": "Ensure MFA is enabled for root account",
"checks_status": {
"pass": 0,
"fail": 1,
"manual": 0,
"total": 1,
},
"status": "FAIL",
},
},
},
"aws_account_security_onboarding_aws": {
"framework": "AWS Account Security Onboarding",
"version": "1.0",
"requirements": {
"requirement1": {
"description": "Basic security requirement",
"checks_status": {
"pass": 1,
"fail": 0,
"manual": 0,
"total": 1,
},
"status": "PASS",
},
},
},
}
mock_findings_filter.return_value = []
result = create_compliance_requirements(tenant_id, scan_id)
assert "requirements_created" in result
assert "regions_processed" in result
assert "compliance_frameworks" in result
assert result["regions_processed"] == ["us-east-1", "us-west-2"]
assert result["requirements_created"] == 6
assert len(result["compliance_frameworks"]) == 2
mock_create_objects.assert_called_once()
call_args = mock_create_objects.call_args[0]
assert call_args[0] == tenant_id
assert call_args[1] == ComplianceRequirementOverview
assert len(call_args[2]) == 6
compliance_objects = call_args[2]
for obj in compliance_objects:
assert isinstance(obj, ComplianceRequirementOverview)
assert obj.tenant.id == tenant.id
assert obj.scan == scan
assert obj.region in ["us-east-1", "us-west-2"]
assert obj.compliance_id in [
"cis_1.4_aws",
"aws_account_security_onboarding_aws",
]
def test_create_compliance_requirements_with_findings(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
patch(
"tasks.jobs.scan.generate_scan_compliance"
) as mock_generate_compliance,
patch("tasks.jobs.scan.create_objects_in_batches"),
patch("api.models.Finding.objects.filter") as mock_findings_filter,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
provider = providers_fixture[0]
provider.provider = Provider.ProviderChoices.AWS
provider.save()
scan.provider = provider
scan.save()
tenant_id = str(tenant.id)
scan_id = str(scan.id)
mock_finding1 = MagicMock()
mock_finding1.check_id = "check1"
mock_finding1.status = "PASS"
mock_resource1 = MagicMock()
mock_resource1.region = "us-east-1"
mock_finding1.resources.all.return_value = [mock_resource1]
mock_finding2 = MagicMock()
mock_finding2.check_id = "check2"
mock_finding2.status = "FAIL"
mock_resource2 = MagicMock()
mock_resource2.region = "us-west-2"
mock_finding2.resources.all.return_value = [mock_resource2]
mock_findings_filter.return_value = [mock_finding1, mock_finding2]
mock_prowler_provider_instance = MagicMock()
mock_prowler_provider_instance.get_regions.return_value = [
"us-east-1",
"us-west-2",
]
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"test_compliance": {
"framework": "Test Framework",
"version": "1.0",
"requirements": {
"req_1": {
"description": "Test Requirement 1",
"checks": {"check_1": None},
"checks_status": {
"pass": 2,
"fail": 1,
"manual": 0,
"total": 3,
},
"status": "FAIL",
},
"req_2": {
"description": "Test Requirement 2",
"checks": {"check_2": None},
"checks_status": {
"pass": 2,
"fail": 0,
"manual": 0,
"total": 2,
},
"status": "PASS",
},
},
}
}
result = create_compliance_requirements(tenant_id, scan_id)
mock_findings_filter.assert_called_once_with(scan_id=scan_id, muted=False)
assert mock_generate_compliance.call_count == 2
assert result["requirements_created"] == 4
assert set(result["regions_processed"]) == {"us-east-1", "us-west-2"}
def test_create_compliance_requirements_no_provider_regions(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
patch("tasks.jobs.scan.generate_scan_compliance"),
patch("tasks.jobs.scan.create_objects_in_batches"),
patch("api.models.Finding.objects.filter") as mock_findings_filter,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
provider = providers_fixture[0]
provider.provider = Provider.ProviderChoices.KUBERNETES
provider.save()
scan.provider = provider
scan.save()
tenant_id = str(tenant.id)
scan_id = str(scan.id)
mock_finding = MagicMock()
mock_finding.check_id = "check1"
mock_finding.status = "PASS"
mock_resource = MagicMock()
mock_resource.region = "default"
mock_finding.resources.all.return_value = [mock_resource]
mock_findings_filter.return_value = [mock_finding]
mock_prowler_provider_instance = MagicMock()
mock_prowler_provider_instance.get_regions.side_effect = AttributeError(
"No get_regions method"
)
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"kubernetes_cis": {
"framework": "CIS Kubernetes Benchmark",
"version": "1.6.0",
"requirements": {
"1.1": {
"description": "Test requirement",
"checks_status": {
"pass": 0,
"fail": 0,
"manual": 0,
"total": 1,
},
"status": "PASS",
},
},
},
}
result = create_compliance_requirements(tenant_id, scan_id)
assert result["regions_processed"] == ["default"]
def test_create_compliance_requirements_empty_findings(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
patch(
"tasks.jobs.scan.generate_scan_compliance"
) as mock_generate_compliance,
patch("tasks.jobs.scan.create_objects_in_batches"),
patch("api.models.Finding.objects.filter") as mock_findings_filter,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
provider = providers_fixture[0]
provider.provider = Provider.ProviderChoices.AWS
provider.save()
scan.provider = provider
scan.save()
tenant_id = str(tenant.id)
scan_id = str(scan.id)
mock_findings_filter.return_value = []
mock_prowler_provider_instance = MagicMock()
mock_prowler_provider_instance.get_regions.return_value = ["us-east-1"]
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"cis_1.4_aws": {
"framework": "CIS AWS Foundations Benchmark",
"version": "1.4.0",
"requirements": {
"1.1": {
"description": "Test requirement",
"checks_status": {
"pass": 0,
"fail": 0,
"manual": 0,
"total": 1,
},
"status": "PASS",
},
},
},
}
mock_findings_filter.return_value = []
result = create_compliance_requirements(tenant_id, scan_id)
assert result["regions_processed"] == ["us-east-1"]
assert result["requirements_created"] == 1
mock_generate_compliance.assert_not_called()
def test_create_compliance_requirements_error_handling(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
provider = providers_fixture[0]
provider.provider = Provider.ProviderChoices.AWS
provider.save()
scan.provider = provider
scan.save()
tenant_id = str(tenant.id)
scan_id = str(scan.id)
mock_prowler_provider.side_effect = Exception(
"Provider initialization failed"
)
with pytest.raises(Exception, match="Provider initialization failed"):
create_compliance_requirements(tenant_id, scan_id)
def test_create_compliance_requirements_muted_findings_excluded(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
patch("tasks.jobs.scan.generate_scan_compliance"),
patch("tasks.jobs.scan.create_objects_in_batches"),
patch("api.models.Finding.objects.filter") as mock_findings_filter,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
provider = providers_fixture[0]
provider.provider = Provider.ProviderChoices.AWS
provider.save()
scan.provider = provider
scan.save()
tenant_id = str(tenant.id)
scan_id = str(scan.id)
mock_findings_filter.return_value = []
mock_prowler_provider_instance = MagicMock()
mock_prowler_provider_instance.get_regions.return_value = ["us-east-1"]
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {}
mock_findings_filter.return_value = []
create_compliance_requirements(tenant_id, scan_id)
mock_findings_filter.assert_called_once_with(scan_id=scan_id, muted=False)
def test_create_compliance_requirements_check_status_priority(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.return_prowler_provider"
) as mock_return_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
patch(
"tasks.jobs.scan.generate_scan_compliance"
) as mock_generate_compliance,
patch("tasks.jobs.scan.create_objects_in_batches"),
patch("api.models.Finding.objects.filter") as mock_findings_filter,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
provider = providers_fixture[0]
provider.provider = Provider.ProviderChoices.AWS
provider.save()
scan.provider = provider
scan.save()
tenant_id = str(tenant.id)
scan_id = str(scan.id)
mock_finding1 = MagicMock()
mock_finding1.check_id = "check1"
mock_finding1.status = "PASS"
mock_resource1 = MagicMock()
mock_resource1.region = "us-east-1"
mock_finding1.resources.all.return_value = [mock_resource1]
mock_finding2 = MagicMock()
mock_finding2.check_id = "check1"
mock_finding2.status = "FAIL"
mock_resource2 = MagicMock()
mock_resource2.region = "us-east-1"
mock_finding2.resources.all.return_value = [mock_resource2]
mock_findings_filter.return_value = [mock_finding1, mock_finding2]
mock_prowler_provider_instance = MagicMock()
mock_prowler_provider_instance.get_regions.return_value = ["us-east-1"]
mock_return_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"cis_1.4_aws": {
"framework": "CIS AWS Foundations Benchmark",
"version": "1.4.0",
"requirements": {
"1.1": {
"description": "Test requirement",
"checks_status": {
"pass": 0,
"fail": 0,
"manual": 0,
"total": 1,
},
"status": "PASS",
},
},
},
}
create_compliance_requirements(tenant_id, scan_id)
assert mock_generate_compliance.call_count == 1
def test_compliance_overview_aggregation_requirement_fail_priority(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.return_prowler_provider"
) as mock_return_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
patch(
"tasks.jobs.scan.generate_scan_compliance"
) as mock_generate_compliance,
patch("tasks.jobs.scan.create_objects_in_batches") as mock_create_objects,
patch("api.models.Finding.objects.filter") as mock_findings_filter,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
mock_findings_filter.return_value = []
mock_prowler_provider = MagicMock()
mock_prowler_provider.get_regions.return_value = [
"us-east-1",
"us-west-2",
"eu-west-1",
]
mock_return_prowler_provider.return_value = mock_prowler_provider
mock_compliance_template.__getitem__.return_value = {
"test_compliance": {
"framework": "Test Framework",
"version": "1.0",
"requirements": {
"req_1": {
"description": "Test Requirement 1",
"checks": {"check_1": None},
"checks_status": {
"pass": 2,
"fail": 1,
"manual": 0,
"total": 3,
},
"status": "FAIL",
}
},
}
}
mock_generate_compliance.return_value = {
"test_compliance": {
"framework": "Test Framework",
"version": "1.0",
"requirements": {
"req_1": {
"description": "Test Requirement 1",
"checks": {
"check_1": {
"us-east-1": {"status": "PASS"},
"us-west-2": {"status": "FAIL"},
"eu-west-1": {"status": "PASS"},
}
},
"checks_status": {
"pass": 2,
"fail": 1,
"manual": 0,
"total": 3,
},
"status": "FAIL",
}
},
}
}
created_objects = []
mock_create_objects.side_effect = (
lambda tenant_id, model, objs, batch_size=500: created_objects.extend(
objs
)
)
create_compliance_requirements(str(tenant.id), str(scan.id))
assert len(created_objects) == 3
assert all(obj.requirement_status == "FAIL" for obj in created_objects)
def test_compliance_overview_aggregation_requirement_pass_all_regions(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.return_prowler_provider"
) as mock_return_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
patch(
"tasks.jobs.scan.generate_scan_compliance"
) as mock_generate_compliance,
patch("tasks.jobs.scan.create_objects_in_batches") as mock_create_objects,
patch("api.models.Finding.objects.filter") as mock_findings_filter,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
providers_fixture[0]
mock_findings_filter.return_value = []
mock_prowler_provider = MagicMock()
mock_prowler_provider.get_regions.return_value = ["us-east-1", "us-west-2"]
mock_return_prowler_provider.return_value = mock_prowler_provider
mock_compliance_template.__getitem__.return_value = {
"test_compliance": {
"framework": "Test Framework",
"version": "1.0",
"requirements": {
"req_1": {
"description": "Test Requirement 1",
"checks": {"check_1": None},
"checks_status": {
"pass": 2,
"fail": 0,
"manual": 0,
"total": 2,
},
"status": "PASS",
}
},
}
}
mock_generate_compliance.return_value = {
"test_compliance": {
"framework": "Test Framework",
"version": "1.0",
"requirements": {
"req_1": {
"description": "Test Requirement 1",
"checks": {
"check_1": {
"us-east-1": {"status": "PASS"},
"us-west-2": {"status": "PASS"},
}
},
"checks_status": {
"pass": 2,
"fail": 0,
"manual": 0,
"total": 2,
},
"status": "PASS",
}
},
}
}
created_objects = []
mock_create_objects.side_effect = (
lambda tenant_id, model, objs, batch_size=500: created_objects.extend(
objs
)
)
create_compliance_requirements(str(tenant.id), str(scan.id))
assert len(created_objects) == 2
assert all(obj.requirement_status == "PASS" for obj in created_objects)
def test_compliance_overview_aggregation_multiple_requirements_mixed_status(
self,
tenants_fixture,
scans_fixture,
providers_fixture,
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.return_prowler_provider"
) as mock_return_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
patch(
"tasks.jobs.scan.generate_scan_compliance"
) as mock_generate_compliance,
patch("tasks.jobs.scan.create_objects_in_batches") as mock_create_objects,
patch("api.models.Finding.objects.filter") as mock_findings_filter,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
mock_findings_filter.return_value = []
mock_prowler_provider = MagicMock()
mock_prowler_provider.get_regions.return_value = ["us-east-1", "us-west-2"]
mock_return_prowler_provider.return_value = mock_prowler_provider
mock_compliance_template.__getitem__.return_value = {
"test_compliance": {
"framework": "Test Framework",
"version": "1.0",
"requirements": {
"req_1": {
"description": "Test Requirement 1",
"checks": {"check_1": None},
"checks_status": {
"pass": 2,
"fail": 0,
"manual": 0,
"total": 2,
},
"status": "PASS",
},
"req_2": {
"description": "Test Requirement 2",
"checks": {"check_2": None},
"checks_status": {
"pass": 1,
"fail": 1,
"manual": 0,
"total": 2,
},
"status": "FAIL",
},
},
}
}
mock_generate_compliance.return_value = {
"test_compliance": {
"framework": "Test Framework",
"version": "1.0",
"requirements": {
"req_1": {
"description": "Test Requirement 1",
"checks": {
"check_1": {
"us-east-1": {"status": "PASS"},
"us-west-2": {"status": "PASS"},
}
},
"checks_status": {
"pass": 2,
"fail": 0,
"manual": 0,
"total": 2,
},
"status": "PASS",
},
"req_2": {
"description": "Test Requirement 2",
"checks": {
"check_2": {
"us-east-1": {"status": "PASS"},
"us-west-2": {"status": "FAIL"},
}
},
"checks_status": {
"pass": 1,
"fail": 1,
"manual": 0,
"total": 2,
},
"status": "FAIL",
},
},
}
}
created_objects = []
mock_create_objects.side_effect = (
lambda tenant_id, model, objs, batch_size=500: created_objects.extend(
objs
)
)
create_compliance_requirements(str(tenant.id), str(scan.id))
assert len(created_objects) == 4
req_1_objects = [
obj for obj in created_objects if obj.requirement_id == "req_1"
]
req_2_objects = [
obj for obj in created_objects if obj.requirement_id == "req_2"
]
assert len(req_1_objects) == 2
assert len(req_2_objects) == 2
assert all(obj.requirement_status == "PASS" for obj in req_1_objects)
assert all(obj.requirement_status == "FAIL" for obj in req_2_objects)

View File

@@ -1,415 +0,0 @@
import uuid
from pathlib import Path
from unittest.mock import MagicMock, patch
import pytest
from tasks.tasks import generate_outputs
@pytest.mark.django_db
class TestGenerateOutputs:
def setup_method(self):
self.scan_id = str(uuid.uuid4())
self.provider_id = str(uuid.uuid4())
self.tenant_id = str(uuid.uuid4())
def test_no_findings_returns_early(self):
with patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter:
mock_filter.return_value.exists.return_value = False
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": False}
mock_filter.assert_called_once_with(scan_id=self.scan_id)
@patch("tasks.tasks.rmtree")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Finding.all_objects.filter")
def test_generate_outputs_happy_path(
self,
mock_finding_filter,
mock_scan_summary_filter,
mock_provider_get,
mock_initialize_provider,
mock_compliance_get_bulk,
mock_get_available_frameworks,
mock_compress,
mock_upload,
mock_rmtree,
):
mock_scan_summary_filter.return_value.exists.return_value = True
mock_provider = MagicMock()
mock_provider.uid = "provider-uid"
mock_provider.provider = "aws"
mock_provider_get.return_value = mock_provider
prowler_provider = MagicMock()
mock_initialize_provider.return_value = prowler_provider
mock_compliance_get_bulk.return_value = {"cis": MagicMock()}
mock_get_available_frameworks.return_value = ["cis"]
dummy_finding = MagicMock(uid="f1")
mock_finding_filter.return_value.order_by.return_value.iterator.return_value = [
[dummy_finding],
True,
]
mock_transformed_stats = {"some": "stats"}
with (
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value=mock_transformed_stats,
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value={"transformed": "f1"},
),
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": MagicMock(name="JSONWriter"),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock(name="CSVCompliance"))]},
),
patch(
"tasks.tasks._generate_output_directory",
return_value=("out-dir", "comp-dir"),
),
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
):
mock_compress.return_value = "/tmp/zipped.zip"
mock_upload.return_value = "s3://bucket/zipped.zip"
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": True}
mock_scan_update.return_value.update.assert_called_once_with(
output_location="s3://bucket/zipped.zip"
)
mock_rmtree.assert_called_once_with(
Path("/tmp/zipped.zip").parent, ignore_errors=True
)
def test_generate_outputs_fails_upload(self):
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk"),
patch("tasks.tasks.get_compliance_frameworks"),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch("tasks.tasks.FindingOutput.transform_api_finding"),
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": MagicMock(name="Writer"),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value=None),
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
result = generate_outputs(
scan_id="scan",
provider_id="provider",
tenant_id=self.tenant_id,
)
assert result == {"upload": False}
mock_scan_update.return_value.update.assert_called_once()
def test_generate_outputs_triggers_html_extra_update(self):
mock_finding_output = MagicMock()
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value={"some": "stats"},
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value=mock_finding_output,
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/f.zip"),
patch("tasks.tasks.Scan.all_objects.filter"),
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
html_writer_mock = MagicMock()
with (
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"html": {
"class": lambda *args, **kwargs: html_writer_mock,
"suffix": ".html",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
):
generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
html_writer_mock.batch_write_data_to_file.assert_called_once()
def test_transform_called_only_on_second_batch(self):
raw1 = MagicMock()
raw2 = MagicMock()
tf1 = MagicMock()
tf1.compliance = {}
tf2 = MagicMock()
tf2.compliance = {}
writer_instances = []
class TrackingWriter:
def __init__(self, findings, file_path, file_extension, from_cli):
self.transform_called = 0
self.batch_write_data_to_file = MagicMock()
self._data = []
self.close_file = False
writer_instances.append(self)
def transform(self, fos):
self.transform_called += 1
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk"),
patch("tasks.tasks.get_compliance_frameworks", return_value=[]),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
side_effect=[tf1, tf2],
),
patch(
"tasks.tasks._generate_output_directory",
return_value=("outdir", "compdir"),
),
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
patch("tasks.tasks.rmtree"),
patch("tasks.tasks.Scan.all_objects.filter"),
patch(
"tasks.tasks.batched",
return_value=[
([raw1], False),
([raw2], True),
],
),
):
mock_summary.return_value.exists.return_value = True
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": TrackingWriter,
"suffix": ".json",
"kwargs": {},
}
},
):
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": True}
assert len(writer_instances) == 1
writer = writer_instances[0]
assert writer.transform_called == 1
def test_compliance_transform_called_on_second_batch(self):
raw1 = MagicMock()
raw2 = MagicMock()
compliance_obj = MagicMock()
writer_instances = []
class TrackingComplianceWriter:
def __init__(self, *args, **kwargs):
self.transform_calls = []
self._data = []
writer_instances.append(self)
def transform(self, fos, comp_obj, name):
self.transform_calls.append((fos, comp_obj, name))
def batch_write_data_to_file(self):
pass
two_batches = [
([raw1], False),
([raw2], True),
]
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
patch(
"tasks.tasks.Provider.objects.get",
return_value=MagicMock(uid="UID", provider="aws"),
),
patch("tasks.tasks.initialize_prowler_provider"),
patch(
"tasks.tasks.Compliance.get_bulk", return_value={"cis": compliance_obj}
),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch(
"tasks.tasks._generate_output_directory",
return_value=("outdir", "compdir"),
),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
side_effect=lambda f, prov: f,
),
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
patch("tasks.tasks.rmtree"),
patch(
"tasks.tasks.Scan.all_objects.filter",
return_value=MagicMock(update=lambda **kw: None),
),
patch("tasks.tasks.batched", return_value=two_batches),
patch("tasks.tasks.OUTPUT_FORMATS_MAPPING", {}),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda name: True, TrackingComplianceWriter)]},
),
):
mock_summary.return_value.exists.return_value = True
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert len(writer_instances) == 1
writer = writer_instances[0]
assert writer.transform_calls == [([raw2], compliance_obj, "cis")]
assert result == {"upload": True}
def test_generate_outputs_logs_rmtree_exception(self, caplog):
mock_finding_output = MagicMock()
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value={"some": "stats"},
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value=mock_finding_output,
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/file.zip"),
patch("tasks.tasks.Scan.all_objects.filter"),
patch("tasks.tasks.rmtree", side_effect=Exception("Test deletion error")),
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
with (
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": lambda *args, **kwargs: MagicMock(),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
):
with caplog.at_level("ERROR"):
generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert "Error deleting output files" in caplog.text

View File

@@ -1,156 +0,0 @@
#!/usr/bin/env python3
import argparse
import os
import re
import subprocess
import sys
from pathlib import Path
import matplotlib.pyplot as plt
import pandas as pd
plt.style.use("ggplot")
def run_locust(
locust_file: str,
host: str,
users: int,
hatch_rate: int,
run_time: str,
csv_prefix: Path,
) -> Path:
artifacts_dir = Path("artifacts")
artifacts_dir.mkdir(parents=True, exist_ok=True)
cmd = [
"locust",
"-f",
f"scenarios/{locust_file}",
"--headless",
"-u",
str(users),
"-r",
str(hatch_rate),
"-t",
run_time,
"--host",
host,
"--csv",
str(artifacts_dir / csv_prefix.name),
]
print(f"Running Locust: {' '.join(cmd)}")
process = subprocess.run(cmd)
if process.returncode:
sys.exit("Locust execution failed")
stats_file = artifacts_dir / f"{csv_prefix.stem}_stats.csv"
if not stats_file.exists():
sys.exit(f"Stats CSV not found: {stats_file}")
return stats_file
def load_percentiles(csv_path: Path) -> pd.DataFrame:
df = pd.read_csv(csv_path)
mapping = {"50%": "p50", "75%": "p75", "90%": "p90", "95%": "p95"}
available = [col for col in mapping if col in df.columns]
renamed = {col: mapping[col] for col in available}
df = df.rename(columns=renamed).set_index("Name")[renamed.values()]
return df.drop(index=["Aggregated"], errors="ignore")
def sanitize_label(label: str) -> str:
text = re.sub(r"[^\w]+", "_", label.strip().lower())
return text.strip("_")
def plot_multi_comparison(metrics: dict[str, pd.DataFrame]) -> None:
common = sorted(set.intersection(*(set(df.index) for df in metrics.values())))
percentiles = list(next(iter(metrics.values())).columns)
groups = len(metrics)
width = 0.8 / groups
for endpoint in common:
fig, ax = plt.subplots(figsize=(10, 5), dpi=100)
for idx, (label, df) in enumerate(metrics.items()):
series = df.loc[endpoint]
positions = [
i + (idx - groups / 2) * width + width / 2
for i in range(len(percentiles))
]
bars = ax.bar(positions, series.values, width, label=label)
for bar in bars:
height = bar.get_height()
ax.annotate(
f"{int(height)}",
xy=(bar.get_x() + bar.get_width() / 2, height),
xytext=(0, 3),
textcoords="offset points",
ha="center",
va="bottom",
fontsize=8,
)
ax.set_xticks(range(len(percentiles)))
ax.set_xticklabels(percentiles)
ax.set_ylabel("Latency (ms)")
ax.set_title(endpoint, fontsize=12)
ax.grid(True, axis="y", linestyle="--", alpha=0.7)
fig.tight_layout()
fig.subplots_adjust(right=0.75)
ax.legend(loc="center left", bbox_to_anchor=(1, 0.5), framealpha=0.9)
output = Path("artifacts") / f"comparison_{sanitize_label(endpoint)}.png"
plt.savefig(output)
plt.close(fig)
print(f"Saved chart: {output}")
def main() -> None:
parser = argparse.ArgumentParser(description="Run Locust and compare metrics")
parser.add_argument("--locustfile", required=True, help="Locust file in scenarios/")
parser.add_argument("--host", required=True, help="Target host URL")
parser.add_argument(
"--users", type=int, default=10, help="Number of simulated users"
)
parser.add_argument("--rate", type=int, default=1, help="Hatch rate per second")
parser.add_argument("--time", default="1m", help="Test duration (e.g. 30s, 1m)")
parser.add_argument(
"--metrics-dir", default="baselines", help="Directory with CSV baselines"
)
parser.add_argument("--version", default="current", help="Test version")
args = parser.parse_args()
metrics_dir = Path(args.metrics_dir)
os.makedirs(metrics_dir, exist_ok=True)
metrics_data: dict[str, pd.DataFrame] = {}
for csv_file in sorted(metrics_dir.glob("*.csv")):
metrics_data[csv_file.stem] = load_percentiles(csv_file)
current_prefix = Path(args.version)
current_csv = run_locust(
locust_file=args.locustfile,
host=args.host,
users=args.users,
hatch_rate=args.rate,
run_time=args.time,
csv_prefix=current_prefix,
)
metrics_data[args.version] = load_percentiles(current_csv)
for endpoint in sorted(
set.intersection(*(set(df.index) for df in metrics_data.values()))
):
parts = [endpoint]
for label, df in metrics_data.items():
s = df.loc[endpoint]
parts.append(f"{label}: p50 {s.p50}, p75 {s.p75}, p90 {s.p90}, p95 {s.p95}")
print(" | ".join(parts))
plot_multi_comparison(metrics_data)
if __name__ == "__main__":
main()

View File

@@ -1,3 +0,0 @@
locust==2.34.1
matplotlib==3.10.1
pandas==2.2.3

View File

@@ -1,128 +0,0 @@
import random
from collections import defaultdict
import requests
from locust import events, task
from utils.helpers import APIUserBase, get_api_token, get_auth_headers
GLOBAL = {
"token": None,
"available_scans_info": {},
}
SUPPORTED_COMPLIANCE_IDS = {
"aws": ["ens_rd2022", "cis_2.0", "prowler_threatscore", "soc2"],
"gcp": ["ens_rd2022", "cis_2.0", "prowler_threatscore", "soc2"],
"azure": ["ens_rd2022", "cis_2.0", "prowler_threatscore", "soc2"],
"m365": ["cis_4.0", "iso27001_2022", "prowler_threatscore"],
}
def _get_random_scan() -> tuple:
provider_type = random.choice(list(GLOBAL["available_scans_info"].keys()))
scan_info = random.choice(GLOBAL["available_scans_info"][provider_type])
return provider_type, scan_info
def _get_random_compliance_id(provider: str) -> str:
return f"{random.choice(SUPPORTED_COMPLIANCE_IDS[provider])}_{provider}"
def _get_compliance_available_scans_by_provider_type(host: str, token: str) -> dict:
excluded_providers = ["kubernetes"]
response_dict = defaultdict(list)
provider_response = requests.get(
f"{host}/providers?fields[providers]=id,provider&filter[connected]=true",
headers=get_auth_headers(token),
)
for provider in provider_response.json()["data"]:
provider_id = provider["id"]
provider_type = provider["attributes"]["provider"]
if provider_type in excluded_providers:
continue
scan_response = requests.get(
f"{host}/scans?fields[scans]=id&filter[provider]={provider_id}&filter[state]=completed",
headers=get_auth_headers(token),
)
scan_data = scan_response.json()["data"]
if not scan_data:
continue
scan_id = scan_data[0]["id"]
response_dict[provider_type].append(scan_id)
return response_dict
def _get_compliance_regions_from_scan(host: str, token: str, scan_id: str) -> list:
response = requests.get(
f"{host}/compliance-overviews/metadata?filter[scan_id]={scan_id}",
headers=get_auth_headers(token),
)
assert response.status_code == 200, f"Failed to get scan: {response.text}"
return response.json()["data"]["attributes"]["regions"]
@events.test_start.add_listener
def on_test_start(environment, **kwargs):
GLOBAL["token"] = get_api_token(environment.host)
scans_by_provider = _get_compliance_available_scans_by_provider_type(
environment.host, GLOBAL["token"]
)
scan_info = defaultdict(list)
for provider, scans in scans_by_provider.items():
for scan in scans:
scan_info[provider].append(
{
"scan_id": scan,
"regions": _get_compliance_regions_from_scan(
environment.host, GLOBAL["token"], scan
),
}
)
GLOBAL["available_scans_info"] = scan_info
class APIUser(APIUserBase):
def on_start(self):
self.token = GLOBAL["token"]
@task(3)
def compliance_overviews_default(self):
provider_type, scan_info = _get_random_scan()
name = f"/compliance-overviews ({provider_type})"
endpoint = f"/compliance-overviews?" f"filter[scan_id]={scan_info['scan_id']}"
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(2)
def compliance_overviews_region(self):
provider_type, scan_info = _get_random_scan()
name = f"/compliance-overviews?filter[region] ({provider_type})"
endpoint = (
f"/compliance-overviews"
f"?filter[scan_id]={scan_info['scan_id']}"
f"&filter[region]={random.choice(scan_info['regions'])}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(2)
def compliance_overviews_requirements(self):
provider_type, scan_info = _get_random_scan()
compliance_id = _get_random_compliance_id(provider_type)
name = f"/compliance-overviews/requirements ({compliance_id})"
endpoint = (
f"/compliance-overviews/requirements"
f"?filter[scan_id]={scan_info['scan_id']}"
f"&filter[compliance_id]={compliance_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def compliance_overviews_attributes(self):
provider_type, _ = _get_random_scan()
compliance_id = _get_random_compliance_id(provider_type)
name = f"/compliance-overviews/attributes ({compliance_id})"
endpoint = (
f"/compliance-overviews/attributes"
f"?filter[compliance_id]={compliance_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)

View File

@@ -1,216 +0,0 @@
from locust import events, task
from utils.config import (
FINDINGS_UI_SORT_VALUES,
L_PROVIDER_NAME,
M_PROVIDER_NAME,
S_PROVIDER_NAME,
TARGET_INSERTED_AT,
)
from utils.helpers import (
APIUserBase,
get_api_token,
get_auth_headers,
get_next_resource_filter,
get_resource_filters_pairs,
get_scan_id_from_provider_name,
get_sort_value,
)
GLOBAL = {
"token": None,
"scan_ids": {},
"resource_filters": None,
"large_resource_filters": None,
}
@events.test_start.add_listener
def on_test_start(environment, **kwargs):
GLOBAL["token"] = get_api_token(environment.host)
GLOBAL["scan_ids"]["small"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], S_PROVIDER_NAME
)
GLOBAL["scan_ids"]["medium"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], M_PROVIDER_NAME
)
GLOBAL["scan_ids"]["large"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], L_PROVIDER_NAME
)
GLOBAL["resource_filters"] = get_resource_filters_pairs(
environment.host, GLOBAL["token"]
)
GLOBAL["large_resource_filters"] = get_resource_filters_pairs(
environment.host, GLOBAL["token"], GLOBAL["scan_ids"]["large"]
)
class APIUser(APIUserBase):
def on_start(self):
self.token = GLOBAL["token"]
self.s_scan_id = GLOBAL["scan_ids"]["small"]
self.m_scan_id = GLOBAL["scan_ids"]["medium"]
self.l_scan_id = GLOBAL["scan_ids"]["large"]
self.available_resource_filters = GLOBAL["resource_filters"]
self.available_resource_filters_large_scan = GLOBAL["large_resource_filters"]
@task
def findings_default(self):
name = "/findings"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_default_include(self):
name = "/findings?include"
page = self._next_page(name)
endpoint = (
f"/findings?page[number]={page}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata(self):
endpoint = f"/findings/metadata?" f"filter[inserted_at]={TARGET_INSERTED_AT}"
self.client.get(
endpoint, headers=get_auth_headers(self.token), name="/findings/metadata"
)
@task
def findings_scan_small(self):
name = "/findings?filter[scan_id] - 50k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.s_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_small(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.s_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 50k",
)
@task(2)
def findings_scan_medium(self):
name = "/findings?filter[scan_id] - 250k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.m_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_medium(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.m_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 250k",
)
@task
def findings_scan_large(self):
name = "/findings?filter[scan_id] - 500k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_scan_large_include(self):
name = "/findings?filter[scan_id]&include - 500k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_large(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.l_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 500k",
)
@task(2)
def findings_resource_filter(self):
name = "/findings?filter[resource_filter]&include"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings?filter[{filter_name}]={filter_value}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata_resource_filter(self):
name = "/findings/metadata?filter[resource_filter]"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings/metadata?filter[{filter_name}]={filter_value}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata_resource_filter_scan_large(self):
name = "/findings/metadata?filter[resource_filter]&filter[scan_id] - 500k"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings/metadata?filter[{filter_name}]={filter_value}"
f"&filter[scan]={self.l_scan_id}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(2)
def findings_resource_filter_large_scan_include(self):
name = "/findings?filter[resource_filter][scan]&include - 500k"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings?filter[{filter_name}]={filter_value}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)

View File

@@ -1,19 +0,0 @@
import os
USER_EMAIL = os.environ.get("USER_EMAIL")
USER_PASSWORD = os.environ.get("USER_PASSWORD")
BASE_HEADERS = {"Content-Type": "application/vnd.api+json"}
FINDINGS_UI_SORT_VALUES = ["severity", "status", "-inserted_at"]
TARGET_INSERTED_AT = os.environ.get("TARGET_INSERTED_AT", "2025-04-22")
FINDINGS_RESOURCE_METADATA = {
"regions": "region",
"resource_types": "resource_type",
"services": "service",
}
S_PROVIDER_NAME = "provider-50k"
M_PROVIDER_NAME = "provider-250k"
L_PROVIDER_NAME = "provider-500k"

View File

@@ -1,168 +0,0 @@
import random
from collections import defaultdict
from threading import Lock
import requests
from locust import HttpUser, between
from utils.config import (
BASE_HEADERS,
FINDINGS_RESOURCE_METADATA,
TARGET_INSERTED_AT,
USER_EMAIL,
USER_PASSWORD,
)
_global_page_counters = defaultdict(int)
_page_lock = Lock()
class APIUserBase(HttpUser):
"""
Base class for API user simulation in Locust performance tests.
Attributes:
abstract (bool): Indicates this is an abstract user class.
wait_time: Time between task executions, randomized between 1 and 5 seconds.
"""
abstract = True
wait_time = between(1, 5)
def _next_page(self, endpoint_name: str) -> int:
"""
Returns the next page number for a given endpoint. Thread-safe.
Args:
endpoint_name (str): Name of the API endpoint being paginated.
Returns:
int: The next page number for the given endpoint.
"""
with _page_lock:
_global_page_counters[endpoint_name] += 1
return _global_page_counters[endpoint_name]
def get_next_resource_filter(available_values: dict) -> tuple:
"""
Randomly selects a filter type and value from available options.
Args:
available_values (dict): Dictionary with filter types as keys and list of possible values.
Returns:
tuple: A (filter_type, filter_value) pair randomly selected.
"""
filter_type = random.choice(list(available_values.keys()))
filter_value = random.choice(available_values[filter_type])
return filter_type, filter_value
def get_auth_headers(token: str) -> dict:
"""
Returns the headers for the API requests.
Args:
token (str): The token to be included in the headers.
Returns:
dict: The headers for the API requests.
"""
return {
"Authorization": f"Bearer {token}",
**BASE_HEADERS,
}
def get_api_token(host: str) -> str:
"""
Authenticates with the API and retrieves a bearer token.
Args:
host (str): The host URL of the API.
Returns:
str: The access token for authenticated requests.
Raises:
AssertionError: If the request fails or does not return a 200 status code.
"""
login_payload = {
"data": {
"type": "tokens",
"attributes": {"email": USER_EMAIL, "password": USER_PASSWORD},
}
}
response = requests.post(f"{host}/tokens", json=login_payload, headers=BASE_HEADERS)
assert response.status_code == 200, f"Failed to get token: {response.text}"
return response.json()["data"]["attributes"]["access"]
def get_scan_id_from_provider_name(host: str, token: str, provider_name: str) -> str:
"""
Retrieves the scan ID associated with a specific provider name.
Args:
host (str): The host URL of the API.
token (str): Bearer token for authentication.
provider_name (str): Name of the provider to filter scans by.
Returns:
str: The ID of the scan.
Raises:
AssertionError: If the request fails or does not return a 200 status code.
"""
response = requests.get(
f"{host}/scans?fields[scans]=id&filter[provider_alias]={provider_name}",
headers=get_auth_headers(token),
)
assert response.status_code == 200, f"Failed to get scan: {response.text}"
return response.json()["data"][0]["id"]
def get_resource_filters_pairs(host: str, token: str, scan_id: str = "") -> dict:
"""
Retrieves and maps resource metadata filter values from the findings endpoint.
Args:
host (str): The host URL of the API.
token (str): Bearer token for authentication.
scan_id (str, optional): Optional scan ID to filter metadata. Defaults to using inserted_at timestamp.
Returns:
dict: A dictionary of resource filter metadata.
Raises:
AssertionError: If the request fails or does not return a 200 status code.
"""
metadata_filters = (
f"filter[scan]={scan_id}"
if scan_id
else f"filter[inserted_at]={TARGET_INSERTED_AT}"
)
response = requests.get(
f"{host}/findings/metadata?{metadata_filters}", headers=get_auth_headers(token)
)
assert (
response.status_code == 200
), f"Failed to get resource filters values: {response.text}"
attributes = response.json()["data"]["attributes"]
return {
FINDINGS_RESOURCE_METADATA[key]: values
for key, values in attributes.items()
if key in FINDINGS_RESOURCE_METADATA.keys()
}
def get_sort_value(sort_values: list) -> str:
"""
Constructs a sort query string from a list of sort keys.
Args:
sort_values (list): The list of sort values to include in the query.
Returns:
str: A formatted sort query string (e.g., "sort=created_at,-severity").
"""
return f"sort={','.join(sort_values)}"

View File

@@ -1,117 +0,0 @@
# Prowler Multicloud CIS Benchmarks PowerBI Template
![Prowler Report](https://github.com/user-attachments/assets/560f7f83-1616-4836-811a-16963223c72f)
## Getting Started
1. Install Microsoft PowerBI Desktop
This report requires the Microsoft PowerBI Desktop software which can be downloaded for free from Microsoft.
2. Run compliance scans in Prowler
The report uses compliance csv outputs from Prowler. Compliance scans be run using either [Prowler CLI](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli) or [Prowler Cloud/App](https://cloud.prowler.com/sign-in)
1. Prowler CLI -&gt; Run a Prowler scan using the --compliance option
2. Prowler Cloud/App -&gt; Navigate to the compliance section to download csv outputs
![Download Compliance Scan](https://github.com/user-attachments/assets/42c11a60-8ce8-4c60-a663-2371199c052b)
The template supports the following CIS Benchmarks only:
| Compliance Framework | Version |
| ---------------------------------------------- | ------- |
| CIS Amazon Web Services Foundations Benchmark | v4.0.1 |
| CIS Google Cloud Platform Foundation Benchmark | v3.0.0 |
| CIS Microsoft Azure Foundations Benchmark | v3.0.0 |
| CIS Kubernetes Benchmark | v1.10.0 |
Ensure you run or download the correct benchmark versions.
3. Create a local directory to store Prowler csvoutputs
Once downloaded, place your csv outputs in a directory on your local machine. If you rename the files, they must maintain the provider in the filename.
To use time-series capabilities such as "compliance percent over time" you'll need scans from multiple dates.
4. Download and run the PowerBI template file (.pbit)
Running the .pbit file will open PowerBI Desktop and prompt you for the full filepath to the local directory
5. Enter the full filepath to the directory created in step 3
Provide the full filepath from the root directory.
Ensure that the filepath is not wrapped in quotation marks (""). If you use Window's "copy as path" feature, it will automatically include quotation marks.
6. Save the report as a PowerBI file (.pbix)
Once the filepath is entered, the template will automatically ingest and populate the report. You can then save this file as a new PowerBI report. If you'd like to generate another report, simply re-run the template file (.pbit) from step 4.
## Validation
After setting up your dashboard, you may want to validate the Prowler csv files were ingested correctly. To do this, navigate to the "Configuration" tab.
The "loaded CIS Benchmarks" table shows the supported benchmarks and versions. This is defined by the template file and not editable by the user. All benchmarks will be loaded regardless of which providers you provided csv outputs for.
The "Prowler CSV Folder" shows the path to the local directory you provided.
The "Loaded Prowler Exports" table shows the ingested csv files from the local directory. It will mark files that are treated as the latest assessment with a green checkmark.
![Prowler Validation](https://github.com/user-attachments/assets/a543ca9b-6cbe-4ad1-b32a-d4ac2163d447)
## Report Sections
The PowerBI Report is broken into three main report pages
| Report Page | Description |
| ----------- | ----------------------------------------------------------------------------------- |
| Overview | Provides general CIS Benchmark overview across both AWS, Azure, GCP, and Kubernetes |
| Benchmark | Provides overview of a single CIS Benchmark |
| Requirement | Drill-through page to view details of a single requirement |
### Overview Page
The overview page is a general CIS Benchmark overview across both AWS, Azure, GCP, and Kubernetes.
![image](https://github.com/user-attachments/assets/94164fa9-36a4-4bb9-890d-e9a9a63a3e7d)
The page has the following components:
| Component | Description |
| ---------------------------------------- | ------------------------------------------------------------------------ |
| CIS Benchmark Overview | Table with benchmark name, Version, and overall compliance percentage |
| Provider by Requirement Status | Bar chart showing benchmark requirements by status by provider |
| Compliance Percent Heatmap | Heatmap showing compliance percent by benchmark and profile level |
| Profile level by Requirement Status | Bar chart showing requirements by status and profile level |
| Compliance Percent Over Time by Provider | Line chart showing overall compliance perecentage over time by provider. |
### Benchmark Page
The benchmark page provides an overview of a single CIS Benchmark. You can select the benchmark from the dropdown as well as scope down to specific profile levels or regions.
![image](https://github.com/user-attachments/assets/34498ee8-317b-4b81-b241-c561451d8def)
The page has the following components:
| Component | Description |
| --------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------ |
| Compliance Percent Heatmap | Heatmap showing compliance percent by region and profile level |
| Benchmark Section by Requirement Status | Bar chart showing benchmark requirements by bennchmark section and status |
| Compliance percent Over Time by Region | Line chart showing overall compliance percentage over time by region |
| Benchmark Requirements | Table showing requirement section, requirement number, reuqirement title, number of resources tested, status, and number of failing checks |
### Requirement Page
The requirement page is a drill-through page to view details of a single requirement. To populate the requirement page right click on a requiement from the "Benchmark Requirements" table on the benchmark page and select "Drill through" -&gt; "Requirement".
![image](https://github.com/user-attachments/assets/5c9172d9-56fe-4514-b341-7e708863fad6)
The requirement page has the following components:
| Component | Description |
| ------------------------------------------ | --------------------------------------------------------------------------------- |
| Title | Title of the requirement |
| Rationale | Rationale of the requirement |
| Remediation | Remedation guidance for the requirement |
| Region by Check Status | Bar chart showing Prowler checks by region and status |
| Resource Checks for Benchmark Requirements | Table showing Resource ID, Resource Name, Status, Description, and Prowler Checkl |
## Walkthrough Video
[![image](https://github.com/user-attachments/assets/866642c6-43ac-4aac-83d3-bb625002da0b)](https://www.youtube.com/watch?v=lfKFkTqBxjU)

View File

@@ -1,9 +1,6 @@
#!/bin/bash
# Run Prowler against All AWS Accounts in an AWS Organization
# Activate Poetry Environment
eval "$(poetry env activate)"
# Show Prowler Version
prowler -v

Some files were not shown because too many files have changed in this diff Show More