Compare commits

..

783 Commits

Author SHA1 Message Date
Prowler Bot
a0ca1f5124 fix(gcp): handle case sensitivity in block-project-ssh-keys (#8124)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-06-27 19:11:20 +08:00
Prowler Bot
a60b981526 fix: checks with no resource name (#8121)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-06-27 18:46:29 +08:00
Prowler Bot
25da83276f fix(compliance): handle latest assessment date for each account (#8109)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-06-26 17:54:02 +08:00
Prowler Bot
5a50b5d38f fix(aws): fix logic in VPC and ELBv2 checks (#8092)
Co-authored-by: crr <42739372+55002ghals@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: César Arroba <19954079+cesararroba@users.noreply.github.com>
2025-06-26 00:26:14 +08:00
Prowler Bot
eb3e4fab85 fix(aws): retrieve correctly ECS Container insights settings (#8100)
Co-authored-by: Jack Holloway <MrPrimate@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-25 22:28:49 +08:00
Prowler Bot
9ac45c08a0 fix(organizations): Key Error: Statement in check organizations_scp_deny_regions (#8099)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: César Arroba <19954079+cesararroba@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-25 21:34:41 +08:00
Prowler Bot
bed9bfaab5 chore(gha): avoid comment on PRs for check-changelog workflow (#8105)
Co-authored-by: César Arroba <19954079+cesararroba@users.noreply.github.com>
2025-06-25 14:15:11 +05:45
Prowler Bot
a29d626552 chore(gha): avoid comment on PRs for check-changelog workflow (#8104)
Co-authored-by: César Arroba <19954079+cesararroba@users.noreply.github.com>
2025-06-25 14:14:41 +05:45
Prowler Bot
c7ff32b513 chore(gha): check changelog when label is added or deleted (#8103)
Co-authored-by: César Arroba <19954079+cesararroba@users.noreply.github.com>
2025-06-25 14:13:53 +05:45
Prowler Bot
b86e2139e5 chore(gha): add permissions on check-changelog workflow (#8102)
Co-authored-by: César Arroba <19954079+cesararroba@users.noreply.github.com>
2025-06-25 14:12:43 +05:45
Prowler Bot
798b74e6a2 chore(gha): check changelog changes on pull request (#8101)
Co-authored-by: César Arroba <19954079+cesararroba@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-06-25 14:10:10 +05:45
Prowler Bot
1e2ab0e617 chore(release): Bump version to v5.7.6 (#8065)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-06-19 14:32:53 +05:45
Prowler Bot
f96b7f5958 fix: add missing changelog compliance timestamps (#8062)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-18 16:32:48 +02:00
Prowler Bot
2b335c47c7 fix(compliance): use unified timestampt for all requirements (#8059)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-18 22:15:42 +08:00
Prowler Bot
a73fcd2a0d chore: add pr to changelog (#8057)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-06-18 14:44:54 +02:00
Prowler Bot
7d7ee68fc2 fix(m365): avoid user requests in setup_identity app context and user auth log enhancement (#8055)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2025-06-18 13:46:03 +02:00
Prowler Bot
52b68451d1 chore(metadata): add validator for ResourceType (#8036)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-17 16:41:14 +08:00
Prowler Bot
0df95e9088 chore(sentry): handle exceptions ignores not based in ClassNames (#8038)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-06-17 13:41:02 +05:45
Prowler Bot
ff09c1f4c6 fix(metadata): add missing ResourceType values (#8031)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-17 01:01:14 +08:00
Prowler Bot
3b1f625d10 fix(eks): add EKS to service without subservices (#8025)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-16 23:17:36 +08:00
Prowler Bot
bc8bca5f34 chore: add missing init file to check repository_secret_scanning_enabled (#8032)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-06-16 21:54:56 +08:00
Prowler Bot
95314dfc2b fix(findings): exclude blank resource types from metadata endpoints (#8030)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-06-16 15:14:53 +02:00
Prowler Bot
63458cd976 fix(azure): add new way to auth against App Insight (#8024)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-16 19:19:51 +08:00
Prowler Bot
9d5f66ced3 fix(network): allow 0 as compliant value (#8021)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-16 19:05:48 +08:00
Prowler Bot
6e85fe26ba fix(app): change api call for ftps_state (#8020)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-06-16 09:27:02 +02:00
Prowler Bot
fc4536dca8 fix(iam): check always if root credentials are present (#8013)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-06-13 17:20:12 +08:00
Prowler Bot
ddd3a292f8 chore: update CHANGELOG (#8016)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-06-13 10:17:06 +02:00
Prowler Bot
abd5fc181f fix(export): add name sanitization (#8012)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-06-12 16:45:05 +02:00
Prowler Bot
0e45467502 fix(dashboard): handle account uids with 0 at start and end (#8006)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-12 17:42:07 +08:00
Prowler Bot
69ce74f831 fix(kubernetes): change object type to set for apiserver check (#8005)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-06-12 17:08:55 +08:00
Prowler Bot
9d428b5e3b fix(k8s): remove typo for PCI 4.0 compliance framework (#8004)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-06-12 16:46:42 +08:00
Prowler Bot
ba7dafd518 fix(gcp): test connection by verifying token (#7883)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-12 15:20:01 +08:00
Prowler Bot
300936f52f chore(release): Bump version to v5.7.5 (#7997)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-06-11 17:44:28 +05:45
Prowler Bot
a517c575dd revert: RLS transactions handling and DB custom backend (#7995)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-06-11 11:19:43 +02:00
Prowler Bot
0eec05bade chore(release): Bump version to v5.7.4 (#7958)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-06-06 14:31:34 +05:45
Pepe Fagoaga
a59d985b2d chore(ui): Changelog for v5.7.3 (#7951) 2025-06-05 20:16:08 +05:45
Prowler Bot
62bc5bbf76 chore: update API changelog for v5.7.3 (#7950)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-06-05 16:07:20 +02:00
Prowler Bot
1c7d6f697b fix(changelog): add entries for password encryption in v5.7.3 (#7944)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-06-05 19:12:46 +05:45
Prowler Bot
f7e20fc008 fix(formSchemas): encrypted password typo (#7940)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-06-05 19:04:42 +05:45
Prowler Bot
8831572c5f fix(m365): remove last encrypted password appearances (#7941)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-06-05 19:00:49 +05:45
Prowler Bot
6355be4ede chore(m365powershell): manage encryption from plaintext password (#7942)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-06-05 14:48:44 +02:00
Prowler Bot
35e0efb36b feat(database): handle already closed connections (#7936)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-06-04 16:20:40 +02:00
Prowler Bot
928dfe078c fix(rls): Apply persistent RLS transactions (#7917)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-06-04 15:23:31 +02:00
Prowler Bot
d45c686429 revert: remove get_with_retry (#7933)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-06-04 15:11:10 +02:00
Prowler Bot
b51a07e490 chore(release): Bump version to v5.7.3 (#7911)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-06-04 12:41:58 +05:45
Prowler Bot
8afc016312 feat(changelog): update version with fixes (#7905)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-06-02 16:26:51 +05:45
Prowler Bot
3a51a455e1 fix: set user_id for tenant operations (#7899)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-06-02 12:17:45 +02:00
Prowler Bot
16b008588c docs: update the changelog (#7902)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-06-02 11:55:35 +02:00
Prowler Bot
0140fd0b0a fix: add new get method to avoid race conditions when creating async tasks (#7887)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-05-30 10:21:28 +02:00
Prowler Bot
7546dd8dd1 fix(api): connection correctly reflected (#7885)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-05-29 20:37:18 +05:45
Prowler Bot
475ee8689c fix(awslambda): aws service awslambda not working (#7881)
Co-authored-by: Alison Vilela <me@alison.dev>
2025-05-29 09:44:52 +02:00
Prowler Bot
dec6a29086 fix(k8s): UID validation for valid context names (#7880)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-05-29 12:40:00 +05:45
Prowler Bot
563652fd68 fix(ui): increase limit to retrieve more than 10 scan list (#7879)
Co-authored-by: sumit-tft <70506234+sumit-tft@users.noreply.github.com>
2025-05-29 07:58:35 +02:00
Prowler Bot
a60cd1f24d fix(admincenter): service and group visibility (#7875)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-05-28 16:55:13 +02:00
Prowler Bot
17df834add chore: enhance CustomDropdownFilter (#7873)
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
2025-05-28 16:35:36 +02:00
Prowler Bot
0e44bebc79 fix(m365): add powershell.close() to msgraph services (#7817)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2025-05-28 16:03:49 +02:00
Prowler Bot
e05a9381e6 fix(admincenter): admincenter_users_admins_reduced_license_footprint logic (#7810)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-28 11:46:14 +02:00
Prowler Bot
20e88e5b09 fix(defender): update defender_ensure_notify_alerts_severity_is_high logic (#7863)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-28 10:49:01 +02:00
Prowler Bot
dedd77758a fix(ui): Added missing icons (kisa, prowlerThreat) on compliance page (#7864)
Co-authored-by: sumit-tft <70506234+sumit-tft@users.noreply.github.com>
2025-05-28 10:36:54 +02:00
Pablo Lara
94ea5ef546 docs: fix conflict changelog (#7849) 2025-05-27 12:11:17 +02:00
Prowler Bot
ad662fe245 fix(ui): updated to use the correct message when download report clicked (#7847)
Co-authored-by: sumit-tft <70506234+sumit-tft@users.noreply.github.com>
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-05-27 11:12:22 +02:00
Prowler Bot
9ea6043006 fix(reports): change invalid search term for tasks (#7835)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-05-27 10:19:03 +02:00
Prowler Bot
532cd973d1 feat(app): split SDK App service calls (#7846)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-05-27 10:15:41 +02:00
Prowler Bot
b15cd6cc2c fix(vpc): change the ServiceName from EC2 to VPC (#7841)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-05-26 19:55:52 +02:00
Prowler Bot
f8c05c8741 fix(m365powershell): add sanitize to test_credentials (#7818)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-23 13:43:45 +02:00
Prowler Bot
60f36226c8 chore(release): Bump version to v5.7.2 (#7812)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-05-23 11:53:47 +02:00
Prowler Bot
830c4fd706 docs: update changelog UI (#7809)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-05-21 12:52:50 +02:00
Prowler Bot
00a349dadd fix: retrieve more than 10 providers (#7806)
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-05-21 12:44:50 +02:00
Prowler Bot
0929fff027 chore: tweak some wording for consistency (#7807)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-05-21 12:40:53 +02:00
Prowler Bot
ec7368f1bc fix: AWS I AM role validation when field is empty (#7805)
Co-authored-by: sumit-tft <70506234+sumit-tft@users.noreply.github.com>
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-05-21 12:35:45 +02:00
Prowler Bot
1ad6878575 feat(findings): Add new index for finding UID lookup (#7802)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-05-21 12:05:11 +02:00
Prowler Bot
8b0a62b42c fix(dashboard): remove typo from subscribe cards (#7798)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-05-21 11:23:21 +02:00
Prowler Bot
d9c8b1ea66 chore(release): Bump version to v5.7.1 (#7789)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-05-20 22:27:32 +05:45
Pedro Martín
ec7a95ebba feat(export): support m365 - prowler threatscore (#7783) 2025-05-19 16:08:35 +02:00
Víctor Fernández Poyatos
99dec659d6 fix(providers): Fix m365 UID validation (#7781) 2025-05-19 13:35:23 +02:00
Adrián Jesús Peña Rodríguez
1b7630cbe3 chore: update api changelog (#7775) 2025-05-19 11:08:15 +02:00
Pepe Fagoaga
740ab266fe chore(api): Use Prowler from v5.7 2025-05-19 11:05:54 +02:00
Pablo Lara
a3b606fc71 docs: update changelog (#7773) 2025-05-19 10:49:58 +02:00
Hugo Pereira Brito
f6bb6efbf1 chore(m365): accept all tenant domains in authentication (#7746) 2025-05-19 10:47:40 +02:00
Pedro Martín
91b1feffcb fix(cis): rename and add sections and subsections (#7738) 2025-05-19 10:47:40 +02:00
Pedro Martín
da5e11e08e feat(aws): add CIS 5.0 compliance framework (#7766) 2025-05-19 10:47:36 +02:00
Pedro Martín
3ca8aacbfa docs(checks): improve docs related with checks (#7768) 2025-05-19 10:47:03 +02:00
Hugo Pereira Brito
f88d45535c feat(repository): add new check repository_branch_delete_on_merge_enabled (#6209)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:47:03 +02:00
Hugo Pereira Brito
6d6864d9c5 feat(repository): add new check repository_default_branch_requires_conversation_resolution (#6208)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:47:03 +02:00
Víctor Fernández Poyatos
03ec12a2ca fix(findings): Fix latest metadata backfill condition and optimization (#7765) 2025-05-19 10:47:03 +02:00
Víctor Fernández Poyatos
0fc4e23b81 fix(findings): Fix latest metadata backfill condition (#7762) 2025-05-19 10:47:03 +02:00
sumit-tft
45780de281 fix(ui): Removed the alias if not available in findings detail page (#7751) 2025-05-19 10:47:03 +02:00
sumit-tft
ab80f3be18 fix: Updated the high risk section provider icons to make it consistent (#7706) 2025-05-19 10:47:03 +02:00
Hugo Pereira Brito
98da709202 feat(repository): add new check repository_default_branch_protection_applies_to_admins (#6205)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:47:03 +02:00
Pablo Lara
6cb1acd93d feat: use getFindingsLatest when no scan or date filters are applied (#7756) 2025-05-19 10:47:03 +02:00
Víctor Fernández Poyatos
d4e5f37894 feat(findings): Add /findings/latest and /findings/metadata/latest endpoints (#7743) 2025-05-19 10:47:03 +02:00
Hugo Pereira Brito
e0b1036f22 feat(repository): add new check repository_default_branch_status_checks_required (#6204)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:47:03 +02:00
Hugo Pereira Brito
d2dde4fbbd fix(check): add missing __init__.py files (#7748) 2025-05-19 10:46:58 +02:00
Hugo Pereira Brito
5572c675d8 feat(repository): add new check repository_default_branch_deletion_disabled (#6200)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-19 10:46:40 +02:00
Hugo Pereira Brito
ace0dab20d feat(repository): add new check repository_default_branch_disallows_force_push (#6197)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:46:40 +02:00
Pablo Lara
8ef42bd1d6 fix: force z-index componet select provider (#7744)
Co-authored-by: StylusFrost <pm.diaz.pena@gmail.com>
2025-05-19 10:46:40 +02:00
Hugo Pereira Brito
75a5496459 feat(repository): add new check repository_default_branch_requires_linear_history (#6162)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:46:40 +02:00
Hugo Pereira Brito
2b4bb7f805 feat(repository): add new check repository_default_branch_protection_enabled (#6161)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-19 10:46:40 +02:00
Hugo Pereira Brito
f1c165a89d feat(organization): add new check organization_members_mfa_required (#6304)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:46:40 +02:00
Pablo Lara
a0e72eab0b fix: UID Filter Improvement (#7741)
Co-authored-by: sumit_chaturvedi <chaturvedi.sumit@tftus.com>
2025-05-19 10:46:40 +02:00
Hugo Pereira Brito
e539b1ab4d feat: add GitHub provider documentation and CIS v1.0.0 compliance (#6116)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-19 10:46:40 +02:00
Hugo Pereira Brito
ddc7fe649d feat(github): add new service Organization (#6300)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:46:40 +02:00
Hugo Pereira Brito
d80ddc5107 feat(github): add new check repository_code_changes_multi_approval_requirement (#6160)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:46:40 +02:00
Adrián Jesús Peña Rodríguez
04d3138a22 fix: ensure proper folder creation (#7729) 2025-05-19 10:46:40 +02:00
Pepe Fagoaga
caaadbbc26 feat(ui): Add AWS CloudFormation Quick Link to deploy the IAM Role (#7735) 2025-05-19 10:46:40 +02:00
César Arroba
6a104141f3 chore: add ref on checkout step (#7740) 2025-05-19 10:46:39 +02:00
Hugo Pereira Brito
7399815aa4 feat(github): add GitHub provider (#5787)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:46:39 +02:00
Pablo Lara
94e5902553 docs: update changelog (#7731) 2025-05-19 10:46:39 +02:00
Sergio Garcia
a0b1838bd2 fix(deps): solve h11 package vulnerability (#7696) 2025-05-19 10:46:39 +02:00
sumit-tft
1300cf0ed2 fix: Added filter to get connected providers only for banner to show (#7723) 2025-05-19 10:46:35 +02:00
dependabot[bot]
c2cd5bcb30 chore(deps): bump h11 from 0.14.0 to 0.16.0 in /api (#7610)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-19 10:46:10 +02:00
Pablo Lara
eb304023b7 chore: bump tailwind-merge from 2.5.3 to 3.2.0 (#7722) 2025-05-19 10:45:47 +02:00
Pablo Lara
f6d78770e5 chore: add M365 to scan page filters (#7704) 2025-05-19 10:45:47 +02:00
Pablo Lara
7e3ee14741 chore(deps): upgrade recharts from 2.13.0-alpha.4 to 2.15.2 (#7717) 2025-05-19 10:45:47 +02:00
Sergio Garcia
cfa3ba2c94 chore(docs): quality redrive to README.md (#7616)
Co-authored-by: dcanotrad <168282715+dcanotrad@users.noreply.github.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-05-19 10:45:47 +02:00
Alejandro Bailo
fc069d9eb1 feat: scan label validation (#7693) 2025-05-19 10:45:47 +02:00
Hugo Pereira Brito
071030c00d chore(findings): enhance m365 authentication method information (#7681) 2025-05-19 10:45:46 +02:00
Víctor Fernández Poyatos
e2c93a0ba8 feat(findings): Improve performance on /findings/metadata, /overviews and filters (#7690) 2025-05-19 10:45:46 +02:00
Hugo Pereira Brito
deed6c0b5e chore(compliance): update CIS 4.0 for M365 (#7699)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:45:41 +02:00
Pedro Martín
fef5d3d0e4 docs(compliance): update compliance page with latest changes (#7694)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-19 10:44:59 +02:00
Prowler Bot
52ba89ebdd chore(regions_update): Changes in regions for AWS services (#7709)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-05-19 10:44:02 +02:00
Pepe Fagoaga
4cdbc551d5 chore(api): Set tab name for API reference (#7713) 2025-05-19 10:44:02 +02:00
Andoni Alonso
7f582c8098 fix(typo): rename generate_compliance_json_from_csv_threatscore (#7698) 2025-05-19 10:44:02 +02:00
Pedro Martín
f4c797e9d4 feat(m365): add Prowler Threatscore (#7692)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-19 10:44:02 +02:00
Sergio Garcia
b3b88ebd68 feat(kubernetes): allow setting cluster name in in-cluster mode (#7695) 2025-05-19 10:44:02 +02:00
César Arroba
a01ff0f7cc chore: add pass PR url (#7711) 2025-05-19 10:44:02 +02:00
Alejandro Bailo
804c2cf058 feat: Horizontal bar chart (#7680) 2025-05-19 10:44:02 +02:00
Adrián Jesús Peña Rodríguez
9354cfac9e docs: update the download export documentation (#7682) 2025-05-19 10:44:02 +02:00
Prowler Bot
13b3cf8fee chore(release): Bump version to v5.7.0 (#7697)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-05-19 10:43:54 +02:00
Alejandro Bailo
98b46a94d3 feat: accordion component (#7700) 2025-05-19 10:40:56 +02:00
Prowler Bot
575bb8cd2c fix: move ProviderType to shared types and update usages (#7714)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-05-19 14:11:52 +05:45
Prowler Bot
fcc25451d8 chore(ec2): improve severity logic in SG all ports open check (#7769)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-16 16:06:51 +02:00
Prowler Bot
6c04592e7e fix(check): Add support for condition with restriction on SNS endpoint (#7757)
Co-authored-by: Ogonna Iwunze <1915636+wunzeco@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-15 16:53:12 +02:00
Prowler Bot
73f9811f42 fix(check): add missing __init__.py files (#7754)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-05-15 15:01:42 +02:00
Prowler Bot
c89673a01e fix(deps): solve h11 package vulnerability (#7730)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-14 10:21:01 +02:00
Prowler Bot
2c7b10d71a fix: Added filter to get connected providers only for banner to show (#7726)
Co-authored-by: sumit-tft <70506234+sumit-tft@users.noreply.github.com>
2025-05-13 13:02:03 +02:00
Prowler Bot
9ea500009b fix(bump-version): bump for fix also in minors (#7715)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-13 13:35:25 +05:45
Prowler Bot
5f92e33a54 fix(defender): enhance policies checks logic (#7719)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-12 19:12:11 +02:00
Prowler Bot
d6b5d8a919 chore(compliance): update CIS 4.0 for M365 (#7716)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-12 13:10:07 +02:00
Prowler Bot
b3d5f7b848 fix(m365): invalid user credentials exception (#7707)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-12 12:46:02 +02:00
Pepe Fagoaga
a96aa9a3f6 chore(release): Bump version to v5.6.1 (#7701) 2025-05-12 14:41:03 +05:45
Pepe Fagoaga
ff3cd0f51b fix(sdk): Set v5.6.0 in config 2025-05-08 14:18:51 +02:00
Pepe Fagoaga
c208948521 chore(deps): v5.6.0 (#7689) 2025-05-08 18:00:44 +05:45
Pepe Fagoaga
f1d5f73d40 chore(changelog): prepare for v5.6.0 (#7688) 2025-05-08 13:11:22 +02:00
Pedro Martín
1cc09b81f9 fix(prowler_threatscore): fine-tune LevelOfRisk (#7667) 2025-05-08 11:39:14 +02:00
Pedro Martín
56ef1a4f87 fix(dashboard): drop duplicates for rows (#7686) 2025-05-08 11:36:25 +02:00
Sergio Garcia
0f75c2a24f fix(mutelist): properly handle wildcards and regex (#7685) 2025-05-08 11:36:25 +02:00
Pedro Martín
b7c317bf23 fix(dashboard): remove muted findings on compliance page (#7683) 2025-05-08 11:36:25 +02:00
Adrián Jesús Peña Rodríguez
54aa1a4507 feat: add compliance to API report files and its endpoint (#7653)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-05-08 11:36:13 +02:00
Hugo Pereira Brito
0d4bd8c0a0 fix(metadata): typo in defender_chat_report_policy_configured (#7678) 2025-05-08 11:33:56 +02:00
Alejandro Bailo
1c040b8e41 feat: add DeltaIndicator in new findings (#7676) 2025-05-08 11:33:56 +02:00
Daniel Barranquero
c3b36720dc feat(docs): add snapshots to M365 docs (#7673) 2025-05-08 11:33:56 +02:00
Hugo Pereira Brito
af6b0833a1 fix(powershell): remove platform-specific execution (#7675) 2025-05-08 11:33:56 +02:00
Alejandro Bailo
8817d08a92 refactor(finding-detail): remove "Next Scan" field (#7674) 2025-05-08 11:33:56 +02:00
Pablo Lara
78d9508862 docs: update changelog (#7672) 2025-05-08 11:33:56 +02:00
Alejandro Bailo
dc543b2c89 feat: diff between providers actions depending on their secrets (#7669) 2025-05-08 11:33:56 +02:00
Sergio Garcia
0025c99fb9 chore(actions): run tests in dependabot updates (#7671) 2025-05-08 11:33:56 +02:00
Pedro Martín
d3f12075e9 feat(aws): add static credentials for S3 and SH (#7322) 2025-05-08 11:33:56 +02:00
Pablo Lara
513bf6bca7 chore: tweaks for m365 provider (#7668) 2025-05-08 11:33:56 +02:00
Alejandro Bailo
7459fe9556 feat: add delta attribute in findings detail view with and finding id to the url (#7654) 2025-05-08 11:33:56 +02:00
Pablo Lara
2b06f0115e feat(compliance): add a button to download the report in compliance card (#7665) 2025-05-08 11:33:56 +02:00
Andoni Alonso
2e134815d3 feat(teams): add new checks teams_security_reporting_enabled and defender_chat_report_policy_configured (#7614)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-05-08 11:33:56 +02:00
Daniel Barranquero
118a1c1138 feat(defender): add new check defender_malware_policy_comprehensive_attachments_filter_applied (#7661) 2025-05-08 11:33:55 +02:00
Daniel Barranquero
1b35a72915 feat(exchange): make exchange_user_mailbox_auditing_enabled check configurable (#7662)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:33:55 +02:00
Hugo Pereira Brito
2d7b110b1b feat(m365): ensure all forms of mail forwarding are blocked or disabled (#7658)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:33:55 +02:00
Daniel Barranquero
7c00c949e6 docs(m365): add documentation for m365 (#7622)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:33:55 +02:00
Pedro Martín
8fcbcda15c chore(changelog): update with latest PR (#7628)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:33:55 +02:00
Pedro Martín
fb33506f4a feat(dashboard): support m365 provider (#7633)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:33:55 +02:00
dependabot[bot]
e43b572d5e chore(deps): bump docker/build-push-action from 6.15.0 to 6.16.0 (#7650)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
Prowler Bot
ff22e13e24 chore(regions_update): Changes in regions for AWS services (#7657)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
dependabot[bot]
c4946a8938 chore(deps): bump github/codeql-action from 3.28.15 to 3.28.16 (#7649)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
dependabot[bot]
0e51ee9c7c chore(deps): bump trufflesecurity/trufflehog from 3.88.23 to 3.88.26 (#7648)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
dependabot[bot]
22a5776ec0 chore(deps): bump actions/setup-python from 5.5.0 to 5.6.0 (#7647)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
sumit-tft
108983e959 feat(ui): Page size for datatables (#7634) 2025-05-08 11:33:55 +02:00
Alejandro Bailo
635b3f1978 fix: error about page number persistence when filters change (#7655) 2025-05-08 11:33:55 +02:00
Andoni Alonso
dc3d5149e9 chore(sentry): attach stacktrace to logging events (#7598)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-05-08 11:33:55 +02:00
Daniel Barranquero
cffa560c9d feat(exchange): add new check exchange_organization_modern_authentication_enabled (#7636)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
Daniel Barranquero
80c8cb9b6c feat(exchange): add new check exchange_roles_assignment_policy_addins_disabled (#7644)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
Daniel Barranquero
dbffcedc49 feat(exchange): add new check exchange_mailbox_properties_auditing_e3_enabled (#7642)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
Daniel Barranquero
5d4191a7fc feat(exchange): add new check exchange_transport_config_smtp_auth_disabled (#7640)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
Daniel Barranquero
ab6d05637d feat(exchange): add new check exchange_organization_mailtips_enabled (#7637)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-08 11:33:55 +02:00
Adrián Jesús Peña Rodríguez
75b3c02811 feat: add m365 to API (#7563)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
2025-05-08 11:33:43 +02:00
Hugo Pereira Brito
e25ff209b3 feat(m365): automate PowerShell modules installation (#7618)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-05-08 11:30:40 +02:00
Pablo Lara
81cf5303a1 fix: set correct default value for session duration (#7639) 2025-05-08 11:30:40 +02:00
Víctor Fernández Poyatos
087ac5b53a test(performance): Add base framework for API performance tests (#7632) 2025-05-08 11:30:40 +02:00
Daniel Barranquero
fb429c9e23 feat(exchange): add new check exchange_mailbox_policy_additional_storage_restricted (#7638)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-08 11:30:40 +02:00
Pedro Martín
284cd66ed6 feat(sharepoint): add new check related with OneDrive Sync (#7589)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-08 11:30:40 +02:00
Pedro Martín
3ae2c4a225 fix(typos): remove unneeded files (#7627) 2025-05-08 11:30:40 +02:00
Erlend Ekern
2d270ace7f chore(dockerfile): add image source as docker label (#7617) 2025-05-08 11:30:40 +02:00
Pedro Martín
aaeb71a563 feat(compliance): add new Prowler Threat Score Compliance Framework (#7603)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:30:39 +02:00
dependabot[bot]
737abe83c8 chore(deps): bump @babel/runtime from 7.24.7 to 7.27.0 in /ui (#7502)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-08 11:30:39 +02:00
Andoni Alonso
a78c5499c9 feat(teams): add new check teams_meeting_presenters_restricted (#7613)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:30:39 +02:00
Andoni Alonso
e50d779e34 feat(teams): add new check teams_meeting_recording_disabled (#7607)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:30:39 +02:00
Andoni Alonso
be2965d274 feat(teams): add new check teams_meeting_external_chat_disabled (#7605)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:30:39 +02:00
Andoni Alonso
0bdaeff745 feat(teams): add new check teams_meeting_external_control_disabled (#7604)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:30:39 +02:00
Hugo Pereira Brito
ad25a8fe82 fix(powershell): handle m365 provider execution and logging (#7602)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:30:39 +02:00
Hugo Pereira Brito
aebc89c17c feat(teams): add new check teams_meeting_chat_anonymous_users_disabled (#7579)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:30:39 +02:00
Pablo Lara
8fac7ad44d feat: add new M365 to the provider overview table (#7615) 2025-05-08 11:30:39 +02:00
dependabot[bot]
b0f5d6718f chore(deps): bump h11 from 0.14.0 to 0.16.0 (#7609)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-08 11:30:35 +02:00
Hugo Pereira Brito
83dfa8ae45 feat(teams): add new check teams_meeting_dial_in_lobby_bypass_disabled (#7571)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Hugo Pereira Brito
d929e293b5 feat(teams): add new check teams_meeting_external_lobby_bypass_disabled (#7568)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Pepe Fagoaga
87c4361559 chore(actions): Bump Prowler version on release (#7560) 2025-05-08 11:28:19 +02:00
Hugo Pereira Brito
de36bddccf chore(m365): add test_connection function (#7541)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Daniel Barranquero
97dac23d39 feat(exchange): add new check exchange_external_email_tagging_enabled (#7580)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Daniel Barranquero
64082b5038 feat(exchange): add new check exchange_transport_rules_whitelist_disabled (#7569)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Daniel Barranquero
2f6e83ad0c feat(defender): Add new check defender_antispam_policy_inbound_no_allowed_domains (#7500)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Hugo Pereira Brito
24e0a175b5 feat(teams): add new check teams_meeting_anonymous_user_start_disabled (#7567) 2025-05-08 11:28:19 +02:00
Hugo Pereira Brito
7c1ae956d5 fix(docs): overview m365 auth (#7588) 2025-05-08 11:28:19 +02:00
Pablo Lara
896c466889 chore: remove deprecated launch scan page from old 4-step workflow (#7592) 2025-05-08 11:28:19 +02:00
Pablo Lara
e2fd3f14ed feat(m365): add the new provider m365 - UI part (#7591) 2025-05-08 11:28:19 +02:00
Hugo Pereira Brito
8239e2cd09 feat(teams): add new check teams_meeting_anonymous_user_join_disabled (#7565)
Co-authored-by: Andoni A <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Hugo Pereira Brito
9e21348cdd feat(teams): add new check teams_external_users_cannot_start_conversations (#7562)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Hugo Pereira Brito
62672a98f2 feat(teams): add new check teams_unmanaged_communication_disabled (#7561)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Hugo Pereira Brito
08f52ee668 feat(teams): add new check teams_external_domains_restricted (#7557)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Hugo Pereira Brito
253847e3cd fix(teams): teams_email_sending_to_channel_disabled docstrings (#7559) 2025-05-08 11:28:19 +02:00
Daniel Barranquero
8449728df1 feat(defender): add new check defender_antispam_connection_filter_policy_safe_list_off (#7494)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Daniel Barranquero
ba9d0cd9c2 feat(defender): add new check defender_antispam_connection_filter_policy_empty_ip_allowlist (#7492)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Daniel Barranquero
1981173e75 feat(defender): add new check defender_domain_dkim_enabled (#7485)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Daniel Barranquero
1f250dccb7 feat(defender): add new check defender_antispam_outbound_policy_configured (#7480)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:19 +02:00
Prowler Bot
4a8f6070d3 chore(regions_update): Changes in regions for AWS services (#7550)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-05-08 11:28:19 +02:00
César Arroba
87f89e68eb chore: pass labels on PR merge trigger (#7558) 2025-05-08 11:28:19 +02:00
César Arroba
87cb33ba57 chore: revert pass labels (#7556) 2025-05-08 11:28:19 +02:00
César Arroba
83cb2ed297 chore: pass labels as json is required (#7555) 2025-05-08 11:28:18 +02:00
César Arroba
57ff41db3e chore: fix merged PR action, incorrect order on payload (#7554) 2025-05-08 11:28:18 +02:00
César Arroba
3ad376fe07 chore: pass labels (#7553) 2025-05-08 11:28:18 +02:00
César Arroba
9f76c47c85 chore: fix json body (#7552) 2025-05-08 11:28:18 +02:00
César Arroba
c6c46b0f23 chore: fix trigger (#7551) 2025-05-08 11:28:18 +02:00
César Arroba
acb98372c7 chore(gha): trigger cloud pull-request when a PR is merged (#7212) 2025-05-08 11:28:18 +02:00
Daniel Barranquero
1998054680 feat(defender): add new check defender_antiphishing_policy_configured (#7453) 2025-05-08 11:28:18 +02:00
Daniel Barranquero
bb94ede69f feat(defender): add new check defender_malware_policy_notifications_internal_users_malware_enabled (#7435)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:28:18 +02:00
Daniel Barranquero
676133c14d feat(defender): add service and new check defender_malware_policy_common_attachments_filter_enabled (#7425)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:28:18 +02:00
Daniel Barranquero
00e33d39bb feat(exchange): add new check exchange_mailbox_audit_bypass_disabled (#7418)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:18 +02:00
Daniel Barranquero
311c9a41ff feat(exchange): add service and new check exchange_organization_mailbox_auditing_enabled (#7408)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:18 +02:00
Hugo Pereira Brito
86b6732013 feat(teams): add new check teams_email_sending_to_channel_disabled (#7533)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-08 11:28:18 +02:00
Sergio Garcia
f851a90cb0 feat(gcp): support CLOUDSDK_AUTH_ACCESS_TOKEN (#7495) 2025-05-08 11:28:18 +02:00
Sergio Garcia
c8983440f1 chore(regions): change interval to weekly (#7539) 2025-05-08 11:28:18 +02:00
Prowler Bot
ca21d8ceae chore(regions_update): Changes in regions for AWS services (#7538)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-05-08 11:28:18 +02:00
Sergio Garcia
c07a531c3b chore(dependabot): change settings (#7536) 2025-05-08 11:28:18 +02:00
Hugo Pereira Brito
b0e3511351 feat: adapt Microsoft365 provider to use PowerShell (#7331)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-05-08 11:28:18 +02:00
Bogdan A
1c3d5b5f69 docs(gcp): update required permissions for GCP (#7488) 2025-05-08 11:28:18 +02:00
dependabot[bot]
e38a2f47b3 chore(deps): bump python from 3.12.9-alpine3.20 to 3.12.10-alpine3.20 (#7520)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-08 11:28:18 +02:00
dependabot[bot]
b4dab02f5a chore(deps): bump codecov/codecov-action from 5.4.0 to 5.4.2 (#7522)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-08 11:28:18 +02:00
dependabot[bot]
a48fafc277 chore(deps): bump actions/setup-node from 4.3.0 to 4.4.0 (#7521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-08 11:28:18 +02:00
Prowler Bot
109c23ba69 chore(regions_update): Changes in regions for AWS services (#7527)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-05-08 11:28:18 +02:00
Pepe Fagoaga
2fe98ae6ce chore(action): Remove cache in PyPI release (#7532) 2025-05-08 11:28:06 +02:00
Prowler Bot
d56744844a fix(inspector2): handle error when getting active findings (#7679)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-05-07 09:30:05 -04:00
Prowler Bot
cd21c6980b fix(run-sh): Use poetry's env (#7626)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-06 10:55:20 -04:00
Prowler Bot
ddc0c84fb2 fix(azure): CIS v2.0 4.4.1 Uses Wrong Check (#7660)
Co-authored-by: drewadwade <32397380+drewadwade@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-05-05 10:33:39 -04:00
Prowler Bot
26040f04ad fix(s3): add ContentType in upload_file (#7643)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-05-05 10:23:33 -04:00
Prowler Bot
bf7b2d7c8f fix(compliance): improve compliance and dashboard (#7611)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-04-24 13:37:46 -04:00
Prowler Bot
5df1e163ee fix(html): remove first empty line (#7608)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-04-24 11:27:31 -04:00
Prowler Bot
a1326022e5 fix(nhn): remove unneeded parameter (#7601)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-04-24 09:23:25 -04:00
Prowler Bot
4f29743604 fix(scan): handle cloud provider errors and ignore expected sentry noise (#7593)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-23 10:51:55 -04:00
Prowler Bot
ff9c992bf8 fix(aws): use correct ports in ec2_instance_port_cifs_exposed_to_internet recommendation (#7581)
Co-authored-by: Matt Keeler <19890779+mattkeeler@users.noreply.github.com>
2025-04-22 13:17:47 -04:00
Prowler Bot
8eb9d17006 fix(aws): update bucket naming validation to accept dots (#7576)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-22 11:32:01 -04:00
Prowler Bot
7a6ed613d5 fix(azure): handle new FlowLog properties (#7572)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-22 10:42:51 -04:00
Prowler Bot
98e3c4a105 fix(k8s): Remove command as it is not needed (#7573)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-22 10:36:23 -04:00
Prowler Bot
010457ef09 fix(actions): Include files within providers for SDK tests (#7578)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-22 10:32:38 -04:00
Prowler Bot
5d5e4530b3 chore(tests): Split by provider in the SDK (#7575)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-22 10:02:56 -04:00
Prowler Bot
fe5694fd39 fix(aws): remove SHA-1 from ACM insecure key algorithms (#7548)
Co-authored-by: Felix Dreissig <f30@f30.me>
2025-04-18 16:38:06 -05:00
Prowler Bot
7c614ef160 fix(iam): change some logger.info values (#7537)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-15 15:49:12 -04:00
Prowler Bot
36df42cf64 fix(pypi): package name location in pyproject.toml while replicating for prowler-cloud (#7534)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-15 13:21:56 -04:00
Prowler Bot
09be60948d revert: fix(findings): increase uid max length to 600 (#7529)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-15 16:02:43 +05:45
Prowler Bot
a8b102794c chore(changelog): prepare for 5.5.1 (#7524)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-15 11:59:04 +05:45
Prowler Bot
01ed65fcf8 fix(pyproject): Restore packages location (#7511)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-14 17:47:53 -04:00
Pepe Fagoaga
d7b024b460 chore(release): bump for 5.5.1 (#7504)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-14 16:51:34 -04:00
Prowler Bot
45f1b4aeda fix(gcp): handle projects without ID (#7506)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-14 14:08:07 -04:00
Prowler Bot
1e79112ea0 fix(defender): add default name to contacts (#7505)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-04-14 11:58:18 -04:00
Prowler Bot
7a27a8ddf5 fix(findings): increase uid max length to 600 (#7501)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-14 19:03:08 +05:45
Pablo Lara
d8e575e6dc fix: update redirect URL for SSO (#7493) 2025-04-11 14:42:30 +02:00
Pablo Lara
f7ed5bd365 fix: resolve social login issue in AuthForm on sign-up page (#7490) 2025-04-11 10:02:14 +02:00
dependabot[bot]
1c5348d846 chore(deps): bump tj-actions/changed-files from 46.0.4 to 46.0.5 (#7486)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-11 10:02:08 +02:00
Pepe Fagoaga
2bde25a471 fix(api): poetry.lock 2025-04-10 12:32:59 +02:00
Pepe Fagoaga
c1b70ed0fc fix(api): build Prowler from v5 branch 2025-04-09 17:20:47 +02:00
Pepe Fagoaga
575ef41d4f fix(api): 1.6.0 in pyproject 2025-04-09 17:15:30 +02:00
Pepe Fagoaga
8f19bfda0d chore(changelog): Prepare for v5.5.0 (#7484) 2025-04-09 17:07:13 +02:00
Sergio Garcia
fefdce129b fix: handle errors in AWS and Azure (#7482) 2025-04-09 16:35:20 +02:00
Pepe Fagoaga
f85abf90fb fix: conflict in poetry.lock 2025-04-09 16:17:29 +02:00
Pedro Martín
855cc17a23 fix(aws): add default session_duration (#7479) 2025-04-09 16:14:23 +02:00
eeche
d489c80857 feat(NHN): add NHN cloud provider with 6 checks (#6870)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:14:22 +02:00
Prowler Bot
b81e12f697 chore(regions_update): Changes in regions for AWS services (#7478)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:14:22 +02:00
Pablo Lara
c02b9073d1 fix: fix TS type for session duration (#7481) 2025-04-09 16:14:22 +02:00
Pedro Martín
2ce4d72111 feat(gcp): add SOC2 compliance framework (#7476) 2025-04-09 16:14:22 +02:00
Drew Kerrigan
eec0b45b32 fix(ui): Remove UTC from timestamps in app (#7474) 2025-04-09 16:14:22 +02:00
Pablo Lara
f4746a1b09 feat: update the NextJS version to the latest (#7473) 2025-04-09 16:14:22 +02:00
Prowler Bot
d4c23efee3 chore(regions_update): Changes in regions for AWS services (#7467)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:14:22 +02:00
dependabot[bot]
6d7711ca1c chore(deps): bump github/codeql-action from 3.28.13 to 3.28.15 (#7463)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:14:22 +02:00
Pepe Fagoaga
a7f733522c fix(action): Use poetry > v2 (#7472) 2025-04-09 16:14:22 +02:00
Pablo Lara
e2dfc1f383 feat: add link with the service status using static icon (#7468) 2025-04-09 16:14:22 +02:00
dependabot[bot]
163ae5d79d chore(deps): bump tj-actions/changed-files from 46.0.3 to 46.0.4 (#7443)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:14:22 +02:00
Prowler Bot
ff8c33ecf8 chore(regions_update): Changes in regions for AWS services (#7446)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:14:18 +02:00
dependabot[bot]
cf4e8e940d chore(deps): bump trufflesecurity/trufflehog from 3.88.22 to 3.88.23 (#7444)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:13:36 +02:00
Prowler Bot
763e88edb5 chore(regions_update): Changes in regions for AWS services (#7445)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:13:36 +02:00
Pablo Lara
c301fbd8b3 refactor: extract common auth headers into reusable helper (#7439) 2025-04-09 16:13:36 +02:00
Sergio Garcia
5121015586 fix(docs): solve broken links (#7432) 2025-04-09 16:13:36 +02:00
Adrián Jesús Peña Rodríguez
b750b6492d feat: add missing SDK fields to API findings and resources (#7318) 2025-04-09 16:13:36 +02:00
Prowler Bot
da656b5086 chore(regions_update): Changes in regions for AWS services (#7434)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:13:36 +02:00
dependabot[bot]
c82437c32c chore(deps): bump trufflesecurity/trufflehog from 3.88.20 to 3.88.22 (#7433)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:13:36 +02:00
Pedro Martín
22c4216ed6 docs: add onboarding information step by step for each provider (#7362) 2025-04-09 16:13:36 +02:00
Pablo Lara
07d0f72239 fix: correct fetch variable name from invitations to roles (#7437) 2025-04-09 16:13:36 +02:00
Prowler Bot
6d14b5619c chore(regions_update): Changes in regions for AWS services (#7424)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:12:19 +02:00
Prowler Bot
f12c7000bb chore(regions_update): Changes in regions for AWS services (#7417)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:12:19 +02:00
dependabot[bot]
ec06fd0bf4 chore(deps): bump azure-identity from 1.19.0 to 1.21.0 (#7192)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:12:19 +02:00
Daniel Barranquero
92911d2c0b feat(entra): add new check entra_admin_users_cloud_only (#7286) 2025-04-09 16:12:19 +02:00
dependabot[bot]
985bfc1618 chore(deps): bump azure-mgmt-applicationinsights from 4.0.0 to 4.1.0 (#7161)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:12:03 +02:00
dependabot[bot]
d34c1556a1 chore(deps): bump azure-mgmt-containerregistry from 10.3.0 to 12.0.0 (#7025)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:11:44 +02:00
Pedro Martín
c0d4f43310 docs(python): add annotations about Python version (#7402) 2025-04-09 16:10:47 +02:00
Bogdan A
0b2dac83bd feat(gcp): add check for dormant (unused) SA keys (#7348)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
b3d7bb4e8d feat(entra): add new check entra_legacy_authentication_blocked (#7240) 2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
71f8bcefa8 feat(entra): add new check entra_users_mfa_enabled (#7228) 2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
a11b338994 feat(entra): add new check entra_admin_users_phishing_resistant_mfa_enabled (#7211)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
c7c8fc90b2 fix(entra): check name and logic of entra_admin_users_have_mfa_enabled (#7230) 2025-04-09 16:10:47 +02:00
Daniel Barranquero
2bd32f6e8f feat(entra): add new check entra_policy_guest_invite_only_for_admin_roles (#7241)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-09 16:10:47 +02:00
dependabot[bot]
32c8cc723e chore(deps): bump azure-mgmt-resource from 23.2.0 to 23.3.0 (#7054)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:10:47 +02:00
Daniel Barranquero
c298541d8d feat(entra): add new check entra_policy_guest_users_access_restrictions (#7234) 2025-04-09 16:10:47 +02:00
Daniel Barranquero
dcf53ea357 feat(entra): add new check entra_policy_restricts_user_consent_for_apps (#7225) 2025-04-09 16:10:47 +02:00
Víctor Fernández Poyatos
3c9ae06086 feat(findings): Handle muted findings in API and UI (#7378)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-04-09 16:10:47 +02:00
Hugo Pereira Brito
735a8fbb95 feat(entra): add new check entra_managed_device_required_for_mfa_registration (#7203)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:10:47 +02:00
Prowler Bot
7442eb398c chore(regions_update): Changes in regions for AWS services (#7395)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
dependabot[bot]
509f185890 chore(deps): bump trufflesecurity/trufflehog from 3.88.18 to 3.88.20 (#7394)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
Prowler Bot
7712968eeb chore(regions_update): Changes in regions for AWS services (#7391)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
dependabot[bot]
fc3ee5534d chore(deps): bump actions/setup-python from 5.4.0 to 5.5.0 (#7390)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
Prowler Bot
376ec4c73b chore(regions_update): Changes in regions for AWS services (#7382)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:47 +02:00
Pablo Lara
3db0a0ed6f chore: tweak for button see findings (#7369) 2025-04-09 16:10:46 +02:00
Pablo Lara
c3e5980eb0 chore(scans): properly enable link to findings when scan is completed (#7368) 2025-04-09 16:10:46 +02:00
dependabot[bot]
c830829b55 chore(deps): bump github/codeql-action from 3.28.12 to 3.28.13 (#7367)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:46 +02:00
dependabot[bot]
3b21b7ecf0 chore(deps): bump tj-actions/changed-files from 46.0.1 to 46.0.3 (#7363)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:46 +02:00
Víctor Fernández Poyatos
a3810f3fda build(api): Force django-allauth==65.4.1 (#7358) 2025-04-09 16:10:46 +02:00
Pablo Lara
18fc405d15 docs: update readme (#7357) 2025-04-09 16:10:46 +02:00
Pablo Lara
aeb3bdc779 chore(findings): apply default filter to show failed findings (#7356) 2025-04-09 16:10:46 +02:00
Pablo Lara
b32f0997f5 docs(changelog): document addition of download column in scans table … (#7354) 2025-04-09 16:10:46 +02:00
Pablo Lara
a72d8d7c83 feat(scans): add download button column for completed scans in table (#7353) 2025-04-09 16:10:46 +02:00
Víctor Fernández Poyatos
a4f5566589 feat(compliance): Add endpoint to retrieve compliance overviews metadata (#7333) 2025-04-09 16:10:46 +02:00
Pablo Lara
bd839e1398 docs: update changelog with Next.js security patch (#7339) (#7341) 2025-04-09 16:10:46 +02:00
Prowler Bot
7e9b1630f0 chore(regions_update): Changes in regions for AWS services (#7219)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:10:29 +02:00
Prowler Bot
e2330acbff chore(regions_update): Changes in regions for AWS services (#7246)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:10:13 +02:00
Prowler Bot
0f15a20128 chore(regions_update): Changes in regions for AWS services (#7250)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:10:06 +02:00
Prowler Bot
237ec20513 chore(regions_update): Changes in regions for AWS services (#7334)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
497adbb4e3 fix(action): Use Poetry v2 (#7329) 2025-04-09 16:10:01 +02:00
Prowler Bot
1bb939c609 chore(regions_update): Changes in regions for AWS services (#7323)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
8ee73af0c0 chore(aws-regions): remove backport to v3 (#7319) 2025-04-09 16:10:01 +02:00
dependabot[bot]
6ead888399 chore(deps): bump github/codeql-action from 3.28.11 to 3.28.12 (#7321)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
c20f9edb0b chore(dependabot): disable for v3 (#7316) 2025-04-09 16:10:01 +02:00
Pablo Lara
747d62393e docs: add social login images and update documentation (#7314)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
e757bde7ca chore(dependabot): Disable for API and UI (#7300) 2025-04-09 16:10:01 +02:00
Pedro Martín
29ff19eead fix(k8s): remove typos from PCI 4.0 (#7294) 2025-04-09 16:10:01 +02:00
Pepe Fagoaga
c516773cff chore(social-login): improve copy when not enabled (#7295) 2025-04-09 16:10:01 +02:00
dependabot[bot]
e4a682e23e chore(deps): bump trufflesecurity/trufflehog from 3.88.17 to 3.88.18 (#7297)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
d427455db5 chore(security): Configure HTTP Security Headers (#7220)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-04-09 16:10:01 +02:00
Pepe Fagoaga
f0cbcacbe4 chore(security): Add HTTP Security Headers (#7289) 2025-04-09 16:09:55 +02:00
Pablo Lara
df11afbcf0 fix: prevent SSR mismatch in OAuth URL generation (#7288) 2025-04-09 16:07:31 +02:00
dependabot[bot]
cc6db6b680 chore(deps): bump azure-mgmt-containerservice from 34.0.0 to 34.1.0 (#6989)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:07:31 +02:00
Pablo Lara
0eb51dc147 chore(providers): change wording when adding a new provider (#7280) 2025-04-09 16:07:31 +02:00
Pepe Fagoaga
aeb3dc9ac2 fix(aws-regions): Use @prowler-bot as author (#7285) 2025-04-09 16:07:31 +02:00
Pablo Lara
36c8240f2b chore: add env vars for social login (#7257)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-04-09 16:07:31 +02:00
Prowler Bot
a0852c397d chore(regions_update): Changes in regions for AWS services (#7281)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:07:31 +02:00
Pablo Lara
32f1d1a713 feat(providers): add component to render a link to the documentation (#7282) 2025-04-09 16:07:31 +02:00
dependabot[bot]
76be3bd4ae chore(deps): bump azure-mgmt-storage from 21.2.1 to 22.1.1 (#7098)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-04-09 16:07:31 +02:00
Adrián Jesús Peña Rodríguez
93b59e15e8 chore: add api reference to download report section (#7243) 2025-04-09 16:07:30 +02:00
Pablo Lara
f559b87018 chore: Rename keyServer and extract to helper (#7256) 2025-04-09 16:07:30 +02:00
Pedro Martín
84b50f2a3f fix(.env): remove spaces (#7255) 2025-04-09 16:07:30 +02:00
Pedro Martín
69c7dc339f fix(prowler): change from prowler.py to prowler-cli.py (#7253) 2025-04-09 16:07:30 +02:00
Pablo Lara
ea396a90e3 chore: update git ignore file (#7254) 2025-04-09 16:07:30 +02:00
Pedro Martín
d460509262 feat(jira): add basic auth method (#7233) 2025-04-09 16:07:30 +02:00
Pepe Fagoaga
4d2bb93a8c fix(backport): Use container tagged version (#7252) 2025-04-09 16:07:30 +02:00
Pepe Fagoaga
d936c3f934 chore(security): Pin actions to the Full-Length Commit SHA (#7249) 2025-04-09 16:07:30 +02:00
Pablo Lara
c1e36f0df7 chore: add env var for social login (#7251) 2025-04-09 16:07:30 +02:00
Prowler Bot
d1f04fefb0 chore(regions_update): Changes in regions for AWS services (#7237)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:07:27 +02:00
Prowler Bot
8703c55b5a chore(regions_update): Changes in regions for AWS services (#7245)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:06:57 +02:00
Pablo Lara
0af5c9c9c2 chore: improve UX when social login is not enabled (#7242) 2025-04-09 16:06:57 +02:00
Pablo Lara
57717467c7 chore(social-login): disable social login buttons when env vars are not set (#7238) 2025-04-09 16:06:57 +02:00
Pablo Lara
b59930803c chore(social-login): rename env.vars for social login (#7232) 2025-04-09 16:06:57 +02:00
Pablo Lara
835272b3ab chore: social auth is algo in sign-up page (#7231) 2025-04-09 16:06:57 +02:00
Pablo Lara
6a67d8d93a chore: remove unused regions (#7229) 2025-04-09 16:06:57 +02:00
Pablo Lara
f87fd35c95 chore: change wording for launching a single scan (#7226) 2025-04-09 16:06:57 +02:00
Pablo Lara
7a00750e0a chore: update changelog (#7223) 2025-04-09 16:06:57 +02:00
Pablo Lara
412697c418 feat(social-login): social login with Google is working (#7218)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-09 16:06:57 +02:00
Víctor Fernández Poyatos
3cd4e4cbbf fix(migrations): add through parameter to integration.providers (#7222) 2025-04-09 16:06:57 +02:00
Pepe Fagoaga
e18cace3bb fix(pyproject): Rename prowler.py (#7217) 2025-04-09 16:06:57 +02:00
Víctor Fernández Poyatos
a1252e7afc feat(integrations): Added new endpoints to allow configuring integrations (#7167) 2025-04-09 16:06:34 +02:00
Daniel Barranquero
7bc9aa2424 feat(entra): add new check entra_admin_mfa_enabled_for_administrative_roles (#7181)
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:04:42 +02:00
Pedro Martín
c4f87f45d4 feat(kubernetes): add ISO 27001 2022 compliance framework (#7204) 2025-04-09 16:04:42 +02:00
Hugo Pereira Brito
93f3a094fc feat(entra): add new check entra_identity_protection_sign_in_risk_enabled (#7171)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:04:42 +02:00
Andoni Alonso
881133d6b9 refactor(check): add docstrings and improve report handling (#7113) 2025-04-09 16:04:42 +02:00
Hugo Pereira Brito
ba02a26cc1 feat(docs): add microsoft365 configurable checks (#7200) 2025-04-09 16:04:42 +02:00
Hugo Pereira Brito
bb1b81455c feat(entra): add new check entra_identity_protection_user_risk_enabled (#7126)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:04:42 +02:00
Pepe Fagoaga
3116fef89a chore(poetry): Upgrade to v2 (#7112) 2025-04-09 16:04:33 +02:00
Hugo Pereira Brito
f6dbf3ef5f feat(entra): add new check entra_managed_device_required_for_authentication (#7115)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:01:40 +02:00
Daniel Barranquero
35a9c03467 feat(entra): add new check entra_password_hash_sync_enabled (#7061) 2025-04-09 16:01:40 +02:00
dependabot[bot]
5fc95a879c chore(deps): bump google-api-python-client from 2.162.0 to 2.163.0 (#7191)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:01:40 +02:00
Prowler Bot
8ffa958c20 chore(regions_update): Changes in regions for AWS services (#7197)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:01:40 +02:00
Pablo Lara
476cc59ca1 chore: update changelog (#7199) 2025-04-09 16:01:40 +02:00
Pablo Lara
26390cdcd9 feat(invitations): Disable editing for accepted invites (#7198) 2025-04-09 16:01:40 +02:00
Pablo Lara
95b8390ddd chore(scans): rename type to trigger (#7196) 2025-04-09 16:01:40 +02:00
Pablo Lara
e9aebbd269 chore: auto refresh if the state is also available (#7195) 2025-04-09 16:01:40 +02:00
Pablo Lara
11ffbd86eb styles: tweaks styles (#7194) 2025-04-09 16:01:40 +02:00
Pablo Lara
9f32ff0c10 chore(launch-scan): update wording (#7193) 2025-04-09 16:01:40 +02:00
Pablo Lara
33046eb95d chore: update the changelog (#7190) 2025-04-09 16:01:40 +02:00
Hugo Pereira Brito
a77d6ebaa2 feat(microsoft365): add new check entra_admin_users_sign_in_frequency_enabled (#7020)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-04-09 16:01:40 +02:00
Pablo Lara
f1fa79b7ae feat(scans): allow running a scan once (#7188) 2025-04-09 16:01:40 +02:00
Adrián Jesús Peña Rodríguez
72874201c1 docs: add users, invitations and RBAC (#7109) 2025-04-09 16:01:40 +02:00
Daniel Barranquero
d979076814 feat(entra): add new check entra_dynamic_group_for_guests_created (#7168)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-09 16:01:40 +02:00
Daniel Barranquero
2bdc6fef26 chore(providers): enhance Remediation.Code.CLI field from check's metadata (#7094)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-04-09 16:01:40 +02:00
Pedro Martín
5bfcdbb0b9 feat(gcp): add ISO 27001 2022 compliance framework (#7185) 2025-04-09 16:01:39 +02:00
Pedro Martín
8812566c83 fix(azure): add remaining checks for reqA.5.25 (#7182) 2025-04-09 16:01:39 +02:00
Daniel Barranquero
fab7a107fa feat(entra): add new check entra_admin_consent_workflow_enabled (#7110) 2025-04-09 16:01:39 +02:00
Adrián Jesús Peña Rodríguez
cbd5197ddc docs: add generate_output documentation (#7122)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-09 16:01:39 +02:00
Hugo Pereira Brito
0b87b63cdd refactor(microsoft365): resource metadata assertions (#7169) 2025-04-09 16:01:39 +02:00
Pedro Martín
18d7ce3607 feat(azure): add ISO 27001 2022 compliance framework (#7170) 2025-04-09 16:01:39 +02:00
dependabot[bot]
8f20eb958d chore(deps): bump tzlocal from 5.3 to 5.3.1 (#7162)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:01:39 +02:00
Prowler Bot
33ab90e119 chore(regions_update): Changes in regions for AWS services (#7177)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 16:01:39 +02:00
dependabot[bot]
d4c59bf4fe chore(deps): bump trufflesecurity/trufflehog from 3.88.15 to 3.88.16 (#7174)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:01:39 +02:00
Harshit Raj Singh
5e45b30823 feat(aws): AWS Found Sec Best Practices & PCI DSS v3.2.1 upgrade (#7017)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-04-09 16:01:39 +02:00
Pablo Lara
77f15a8749 fix: tweak z-index for custom inputs (#7166) 2025-04-09 16:01:39 +02:00
Pablo Lara
953fc0590b feat(scans): improve scan launch provider selection (#7164) 2025-04-09 16:01:39 +02:00
dependabot[bot]
8ed6e497a3 chore(deps): bump django from 5.1.5 to 5.1.7 in /api (#7145)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 16:01:01 +02:00
dependabot[bot]
10c3a1e3ce chore(deps-dev): bump mock from 5.1.0 to 5.2.0 (#7099)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 15:58:53 +02:00
Kay Agahd
32dec0b235 fix(doc): event_time has been changed to time_dt but was not documented (#7136) 2025-04-09 15:58:53 +02:00
dependabot[bot]
bf50c5e7b0 chore(deps): bump jinja2 from 3.1.5 to 3.1.6 (#7151)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 15:58:53 +02:00
Prowler Bot
a660cf7024 chore(regions_update): Changes in regions for AWS services (#7108)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 15:58:49 +02:00
Prowler Bot
50851aec64 chore(regions_update): Changes in regions for AWS services (#7119)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 15:58:35 +02:00
dependabot[bot]
b931a81692 chore(deps-dev): bump mkdocs-git-revision-date-localized-plugin from 1.3.0 to 1.4.1 (#7129)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 15:57:04 +02:00
Prowler Bot
7ce291d0d3 chore(regions_update): Changes in regions for AWS services (#7131)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 15:56:58 +02:00
dependabot[bot]
d257e6368e chore(deps): bump trufflesecurity/trufflehog from 3.88.14 to 3.88.15 (#7127)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-09 15:55:06 +02:00
César Arroba
9f43391521 chore: increase release to 5.5.0 (#7143) 2025-04-09 15:54:59 +02:00
Prowler Bot
02099b793d chore(regions_update): Changes in regions for AWS services (#7146)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-04-09 15:53:07 +02:00
Pablo Lara
56b760208e chore: update changelog (#7149) 2025-04-09 15:53:07 +02:00
Pablo Lara
042f138f56 feat: add changelog (#7141) 2025-04-09 15:53:07 +02:00
Pepe Fagoaga
f347080dbd chore(release): bump for 5.4.5 (#7475) 2025-04-08 13:14:51 -04:00
Prowler Bot
2cc8363697 fix: handle errors in AWS, Azure, and GCP (#7471)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-08 19:03:42 +05:45
Prowler Bot
154467e7c8 fix(provider): disable periodic task on views before deleting (#7470)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-08 15:50:39 +05:45
Prowler Bot
16c71f3c04 fix(soc2_aws): update compliance and remove some requirements (#7455)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-04-07 16:53:52 -04:00
Prowler Bot
849707166a fix(gcp): handle logic for empty project names (#7450)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-07 15:21:54 -04:00
Prowler Bot
59513d777c fix(aws): add resource arn for transit gateways (#7448)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-07 13:38:28 -04:00
Prowler Bot
f52929673d fix(gcp): ignore redirect balancers and add regional ones (#7449)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-07 12:43:45 -04:00
Prowler Bot
611681488d fix(defender): add default resource name in contacts (#7441)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-04 11:49:58 -04:00
Prowler Bot
9630f23585 fix(aws): solve multiple errors (#7440)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-04 10:47:08 -04:00
Prowler Bot
23cfd91708 fix(azure): remove resource_name inside the Check_Report (#7430)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-04-03 12:34:53 -04:00
Prowler Bot
0c44f28bf7 fix(gcp): make logging sink check at project level (#7428)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-03 11:34:03 -04:00
Pepe Fagoaga
09c529d810 chore(release): bump for 5.4.4 (#7427) 2025-04-03 10:19:07 -04:00
Prowler Bot
5be859d86c chore(deletion): Add environment variable for batch size (#7426)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-04-03 15:42:22 +05:45
Prowler Bot
5340008c8f chore(sentry): ignore exception when aws service not available in a region (#7398)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-04-03 13:01:14 +05:45
Prowler Bot
a5aa4c30d7 fix(scans): Handle duplicated scan tasks (#7409)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-04-03 13:00:58 +05:45
Prowler Bot
56d0c2fbea fix(resources): add the correct id and names for resources (#7414)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-01 22:24:33 +02:00
Prowler Bot
6b7ef199e0 fix(report): log as error when Resource ID or Name do not exist (#7412)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-04-01 21:23:16 +02:00
Prowler Bot
933c5063ee fix(redshift): validation error for Cluster.multi_az (#7400)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-31 14:51:49 +02:00
Prowler Bot
72a998a692 fix(rds): hundle Certificate rds-ca-2019 not found (#7392)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-27 12:04:19 +01:00
Prowler Bot
a4d1e3bb69 fix(stepfunctions): Nonetype object has no attribute level (#7389)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-27 11:09:17 +01:00
Prowler Bot
2a8b04cced fix(fms): resource metadata could not be converted to dict (#7388)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-27 09:13:16 +01:00
Prowler Bot
cfc02186d4 fix(vm): handle Nonetype is not iterable for extensions (#7377)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-26 00:31:57 +01:00
Prowler Bot
7e432a3b69 fix(s3): handle None S3 account public access block (#7376)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 17:55:54 +01:00
Prowler Bot
fed8de314f fix(storagegateway): describe smb/nfs share per region (#7375)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 16:56:25 +01:00
Prowler Bot
ce23c4b5aa fix(network): handle Nonetype is not iterable for security groups (#7372)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-25 15:53:55 +01:00
Prowler Bot
33e99ef628 fix(vm): handle NoneType accessing security_profile (#7373)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 14:43:49 +01:00
Prowler Bot
72761a6ef6 fix(iam): handle none SAML Providers (#7371)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-25 11:18:10 +01:00
Prowler Bot
2a4fdff827 fix(iam): handle UnboundLocalError cannot access local variable 'report' (#7370)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-25 10:20:49 +01:00
Pepe Fagoaga
8e264313bc chore(release): bump for 5.4.3 2025-03-24 18:08:59 +01:00
Víctor Fernández Poyatos
21654b0bc0 chore: bump API version (#7355) 2025-03-24 20:20:52 +05:45
Prowler Bot
7fa57ba3e2 ref(providers): Refactor provider deletion functions (#7351)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-03-24 19:43:39 +05:45
Prowler Bot
ea9b6bb191 chore(awslambda): update obsolete lambda runtimes (#7345)
Co-authored-by: Jonny <106528116+jonathanbro@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-24 13:18:05 +01:00
Prowler Bot
95acb7b0c8 chore(next): Remove x-powered-by header (#7347)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-24 16:44:50 +05:45
Prowler Bot
d23edfa337 chore: upgrade Next.js to 14.2.25 to fix auth middleware vulnerability (#7340)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-24 09:52:42 +01:00
Pepe Fagoaga
40678a5863 chore(release): bump for 5.4.2 (#7328) 2025-03-20 18:49:13 +01:00
Prowler Bot
23aded92a3 chore(api): Update CHANGELOG (#7327)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-20 15:24:16 +05:45
Prowler Bot
6e56d3862d fix(scan_id): Read the ID from the Scan object (#7326)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-20 15:22:57 +05:45
Prowler Bot
d95fccd163 fix(gcp): make provider id mandatory in test_connection (#7315)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-03-19 20:38:37 +05:45
Prowler Bot
7ddf860a55 fix: add a handled response in case local files are missing (#7227)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-03-19 11:58:25 +01:00
Prowler Bot
3f41c75a45 fix(route53): solve false positive in route53_public_hosted_zones_cloudwatch_logging_enabled (#7293)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-03-19 13:39:56 +05:45
Prowler Bot
04b6dbf639 fix(microsoft365): typo Microsoft365NotTenantIdButClientIdAndClienSecretError (#7258)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-03-19 13:38:08 +05:45
Prowler Bot
ff4d16deb5 fix(scan): add compliance info inside finding (#7247)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-03-19 13:37:16 +05:45
Prowler Bot
562921cd5e fix(test-connection): Handle provider without secret (#7290)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-19 13:36:38 +05:45
Prowler Bot
8f061e4fed fix(exports): change the way to remove the local export files after s3 upload (#7224)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-03-17 17:30:57 +05:45
Prowler Bot
3fb86d754a fix(cloudwatch): handle None metric alarms (#7207)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-12 16:18:44 +01:00
Prowler Bot
7874707310 chore(sentry): ignore new exceptions in Sentry (#7189)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-12 11:35:32 +01:00
Prowler Bot
1c934e37c7 chore(sentry): ignore expected errors in GCP API (#7186)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-11 17:27:07 +01:00
Prowler Bot
8459cff16d fix(ens): remove and change duplicated ids (#7180)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-03-11 12:46:31 +01:00
Prowler Bot
57ae096395 fix(azure): correct check title for SQL Server Unrestricted (#7160)
Co-authored-by: Gary Mclean <gary.mclean@krrv.io>
2025-03-07 19:22:35 +01:00
Prowler Bot
200185de25 fix(metadata): match type with check results (#7155)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-07 18:16:32 +01:00
Prowler Bot
f8447b0f79 fix(metadata): typo in ec2_securitygroup_allow_wide_open_public_ipv4 (#7158)
Co-authored-by: ryan-stavella <71134114+ryan-stavella@users.noreply.github.com>
2025-03-07 16:32:36 +01:00
Prowler Bot
19289bbe20 fix(aws): ecs_task_definitions_no_environment_secrets.metadata.json (#7153)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-03-07 15:27:59 +01:00
César Arroba
b5b371fa0c chore: increase release to 5.4.1 (#7144)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-07 14:20:28 +01:00
Prowler Bot
939a623cec fix: tweaks for compliance cards (#7148)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-07 11:40:55 +01:00
Víctor Fernández Poyatos
926f449ae6 fix(overviews): manage overview exceptions and use batch_size with bulk (#7140) 2025-03-06 15:39:51 +01:00
César Arroba
646668c6ae chore(ui-gha): delete double quotes on prowler version (#7139) 2025-03-06 15:39:40 +01:00
Prowler Bot
8e6b92792b fix(groups): display uid if alias is missing (#7138)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-06 14:41:57 +01:00
Prowler Bot
65c081ce38 fix(credentials): adjust helper links to fit width (#7134)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-03-06 13:01:10 +01:00
Pepe Fagoaga
5600131d6a revert(findings): change uid from varchar to text (#7132) 2025-03-06 11:43:07 +01:00
César Arroba
dbd271980f chore(ui-gha): add version prefix (#7125) 2025-03-05 16:34:42 +01:00
Víctor Fernández Poyatos
ff532a899e fix(reports): Fix task kwargs and result (#7124) 2025-03-05 16:34:36 +01:00
César Arroba
0c9675ec70 chore(ui): add prowler version on build (#7120) 2025-03-05 16:34:26 +01:00
Pablo Lara
d45eda2b2b feat(compliance): new compliance selector (#7118) 2025-03-05 16:34:07 +01:00
dependabot[bot]
0abcf80d19 chore(deps-dev): bump pytest from 8.3.4 to 8.3.5 (#7097)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-05 16:33:35 +01:00
Pablo Lara
c0e10fd395 chore(ui): update label from 'Select a scan job' to 'Select a cloud p… (#7107) 2025-03-05 16:32:20 +01:00
Pepe Fagoaga
a80e9b26a8 chore(api): Use Prowler from v5.4 (#7092) 2025-03-03 22:30:58 +05:45
Sergio Garcia
2a9cd57fb8 fix(deps): update vulnerable cryptography dependency (#6993) 2025-03-03 17:38:23 +01:00
Prowler Bot
a0ad1a5f49 chore(deps): bump cryptography from 43.0.1 to 44.0.1 in /api (#7003)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-03-03 17:21:15 +01:00
César Arroba
dff22dd166 Revert "chore(api): update prowler version to 5.4 (#7091)"
This reverts commit 58138810b9.
2025-03-03 17:11:20 +01:00
César Arroba
58138810b9 chore(api): update prowler version to 5.4 (#7091)
Co-authored-by: Prowler Bot <bot@prowler.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-03-03 17:08:45 +01:00
Víctor Fernández Poyatos
1ecc272fe4 chore: update changelog 2025-03-03 16:27:05 +01:00
Pablo Lara
b784167006 fix(roles): show the correct error message (#7089) 2025-03-03 15:50:28 +01:00
Pablo Lara
cf0ec8dea0 fix: bug with create role and unlimited visibility checkbox (#7088) 2025-03-03 15:50:23 +01:00
Sergio Garcia
96cae5e961 feat(aws): add fixers for threat detection checks (#7085) 2025-03-03 15:50:18 +01:00
Pablo Lara
a48e5cb15f feat(version): add prowler version to the sidebar (#7086) 2025-03-03 15:50:14 +01:00
Pablo Lara
5a9ff007e0 chore: Update the latest table findings with the most recent changes (#7084) 2025-03-03 15:50:06 +01:00
Prowler Bot
24c45f894c chore(regions_update): Changes in regions for AWS services (#7034)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 15:47:09 +01:00
dependabot[bot]
5d03c85629 chore(deps): bump cryptography from 43.0.1 to 44.0.1 in /api (#7001)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 15:46:15 +01:00
Pablo Lara
41dc397a7a feat(sidebar): sidebar with new functionalities (#7018) 2025-03-03 15:01:38 +01:00
Prowler Bot
237a9adce9 chore(regions_update): Changes in regions for AWS services (#7067)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 15:01:29 +01:00
Sergio Garcia
a06167f1c2 fix(threat detection): run single threat detection check (#7065) 2025-03-03 15:01:21 +01:00
Pepe Fagoaga
a7d58c40dd refactor(stats): Use Finding instead of Check_Report (#7053)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-03-03 15:01:12 +01:00
Pepe Fagoaga
e260c46389 chore(examples): Scan AWS (#7064) 2025-03-03 15:01:02 +01:00
Sergio Garcia
115169a596 chore(gcp): enhance GCP APIs logic (#7046) 2025-03-03 15:00:49 +01:00
dependabot[bot]
5b19173c1d chore(deps): bump trufflesecurity/trufflehog from 3.88.13 to 3.88.14 (#7063)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 15:00:40 +01:00
Daniel Barranquero
d3dd1644e6 feat(m365): add sharepoint service with 4 checks (#7057)
Co-authored-by: MarioRgzLpz <mariorgzlpz1809@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-03 15:00:29 +01:00
Pedro Martín
8ff0c59964 feat(docs): add info related with sts assume role and regions (#7062) 2025-03-03 15:00:19 +01:00
Daniel Barranquero
285939c389 fix(azure): handle account not supporting Blob (#7060) 2025-03-03 15:00:04 +01:00
Sergio Garcia
a62ae8af51 fix(ecs): ensure unique finding id in ECS checks (#7059) 2025-03-03 14:59:56 +01:00
Prowler Bot
5d78b9e439 chore(regions_update): Changes in regions for AWS services (#7056)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:59:45 +01:00
Hugo Pereira Brito
1056c270ca feat(microsoft365): add new check entra_policy_ensure_default_user_cannot_create_tenants (#6918)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-03 14:59:35 +01:00
Pablo Lara
eeef6600b7 feat(exports): download scan exports (#7006) 2025-03-03 14:59:14 +01:00
Pepe Fagoaga
e142f17abe fix(env): UI version must be stable (#7055) 2025-03-03 14:58:45 +01:00
Víctor Fernández Poyatos
a65d858dac fix(migrations): Fix migration dependency order (#7051) 2025-03-03 14:58:32 +01:00
Víctor Fernández Poyatos
6235a1ba41 feat(labeler): apply label on migration changes (#7052) 2025-03-03 14:58:22 +01:00
Pepe Fagoaga
05007d03ee fix(findings): change uid from varchar to text (#7048) 2025-03-03 14:58:13 +01:00
Víctor Fernández Poyatos
102d099947 feat(findings): Add Django management command to populate database with dummy data (#7049) 2025-03-03 14:58:03 +01:00
Adrián Jesús Peña Rodríguez
3194675a5c feat(export): add API export system (#6878) 2025-03-03 14:57:40 +01:00
dependabot[bot]
14e6e4aa68 chore(deps-dev): bump black from 24.10.0 to 25.1.0 (#6733)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-03 14:57:22 +01:00
Pedro Martín
b24c3665b5 feat(aws): add ISO 27001 2022 compliance framework (#7035)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-03-03 14:56:31 +01:00
Prowler Bot
1f60878867 chore(regions_update): Changes in regions for AWS services (#7015)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:54:16 +01:00
dependabot[bot]
2dd18662d8 chore(deps): bump google-api-python-client from 2.161.0 to 2.162.0 (#7037)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:54:02 +01:00
Hugo Pereira Brito
175360dbe6 refactor(microsoft365): CheckReportMicrosoft365 and resource metadata (#6952) 2025-03-03 14:53:42 +01:00
Víctor Fernández Poyatos
80e24b971f feat(findings): Optimize findings endpoint (#7019) 2025-03-03 14:53:22 +01:00
Pepe Fagoaga
78877c470a chore(action): Conventional Commit Check (#7033) 2025-03-03 14:53:08 +01:00
dependabot[bot]
9c9b100359 chore(deps): bump trufflesecurity/trufflehog from 3.88.12 to 3.88.13 (#7026)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:52:51 +01:00
Pedro Martín
10f3232294 feat(outputs): add sample outputs (#6945) 2025-03-03 14:52:37 +01:00
Pedro Martín
a2e5f70f36 fix(cis): show report table on the CLI (#6979) 2025-03-03 14:52:23 +01:00
Pedro Martín
8d8b31c757 feat(azure): add PCI DSS 4.0 (#6982) 2025-03-03 14:52:03 +01:00
Pedro Martín
cba1e718b9 feat(kubernetes): add PCI DSS 4.0 (#7013) 2025-03-03 14:51:42 +01:00
Pedro Martín
6c3c37fc26 feat(dashboard): take the latest finding uid by timestamp (#6987) 2025-03-03 14:51:29 +01:00
Víctor Fernández Poyatos
b610cacd0c feat(tasks): add deletion queue for deletion tasks (#7022) 2025-03-03 14:51:14 +01:00
Pedro Martín
027a5705cb feat(gcp): add PCI DSS 4.0 (#7010) 2025-03-03 14:51:00 +01:00
Prowler Bot
b7fbfb4360 chore(regions_update): Changes in regions for AWS services (#7011)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:50:41 +01:00
dependabot[bot]
5acf0a7e3d chore(deps-dev): bump mkdocs-material from 9.6.4 to 9.6.5 (#7007)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:50:26 +01:00
Raj Chowdhury
3a25e86e30 fix(typo): solve typo in dashboard.md (#7009) 2025-03-03 14:50:06 +01:00
dependabot[bot]
50f1592eb3 chore(deps): bump trufflesecurity/trufflehog from 3.88.11 to 3.88.12 (#7008)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:49:46 +01:00
César Arroba
0f2927cb88 feat(api): setup sentry for OSS API (#6874) 2025-03-03 14:49:32 +01:00
Pablo Lara
b4e1434052 chore(users): renaming the account now triggers a re-render in the sidebar (#7005) 2025-03-03 14:49:16 +01:00
dependabot[bot]
43710783f9 chore(deps): bump python from 3.12.8-alpine3.20 to 3.12.9-alpine3.20 (#6882)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:49:04 +01:00
dependabot[bot]
16f767e7b9 chore(deps): bump tzlocal from 5.2 to 5.3 (#6932)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:48:41 +01:00
Hugo Pereira Brito
42818217a0 docs(tutorials): update all deprecated poetry shell references (#7002) 2025-03-03 14:44:50 +01:00
Prowler Bot
13405594b2 chore(regions_update): Changes in regions for AWS services (#6998)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:44:33 +01:00
Pedro Martín
b5a3852334 chore(github): add compliance to PR labeler (#6996) 2025-03-03 14:44:16 +01:00
Hugo Pereira Brito
181ff1acb3 docs(installation): add warning for poetry shell deprecation in README (#6983) 2025-03-03 14:43:35 +01:00
Pablo Lara
91e59a3279 chore(findings): add 'Status Extended' attribute to finding details (#6997) 2025-03-03 14:43:20 +01:00
Pedro Martín
b3fad1a765 feat(aws): add PCI DSS 4.0 (#6949) 2025-03-03 14:42:41 +01:00
dependabot[bot]
6f90927a79 chore(deps): bump trufflesecurity/trufflehog from 3.88.9 to 3.88.11 (#6988)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:42:24 +01:00
dependabot[bot]
0bb4a9a3e9 chore(deps): bump kubernetes from 32.0.0 to 32.0.1 (#6992)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:42:10 +01:00
Pablo Lara
80d9cde60b feat(scans): update the progress for executing scans (#6972) 2025-03-03 14:41:39 +01:00
César Arroba
11196c2f83 chore(gha): trigger API or UI deployment when push to master (#6946) 2025-03-03 14:41:21 +01:00
Prowler Bot
55a0d0a1b5 chore(regions_update): Changes in regions for AWS services (#6978) 2025-03-03 14:41:06 +01:00
Pedro Martín
4e5e1d7bd4 feat(aws): add compliance CIS 4.0 (#6937) 2025-03-03 14:40:17 +01:00
dependabot[bot]
06a0a434ab chore(deps-dev): bump flake8 from 7.1.1 to 7.1.2 (#6954)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:39:58 +01:00
Pepe Fagoaga
153833fc55 fix(ocsf): Adapt for 1.4.0 (#6971) 2025-03-03 14:38:57 +01:00
Prowler Bot
fc4877975f chore(regions_update): Changes in regions for AWS services (#6968)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:38:21 +01:00
dependabot[bot]
0797efd4fd chore(deps-dev): bump bandit from 1.8.2 to 1.8.3 (#6955)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:38:07 +01:00
Prowler Bot
fbec99a0b7 chore(regions_update): Changes in regions for AWS services (#6944)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:37:28 +01:00
dependabot[bot]
b2cb1de95e chore(deps): bump trufflesecurity/trufflehog from 3.88.8 to 3.88.9 (#6943)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:37:06 +01:00
dependabot[bot]
190c2316d7 chore(deps): bump google-api-python-client from 2.160.0 to 2.161.0 (#6933)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:36:16 +01:00
César Arroba
f6c352281a fix(gha): fix short sha step (#6939) 2025-03-03 14:36:00 +01:00
César Arroba
66dfe89936 chore(gha): add tag for api and ui images on push to master (#6920) 2025-03-03 14:35:41 +01:00
dependabot[bot]
8b3942ca49 chore(deps): bump trufflesecurity/trufflehog from 3.88.7 to 3.88.8 (#6931)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:35:06 +01:00
dependabot[bot]
9d35213bd5 chore(deps-dev): bump mkdocs-material from 9.6.3 to 9.6.4 (#6913)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:34:13 +01:00
Víctor Fernández Poyatos
3e586e615d feat(social-login): Add social login integration for Google and Github OAuth providers (#6906) 2025-03-03 14:33:12 +01:00
Sergio Garcia
a4f950e093 chore(docs): external K8s cluster Prowler App credentials (#6921) 2025-03-03 14:32:30 +01:00
Pedro Martín
7c2441f6ff fix(gcp): remove typos on CIS 3.0 (#6917) 2025-03-03 14:31:48 +01:00
dependabot[bot]
0a92af3eb2 chore(deps): bump trufflesecurity/trufflehog from 3.88.6 to 3.88.7 (#6915)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:31:25 +01:00
César Arroba
666f3a0e20 fix(gha): fix test build containers on pull requests actions (#6909) 2025-03-03 14:30:35 +01:00
dependabot[bot]
06ef98b5cc chore(deps-dev): bump coverage from 7.6.11 to 7.6.12 (#6897)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:29:39 +01:00
Prowler Bot
79125bdd40 chore(regions_update): Changes in regions for AWS services (#6900)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-03-03 14:28:42 +01:00
César Arroba
e8e8b085ac chore(api): test build container image on pull request (#6850) 2025-03-03 14:28:24 +01:00
César Arroba
c9b81d003a chore(ui): test build container image on pull request (#6849) 2025-03-03 14:27:33 +01:00
Pepe Fagoaga
23fa3c1e38 chore(version): Update version to 5.4.0 (#6894) 2025-03-03 14:26:51 +01:00
dependabot[bot]
03fbd0baca chore(deps-dev): bump coverage from 7.6.10 to 7.6.11 (#6887)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 14:24:50 +01:00
dependabot[bot]
d4fe24ef47 chore(deps): bump trufflesecurity/trufflehog from 3.88.5 to 3.88.6 (#6888)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-03 13:05:51 +01:00
Prowler Bot
9c5220ee98 fix(elasticache): improve logic in elasticache_redis_cluster_backup_enabled (#7045)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-26 12:12:44 +01:00
Prowler Bot
6491bce5a6 fix(azure): migrate resource models to avoid using SDK defaults (#7043)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-02-26 10:49:33 +01:00
Prowler Bot
ca375dd79c chore(iam): enhance iam_role_cross_service_confused_deputy_prevention recommendation (#7041)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-02-26 08:35:26 +01:00
Prowler Bot
e807573b54 fix(soc2_aws): remove duplicated checks (#6999)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-20 17:26:50 +05:30
Prowler Bot
c0f4c9743f fix(deps): update vulnerable cryptography dependency (#6994)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-20 16:04:14 +05:30
Prowler Bot
5974d0b5da fix(report): remove invalid resources in report (#6984)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-20 11:05:58 +05:30
Prowler Bot
6244a8a5f7 fix(roles): handle empty response in deleteRole and ensure revalidation (#6977)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-19 09:06:56 +01:00
Prowler Bot
5b9dae4529 test(cloudfront): add name retrieval test for cloudfront bucket domains (#6975)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-02-19 09:12:17 +05:30
Prowler Bot
a424374c44 fix(cloudfront): Incorrect bucket name retrievement (#6967)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-02-19 08:08:31 +05:30
Prowler Bot
b7fc2542e8 fix(gcp): Correct false positive when sslMode=ENCRYPTED_ONLY in CloudSQL (#6942)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-02-14 16:36:26 -05:00
Prowler Bot
83a1598a1e fix(issue pages): apply sorting by default in issue pages (#6935)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-14 10:35:49 +01:00
Prowler Bot
b22b56a06b fix(gcp): handle DNS Managed Zone with no DNSSEC (#6928)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-13 14:50:05 -05:00
Prowler Bot
5020e4713c fix(aws): codebuild service threw KeyError for projects type CODEPIPELINE (#6930)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-13 13:53:05 -05:00
Prowler Bot
ee534a740e fix(aws): SNS threw IndexError if SubscriptionArn is PendingConfirmation (#6923)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-13 10:35:19 -05:00
Prowler Bot
48cb45b7a8 fix(aws): handle AccessDenied when retrieving resource policy (#6912)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-12 19:15:51 -05:00
Prowler Bot
91b74822e9 fix(kms): Amazon KMS API call error handling (#6904)
Co-authored-by: Ogonna Iwunze <1915636+wunzeco@users.noreply.github.com>
2025-02-12 11:08:47 -05:00
Pepe Fagoaga
287eef5085 chore(version): Update version to 5.3.1 (#6895) 2025-02-12 16:39:09 +05:45
Mario Rodriguez Lopez
45d359c84a feat(microsof365): Add documentation and compliance file (#6195)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-10 17:18:43 +01:00
Pablo Lara
6049e5d4e8 chore: Update prowler api version (#6877)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-10 17:39:08 +05:45
Víctor Fernández Poyatos
dfd377f89e chore(api): Update changelog and specs (#6876) 2025-02-10 12:25:04 +01:00
Víctor Fernández Poyatos
37e6c52c14 chore: Add needed steps for API in PR template (#6875) 2025-02-10 12:24:51 +01:00
Pepe Fagoaga
d6a7f4d88f fix(kubernetes): Change UID validation (#6869)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-10 11:24:34 +01:00
Pepe Fagoaga
239cda0a90 chore: Rename dashboard table latest findings (#6873)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-10 11:24:27 +01:00
dependabot[bot]
4a821e425b chore(deps-dev): bump mkdocs-material from 9.6.2 to 9.6.3 (#6871)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:24:21 +01:00
Sergio Garcia
e1a2f0c204 docs(eks): add documentation about EKS onboarding (#6853)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-10 11:24:16 +01:00
Prowler Bot
c70860c733 chore(regions_update): Changes in regions for AWS services (#6858)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-10 11:24:09 +01:00
Víctor Fernández Poyatos
05e71e033f feat(findings): Use ArrayAgg and subqueries on metadata endpoint (#6863)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-02-10 11:24:03 +01:00
Pablo Lara
5164ec2eb9 feat: implement new functionality with inserted_at__gte in findings a… (#6864) 2025-02-10 11:23:58 +01:00
Víctor Fernández Poyatos
be18dac4f9 docs: Add details about user creation in Prowler app (#6862) 2025-02-10 11:23:52 +01:00
dependabot[bot]
bb126c242f chore(deps): bump microsoft-kiota-abstractions from 1.9.1 to 1.9.2 (#6856)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:23:47 +01:00
Víctor Fernández Poyatos
e27780a856 feat(findings): Require date filters for findings endpoints (#6800) 2025-02-10 11:23:41 +01:00
Pranay Girase
196ec51751 fix(typo): typos in Dashboard and Report in HTML (#6847) 2025-02-10 11:23:33 +01:00
Prowler Bot
86abf9e64c chore(regions_update): Changes in regions for AWS services (#6848)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-10 11:23:28 +01:00
dependabot[bot]
9d8be578e3 chore(deps): bump trufflesecurity/trufflehog from 3.88.4 to 3.88.5 (#6844)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:23:19 +01:00
Mario Rodriguez Lopez
b3aa800082 feat(entra): add new check entra_thirdparty_integrated_apps_not_allowed (#6357)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:23:13 +01:00
Ogonna Iwunze
501674a778 feat(kms): add kms_cmk_not_multi_region AWS check (#6794)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:23:01 +01:00
Prowler Bot
5ff6ae79d8 chore(regions_update): Changes in regions for AWS services (#6821)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-02-10 11:21:08 +01:00
Mario Rodriguez Lopez
e518a869ab feat(entra): add new entra service for Microsoft365 (#6326)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:20:58 +01:00
Mario Rodriguez Lopez
43927a62f3 feat(microsoft365): add new check admincenter_settings_password_never_expire (#6023)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:20:52 +01:00
dependabot[bot]
335980c8d8 chore(deps): bump kubernetes from 31.0.0 to 32.0.0 (#6678)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:20:42 +01:00
Pablo Lara
ca3ee378db style(forms): improve spacing consistency (#6814) 2025-02-10 11:17:11 +01:00
Pablo Lara
c05bc1068a chore(forms): improvements to the sign-in and sign-up forms (#6813) 2025-02-10 11:17:03 +01:00
Drew Kerrigan
2e3164636d docs(): add description of changed and new delta values to prowler app tutorial (#6801) 2025-02-10 11:16:56 +01:00
dependabot[bot]
c34e07fc40 chore(deps): bump pytz from 2024.2 to 2025.1 (#6765)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:50 +01:00
dependabot[bot]
6022122a61 chore(deps-dev): bump mkdocs-material from 9.5.50 to 9.6.2 (#6799)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:44 +01:00
dependabot[bot]
f65f5e4b46 chore(deps-dev): bump pylint from 3.3.3 to 3.3.4 (#6721)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:37 +01:00
Pablo Lara
dee17733a0 feat(scans): show scan details right after launch (#6791) 2025-02-10 11:16:30 +01:00
dependabot[bot]
cddda1e64e chore(deps): bump trufflesecurity/trufflehog from 3.88.2 to 3.88.4 (#6760)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:23 +01:00
dependabot[bot]
f7b873db03 chore(deps): bump google-api-python-client from 2.159.0 to 2.160.0 (#6720)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:16:17 +01:00
Víctor Fernández Poyatos
792bc70d0a feat(schedules): Rework daily schedule to always show the next scan (#6700) 2025-02-10 11:16:11 +01:00
Hugo Pereira Brito
185491b061 fix: microsoft365 mutelist (#6724) 2025-02-10 11:16:05 +01:00
dependabot[bot]
3af8a43480 chore(deps): bump microsoft-kiota-abstractions from 1.6.8 to 1.9.1 (#6734)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:57 +01:00
Pablo Lara
fd78406b29 fix: Enable hot reloading when using Docker Compose for UI (#6750) 2025-02-10 11:15:49 +01:00
Matt Johnson
4758b258a3 chore: Update Google Analytics ID across all docs.prowler.com sites. (#6730) 2025-02-10 11:15:41 +01:00
dependabot[bot]
015e2b3b88 chore(deps): bump uuid from 10.0.0 to 11.0.5 in /ui (#6516)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:33 +01:00
Mario Rodriguez Lopez
e184c9cb61 feat(m365): add Microsoft 365 provider (#5902)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:15:24 +01:00
dependabot[bot]
9004a01183 chore(deps): bump azure-mgmt-web from 7.3.1 to 8.0.0 (#6680)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:18 +01:00
dependabot[bot]
dd65ba3d9e chore(deps): bump msgraph-sdk from 1.17.0 to 1.18.0 (#6679)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:15:08 +01:00
dependabot[bot]
bba616a18f chore(deps): bump azure-storage-blob from 12.24.0 to 12.24.1 (#6666)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-10 11:14:33 +01:00
Pepe Fagoaga
aa0f8d2981 chore: bump for next minor (#6672) 2025-02-10 11:13:42 +01:00
Paolo Frigo
2511d6ffa9 docs: update # of checks, services, frameworks and categories (#6528)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-10 11:12:02 +01:00
Prowler Bot
27329457be fix(dashboard): adjust the bar chart display (#6868)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-07 11:04:23 -05:00
Prowler Bot
7189f3d526 fix(aws): key error for detect-secrets (#6865)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-07 10:04:54 -05:00
Prowler Bot
58e7589c9d fix(kms): handle error in DescribeKey function (#6842)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-02-05 15:27:43 -05:00
Prowler Bot
d60f4b5ded fix(cloudfront): fix false positive in s3 origins (#6838)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-05 13:36:49 -05:00
Prowler Bot
4c2ec094f6 fix(findings): Spelling mistakes correction (#6834)
Co-authored-by: Gary Mclean <gary.mclean@krrv.io>
2025-02-05 11:53:17 -05:00
Prowler Bot
395ecaff5b fix(directoryservice): handle ClientException (#6828)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-02-05 11:20:13 -05:00
Prowler Bot
c39506ef7d fix(aws) wording of report.status_extended in awslambda_function_not_publicly_accessible (#6831)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-02-05 11:18:27 -05:00
Prowler Bot
eb90d479e2 chore(aws_audit_manager_control_tower_guardrails): add checks to reqs (#6803)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 19:47:04 -05:00
Prowler Bot
b92a73f5ea fix(elasticache): InvalidReplicationGroupStateFault error (#6820)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-02-04 16:08:26 -05:00
Prowler Bot
ad121f3059 chore(deps-dev): bump moto from 5.0.27 to 5.0.28 (#6817)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-02-04 14:25:04 -05:00
Pepe Fagoaga
70e4c5a44e chore: bump for next patch (#6764) 2025-02-03 15:25:23 -05:00
Prowler Bot
b5a46b7b59 fix(cis_1.5_aws): add checks to needed reqs (#6798)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-03 11:37:53 -05:00
Prowler Bot
f1a97cd166 fix(cis_1.4_aws): add checks to needed reqs (#6796)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-03 11:37:39 -05:00
Prowler Bot
0774508093 fix(cis_2.0_aws): add checks to needed reqs (#6787)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-02-03 11:37:24 -05:00
Prowler Bot
0664ce6b94 fix(gcp): fix wrong provider value in check (#6789)
Co-authored-by: secretcod3r <101349794+secretcod3r@users.noreply.github.com>
2025-02-03 10:32:53 -05:00
Prowler Bot
407c779c52 fix(findings): remove default status filtering (#6785)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-03 15:25:11 +01:00
Prowler Bot
c60f13f23f fix(findings): order findings by inserted_at DESC (#6783)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-02-03 11:57:45 +01:00
Prowler Bot
37d912ef01 fix(celery): Kill celery worker process after every task to release memory (#6763)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-31 19:37:34 +05:45
Pepe Fagoaga
d3de89c017 chore: bump for next patch (#6762) 2025-01-31 19:34:54 +05:45
Prowler Bot
cb22af25c6 fix(db_event): Handle other events (#6757)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 21:48:42 +05:45
Prowler Bot
a534b94495 fix(set_report_color): Add more details to error (#6755)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 21:46:34 +05:45
Prowler Bot
6262b4ff0b feat(scans): Optimize read queries during scans (#6756)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-30 20:56:11 +05:45
Prowler Bot
84ecd7ab2c feat(findings): Improve /findings/metadata performance (#6749)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-30 18:39:25 +05:45
Prowler Bot
1a5428445a fix(neptune): correct service name (#6747)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 17:52:00 +05:45
Prowler Bot
ac8e991ca0 fix(aws): iam_user_with_temporary_credentials resource in OCSF (#6741)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-01-30 17:17:20 +05:45
Prowler Bot
83a0331472 fix(acm): Key Error DomainName (#6744)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 17:17:05 +05:45
Prowler Bot
cce31e2971 fix(finding): raise when generating invalid findings (#6745)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 17:16:50 +05:45
Prowler Bot
0adf7d6e77 fix(sns): Add region to subscriptions (#6740)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-30 15:46:55 +05:45
Pepe Fagoaga
295f8b557e chore: bump for next patch (#6727) 2025-01-29 20:00:47 +05:45
Prowler Bot
bb2c5c3161 fix(scans): change label for next scan (#6726)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-29 10:49:35 +01:00
Prowler Bot
0018f36a36 fix(migrations): Use indexes instead of constraints to define an index (#6723)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-29 14:29:03 +05:45
Prowler Bot
857de84f49 fix(defender): add field to SecurityContacts (#6715)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-01-29 13:53:14 +05:45
Prowler Bot
9630f2242a revert: Update Django DB manager to use psycopg3 and connection pooling (#6719)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-28 22:33:32 +05:45
Prowler Bot
1fe125867c fix(scan-summaries): Improve efficiency on providers overview (#6718)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-28 22:15:22 +05:45
Prowler Bot
0737893240 fix(scans): filters and sorting for scan table (#6714)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-28 13:29:39 +01:00
Prowler Bot
282fe3d348 fix(scans, findings): Improve API performance ordering by inserted_at instead of id (#6712)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-28 17:28:26 +05:45
Prowler Bot
b5d83640ae fix: fixed bug when opening finding details while a scan is in progress (#6709)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-28 07:01:43 +01:00
Prowler Bot
2823d3ad21 fix(cloudsql): add trusted client certificates case for cloudsql_instance_ssl_connections (#6687)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-01-24 12:19:05 -05:00
Prowler Bot
00b93bfe86 fix(cloudwatch): NoneType object is not iterable (#6677)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-01-23 13:25:02 -05:00
Pepe Fagoaga
84c253d887 chore: bump for next patch (#6673) 2025-01-23 13:13:05 -05:00
Pepe Fagoaga
2ab5a702c9 chore(api): Bump to v1.3.0 (#6670) 2025-01-23 16:40:59 +01:00
Sergio Garcia
11d9cdf24e feat(resource metadata): add resource metadata to JSON OCSF (#6592)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-01-23 16:08:13 +01:00
Rubén De la Torre Vico
b1de41619b fix: add missing Check_Report_Azure parameters (#6583) 2025-01-23 16:07:23 +01:00
Rubén De la Torre Vico
18a4881a51 feat(network): extract Network resource metadata automated (#6555)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:06:44 +01:00
Rubén De la Torre Vico
de76a168c0 feat(storage): extract Storage resource metadata automated (#6563)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:06:40 +01:00
Rubén De la Torre Vico
bcc13a742d feat(vm): extract VM resource metadata automated (#6564) 2025-01-23 16:06:32 +01:00
Rubén De la Torre Vico
23b584f4bf feat(sqlserver): extract SQL Server resource metadata automated (#6562) 2025-01-23 16:06:25 +01:00
Daniel Barranquero
074b7c1ff5 feat(aws): include resource metadata to remaining checks (#6551)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 16:06:18 +01:00
Rubén De la Torre Vico
c04e9ed914 feat(postgresql): extract PostgreSQL resource metadata automated (#6560) 2025-01-23 16:06:11 +01:00
Rubén De la Torre Vico
1f1b126e79 feat(policy): extract Policy resource metadata automated (#6558) 2025-01-23 16:06:06 +01:00
Rubén De la Torre Vico
9b0dd80f13 feat(entra): extract Entra resource metadata automated (#6542)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:06:01 +01:00
Rubén De la Torre Vico
b0807478f2 feat(monitor): extract monitor resource metadata automated (#6554)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:05:54 +01:00
Rubén De la Torre Vico
11dfa58ecd feat(mysql): extract MySQL resource metadata automated (#6556) 2025-01-23 16:05:48 +01:00
Rubén De la Torre Vico
412f396e0a feat(keyvault): extract KeyVault resource metadata automated (#6553) 2025-01-23 16:05:44 +01:00
Rubén De la Torre Vico
8100d43ff2 feat(iam): extract IAM resource metadata automated (#6552) 2025-01-23 16:05:38 +01:00
Sergio Garcia
bc96acef48 fix(gcp): iterate through service projects (#6549)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-01-23 16:05:24 +01:00
Hugo Pereira Brito
6e9876b61a feat(aws): include resource metadata in services from r* to s* (#6536)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 16:05:17 +01:00
Pedro Martín
32253ca4f7 feat(gcp): add resource metadata to report (#6500)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 16:05:08 +01:00
Hugo Pereira Brito
4c6be5e283 feat(aws): include resource metadata in services from a* to b* (#6504)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:05:03 +01:00
Daniel Barranquero
69178fd7bd chore(aws): add resource metadata to services from t to w (#6546) 2025-01-23 16:04:49 +01:00
Daniel Barranquero
cd4432c14b chore(aws): add resource metadata to services from f to o (#6545) 2025-01-23 16:04:40 +01:00
Rubén De la Torre Vico
0e47172aec feat(defender): extract Defender resource metadata in automated way (#6538) 2025-01-23 16:04:30 +01:00
Rubén De la Torre Vico
efe18ae8b2 feat(appinsights): extract App Insights resource metadata in automated way (#6540) 2025-01-23 16:04:18 +01:00
Hugo Pereira Brito
d5e13df4fe feat: add resource metadata to emr_cluster_account_public_block_enabled (#6539) 2025-01-23 16:04:13 +01:00
Sergio Garcia
b0e84d74f2 feat(kubernetes): add resource metadata to report (#6479) 2025-01-23 16:04:05 +01:00
Hugo Pereira Brito
28526b591c feat(aws): include resource metadata in services from d* to e* (#6532)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-01-23 16:03:59 +01:00
Daniel Barranquero
51e6a6edb1 feat(aws): add resource metadata to all services starting with c (#6493) 2025-01-23 16:03:50 +01:00
Rubén De la Torre Vico
c1b132a29e feat(cosmosdb): extract CosmosDB resource metadata in automated way (#6533) 2025-01-23 16:03:43 +01:00
Rubén De la Torre Vico
e2530e7d57 feat(containerregistry): extract Container Registry resource metadata in automated way (#6530) 2025-01-23 16:03:37 +01:00
Rubén De la Torre Vico
f2fcb1599b feat(azure-app): extract Web App resource metadata in automated way (#6529) 2025-01-23 16:03:32 +01:00
Rubén De la Torre Vico
160eafa0c9 feat(aks): use Check_Report_Azure constructor properly in AKS checks (#6509) 2025-01-23 16:03:17 +01:00
Rubén De la Torre Vico
c2bc0f1368 feat(aisearch): use Check_Report_Azure constructor properly in AISearch checks (#6506) 2025-01-23 16:03:10 +01:00
Rubén De la Torre Vico
27d27fff81 feat(azure): include resource metadata in Check_Report_Azure (#6505) 2025-01-23 16:02:00 +01:00
Pepe Fagoaga
66e8f0ce18 chore(scan): Remove ._findings (#6667) 2025-01-23 15:58:37 +01:00
Pedro Martín
0ceae8361b feat(kubernetes): add CIS 1.10 compliance (#6508)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
# Conflicts:
#	prowler/compliance/kubernetes/cis_1.10_kubernetes.json
2025-01-23 15:39:40 +01:00
Pedro Martín
5e52fded83 feat(azure): add CIS 3.0 for Azure (#5226)
# Conflicts:
#	prowler/compliance/azure/cis_3.0_azure.json
2025-01-23 15:35:35 +01:00
Pablo Lara
0a7d07b4b6 chore: adjust DateWithTime component height when used with InfoField (#6669) 2025-01-23 15:21:42 +01:00
Pablo Lara
34b5f483d7 chore(scans): improve scan details (#6665) 2025-01-23 14:16:35 +01:00
Pedro Martín
9d04a1bc52 feat(detect-secrets): get secrets plugins from config.yaml (#6544)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-01-23 14:16:15 +01:00
dependabot[bot]
b25c0aaa00 chore(deps): bump azure-mgmt-containerservice from 33.0.0 to 34.0.0 (#6630)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:16:03 +01:00
dependabot[bot]
652b93ea45 chore(deps): bump azure-mgmt-compute from 33.1.0 to 34.0.0 (#6628)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:15:54 +01:00
Pepe Fagoaga
ccb7726511 fix(templates): Customize principals and add validation (#6655) 2025-01-23 14:15:41 +01:00
Anton Rubets
c514a4e451 chore(helm): Add prowler helm support (#6580)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 14:15:33 +01:00
Prowler Bot
d841bd6890 chore(regions_update): Changes in regions for AWS services (#6652)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-23 14:15:24 +01:00
dependabot[bot]
ff54e10ab2 chore(deps): bump boto3 from 1.35.94 to 1.35.99 (#6651)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:15:03 +01:00
Pepe Fagoaga
718e562741 chore(pre-commit): poetry checks for API and SDK (#6658) 2025-01-23 14:14:10 +01:00
Pablo Lara
ce05e2a939 feat(providers): show the cloud formation and terraform template links on the form (#6660) 2025-01-23 14:13:37 +01:00
Pablo Lara
90831d3084 feat(providers): make external id field mandatory in the aws role secret form (#6656) 2025-01-23 14:13:30 +01:00
dependabot[bot]
2c928dead5 chore(deps-dev): bump moto from 5.0.16 to 5.0.27 (#6632)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 14:13:23 +01:00
dependabot[bot]
3ffe147664 chore(deps): bump botocore from 1.35.94 to 1.35.99 (#6520)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 14:13:04 +01:00
dependabot[bot]
3373b0f47c chore(deps-dev): bump mkdocs-material from 9.5.49 to 9.5.50 (#6631)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:12:52 +01:00
Prowler Bot
b29db04560 chore(regions_update): Changes in regions for AWS services (#6599)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-23 14:12:38 +01:00
Hugo Pereira Brito
f964a0362e chore(services): delete all comment headers (#6585) 2025-01-23 14:11:58 +01:00
Prowler Bot
537b23dfae chore(regions_update): Changes in regions for AWS services (#6577)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-23 14:11:46 +01:00
Pablo Lara
48cf3528b4 feat(findings): add first seen in findings details (#6575) 2025-01-23 14:11:08 +01:00
dependabot[bot]
3c4b9d32c9 chore(deps): bump msgraph-sdk from 1.16.0 to 1.17.0 (#6547)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:09:46 +01:00
Víctor Fernández Poyatos
fd8361ae2c feat(db): Update Django DB manager to use psycopg3 and connection pooling (#6541)
# Conflicts:
#	api/poetry.lock
#	api/pyproject.toml
2025-01-23 14:07:34 +01:00
Prowler Bot
33cadaa932 chore(regions_update): Changes in regions for AWS services (#6526)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
2025-01-23 14:02:14 +01:00
dependabot[bot]
cc126eb8b4 chore(deps): bump google-api-python-client from 2.158.0 to 2.159.0 (#6521)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 14:02:07 +01:00
Pedro Martín
c996b3f6aa docs(readme): update pr template to add check for readme (#6531) 2025-01-23 14:01:53 +01:00
Adrián Jesús Peña Rodríguez
e2b406a300 feat(finding): add first_seen attribute (#6460) 2025-01-23 14:01:46 +01:00
dependabot[bot]
c2c69da603 chore(deps): bump django from 5.1.4 to 5.1.5 in /api (#6519)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
# Conflicts:
#	api/poetry.lock
2025-01-23 14:01:33 +01:00
Adrián Jesús Peña Rodríguez
4e51348ff2 feat(provider-secret): make existing external_id field mandatory (#6510) 2025-01-23 14:00:42 +01:00
dependabot[bot]
3257b82706 chore(deps-dev): bump eslint-config-prettier from 9.1.0 to 10.0.1 in /ui (#6518)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 13:59:26 +01:00
Pepe Fagoaga
c98d764daa chore(version): set next minor (#6511)
# Conflicts:
#	prowler/config/config.py
#	pyproject.toml
2025-01-23 13:57:20 +01:00
Prowler Bot
0448429a9f chore(regions_update): Changes in regions for AWS services (#6495)
Co-authored-by: MrCloudSec <38561120+MrCloudSec@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-01-23 13:56:43 +01:00
Pepe Fagoaga
b948ac6125 feat(prowler-role): Add templates to deploy it in AWS (#6499) 2025-01-23 13:56:28 +01:00
dependabot[bot]
3acc09ea16 chore(deps): bump jinja2 from 3.1.4 to 3.1.5 (#6457)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 13:56:09 +01:00
dependabot[bot]
82d0d0de9d chore(deps-dev): bump bandit from 1.8.0 to 1.8.2 (#6485)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-23 13:56:00 +01:00
Prowler Bot
f9cfc4d087 fix: add detector and line number of potential secret (#6663)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-01-22 10:55:49 -05:00
Pepe Fagoaga
7d68ff455b chore(api): Use prowler from v5.1 (#6659) 2025-01-22 20:04:55 +05:45
Pepe Fagoaga
ddf4881971 chore(version): Update version to 5.1.6 (#6645) 2025-01-21 13:28:44 -05:00
Prowler Bot
9ad4944142 fix(filters): fix dynamic filters (#6643)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-21 13:40:18 +01:00
Prowler Bot
7f33ea76a4 fix(OCSF): fix OCSF output when timestamp is UNIX format (#6627)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-01-20 18:03:51 -05:00
Prowler Bot
1140c29384 fix(aws): list tags for DocumentDB clusters (#6622)
Co-authored-by: Kay Agahd <kagahd@users.noreply.github.com>
2025-01-20 17:22:06 -05:00
Prowler Bot
2441a62f39 fix: update Azure CIS with existing App checks (#6625)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-01-20 16:27:14 -05:00
Pepe Fagoaga
c26a231fc1 chore(version): Update version to 5.1.5 (#6618) 2025-01-20 15:07:58 -05:00
Prowler Bot
2fb2315037 chore(RBAC): add permission's info (#6617)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-20 17:31:52 +01:00
Prowler Bot
a9e475481a fix(snippet-id): improve provider ID readability in tables (#6616)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-20 17:28:04 +01:00
Prowler Bot
826d7c4dc3 fix(rbac): remove invalid required permission (#6614)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-01-20 17:02:31 +01:00
Prowler Bot
b7f4b37f66 feat(api): restrict the deletion of users, only the user of the request can be deleted (#6613)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-01-20 17:02:21 +01:00
Prowler Bot
193d691bfe fix(RBAC): tweaks for edit role form (#6610)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-20 14:12:43 +01:00
Prowler Bot
a359bc581c fix(RBAC): restore manage_account permission for roles (#6603)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-20 11:53:08 +01:00
Prowler Bot
9a28ff025a fix(sqs): fix flaky test (#6595)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-01-17 12:40:11 -05:00
Prowler Bot
f1c7050700 fix(apigatewayv2): managed exception NotFoundException (#6590)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-01-17 09:27:19 -05:00
Pepe Fagoaga
9391c27b9e chore(version): Update version to 5.1.4 (#6591) 2025-01-17 09:25:35 -05:00
Prowler Bot
4c54de092f feat(findings): Add resource_tag filters for findings endpoint (#6587)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-17 19:01:51 +05:45
Prowler Bot
690c482a43 fix(gcp): fix flaky tests from dns service (#6571)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2025-01-17 08:15:34 -05:00
Prowler Bot
ad2d857c6f feat(findings): add /findings/metadata to retrieve dynamic filters information (#6586)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-01-17 18:47:59 +05:45
Pepe Fagoaga
07ee59d2ef chore(version): Update version to 5.1.3 (#6584) 2025-01-17 18:46:08 +05:45
Prowler Bot
bec4617d0a fix(providers): update the label and placeholder based on the cloud provider (#6582)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-17 12:33:25 +01:00
Prowler Bot
94916f8305 fix(findings): remove filter delta_in applied by default (#6579)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-17 11:06:07 +01:00
Prowler Bot
44de651be3 fix(cis): add subsections if needed (#6568)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-01-16 14:49:49 -05:00
Prowler Bot
bdcba9c642 fix(detect_secrets): refactor logic for detect-secrets (#6566)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-01-16 13:07:18 -05:00
Prowler Bot
c172f75f1a fix(dep): address compatibility issues (#6557)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-16 14:35:06 +01:00
Prowler Bot
ec492fa13a feat(filters): add resource type filter for findings (#6525)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-01-15 08:43:49 +01:00
Prowler Bot
702659959c fix(Azure TDE): add filter for master DB (#6514)
Co-authored-by: johannes-engler-mw <132657752+johannes-engler-mw@users.noreply.github.com>
2025-01-14 15:25:27 -05:00
Pepe Fagoaga
fef332a591 chore(version): set next fixes 2025-01-14 18:05:04 +01:00
1062 changed files with 63580 additions and 9422 deletions

6
.env
View File

@@ -24,6 +24,10 @@ POSTGRES_USER=prowler
POSTGRES_PASSWORD=postgres
POSTGRES_DB=prowler_db
# Celery-Prowler task settings
TASK_RETRY_DELAY_SECONDS=0.1
TASK_RETRY_ATTEMPTS=5
# Valkey settings
# If running Valkey and celery on host, use localhost, else use 'valkey'
VALKEY_HOST=valkey
@@ -123,7 +127,7 @@ SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.5.0
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.5.1
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"

View File

@@ -9,8 +9,8 @@ updates:
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "daily"
open-pull-requests-limit: 10
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
@@ -31,8 +31,8 @@ updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "daily"
open-pull-requests-limit: 10
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
@@ -53,46 +53,47 @@ updates:
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
interval: "monthly"
open-pull-requests-limit: 25
target-branch: master
labels:
- "dependencies"
- "docker"
# Dependabot Updates are temporary disabled - 2025/04/15
# v4.6
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: v4.6
labels:
- "dependencies"
- "pip"
- "v4"
# - package-ecosystem: "pip"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "pip"
# - "v4"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: v4.6
labels:
- "dependencies"
- "github_actions"
- "v4"
# - package-ecosystem: "github-actions"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "github_actions"
# - "v4"
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "weekly"
open-pull-requests-limit: 10
target-branch: v4.6
labels:
- "dependencies"
- "docker"
- "v4"
# - package-ecosystem: "docker"
# directory: "/"
# schedule:
# interval: "weekly"
# open-pull-requests-limit: 10
# target-branch: v4.6
# labels:
# - "dependencies"
# - "docker"
# - "v4"
# Dependabot Updates are temporary disabled - 2025/03/19
# v3
@@ -107,13 +108,13 @@ updates:
# - "pip"
# - "v3"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"
open-pull-requests-limit: 10
target-branch: v3
labels:
- "dependencies"
- "github_actions"
- "v3"
# - package-ecosystem: "github-actions"
# directory: "/"
# schedule:
# interval: "monthly"
# open-pull-requests-limit: 10
# target-branch: v3
# labels:
# - "dependencies"
# - "github_actions"
# - "v3"

View File

@@ -81,7 +81,7 @@ jobs:
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.WORKING_DIRECTORY }}
# Set push: false for testing
@@ -94,7 +94,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true

View File

@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/api-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
category: "/language:${{matrix.language}}"

View File

@@ -75,7 +75,7 @@ jobs:
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@2f7c5bfce28377bc069a65ba478de0a74aa0ca32 # v46.0.1
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: api/**
files_ignore: |
@@ -85,6 +85,14 @@ jobs:
api/README.md
api/mkdocs.yml
- name: Replace @master with current branch in pyproject.toml
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
BRANCH_NAME="${GITHUB_HEAD_REF:-${GITHUB_REF_NAME}}"
echo "Using branch: $BRANCH_NAME"
sed -i "s|@master|@$BRANCH_NAME|g" pyproject.toml
- name: Install poetry
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
@@ -92,9 +100,15 @@ jobs:
python -m pip install --upgrade pip
pipx install poetry==2.1.1
- name: Update poetry.lock after the branch name change
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry lock
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
@@ -145,7 +159,7 @@ jobs:
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run safety check --ignore 70612,66963
poetry run safety check --ignore 70612,66963,74429,77119
- name: Vulture
working-directory: ./api
@@ -167,7 +181,7 @@ jobs:
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@0565863a31f2c772f9f0395002a31e3f06189574 # v5.4.0
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -179,7 +193,7 @@ jobs:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.API_WORKING_DIR }}
push: false

View File

@@ -11,7 +11,7 @@ jobs:
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@ded5f45b92c00939718787ce586b520bbe795f3b # v3.88.18
uses: trufflesecurity/trufflehog@b06f6d72a3791308bb7ba59c2b8cb7a083bd17e4 # v3.88.26
with:
path: ./
base: ${{ github.event.repository.default_branch }}

View File

@@ -0,0 +1,86 @@
name: Check Changelog
on:
pull_request:
types: [opened, synchronize, reopened, labeled, unlabeled]
jobs:
check-changelog:
if: contains(github.event.pull_request.labels.*.name, 'no-changelog') == false
runs-on: ubuntu-latest
permissions:
id-token: write
contents: read
pull-requests: write
env:
MONITORED_FOLDERS: "api ui prowler"
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
- name: Get list of changed files
id: changed_files
run: |
git fetch origin ${{ github.base_ref }}
git diff --name-only origin/${{ github.base_ref }}...HEAD > changed_files.txt
cat changed_files.txt
- name: Check for folder changes and changelog presence
id: check_folders
run: |
missing_changelogs=""
for folder in $MONITORED_FOLDERS; do
if grep -q "^${folder}/" changed_files.txt; then
echo "Detected changes in ${folder}/"
if ! grep -q "^${folder}/CHANGELOG.md$" changed_files.txt; then
echo "No changelog update found for ${folder}/"
missing_changelogs="${missing_changelogs}- \`${folder}\`\n"
fi
fi
done
echo "missing_changelogs<<EOF" >> $GITHUB_OUTPUT
echo -e "${missing_changelogs}" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
- name: Find existing changelog comment
if: github.event.pull_request.head.repo.full_name == github.repository
id: find_comment
uses: peter-evans/find-comment@3eae4d37986fb5a8592848f6a574fdf654e61f9e #v3.1.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: 'github-actions[bot]'
body-includes: '<!-- changelog-check -->'
- name: Comment on PR if changelog is missing
if: github.event.pull_request.head.repo.full_name == github.repository && steps.check_folders.outputs.missing_changelogs != ''
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.find_comment.outputs.comment-id }}
body: |
<!-- changelog-check -->
⚠️ **Changes detected in the following folders without a corresponding update to the `CHANGELOG.md`:**
${{ steps.check_folders.outputs.missing_changelogs }}
Please add an entry to the corresponding `CHANGELOG.md` file to maintain a clear history of changes.
- name: Comment on PR if all changelogs are present
if: github.event.pull_request.head.repo.full_name == github.repository && steps.check_folders.outputs.missing_changelogs == ''
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.find_comment.outputs.comment-id }}
body: |
<!-- changelog-check -->
✅ All necessary `CHANGELOG.md` files have been updated. Great job! 🎉
- name: Fail if changelog is missing
if: steps.check_folders.outputs.missing_changelogs != ''
run: |
echo "ERROR: Missing changelog updates in some folders."
exit 1

View File

@@ -0,0 +1,37 @@
name: Prowler - Merged Pull Request
on:
pull_request_target:
branches: ['master']
types: ['closed']
jobs:
trigger-cloud-pull-request:
name: Trigger Cloud Pull Request
if: github.event.pull_request.merged == true && github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
ref: ${{ github.event.pull_request.merge_commit_sha }}
- name: Set short git commit SHA
id: vars
run: |
shortSha=$(git rev-parse --short ${{ github.sha }})
echo "SHORT_SHA=${shortSha}" >> $GITHUB_ENV
- name: Trigger pull request
uses: peter-evans/repository-dispatch@ff45666b9427631e3450c54a1bcbee4d9ff4d7c0 # v3.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
repository: ${{ secrets.CLOUD_DISPATCH }}
event-type: prowler-pull-request-merged
client-payload: '{
"PROWLER_COMMIT_SHA": "${{ github.sha }}",
"PROWLER_COMMIT_SHORT_SHA": "${{ env.SHORT_SHA }}",
"PROWLER_PR_TITLE": "${{ github.event.pull_request.title }}",
"PROWLER_PR_LABELS": ${{ toJson(github.event.pull_request.labels.*.name) }},
"PROWLER_PR_BODY": ${{ toJson(github.event.pull_request.body) }},
"PROWLER_PR_URL":${{ toJson(github.event.pull_request.html_url) }}
}'

View File

@@ -62,13 +62,13 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Python
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: Install Poetry
run: |
pipx install poetry==1.8.5
pipx install poetry==2.*
pipx inject poetry poetry-bumpversion
- name: Get Prowler version
@@ -127,7 +127,7 @@ jobs:
- name: Build and push container image (latest)
if: github.event_name == 'push'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
push: true
tags: |
@@ -140,7 +140,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
# Use local context to get changes
# https://github.com/docker/build-push-action#path-context

145
.github/workflows/sdk-bump-version.yml vendored Normal file
View File

@@ -0,0 +1,145 @@
name: SDK - Bump Version
on:
release:
types: [published]
env:
PROWLER_VERSION: ${{ github.event.release.tag_name }}
BASE_BRANCH: master
jobs:
bump-version:
name: Bump Version
if: github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Get Prowler version
shell: bash
run: |
if [[ $PROWLER_VERSION =~ ^([0-9]+)\.([0-9]+)\.([0-9]+)$ ]]; then
MAJOR_VERSION=${BASH_REMATCH[1]}
MINOR_VERSION=${BASH_REMATCH[2]}
FIX_VERSION=${BASH_REMATCH[3]}
# Export version components to GitHub environment
echo "MAJOR_VERSION=${MAJOR_VERSION}" >> "${GITHUB_ENV}"
echo "MINOR_VERSION=${MINOR_VERSION}" >> "${GITHUB_ENV}"
echo "FIX_VERSION=${FIX_VERSION}" >> "${GITHUB_ENV}"
if (( MAJOR_VERSION == 5 )); then
if (( FIX_VERSION == 0 )); then
echo "Minor Release: $PROWLER_VERSION"
# Set up next minor version for master
BUMP_VERSION_TO=${MAJOR_VERSION}.$((MINOR_VERSION + 1)).${FIX_VERSION}
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
TARGET_BRANCH=${BASE_BRANCH}
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
# Set up patch version for version branch
PATCH_VERSION_TO=${MAJOR_VERSION}.${MINOR_VERSION}.1
echo "PATCH_VERSION_TO=${PATCH_VERSION_TO}" >> "${GITHUB_ENV}"
VERSION_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
echo "VERSION_BRANCH=${VERSION_BRANCH}" >> "${GITHUB_ENV}"
echo "Bumping to next minor version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
echo "Bumping to next patch version: ${PATCH_VERSION_TO} in branch ${VERSION_BRANCH}"
else
echo "Patch Release: $PROWLER_VERSION"
BUMP_VERSION_TO=${MAJOR_VERSION}.${MINOR_VERSION}.$((FIX_VERSION + 1))
echo "BUMP_VERSION_TO=${BUMP_VERSION_TO}" >> "${GITHUB_ENV}"
TARGET_BRANCH=v${MAJOR_VERSION}.${MINOR_VERSION}
echo "TARGET_BRANCH=${TARGET_BRANCH}" >> "${GITHUB_ENV}"
echo "Bumping to next patch version: ${BUMP_VERSION_TO} in branch ${TARGET_BRANCH}"
fi
else
echo "Releasing another Prowler major version, aborting..."
exit 1
fi
else
echo "Invalid version syntax: '$PROWLER_VERSION' (must be N.N.N)" >&2
exit 1
fi
- name: Bump versions in files
run: |
echo "Using PROWLER_VERSION=$PROWLER_VERSION"
echo "Using BUMP_VERSION_TO=$BUMP_VERSION_TO"
set -e
echo "Bumping version in pyproject.toml ..."
sed -i "s|version = \"${PROWLER_VERSION}\"|version = \"${BUMP_VERSION_TO}\"|" pyproject.toml
echo "Bumping version in prowler/config/config.py ..."
sed -i "s|prowler_version = \"${PROWLER_VERSION}\"|prowler_version = \"${BUMP_VERSION_TO}\"|" prowler/config/config.py
echo "Bumping version in .env ..."
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${BUMP_VERSION_TO}|" .env
git --no-pager diff
- name: Create Pull Request
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.TARGET_BRANCH }}
commit-message: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
branch: "version-bump-to-v${{ env.BUMP_VERSION_TO }}"
title: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
body: |
### Description
Bump Prowler version to v${{ env.BUMP_VERSION_TO }}
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
- name: Handle patch version for minor release
if: env.FIX_VERSION == '0'
run: |
echo "Using PROWLER_VERSION=$PROWLER_VERSION"
echo "Using PATCH_VERSION_TO=$PATCH_VERSION_TO"
set -e
echo "Bumping version in pyproject.toml ..."
sed -i "s|version = \"${PROWLER_VERSION}\"|version = \"${PATCH_VERSION_TO}\"|" pyproject.toml
echo "Bumping version in prowler/config/config.py ..."
sed -i "s|prowler_version = \"${PROWLER_VERSION}\"|prowler_version = \"${PATCH_VERSION_TO}\"|" prowler/config/config.py
echo "Bumping version in .env ..."
sed -i "s|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PROWLER_VERSION}|NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${PATCH_VERSION_TO}|" .env
git --no-pager diff
- name: Create Pull Request for patch version
if: env.FIX_VERSION == '0'
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
base: ${{ env.VERSION_BRANCH }}
commit-message: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
branch: "version-bump-to-v${{ env.PATCH_VERSION_TO }}"
title: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
body: |
### Description
Bump Prowler version to v${{ env.PATCH_VERSION_TO }}
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

View File

@@ -21,6 +21,7 @@ on:
paths-ignore:
- 'ui/**'
- 'api/**'
- '.github/**'
pull_request:
branches:
- "master"
@@ -30,6 +31,7 @@ on:
paths-ignore:
- 'ui/**'
- 'api/**'
- '.github/**'
schedule:
- cron: '00 12 * * *'
@@ -54,12 +56,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/sdk-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
category: "/language:${{matrix.language}}"

View File

@@ -25,7 +25,7 @@ jobs:
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@2f7c5bfce28377bc069a65ba478de0a74aa0ca32 # v46.0.1
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: ./**
files_ignore: |
@@ -34,6 +34,7 @@ jobs:
permissions/**
api/**
ui/**
prowler/CHANGELOG.md
README.md
mkdocs.yml
.backportrc.json
@@ -50,7 +51,7 @@ jobs:
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
@@ -106,15 +107,128 @@ jobs:
run: |
/tmp/hadolint Dockerfile --ignore=DL3013
- name: Test with pytest
# Test AWS
- name: AWS - Check if any file has changed
id: aws-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/aws/**
./tests/providers/aws/**
.poetry.lock
- name: AWS - Test
if: steps.aws-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/aws --cov-report=xml:aws_coverage.xml tests/providers/aws
# Test Azure
- name: Azure - Check if any file has changed
id: azure-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/azure/**
./tests/providers/azure/**
.poetry.lock
- name: Azure - Test
if: steps.azure-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/azure --cov-report=xml:azure_coverage.xml tests/providers/azure
# Test GCP
- name: GCP - Check if any file has changed
id: gcp-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/gcp/**
./tests/providers/gcp/**
.poetry.lock
- name: GCP - Test
if: steps.gcp-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/gcp --cov-report=xml:gcp_coverage.xml tests/providers/gcp
# Test Kubernetes
- name: Kubernetes - Check if any file has changed
id: kubernetes-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/kubernetes/**
./tests/providers/kubernetes/**
.poetry.lock
- name: Kubernetes - Test
if: steps.kubernetes-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/kubernetes --cov-report=xml:kubernetes_coverage.xml tests/providers/kubernetes
# Test GitHub
- name: GitHub - Check if any file has changed
id: github-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/github/**
./tests/providers/github/**
.poetry.lock
- name: GitHub - Test
if: steps.github-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/github --cov-report=xml:github_coverage.xml tests/providers/github
# Test NHN
- name: NHN - Check if any file has changed
id: nhn-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/nhn/**
./tests/providers/nhn/**
.poetry.lock
- name: NHN - Test
if: steps.nhn-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/nhn --cov-report=xml:nhn_coverage.xml tests/providers/nhn
# Test M365
- name: M365 - Check if any file has changed
id: m365-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/m365/**
./tests/providers/m365/**
.poetry.lock
- name: M365 - Test
if: steps.m365-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/m365 --cov-report=xml:m365_coverage.xml tests/providers/m365
# Common Tests
- name: Lib - Test
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler --cov-report=xml tests
poetry run pytest -n auto --cov=./prowler/lib --cov-report=xml:lib_coverage.xml tests/lib
- name: Config - Test
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/config --cov-report=xml:config_coverage.xml tests/config
# Codecov
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@0565863a31f2c772f9f0395002a31e3f06189574 # v5.4.0
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
flags: prowler
files: ./aws_coverage.xml,./azure_coverage.xml,./gcp_coverage.xml,./kubernetes_coverage.xml,./github_coverage.xml,./nhn_coverage.xml,./m365_coverage.xml,./lib_coverage.xml,./config_coverage.xml

View File

@@ -7,7 +7,7 @@ on:
env:
RELEASE_TAG: ${{ github.event.release.tag_name }}
PYTHON_VERSION: 3.11
CACHE: "poetry"
# CACHE: "poetry"
jobs:
repository-check:
@@ -68,13 +68,13 @@ jobs:
- name: Install dependencies
run: |
pipx install poetry==1.8.5
pipx install poetry==2.1.1
- name: Setup Python
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: ${{ env.CACHE }}
# cache: ${{ env.CACHE }}
- name: Build Prowler package
run: |

View File

@@ -4,7 +4,7 @@ name: SDK - Refresh AWS services' regions
on:
schedule:
- cron: "0 9 * * *" #runs at 09:00 UTC everyday
- cron: "0 9 * * 1" # runs at 09:00 UTC every Monday
env:
GITHUB_BRANCH: "master"
@@ -28,7 +28,7 @@ jobs:
ref: ${{ env.GITHUB_BRANCH }}
- name: setup python
uses: actions/setup-python@42375524e23c412d93fb67b49958b491fce71c38 # v5.4.0
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.9 #install the python needed

View File

@@ -81,7 +81,7 @@ jobs:
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
@@ -96,7 +96,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |

View File

@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/ui-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@5f8171a638ada777af81d42b55959a643bb29017 # v3.28.12
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
with:
category: "/language:${{matrix.language}}"

View File

@@ -31,7 +31,7 @@ jobs:
with:
persist-credentials: false
- name: Setup Node.js ${{ matrix.node-version }}
uses: actions/setup-node@cdca7365b2dadb8aad0a33bc7601856ffabcc48e # v4.3.0
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: ${{ matrix.node-version }}
- name: Install dependencies
@@ -50,7 +50,7 @@ jobs:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Build Container
uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
with:
context: ${{ env.UI_WORKING_DIR }}
# Always build using `prod` target

3
.gitignore vendored
View File

@@ -42,6 +42,9 @@ junit-reports/
# VSCode files
.vscode/
# Cursor files
.cursorignore
# Terraform
.terraform*
*.tfstate

View File

@@ -115,7 +115,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'safety check --ignore 70612,66963'
entry: bash -c 'safety check --ignore 70612,66963,74429'
language: system
- id: vulture

View File

@@ -1,24 +1,43 @@
FROM python:3.12.9-alpine3.20
FROM python:3.12.10-slim-bookworm AS build
LABEL maintainer="https://github.com/prowler-cloud/prowler"
LABEL org.opencontainers.image.source="https://github.com/prowler-cloud/prowler"
# Update system dependencies and install essential tools
#hadolint ignore=DL3018
RUN apk --no-cache upgrade && apk --no-cache add curl git gcc python3-dev musl-dev linux-headers
ARG POWERSHELL_VERSION=7.5.0
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
elif [ "$ARCH" = "aarch64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
else \
echo "Unsupported architecture: $ARCH" && exit 1 ; \
fi && \
mkdir -p /opt/microsoft/powershell/7 && \
tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7 && \
chmod +x /opt/microsoft/powershell/7/pwsh && \
ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh && \
rm /tmp/powershell.tar.gz
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
# Create non-root user
RUN mkdir -p /home/prowler && \
echo 'prowler:x:1000:1000:prowler:/home/prowler:' > /etc/passwd && \
echo 'prowler:x:1000:' > /etc/group && \
chown -R prowler:prowler /home/prowler
USER prowler
# Copy necessary files
WORKDIR /home/prowler
# Copy necessary files
COPY prowler/ /home/prowler/prowler/
COPY dashboard/ /home/prowler/dashboard/
COPY pyproject.toml /home/prowler
COPY README.md /home/prowler/
COPY prowler/providers/m365/lib/powershell/m365_powershell.py /home/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py
# Install Python dependencies
ENV HOME='/home/prowler'
@@ -34,6 +53,9 @@ RUN pip install --no-cache-dir --upgrade pip && \
RUN poetry install --compile && \
rm -rf ~/.cache/pip
# Install PowerShell modules
RUN poetry run python prowler/providers/m365/lib/powershell/m365_powershell.py
# Remove deprecated dash dependencies
RUN pip uninstall dash-html-components -y && \
pip uninstall dash-core-components -y

164
README.md
View File

@@ -3,7 +3,7 @@
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/master/docs/img/prowler-logo-white.png#gh-dark-mode-only" width="50%" height="50%">
</p>
<p align="center">
<b><i>Prowler Open Source</b> is as dynamic and adaptable as the environment theyre meant to protect. Trusted by the leaders in security.
<b><i>Prowler Open Source</b> is as dynamic and adaptable as the environment it secures. It is trusted by the industry leaders to uphold the highest standards in security.
</p>
<p align="center">
<b>Learn more at <a href="https://prowler.com">prowler.com</i></b>
@@ -43,15 +43,29 @@
# Description
**Prowler** is an Open Source security tool to perform AWS, Azure, Google Cloud and Kubernetes security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness, and also remediations! We have Prowler CLI (Command Line Interface) that we call Prowler Open Source and a service on top of it that we call <a href="https://prowler.com">Prowler Cloud</a>.
**Prowler** is an open-source security tool designed to assess and enforce security best practices across AWS, Azure, Google Cloud, and Kubernetes. It supports tasks such as security audits, incident response, continuous monitoring, system hardening, forensic readiness, and remediation processes.
Prowler includes hundreds of built-in controls to ensure compliance with standards and frameworks, including:
- **Industry Standards:** CIS, NIST 800, NIST CSF, and CISA
- **Regulatory Compliance and Governance:** RBI, FedRAMP, and PCI-DSS
- **Frameworks for Sensitive Data and Privacy:** GDPR, HIPAA, and FFIEC
- **Frameworks for Organizational Governance and Quality Control:** SOC2 and GXP
- **AWS-Specific Frameworks:** AWS Foundational Technical Review (FTR) and AWS Well-Architected Framework (Security Pillar)
- **National Security Standards:** ENS (Spanish National Security Scheme)
- **Custom Security Frameworks:** Tailored to your needs
## Prowler CLI and Prowler Cloud
Prowler offers a Command Line Interface (CLI), known as Prowler Open Source, and an additional service built on top of it, called <a href="https://prowler.com">Prowler Cloud</a>.
## Prowler App
Prowler App is a web application that allows you to run Prowler in your cloud provider accounts and visualize the results in a user-friendly interface.
Prowler App is a web-based application that simplifies running Prowler across your cloud provider accounts. It provides a user-friendly interface to visualize the results and streamline your security assessments.
![Prowler App](docs/img/overview.png)
>More details at [Prowler App Documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-app-installation)
>For more details, refer to the [Prowler App Documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-app-installation)
## Prowler CLI
@@ -60,6 +74,7 @@ prowler <provider>
```
![Prowler CLI Execution](docs/img/short-display.png)
## Prowler Dashboard
```console
@@ -67,25 +82,28 @@ prowler dashboard
```
![Prowler Dashboard](docs/img/dashboard.png)
It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks.
# Prowler at a Glance
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 564 | 82 | 33 | 10 |
| GCP | 77 | 13 | 6 | 3 |
| Azure | 140 | 18 | 7 | 3 |
| AWS | 564 | 82 | 34 | 10 |
| GCP | 79 | 13 | 7 | 3 |
| Azure | 140 | 18 | 8 | 3 |
| Kubernetes | 83 | 7 | 4 | 7 |
| Microsoft365 | 5 | 2 | 1 | 0 |
| GitHub | 3 | 2 | 1 | 0 |
| M365 | 44 | 2 | 2 | 0 |
| NHN (Unofficial) | 6 | 2 | 1 | 0 |
> You can list the checks, services, compliance frameworks and categories with `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
> Use the following commands to list Prowler's available checks, services, compliance frameworks, and categories: `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
# 💻 Installation
## Prowler App
Prowler App can be installed in different ways, depending on your environment:
Installing Prowler App
Prowler App offers flexible installation methods tailored to various environments:
> See how to use Prowler App in the [Prowler App Usage Guide](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/prowler-app/).
> For detailed instructions on using Prowler App, refer to the [Prowler App Usage Guide](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/prowler-app/).
### Docker Compose
@@ -101,8 +119,16 @@ curl -LO https://raw.githubusercontent.com/prowler-cloud/prowler/refs/heads/mast
docker compose up -d
```
> Containers are built for `linux/amd64`. If your workstation's architecture is different, please set `DOCKER_DEFAULT_PLATFORM=linux/amd64` in your environment or use the `--platform linux/amd64` flag in the docker command.
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
> Containers are built for `linux/amd64`.
### Configuring Your Workstation for Prowler App
If your workstation's architecture is incompatible, you can resolve this by:
- **Setting the environment variable**: `DOCKER_DEFAULT_PLATFORM=linux/amd64`
- **Using the following flag in your Docker command**: `--platform linux/amd64`
> Once configured, access the Prowler App at http://localhost:3000. Sign up using your email and password to get started.
### From GitHub
@@ -128,12 +154,12 @@ python manage.py migrate --database admin
gunicorn -c config/guniconf.py config.wsgi:application
```
> [!IMPORTANT]
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
>
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
> Now, you can access the API documentation at http://localhost:8080/api/v1/docs.
> After completing the setup, access the API documentation at http://localhost:8080/api/v1/docs.
**Commands to run the API Worker**
@@ -171,29 +197,31 @@ npm run build
npm start
```
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
> Once configured, access the Prowler App at http://localhost:3000. Sign up using your email and password to get started.
## Prowler CLI
### Pip package
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/), thus can be installed using pip with Python > 3.9.1, < 3.13:
Prowler CLI is available as a project in [PyPI](https://pypi.org/project/prowler-cloud/). Consequently, it can be installed using pip with Python >3.9.1, <3.13:
```console
pip install prowler
prowler -v
```
>More details at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli-installation)
>For further guidance, refer to [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli-installation)
### Containers
The available versions of Prowler CLI are the following:
**Available Versions of Prowler CLI**
- `latest`: in sync with `master` branch (bear in mind that it is not a stable version)
- `v4-latest`: in sync with `v4` branch (bear in mind that it is not a stable version)
- `v3-latest`: in sync with `v3` branch (bear in mind that it is not a stable version)
- `<x.y.z>` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases.
- `stable`: this tag always point to the latest release.
- `v4-stable`: this tag always point to the latest release for v4.
- `v3-stable`: this tag always point to the latest release for v3.
The following versions of Prowler CLI are available, depending on your requirements:
- `latest`: Synchronizes with the `master` branch. Note that this version is not stable.
- `v4-latest`: Synchronizes with the `v4` branch. Note that this version is not stable.
- `v3-latest`: Synchronizes with the `v3` branch. Note that this version is not stable.
- `<x.y.z>` (release): Stable releases corresponding to specific versions. You can find the complete list of releases [here](https://github.com/prowler-cloud/prowler/releases).
- `stable`: Always points to the latest release.
- `v4-stable`: Always points to the latest release for v4.
- `v3-stable`: Always points to the latest release for v3.
The container images are available here:
- Prowler CLI:
@@ -205,7 +233,7 @@ The container images are available here:
### From GitHub
Python > 3.9.1, < 3.13 is required with pip and poetry:
Python >3.9.1, <3.13 is required with pip and Poetry:
``` console
git clone https://github.com/prowler-cloud/prowler
@@ -215,25 +243,46 @@ poetry install
python prowler-cli.py -v
```
> [!IMPORTANT]
> Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
>
> If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
> In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
> To clone Prowler on Windows, configure Git to support long file paths by running the following command: `git config core.longpaths true`.
> If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
# 📐✏️ High level architecture
> [!IMPORTANT]
> As of Poetry v2.0.0, the `poetry shell` command has been deprecated. Use `poetry env activate` instead for environment activation.
>
> If your Poetry version is below v2.0.0, continue using `poetry shell` to activate your environment.
> For further guidance, refer to the Poetry Environment Activation Guide https://python-poetry.org/docs/managing-environments/#activating-the-environment.
# ✏️ High level architecture
## Prowler App
The **Prowler App** consists of three main components:
**Prowler App** is composed of three key components:
- **Prowler UI**: A user-friendly web interface for running Prowler and viewing results, powered by Next.js.
- **Prowler API**: The backend API that executes Prowler scans and stores the results, built with Django REST Framework.
- **Prowler SDK**: A Python SDK that integrates with the Prowler CLI for advanced functionality.
- **Prowler UI**: A web-based interface, built with Next.js, providing a user-friendly experience for executing Prowler scans and visualizing results.
- **Prowler API**: A backend service, developed with Django REST Framework, responsible for running Prowler scans and storing the generated results.
- **Prowler SDK**: A Python SDK designed to extend the functionality of the Prowler CLI for advanced capabilities.
![Prowler App Architecture](docs/img/prowler-app-architecture.png)
## Prowler CLI
You can run Prowler from your workstation, a Kubernetes Job, a Google Compute Engine, an Azure VM, an EC2 instance, Fargate or any other container, CloudShell and many more.
**Running Prowler**
Prowler can be executed across various environments, offering flexibility to meet your needs. It can be run from:
- Your own workstation
- A Kubernetes Job
- Google Compute Engine
- Azure Virtual Machines (VMs)
- Amazon EC2 instances
- AWS Fargate or other container platforms
- CloudShell
And many more environments.
![Architecture](docs/img/architecture.png)
@@ -241,23 +290,36 @@ You can run Prowler from your workstation, a Kubernetes Job, a Google Compute En
## General
- `Allowlist` now is called `Mutelist`.
- The `--quiet` option has been deprecated, now use the `--status` flag to select the finding's status you want to get from PASS, FAIL or MANUAL.
- All `INFO` finding's status has changed to `MANUAL`.
- The CSV output format is common for all the providers.
- The `--quiet` option has been deprecated. Use the `--status` flag to filter findings based on their status: PASS, FAIL, or MANUAL.
- All findings with an `INFO` status have been reclassified as `MANUAL`.
- The CSV output format is standardized across all providers.
We have deprecated some of our outputs formats:
- The native JSON is replaced for the JSON [OCSF](https://schema.ocsf.io/) v1.1.0, common for all the providers.
**Deprecated Output Formats**
The following formats are now deprecated:
- Native JSON has been replaced with JSON in [OCSF] v1.1.0 format, which is standardized across all providers (https://schema.ocsf.io/).
## AWS
- Deprecate the AWS flag --sts-endpoint-region since we use AWS STS regional tokens.
- To send only FAILS to AWS Security Hub, now use either `--send-sh-only-fails` or `--security-hub --status FAIL`.
**AWS Flag Deprecation**
The flag --sts-endpoint-region has been deprecated due to the adoption of AWS STS regional tokens.
**Sending FAIL Results to AWS Security Hub**
- To send only FAILS to AWS Security Hub, use one of the following options: `--send-sh-only-fails` or `--security-hub --status FAIL`.
# 📖 Documentation
Install, Usage, Tutorials and Developer Guide is at https://docs.prowler.com/
**Documentation Resources**
For installation instructions, usage details, tutorials, and the Developer Guide, visit https://docs.prowler.com/
# 📃 License
Prowler is licensed as Apache License 2.0 as specified in each file. You may obtain a copy of the License at
<http://www.apache.org/licenses/LICENSE-2.0>
**Prowler License Information**
Prowler is licensed under the Apache License 2.0, as indicated in each file within the repository. Obtaining a Copy of the License
A copy of the License is available at <http://www.apache.org/licenses/LICENSE-2.0>

View File

@@ -53,3 +53,6 @@ DJANGO_GOOGLE_OAUTH_CALLBACK_URL=""
DJANGO_GITHUB_OAUTH_CLIENT_ID=""
DJANGO_GITHUB_OAUTH_CLIENT_SECRET=""
DJANGO_GITHUB_OAUTH_CALLBACK_URL=""
# Deletion Task Batch Size
DJANGO_DELETION_BATCH_SIZE=5000

View File

@@ -80,7 +80,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'poetry run safety check --ignore 70612,66963'
entry: bash -c 'poetry run safety check --ignore 70612,66963,74429,77119'
language: system
- id: vulture

View File

@@ -2,14 +2,106 @@
All notable changes to the **Prowler API** are documented in this file.
## [v1.8.5] (Prowler v5.7.5)
### Fixed
- Normalize provider UID to ensure safe and unique export directory paths [(#8007)](https://github.com/prowler-cloud/prowler/pull/8007).
- Blank resource types in `/metadata` endpoints [(#8027)](https://github.com/prowler-cloud/prowler/pull/8027)
---
## [v1.6.0] (Prowler UNRELEASED)
## [v1.8.4] (Prowler v5.7.4)
### Removed
- Reverted RLS transaction handling and DB custom backend [(#7994)](https://github.com/prowler-cloud/prowler/pull/7994).
---
## [v1.8.3] (Prowler v5.7.3)
### Added
- Database backend to handle already closed connections [(#7935)](https://github.com/prowler-cloud/prowler/pull/7935).
### Changed
- Renamed field encrypted_password to password for M365 provider [(#7784)](https://github.com/prowler-cloud/prowler/pull/7784)
### Fixed
- Fixed transaction persistence with RLS operations [(#7916)](https://github.com/prowler-cloud/prowler/pull/7916).
- Reverted the change `get_with_retry` to use the original `get` method for retrieving tasks [(#7932)](https://github.com/prowler-cloud/prowler/pull/7932).
- Fixed the connection status verification before launching a scan [(#7831)](https://github.com/prowler-cloud/prowler/pull/7831)
---
## [v1.8.2] (Prowler v5.7.2)
### Fixed
- Fixed task lookup to use task_kwargs instead of task_args for scan report resolution. [(#7830)](https://github.com/prowler-cloud/prowler/pull/7830)
- Fixed Kubernetes UID validation to allow valid context names [(#7871)](https://github.com/prowler-cloud/prowler/pull/7871)
- Fixed the connection status verification before launching a scan [(#7831)](https://github.com/prowler-cloud/prowler/pull/7831)
- Fixed a race condition when creating background tasks [(#7876)](https://github.com/prowler-cloud/prowler/pull/7876).
- Fixed an error when modifying or retrieving tenants due to missing user UUID in transaction context [(#7890)](https://github.com/prowler-cloud/prowler/pull/7890).
---
## [v1.8.1] (Prowler v5.7.1)
### Fixed
- Added database index to improve performance on finding lookup [(#7800)](https://github.com/prowler-cloud/prowler/pull/7800).
---
## [v1.8.0] (Prowler v5.7.0)
### Added
- Added huge improvements to `/findings/metadata` and resource related filters for findings [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
- Added improvements to `/overviews` endpoints [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
- Added new queue to perform backfill background tasks [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
- Added new endpoints to retrieve latest findings and metadata [(#7743)](https://github.com/prowler-cloud/prowler/pull/7743).
- Added export support for Prowler ThreatScore in M365 [(7783)](https://github.com/prowler-cloud/prowler/pull/7783)
---
## [v1.7.0] (Prowler v5.6.0)
### Added
- Added M365 as a new provider [(#7563)](https://github.com/prowler-cloud/prowler/pull/7563).
- Added a `compliance/` folder and ZIPexport functionality for all compliance reports.[(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
- Added a new API endpoint to fetch and download any specific compliance file by name [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
---
## [v1.6.0] (Prowler v5.5.0)
### Added
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
- HTTP Security Headers [(#7289)](https://github.com/prowler-cloud/prowler/pull/7289).
- New endpoint to get the compliance overviews metadata [(#7333)](https://github.com/prowler-cloud/prowler/pull/7333).
- Support for muted findings [(#7378)](https://github.com/prowler-cloud/prowler/pull/7378).
- Added missing fields to API findings and resources [(#7318)](https://github.com/prowler-cloud/prowler/pull/7318).
---
## [v1.5.4] (Prowler v5.4.4)
### Fixed
- Fixed a bug with periodic tasks when trying to delete a provider ([#7466])(https://github.com/prowler-cloud/prowler/pull/7466).
---
## [v1.5.3] (Prowler v5.4.3)
### Fixed
- Added duplicated scheduled scans handling ([#7401])(https://github.com/prowler-cloud/prowler/pull/7401).
- Added environment variable to configure the deletion task batch size ([#7423])(https://github.com/prowler-cloud/prowler/pull/7423).
---
## [v1.5.2] (Prowler v5.4.2)
### Changed
- Refactored deletion logic and implemented retry mechanism for deletion tasks [(#7349)](https://github.com/prowler-cloud/prowler/pull/7349).
---
@@ -20,6 +112,9 @@ All notable changes to the **Prowler API** are documented in this file.
- Fixed a race condition when deleting export files after the S3 upload [(#7172)](https://github.com/prowler-cloud/prowler/pull/7172).
- Handled exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283).
### Added
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
---

View File

@@ -1,13 +1,33 @@
FROM python:3.12.8-alpine3.20 AS build
FROM python:3.12.10-slim-bookworm AS build
LABEL maintainer="https://github.com/prowler-cloud/api"
# hadolint ignore=DL3018
RUN apk --no-cache add gcc python3-dev musl-dev linux-headers curl-dev
ARG POWERSHELL_VERSION=7.5.0
ENV POWERSHELL_VERSION=${POWERSHELL_VERSION}
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
RUN ARCH=$(uname -m) && \
if [ "$ARCH" = "x86_64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-x64.tar.gz -O /tmp/powershell.tar.gz ; \
elif [ "$ARCH" = "aarch64" ]; then \
wget --progress=dot:giga https://github.com/PowerShell/PowerShell/releases/download/v${POWERSHELL_VERSION}/powershell-${POWERSHELL_VERSION}-linux-arm64.tar.gz -O /tmp/powershell.tar.gz ; \
else \
echo "Unsupported architecture: $ARCH" && exit 1 ; \
fi && \
mkdir -p /opt/microsoft/powershell/7 && \
tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7 && \
chmod +x /opt/microsoft/powershell/7/pwsh && \
ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh && \
rm /tmp/powershell.tar.gz
# Add prowler user
RUN addgroup --gid 1000 prowler && \
adduser --uid 1000 --gid 1000 --disabled-password --gecos "" prowler
RUN apk --no-cache upgrade && \
addgroup -g 1000 prowler && \
adduser -D -u 1000 -G prowler prowler
USER prowler
WORKDIR /home/prowler
@@ -17,7 +37,7 @@ COPY pyproject.toml ./
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
COPY src/backend/ ./backend/
COPY src/backend/ ./backend/
ENV PATH="/home/prowler/.local/bin:$PATH"
@@ -27,18 +47,13 @@ RUN poetry install --no-root && \
COPY docker-entrypoint.sh ./docker-entrypoint.sh
RUN poetry run python "$(poetry env info --path)/src/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py"
WORKDIR /home/prowler/backend
# Development image
# hadolint ignore=DL3006
FROM build AS dev
USER 0
# hadolint ignore=DL3018
RUN apk --no-cache add curl vim
USER prowler
ENTRYPOINT ["../docker-entrypoint.sh", "dev"]
# Production image

View File

@@ -235,6 +235,7 @@ To view the logs for any component (e.g., Django, Celery worker), you can use th
```console
docker logs -f $(docker ps --format "{{.Names}}" | grep 'api-')
```
## Applying migrations

View File

@@ -28,7 +28,7 @@ start_prod_server() {
start_worker() {
echo "Starting the worker..."
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion -E --max-tasks-per-child 1
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion,backfill -E --max-tasks-per-child 1
}
start_worker_beat() {

1954
api/poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -7,7 +7,8 @@ authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
dependencies = [
"celery[pytest] (>=5.4.0,<6.0.0)",
"dj-rest-auth[with_social,jwt] (==7.0.1)",
"django==5.1.7",
"django==5.1.8",
"django-allauth==65.4.1",
"django-celery-beat (>=2.7.0,<3.0.0)",
"django-celery-results (>=2.5.1,<3.0.0)",
"django-cors-headers==4.4.0",
@@ -22,7 +23,7 @@ dependencies = [
"drf-spectacular==0.27.2",
"drf-spectacular-jsonapi==0.5.1",
"gunicorn==23.0.0",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.7",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
@@ -34,7 +35,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.6.0"
version = "1.8.5"
[project.scripts]
celery = "src.backend.config.settings.celery"
@@ -45,6 +46,7 @@ coverage = "7.5.4"
django-silk = "5.3.2"
docker = "7.1.0"
freezegun = "1.5.1"
marshmallow = ">=3.15.0,<4.0.0"
mypy = "1.10.1"
pylint = "3.2.5"
pytest = "8.2.2"

View File

@@ -109,16 +109,6 @@ class BaseTenantViewset(BaseViewSet):
pass # Tenant might not exist, handle gracefully
def initial(self, request, *args, **kwargs):
if (
request.resolver_match.url_name != "tenant-detail"
and request.method != "DELETE"
):
user_id = str(request.user.id)
with rls_transaction(value=user_id, parameter=POSTGRES_USER_VAR):
return super().initial(request, *args, **kwargs)
# TODO: DRY this when we have time
if request.auth is None:
raise NotAuthenticated
@@ -126,8 +116,8 @@ class BaseTenantViewset(BaseViewSet):
if tenant_id is None:
raise NotAuthenticated("Tenant ID is not present in token")
with rls_transaction(tenant_id):
self.request.tenant_id = tenant_id
user_id = str(request.user.id)
with rls_transaction(value=user_id, parameter=POSTGRES_USER_VAR):
return super().initial(request, *args, **kwargs)

View File

@@ -1,12 +1,38 @@
from types import MappingProxyType
from api.models import Provider
from prowler.config.config import get_available_compliance_frameworks
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.check.models import CheckMetadata
from api.models import Provider
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE = {}
PROWLER_CHECKS = {}
AVAILABLE_COMPLIANCE_FRAMEWORKS = {}
def get_compliance_frameworks(provider_type: Provider.ProviderChoices) -> list[str]:
"""
Retrieve and cache the list of available compliance frameworks for a specific cloud provider.
This function lazily loads and caches the available compliance frameworks (e.g., CIS, MITRE, ISO)
for each provider type (AWS, Azure, GCP, etc.) on first access. Subsequent calls for the same
provider will return the cached result.
Args:
provider_type (Provider.ProviderChoices): The cloud provider type for which to retrieve
available compliance frameworks (e.g., "aws", "azure", "gcp", "m365").
Returns:
list[str]: A list of framework identifiers (e.g., "cis_1.4_aws", "mitre_attack_azure") available
for the given provider.
"""
global AVAILABLE_COMPLIANCE_FRAMEWORKS
if provider_type not in AVAILABLE_COMPLIANCE_FRAMEWORKS:
AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type] = (
get_available_compliance_frameworks(provider_type)
)
return AVAILABLE_COMPLIANCE_FRAMEWORKS[provider_type]
def get_prowler_provider_checks(provider_type: Provider.ProviderChoices):

View File

@@ -6,6 +6,7 @@ from datetime import datetime, timedelta, timezone
from django.conf import settings
from django.contrib.auth.models import BaseUserManager
from django.db import connection, models, transaction
from django_celery_beat.models import PeriodicTask
from psycopg2 import connect as psycopg2_connect
from psycopg2.extensions import AsIs, new_type, register_adapter, register_type
from rest_framework_json_api.serializers import ValidationError
@@ -105,11 +106,12 @@ def generate_random_token(length: int = 14, symbols: str | None = None) -> str:
return "".join(secrets.choice(symbols or _symbols) for _ in range(length))
def batch_delete(queryset, batch_size=5000):
def batch_delete(tenant_id, queryset, batch_size=settings.DJANGO_DELETION_BATCH_SIZE):
"""
Deletes objects in batches and returns the total number of deletions and a summary.
Args:
tenant_id (str): Tenant ID the queryset belongs to.
queryset (QuerySet): The queryset of objects to delete.
batch_size (int): The number of objects to delete in each batch.
@@ -120,15 +122,16 @@ def batch_delete(queryset, batch_size=5000):
deletion_summary = {}
while True:
# Get a batch of IDs to delete
batch_ids = set(
queryset.values_list("id", flat=True).order_by("id")[:batch_size]
)
if not batch_ids:
# No more objects to delete
break
with rls_transaction(tenant_id, POSTGRES_TENANT_VAR):
# Get a batch of IDs to delete
batch_ids = set(
queryset.values_list("id", flat=True).order_by("id")[:batch_size]
)
if not batch_ids:
# No more objects to delete
break
deleted_count, deleted_info = queryset.filter(id__in=batch_ids).delete()
deleted_count, deleted_info = queryset.filter(id__in=batch_ids).delete()
total_deleted += deleted_count
for model_label, count in deleted_info.items():
@@ -137,6 +140,18 @@ def batch_delete(queryset, batch_size=5000):
return total_deleted, deletion_summary
def delete_related_daily_task(provider_id: str):
"""
Deletes the periodic task associated with a specific provider.
Args:
provider_id (str): The unique identifier for the provider
whose related periodic task should be deleted.
"""
task_name = f"scan-perform-scheduled-{provider_id}"
PeriodicTask.objects.filter(name=task_name).delete()
# Postgres Enums
@@ -212,6 +227,77 @@ def register_enum(apps, schema_editor, enum_class): # noqa: F841
register_adapter(enum_class, enum_adapter)
def create_index_on_partitions(
apps, # noqa: F841
schema_editor,
parent_table: str,
index_name: str,
columns: str,
method: str = "BTREE",
where: str = "",
):
"""
Create an index on every existing partition of `parent_table`.
Args:
parent_table: The name of the root table (e.g. "findings").
index_name: A short name for the index (will be prefixed per-partition).
columns: The parenthesized column list, e.g. "tenant_id, scan_id, status".
method: The index method—BTREE, GIN, etc. Defaults to BTREE.
where: Optional WHERE clause (without the leading "WHERE"), e.g. "status = 'FAIL'".
"""
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
where_sql = f" WHERE {where}" if where else ""
for partition in partitions:
idx_name = f"{partition.replace('.', '_')}_{index_name}"
sql = (
f"CREATE INDEX CONCURRENTLY IF NOT EXISTS {idx_name} "
f"ON {partition} USING {method} ({columns})"
f"{where_sql};"
)
schema_editor.execute(sql)
def drop_index_on_partitions(
apps, # noqa: F841
schema_editor,
parent_table: str,
index_name: str,
):
"""
Drop the per-partition indexes that were created by create_index_on_partitions.
Args:
parent_table: The name of the root table (e.g. "findings").
index_name: The same short name used when creating them.
"""
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT inhrelid::regclass::text
FROM pg_inherits
WHERE inhparent = %s::regclass
""",
[parent_table],
)
partitions = [row[0] for row in cursor.fetchall()]
for partition in partitions:
idx_name = f"{partition.replace('.', '_')}_{index_name}"
sql = f"DROP INDEX CONCURRENTLY IF EXISTS {idx_name};"
schema_editor.execute(sql)
# Postgres enum definition for member role

View File

@@ -81,6 +81,114 @@ class ChoiceInFilter(BaseInFilter, ChoiceFilter):
pass
class CommonFindingFilters(FilterSet):
# We filter providers from the scan in findings
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_type__in = ChoiceInFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_uid = CharFilter(field_name="scan__provider__uid", lookup_expr="exact")
provider_uid__in = CharInFilter(field_name="scan__provider__uid", lookup_expr="in")
provider_uid__icontains = CharFilter(
field_name="scan__provider__uid", lookup_expr="icontains"
)
provider_alias = CharFilter(field_name="scan__provider__alias", lookup_expr="exact")
provider_alias__in = CharInFilter(
field_name="scan__provider__alias", lookup_expr="in"
)
provider_alias__icontains = CharFilter(
field_name="scan__provider__alias", lookup_expr="icontains"
)
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
uid = CharFilter(field_name="uid")
delta = ChoiceFilter(choices=Finding.DeltaChoices.choices)
status = ChoiceFilter(choices=StatusChoices.choices)
severity = ChoiceFilter(choices=SeverityChoices)
impact = ChoiceFilter(choices=SeverityChoices)
muted = BooleanFilter(
help_text="If this filter is not provided, muted and non-muted findings will be returned."
)
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
region = CharFilter(method="filter_resource_region")
region__in = CharInFilter(field_name="resource_regions", lookup_expr="overlap")
region__icontains = CharFilter(
field_name="resource_regions", lookup_expr="icontains"
)
service = CharFilter(method="filter_resource_service")
service__in = CharInFilter(field_name="resource_services", lookup_expr="overlap")
service__icontains = CharFilter(
field_name="resource_services", lookup_expr="icontains"
)
resource_uid = CharFilter(field_name="resources__uid")
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
resource_uid__icontains = CharFilter(
field_name="resources__uid", lookup_expr="icontains"
)
resource_name = CharFilter(field_name="resources__name")
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
resource_name__icontains = CharFilter(
field_name="resources__name", lookup_expr="icontains"
)
resource_type = CharFilter(method="filter_resource_type")
resource_type__in = CharInFilter(field_name="resource_types", lookup_expr="overlap")
resource_type__icontains = CharFilter(
field_name="resources__type", lookup_expr="icontains"
)
# Temporarily disabled until we implement tag filtering in the UI
# resource_tag_key = CharFilter(field_name="resources__tags__key")
# resource_tag_key__in = CharInFilter(
# field_name="resources__tags__key", lookup_expr="in"
# )
# resource_tag_key__icontains = CharFilter(
# field_name="resources__tags__key", lookup_expr="icontains"
# )
# resource_tag_value = CharFilter(field_name="resources__tags__value")
# resource_tag_value__in = CharInFilter(
# field_name="resources__tags__value", lookup_expr="in"
# )
# resource_tag_value__icontains = CharFilter(
# field_name="resources__tags__value", lookup_expr="icontains"
# )
# resource_tags = CharInFilter(
# method="filter_resource_tag",
# lookup_expr="in",
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
# "separated by commas.",
# )
def filter_resource_service(self, queryset, name, value):
return queryset.filter(resource_services__contains=[value])
def filter_resource_region(self, queryset, name, value):
return queryset.filter(resource_regions__contains=[value])
def filter_resource_type(self, queryset, name, value):
return queryset.filter(resource_types__contains=[value])
def filter_resource_tag(self, queryset, name, value):
overall_query = Q()
for key_value_pair in value:
tag_key, tag_value = key_value_pair.split(":", 1)
overall_query |= Q(
resources__tags__key__icontains=tag_key,
resources__tags__value__icontains=tag_value,
)
return queryset.filter(overall_query).distinct()
class TenantFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
@@ -257,91 +365,7 @@ class ResourceFilter(ProviderRelationshipFilterSet):
return queryset.filter(tags__text_search=value)
class FindingFilter(FilterSet):
# We filter providers from the scan in findings
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_type__in = ChoiceInFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
provider_uid = CharFilter(field_name="scan__provider__uid", lookup_expr="exact")
provider_uid__in = CharInFilter(field_name="scan__provider__uid", lookup_expr="in")
provider_uid__icontains = CharFilter(
field_name="scan__provider__uid", lookup_expr="icontains"
)
provider_alias = CharFilter(field_name="scan__provider__alias", lookup_expr="exact")
provider_alias__in = CharInFilter(
field_name="scan__provider__alias", lookup_expr="in"
)
provider_alias__icontains = CharFilter(
field_name="scan__provider__alias", lookup_expr="icontains"
)
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
uid = CharFilter(field_name="uid")
delta = ChoiceFilter(choices=Finding.DeltaChoices.choices)
status = ChoiceFilter(choices=StatusChoices.choices)
severity = ChoiceFilter(choices=SeverityChoices)
impact = ChoiceFilter(choices=SeverityChoices)
resources = UUIDInFilter(field_name="resource__id", lookup_expr="in")
region = CharFilter(field_name="resources__region")
region__in = CharInFilter(field_name="resources__region", lookup_expr="in")
region__icontains = CharFilter(
field_name="resources__region", lookup_expr="icontains"
)
service = CharFilter(field_name="resources__service")
service__in = CharInFilter(field_name="resources__service", lookup_expr="in")
service__icontains = CharFilter(
field_name="resources__service", lookup_expr="icontains"
)
resource_uid = CharFilter(field_name="resources__uid")
resource_uid__in = CharInFilter(field_name="resources__uid", lookup_expr="in")
resource_uid__icontains = CharFilter(
field_name="resources__uid", lookup_expr="icontains"
)
resource_name = CharFilter(field_name="resources__name")
resource_name__in = CharInFilter(field_name="resources__name", lookup_expr="in")
resource_name__icontains = CharFilter(
field_name="resources__name", lookup_expr="icontains"
)
resource_type = CharFilter(field_name="resources__type")
resource_type__in = CharInFilter(field_name="resources__type", lookup_expr="in")
resource_type__icontains = CharFilter(
field_name="resources__type", lookup_expr="icontains"
)
# Temporarily disabled until we implement tag filtering in the UI
# resource_tag_key = CharFilter(field_name="resources__tags__key")
# resource_tag_key__in = CharInFilter(
# field_name="resources__tags__key", lookup_expr="in"
# )
# resource_tag_key__icontains = CharFilter(
# field_name="resources__tags__key", lookup_expr="icontains"
# )
# resource_tag_value = CharFilter(field_name="resources__tags__value")
# resource_tag_value__in = CharInFilter(
# field_name="resources__tags__value", lookup_expr="in"
# )
# resource_tag_value__icontains = CharFilter(
# field_name="resources__tags__value", lookup_expr="icontains"
# )
# resource_tags = CharInFilter(
# method="filter_resource_tag",
# lookup_expr="in",
# help_text="Filter by resource tags `key:value` pairs.\nMultiple values may be "
# "separated by commas.",
# )
class FindingFilter(CommonFindingFilters):
scan = UUIDFilter(method="filter_scan_id")
scan__in = UUIDInFilter(method="filter_scan_id_in")
@@ -382,6 +406,15 @@ class FindingFilter(FilterSet):
},
}
def filter_resource_type(self, queryset, name, value):
return queryset.filter(resource_types__contains=[value])
def filter_resource_region(self, queryset, name, value):
return queryset.filter(resource_regions__contains=[value])
def filter_resource_service(self, queryset, name, value):
return queryset.filter(resource_services__contains=[value])
def filter_queryset(self, queryset):
if not (self.data.get("scan") or self.data.get("scan__in")) and not (
self.data.get("inserted_at")
@@ -500,16 +533,6 @@ class FindingFilter(FilterSet):
return queryset.filter(id__lt=end)
def filter_resource_tag(self, queryset, name, value):
overall_query = Q()
for key_value_pair in value:
tag_key, tag_value = key_value_pair.split(":", 1)
overall_query |= Q(
resources__tags__key__icontains=tag_key,
resources__tags__value__icontains=tag_value,
)
return queryset.filter(overall_query).distinct()
@staticmethod
def maybe_date_to_datetime(value):
dt = value
@@ -518,6 +541,31 @@ class FindingFilter(FilterSet):
return dt
class LatestFindingFilter(CommonFindingFilters):
class Meta:
model = Finding
fields = {
"id": ["exact", "in"],
"uid": ["exact", "in"],
"delta": ["exact", "in"],
"status": ["exact", "in"],
"severity": ["exact", "in"],
"impact": ["exact", "in"],
"check_id": ["exact", "in", "icontains"],
}
filter_overrides = {
FindingDeltaEnumField: {
"filter_class": CharFilter,
},
StatusEnumField: {
"filter_class": CharFilter,
},
SeverityEnumField: {
"filter_class": CharFilter,
},
}
class ProviderSecretFilter(FilterSet):
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
@@ -614,12 +662,6 @@ class ScanSummaryFilter(FilterSet):
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
region = CharFilter(field_name="region")
muted_findings = BooleanFilter(method="filter_muted_findings")
def filter_muted_findings(self, queryset, name, value):
if not value:
return queryset.exclude(muted__gt=0)
return queryset
class Meta:
model = ScanSummary
@@ -630,8 +672,6 @@ class ScanSummaryFilter(FilterSet):
class ServiceOverviewFilter(ScanSummaryFilter):
muted_findings = None
def is_valid(self):
# Check if at least one of the inserted_at filters is present
inserted_at_filters = [

View File

@@ -12,6 +12,7 @@ from api.models import (
Provider,
Resource,
ResourceFindingMapping,
ResourceScanSummary,
Scan,
StatusChoices,
)
@@ -133,6 +134,7 @@ class Command(BaseCommand):
region=random.choice(possible_regions),
service=random.choice(possible_services),
type=random.choice(possible_types),
inserted_at="2024-10-01T00:00:00Z",
)
)
@@ -181,6 +183,10 @@ class Command(BaseCommand):
"servicename": assigned_resource.service,
"resourcetype": assigned_resource.type,
},
resource_types=[assigned_resource.type],
resource_regions=[assigned_resource.region],
resource_services=[assigned_resource.service],
inserted_at="2024-10-01T00:00:00Z",
)
)
@@ -197,12 +203,22 @@ class Command(BaseCommand):
# Create ResourceFindingMapping
mappings = []
for index, f in enumerate(findings):
scan_resource_cache: set[tuple] = set()
for index, finding_instance in enumerate(findings):
resource_instance = resources[findings_resources_mapping[index]]
mappings.append(
ResourceFindingMapping(
tenant_id=tenant_id,
resource=resources[findings_resources_mapping[index]],
finding=f,
resource=resource_instance,
finding=finding_instance,
)
)
scan_resource_cache.add(
(
str(resource_instance.id),
resource_instance.service,
resource_instance.region,
resource_instance.type,
)
)
@@ -220,6 +236,38 @@ class Command(BaseCommand):
"Resource-finding mappings created successfully.\n\n"
)
)
with rls_transaction(tenant_id):
scan.progress = 99
scan.save()
self.stdout.write(self.style.WARNING("Creating finding filter values..."))
resource_scan_summaries = [
ResourceScanSummary(
tenant_id=tenant_id,
scan_id=str(scan.id),
resource_id=resource_id,
service=service,
region=region,
resource_type=resource_type,
)
for resource_id, service, region, resource_type in scan_resource_cache
]
num_batches = ceil(len(resource_scan_summaries) / batch_size)
with rls_transaction(tenant_id):
for i in tqdm(
range(0, len(resource_scan_summaries), batch_size),
total=num_batches,
):
with rls_transaction(tenant_id):
ResourceScanSummary.objects.bulk_create(
resource_scan_summaries[i : i + batch_size],
ignore_conflicts=True,
)
self.stdout.write(
self.style.SUCCESS("Finding filter values created successfully.\n\n")
)
except Exception as e:
self.stdout.write(self.style.ERROR(f"Failed to populate test data: {e}"))
scan_state = "failed"

View File

@@ -0,0 +1,26 @@
# Generated by Django 5.1.5 on 2025-03-25 11:29
from django.db import migrations, models
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0014_integrations"),
]
operations = [
migrations.AddField(
model_name="finding",
name="muted",
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name="finding",
name="status",
field=api.db_utils.StatusEnumField(
choices=[("FAIL", "Fail"), ("PASS", "Pass"), ("MANUAL", "Manual")]
),
),
]

View File

@@ -0,0 +1,32 @@
# Generated by Django 5.1.5 on 2025-03-31 10:46
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0015_finding_muted"),
]
operations = [
migrations.AddField(
model_name="finding",
name="compliance",
field=models.JSONField(blank=True, default=dict, null=True),
),
migrations.AddField(
model_name="resource",
name="details",
field=models.TextField(blank=True, null=True),
),
migrations.AddField(
model_name="resource",
name="metadata",
field=models.TextField(blank=True, null=True),
),
migrations.AddField(
model_name="resource",
name="partition",
field=models.TextField(blank=True, null=True),
),
]

View File

@@ -0,0 +1,32 @@
# Generated by Django 5.1.7 on 2025-04-16 08:47
from django.db import migrations
import api.db_utils
class Migration(migrations.Migration):
dependencies = [
("api", "0016_finding_compliance_resource_details_and_more"),
]
operations = [
migrations.AlterField(
model_name="provider",
name="provider",
field=api.db_utils.ProviderEnumField(
choices=[
("aws", "AWS"),
("azure", "Azure"),
("gcp", "GCP"),
("kubernetes", "Kubernetes"),
("m365", "M365"),
],
default="aws",
),
),
migrations.RunSQL(
"ALTER TYPE provider ADD VALUE IF NOT EXISTS 'm365';",
reverse_sql=migrations.RunSQL.noop,
),
]

View File

@@ -0,0 +1,81 @@
# Generated by Django 5.1.7 on 2025-05-05 10:01
import uuid
import django.db.models.deletion
import uuid6
from django.db import migrations, models
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0017_m365_provider"),
]
operations = [
migrations.CreateModel(
name="ResourceScanSummary",
fields=[
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("scan_id", models.UUIDField(db_index=True, default=uuid6.uuid7)),
("resource_id", models.UUIDField(db_index=True, default=uuid.uuid4)),
("service", models.CharField(max_length=100)),
("region", models.CharField(max_length=100)),
("resource_type", models.CharField(max_length=100)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "resource_scan_summaries",
"indexes": [
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="rss_tenant_scan_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region"],
name="rss_tenant_scan_reg_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "resource_type"],
name="rss_tenant_scan_type_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region", "service"],
name="rss_tenant_scan_reg_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "service", "resource_type"],
name="rss_tenant_scan_svc_type_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region", "resource_type"],
name="rss_tenant_scan_reg_type_idx",
),
],
"unique_together": {("tenant_id", "scan_id", "resource_id")},
},
),
migrations.AddConstraint(
model_name="resourcescansummary",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_resourcescansummary",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]

View File

@@ -0,0 +1,42 @@
import django.contrib.postgres.fields
import django.contrib.postgres.indexes
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0018_resource_scan_summaries"),
]
operations = [
migrations.AddField(
model_name="finding",
name="resource_regions",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(max_length=100),
blank=True,
null=True,
size=None,
),
),
migrations.AddField(
model_name="finding",
name="resource_services",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(max_length=100),
blank=True,
null=True,
size=None,
),
),
migrations.AddField(
model_name="finding",
name="resource_types",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(max_length=100),
blank=True,
null=True,
size=None,
),
),
]

View File

@@ -0,0 +1,86 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0019_finding_denormalize_resource_fields"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_service_idx",
columns="resource_services",
method="GIN",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_service_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_region_idx",
columns="resource_regions",
method="GIN",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_region_idx",
),
),
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="gin_find_rtype_idx",
columns="resource_types",
method="GIN",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="gin_find_rtype_idx",
),
),
migrations.RunPython(
partial(
drop_index_on_partitions,
parent_table="findings",
index_name="findings_uid_idx",
),
reverse_code=partial(
create_index_on_partitions,
parent_table="findings",
index_name="findings_uid_idx",
columns="uid",
method="BTREE",
),
),
migrations.RunPython(
partial(
drop_index_on_partitions,
parent_table="findings",
index_name="findings_filter_idx",
),
reverse_code=partial(
create_index_on_partitions,
parent_table="findings",
index_name="findings_filter_idx",
columns="scan_id, impact, severity, status, check_id, delta",
method="BTREE",
),
),
]

View File

@@ -0,0 +1,37 @@
import django.contrib.postgres.indexes
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("api", "0020_findings_new_performance_indexes_partitions"),
]
operations = [
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=["resource_services"], name="gin_find_service_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=["resource_regions"], name="gin_find_region_idx"
),
),
migrations.AddIndex(
model_name="finding",
index=django.contrib.postgres.indexes.GinIndex(
fields=["resource_types"], name="gin_find_rtype_idx"
),
),
migrations.RemoveIndex(
model_name="finding",
name="findings_uid_idx",
),
migrations.RemoveIndex(
model_name="finding",
name="findings_filter_idx",
),
]

View File

@@ -0,0 +1,38 @@
# Generated by Django 5.1.8 on 2025-05-12 10:04
from django.contrib.postgres.operations import AddIndexConcurrently
from django.db import migrations, models
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0021_findings_new_performance_indexes_parent"),
("django_celery_beat", "0019_alter_periodictasks_options"),
]
operations = [
AddIndexConcurrently(
model_name="scan",
index=models.Index(
condition=models.Q(("state", "completed")),
fields=["tenant_id", "provider_id", "state", "-inserted_at"],
name="scans_prov_state_ins_desc_idx",
),
),
AddIndexConcurrently(
model_name="scansummary",
index=models.Index(
fields=["tenant_id", "scan_id", "service"],
name="ss_tenant_scan_service_idx",
),
),
AddIndexConcurrently(
model_name="scansummary",
index=models.Index(
fields=["tenant_id", "scan_id", "severity"],
name="ss_tenant_scan_severity_idx",
),
),
]

View File

@@ -0,0 +1,28 @@
# Generated by Django 5.1.8 on 2025-05-12 10:18
from django.contrib.postgres.operations import AddIndexConcurrently
from django.db import migrations, models
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0022_scan_summaries_performance_indexes"),
]
operations = [
AddIndexConcurrently(
model_name="resource",
index=models.Index(
fields=["tenant_id", "id"], name="resources_tenant_id_idx"
),
),
AddIndexConcurrently(
model_name="resource",
index=models.Index(
fields=["tenant_id", "provider_id"],
name="resources_tenant_provider_idx",
),
),
]

View File

@@ -0,0 +1,29 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0023_resources_lookup_optimization"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="findings",
index_name="find_tenant_uid_inserted_idx",
columns="tenant_id, uid, inserted_at DESC",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="findings",
index_name="find_tenant_uid_inserted_idx",
),
)
]

View File

@@ -0,0 +1,17 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0024_findings_uid_index_partitions"),
]
operations = [
migrations.AddIndex(
model_name="finding",
index=models.Index(
fields=["tenant_id", "uid", "-inserted_at"],
name="find_tenant_uid_inserted_idx",
),
),
]

View File

@@ -5,6 +5,7 @@ from uuid import UUID, uuid4
from cryptography.fernet import Fernet
from django.conf import settings
from django.contrib.auth.models import AbstractBaseUser
from django.contrib.postgres.fields import ArrayField
from django.contrib.postgres.indexes import GinIndex
from django.contrib.postgres.search import SearchVector, SearchVectorField
from django.core.validators import MinLengthValidator
@@ -59,7 +60,6 @@ class StatusChoices(models.TextChoices):
FAIL = "FAIL", _("Fail")
PASS = "PASS", _("Pass")
MANUAL = "MANUAL", _("Manual")
MUTED = "MUTED", _("Muted")
class StateChoices(models.TextChoices):
@@ -192,6 +192,7 @@ class Provider(RowLevelSecurityProtectedModel):
AZURE = "azure", _("Azure")
GCP = "gcp", _("GCP")
KUBERNETES = "kubernetes", _("Kubernetes")
M365 = "m365", _("M365")
@staticmethod
def validate_aws_uid(value):
@@ -215,6 +216,19 @@ class Provider(RowLevelSecurityProtectedModel):
pointer="/data/attributes/uid",
)
@staticmethod
def validate_m365_uid(value):
if not re.match(
r"""^(?!-)[A-Za-z0-9](?:[A-Za-z0-9-]{0,61}[A-Za-z0-9])?(?:\.(?!-)[A-Za-z0-9]"""
r"""(?:[A-Za-z0-9-]{0,61}[A-Za-z0-9])?)*\.[A-Za-z]{2,}$""",
value,
):
raise ModelValidationError(
detail="M365 domain ID must be a valid domain.",
code="m365-uid",
pointer="/data/attributes/uid",
)
@staticmethod
def validate_gcp_uid(value):
if not re.match(r"^[a-z][a-z0-9-]{5,29}$", value):
@@ -228,7 +242,7 @@ class Provider(RowLevelSecurityProtectedModel):
@staticmethod
def validate_kubernetes_uid(value):
if not re.match(
r"^[a-z0-9][A-Za-z0-9_.:\/-]{1,250}$",
r"^[a-zA-Z0-9][a-zA-Z0-9._@:\/-]{1,250}$",
value,
):
raise ModelValidationError(
@@ -416,6 +430,7 @@ class Scan(RowLevelSecurityProtectedModel):
PeriodicTask, on_delete=models.CASCADE, null=True, blank=True
)
output_location = models.CharField(blank=True, null=True, max_length=200)
# TODO: mutelist foreign key
class Meta(RowLevelSecurityProtectedModel.Meta):
@@ -438,6 +453,11 @@ class Scan(RowLevelSecurityProtectedModel):
fields=["tenant_id", "provider_id", "state", "inserted_at"],
name="scans_prov_state_insert_idx",
),
models.Index(
fields=["tenant_id", "provider_id", "state", "-inserted_at"],
condition=Q(state=StateChoices.COMPLETED),
name="scans_prov_state_ins_desc_idx",
),
]
class JSONAPIMeta:
@@ -519,6 +539,11 @@ class Resource(RowLevelSecurityProtectedModel):
editable=False,
)
metadata = models.TextField(blank=True, null=True)
details = models.TextField(blank=True, null=True)
partition = models.TextField(blank=True, null=True)
# Relationships
tags = models.ManyToManyField(
ResourceTag,
verbose_name="Tags associated with the resource, by provider",
@@ -559,6 +584,11 @@ class Resource(RowLevelSecurityProtectedModel):
name="resource_tenant_metadata_idx",
),
GinIndex(fields=["text_search"], name="gin_resources_search_idx"),
models.Index(fields=["tenant_id", "id"], name="resources_tenant_id_idx"),
models.Index(
fields=["tenant_id", "provider_id"],
name="resources_tenant_provider_idx",
),
]
constraints = [
@@ -656,6 +686,23 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
tags = models.JSONField(default=dict, null=True, blank=True)
check_id = models.CharField(max_length=100, blank=False, null=False)
check_metadata = models.JSONField(default=dict, null=False)
muted = models.BooleanField(default=False, null=False)
compliance = models.JSONField(default=dict, null=True, blank=True)
# Denormalize resource data for performance
resource_regions = ArrayField(
models.CharField(max_length=100), blank=True, null=True
)
resource_services = ArrayField(
models.CharField(max_length=100),
blank=True,
null=True,
)
resource_types = ArrayField(
models.CharField(max_length=100),
blank=True,
null=True,
)
# Relationships
scan = models.ForeignKey(to=Scan, related_name="findings", on_delete=models.CASCADE)
@@ -697,18 +744,6 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
]
indexes = [
models.Index(fields=["uid"], name="findings_uid_idx"),
models.Index(
fields=[
"scan_id",
"impact",
"severity",
"status",
"check_id",
"delta",
],
name="findings_filter_idx",
),
models.Index(fields=["tenant_id", "id"], name="findings_tenant_and_id_idx"),
GinIndex(fields=["text_search"], name="gin_findings_search_idx"),
models.Index(fields=["tenant_id", "scan_id"], name="find_tenant_scan_idx"),
@@ -720,19 +755,42 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
condition=Q(delta="new"),
name="find_delta_new_idx",
),
models.Index(
fields=["tenant_id", "uid", "-inserted_at"],
name="find_tenant_uid_inserted_idx",
),
GinIndex(fields=["resource_services"], name="gin_find_service_idx"),
GinIndex(fields=["resource_regions"], name="gin_find_region_idx"),
GinIndex(fields=["resource_types"], name="gin_find_rtype_idx"),
]
class JSONAPIMeta:
resource_name = "findings"
def add_resources(self, resources: list[Resource] | None):
# Add new relationships with the tenant_id field
if not resources:
return
self.resource_regions = self.resource_regions or []
self.resource_services = self.resource_services or []
self.resource_types = self.resource_types or []
# Deduplication
regions = set(self.resource_regions)
services = set(self.resource_services)
types = set(self.resource_types)
for resource in resources:
ResourceFindingMapping.objects.update_or_create(
resource=resource, finding=self, tenant_id=self.tenant_id
)
regions.add(resource.region)
services.add(resource.service)
types.add(resource.type)
# Save the instance
self.resource_regions = list(regions)
self.resource_services = list(services)
self.resource_types = list(types)
self.save()
@@ -1134,7 +1192,15 @@ class ScanSummary(RowLevelSecurityProtectedModel):
models.Index(
fields=["tenant_id", "scan_id"],
name="scan_summaries_tenant_scan_idx",
)
),
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="ss_tenant_scan_service_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "severity"],
name="ss_tenant_scan_severity_idx",
),
]
class JSONAPIMeta:
@@ -1216,3 +1282,52 @@ class IntegrationProviderRelationship(RowLevelSecurityProtectedModel):
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class ResourceScanSummary(RowLevelSecurityProtectedModel):
scan_id = models.UUIDField(default=uuid7, db_index=True)
resource_id = models.UUIDField(default=uuid4, db_index=True)
service = models.CharField(max_length=100)
region = models.CharField(max_length=100)
resource_type = models.CharField(max_length=100)
class Meta:
db_table = "resource_scan_summaries"
unique_together = (("tenant_id", "scan_id", "resource_id"),)
indexes = [
# Single-dimension lookups:
models.Index(
fields=["tenant_id", "scan_id", "service"],
name="rss_tenant_scan_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region"],
name="rss_tenant_scan_reg_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "resource_type"],
name="rss_tenant_scan_type_idx",
),
# Two-dimension cross-filters:
models.Index(
fields=["tenant_id", "scan_id", "region", "service"],
name="rss_tenant_scan_reg_svc_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "service", "resource_type"],
name="rss_tenant_scan_svc_type_idx",
),
models.Index(
fields=["tenant_id", "scan_id", "region", "resource_type"],
name="rss_tenant_scan_reg_type_idx",
),
]
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]

View File

@@ -1,12 +1,12 @@
from celery import states
from celery.signals import before_task_publish
from config.celery import celery_app
from django.db.models.signals import post_delete
from django.dispatch import receiver
from django_celery_beat.models import PeriodicTask
from django_celery_results.backends.database import DatabaseBackend
from api.db_utils import delete_related_daily_task
from api.models import Provider
from config.celery import celery_app
def create_task_result_on_publish(sender=None, headers=None, **kwargs): # noqa: F841
@@ -31,5 +31,4 @@ before_task_publish.connect(
@receiver(post_delete, sender=Provider)
def delete_provider_scan_task(sender, instance, **kwargs): # noqa: F841
# Delete the associated periodic task when the provider is deleted
task_name = f"scan-perform-scheduled-{instance.id}"
PeriodicTask.objects.filter(name=task_name).delete()
delete_related_daily_task(instance.id)

File diff suppressed because it is too large Load Diff

View File

@@ -131,9 +131,10 @@ class TestBatchDelete:
return provider_count
@pytest.mark.django_db
def test_batch_delete(self, create_test_providers):
def test_batch_delete(self, tenants_fixture, create_test_providers):
tenant_id = str(tenants_fixture[0].id)
_, summary = batch_delete(
Provider.objects.all(), batch_size=create_test_providers // 2
tenant_id, Provider.objects.all(), batch_size=create_test_providers // 2
)
assert Provider.objects.all().count() == 0
assert summary == {"api.Provider": create_test_providers}

View File

@@ -92,3 +92,31 @@ class TestResourceModel:
assert len(resource.tags.filter(tenant_id=tenant_id)) == 0
assert resource.get_tags(tenant_id=tenant_id) == {}
# @pytest.mark.django_db
# class TestFindingModel:
# def test_add_finding_with_long_uid(
# self, providers_fixture, scans_fixture, resources_fixture
# ):
# provider, *_ = providers_fixture
# tenant_id = provider.tenant_id
# long_uid = "1" * 500
# _ = Finding.objects.create(
# tenant_id=tenant_id,
# uid=long_uid,
# delta=Finding.DeltaChoices.NEW,
# check_metadata={},
# status=StatusChoices.PASS,
# status_extended="",
# severity="high",
# impact="high",
# raw_result={},
# check_id="test_check",
# scan=scans_fixture[0],
# first_seen_at=None,
# muted=False,
# compliance={},
# )
# assert Finding.objects.filter(uid=long_uid).exists()

View File

@@ -0,0 +1,80 @@
import logging
from unittest.mock import MagicMock
from config.settings.sentry import before_send
def test_before_send_ignores_log_with_ignored_exception():
"""Test that before_send ignores logs containing ignored exceptions."""
log_record = MagicMock()
log_record.msg = "Provider kubernetes is not connected"
log_record.levelno = logging.ERROR # 40
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None
def test_before_send_ignores_exception_with_ignored_exception():
"""Test that before_send ignores exceptions containing ignored exceptions."""
exc_info = (Exception, Exception("Provider kubernetes is not connected"), None)
hint = {"exc_info": exc_info}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None
def test_before_send_passes_through_non_ignored_log():
"""Test that before_send passes through logs that don't contain ignored exceptions."""
log_record = MagicMock()
log_record.msg = "Some other error message"
log_record.levelno = logging.ERROR # 40
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was passed through
assert result == event
def test_before_send_passes_through_non_ignored_exception():
"""Test that before_send passes through exceptions that don't contain ignored exceptions."""
exc_info = (Exception, Exception("Some other error message"), None)
hint = {"exc_info": exc_info}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was passed through
assert result == event
def test_before_send_handles_warning_level():
"""Test that before_send handles warning level logs."""
log_record = MagicMock()
log_record.msg = "Provider kubernetes is not connected"
log_record.levelno = logging.WARNING # 30
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None

View File

@@ -19,6 +19,7 @@ from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
from prowler.providers.m365.m365_provider import M365Provider
class TestMergeDicts:
@@ -104,6 +105,7 @@ class TestReturnProwlerProvider:
(Provider.ProviderChoices.GCP.value, GcpProvider),
(Provider.ProviderChoices.AZURE.value, AzureProvider),
(Provider.ProviderChoices.KUBERNETES.value, KubernetesProvider),
(Provider.ProviderChoices.M365.value, M365Provider),
],
)
def test_return_prowler_provider(self, provider_type, expected_provider):
@@ -176,6 +178,10 @@ class TestGetProwlerProviderKwargs:
Provider.ProviderChoices.KUBERNETES.value,
{"context": "provider_uid"},
),
(
Provider.ProviderChoices.M365.value,
{},
),
],
)
def test_get_prowler_provider_kwargs(self, provider_type, expected_extra_kwargs):

View File

@@ -2,18 +2,23 @@ import glob
import io
import json
import os
import tempfile
from datetime import datetime, timedelta, timezone
from unittest.mock import ANY, Mock, patch
from pathlib import Path
from unittest.mock import ANY, MagicMock, Mock, patch
import jwt
import pytest
from botocore.exceptions import NoCredentialsError
from botocore.exceptions import ClientError, NoCredentialsError
from conftest import API_JSON_CONTENT_TYPE, TEST_PASSWORD, TEST_USER
from django.conf import settings
from django.urls import reverse
from django_celery_results.models import TaskResult
from rest_framework import status
from api.compliance import get_compliance_frameworks
from api.models import (
ComplianceOverview,
Integration,
Invitation,
Membership,
@@ -909,11 +914,41 @@ class TestProviderViewSet:
"uid": "gke_aaaa-dev_europe-test1_dev-aaaa-test-cluster-long-name-123456789",
"alias": "GKE",
},
{
"provider": "kubernetes",
"uid": "gke_project/cluster-name",
"alias": "GKE",
},
{
"provider": "kubernetes",
"uid": "admin@k8s-demo",
"alias": "test",
},
{
"provider": "azure",
"uid": "8851db6b-42e5-4533-aa9e-30a32d67e875",
"alias": "test",
},
{
"provider": "m365",
"uid": "TestingPro.onMirosoft.com",
"alias": "test",
},
{
"provider": "m365",
"uid": "subdomain.domain.es",
"alias": "test",
},
{
"provider": "m365",
"uid": "microsoft.net",
"alias": "test",
},
{
"provider": "m365",
"uid": "subdomain1.subdomain2.subdomain3.subdomain4.domain.net",
"alias": "test",
},
]
),
)
@@ -982,6 +1017,51 @@ class TestProviderViewSet:
"invalid_choice",
"provider",
),
(
{
"provider": "m365",
"uid": "https://test.com",
"alias": "test",
},
"m365-uid",
"uid",
),
(
{
"provider": "m365",
"uid": "thisisnotadomain",
"alias": "test",
},
"m365-uid",
"uid",
),
(
{
"provider": "m365",
"uid": "http://test.com",
"alias": "test",
},
"m365-uid",
"uid",
),
(
{
"provider": "m365",
"uid": f"{'a' * 64}.domain.com",
"alias": "test",
},
"m365-uid",
"uid",
),
(
{
"provider": "m365",
"uid": f"subdomain.{'a' * 64}.com",
"alias": "test",
},
"m365-uid",
"uid",
),
]
),
)
@@ -2234,7 +2314,10 @@ class TestScanViewSet:
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_404_NOT_FOUND
assert response.json()["errors"]["detail"] == "The scan has no reports."
assert (
response.json()["errors"]["detail"]
== "The scan has no reports, or the report generation task has not started yet."
)
def test_report_s3_no_credentials(
self, authenticated_client, scans_fixture, monkeypatch
@@ -2276,7 +2359,8 @@ class TestScanViewSet:
scan.save()
monkeypatch.setattr(
"api.v1.views.env", type("env", (), {"str": lambda self, key: bucket})()
"api.v1.views.env",
type("env", (), {"str": lambda self, *args, **kwargs: "test-bucket"})(),
)
class FakeS3Client:
@@ -2301,7 +2385,7 @@ class TestScanViewSet:
):
"""
When output_location is a local path and glob.glob returns an empty list,
the view should return HTTP 404 with detail "The scan has no reports."
the view should return HTTP 404 with detail "The scan has no reports, or the report generation task has not started yet."
"""
scan = scans_fixture[0]
scan.output_location = "/tmp/nonexistent_report_pattern.zip"
@@ -2313,37 +2397,337 @@ class TestScanViewSet:
response = authenticated_client.get(url)
assert response.status_code == 404
assert response.json()["errors"]["detail"] == "The scan has no reports."
assert (
response.json()["errors"]["detail"]
== "The scan has no reports, or the report generation task has not started yet."
)
def test_report_local_file(
self, authenticated_client, scans_fixture, tmp_path, monkeypatch
):
"""
When output_location is a local file path, the view should read the file from disk
and return it with proper headers.
"""
def test_report_local_file(self, authenticated_client, scans_fixture, monkeypatch):
scan = scans_fixture[0]
file_content = b"local zip file content"
file_path = tmp_path / "report.zip"
file_path.write_bytes(file_content)
with tempfile.TemporaryDirectory() as tmp:
tmp_path = Path(tmp)
base_tmp = tmp_path / "report_local_file"
base_tmp.mkdir(parents=True, exist_ok=True)
scan.output_location = str(file_path)
file_content = b"local zip file content"
file_path = base_tmp / "report.zip"
file_path.write_bytes(file_content)
scan.output_location = str(file_path)
scan.state = StateChoices.COMPLETED
scan.save()
monkeypatch.setattr(
glob,
"glob",
lambda pattern: [str(file_path)] if pattern == str(file_path) else [],
)
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == 200
assert response.content == file_content
content_disposition = response.get("Content-Disposition")
assert content_disposition.startswith('attachment; filename="')
assert f'filename="{file_path.name}"' in content_disposition
def test_compliance_invalid_framework(self, authenticated_client, scans_fixture):
scan = scans_fixture[0]
scan.state = StateChoices.COMPLETED
scan.output_location = "dummy"
scan.save()
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": "invalid"})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_404_NOT_FOUND
assert resp.json()["errors"]["detail"] == "Compliance 'invalid' not found."
def test_compliance_executing(
self, authenticated_client, scans_fixture, monkeypatch
):
scan = scans_fixture[0]
scan.state = StateChoices.EXECUTING
scan.save()
task = Task.objects.create(tenant_id=scan.tenant_id)
scan.task = task
scan.save()
dummy = {"id": str(task.id), "state": StateChoices.EXECUTING}
monkeypatch.setattr(
"api.v1.views.TaskSerializer",
lambda *args, **kwargs: type("S", (), {"data": dummy}),
)
framework = get_compliance_frameworks(scan.provider.provider)[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_202_ACCEPTED
assert "Content-Location" in resp
assert dummy["id"] in resp["Content-Location"]
def test_compliance_no_output(self, authenticated_client, scans_fixture):
scan = scans_fixture[0]
scan.state = StateChoices.COMPLETED
scan.output_location = ""
scan.save()
framework = get_compliance_frameworks(scan.provider.provider)[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_404_NOT_FOUND
assert (
resp.json()["errors"]["detail"]
== "The scan has no reports, or the report generation task has not started yet."
)
def test_compliance_s3_no_credentials(
self, authenticated_client, scans_fixture, monkeypatch
):
scan = scans_fixture[0]
bucket = "bucket"
key = "file.zip"
scan.output_location = f"s3://{bucket}/{key}"
scan.state = StateChoices.COMPLETED
scan.save()
monkeypatch.setattr(
glob,
"glob",
lambda pattern: [str(file_path)] if pattern == str(file_path) else [],
"api.v1.views.get_s3_client",
lambda: (_ for _ in ()).throw(NoCredentialsError()),
)
framework = get_compliance_frameworks(scan.provider.provider)[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_403_FORBIDDEN
assert resp.json()["errors"]["detail"] == "There is a problem with credentials."
def test_compliance_s3_success(
self, authenticated_client, scans_fixture, monkeypatch
):
scan = scans_fixture[0]
bucket = "bucket"
prefix = "path/scan.zip"
scan.output_location = f"s3://{bucket}/{prefix}"
scan.state = StateChoices.COMPLETED
scan.save()
monkeypatch.setattr(
"api.v1.views.env",
type("env", (), {"str": lambda self, *args, **kwargs: "test-bucket"})(),
)
match_key = "path/compliance/mitre_attack_aws.csv"
class FakeS3Client:
def list_objects_v2(self, Bucket, Prefix):
return {"Contents": [{"Key": match_key}]}
def get_object(self, Bucket, Key):
return {"Body": io.BytesIO(b"ignored")}
monkeypatch.setattr("api.v1.views.get_s3_client", lambda: FakeS3Client())
framework = match_key.split("/")[-1].split(".")[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_200_OK
cd = resp["Content-Disposition"]
assert cd.startswith('attachment; filename="')
assert cd.endswith('filename="mitre_attack_aws.csv"')
def test_compliance_s3_not_found(
self, authenticated_client, scans_fixture, monkeypatch
):
scan = scans_fixture[0]
bucket = "bucket"
scan.output_location = f"s3://{bucket}/x/scan.zip"
scan.state = StateChoices.COMPLETED
scan.save()
monkeypatch.setattr(
"api.v1.views.env",
type("env", (), {"str": lambda self, *args, **kwargs: "test-bucket"})(),
)
class FakeS3Client:
def list_objects_v2(self, Bucket, Prefix):
return {"Contents": []}
def get_object(self, Bucket, Key):
return {"Body": io.BytesIO(b"ignored")}
monkeypatch.setattr("api.v1.views.get_s3_client", lambda: FakeS3Client())
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": "cis_1.4_aws"})
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_404_NOT_FOUND
assert (
resp.json()["errors"]["detail"]
== "No compliance file found for name 'cis_1.4_aws'."
)
def test_compliance_local_file(
self, authenticated_client, scans_fixture, monkeypatch
):
scan = scans_fixture[0]
scan.state = StateChoices.COMPLETED
with tempfile.TemporaryDirectory() as tmp:
tmp_path = Path(tmp)
base = tmp_path / "reports"
comp_dir = base / "compliance"
comp_dir.mkdir(parents=True, exist_ok=True)
fname = comp_dir / "scan_cis.csv"
fname.write_bytes(b"ignored")
scan.output_location = str(base / "scan.zip")
scan.save()
monkeypatch.setattr(
glob,
"glob",
lambda p: [str(fname)] if p.endswith("*_cis_1.4_aws.csv") else [],
)
url = reverse(
"scan-compliance", kwargs={"pk": scan.id, "name": "cis_1.4_aws"}
)
resp = authenticated_client.get(url)
assert resp.status_code == status.HTTP_200_OK
cd = resp["Content-Disposition"]
assert cd.startswith('attachment; filename="')
assert cd.endswith(f'filename="{fname.name}"')
@patch("api.v1.views.Task.objects.get")
@patch("api.v1.views.TaskSerializer")
def test__get_task_status_returns_none_if_task_not_executing(
self, mock_task_serializer, mock_task_get, authenticated_client, scans_fixture
):
scan = scans_fixture[0]
scan.state = StateChoices.COMPLETED
scan.output_location = "dummy"
scan.save()
task = Task.objects.create(tenant_id=scan.tenant_id)
mock_task_get.return_value = task
mock_task_serializer.return_value.data = {
"id": str(task.id),
"state": StateChoices.COMPLETED,
}
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == 200
assert response.content == file_content
content_disposition = response.get("Content-Disposition")
assert content_disposition.startswith('attachment; filename="')
assert f'filename="{file_path.name}"' in content_disposition
assert response.status_code == status.HTTP_404_NOT_FOUND
@patch("api.v1.views.TaskSerializer")
def test__get_task_status_finds_task_using_kwargs(
self, mock_task_serializer, authenticated_client, scans_fixture
):
scan = scans_fixture[0]
scan.state = StateChoices.COMPLETED
scan.output_location = "dummy"
scan.save()
task_result = TaskResult.objects.create(
task_name="scan-report",
task_kwargs={"scan_id": str(scan.id)},
)
task = Task.objects.create(
tenant_id=scan.tenant_id,
task_runner_task=task_result,
)
mock_task_serializer.return_value.data = {
"id": str(task.id),
"state": StateChoices.EXECUTING,
}
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_202_ACCEPTED
assert response.data["id"] == str(task.id)
@patch("api.v1.views.get_s3_client")
@patch("api.v1.views.sentry_sdk.capture_exception")
def test_compliance_list_objects_client_error(
self,
mock_sentry_capture,
mock_get_s3_client,
authenticated_client,
scans_fixture,
):
scan = scans_fixture[0]
scan.output_location = "s3://test-bucket/path/to/scan.zip"
scan.state = StateChoices.COMPLETED
scan.save()
fake_client = MagicMock()
fake_client.list_objects_v2.side_effect = ClientError(
{"Error": {"Code": "InternalError"}}, "ListObjectsV2"
)
mock_get_s3_client.return_value = fake_client
framework = get_compliance_frameworks(scan.provider.provider)[0]
url = reverse("scan-compliance", kwargs={"pk": scan.id, "name": framework})
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_502_BAD_GATEWAY
assert (
response.json()["errors"]["detail"]
== "Unable to list compliance files in S3: encountered an AWS error."
)
mock_sentry_capture.assert_called()
@patch("api.v1.views.get_s3_client")
def test_report_s3_nosuchkey(
self, mock_get_s3_client, authenticated_client, scans_fixture
):
scan = scans_fixture[0]
scan.output_location = "s3://test-bucket/report.zip"
scan.state = StateChoices.COMPLETED
scan.save()
fake_client = MagicMock()
fake_client.get_object.side_effect = ClientError(
{"Error": {"Code": "NoSuchKey"}}, "GetObject"
)
mock_get_s3_client.return_value = fake_client
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_404_NOT_FOUND
assert (
response.json()["errors"]["detail"]
== "The scan has no reports, or the report generation task has not started yet."
)
@patch("api.v1.views.get_s3_client")
def test_report_s3_client_error_other(
self, mock_get_s3_client, authenticated_client, scans_fixture
):
scan = scans_fixture[0]
scan.output_location = "s3://test-bucket/report.zip"
scan.state = StateChoices.COMPLETED
scan.save()
fake_client = MagicMock()
fake_client.get_object.side_effect = ClientError(
{"Error": {"Code": "AccessDenied"}}, "GetObject"
)
mock_get_s3_client.return_value = fake_client
url = reverse("scan-report", kwargs={"pk": scan.id})
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert (
response.json()["errors"]["detail"]
== "There is a problem with credentials."
)
@pytest.mark.django_db
@@ -2692,6 +3076,8 @@ class TestFindingViewSet:
# ("resource_tags", "key:value", 2),
# ("resource_tags", "not:exists", 0),
# ("resource_tags", "not:exists,key:value", 2),
("muted", True, 1),
("muted", False, 1),
]
),
)
@@ -2834,7 +3220,9 @@ class TestFindingViewSet:
)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_findings_metadata_retrieve(self, authenticated_client, findings_fixture):
def test_findings_metadata_retrieve(
self, authenticated_client, findings_fixture, backfill_scan_metadata_fixture
):
finding_1, *_ = findings_fixture
response = authenticated_client.get(
reverse("finding-metadata"),
@@ -2857,14 +3245,14 @@ class TestFindingViewSet:
)
# assert data["data"]["attributes"]["tags"] == expected_tags
def test_findings_metadata_severity_retrieve(
self, authenticated_client, findings_fixture
def test_findings_metadata_resource_filter_retrieve(
self, authenticated_client, findings_fixture, backfill_scan_metadata_fixture
):
finding_1, *_ = findings_fixture
response = authenticated_client.get(
reverse("finding-metadata"),
{
"filter[severity__in]": ["low", "medium"],
"filter[region]": "eu-west-1",
"filter[inserted_at]": finding_1.inserted_at.strftime("%Y-%m-%d"),
},
)
@@ -2915,6 +3303,29 @@ class TestFindingViewSet:
]
}
def test_findings_latest(self, authenticated_client, latest_scan_finding):
response = authenticated_client.get(
reverse("finding-latest"),
)
assert response.status_code == status.HTTP_200_OK
# The latest scan only has one finding, in comparison with `GET /findings`
assert len(response.json()["data"]) == 1
assert (
response.json()["data"][0]["attributes"]["status"]
== latest_scan_finding.status
)
def test_findings_metadata_latest(self, authenticated_client, latest_scan_finding):
response = authenticated_client.get(
reverse("finding-metadata_latest"),
)
assert response.status_code == status.HTTP_200_OK
attributes = response.json()["data"]["attributes"]
assert attributes["services"] == latest_scan_finding.resource_services
assert attributes["regions"] == latest_scan_finding.resource_regions
assert attributes["resource_types"] == latest_scan_finding.resource_types
@pytest.mark.django_db
class TestJWTFields:
@@ -4508,6 +4919,33 @@ class TestComplianceOverviewViewSet:
assert len(response.json()["data"]) == 1
assert response.json()["data"][0]["id"] == str(compliance_overview1.id)
def test_compliance_overview_metadata(
self, authenticated_client, compliance_overviews_fixture
):
response = authenticated_client.get(
reverse("complianceoverview-metadata"),
{"filter[scan_id]": str(compliance_overviews_fixture[0].scan_id)},
)
data = response.json()
expected_regions = set(
ComplianceOverview.objects.all()
.values_list("region", flat=True)
.distinct("region")
)
assert response.status_code == status.HTTP_200_OK
assert data["data"]["type"] == "compliance-overviews-metadata"
assert data["data"]["id"] is None
assert set(data["data"]["attributes"]["regions"]) == expected_regions
def test_compliance_overview_metadata_missing_scan_id(self, authenticated_client):
# Attempt to list compliance overviews without providing filter[scan_id]
response = authenticated_client.get(reverse("complianceoverview-metadata"))
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.json()["errors"][0]["source"]["pointer"] == "filter[scan_id]"
assert response.json()["errors"][0]["code"] == "required"
@pytest.mark.django_db
class TestOverviewViewSet:
@@ -4525,9 +4963,8 @@ class TestOverviewViewSet:
assert response.json()["data"][0]["attributes"]["findings"]["pass"] == 2
assert response.json()["data"][0]["attributes"]["findings"]["fail"] == 1
assert response.json()["data"][0]["attributes"]["findings"]["muted"] == 1
assert response.json()["data"][0]["attributes"]["resources"]["total"] == len(
resources_fixture
)
# Since we rely on completed scans, there are only 2 resources now
assert response.json()["data"][0]["attributes"]["resources"]["total"] == 2
def test_overview_services_list_no_required_filters(
self, authenticated_client, scan_summaries_fixture

View File

@@ -1,16 +1,20 @@
from datetime import datetime, timezone
from allauth.socialaccount.providers.oauth2.client import OAuth2Client
from django.contrib.postgres.aggregates import ArrayAgg
from django.db.models import Subquery
from rest_framework.exceptions import NotFound, ValidationError
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Invitation, Provider
from api.models import Invitation, Provider, Resource
from api.v1.serializers import FindingMetadataSerializer
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.common.models import Connection
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
from prowler.providers.m365.m365_provider import M365Provider
class CustomOAuth2Client(OAuth2Client):
@@ -51,14 +55,14 @@ def merge_dicts(default_dict: dict, replacement_dict: dict) -> dict:
def return_prowler_provider(
provider: Provider,
) -> [AwsProvider | AzureProvider | GcpProvider | KubernetesProvider]:
) -> [AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider]:
"""Return the Prowler provider class based on the given provider type.
Args:
provider (Provider): The provider object containing the provider type and associated secrets.
Returns:
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider: The corresponding provider class.
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider: The corresponding provider class.
Raises:
ValueError: If the provider type specified in `provider.provider` is not supported.
@@ -72,6 +76,8 @@ def return_prowler_provider(
prowler_provider = AzureProvider
case Provider.ProviderChoices.KUBERNETES.value:
prowler_provider = KubernetesProvider
case Provider.ProviderChoices.M365.value:
prowler_provider = M365Provider
case _:
raise ValueError(f"Provider type {provider.provider} not supported")
return prowler_provider
@@ -104,15 +110,15 @@ def get_prowler_provider_kwargs(provider: Provider) -> dict:
def initialize_prowler_provider(
provider: Provider,
) -> AwsProvider | AzureProvider | GcpProvider | KubernetesProvider:
) -> AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider:
"""Initialize a Prowler provider instance based on the given provider type.
Args:
provider (Provider): The provider object containing the provider type and associated secrets.
Returns:
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider: An instance of the corresponding provider class
(`AwsProvider`, `AzureProvider`, `GcpProvider`, or `KubernetesProvider`) initialized with the
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider: An instance of the corresponding provider class
(`AwsProvider`, `AzureProvider`, `GcpProvider`, `KubernetesProvider` or `M365Provider`) initialized with the
provider's secrets.
"""
prowler_provider = return_prowler_provider(provider)
@@ -130,10 +136,12 @@ def prowler_provider_connection_test(provider: Provider) -> Connection:
Connection: A connection object representing the result of the connection test for the specified provider.
"""
prowler_provider = return_prowler_provider(provider)
try:
prowler_provider_kwargs = provider.secret.secret
except Provider.secret.RelatedObjectDoesNotExist as secret_error:
return Connection(is_connected=False, error=secret_error)
return prowler_provider.test_connection(
**prowler_provider_kwargs, provider_id=provider.uid, raise_on_exception=False
)
@@ -200,3 +208,33 @@ def validate_invitation(
)
return invitation
# ToRemove after removing the fallback mechanism in /findings/metadata
def get_findings_metadata_no_aggregations(tenant_id: str, filtered_queryset):
filtered_ids = filtered_queryset.order_by().values("id")
relevant_resources = Resource.all_objects.filter(
tenant_id=tenant_id, findings__id__in=Subquery(filtered_ids)
).only("service", "region", "type")
aggregation = relevant_resources.aggregate(
services=ArrayAgg("service", flat=True),
regions=ArrayAgg("region", flat=True),
resource_types=ArrayAgg("type", flat=True),
)
services = sorted(set(aggregation["services"] or []))
regions = sorted({region for region in aggregation["regions"] or [] if region})
resource_types = sorted(set(aggregation["resource_types"] or []))
result = {
"services": services,
"regions": regions,
"resource_types": resource_types,
}
serializer = FindingMetadataSerializer(data=result)
serializer.is_valid(raise_exception=True)
return serializer.data

View File

@@ -0,0 +1,33 @@
from rest_framework.response import Response
class PaginateByPkMixin:
"""
Mixin to paginate on a list of PKs (cheaper than heavy JOINs),
re-fetch the full objects with the desired select/prefetch,
re-sort them to preserve DB ordering, then serialize + return.
"""
def paginate_by_pk(
self,
request, # noqa: F841
base_queryset,
manager,
select_related: list[str] | None = None,
prefetch_related: list[str] | None = None,
) -> Response:
pk_list = base_queryset.values_list("id", flat=True)
page = self.paginate_queryset(pk_list)
if page is None:
return Response(self.get_serializer(base_queryset, many=True).data)
queryset = manager.filter(id__in=page)
if select_related:
queryset = queryset.select_related(*select_related)
if prefetch_related:
queryset = queryset.prefetch_related(*prefetch_related)
queryset = sorted(queryset, key=lambda obj: page.index(obj.id))
serialized = self.get_serializer(queryset, many=True).data
return self.get_paginated_response(serialized)

View File

@@ -0,0 +1,172 @@
from drf_spectacular.utils import extend_schema_field
from rest_framework_json_api import serializers
@extend_schema_field(
{
"oneOf": [
{
"type": "object",
"title": "AWS Static Credentials",
"properties": {
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Required for environments where no IAM role is being "
"assumed and direct AWS access is needed.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Must accompany 'aws_access_key_id' to authorize "
"access to AWS resources.",
},
"aws_session_token": {
"type": "string",
"description": "The session token associated with temporary credentials. Only needed for "
"session-based or temporary AWS access.",
},
},
"required": ["aws_access_key_id", "aws_secret_access_key"],
},
{
"type": "object",
"title": "AWS Assume Role",
"properties": {
"role_arn": {
"type": "string",
"description": "The Amazon Resource Name (ARN) of the role to assume. Required for AWS role "
"assumption.",
},
"external_id": {
"type": "string",
"description": "An identifier to enhance security for role assumption.",
},
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Only required if the environment lacks pre-configured "
"AWS credentials.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Required if 'aws_access_key_id' is provided or if "
"no AWS credentials are pre-configured.",
},
"aws_session_token": {
"type": "string",
"description": "The session token for temporary credentials, if applicable.",
},
"session_duration": {
"type": "integer",
"minimum": 900,
"maximum": 43200,
"default": 3600,
"description": "The duration (in seconds) for the role session.",
},
"role_session_name": {
"type": "string",
"description": "An identifier for the role session, useful for tracking sessions in AWS logs. "
"The regex used to validate this parameter is a string of characters consisting of "
"upper- and lower-case alphanumeric characters with no spaces. You can also include "
"underscores or any of the following characters: =,.@-\n\n"
"Examples:\n"
"- MySession123\n"
"- User_Session-1\n"
"- Test.Session@2",
"pattern": "^[a-zA-Z0-9=,.@_-]+$",
},
},
"required": ["role_arn", "external_id"],
},
{
"type": "object",
"title": "Azure Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The Azure application (client) ID for authentication in Azure AD.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the application (client) ID, providing "
"secure access.",
},
"tenant_id": {
"type": "string",
"description": "The Azure tenant ID, representing the directory where the application is "
"registered.",
},
},
"required": ["client_id", "client_secret", "tenant_id"],
},
{
"type": "object",
"title": "M365 Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The Azure application (client) ID for authentication in Azure AD.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the application (client) ID, providing "
"secure access.",
},
"tenant_id": {
"type": "string",
"description": "The Azure tenant ID, representing the directory where the application is "
"registered.",
},
"user": {
"type": "email",
"description": "User microsoft email address.",
},
"password": {
"type": "string",
"description": "User password.",
},
},
"required": [
"client_id",
"client_secret",
"tenant_id",
"user",
"password",
],
},
{
"type": "object",
"title": "GCP Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The client ID from Google Cloud, used to identify the application for GCP "
"access.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the GCP client ID, required for secure "
"access.",
},
"refresh_token": {
"type": "string",
"description": "A refresh token that allows the application to obtain new access tokens for "
"extended use.",
},
},
"required": ["client_id", "client_secret", "refresh_token"],
},
{
"type": "object",
"title": "Kubernetes Static Credentials",
"properties": {
"kubeconfig_content": {
"type": "string",
"description": "The content of the Kubernetes kubeconfig file, encoded as a string.",
}
},
"required": ["kubeconfig_content"],
},
]
}
)
class ProviderSecretField(serializers.JSONField):
pass

View File

@@ -42,6 +42,7 @@ from api.v1.serializer_utils.integrations import (
IntegrationCredentialField,
S3ConfigSerializer,
)
from api.v1.serializer_utils.providers import ProviderSecretField
# Tokens
@@ -851,6 +852,10 @@ class ScanSerializer(RLSSerializer):
"url",
]
included_serializers = {
"provider": "api.v1.serializers.ProviderIncludeSerializer",
}
class ScanIncludeSerializer(RLSSerializer):
trigger = serializers.ChoiceField(
@@ -955,6 +960,15 @@ class ScanReportSerializer(serializers.Serializer):
fields = ["id"]
class ScanComplianceReportSerializer(serializers.Serializer):
id = serializers.CharField(source="scan")
name = serializers.CharField()
class Meta:
resource_name = "scan-reports"
fields = ["id", "name"]
class ResourceTagSerializer(RLSSerializer):
"""
Serializer for the ResourceTag model
@@ -1087,6 +1101,7 @@ class FindingSerializer(RLSSerializer):
"inserted_at",
"updated_at",
"first_seen_at",
"muted",
"url",
# Relationships
"scan",
@@ -1136,6 +1151,8 @@ class BaseWriteProviderSecretSerializer(BaseWriteSerializer):
serializer = GCPProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.KUBERNETES.value:
serializer = KubernetesProviderSecret(data=secret)
elif provider_type == Provider.ProviderChoices.M365.value:
serializer = M365ProviderSecret(data=secret)
else:
raise serializers.ValidationError(
{"provider": f"Provider type not supported {provider_type}"}
@@ -1175,6 +1192,17 @@ class AzureProviderSecret(serializers.Serializer):
resource_name = "provider-secrets"
class M365ProviderSecret(serializers.Serializer):
client_id = serializers.CharField()
client_secret = serializers.CharField()
tenant_id = serializers.CharField()
user = serializers.EmailField()
password = serializers.CharField()
class Meta:
resource_name = "provider-secrets"
class GCPProviderSecret(serializers.Serializer):
client_id = serializers.CharField()
client_secret = serializers.CharField()
@@ -1206,141 +1234,6 @@ class AWSRoleAssumptionProviderSecret(serializers.Serializer):
resource_name = "provider-secrets"
@extend_schema_field(
{
"oneOf": [
{
"type": "object",
"title": "AWS Static Credentials",
"properties": {
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Required for environments where no IAM role is being "
"assumed and direct AWS access is needed.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Must accompany 'aws_access_key_id' to authorize "
"access to AWS resources.",
},
"aws_session_token": {
"type": "string",
"description": "The session token associated with temporary credentials. Only needed for "
"session-based or temporary AWS access.",
},
},
"required": ["aws_access_key_id", "aws_secret_access_key"],
},
{
"type": "object",
"title": "AWS Assume Role",
"properties": {
"role_arn": {
"type": "string",
"description": "The Amazon Resource Name (ARN) of the role to assume. Required for AWS role "
"assumption.",
},
"external_id": {
"type": "string",
"description": "An identifier to enhance security for role assumption.",
},
"aws_access_key_id": {
"type": "string",
"description": "The AWS access key ID. Only required if the environment lacks pre-configured "
"AWS credentials.",
},
"aws_secret_access_key": {
"type": "string",
"description": "The AWS secret access key. Required if 'aws_access_key_id' is provided or if "
"no AWS credentials are pre-configured.",
},
"aws_session_token": {
"type": "string",
"description": "The session token for temporary credentials, if applicable.",
},
"session_duration": {
"type": "integer",
"minimum": 900,
"maximum": 43200,
"default": 3600,
"description": "The duration (in seconds) for the role session.",
},
"role_session_name": {
"type": "string",
"description": "An identifier for the role session, useful for tracking sessions in AWS logs. "
"The regex used to validate this parameter is a string of characters consisting of "
"upper- and lower-case alphanumeric characters with no spaces. You can also include "
"underscores or any of the following characters: =,.@-\n\n"
"Examples:\n"
"- MySession123\n"
"- User_Session-1\n"
"- Test.Session@2",
"pattern": "^[a-zA-Z0-9=,.@_-]+$",
},
},
"required": ["role_arn", "external_id"],
},
{
"type": "object",
"title": "Azure Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The Azure application (client) ID for authentication in Azure AD.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the application (client) ID, providing "
"secure access.",
},
"tenant_id": {
"type": "string",
"description": "The Azure tenant ID, representing the directory where the application is "
"registered.",
},
},
"required": ["client_id", "client_secret", "tenant_id"],
},
{
"type": "object",
"title": "GCP Static Credentials",
"properties": {
"client_id": {
"type": "string",
"description": "The client ID from Google Cloud, used to identify the application for GCP "
"access.",
},
"client_secret": {
"type": "string",
"description": "The client secret associated with the GCP client ID, required for secure "
"access.",
},
"refresh_token": {
"type": "string",
"description": "A refresh token that allows the application to obtain new access tokens for "
"extended use.",
},
},
"required": ["client_id", "client_secret", "refresh_token"],
},
{
"type": "object",
"title": "Kubernetes Static Credentials",
"properties": {
"kubeconfig_content": {
"type": "string",
"description": "The content of the Kubernetes kubeconfig file, encoded as a string.",
}
},
"required": ["kubeconfig_content"],
},
]
}
)
class ProviderSecretField(serializers.JSONField):
pass
class ProviderSecretSerializer(RLSSerializer):
"""
Serializer for the ProviderSecret model.
@@ -1897,6 +1790,13 @@ class ComplianceOverviewFullSerializer(ComplianceOverviewSerializer):
return obj.requirements
class ComplianceOverviewMetadataSerializer(serializers.Serializer):
regions = serializers.ListField(child=serializers.CharField(), allow_empty=True)
class Meta:
resource_name = "compliance-overviews-metadata"
# Overviews

View File

@@ -1,5 +1,6 @@
import glob
import os
from datetime import datetime, timedelta, timezone
import sentry_sdk
from allauth.socialaccount.providers.github.views import GitHubOAuth2Adapter
@@ -20,8 +21,10 @@ from django.db.models import Count, Exists, F, OuterRef, Prefetch, Q, Subquery,
from django.db.models.functions import Coalesce
from django.http import HttpResponse
from django.urls import reverse
from django.utils.dateparse import parse_date
from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_control
from django_celery_beat.models import PeriodicTask
from drf_spectacular.settings import spectacular_settings
from drf_spectacular.utils import (
OpenApiParameter,
@@ -47,6 +50,7 @@ from rest_framework_simplejwt.exceptions import InvalidToken, TokenError
from tasks.beat import schedule_provider_scan
from tasks.jobs.export import get_s3_client
from tasks.tasks import (
backfill_scan_resource_summaries_task,
check_provider_connection_task,
delete_provider_task,
delete_tenant_task,
@@ -54,12 +58,14 @@ from tasks.tasks import (
)
from api.base_views import BaseRLSViewSet, BaseTenantViewset, BaseUserViewset
from api.compliance import get_compliance_frameworks
from api.db_router import MainRouter
from api.filters import (
ComplianceOverviewFilter,
FindingFilter,
IntegrationFilter,
InvitationFilter,
LatestFindingFilter,
MembershipFilter,
ProviderFilter,
ProviderGroupFilter,
@@ -85,6 +91,7 @@ from api.models import (
ProviderSecret,
Resource,
ResourceFindingMapping,
ResourceScanSummary,
Role,
RoleProviderGroupRelationship,
Scan,
@@ -98,9 +105,16 @@ from api.models import (
from api.pagination import ComplianceOverviewPagination
from api.rbac.permissions import Permissions, get_providers, get_role
from api.rls import Tenant
from api.utils import CustomOAuth2Client, validate_invitation
from api.utils import (
CustomOAuth2Client,
get_findings_metadata_no_aggregations,
validate_invitation,
)
from api.uuid_utils import datetime_to_uuid7, uuid7_start
from api.v1.mixins import PaginateByPkMixin
from api.v1.serializers import (
ComplianceOverviewFullSerializer,
ComplianceOverviewMetadataSerializer,
ComplianceOverviewSerializer,
FindingDynamicFilterSerializer,
FindingMetadataSerializer,
@@ -132,6 +146,7 @@ from api.v1.serializers import (
RoleProviderGroupRelationshipSerializer,
RoleSerializer,
RoleUpdateSerializer,
ScanComplianceReportSerializer,
ScanCreateSerializer,
ScanReportSerializer,
ScanSerializer,
@@ -245,7 +260,7 @@ class SchemaView(SpectacularAPIView):
def get(self, request, *args, **kwargs):
spectacular_settings.TITLE = "Prowler API"
spectacular_settings.VERSION = "1.6.0"
spectacular_settings.VERSION = "1.8.5"
spectacular_settings.DESCRIPTION = (
"Prowler API specification.\n\nThis file is auto-generated."
)
@@ -1087,6 +1102,8 @@ class ProviderViewSet(BaseRLSViewSet):
provider = get_object_or_404(Provider, pk=pk)
provider.is_deleted = True
provider.save()
task_name = f"scan-perform-scheduled-{pk}"
PeriodicTask.objects.filter(name=task_name).update(enabled=False)
with transaction.atomic():
task = delete_provider_task.delay(
@@ -1143,9 +1160,32 @@ class ProviderViewSet(BaseRLSViewSet):
200: OpenApiResponse(description="Report obtained successfully"),
202: OpenApiResponse(description="The task is in progress"),
403: OpenApiResponse(description="There is a problem with credentials"),
404: OpenApiResponse(description="The scan has no reports"),
404: OpenApiResponse(
description="The scan has no reports, or the report generation task has not started yet"
),
},
),
compliance=extend_schema(
tags=["Scan"],
summary="Retrieve compliance report as CSV",
description="Download a specific compliance report (e.g., 'cis_1.4_aws') as a CSV file.",
parameters=[
OpenApiParameter(
name="name",
type=str,
location=OpenApiParameter.PATH,
required=True,
description="The compliance report name, like 'cis_1.4_aws'",
),
],
responses={
200: OpenApiResponse(
description="CSV file containing the compliance report"
),
404: OpenApiResponse(description="Compliance report not found"),
},
request=None,
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
@@ -1198,6 +1238,10 @@ class ScanViewSet(BaseRLSViewSet):
if hasattr(self, "response_serializer_class"):
return self.response_serializer_class
return ScanReportSerializer
elif self.action == "compliance":
if hasattr(self, "response_serializer_class"):
return self.response_serializer_class
return ScanComplianceReportSerializer
return super().get_serializer_class()
def partial_update(self, request, *args, **kwargs):
@@ -1215,100 +1259,220 @@ class ScanViewSet(BaseRLSViewSet):
)
return Response(data=read_serializer.data, status=status.HTTP_200_OK)
@action(detail=True, methods=["get"], url_name="report")
def report(self, request, pk=None):
scan_instance = self.get_object()
def _get_task_status(self, scan_instance):
"""
Returns task status if the scan or its associated report-generation task is still executing.
if scan_instance.state == StateChoices.EXECUTING:
# If the scan is still running, return the task
prowler_task = Task.objects.get(id=scan_instance.task.id)
self.response_serializer_class = TaskSerializer
output_serializer = self.get_serializer(prowler_task)
return Response(
data=output_serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse(
"task-detail", kwargs={"pk": output_serializer.data["id"]}
)
},
)
If the scan is in an EXECUTING state or if a background task related to report generation
is found and also executing, this method returns a 202 Accepted response with the task
metadata and a `Content-Location` header pointing to the task detail endpoint.
try:
output_celery_task = Task.objects.get(
task_runner_task__task_name="scan-report",
task_runner_task__task_args__contains=pk,
)
self.response_serializer_class = TaskSerializer
output_serializer = self.get_serializer(output_celery_task)
if output_serializer.data["state"] == StateChoices.EXECUTING:
# If the task is still running, return the task
return Response(
data=output_serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse(
"task-detail", kwargs={"pk": output_serializer.data["id"]}
)
},
)
except Task.DoesNotExist:
# If the task does not exist, it means that the task is removed from the database
pass
Args:
scan_instance (Scan): The scan instance for which the task status is being checked.
output_location = scan_instance.output_location
if not output_location:
return Response(
{"detail": "The scan has no reports."},
status=status.HTTP_404_NOT_FOUND,
)
Returns:
Response or None:
- A `Response` with HTTP 202 status and serialized task data if the task is executing.
- `None` if no running task is found or if the task has already completed.
"""
task = None
if scan_instance.output_location.startswith("s3://"):
if scan_instance.state == StateChoices.EXECUTING and scan_instance.task:
task = scan_instance.task
else:
try:
s3_client = get_s3_client()
task = Task.objects.get(
task_runner_task__task_name="scan-report",
task_runner_task__task_kwargs__contains=str(scan_instance.id),
)
except Task.DoesNotExist:
return None
self.response_serializer_class = TaskSerializer
serializer = self.get_serializer(task)
if serializer.data.get("state") != StateChoices.EXECUTING:
return None
return Response(
data=serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse(
"task-detail", kwargs={"pk": serializer.data["id"]}
)
},
)
def _load_file(self, path_pattern, s3=False, bucket=None, list_objects=False):
"""
Loads a binary file (e.g., ZIP or CSV) and returns its content and filename.
Depending on the input parameters, this method supports loading:
- From S3 using a direct key.
- From S3 by listing objects under a prefix and matching suffix.
- From the local filesystem using glob pattern matching.
Args:
path_pattern (str): The key or glob pattern representing the file location.
s3 (bool, optional): Whether the file is stored in S3. Defaults to False.
bucket (str, optional): The name of the S3 bucket, required if `s3=True`. Defaults to None.
list_objects (bool, optional): If True and `s3=True`, list objects by prefix to find the file. Defaults to False.
Returns:
tuple[bytes, str]: A tuple containing the file content as bytes and the filename if successful.
Response: A DRF `Response` object with an appropriate status and error detail if an error occurs.
"""
if s3:
try:
client = get_s3_client()
except (ClientError, NoCredentialsError, ParamValidationError):
return Response(
{"detail": "There is a problem with credentials."},
status=status.HTTP_403_FORBIDDEN,
)
bucket_name = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET")
key = output_location[len(f"s3://{bucket_name}/") :]
try:
s3_object = s3_client.get_object(Bucket=bucket_name, Key=key)
except ClientError as e:
error_code = e.response.get("Error", {}).get("Code")
if error_code == "NoSuchKey":
if list_objects:
# list keys under prefix then match suffix
prefix = os.path.dirname(path_pattern)
suffix = os.path.basename(path_pattern)
try:
resp = client.list_objects_v2(Bucket=bucket, Prefix=prefix)
except ClientError as e:
sentry_sdk.capture_exception(e)
return Response(
{"detail": "The scan has no reports."},
{
"detail": "Unable to list compliance files in S3: encountered an AWS error."
},
status=status.HTTP_502_BAD_GATEWAY,
)
contents = resp.get("Contents", [])
keys = [obj["Key"] for obj in contents if obj["Key"].endswith(suffix)]
if not keys:
return Response(
{
"detail": f"No compliance file found for name '{os.path.splitext(suffix)[0]}'."
},
status=status.HTTP_404_NOT_FOUND,
)
# path_pattern here is prefix, but in compliance we build correct suffix check before
key = keys[0]
else:
# path_pattern is exact key
key = path_pattern
try:
s3_obj = client.get_object(Bucket=bucket, Key=key)
except ClientError as e:
code = e.response.get("Error", {}).get("Code")
if code == "NoSuchKey":
return Response(
{
"detail": "The scan has no reports, or the report generation task has not started yet."
},
status=status.HTTP_404_NOT_FOUND,
)
return Response(
{"detail": "There is a problem with credentials."},
status=status.HTTP_403_FORBIDDEN,
)
file_content = s3_object["Body"].read()
filename = os.path.basename(output_location.split("/")[-1])
content = s3_obj["Body"].read()
filename = os.path.basename(key)
else:
zip_files = glob.glob(output_location)
try:
file_path = zip_files[0]
except IndexError as e:
sentry_sdk.capture_exception(e)
files = glob.glob(path_pattern)
if not files:
return Response(
{"detail": "The scan has no reports."},
{
"detail": "The scan has no reports, or the report generation task has not started yet."
},
status=status.HTTP_404_NOT_FOUND,
)
with open(file_path, "rb") as f:
file_content = f.read()
filename = os.path.basename(file_path)
filepath = files[0]
with open(filepath, "rb") as f:
content = f.read()
filename = os.path.basename(filepath)
response = HttpResponse(
file_content, content_type="application/x-zip-compressed"
)
return content, filename
def _serve_file(self, content, filename, content_type):
response = HttpResponse(content, content_type=content_type)
response["Content-Disposition"] = f'attachment; filename="{filename}"'
return response
@action(detail=True, methods=["get"], url_name="report")
def report(self, request, pk=None):
scan = self.get_object()
# Check for executing tasks
running_resp = self._get_task_status(scan)
if running_resp:
return running_resp
if not scan.output_location:
return Response(
{
"detail": "The scan has no reports, or the report generation task has not started yet."
},
status=status.HTTP_404_NOT_FOUND,
)
if scan.output_location.startswith("s3://"):
bucket = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
key_prefix = scan.output_location.removeprefix(f"s3://{bucket}/")
loader = self._load_file(
key_prefix, s3=True, bucket=bucket, list_objects=False
)
else:
loader = self._load_file(scan.output_location, s3=False)
if isinstance(loader, Response):
return loader
content, filename = loader
return self._serve_file(content, filename, "application/x-zip-compressed")
@action(
detail=True,
methods=["get"],
url_path="compliance/(?P<name>[^/]+)",
url_name="compliance",
)
def compliance(self, request, pk=None, name=None):
scan = self.get_object()
if name not in get_compliance_frameworks(scan.provider.provider):
return Response(
{"detail": f"Compliance '{name}' not found."},
status=status.HTTP_404_NOT_FOUND,
)
running_resp = self._get_task_status(scan)
if running_resp:
return running_resp
if not scan.output_location:
return Response(
{
"detail": "The scan has no reports, or the report generation task has not started yet."
},
status=status.HTTP_404_NOT_FOUND,
)
if scan.output_location.startswith("s3://"):
bucket = env.str("DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET", "")
key_prefix = scan.output_location.removeprefix(f"s3://{bucket}/")
prefix = os.path.join(
os.path.dirname(key_prefix), "compliance", f"{name}.csv"
)
loader = self._load_file(prefix, s3=True, bucket=bucket, list_objects=True)
else:
base = os.path.dirname(scan.output_location)
pattern = os.path.join(base, "compliance", f"*_{name}.csv")
loader = self._load_file(pattern, s3=False)
if isinstance(loader, Response):
return loader
content, filename = loader
return self._serve_file(content, filename, "text/csv")
def create(self, request, *args, **kwargs):
input_serializer = self.get_serializer(data=request.data)
input_serializer.is_valid(raise_exception=True)
@@ -1325,10 +1489,10 @@ class ScanViewSet(BaseRLSViewSet):
},
)
prowler_task = Task.objects.get(id=task.id)
scan.task_id = task.id
scan.save(update_fields=["task_id"])
prowler_task = Task.objects.get(id=task.id)
self.response_serializer_class = TaskSerializer
output_serializer = self.get_serializer(prowler_task)
@@ -1521,10 +1685,24 @@ class ResourceViewSet(BaseRLSViewSet):
],
filters=True,
),
latest=extend_schema(
tags=["Finding"],
summary="List the latest findings",
description="Retrieve a list of the latest findings from the latest scans for each provider with options for "
"filtering by various criteria.",
filters=True,
),
metadata_latest=extend_schema(
tags=["Finding"],
summary="Retrieve metadata values from the latest findings",
description="Fetch unique metadata values from a set of findings from the latest scans for each provider. "
"This is useful for dynamic filtering.",
filters=True,
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
class FindingViewSet(BaseRLSViewSet):
class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
queryset = Finding.all_objects.all()
serializer_class = FindingSerializer
filterset_class = FindingFilter
@@ -1556,11 +1734,16 @@ class FindingViewSet(BaseRLSViewSet):
def get_serializer_class(self):
if self.action == "findings_services_regions":
return FindingDynamicFilterSerializer
elif self.action == "metadata":
elif self.action in ["metadata", "metadata_latest"]:
return FindingMetadataSerializer
return super().get_serializer_class()
def get_filterset_class(self):
if self.action in ["latest", "metadata_latest"]:
return LatestFindingFilter
return FindingFilter
def get_queryset(self):
tenant_id = self.request.tenant_id
user_roles = get_role(self.request.user)
@@ -1600,21 +1783,14 @@ class FindingViewSet(BaseRLSViewSet):
return super().filter_queryset(queryset)
def list(self, request, *args, **kwargs):
base_qs = self.filter_queryset(self.get_queryset())
paginated_ids = self.paginate_queryset(base_qs.values_list("id", flat=True))
if paginated_ids is not None:
ids = list(paginated_ids)
findings = (
Finding.all_objects.filter(tenant_id=self.request.tenant_id, id__in=ids)
.select_related("scan")
.prefetch_related("resources")
)
# Re-sort in Python to preserve ordering:
findings = sorted(findings, key=lambda x: ids.index(x.id))
serializer = self.get_serializer(findings, many=True)
return self.get_paginated_response(serializer.data)
serializer = self.get_serializer(base_qs, many=True)
return Response(serializer.data)
filtered_queryset = self.filter_queryset(self.get_queryset())
return self.paginate_by_pk(
request,
filtered_queryset,
manager=Finding.all_objects,
select_related=["scan"],
prefetch_related=["resources"],
)
@action(detail=False, methods=["get"], url_name="findings_services_regions")
def findings_services_regions(self, request):
@@ -1639,25 +1815,105 @@ class FindingViewSet(BaseRLSViewSet):
@action(detail=False, methods=["get"], url_name="metadata")
def metadata(self, request):
tenant_id = self.request.tenant_id
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
# Force filter validation
filtered_queryset = self.filter_queryset(self.get_queryset())
filtered_ids = filtered_queryset.order_by().values("id")
tenant_id = request.tenant_id
query_params = request.query_params
relevant_resources = Resource.all_objects.filter(
tenant_id=tenant_id, findings__id__in=Subquery(filtered_ids)
).only("service", "region", "type")
queryset = ResourceScanSummary.objects.filter(tenant_id=tenant_id)
scan_based_filters = {}
aggregation = relevant_resources.aggregate(
services=ArrayAgg("service", flat=True),
regions=ArrayAgg("region", flat=True),
resource_types=ArrayAgg("type", flat=True),
if scans := query_params.get("filter[scan__in]") or query_params.get(
"filter[scan]"
):
queryset = queryset.filter(scan_id__in=scans.split(","))
scan_based_filters = {"id__in": scans.split(",")}
else:
exact = query_params.get("filter[inserted_at]")
gte = query_params.get("filter[inserted_at__gte]")
lte = query_params.get("filter[inserted_at__lte]")
date_filters = {}
if exact:
date = parse_date(exact)
datetime_start = datetime.combine(
date, datetime.min.time(), tzinfo=timezone.utc
)
datetime_end = datetime_start + timedelta(days=1)
date_filters["scan_id__gte"] = uuid7_start(
datetime_to_uuid7(datetime_start)
)
date_filters["scan_id__lt"] = uuid7_start(
datetime_to_uuid7(datetime_end)
)
else:
if gte:
date_start = parse_date(gte)
datetime_start = datetime.combine(
date_start, datetime.min.time(), tzinfo=timezone.utc
)
date_filters["scan_id__gte"] = uuid7_start(
datetime_to_uuid7(datetime_start)
)
if lte:
date_end = parse_date(lte)
datetime_end = datetime.combine(
date_end + timedelta(days=1),
datetime.min.time(),
tzinfo=timezone.utc,
)
date_filters["scan_id__lt"] = uuid7_start(
datetime_to_uuid7(datetime_end)
)
if date_filters:
queryset = queryset.filter(**date_filters)
scan_based_filters = {
key.lstrip("scan_"): value for key, value in date_filters.items()
}
# ToRemove: Temporary fallback mechanism
if not queryset.exists():
scan_ids = Scan.objects.filter(
tenant_id=tenant_id, **scan_based_filters
).values_list("id", flat=True)
for scan_id in scan_ids:
backfill_scan_resource_summaries_task.apply_async(
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
)
return Response(
get_findings_metadata_no_aggregations(tenant_id, filtered_queryset)
)
if service_filter := query_params.get("filter[service]") or query_params.get(
"filter[service__in]"
):
queryset = queryset.filter(service__in=service_filter.split(","))
if region_filter := query_params.get("filter[region]") or query_params.get(
"filter[region__in]"
):
queryset = queryset.filter(region__in=region_filter.split(","))
if resource_type_filter := query_params.get(
"filter[resource_type]"
) or query_params.get("filter[resource_type__in]"):
queryset = queryset.filter(
resource_type__in=resource_type_filter.split(",")
)
services = list(
queryset.values_list("service", flat=True).distinct().order_by("service")
)
regions = list(
queryset.values_list("region", flat=True).distinct().order_by("region")
)
resource_types = list(
queryset.values_list("resource_type", flat=True)
.exclude(resource_type__isnull=True)
.exclude(resource_type__exact="")
.distinct()
.order_by("resource_type")
)
services = sorted(set(aggregation["services"] or []))
regions = sorted({region for region in aggregation["regions"] or [] if region})
resource_types = sorted(set(aggregation["resource_types"] or []))
result = {
"services": services,
@@ -1667,7 +1923,110 @@ class FindingViewSet(BaseRLSViewSet):
serializer = self.get_serializer(data=result)
serializer.is_valid(raise_exception=True)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.data)
@action(detail=False, methods=["get"], url_name="latest")
def latest(self, request):
tenant_id = request.tenant_id
filtered_queryset = self.filter_queryset(self.get_queryset())
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
)
filtered_queryset = filtered_queryset.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
)
return self.paginate_by_pk(
request,
filtered_queryset,
manager=Finding.all_objects,
select_related=["scan"],
prefetch_related=["resources"],
)
@action(
detail=False,
methods=["get"],
url_name="metadata_latest",
url_path="metadata/latest",
)
def metadata_latest(self, request):
tenant_id = request.tenant_id
query_params = request.query_params
latest_scans_queryset = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
)
latest_scans_ids = list(latest_scans_queryset.values_list("id", flat=True))
queryset = ResourceScanSummary.objects.filter(
tenant_id=tenant_id,
scan_id__in=latest_scans_queryset.values_list("id", flat=True),
)
# ToRemove: Temporary fallback mechanism
present_ids = set(
ResourceScanSummary.objects.filter(
tenant_id=tenant_id, scan_id__in=latest_scans_ids
)
.values_list("scan_id", flat=True)
.distinct()
)
missing_scan_ids = [sid for sid in latest_scans_ids if sid not in present_ids]
if missing_scan_ids:
for scan_id in missing_scan_ids:
backfill_scan_resource_summaries_task.apply_async(
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
)
return Response(
get_findings_metadata_no_aggregations(
tenant_id, self.filter_queryset(self.get_queryset())
)
)
if service_filter := query_params.get("filter[service]") or query_params.get(
"filter[service__in]"
):
queryset = queryset.filter(service__in=service_filter.split(","))
if region_filter := query_params.get("filter[region]") or query_params.get(
"filter[region__in]"
):
queryset = queryset.filter(region__in=region_filter.split(","))
if resource_type_filter := query_params.get(
"filter[resource_type]"
) or query_params.get("filter[resource_type__in]"):
queryset = queryset.filter(
resource_type__in=resource_type_filter.split(",")
)
services = list(
queryset.values_list("service", flat=True).distinct().order_by("service")
)
regions = list(
queryset.values_list("region", flat=True).distinct().order_by("region")
)
resource_types = list(
queryset.values_list("resource_type", flat=True)
.exclude(resource_type__isnull=True)
.exclude(resource_type__exact="")
.distinct()
.order_by("resource_type")
)
result = {
"services": services,
"regions": regions,
"resource_types": resource_types,
}
serializer = self.get_serializer(data=result)
serializer.is_valid(raise_exception=True)
return Response(serializer.data)
@extend_schema_view(
@@ -2054,6 +2413,21 @@ class RoleProviderGroupRelationshipView(RelationshipView, BaseRLSViewSet):
description="Fetch detailed information about a specific compliance overview by its ID, including detailed "
"requirement information and check's status.",
),
metadata=extend_schema(
tags=["Compliance Overview"],
summary="Retrieve metadata values from compliance overviews",
description="Fetch unique metadata values from a set of compliance overviews. This is useful for dynamic "
"filtering.",
parameters=[
OpenApiParameter(
name="filter[scan_id]",
required=True,
type=OpenApiTypes.UUID,
location=OpenApiParameter.QUERY,
description="Related scan ID.",
),
],
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
@@ -2115,6 +2489,8 @@ class ComplianceOverviewViewSet(BaseRLSViewSet):
def get_serializer_class(self):
if self.action == "retrieve":
return ComplianceOverviewFullSerializer
elif self.action == "metadata":
return ComplianceOverviewMetadataSerializer
return super().get_serializer_class()
def list(self, request, *args, **kwargs):
@@ -2131,6 +2507,35 @@ class ComplianceOverviewViewSet(BaseRLSViewSet):
)
return super().list(request, *args, **kwargs)
@action(detail=False, methods=["get"], url_name="metadata")
def metadata(self, request):
scan_id = request.query_params.get("filter[scan_id]")
if not scan_id:
raise ValidationError(
[
{
"detail": "This query parameter is required.",
"status": 400,
"source": {"pointer": "filter[scan_id]"},
"code": "required",
}
]
)
tenant_id = self.request.tenant_id
regions = list(
ComplianceOverview.objects.filter(tenant_id=tenant_id, scan_id=scan_id)
.values_list("region", flat=True)
.order_by("region")
.distinct()
)
result = {"regions": regions}
serializer = self.get_serializer(data=result)
serializer.is_valid(raise_exception=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@extend_schema(tags=["Overview"])
@extend_schema_view(
@@ -2188,8 +2593,8 @@ class OverviewViewSet(BaseRLSViewSet):
def _get_filtered_queryset(model):
if role.unlimited_visibility:
return model.objects.filter(tenant_id=self.request.tenant_id)
return model.objects.filter(
return model.all_objects.filter(tenant_id=self.request.tenant_id)
return model.all_objects.filter(
tenant_id=self.request.tenant_id, scan__provider__in=providers
)
@@ -2233,51 +2638,38 @@ class OverviewViewSet(BaseRLSViewSet):
tenant_id = self.request.tenant_id
latest_scan_ids = (
Scan.objects.filter(
tenant_id=tenant_id,
state=StateChoices.COMPLETED,
)
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
)
findings_aggregated = (
ScanSummary.objects.filter(tenant_id=tenant_id, scan_id__in=latest_scan_ids)
.values("scan__provider__provider")
resource_count_queryset = (
Resource.all_objects.filter(
tenant_id=tenant_id,
provider_id=OuterRef("scan__provider_id"),
)
.order_by()
.values("provider_id")
.annotate(cnt=Count("id"))
.values("cnt")
)
overview_queryset = (
ScanSummary.all_objects.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
)
.values(provider=F("scan__provider__provider"))
.annotate(
findings_passed=Coalesce(Sum("_pass"), 0),
findings_failed=Coalesce(Sum("fail"), 0),
findings_muted=Coalesce(Sum("muted"), 0),
total_findings=Coalesce(Sum("total"), 0),
total_resources=Coalesce(Subquery(resource_count_queryset), 0),
)
)
resources_aggregated = (
Resource.objects.filter(tenant_id=tenant_id)
.values("provider__provider")
.annotate(total_resources=Count("id"))
)
resources_dict = {
row["provider__provider"]: row["total_resources"]
for row in resources_aggregated
}
overview = []
for row in findings_aggregated:
provider_type = row["scan__provider__provider"]
overview.append(
{
"provider": provider_type,
"total_resources": resources_dict.get(provider_type, 0),
"total_findings": row["total_findings"],
"findings_passed": row["findings_passed"],
"findings_failed": row["findings_failed"],
"findings_muted": row["findings_muted"],
}
)
serializer = OverviewProviderSerializer(overview, many=True)
serializer = OverviewProviderSerializer(overview_queryset, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@action(detail=False, methods=["get"], url_name="findings")
@@ -2286,22 +2678,16 @@ class OverviewViewSet(BaseRLSViewSet):
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
latest_scan_subquery = (
Scan.objects.filter(
tenant_id=tenant_id,
state=StateChoices.COMPLETED,
provider_id=OuterRef("scan__provider_id"),
)
.order_by("-inserted_at")
.values("id")[:1]
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
)
annotated_queryset = filtered_queryset.annotate(
latest_scan_id=Subquery(latest_scan_subquery)
filtered_queryset = filtered_queryset.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
)
filtered_queryset = annotated_queryset.filter(scan_id=F("latest_scan_id"))
aggregated_totals = filtered_queryset.aggregate(
_pass=Sum("_pass") or 0,
fail=Sum("fail") or 0,
@@ -2331,22 +2717,16 @@ class OverviewViewSet(BaseRLSViewSet):
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
latest_scan_subquery = (
Scan.objects.filter(
tenant_id=tenant_id,
state=StateChoices.COMPLETED,
provider_id=OuterRef("scan__provider_id"),
)
.order_by("-inserted_at")
.values("id")[:1]
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
)
annotated_queryset = filtered_queryset.annotate(
latest_scan_id=Subquery(latest_scan_subquery)
filtered_queryset = filtered_queryset.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
)
filtered_queryset = annotated_queryset.filter(scan_id=F("latest_scan_id"))
severity_counts = (
filtered_queryset.values("severity")
.annotate(count=Sum("total"))
@@ -2367,22 +2747,16 @@ class OverviewViewSet(BaseRLSViewSet):
queryset = self.get_queryset()
filtered_queryset = self.filter_queryset(queryset)
latest_scan_subquery = (
Scan.objects.filter(
tenant_id=tenant_id,
state=StateChoices.COMPLETED,
provider_id=OuterRef("scan__provider_id"),
)
.order_by("-inserted_at")
.values("id")[:1]
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
)
annotated_queryset = filtered_queryset.annotate(
latest_scan_id=Subquery(latest_scan_subquery)
filtered_queryset = filtered_queryset.filter(
tenant_id=tenant_id, scan_id__in=latest_scan_ids
)
filtered_queryset = annotated_queryset.filter(scan_id=F("latest_scan_id"))
services_data = (
filtered_queryset.values("service")
.annotate(_pass=Sum("_pass"))

View File

@@ -50,9 +50,9 @@ class RLSTask(Task):
tenant_id = kwargs.get("tenant_id")
with rls_transaction(tenant_id):
APITask.objects.create(
APITask.objects.update_or_create(
id=task_result_instance.task_id,
tenant_id=tenant_id,
task_runner_task=task_result_instance,
defaults={"task_runner_task": task_result_instance},
)
return result

View File

@@ -111,6 +111,7 @@ SPECTACULAR_SETTINGS = {
"PREPROCESSING_HOOKS": [
"drf_spectacular_jsonapi.hooks.fix_nested_path_parameters",
],
"TITLE": "API Reference - Prowler",
}
WSGI_APPLICATION = "config.wsgi.application"
@@ -237,6 +238,7 @@ DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = env.str(
DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = env.str("DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN", "")
DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = env.str("DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION", "")
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
# HTTP Security Headers
SECURE_CONTENT_TYPE_NOSNIFF = True
X_FRAME_OPTIONS = "DENY"

View File

@@ -12,6 +12,8 @@ IGNORED_EXCEPTIONS = [
"UnauthorizedOperation",
"AuthFailure",
"InvalidClientTokenId",
"AWSInvalidProviderIdError",
"InternalServerErrorException",
"AccessDenied",
"No Shodan API Key", # Shodan Check
"RequestLimitExceeded", # For now we don't want to log the RequestLimitExceeded errors
@@ -33,6 +35,13 @@ IGNORED_EXCEPTIONS = [
"ValidationException",
"AWSSecretAccessKeyInvalidError",
"InvalidAction",
"InvalidRequestException",
"RequestExpired",
"ConnectionClosedError",
"MaxRetryError",
"AWSAccessKeyIDInvalidError",
"AWSSessionTokenExpiredError",
"EndpointConnectionError", # AWS Service is not available in a region
"Pool is closed", # The following comes from urllib3: eu-west-1 -- HTTPClientError[126]: An HTTP Client raised an unhandled exception: AWSHTTPSConnectionPool(host='hostname.s3.eu-west-1.amazonaws.com', port=443): Pool is closed.
# Authentication Errors from GCP
"ClientAuthenticationError",
@@ -41,6 +50,8 @@ IGNORED_EXCEPTIONS = [
"Permission denied to get service",
"API has not been used in project",
"HttpError 404 when requesting",
"HttpError 403 when requesting",
"HttpError 400 when requesting",
"GCPNoAccesibleProjectsError",
# Authentication Errors from Azure
"ClientAuthenticationError",
@@ -49,6 +60,7 @@ IGNORED_EXCEPTIONS = [
"AzureNotValidClientIdError",
"AzureNotValidClientSecretError",
"AzureNotValidTenantIdError",
"AzureInvalidProviderIdError",
"AzureTenantIdAndClientSecretNotBelongingToClientIdError",
"AzureTenantIdAndClientIdNotBelongingToClientSecretError",
"AzureClientIdAndClientSecretNotBelongingToTenantIdError",
@@ -67,9 +79,16 @@ def before_send(event, hint):
log_msg = hint["log_record"].msg
log_lvl = hint["log_record"].levelno
# Handle Error events and discard the rest
if log_lvl == 40 and any(ignored in log_msg for ignored in IGNORED_EXCEPTIONS):
return
# Handle Error and Critical events and discard the rest
if log_lvl <= 40 and any(ignored in log_msg for ignored in IGNORED_EXCEPTIONS):
return None # Explicitly return None to drop the event
# Ignore exceptions with the ignored_exceptions
if "exc_info" in hint and hint["exc_info"]:
exc_value = str(hint["exc_info"][1])
if any(ignored in exc_value for ignored in IGNORED_EXCEPTIONS):
return None # Explicitly return None to drop the event
return event
@@ -85,4 +104,6 @@ sentry_sdk.init(
# possible.
"continuous_profiling_auto_start": True,
},
attach_stacktrace=True,
ignore_errors=IGNORED_EXCEPTIONS,
)

View File

@@ -10,6 +10,7 @@ from django.urls import reverse
from django_celery_results.models import TaskResult
from rest_framework import status
from rest_framework.test import APIClient
from tasks.jobs.backfill import backfill_resource_scan_summaries
from api.db_utils import rls_transaction
from api.models import (
@@ -655,6 +656,7 @@ def findings_fixture(scans_fixture, resources_fixture):
"Description": "test description orange juice",
},
first_seen_at="2024-01-02T00:00:00Z",
muted=True,
)
finding2.add_resources([resource2])
@@ -919,6 +921,55 @@ def integrations_fixture(providers_fixture):
return integration1, integration2
@pytest.fixture
def backfill_scan_metadata_fixture(scans_fixture, findings_fixture):
for scan_instance in scans_fixture:
tenant_id = scan_instance.tenant_id
scan_id = scan_instance.id
backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
@pytest.fixture(scope="function")
def latest_scan_finding(authenticated_client, providers_fixture, resources_fixture):
provider = providers_fixture[0]
tenant_id = str(providers_fixture[0].tenant_id)
resource = resources_fixture[0]
scan = Scan.objects.create(
name="latest completed scan",
provider=provider,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.COMPLETED,
tenant_id=tenant_id,
)
finding = Finding.objects.create(
tenant_id=tenant_id,
uid="test_finding_uid_1",
scan=scan,
delta="new",
status=Status.FAIL,
status_extended="test status extended ",
impact=Severity.critical,
impact_extended="test impact extended one",
severity=Severity.critical,
raw_result={
"status": Status.FAIL,
"impact": Severity.critical,
"severity": Severity.critical,
},
tags={"test": "dev-qa"},
check_id="test_check_id",
check_metadata={
"CheckId": "test_check_id",
"Description": "test description apple sauce",
},
first_seen_at="2024-01-02T00:00:00Z",
)
finding.add_resources([resource])
backfill_resource_scan_summaries(tenant_id, str(scan.id))
return finding
def get_authorization_header(access_token: str) -> dict:
return {"Authorization": f"Bearer {access_token}"}

View File

@@ -0,0 +1,61 @@
from api.db_utils import rls_transaction
from api.models import (
Resource,
ResourceFindingMapping,
ResourceScanSummary,
Scan,
StateChoices,
)
def backfill_resource_scan_summaries(tenant_id: str, scan_id: str):
with rls_transaction(tenant_id):
if ResourceScanSummary.objects.filter(
tenant_id=tenant_id, scan_id=scan_id
).exists():
return {"status": "already backfilled"}
with rls_transaction(tenant_id):
if not Scan.objects.filter(
tenant_id=tenant_id,
id=scan_id,
state__in=(StateChoices.COMPLETED, StateChoices.FAILED),
).exists():
return {"status": "scan is not completed"}
resource_ids_qs = (
ResourceFindingMapping.objects.filter(
tenant_id=tenant_id, finding__scan_id=scan_id
)
.values_list("resource_id", flat=True)
.distinct()
)
resource_ids = list(resource_ids_qs)
if not resource_ids:
return {"status": "no resources to backfill"}
resources_qs = Resource.objects.filter(
tenant_id=tenant_id, id__in=resource_ids
).only("id", "service", "region", "type")
summaries = []
for resource in resources_qs.iterator():
summaries.append(
ResourceScanSummary(
tenant_id=tenant_id,
scan_id=scan_id,
resource_id=str(resource.id),
service=resource.service,
region=resource.region,
resource_type=resource.type,
)
)
for i in range(0, len(summaries), 500):
ResourceScanSummary.objects.bulk_create(
summaries[i : i + 500], ignore_conflicts=True
)
return {"status": "backfilled", "inserted": len(summaries)}

View File

@@ -1,5 +1,5 @@
from celery.utils.log import get_task_logger
from django.db import DatabaseError, transaction
from django.db import DatabaseError
from api.db_router import MainRouter
from api.db_utils import batch_delete, rls_transaction
@@ -8,11 +8,12 @@ from api.models import Finding, Provider, Resource, Scan, ScanSummary, Tenant
logger = get_task_logger(__name__)
def delete_provider(pk: str):
def delete_provider(tenant_id: str, pk: str):
"""
Gracefully deletes an instance of a provider along with its related data.
Args:
tenant_id (str): Tenant ID the resources belong to.
pk (str): The primary key of the Provider instance to delete.
Returns:
@@ -21,37 +22,32 @@ def delete_provider(pk: str):
Raises:
Provider.DoesNotExist: If no instance with the provided primary key exists.
DatabaseError: If any deletion step fails.
"""
instance = Provider.all_objects.get(pk=pk)
deletion_summary = {}
deletion_steps = [
("Scan Summaries", ScanSummary.all_objects.filter(scan__provider=instance)),
("Findings", Finding.all_objects.filter(scan__provider=instance)),
("Resources", Resource.all_objects.filter(provider=instance)),
("Scans", Scan.all_objects.filter(provider=instance)),
]
with rls_transaction(tenant_id):
instance = Provider.all_objects.get(pk=pk)
deletion_summary = {}
deletion_steps = [
("Scan Summaries", ScanSummary.all_objects.filter(scan__provider=instance)),
("Findings", Finding.all_objects.filter(scan__provider=instance)),
("Resources", Resource.all_objects.filter(provider=instance)),
("Scans", Scan.all_objects.filter(provider=instance)),
]
for step_name, queryset in deletion_steps:
try:
with transaction.atomic():
_, step_summary = batch_delete(queryset)
deletion_summary.update(step_summary)
except DatabaseError as error:
logger.error(f"Error deleting {step_name}: {error}")
_, step_summary = batch_delete(tenant_id, queryset)
deletion_summary.update(step_summary)
except DatabaseError as db_error:
logger.error(f"Error deleting {step_name}: {db_error}")
raise
# Delete the provider itself
try:
with transaction.atomic():
with rls_transaction(tenant_id):
_, provider_summary = instance.delete()
deletion_summary.update(provider_summary)
except DatabaseError as error:
logger.error(f"Error deleting Provider: {error}")
deletion_summary.update(provider_summary)
except DatabaseError as db_error:
logger.error(f"Error deleting Provider: {db_error}")
raise
return deletion_summary
@@ -69,9 +65,8 @@ def delete_tenant(pk: str):
deletion_summary = {}
for provider in Provider.objects.using(MainRouter.admin_db).filter(tenant_id=pk):
with rls_transaction(pk):
summary = delete_provider(provider.id)
deletion_summary.update(summary)
summary = delete_provider(pk, provider.id)
deletion_summary.update(summary)
Tenant.objects.using(MainRouter.admin_db).filter(id=pk).delete()

View File

@@ -1,4 +1,5 @@
import os
import re
import zipfile
import boto3
@@ -13,6 +14,41 @@ from prowler.config.config import (
json_ocsf_file_suffix,
output_file_timestamp,
)
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
AWSWellArchitected,
)
from prowler.lib.outputs.compliance.cis.cis_aws import AWSCIS
from prowler.lib.outputs.compliance.cis.cis_azure import AzureCIS
from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
from prowler.lib.outputs.compliance.cis.cis_m365 import M365CIS
from prowler.lib.outputs.compliance.ens.ens_aws import AWSENS
from prowler.lib.outputs.compliance.ens.ens_azure import AzureENS
from prowler.lib.outputs.compliance.ens.ens_gcp import GCPENS
from prowler.lib.outputs.compliance.iso27001.iso27001_aws import AWSISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_azure import AzureISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_gcp import GCPISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_kubernetes import (
KubernetesISO27001,
)
from prowler.lib.outputs.compliance.kisa_ismsp.kisa_ismsp_aws import AWSKISAISMSP
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_aws import AWSMitreAttack
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_azure import (
AzureMitreAttack,
)
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_gcp import GCPMitreAttack
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_aws import (
ProwlerThreatScoreAWS,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_azure import (
ProwlerThreatScoreAzure,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_gcp import (
ProwlerThreatScoreGCP,
)
from prowler.lib.outputs.compliance.prowler_threatscore.prowler_threatscore_m365 import (
ProwlerThreatScoreM365,
)
from prowler.lib.outputs.csv.csv import CSV
from prowler.lib.outputs.html.html import HTML
from prowler.lib.outputs.ocsf.ocsf import OCSF
@@ -20,6 +56,44 @@ from prowler.lib.outputs.ocsf.ocsf import OCSF
logger = get_task_logger(__name__)
COMPLIANCE_CLASS_MAP = {
"aws": [
(lambda name: name.startswith("cis_"), AWSCIS),
(lambda name: name == "mitre_attack_aws", AWSMitreAttack),
(lambda name: name.startswith("ens_"), AWSENS),
(
lambda name: name.startswith("aws_well_architected_framework"),
AWSWellArchitected,
),
(lambda name: name.startswith("iso27001_"), AWSISO27001),
(lambda name: name.startswith("kisa"), AWSKISAISMSP),
(lambda name: name == "prowler_threatscore_aws", ProwlerThreatScoreAWS),
],
"azure": [
(lambda name: name.startswith("cis_"), AzureCIS),
(lambda name: name == "mitre_attack_azure", AzureMitreAttack),
(lambda name: name.startswith("ens_"), AzureENS),
(lambda name: name.startswith("iso27001_"), AzureISO27001),
(lambda name: name == "prowler_threatscore_azure", ProwlerThreatScoreAzure),
],
"gcp": [
(lambda name: name.startswith("cis_"), GCPCIS),
(lambda name: name == "mitre_attack_gcp", GCPMitreAttack),
(lambda name: name.startswith("ens_"), GCPENS),
(lambda name: name.startswith("iso27001_"), GCPISO27001),
(lambda name: name == "prowler_threatscore_gcp", ProwlerThreatScoreGCP),
],
"kubernetes": [
(lambda name: name.startswith("cis_"), KubernetesCIS),
(lambda name: name.startswith("iso27001_"), KubernetesISO27001),
],
"m365": [
(lambda name: name.startswith("cis_"), M365CIS),
(lambda name: name == "prowler_threatscore_m365", ProwlerThreatScoreM365),
],
}
# Predefined mapping for output formats and their configurations
OUTPUT_FORMATS_MAPPING = {
"csv": {
@@ -43,13 +117,17 @@ def _compress_output_files(output_directory: str) -> str:
str: The full path to the newly created ZIP archive.
"""
zip_path = f"{output_directory}.zip"
parent_dir = os.path.dirname(output_directory)
zip_path_abs = os.path.abspath(zip_path)
with zipfile.ZipFile(zip_path, "w", zipfile.ZIP_DEFLATED) as zipf:
for suffix in [config["suffix"] for config in OUTPUT_FORMATS_MAPPING.values()]:
zipf.write(
f"{output_directory}{suffix}",
f"output/{output_directory.split('/')[-1]}{suffix}",
)
for foldername, _, filenames in os.walk(parent_dir):
for filename in filenames:
file_path = os.path.join(foldername, filename)
if os.path.abspath(file_path) == zip_path_abs:
continue
arcname = os.path.relpath(file_path, start=parent_dir)
zipf.write(file_path, arcname)
return zip_path
@@ -102,25 +180,38 @@ def _upload_to_s3(tenant_id: str, zip_path: str, scan_id: str) -> str:
Raises:
botocore.exceptions.ClientError: If the upload attempt to S3 fails for any reason.
"""
if not base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET:
return
bucket = base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET
if not bucket:
return None
try:
s3 = get_s3_client()
s3_key = f"{tenant_id}/{scan_id}/{os.path.basename(zip_path)}"
# Upload the ZIP file (outputs) to the S3 bucket
zip_key = f"{tenant_id}/{scan_id}/{os.path.basename(zip_path)}"
s3.upload_file(
Filename=zip_path,
Bucket=base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET,
Key=s3_key,
Bucket=bucket,
Key=zip_key,
)
return f"s3://{base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET}/{s3_key}"
# Upload the compliance directory to the S3 bucket
compliance_dir = os.path.join(os.path.dirname(zip_path), "compliance")
for filename in os.listdir(compliance_dir):
local_path = os.path.join(compliance_dir, filename)
if not os.path.isfile(local_path):
continue
file_key = f"{tenant_id}/{scan_id}/compliance/{filename}"
s3.upload_file(Filename=local_path, Bucket=bucket, Key=file_key)
return f"s3://{base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET}/{zip_key}"
except (ClientError, NoCredentialsError, ParamValidationError, ValueError) as e:
logger.error(f"S3 upload failed: {str(e)}")
def _generate_output_directory(
output_directory, prowler_provider: object, tenant_id: str, scan_id: str
) -> str:
) -> tuple[str, str]:
"""
Generate a file system path for the output directory of a prowler scan.
@@ -145,12 +236,22 @@ def _generate_output_directory(
Example:
>>> _generate_output_directory("/tmp", "aws", "tenant-1234", "scan-5678")
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56'
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56',
'/tmp/tenant-1234/aws/scan-5678/compliance/prowler-output-2023-02-15T12:34:56'
"""
# Sanitize the prowler provider name to ensure it is a valid directory name
prowler_provider_sanitized = re.sub(r"[^\w\-]", "-", prowler_provider)
path = (
f"{output_directory}/{tenant_id}/{scan_id}/prowler-output-"
f"{prowler_provider}-{output_file_timestamp}"
f"{prowler_provider_sanitized}-{output_file_timestamp}"
)
os.makedirs("/".join(path.split("/")[:-1]), exist_ok=True)
return path
compliance_path = (
f"{output_directory}/{tenant_id}/{scan_id}/compliance/prowler-output-"
f"{prowler_provider_sanitized}-{output_file_timestamp}"
)
os.makedirs("/".join(compliance_path.split("/")[:-1]), exist_ok=True)
return path, compliance_path

View File

@@ -1,3 +1,4 @@
import json
import time
from copy import deepcopy
from datetime import datetime, timezone
@@ -6,6 +7,7 @@ from celery.utils.log import get_task_logger
from config.settings.celery import CELERY_DEADLOCK_ATTEMPTS
from django.db import IntegrityError, OperationalError
from django.db.models import Case, Count, IntegerField, Sum, When
from tasks.utils import CustomEncoder
from api.compliance import (
PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE,
@@ -17,6 +19,7 @@ from api.models import (
Finding,
Provider,
Resource,
ResourceScanSummary,
ResourceTag,
Scan,
ScanSummary,
@@ -119,7 +122,9 @@ def perform_prowler_scan(
check_status_by_region = {}
exception = None
unique_resources = set()
scan_resource_cache: set[tuple[str, str, str, str]] = set()
start_time = time.time()
exc = None
with rls_transaction(tenant_id):
provider_instance = Provider.objects.get(pk=provider_id)
@@ -135,7 +140,7 @@ def perform_prowler_scan(
provider_instance.connected = True
except Exception as e:
provider_instance.connected = False
raise ValueError(
exc = ValueError(
f"Provider {provider_instance.provider} is not connected: {e}"
)
finally:
@@ -144,6 +149,11 @@ def perform_prowler_scan(
)
provider_instance.save()
# If the provider is not connected, raise an exception outside the transaction.
# If raised within the transaction, the transaction will be rolled back and the provider will not be marked as not connected.
if exc:
raise exc
prowler_scan = ProwlerScan(provider=prowler_provider, checks=checks_to_execute)
resource_cache = {}
@@ -191,6 +201,17 @@ def perform_prowler_scan(
if resource_instance.type != finding.resource_type:
resource_instance.type = finding.resource_type
updated_fields.append("type")
if resource_instance.metadata != finding.resource_metadata:
resource_instance.metadata = json.dumps(
finding.resource_metadata, cls=CustomEncoder
)
updated_fields.append("metadata")
if resource_instance.details != finding.resource_details:
resource_instance.details = finding.resource_details
updated_fields.append("details")
if resource_instance.partition != finding.partition:
resource_instance.partition = finding.partition
updated_fields.append("partition")
if updated_fields:
with rls_transaction(tenant_id):
resource_instance.save(update_fields=updated_fields)
@@ -267,6 +288,8 @@ def perform_prowler_scan(
check_id=finding.check_id,
scan=scan_instance,
first_seen_at=last_first_seen_at,
muted=finding.muted,
compliance=finding.compliance,
)
finding_instance.add_resources([resource_instance])
@@ -280,6 +303,16 @@ def perform_prowler_scan(
continue
region_dict[finding.check_id] = finding.status.value
# Update scan resource summaries
scan_resource_cache.add(
(
str(resource_instance.id),
resource_instance.service,
resource_instance.region,
resource_instance.type,
)
)
# Update scan progress
with rls_transaction(tenant_id):
scan_instance.progress = progress
@@ -299,66 +332,90 @@ def perform_prowler_scan(
scan_instance.unique_resource_count = len(unique_resources)
scan_instance.save()
if exception is None:
try:
regions = prowler_provider.get_regions()
except AttributeError:
regions = set()
compliance_template = PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE[
provider_instance.provider
]
compliance_overview_by_region = {
region: deepcopy(compliance_template) for region in regions
}
for region, check_status in check_status_by_region.items():
compliance_data = compliance_overview_by_region.setdefault(
region, deepcopy(compliance_template)
)
for check_name, status in check_status.items():
generate_scan_compliance(
compliance_data,
provider_instance.provider,
check_name,
status,
)
# Prepare compliance overview objects
compliance_overview_objects = []
for region, compliance_data in compliance_overview_by_region.items():
for compliance_id, compliance in compliance_data.items():
compliance_overview_objects.append(
ComplianceOverview(
tenant_id=tenant_id,
scan=scan_instance,
region=region,
compliance_id=compliance_id,
framework=compliance["framework"],
version=compliance["version"],
description=compliance["description"],
requirements=compliance["requirements"],
requirements_passed=compliance["requirements_status"]["passed"],
requirements_failed=compliance["requirements_status"]["failed"],
requirements_manual=compliance["requirements_status"]["manual"],
total_requirements=compliance["total_requirements"],
)
)
try:
with rls_transaction(tenant_id):
ComplianceOverview.objects.bulk_create(
compliance_overview_objects, batch_size=100
)
except Exception as overview_exception:
import sentry_sdk
sentry_sdk.capture_exception(overview_exception)
logger.error(
f"Error storing compliance overview for scan {scan_id}: {overview_exception}"
)
if exception is not None:
raise exception
try:
regions = prowler_provider.get_regions()
except AttributeError:
regions = set()
compliance_template = PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE[
provider_instance.provider
]
compliance_overview_by_region = {
region: deepcopy(compliance_template) for region in regions
}
for region, check_status in check_status_by_region.items():
compliance_data = compliance_overview_by_region.setdefault(
region, deepcopy(compliance_template)
)
for check_name, status in check_status.items():
generate_scan_compliance(
compliance_data,
provider_instance.provider,
check_name,
status,
)
# Prepare compliance overview objects
compliance_overview_objects = []
for region, compliance_data in compliance_overview_by_region.items():
for compliance_id, compliance in compliance_data.items():
compliance_overview_objects.append(
ComplianceOverview(
tenant_id=tenant_id,
scan=scan_instance,
region=region,
compliance_id=compliance_id,
framework=compliance["framework"],
version=compliance["version"],
description=compliance["description"],
requirements=compliance["requirements"],
requirements_passed=compliance["requirements_status"]["passed"],
requirements_failed=compliance["requirements_status"]["failed"],
requirements_manual=compliance["requirements_status"]["manual"],
total_requirements=compliance["total_requirements"],
)
)
try:
with rls_transaction(tenant_id):
ComplianceOverview.objects.bulk_create(
compliance_overview_objects, batch_size=500
)
except Exception as overview_exception:
import sentry_sdk
sentry_sdk.capture_exception(overview_exception)
logger.error(
f"Error storing compliance overview for scan {scan_id}: {overview_exception}"
)
try:
resource_scan_summaries = [
ResourceScanSummary(
tenant_id=tenant_id,
scan_id=scan_id,
resource_id=resource_id,
service=service,
region=region,
resource_type=resource_type,
)
for resource_id, service, region, resource_type in scan_resource_cache
]
with rls_transaction(tenant_id):
ResourceScanSummary.objects.bulk_create(
resource_scan_summaries, batch_size=500, ignore_conflicts=True
)
except Exception as filter_exception:
import sentry_sdk
sentry_sdk.capture_exception(filter_exception)
logger.error(
f"Error storing filter values for scan {scan_id}: {filter_exception}"
)
serializer = ScanTaskSerializer(instance=scan_instance)
return serializer.data
@@ -402,21 +459,21 @@ def aggregate_findings(tenant_id: str, scan_id: str):
).annotate(
fail=Sum(
Case(
When(status="FAIL", then=1),
When(status="FAIL", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
_pass=Sum(
Case(
When(status="PASS", then=1),
When(status="PASS", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
muted=Sum(
muted_count=Sum(
Case(
When(status="MUTED", then=1),
When(muted=True, then=1),
default=0,
output_field=IntegerField(),
)
@@ -424,63 +481,63 @@ def aggregate_findings(tenant_id: str, scan_id: str):
total=Count("id"),
new=Sum(
Case(
When(delta="new", then=1),
When(delta="new", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
changed=Sum(
Case(
When(delta="changed", then=1),
When(delta="changed", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
unchanged=Sum(
Case(
When(delta__isnull=True, then=1),
When(delta__isnull=True, muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
fail_new=Sum(
Case(
When(delta="new", status="FAIL", then=1),
When(delta="new", status="FAIL", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
fail_changed=Sum(
Case(
When(delta="changed", status="FAIL", then=1),
When(delta="changed", status="FAIL", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
pass_new=Sum(
Case(
When(delta="new", status="PASS", then=1),
When(delta="new", status="PASS", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
pass_changed=Sum(
Case(
When(delta="changed", status="PASS", then=1),
When(delta="changed", status="PASS", muted=False, then=1),
default=0,
output_field=IntegerField(),
)
),
muted_new=Sum(
Case(
When(delta="new", status="MUTED", then=1),
When(delta="new", muted=True, then=1),
default=0,
output_field=IntegerField(),
)
),
muted_changed=Sum(
Case(
When(delta="changed", status="MUTED", then=1),
When(delta="changed", muted=True, then=1),
default=0,
output_field=IntegerField(),
)
@@ -498,7 +555,7 @@ def aggregate_findings(tenant_id: str, scan_id: str):
region=agg["resources__region"],
fail=agg["fail"],
_pass=agg["_pass"],
muted=agg["muted"],
muted=agg["muted_count"],
total=agg["total"],
new=agg["new"],
changed=agg["changed"],

View File

@@ -1,3 +1,4 @@
from datetime import datetime, timedelta, timezone
from pathlib import Path
from shutil import rmtree
@@ -6,9 +7,11 @@ from celery.utils.log import get_task_logger
from config.celery import RLSTask
from config.django.base import DJANGO_FINDINGS_BATCH_SIZE, DJANGO_TMP_OUTPUT_DIRECTORY
from django_celery_beat.models import PeriodicTask
from tasks.jobs.backfill import backfill_resource_scan_summaries
from tasks.jobs.connection import check_provider_connection
from tasks.jobs.deletion import delete_provider, delete_tenant
from tasks.jobs.export import (
COMPLIANCE_CLASS_MAP,
OUTPUT_FORMATS_MAPPING,
_compress_output_files,
_generate_output_directory,
@@ -17,10 +20,14 @@ from tasks.jobs.export import (
from tasks.jobs.scan import aggregate_findings, perform_prowler_scan
from tasks.utils import batched, get_next_execution_datetime
from api.compliance import get_compliance_frameworks
from api.db_utils import rls_transaction
from api.decorators import set_tenant
from api.models import Finding, Provider, Scan, ScanSummary, StateChoices
from api.utils import initialize_prowler_provider
from api.v1.serializers import ScanTaskSerializer
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.outputs.compliance.generic.generic import GenericCompliance
from prowler.lib.outputs.finding import Finding as FindingOutput
logger = get_task_logger(__name__)
@@ -43,9 +50,10 @@ def check_provider_connection_task(provider_id: str):
return check_provider_connection(provider_id=provider_id)
@shared_task(base=RLSTask, name="provider-deletion", queue="deletion")
@set_tenant
def delete_provider_task(provider_id: str):
@shared_task(
base=RLSTask, name="provider-deletion", queue="deletion", autoretry_for=(Exception,)
)
def delete_provider_task(provider_id: str, tenant_id: str):
"""
Task to delete a specific Provider instance.
@@ -53,6 +61,7 @@ def delete_provider_task(provider_id: str):
Args:
provider_id (str): The primary key of the `Provider` instance to be deleted.
tenant_id (str): Tenant ID the provider belongs to.
Returns:
tuple: A tuple containing:
@@ -60,7 +69,7 @@ def delete_provider_task(provider_id: str):
- A dictionary with the count of deleted instances per model,
including related models if cascading deletes were triggered.
"""
return delete_provider(pk=provider_id)
return delete_provider(tenant_id=tenant_id, pk=provider_id)
@shared_task(base=RLSTask, name="scan-perform", queue="scans")
@@ -126,6 +135,43 @@ def perform_scheduled_scan_task(self, tenant_id: str, provider_id: str):
periodic_task_instance = PeriodicTask.objects.get(
name=f"scan-perform-scheduled-{provider_id}"
)
executed_scan = Scan.objects.filter(
tenant_id=tenant_id,
provider_id=provider_id,
task__task_runner_task__task_id=task_id,
).order_by("completed_at")
if (
Scan.objects.filter(
tenant_id=tenant_id,
provider_id=provider_id,
trigger=Scan.TriggerChoices.SCHEDULED,
state=StateChoices.EXECUTING,
scheduler_task_id=periodic_task_instance.id,
scheduled_at__date=datetime.now(timezone.utc).date(),
).exists()
or executed_scan.exists()
):
# Duplicated task execution due to visibility timeout or scan is already running
logger.warning(f"Duplicated scheduled scan for provider {provider_id}.")
try:
affected_scan = executed_scan.first()
if not affected_scan:
raise ValueError(
"Error retrieving affected scan details after detecting duplicated scheduled "
"scan."
)
# Return the affected scan details to avoid losing data
serializer = ScanTaskSerializer(instance=affected_scan)
except Exception as duplicated_scan_exception:
logger.error(
f"Duplicated scheduled scan for provider {provider_id}. Error retrieving affected scan details: "
f"{str(duplicated_scan_exception)}"
)
raise duplicated_scan_exception
return serializer.data
next_scan_datetime = get_next_execution_datetime(task_id, provider_id)
scan_instance, _ = Scan.objects.get_or_create(
tenant_id=tenant_id,
@@ -133,7 +179,11 @@ def perform_scheduled_scan_task(self, tenant_id: str, provider_id: str):
trigger=Scan.TriggerChoices.SCHEDULED,
state__in=(StateChoices.SCHEDULED, StateChoices.AVAILABLE),
scheduler_task_id=periodic_task_instance.id,
defaults={"state": StateChoices.SCHEDULED},
defaults={
"state": StateChoices.SCHEDULED,
"name": "Daily scheduled scan",
"scheduled_at": next_scan_datetime - timedelta(days=1),
},
)
scan_instance.task_id = task_id
@@ -174,7 +224,7 @@ def perform_scan_summary_task(tenant_id: str, scan_id: str):
return aggregate_findings(tenant_id=tenant_id, scan_id=scan_id)
@shared_task(name="tenant-deletion", queue="deletion")
@shared_task(name="tenant-deletion", queue="deletion", autoretry_for=(Exception,))
def delete_tenant_task(tenant_id: str):
return delete_tenant(pk=tenant_id)
@@ -201,84 +251,123 @@ def generate_outputs(scan_id: str, provider_id: str, tenant_id: str):
scan_id (str): The scan identifier.
provider_id (str): The provider_id id to be used in generating outputs.
"""
# Initialize the prowler provider
prowler_provider = initialize_prowler_provider(Provider.objects.get(id=provider_id))
# Check if the scan has findings
if not ScanSummary.objects.filter(scan_id=scan_id).exists():
logger.info(f"No findings found for scan {scan_id}")
return {"upload": False}
# Get the provider UID
provider_uid = Provider.objects.get(id=provider_id).uid
provider_obj = Provider.objects.get(id=provider_id)
prowler_provider = initialize_prowler_provider(provider_obj)
provider_uid = provider_obj.uid
provider_type = provider_obj.provider
# Generate and ensure the output directory exists
output_directory = _generate_output_directory(
frameworks_bulk = Compliance.get_bulk(provider_type)
frameworks_avail = get_compliance_frameworks(provider_type)
out_dir, comp_dir = _generate_output_directory(
DJANGO_TMP_OUTPUT_DIRECTORY, provider_uid, tenant_id, scan_id
)
# Define auxiliary variables
def get_writer(writer_map, name, factory, is_last):
"""
Return existing writer_map[name] or create via factory().
In both cases set `.close_file = is_last`.
"""
initialization = False
if name not in writer_map:
writer_map[name] = factory()
initialization = True
w = writer_map[name]
w.close_file = is_last
return w, initialization
output_writers = {}
compliance_writers = {}
scan_summary = FindingOutput._transform_findings_stats(
ScanSummary.objects.filter(scan_id=scan_id)
)
# Retrieve findings queryset
findings_qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid")
qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid").iterator()
for batch, is_last in batched(qs, DJANGO_FINDINGS_BATCH_SIZE):
fos = [FindingOutput.transform_api_finding(f, prowler_provider) for f in batch]
# Process findings in batches
for batch, is_last_batch in batched(
findings_qs.iterator(), DJANGO_FINDINGS_BATCH_SIZE
):
finding_outputs = [
FindingOutput.transform_api_finding(finding, prowler_provider)
for finding in batch
]
# Generate output files
for mode, config in OUTPUT_FORMATS_MAPPING.items():
kwargs = dict(config.get("kwargs", {}))
# Outputs
for mode, cfg in OUTPUT_FORMATS_MAPPING.items():
cls = cfg["class"]
suffix = cfg["suffix"]
extra = cfg.get("kwargs", {}).copy()
if mode == "html":
kwargs["provider"] = prowler_provider
kwargs["stats"] = scan_summary
extra.update(provider=prowler_provider, stats=scan_summary)
writer_class = config["class"]
if writer_class in output_writers:
writer = output_writers[writer_class]
writer.transform(finding_outputs)
writer.close_file = is_last_batch
else:
writer = writer_class(
findings=finding_outputs,
file_path=output_directory,
file_extension=config["suffix"],
writer, initialization = get_writer(
output_writers,
cls,
lambda cls=cls, fos=fos, suffix=suffix: cls(
findings=fos,
file_path=out_dir,
file_extension=suffix,
from_cli=False,
)
writer.close_file = is_last_batch
output_writers[writer_class] = writer
),
is_last,
)
if not initialization:
writer.transform(fos)
writer.batch_write_data_to_file(**extra)
writer._data.clear()
# Write the current batch using the writer
writer.batch_write_data_to_file(**kwargs)
# Compliance CSVs
for name in frameworks_avail:
compliance_obj = frameworks_bulk[name]
# TODO: Refactor the output classes to avoid this manual reset
writer._data = []
klass = GenericCompliance
for condition, cls in COMPLIANCE_CLASS_MAP.get(provider_type, []):
if condition(name):
klass = cls
break
# Compress output files
output_directory = _compress_output_files(output_directory)
filename = f"{comp_dir}_{name}.csv"
# Save to configured storage
uploaded = _upload_to_s3(tenant_id, output_directory, scan_id)
writer, initialization = get_writer(
compliance_writers,
name,
lambda klass=klass, fos=fos: klass(
findings=fos,
compliance=compliance_obj,
file_path=filename,
from_cli=False,
),
is_last,
)
if not initialization:
writer.transform(fos, compliance_obj, name)
writer.batch_write_data_to_file()
writer._data.clear()
if uploaded:
# Remove the local files after upload
compressed = _compress_output_files(out_dir)
upload_uri = _upload_to_s3(tenant_id, compressed, scan_id)
if upload_uri:
try:
rmtree(Path(output_directory).parent, ignore_errors=True)
except FileNotFoundError as e:
rmtree(Path(compressed).parent, ignore_errors=True)
except Exception as e:
logger.error(f"Error deleting output files: {e}")
output_directory = uploaded
uploaded = True
final_location, did_upload = upload_uri, True
else:
uploaded = False
final_location, did_upload = compressed, False
# Update the scan instance with the output path
Scan.all_objects.filter(id=scan_id).update(output_location=output_directory)
Scan.all_objects.filter(id=scan_id).update(output_location=final_location)
logger.info(f"Scan outputs at {final_location}")
return {"upload": did_upload}
logger.info(f"Scan output files generated, output location: {output_directory}")
return {"upload": uploaded}
@shared_task(name="backfill-scan-resource-summaries", queue="backfill")
def backfill_scan_resource_summaries_task(tenant_id: str, scan_id: str):
"""
Tries to backfill the resource scan summaries table for a given scan.
Args:
tenant_id (str): The tenant identifier.
scan_id (str): The scan identifier.
"""
return backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)

View File

@@ -0,0 +1,79 @@
from uuid import uuid4
import pytest
from tasks.jobs.backfill import backfill_resource_scan_summaries
from api.models import ResourceScanSummary, Scan, StateChoices
@pytest.mark.django_db
class TestBackfillResourceScanSummaries:
@pytest.fixture(scope="function")
def resource_scan_summary_data(self, scans_fixture):
scan = scans_fixture[0]
return ResourceScanSummary.objects.create(
tenant_id=scan.tenant_id,
scan_id=scan.id,
resource_id=str(uuid4()),
service="aws",
region="us-east-1",
resource_type="instance",
)
@pytest.fixture(scope="function")
def get_not_completed_scans(self, providers_fixture):
provider_id = providers_fixture[0].id
tenant_id = providers_fixture[0].tenant_id
scan_1 = Scan.objects.create(
tenant_id=tenant_id,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.EXECUTING,
provider_id=provider_id,
)
scan_2 = Scan.objects.create(
tenant_id=tenant_id,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.AVAILABLE,
provider_id=provider_id,
)
return scan_1, scan_2
def test_already_backfilled(self, resource_scan_summary_data):
tenant_id = resource_scan_summary_data.tenant_id
scan_id = resource_scan_summary_data.scan_id
result = backfill_resource_scan_summaries(tenant_id, scan_id)
assert result == {"status": "already backfilled"}
def test_not_completed_scan(self, get_not_completed_scans):
for scan_instance in get_not_completed_scans:
tenant_id = scan_instance.tenant_id
scan_id = scan_instance.id
result = backfill_resource_scan_summaries(tenant_id, scan_id)
assert result == {"status": "scan is not completed"}
def test_successful_backfill_inserts_one_summary(
self, resources_fixture, findings_fixture
):
tenant_id = findings_fixture[0].tenant_id
scan_id = findings_fixture[0].scan_id
# This scan affects the first two resources
resources = resources_fixture[:2]
result = backfill_resource_scan_summaries(tenant_id, scan_id)
assert result == {"status": "backfilled", "inserted": len(resources)}
# Verify correct values
summaries = ResourceScanSummary.objects.filter(
tenant_id=tenant_id, scan_id=scan_id
)
assert summaries.count() == len(resources)
for resource in resources:
summary = summaries.get(resource_id=resource.id)
assert summary.resource_id == resource.id
assert summary.service == resource.service
assert summary.region == resource.region
assert summary.resource_type == resource.type

View File

@@ -9,17 +9,19 @@ from api.models import Provider, Tenant
class TestDeleteProvider:
def test_delete_provider_success(self, providers_fixture):
instance = providers_fixture[0]
result = delete_provider(instance.id)
tenant_id = str(instance.tenant_id)
result = delete_provider(tenant_id, instance.id)
assert result
with pytest.raises(ObjectDoesNotExist):
Provider.objects.get(pk=instance.id)
def test_delete_provider_does_not_exist(self):
def test_delete_provider_does_not_exist(self, tenants_fixture):
tenant_id = str(tenants_fixture[0].id)
non_existent_pk = "babf6796-cfcc-4fd3-9dcf-88d012247645"
with pytest.raises(ObjectDoesNotExist):
delete_provider(non_existent_pk)
delete_provider(tenant_id, non_existent_pk)
@pytest.mark.django_db

View File

@@ -0,0 +1,166 @@
import os
import zipfile
from pathlib import Path
from unittest.mock import MagicMock, patch
import pytest
from botocore.exceptions import ClientError
from tasks.jobs.export import (
_compress_output_files,
_generate_output_directory,
_upload_to_s3,
get_s3_client,
)
@pytest.mark.django_db
class TestOutputs:
def test_compress_output_files_creates_zip(self, tmpdir):
base_tmp = Path(str(tmpdir.mkdir("compress_output")))
output_dir = base_tmp / "output"
output_dir.mkdir()
file_path = output_dir / "result.csv"
file_path.write_text("data")
zip_path = _compress_output_files(str(output_dir))
assert zip_path.endswith(".zip")
assert os.path.exists(zip_path)
with zipfile.ZipFile(zip_path, "r") as zipf:
assert "output/result.csv" in zipf.namelist()
@patch("tasks.jobs.export.boto3.client")
@patch("tasks.jobs.export.settings")
def test_get_s3_client_success(self, mock_settings, mock_boto_client):
mock_settings.DJANGO_OUTPUT_S3_AWS_ACCESS_KEY_ID = "test"
mock_settings.DJANGO_OUTPUT_S3_AWS_SECRET_ACCESS_KEY = "test"
mock_settings.DJANGO_OUTPUT_S3_AWS_SESSION_TOKEN = "token"
mock_settings.DJANGO_OUTPUT_S3_AWS_DEFAULT_REGION = "eu-west-1"
client_mock = MagicMock()
mock_boto_client.return_value = client_mock
client = get_s3_client()
assert client is not None
client_mock.list_buckets.assert_called()
@patch("tasks.jobs.export.boto3.client")
@patch("tasks.jobs.export.settings")
def test_get_s3_client_fallback(self, mock_settings, mock_boto_client):
mock_boto_client.side_effect = [
ClientError({"Error": {"Code": "403"}}, "ListBuckets"),
MagicMock(),
]
client = get_s3_client()
assert client is not None
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_success(self, mock_base, mock_get_client, tmpdir):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
base_tmp = Path(str(tmpdir.mkdir("upload_success")))
zip_path = base_tmp / "outputs.zip"
zip_path.write_bytes(b"dummy")
compliance_dir = base_tmp / "compliance"
compliance_dir.mkdir()
(compliance_dir / "report.csv").write_text("ok")
client_mock = MagicMock()
mock_get_client.return_value = client_mock
result = _upload_to_s3("tenant-id", str(zip_path), "scan-id")
expected_uri = "s3://test-bucket/tenant-id/scan-id/outputs.zip"
assert result == expected_uri
assert client_mock.upload_file.call_count == 2
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_missing_bucket(self, mock_base, mock_get_client):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = ""
result = _upload_to_s3("tenant", "/tmp/fake.zip", "scan")
assert result is None
@patch("tasks.jobs.export.get_s3_client")
@patch("tasks.jobs.export.base")
def test_upload_to_s3_skips_non_files(self, mock_base, mock_get_client, tmpdir):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "test-bucket"
base_tmp = Path(str(tmpdir.mkdir("upload_skips_non_files")))
zip_path = base_tmp / "results.zip"
zip_path.write_bytes(b"zip")
compliance_dir = base_tmp / "compliance"
compliance_dir.mkdir()
(compliance_dir / "subdir").mkdir()
client_mock = MagicMock()
mock_get_client.return_value = client_mock
result = _upload_to_s3("tenant", str(zip_path), "scan")
expected_uri = "s3://test-bucket/tenant/scan/results.zip"
assert result == expected_uri
client_mock.upload_file.assert_called_once()
@patch(
"tasks.jobs.export.get_s3_client",
side_effect=ClientError({"Error": {}}, "Upload"),
)
@patch("tasks.jobs.export.base")
@patch("tasks.jobs.export.logger.error")
def test_upload_to_s3_failure_logs_error(
self, mock_logger, mock_base, mock_get_client, tmpdir
):
mock_base.DJANGO_OUTPUT_S3_AWS_OUTPUT_BUCKET = "bucket"
base_tmp = Path(str(tmpdir.mkdir("upload_failure_logs")))
zip_path = base_tmp / "zipfile.zip"
zip_path.write_bytes(b"zip")
compliance_dir = base_tmp / "compliance"
compliance_dir.mkdir()
(compliance_dir / "report.csv").write_text("csv")
_upload_to_s3("tenant", str(zip_path), "scan")
mock_logger.assert_called()
def test_generate_output_directory_creates_paths(self, tmpdir):
from prowler.config.config import output_file_timestamp
base_tmp = Path(str(tmpdir.mkdir("generate_output")))
base_dir = str(base_tmp)
tenant_id = "t1"
scan_id = "s1"
provider = "aws"
path, compliance = _generate_output_directory(
base_dir, provider, tenant_id, scan_id
)
assert os.path.isdir(os.path.dirname(path))
assert os.path.isdir(os.path.dirname(compliance))
assert path.endswith(f"{provider}-{output_file_timestamp}")
assert compliance.endswith(f"{provider}-{output_file_timestamp}")
def test_generate_output_directory_invalid_character(self, tmpdir):
from prowler.config.config import output_file_timestamp
base_tmp = Path(str(tmpdir.mkdir("generate_output")))
base_dir = str(base_tmp)
tenant_id = "t1"
scan_id = "s1"
provider = "aws/test@check"
path, compliance = _generate_output_directory(
base_dir, provider, tenant_id, scan_id
)
assert os.path.isdir(os.path.dirname(path))
assert os.path.isdir(os.path.dirname(compliance))
assert path.endswith(f"aws-test-check-{output_file_timestamp}")
assert compliance.endswith(f"aws-test-check-{output_file_timestamp}")

View File

@@ -1,4 +1,6 @@
import json
import uuid
from datetime import datetime
from unittest.mock import MagicMock, patch
import pytest
@@ -7,6 +9,7 @@ from tasks.jobs.scan import (
_store_resources,
perform_prowler_scan,
)
from tasks.utils import CustomEncoder
from api.models import (
Finding,
@@ -107,7 +110,13 @@ class TestPerformScan:
finding.service_name = "service_name"
finding.resource_type = "resource_type"
finding.resource_tags = {"tag1": "value1", "tag2": "value2"}
finding.muted = False
finding.raw = {}
finding.resource_metadata = {"test": "metadata"}
finding.resource_details = {"details": "test"}
finding.partition = "partition"
finding.muted = True
finding.compliance = {"compliance1": "PASS"}
# Mock the ProwlerScan instance
mock_prowler_scan_instance = MagicMock()
@@ -145,6 +154,8 @@ class TestPerformScan:
assert scan_finding.severity == finding.severity
assert scan_finding.check_id == finding.check_id
assert scan_finding.raw_result == finding.raw
assert scan_finding.muted
assert scan_finding.compliance == finding.compliance
assert scan_resource.tenant == tenant
assert scan_resource.uid == finding.resource_uid
@@ -152,6 +163,11 @@ class TestPerformScan:
assert scan_resource.service == finding.service_name
assert scan_resource.type == finding.resource_type
assert scan_resource.name == finding.resource_name
assert scan_resource.metadata == json.dumps(
finding.resource_metadata, cls=CustomEncoder
)
assert scan_resource.details == f"{finding.resource_details}"
assert scan_resource.partition == finding.partition
# Assert that the resource tags have been created and associated
tags = scan_resource.tags.all()
@@ -191,6 +207,10 @@ class TestPerformScan:
scan.refresh_from_db()
assert scan.state == StateChoices.FAILED
provider.refresh_from_db()
assert provider.connected is False
assert isinstance(provider.connection_last_checked_at, datetime)
@pytest.mark.parametrize(
"last_status, new_status, expected_delta",
[

View File

@@ -0,0 +1,415 @@
import uuid
from pathlib import Path
from unittest.mock import MagicMock, patch
import pytest
from tasks.tasks import generate_outputs
@pytest.mark.django_db
class TestGenerateOutputs:
def setup_method(self):
self.scan_id = str(uuid.uuid4())
self.provider_id = str(uuid.uuid4())
self.tenant_id = str(uuid.uuid4())
def test_no_findings_returns_early(self):
with patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter:
mock_filter.return_value.exists.return_value = False
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": False}
mock_filter.assert_called_once_with(scan_id=self.scan_id)
@patch("tasks.tasks.rmtree")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Finding.all_objects.filter")
def test_generate_outputs_happy_path(
self,
mock_finding_filter,
mock_scan_summary_filter,
mock_provider_get,
mock_initialize_provider,
mock_compliance_get_bulk,
mock_get_available_frameworks,
mock_compress,
mock_upload,
mock_rmtree,
):
mock_scan_summary_filter.return_value.exists.return_value = True
mock_provider = MagicMock()
mock_provider.uid = "provider-uid"
mock_provider.provider = "aws"
mock_provider_get.return_value = mock_provider
prowler_provider = MagicMock()
mock_initialize_provider.return_value = prowler_provider
mock_compliance_get_bulk.return_value = {"cis": MagicMock()}
mock_get_available_frameworks.return_value = ["cis"]
dummy_finding = MagicMock(uid="f1")
mock_finding_filter.return_value.order_by.return_value.iterator.return_value = [
[dummy_finding],
True,
]
mock_transformed_stats = {"some": "stats"}
with (
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value=mock_transformed_stats,
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value={"transformed": "f1"},
),
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": MagicMock(name="JSONWriter"),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock(name="CSVCompliance"))]},
),
patch(
"tasks.tasks._generate_output_directory",
return_value=("out-dir", "comp-dir"),
),
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
):
mock_compress.return_value = "/tmp/zipped.zip"
mock_upload.return_value = "s3://bucket/zipped.zip"
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": True}
mock_scan_update.return_value.update.assert_called_once_with(
output_location="s3://bucket/zipped.zip"
)
mock_rmtree.assert_called_once_with(
Path("/tmp/zipped.zip").parent, ignore_errors=True
)
def test_generate_outputs_fails_upload(self):
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk"),
patch("tasks.tasks.get_compliance_frameworks"),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch("tasks.tasks.FindingOutput.transform_api_finding"),
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": MagicMock(name="Writer"),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value=None),
patch("tasks.tasks.Scan.all_objects.filter") as mock_scan_update,
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
result = generate_outputs(
scan_id="scan",
provider_id="provider",
tenant_id=self.tenant_id,
)
assert result == {"upload": False}
mock_scan_update.return_value.update.assert_called_once()
def test_generate_outputs_triggers_html_extra_update(self):
mock_finding_output = MagicMock()
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value={"some": "stats"},
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value=mock_finding_output,
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/f.zip"),
patch("tasks.tasks.Scan.all_objects.filter"),
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
html_writer_mock = MagicMock()
with (
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"html": {
"class": lambda *args, **kwargs: html_writer_mock,
"suffix": ".html",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
):
generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
html_writer_mock.batch_write_data_to_file.assert_called_once()
def test_transform_called_only_on_second_batch(self):
raw1 = MagicMock()
raw2 = MagicMock()
tf1 = MagicMock()
tf1.compliance = {}
tf2 = MagicMock()
tf2.compliance = {}
writer_instances = []
class TrackingWriter:
def __init__(self, findings, file_path, file_extension, from_cli):
self.transform_called = 0
self.batch_write_data_to_file = MagicMock()
self._data = []
self.close_file = False
writer_instances.append(self)
def transform(self, fos):
self.transform_called += 1
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk"),
patch("tasks.tasks.get_compliance_frameworks", return_value=[]),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
side_effect=[tf1, tf2],
),
patch(
"tasks.tasks._generate_output_directory",
return_value=("outdir", "compdir"),
),
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
patch("tasks.tasks.rmtree"),
patch("tasks.tasks.Scan.all_objects.filter"),
patch(
"tasks.tasks.batched",
return_value=[
([raw1], False),
([raw2], True),
],
),
):
mock_summary.return_value.exists.return_value = True
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": TrackingWriter,
"suffix": ".json",
"kwargs": {},
}
},
):
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert result == {"upload": True}
assert len(writer_instances) == 1
writer = writer_instances[0]
assert writer.transform_called == 1
def test_compliance_transform_called_on_second_batch(self):
raw1 = MagicMock()
raw2 = MagicMock()
compliance_obj = MagicMock()
writer_instances = []
class TrackingComplianceWriter:
def __init__(self, *args, **kwargs):
self.transform_calls = []
self._data = []
writer_instances.append(self)
def transform(self, fos, comp_obj, name):
self.transform_calls.append((fos, comp_obj, name))
def batch_write_data_to_file(self):
pass
two_batches = [
([raw1], False),
([raw2], True),
]
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_summary,
patch(
"tasks.tasks.Provider.objects.get",
return_value=MagicMock(uid="UID", provider="aws"),
),
patch("tasks.tasks.initialize_prowler_provider"),
patch(
"tasks.tasks.Compliance.get_bulk", return_value={"cis": compliance_obj}
),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch(
"tasks.tasks._generate_output_directory",
return_value=("outdir", "compdir"),
),
patch("tasks.tasks.FindingOutput._transform_findings_stats"),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
side_effect=lambda f, prov: f,
),
patch("tasks.tasks._compress_output_files", return_value="outdir.zip"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/outdir.zip"),
patch("tasks.tasks.rmtree"),
patch(
"tasks.tasks.Scan.all_objects.filter",
return_value=MagicMock(update=lambda **kw: None),
),
patch("tasks.tasks.batched", return_value=two_batches),
patch("tasks.tasks.OUTPUT_FORMATS_MAPPING", {}),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda name: True, TrackingComplianceWriter)]},
),
):
mock_summary.return_value.exists.return_value = True
result = generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert len(writer_instances) == 1
writer = writer_instances[0]
assert writer.transform_calls == [([raw2], compliance_obj, "cis")]
assert result == {"upload": True}
def test_generate_outputs_logs_rmtree_exception(self, caplog):
mock_finding_output = MagicMock()
mock_finding_output.compliance = {"cis": ["requirement-1", "requirement-2"]}
with (
patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter,
patch("tasks.tasks.Provider.objects.get"),
patch("tasks.tasks.initialize_prowler_provider"),
patch("tasks.tasks.Compliance.get_bulk", return_value={"cis": MagicMock()}),
patch("tasks.tasks.get_compliance_frameworks", return_value=["cis"]),
patch("tasks.tasks.Finding.all_objects.filter") as mock_findings,
patch(
"tasks.tasks._generate_output_directory", return_value=("out", "comp")
),
patch(
"tasks.tasks.FindingOutput._transform_findings_stats",
return_value={"some": "stats"},
),
patch(
"tasks.tasks.FindingOutput.transform_api_finding",
return_value=mock_finding_output,
),
patch("tasks.tasks._compress_output_files", return_value="/tmp/compressed"),
patch("tasks.tasks._upload_to_s3", return_value="s3://bucket/file.zip"),
patch("tasks.tasks.Scan.all_objects.filter"),
patch("tasks.tasks.rmtree", side_effect=Exception("Test deletion error")),
):
mock_filter.return_value.exists.return_value = True
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[MagicMock()],
True,
]
with (
patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"json": {
"class": lambda *args, **kwargs: MagicMock(),
"suffix": ".json",
"kwargs": {},
}
},
),
patch(
"tasks.tasks.COMPLIANCE_CLASS_MAP",
{"aws": [(lambda x: True, MagicMock())]},
),
):
with caplog.at_level("ERROR"):
generate_outputs(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert "Error deleting output files" in caplog.text

View File

@@ -1,9 +1,32 @@
import json
from datetime import datetime, timedelta, timezone
from enum import Enum
from django_celery_beat.models import PeriodicTask
from django_celery_results.models import TaskResult
class CustomEncoder(json.JSONEncoder):
def default(self, o):
# Enum serialization
if isinstance(o, Enum):
return o.value
# Datetime and timedelta serialization
if isinstance(o, datetime):
return o.isoformat(timespec="seconds")
if isinstance(o, timedelta):
return o.total_seconds()
# Custom object serialization
try:
return super().default(o)
except TypeError:
try:
return o.__dict__
except AttributeError:
return str(o)
def get_next_execution_datetime(task_id: int, provider_id: str) -> datetime:
task_instance = TaskResult.objects.get(task_id=task_id)
try:

View File

@@ -0,0 +1,156 @@
#!/usr/bin/env python3
import argparse
import os
import re
import subprocess
import sys
from pathlib import Path
import matplotlib.pyplot as plt
import pandas as pd
plt.style.use("ggplot")
def run_locust(
locust_file: str,
host: str,
users: int,
hatch_rate: int,
run_time: str,
csv_prefix: Path,
) -> Path:
artifacts_dir = Path("artifacts")
artifacts_dir.mkdir(parents=True, exist_ok=True)
cmd = [
"locust",
"-f",
f"scenarios/{locust_file}",
"--headless",
"-u",
str(users),
"-r",
str(hatch_rate),
"-t",
run_time,
"--host",
host,
"--csv",
str(artifacts_dir / csv_prefix.name),
]
print(f"Running Locust: {' '.join(cmd)}")
process = subprocess.run(cmd)
if process.returncode:
sys.exit("Locust execution failed")
stats_file = artifacts_dir / f"{csv_prefix.stem}_stats.csv"
if not stats_file.exists():
sys.exit(f"Stats CSV not found: {stats_file}")
return stats_file
def load_percentiles(csv_path: Path) -> pd.DataFrame:
df = pd.read_csv(csv_path)
mapping = {"50%": "p50", "75%": "p75", "90%": "p90", "95%": "p95"}
available = [col for col in mapping if col in df.columns]
renamed = {col: mapping[col] for col in available}
df = df.rename(columns=renamed).set_index("Name")[renamed.values()]
return df.drop(index=["Aggregated"], errors="ignore")
def sanitize_label(label: str) -> str:
text = re.sub(r"[^\w]+", "_", label.strip().lower())
return text.strip("_")
def plot_multi_comparison(metrics: dict[str, pd.DataFrame]) -> None:
common = sorted(set.intersection(*(set(df.index) for df in metrics.values())))
percentiles = list(next(iter(metrics.values())).columns)
groups = len(metrics)
width = 0.8 / groups
for endpoint in common:
fig, ax = plt.subplots(figsize=(10, 5), dpi=100)
for idx, (label, df) in enumerate(metrics.items()):
series = df.loc[endpoint]
positions = [
i + (idx - groups / 2) * width + width / 2
for i in range(len(percentiles))
]
bars = ax.bar(positions, series.values, width, label=label)
for bar in bars:
height = bar.get_height()
ax.annotate(
f"{int(height)}",
xy=(bar.get_x() + bar.get_width() / 2, height),
xytext=(0, 3),
textcoords="offset points",
ha="center",
va="bottom",
fontsize=8,
)
ax.set_xticks(range(len(percentiles)))
ax.set_xticklabels(percentiles)
ax.set_ylabel("Latency (ms)")
ax.set_title(endpoint, fontsize=12)
ax.grid(True, axis="y", linestyle="--", alpha=0.7)
fig.tight_layout()
fig.subplots_adjust(right=0.75)
ax.legend(loc="center left", bbox_to_anchor=(1, 0.5), framealpha=0.9)
output = Path("artifacts") / f"comparison_{sanitize_label(endpoint)}.png"
plt.savefig(output)
plt.close(fig)
print(f"Saved chart: {output}")
def main() -> None:
parser = argparse.ArgumentParser(description="Run Locust and compare metrics")
parser.add_argument("--locustfile", required=True, help="Locust file in scenarios/")
parser.add_argument("--host", required=True, help="Target host URL")
parser.add_argument(
"--users", type=int, default=10, help="Number of simulated users"
)
parser.add_argument("--rate", type=int, default=1, help="Hatch rate per second")
parser.add_argument("--time", default="1m", help="Test duration (e.g. 30s, 1m)")
parser.add_argument(
"--metrics-dir", default="baselines", help="Directory with CSV baselines"
)
parser.add_argument("--version", default="current", help="Test version")
args = parser.parse_args()
metrics_dir = Path(args.metrics_dir)
os.makedirs(metrics_dir, exist_ok=True)
metrics_data: dict[str, pd.DataFrame] = {}
for csv_file in sorted(metrics_dir.glob("*.csv")):
metrics_data[csv_file.stem] = load_percentiles(csv_file)
current_prefix = Path(args.version)
current_csv = run_locust(
locust_file=args.locustfile,
host=args.host,
users=args.users,
hatch_rate=args.rate,
run_time=args.time,
csv_prefix=current_prefix,
)
metrics_data[args.version] = load_percentiles(current_csv)
for endpoint in sorted(
set.intersection(*(set(df.index) for df in metrics_data.values()))
):
parts = [endpoint]
for label, df in metrics_data.items():
s = df.loc[endpoint]
parts.append(f"{label}: p50 {s.p50}, p75 {s.p75}, p90 {s.p90}, p95 {s.p95}")
print(" | ".join(parts))
plot_multi_comparison(metrics_data)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,3 @@
locust==2.34.1
matplotlib==3.10.1
pandas==2.2.3

View File

@@ -0,0 +1,216 @@
from locust import events, task
from utils.config import (
FINDINGS_UI_SORT_VALUES,
L_PROVIDER_NAME,
M_PROVIDER_NAME,
S_PROVIDER_NAME,
TARGET_INSERTED_AT,
)
from utils.helpers import (
APIUserBase,
get_api_token,
get_auth_headers,
get_next_resource_filter,
get_resource_filters_pairs,
get_scan_id_from_provider_name,
get_sort_value,
)
GLOBAL = {
"token": None,
"scan_ids": {},
"resource_filters": None,
"large_resource_filters": None,
}
@events.test_start.add_listener
def on_test_start(environment, **kwargs):
GLOBAL["token"] = get_api_token(environment.host)
GLOBAL["scan_ids"]["small"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], S_PROVIDER_NAME
)
GLOBAL["scan_ids"]["medium"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], M_PROVIDER_NAME
)
GLOBAL["scan_ids"]["large"] = get_scan_id_from_provider_name(
environment.host, GLOBAL["token"], L_PROVIDER_NAME
)
GLOBAL["resource_filters"] = get_resource_filters_pairs(
environment.host, GLOBAL["token"]
)
GLOBAL["large_resource_filters"] = get_resource_filters_pairs(
environment.host, GLOBAL["token"], GLOBAL["scan_ids"]["large"]
)
class APIUser(APIUserBase):
def on_start(self):
self.token = GLOBAL["token"]
self.s_scan_id = GLOBAL["scan_ids"]["small"]
self.m_scan_id = GLOBAL["scan_ids"]["medium"]
self.l_scan_id = GLOBAL["scan_ids"]["large"]
self.available_resource_filters = GLOBAL["resource_filters"]
self.available_resource_filters_large_scan = GLOBAL["large_resource_filters"]
@task
def findings_default(self):
name = "/findings"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_default_include(self):
name = "/findings?include"
page = self._next_page(name)
endpoint = (
f"/findings?page[number]={page}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata(self):
endpoint = f"/findings/metadata?" f"filter[inserted_at]={TARGET_INSERTED_AT}"
self.client.get(
endpoint, headers=get_auth_headers(self.token), name="/findings/metadata"
)
@task
def findings_scan_small(self):
name = "/findings?filter[scan_id] - 50k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.s_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_small(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.s_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 50k",
)
@task(2)
def findings_scan_medium(self):
name = "/findings?filter[scan_id] - 250k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.m_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_medium(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.m_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 250k",
)
@task
def findings_scan_large(self):
name = "/findings?filter[scan_id] - 500k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_scan_large_include(self):
name = "/findings?filter[scan_id]&include - 500k"
page_number = self._next_page(name)
endpoint = (
f"/findings?page[number]={page_number}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def findings_metadata_scan_large(self):
endpoint = f"/findings/metadata?" f"&filter[scan]={self.l_scan_id}"
self.client.get(
endpoint,
headers=get_auth_headers(self.token),
name="/findings/metadata?filter[scan_id] - 500k",
)
@task(2)
def findings_resource_filter(self):
name = "/findings?filter[resource_filter]&include"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings?filter[{filter_name}]={filter_value}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata_resource_filter(self):
name = "/findings/metadata?filter[resource_filter]"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings/metadata?filter[{filter_name}]={filter_value}"
f"&filter[inserted_at]={TARGET_INSERTED_AT}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(3)
def findings_metadata_resource_filter_scan_large(self):
name = "/findings/metadata?filter[resource_filter]&filter[scan_id] - 500k"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings/metadata?filter[{filter_name}]={filter_value}"
f"&filter[scan]={self.l_scan_id}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(2)
def findings_resource_filter_large_scan_include(self):
name = "/findings?filter[resource_filter][scan]&include - 500k"
filter_name, filter_value = get_next_resource_filter(
self.available_resource_filters
)
endpoint = (
f"/findings?filter[{filter_name}]={filter_value}"
f"&{get_sort_value(FINDINGS_UI_SORT_VALUES)}"
f"&filter[scan]={self.l_scan_id}"
f"&include=scan.provider,resources"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)

View File

@@ -0,0 +1,19 @@
import os
USER_EMAIL = os.environ.get("USER_EMAIL")
USER_PASSWORD = os.environ.get("USER_PASSWORD")
BASE_HEADERS = {"Content-Type": "application/vnd.api+json"}
FINDINGS_UI_SORT_VALUES = ["severity", "status", "-inserted_at"]
TARGET_INSERTED_AT = os.environ.get("TARGET_INSERTED_AT", "2025-04-22")
FINDINGS_RESOURCE_METADATA = {
"regions": "region",
"resource_types": "resource_type",
"services": "service",
}
S_PROVIDER_NAME = "provider-50k"
M_PROVIDER_NAME = "provider-250k"
L_PROVIDER_NAME = "provider-500k"

View File

@@ -0,0 +1,168 @@
import random
from collections import defaultdict
from threading import Lock
import requests
from locust import HttpUser, between
from utils.config import (
BASE_HEADERS,
FINDINGS_RESOURCE_METADATA,
TARGET_INSERTED_AT,
USER_EMAIL,
USER_PASSWORD,
)
_global_page_counters = defaultdict(int)
_page_lock = Lock()
class APIUserBase(HttpUser):
"""
Base class for API user simulation in Locust performance tests.
Attributes:
abstract (bool): Indicates this is an abstract user class.
wait_time: Time between task executions, randomized between 1 and 5 seconds.
"""
abstract = True
wait_time = between(1, 5)
def _next_page(self, endpoint_name: str) -> int:
"""
Returns the next page number for a given endpoint. Thread-safe.
Args:
endpoint_name (str): Name of the API endpoint being paginated.
Returns:
int: The next page number for the given endpoint.
"""
with _page_lock:
_global_page_counters[endpoint_name] += 1
return _global_page_counters[endpoint_name]
def get_next_resource_filter(available_values: dict) -> tuple:
"""
Randomly selects a filter type and value from available options.
Args:
available_values (dict): Dictionary with filter types as keys and list of possible values.
Returns:
tuple: A (filter_type, filter_value) pair randomly selected.
"""
filter_type = random.choice(list(available_values.keys()))
filter_value = random.choice(available_values[filter_type])
return filter_type, filter_value
def get_auth_headers(token: str) -> dict:
"""
Returns the headers for the API requests.
Args:
token (str): The token to be included in the headers.
Returns:
dict: The headers for the API requests.
"""
return {
"Authorization": f"Bearer {token}",
**BASE_HEADERS,
}
def get_api_token(host: str) -> str:
"""
Authenticates with the API and retrieves a bearer token.
Args:
host (str): The host URL of the API.
Returns:
str: The access token for authenticated requests.
Raises:
AssertionError: If the request fails or does not return a 200 status code.
"""
login_payload = {
"data": {
"type": "tokens",
"attributes": {"email": USER_EMAIL, "password": USER_PASSWORD},
}
}
response = requests.post(f"{host}/tokens", json=login_payload, headers=BASE_HEADERS)
assert response.status_code == 200, f"Failed to get token: {response.text}"
return response.json()["data"]["attributes"]["access"]
def get_scan_id_from_provider_name(host: str, token: str, provider_name: str) -> str:
"""
Retrieves the scan ID associated with a specific provider name.
Args:
host (str): The host URL of the API.
token (str): Bearer token for authentication.
provider_name (str): Name of the provider to filter scans by.
Returns:
str: The ID of the scan.
Raises:
AssertionError: If the request fails or does not return a 200 status code.
"""
response = requests.get(
f"{host}/scans?fields[scans]=id&filter[provider_alias]={provider_name}",
headers=get_auth_headers(token),
)
assert response.status_code == 200, f"Failed to get scan: {response.text}"
return response.json()["data"][0]["id"]
def get_resource_filters_pairs(host: str, token: str, scan_id: str = "") -> dict:
"""
Retrieves and maps resource metadata filter values from the findings endpoint.
Args:
host (str): The host URL of the API.
token (str): Bearer token for authentication.
scan_id (str, optional): Optional scan ID to filter metadata. Defaults to using inserted_at timestamp.
Returns:
dict: A dictionary of resource filter metadata.
Raises:
AssertionError: If the request fails or does not return a 200 status code.
"""
metadata_filters = (
f"filter[scan]={scan_id}"
if scan_id
else f"filter[inserted_at]={TARGET_INSERTED_AT}"
)
response = requests.get(
f"{host}/findings/metadata?{metadata_filters}", headers=get_auth_headers(token)
)
assert (
response.status_code == 200
), f"Failed to get resource filters values: {response.text}"
attributes = response.json()["data"]["attributes"]
return {
FINDINGS_RESOURCE_METADATA[key]: values
for key, values in attributes.items()
if key in FINDINGS_RESOURCE_METADATA.keys()
}
def get_sort_value(sort_values: list) -> str:
"""
Constructs a sort query string from a list of sort keys.
Args:
sort_values (list): The list of sort values to include in the query.
Returns:
str: A formatted sort query string (e.g., "sort=created_at,-severity").
"""
return f"sort={','.join(sort_values)}"

View File

@@ -1,6 +1,9 @@
#!/bin/bash
# Run Prowler against All AWS Accounts in an AWS Organization
# Activate Poetry Environment
eval "$(poetry env activate)"
# Show Prowler Version
prowler -v

View File

@@ -399,7 +399,6 @@ mainConfig:
[
"RSA-1024",
"P-192",
"SHA-1",
]
# AWS EKS Configuration

View File

@@ -16,7 +16,6 @@ spec:
containers:
- name: prowler
image: {{ .Values.image.repository }}:{{ .Values.image.tag }}
command: ["prowler"]
args: ["kubernetes", "-z", "-b"]
imagePullPolicy: {{ .Values.image.pullPolicy }}
volumeMounts:

View File

@@ -161,7 +161,7 @@ def update_nav_bar(pathname):
html.Span(
[
html.Img(src="assets/favicon.ico", className="w-5"),
"Subscribe to prowler SaaS",
"Subscribe to Prowler Cloud",
],
className="flex items-center gap-x-3 text-white",
),

Binary file not shown.

After

Width:  |  Height:  |  Size: 68 KiB

View File

@@ -1361,6 +1361,9 @@ video {
.lg\:grid-cols-4 {
grid-template-columns: repeat(4, minmax(0, 1fr));
}
.lg\:grid-cols-5 {
grid-template-columns: repeat(5, minmax(0, 1fr));
}
.lg\:justify-normal {
justify-content: normal;
@@ -1403,4 +1406,4 @@ video {
.\32xl\:w-\[9\%\] {
width: 9%;
}
}
}

View File

@@ -2228,13 +2228,356 @@ def get_section_containers_ens(data, section_1, section_2, section_3, section_4)
return html.Div(section_containers, className="compliance-data-layout")
def get_section_containers_3_levels(data, section_1, section_2, section_3):
data["STATUS"] = data["STATUS"].apply(map_status_to_icon)
findings_counts_marco = (
data.groupby([section_1, "STATUS"]).size().unstack(fill_value=0)
)
section_containers = []
data[section_1] = data[section_1].astype(str)
data[section_2] = data[section_2].astype(str)
data[section_3] = data[section_3].astype(str)
data.sort_values(
by=section_3,
key=lambda x: x.map(extract_numeric_values),
ascending=True,
inplace=True,
)
for marco in data[section_1].unique():
success_marco = findings_counts_marco.loc[marco].get(pass_emoji, 0)
failed_marco = findings_counts_marco.loc[marco].get(fail_emoji, 0)
fig_name = go.Figure(
[
go.Bar(
name="Failed",
x=[failed_marco],
y=[""],
orientation="h",
marker=dict(color="#e77676"),
width=[0.8],
),
go.Bar(
name="Success",
x=[success_marco],
y=[""],
orientation="h",
marker=dict(color="#45cc6e"),
width=[0.8],
),
]
)
fig_name.update_layout(
barmode="stack",
margin=dict(l=10, r=10, t=10, b=10),
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
showlegend=False,
width=350,
height=30,
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
annotations=[
dict(
x=success_marco + failed_marco,
y=0,
xref="x",
yref="y",
text=str(success_marco),
showarrow=False,
font=dict(color="#45cc6e", size=14),
xanchor="left",
yanchor="middle",
),
dict(
x=0,
y=0,
xref="x",
yref="y",
text=str(failed_marco),
showarrow=False,
font=dict(color="#e77676", size=14),
xanchor="right",
yanchor="middle",
),
],
)
fig_name.add_annotation(
x=failed_marco,
y=0.3,
text="|",
showarrow=False,
font=dict(size=20),
xanchor="center",
yanchor="middle",
)
graph_div = html.Div(
dcc.Graph(
figure=fig_name, config={"staticPlot": True}, className="info-bar"
),
className="graph-section",
)
direct_internal_items = []
for categoria in data[data[section_1] == marco][section_2].unique():
specific_data = data[
(data[section_1] == marco) & (data[section_2] == categoria)
]
findings_counts_categoria = (
specific_data.groupby([section_2, "STATUS"])
.size()
.unstack(fill_value=0)
)
success_categoria = findings_counts_categoria.loc[categoria].get(
pass_emoji, 0
)
failed_categoria = findings_counts_categoria.loc[categoria].get(
fail_emoji, 0
)
fig_section = go.Figure(
[
go.Bar(
name="Failed",
x=[failed_categoria],
y=[""],
orientation="h",
marker=dict(color="#e77676"),
width=[0.8],
),
go.Bar(
name="Success",
x=[success_categoria],
y=[""],
orientation="h",
marker=dict(color="#45cc6e"),
width=[0.8],
),
]
)
fig_section.update_layout(
barmode="stack",
margin=dict(l=10, r=10, t=10, b=10),
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
showlegend=False,
width=350,
height=30,
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
annotations=[
dict(
x=success_categoria + failed_categoria,
y=0,
xref="x",
yref="y",
text=str(success_categoria),
showarrow=False,
font=dict(color="#45cc6e", size=14),
xanchor="left",
yanchor="middle",
),
dict(
x=0,
y=0,
xref="x",
yref="y",
text=str(failed_categoria),
showarrow=False,
font=dict(color="#e77676", size=14),
xanchor="right",
yanchor="middle",
),
],
)
fig_section.add_annotation(
x=failed_categoria,
y=0.3,
text="|",
showarrow=False,
font=dict(size=20),
xanchor="center",
yanchor="middle",
)
graph_div_section = html.Div(
dcc.Graph(
figure=fig_section,
config={"staticPlot": True},
className="info-bar-child",
),
className="graph-section-req",
)
direct_internal_items_idgrupocontrol = []
for idgrupocontrol in specific_data[section_3].unique():
specific_data2 = specific_data[
specific_data[section_3] == idgrupocontrol
]
findings_counts_idgrupocontrol = (
specific_data2.groupby([section_3, "STATUS"])
.size()
.unstack(fill_value=0)
)
success_idgrupocontrol = findings_counts_idgrupocontrol.loc[
idgrupocontrol
].get(pass_emoji, 0)
failed_idgrupocontrol = findings_counts_idgrupocontrol.loc[
idgrupocontrol
].get(fail_emoji, 0)
fig_idgrupocontrol = go.Figure(
[
go.Bar(
name="Failed",
x=[failed_idgrupocontrol],
y=[""],
orientation="h",
marker=dict(color="#e77676"),
width=[0.8],
),
go.Bar(
name="Success",
x=[success_idgrupocontrol],
y=[""],
orientation="h",
marker=dict(color="#45cc6e"),
width=[0.8],
),
]
)
fig_idgrupocontrol.update_layout(
barmode="stack",
margin=dict(l=10, r=10, t=10, b=10),
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
showlegend=False,
width=350,
height=30,
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
annotations=[
dict(
x=success_idgrupocontrol + failed_idgrupocontrol,
y=0,
xref="x",
yref="y",
text=str(success_idgrupocontrol),
showarrow=False,
font=dict(color="#45cc6e", size=14),
xanchor="left",
yanchor="middle",
),
dict(
x=0,
y=0,
xref="x",
yref="y",
text=str(failed_idgrupocontrol),
showarrow=False,
font=dict(color="#e77676", size=14),
xanchor="right",
yanchor="middle",
),
],
)
fig_idgrupocontrol.add_annotation(
x=failed_idgrupocontrol,
y=0.3,
text="|",
showarrow=False,
font=dict(size=20),
xanchor="center",
yanchor="middle",
)
graph_div_idgrupocontrol = html.Div(
dcc.Graph(
figure=fig_idgrupocontrol,
config={"staticPlot": True},
className="info-bar-child",
),
className="graph-section-req",
)
data_table = dash_table.DataTable(
data=specific_data2.to_dict("records"),
columns=[
{"name": i, "id": i}
for i in [
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
],
style_table={"overflowX": "auto"},
style_as_list_view=True,
style_cell={"textAlign": "left", "padding": "5px"},
)
internal_accordion_item_2 = dbc.AccordionItem(
title=idgrupocontrol,
children=[
graph_div_idgrupocontrol,
html.Div([data_table], className="inner-accordion-content"),
],
)
direct_internal_items_idgrupocontrol.append(
html.Div(
[
graph_div_idgrupocontrol,
dbc.Accordion(
[internal_accordion_item_2],
start_collapsed=True,
flush=True,
),
],
className="accordion-inner--child",
)
)
internal_accordion_item = dbc.AccordionItem(
title=categoria,
children=direct_internal_items_idgrupocontrol,
)
internal_section_container = html.Div(
[
graph_div_section,
dbc.Accordion(
[internal_accordion_item], start_collapsed=True, flush=True
),
],
className="accordion-inner--child",
)
direct_internal_items.append(internal_section_container)
accordion_item = dbc.AccordionItem(title=marco, children=direct_internal_items)
section_container = html.Div(
[
graph_div,
dbc.Accordion([accordion_item], start_collapsed=True, flush=True),
],
className="accordion-inner",
)
section_containers.append(section_container)
return html.Div(section_containers, className="compliance-data-layout")
# This function extracts and compares up to two numeric values, ensuring correct sorting for version-like strings.
def extract_numeric_values(value):
numbers = re.findall(r"\d+", str(value))
if len(numbers) >= 2:
if len(numbers) == 3:
return int(numbers[0]), int(numbers[1]), int(numbers[2])
elif len(numbers) == 2:
return int(numbers[0]), int(numbers[1])
elif len(numbers) == 1:
return int(numbers[0]), 0
return int(numbers[0])
return 0, 0

View File

@@ -1,6 +1,6 @@
import warnings
from dashboard.common_methods import get_section_containers_format2
from dashboard.common_methods import get_section_containers_3_levels
warnings.filterwarnings("ignore")
@@ -10,6 +10,7 @@ def get_table(data):
[
"REQUIREMENTS_ATTRIBUTES_NAME",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"CHECKID",
"STATUS",
"REGION",
@@ -17,6 +18,10 @@ def get_table(data):
"RESOURCEID",
]
]
return get_section_containers_format2(
aux, "REQUIREMENTS_ATTRIBUTES_NAME", "REQUIREMENTS_ATTRIBUTES_SECTION"
return get_section_containers_3_levels(
aux,
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"REQUIREMENTS_ATTRIBUTES_NAME",
)

View File

@@ -1,6 +1,6 @@
import warnings
from dashboard.common_methods import get_section_containers_format2
from dashboard.common_methods import get_section_containers_3_levels
warnings.filterwarnings("ignore")
@@ -10,6 +10,7 @@ def get_table(data):
[
"REQUIREMENTS_ATTRIBUTES_NAME",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"CHECKID",
"STATUS",
"REGION",
@@ -18,6 +19,9 @@ def get_table(data):
]
]
return get_section_containers_format2(
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ATTRIBUTES_NAME"
return get_section_containers_3_levels(
aux,
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"REQUIREMENTS_ATTRIBUTES_NAME",
)

View File

@@ -0,0 +1,24 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -0,0 +1,24 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -1,6 +1,6 @@
import warnings
from dashboard.common_methods import get_section_containers_kisa_ismsp
from dashboard.common_methods import get_section_containers_3_levels
warnings.filterwarnings("ignore")
@@ -8,7 +8,7 @@ warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
# "REQUIREMENTS_DESCRIPTION",
@@ -20,6 +20,9 @@ def get_table(data):
]
].copy()
return get_section_containers_kisa_ismsp(
aux, "REQUIREMENTS_ATTRIBUTES_SUBDOMAIN", "REQUIREMENTS_ATTRIBUTES_SECTION"
return get_section_containers_3_levels(
aux,
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
)

View File

@@ -1,6 +1,6 @@
import warnings
from dashboard.common_methods import get_section_containers_kisa_ismsp
from dashboard.common_methods import get_section_containers_3_levels
warnings.filterwarnings("ignore")
@@ -8,7 +8,7 @@ warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
# "REQUIREMENTS_DESCRIPTION",
@@ -20,6 +20,9 @@ def get_table(data):
]
].copy()
return get_section_containers_kisa_ismsp(
aux, "REQUIREMENTS_ATTRIBUTES_SUBDOMAIN", "REQUIREMENTS_ATTRIBUTES_SECTION"
return get_section_containers_3_levels(
aux,
"REQUIREMENTS_ATTRIBUTES_DOMAIN",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
)

View File

@@ -0,0 +1,24 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -0,0 +1,24 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -0,0 +1,24 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -0,0 +1,24 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -0,0 +1,24 @@
import warnings
from dashboard.common_methods import get_section_containers_format3
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_format3(
aux, "REQUIREMENTS_ATTRIBUTES_SECTION", "REQUIREMENTS_ID"
)

View File

@@ -57,8 +57,9 @@ def create_layout_overview(
html.Div(className="flex", id="azure_card", n_clicks=0),
html.Div(className="flex", id="gcp_card", n_clicks=0),
html.Div(className="flex", id="k8s_card", n_clicks=0),
html.Div(className="flex", id="m365_card", n_clicks=0),
],
className="grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-4",
className="grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-5",
),
html.H4(
"Count of Findings by severity",
@@ -131,7 +132,7 @@ def create_layout_compliance(
html.A(
[
html.Img(src="assets/favicon.ico", className="w-5 mr-3"),
html.Span("Subscribe to prowler SaaS"),
html.Span("Subscribe to Prowler Cloud"),
],
href="https://prowler.pro/",
target="_blank",

Some files were not shown because too many files have changed in this diff Show More