Compare commits

...

256 Commits

Author SHA1 Message Date
Daniel Barranquero 6bc1eafd66 Merge branch 'PRWLR-5093-design-the-fixer-class' into update-fixers-docs 2025-07-15 12:34:36 +02:00
Daniel Barranquero e45e4ae0fe feat(docs): add snapshots 2025-07-15 12:32:58 +02:00
Daniel Barranquero ef4718d16c Merge branch 'master' into update-fixers-docs 2025-07-15 12:20:55 +02:00
Daniel Barranquero 933ba4c3be Merge branch 'master' into PRWLR-5093-design-the-fixer-class 2025-07-15 12:17:55 +02:00
Sergio Garcia 7da6d7b5dd chore(github): add test_connection function (#8248) 2025-07-15 17:01:40 +08:00
Víctor Fernández Poyatos db6a27d1f5 feat(resources): latest and metadata endpoints and performance (#8112) 2025-07-14 18:02:06 +02:00
Alejandro Bailo e07c833cab feat: SAML toast error (#8267)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-07-14 17:04:23 +02:00
Adrián Jesús Peña Rodríguez 728fc9d6ff fix(saml): remove user in case of error (#8260) 2025-07-14 14:07:27 +02:00
Daniel Barranquero 877471783e feat(docs): add new version of fixer docs 2025-07-14 13:57:45 +02:00
Prowler Bot cf9ff78605 chore(regions_update): Changes in regions for AWS services (#8263)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-07-14 19:45:38 +08:00
Daniel Barranquero 55e9695915 Merge branch 'master' into update-fixers-docs 2025-07-14 09:41:53 +02:00
Adrián Jesús Peña Rodríguez a2faf548af chore: update changelog (#8255) 2025-07-11 12:06:03 +02:00
Adrián Jesús Peña Rodríguez 8bcec4926b fix: set lxml version (#8253) 2025-07-11 11:43:31 +02:00
Hugo Pereira Brito a4e96f809b fix(docs): GitHub provider mkdocs and -h (#8246) 2025-07-11 16:32:15 +08:00
Adrián Jesús Peña Rodríguez fa27255dd7 chore(saml): redirect to login page on fail (#8247) 2025-07-11 09:22:38 +02:00
Pepe Fagoaga 05360e469f chore(bump): add no-changelog label (#8240) 2025-07-10 19:14:37 +08:00
Hugo Pereira Brito 9d405ddcbd fix: changelog entries with new specification (#8232) 2025-07-10 14:40:33 +05:45
Víctor Fernández Poyatos 430f831543 feat(exceptions): add custom error for provider connection during scans (#8234) 2025-07-10 14:13:19 +05:45
Pepe Fagoaga da9d7199b7 chore(changelog): add missing entry from the password policy (#8236) 2025-07-10 09:07:04 +02:00
Pepe Fagoaga d63a383ec6 feat(security): password strength (#8225)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2025-07-10 11:50:22 +05:45
Víctor Fernández Poyatos 55c226029e feat(resources): optimize include parameters for resources view (#8229) 2025-07-09 16:16:56 +02:00
Alejandro Bailo 8d2f6aa30c feat: Include/exclude muted findings (#8228) 2025-07-09 16:06:05 +02:00
Rubén De la Torre Vico a319f80701 feat(storage): add new check storage_smb_protocol_version_is_latest (#8128)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-07-09 17:28:00 +08:00
Adrián Jesús Peña Rodríguez 15a8671f0d feat(saml): prevent duplicate SAML entityID configuration (#8224) 2025-07-09 09:50:22 +02:00
Rubén De la Torre Vico d34e709d91 fix(azure/storage): use BaseModel for all Storage models (#8222) 2025-07-09 15:49:17 +08:00
Hugo Pereira Brito ddc53c3c6d fix(firehose): list all streams and fix firehose_stream_encrypted_at_rest logic (#8213) 2025-07-09 15:38:54 +08:00
Alejandro Bailo a3aef18cfe feat: Mutelist implementation (#8190)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
Co-authored-by: Drew Kerrigan <drew@prowler.com>
2025-07-09 08:15:23 +02:00
Alejandro Bailo 49ca3ca325 fix: SAML 403 message (#8221) 2025-07-09 08:10:14 +02:00
Drew Kerrigan 89c67079a3 feat: Processors API endpoint, implement MuteList (#7993)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-07-08 21:33:28 +05:45
Pepe Fagoaga 2de8075d87 fix(overview): use findings latest to get new (#8219) 2025-07-08 15:48:19 +02:00
Adrián Jesús Peña Rodríguez e124275dbf fix(saml): ensure SocialApp and SAMLDomainIndex are deleted with SAMLConfiguration (#8210) 2025-07-08 13:57:23 +02:00
Rubén De la Torre Vico 760d28e752 chore(deps): update dash libs (#8215) 2025-07-08 19:55:50 +08:00
Víctor Fernández Poyatos 3fb0733887 feat(tasks): create overview queue for summaries and overviews (#8214) 2025-07-08 13:53:23 +02:00
Pablo Lara 7de9a37edb fix(api): make invitation email comparison case-insensitive (#8206)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-07-08 16:39:27 +05:45
Pepe Fagoaga fe00b788cc fix: Remove type validation while updating provider credentials (#8197)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-07-08 15:27:02 +05:45
Rubén De la Torre Vico 4c50f4d811 feat(azure/vm): add new check vm_backup_enabled (#8182)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-07-08 17:01:22 +08:00
Rubén De la Torre Vico c0c736bffe chore: ignore some files from AI editors (#8209) 2025-07-08 10:43:38 +02:00
dependabot[bot] a3aa7d0a63 chore(deps): bump python from 3.12.10-slim-bookworm to 3.12.11-slim-bookworm (#8157)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-08 16:43:13 +08:00
Rubén De la Torre Vico 3ceb86c4d9 feat(azure/vm): add new check vm_scaleset_associated_load_balancer (#8181) 2025-07-08 16:40:43 +08:00
Rubén De la Torre Vico 3628e7b3e8 feat(azure/vm): add new check vm_ensure_using_approved_images (#8168) 2025-07-08 16:40:33 +08:00
Chandrapal Badshah f29c2ac9f0 docs(lighthouse): Add Lighthouse Docs (#8196)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-07-08 11:56:23 +05:45
Pablo Lara b4927c3ad1 chore: Update CHANGELOG UI (#8204) 2025-07-07 17:54:44 +02:00
Adrián Jesús Peña Rodríguez 19f3c1d310 chore(saml): restore SAML button (#8203) 2025-07-07 17:34:05 +02:00
Adrián Jesús Peña Rodríguez cd97e57521 fix(saml): restore SAML, deactivate urls, enable idp-initiate (#8175) 2025-07-07 16:42:11 +02:00
Hugo Pereira Brito b38207507a chore(docs): enhance M365 auth documentation (#8199)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-07-07 22:01:41 +08:00
Rubén De la Torre Vico ab96e0aac0 feat(azure/vm): add new check vm_linux_enforce_ssh_authentication (#8149) 2025-07-07 22:01:11 +08:00
Prowler Bot 4477cecc59 chore(regions_update): Changes in regions for AWS services (#8198)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-07-07 18:04:49 +08:00
Pablo Lara 641d671312 chore: upgrade to Next.js 14.2.30 and lock TypeScript to 5.5.4 for ES… (#8189) 2025-07-04 13:20:30 +02:00
Víctor Fernández Poyatos e7c2fa0699 fix(findings): avoid backfill on empty scans (#8183) 2025-07-04 12:24:49 +02:00
Pedro Martín 7eb08b0f14 fix(ec2): allow empty values for http_endpoint in templates (#8184) 2025-07-04 18:03:51 +08:00
Rubén De la Torre Vico 6f3112f754 feat(storage): add new check storage_smb_channel_encryption_with_secure_algorithm (#8123)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-07-04 15:26:33 +08:00
Kay Agahd f5ecae6da1 fix(iam): detect wildcarded ARNs in sts:AssumeRole policy resources (#8164)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-07-03 23:09:48 +08:00
Prowler Bot 1c75f6b804 chore(release): Bump version to v5.9.0 (#8178)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-07-03 23:08:37 +08:00
Daniel Barranquero 91b64d8572 chore(docs): update m365 docs for app auth in cloud (#8147)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-07-03 23:08:15 +08:00
Pablo Lara 233ae74560 fix: disable dynamic filters for now (#8177) 2025-07-03 14:17:02 +02:00
Alejandro Bailo fac97f9785 fix: remove duplicated calls during promise all resolving (#8176) 2025-07-03 14:02:57 +02:00
Pablo Lara e81c7a3893 fix: bug when updating credentials for m365 (#8173) 2025-07-03 11:31:40 +02:00
Adrián Jesús Peña Rodríguez d6f26df2e8 refactor(migrations): remove saml migrations (#8167)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-07-02 17:23:08 +02:00
Sergio Garcia ece74e15fd chore(sdk): update changelog (#8166) 2025-07-02 16:11:48 +02:00
sumit-tft eea6d07259 chore(ui): update capitalization of Sign In and Sign Up to match UI s… (#8136)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-07-02 16:01:29 +02:00
Víctor Fernández Poyatos 4a6d7a5be2 chore: bump API changelog to v5.8.0 (#8165) 2025-07-02 16:00:43 +02:00
Alejandro Bailo 883c5d4e56 feat: client side validation (#8161) 2025-07-02 15:43:20 +02:00
Adrián Jesús Peña Rodríguez f1f998c2fa chore: update spec (#8162) 2025-07-02 13:19:57 +02:00
Adrián Jesús Peña Rodríguez 5276e38f1d chore: disable SAML endpoints (#8160) 2025-07-02 12:51:57 +02:00
Pablo Lara ad98a4747f chore: Hide all SAML config for v5.8 (#8159) 2025-07-02 12:46:04 +02:00
Alejandro Bailo 5798321dc6 feat: saml e2e improvements (#8158) 2025-07-02 11:57:56 +02:00
dependabot[bot] bf58728d29 chore(deps-dev): bump brace-expansion from 1.1.11 to 1.1.12 in /ui (#8003)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 11:25:17 +02:00
Sergio Garcia fcea3b6570 docs(iac): add documentation for IaC (#8150)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-07-02 17:20:34 +08:00
Neil Millard 965111245a feat(aws): add new check for Codebuild projects visibility (#8127)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-07-02 17:20:15 +08:00
Rubén De la Torre Vico f78a29206c fix(azure): use Pydantic models in VM service and fix managed disk logic (#8151) 2025-07-02 16:23:51 +08:00
dependabot[bot] c719d705e0 chore(deps): bump trufflesecurity/trufflehog from 3.88.35 to 3.89.2 (#8156)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 15:36:10 +08:00
dependabot[bot] 8948ee6868 chore(deps): bump docker/setup-buildx-action from 3.10.0 to 3.11.1 (#8153)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 15:29:21 +08:00
dependabot[bot] 24fb31e98f chore(deps): bump github/codeql-action from 3.28.18 to 3.29.2 (#8155)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 14:24:12 +08:00
Adrián Jesús Peña Rodríguez c8b193e658 fix(saml): add user to SAML tenant (#8152) 2025-07-01 18:41:16 +02:00
Alejandro Bailo 6d27738c4d fix: HotFIX related with ACS SAML url (#8148) 2025-07-01 13:10:46 +02:00
Adrián Jesús Peña Rodríguez 17b7becfdf fix(saml): limit attributes length to satisfy the socialapp restriction (#8145) 2025-07-01 12:03:20 +02:00
Alejandro Bailo cfa7f271d2 fix: Minor changes detected while SAML E2E (#8146) 2025-07-01 11:50:47 +02:00
Pedro Martín e61a97cb65 fix(api): handle ISO27001 - M365 in exports (#8143) 2025-07-01 10:19:56 +02:00
Pablo Lara cd4a1ad8a7 chore: clarify M365 context due to credential changes (#8144)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-07-01 09:01:17 +02:00
Alejandro Bailo e650d19a30 feat: enhance getScans API to support fields and include parameters; … (#8140) 2025-07-01 08:13:48 +02:00
Pedro Martín f930739a3d fix(ui): remove typo from compliance detailed view (#8142) 2025-06-30 18:03:45 +02:00
Sergio Garcia 89fc698a0e fix(m365): handle none attribute in exchange transport rule (#8141) 2025-06-30 23:13:18 +08:00
Pablo Lara 6acb6bbf8e docs: update changelog (#8139) 2025-06-30 16:34:03 +02:00
Alejandro Bailo 971424f822 fix: ACS dynamic url and password input visible in sign up (#8131)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-06-30 16:17:34 +02:00
Adrián Jesús Peña Rodríguez 9ba1ae1ced restore: change api redirect (#8138) 2025-06-30 16:15:25 +02:00
dependabot[bot] 062db4cc70 chore(deps): bump protobuf from 6.30.2 to 6.31.1 in /api (#8053)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-30 21:05:10 +08:00
Pepe Fagoaga dc4db10c41 fix(version): only for master branch (#7850) 2025-06-30 16:50:32 +05:45
Rubén De la Torre Vico 68a542ef64 chore(CHANGELOG): put all checks entries in same format (#8134) 2025-06-30 16:50:12 +05:45
Hugo Pereira Brito 32f3787e18 feat(m365powershell): add pwsh authentication via service principal (#7992)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-06-30 18:42:18 +08:00
Víctor Fernández Poyatos 6792bea319 fix(compliance): Avoid initializing Prowler provider (#8133) 2025-06-30 12:14:03 +02:00
Prowler Bot ae4b43c137 chore(regions_update): Changes in regions for AWS services (#8132)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-06-30 17:53:21 +08:00
Rubén De la Torre Vico d576c4f1c4 docs(developer-guide): add configurable checks documentation (#8122) 2025-06-30 16:47:27 +08:00
Pablo Lara ddc0596aa2 chore: tweaks for SAML config in profile page (#8130) 2025-06-30 09:40:02 +02:00
Rubén De la Torre Vico 636bdb6d0a docs(prowler-app): add new auth method for GCP (#8129) 2025-06-30 15:21:03 +08:00
Alejandro Bailo 4a839b0146 feat: update SAML login URL handling and redirect logic (#8095)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-06-27 14:44:04 +02:00
Pablo Lara 73e244dce5 docs: update changelog (#8125) 2025-06-27 13:51:56 +02:00
Adrián Jesús Peña Rodríguez d8ed70236b refactor(s3): adapt test_connection to match AwsProvider (#8088) 2025-06-27 13:23:59 +02:00
Sergio Garcia bcc96ab4f2 fix(gcp): handle case sensitivity in block-project-ssh-keys (#8115)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-06-27 19:03:51 +08:00
Alejandro Bailo fd53a8c9d0 feat: Playright setup (#8107)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
Co-authored-by: César Arroba <19954079+cesararroba@users.noreply.github.com>
2025-06-27 11:47:21 +02:00
Daniel Barranquero 7b58d1dd56 fix: checks with no resource name (#8120) 2025-06-27 17:40:43 +08:00
Víctor Fernández Poyatos 7858c147f7 fix(spec): API specification (#8119) 2025-06-27 10:49:36 +02:00
Alejandro Bailo 8e635b3bd4 feat: saml sso ui integration (#8094) 2025-06-27 10:45:21 +02:00
Pedro Martín 2e97e37316 feat(dashboard): improve overview page (#8118) 2025-06-27 15:41:48 +08:00
Pedro Martín cd804836a1 docs(dev): add info about installing prowler for a branch (#8116) 2025-06-26 23:00:31 +08:00
Víctor Fernández Poyatos d102ee2fd5 chore: ignore Flask Safety alert in API (#8114) 2025-06-26 16:02:39 +02:00
Pedro Martín 325e5739a2 fix(compliance): handle latest assessment date for each account (#8108) 2025-06-26 17:48:35 +08:00
Sergio Garcia 98da3059b4 refactor(iac): import checkov python library (#8093) 2025-06-25 21:36:21 +08:00
Chandrapal Badshah 80fd5d1ba6 fix: update lighthouse chat page name (#8106)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-06-25 12:48:20 +02:00
Jack Holloway 85242c7909 fix(aws): retrieve correctly ECS Container insights settings (#8097)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-06-25 15:54:20 +08:00
Daniel Barranquero ea6ab406c8 fix(organizations): Key Error: Statement in check organizations_scp_deny_regions (#8091)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-06-25 15:23:38 +08:00
Rubén De la Torre Vico cbf2a28bac feat(azure): add new check keyvault_access_only_through_private_endpoints (#8072)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-24 22:04:02 +08:00
Adrián Jesús Peña Rodríguez 5b1e7bb7f9 fix(saml): avoid IndexError when some attributes are not specified (#8089) 2025-06-24 15:55:01 +02:00
crr e108b2caed fix(aws): fix logic in VPC and ELBv2 checks (#8077)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-06-24 19:13:54 +08:00
Rubén De la Torre Vico df1abb2152 feat(azure): add new check monitor_alert_service_health_exists (#8067)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-24 18:04:20 +08:00
Rubén De la Torre Vico e0465f2aa2 fix(azure): consolidate file share properties to the storage account level (#8087) 2025-06-24 17:37:05 +08:00
Daniel Barranquero 82ab20deec feat(compute): add tests for gcp fixer 2025-06-24 10:01:07 +02:00
Drew Kerrigan 51467767cd fix: allow raising exceptions from validate_mutelist (#8086) 2025-06-24 13:14:46 +05:45
Pablo Lara bc71e7fb3b chore: set filters panel to be always open by default (#8085) 2025-06-23 15:05:53 +02:00
sumit-tft 6a331c05e8 fix(ui): resolve accessibility warnings for Sheet and SVG elements (#8019)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-06-23 13:25:05 +02:00
César Arroba 7ab503a096 chore(gha): avoid comment on PRs for check-changelog workflow (#8084) 2025-06-23 13:17:03 +02:00
Daniel Barranquero d7e3b1c760 feat(gcp): working version of gcp fixer 2025-06-23 13:14:25 +02:00
César Arroba b368190c9f chore(gha): avoid comment on PRs for check-changelog workflow (#8083) 2025-06-23 19:13:13 +08:00
Víctor Fernández Poyatos 8915fdff18 fix(scan): set scheduler_task to null when removing periodic tasks (#8082) 2025-06-23 12:53:58 +02:00
Daniel Barranquero 166e07939d feat(gcp): add first version of gcp tests 2025-06-23 12:33:51 +02:00
Víctor Fernández Poyatos 9bf108e9cc tests(compliance): add performance tests for compliance (#8073) 2025-06-23 12:09:30 +02:00
Prowler Bot 87708e39cf chore(regions_update): Changes in regions for AWS services (#8079)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-06-23 17:54:27 +08:00
César Arroba 44927c44e9 chore(gha): add permissions on check-changelog workflow (#8080) 2025-06-23 11:49:48 +02:00
dependabot[bot] 71aa29cf24 chore(deps): bump urllib3 from 1.26.20 to 2.5.0 (#8063)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-23 17:49:20 +08:00
Víctor Fernández Poyatos aa14daf0db fix(schema): API reference documentation (#8078) 2025-06-23 11:04:25 +02:00
Daniel Barranquero c5cf1c4bfb merge branch 'master' into PRWLR-5093-design-the-fixer-class 2025-06-23 10:20:06 +02:00
Daniel Barranquero eb5dbab86e feat(docs): update Azure and M365 docs with needed permissions (#8075) 2025-06-23 10:12:11 +02:00
Víctor Fernández Poyatos 223aab8ece chore(API): skip safety vulnerabilities related to asteval (#8076) 2025-06-20 14:28:23 +02:00
César Arroba 3ec57340a0 chore(gha): check changelog when label is added or deleted (#8071) 2025-06-20 16:35:19 +05:45
Pablo Lara 80d73cc05b feat: integrate Google Tag Manager manually to avoid ORB blocking (#8070) 2025-06-20 12:47:17 +02:00
César Arroba 94f02df11e chore(gha): check changelog changes on pull request (#7991)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-06-19 14:51:59 +05:45
Pepe Fagoaga c454ceb296 fix(changelog): Add missing entries (#8066) 2025-06-19 14:12:39 +05:45
Pepe Fagoaga 76ec13a1d6 chore(ocsf): remove version number and point to the latest (#8064) 2025-06-19 13:33:28 +05:45
Pepe Fagoaga 783b6ea982 chore(api): clean up old files (#8051) 2025-06-19 11:57:48 +05:45
Alejandro Bailo 6b7b700a98 feat: filters relationships in findings and scans page (#8046)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-06-18 17:19:41 +02:00
César Arroba b3f2a1c532 chore(ui): add NEXT_PUBLIC_GOOGLE_TAG_MANAGER_ID variable on Dockerfile (#8061) 2025-06-18 16:31:55 +02:00
Sergio Garcia c4e1bd3ed2 fix: add missing changelog compliance timestamps (#8060) 2025-06-18 16:28:48 +02:00
Sergio Garcia d0d4e0d483 fix(compliance): use unified timestampt for all requirements (#8052) 2025-06-18 22:00:51 +08:00
Pablo Lara 14a9f0e765 feat: add Google Tag Manager integration (#8058) 2025-06-18 15:47:48 +02:00
Rubén De la Torre Vico b572575c8d feat(azure): add new check iam_role_user_access_admin_restricted (#8040)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-06-18 21:24:23 +08:00
Rubén De la Torre Vico a626e41162 docs: add provider-specific developer guide sections (#7996)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-06-18 21:20:33 +08:00
Hugo Pereira Brito 22343faa1e feat(storage): add new check storage_default_to_entra_authorization_enabled (#7981) 2025-06-18 21:16:07 +08:00
Hugo Pereira Brito c5b37887ef chore: add pr to changelog (#8054) 2025-06-18 14:32:21 +02:00
Rubén De la Torre Vico f9aed36d0b feat(azure): add new check databricks_workspace_cmk_encryption_enabled (#8017) 2025-06-18 18:36:37 +08:00
Hugo Pereira Brito facc0627d7 feat(azure): add new check storage_geo_redundant_enabled (#7980)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-18 18:10:02 +08:00
Rubén De la Torre Vico 76f0d890e9 feat(azure): add Databricks service and check for workspace VNet injection (#8008)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-06-18 17:38:09 +08:00
Hugo Pereira Brito 7de7122c3b fix(m365): avoid user requests in setup_identity app context and user auth log enhancement (#8043) 2025-06-18 11:27:11 +02:00
Hugo Pereira Brito 1b73ab2fe4 feat(storage): add new check storage_cross_tenant_replication_disabled (#7977)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-18 15:54:13 +08:00
Rubén De la Torre Vico cc8f6131e6 feat(azure): add new check storage_blob_versioning_is_enabled (#7927)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-18 15:46:38 +08:00
Andoni Alonso dfd5c9aee7 feat(aws): add check to ensure Codebuild Github projects are only use allowed Github orgs (#7595)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-18 00:17:18 +08:00
dependabot[bot] 3986bf3f42 chore(deps): bump asteval from 1.0.5 to 1.0.6 (#8049)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-18 00:11:18 +08:00
Sergio Garcia c45ef1e286 chore(deps): update requests dependency (#8048) 2025-06-18 00:04:09 +08:00
dependabot[bot] 8d8f498dc2 chore(deps): bump asteval from 1.0.5 to 1.0.6 (#8047)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-17 23:32:13 +08:00
Sergio Garcia c4bd9122d4 feat(IaC): PoC for IaC Security Scanner (#7852)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-06-17 23:23:25 +08:00
dependabot[bot] 644cdc81b9 chore(deps): bump requests from 2.32.3 to 2.32.4 in /api (#7986)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-17 16:46:29 +02:00
Pablo Lara e5584f21b3 feat: make user and password fields optional but mutually required fo… (#8044) 2025-06-17 14:46:00 +02:00
Rubén De la Torre Vico b868d39bef chore(deps): add pre-commit as a dev dependency (#8042) 2025-06-17 18:54:32 +08:00
Alejandro Bailo ef9809f61f fix: correct parenthesis around the render condition (#8041) 2025-06-17 12:22:17 +02:00
Alejandro Bailo 9a04ca3611 feat: touching up compliances views (#8022)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-06-17 11:23:14 +02:00
Pedro Martín 1c9b3a1394 feat(m365): add ISO 27001 2022 compliance framework (#7985)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-17 17:04:36 +08:00
dependabot[bot] 5ee7bd6459 chore(deps): bump protobuf from 6.30.2 to 6.31.1 (#8037)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-17 16:31:04 +08:00
Chandrapal Badshah 05d2b86ba8 feat(lighthouse): update NextJS logic to work with latest APIs (#8033)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-06-17 10:25:37 +02:00
Andoni Alonso 84c30af6f8 chore(sentry): handle exceptions ignores not based in ClassNames (#8034) 2025-06-17 09:42:24 +02:00
dcanotrad e8a829b75e docs(dev-guide): improve quality redrive (#7718)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2025-06-17 09:28:22 +02:00
Sergio Garcia a0d169470d chore(metadata): add validator for ResourceType (#8035) 2025-06-17 00:06:32 +08:00
Rubén De la Torre Vico 1fd6046511 chore: add missing init file to check repository_secret_scanning_enabled (#8029) 2025-06-16 21:31:18 +08:00
Sergio Garcia 524455b0f3 fix(metadata): add missing ResourceType values (#8028)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-06-16 21:30:55 +08:00
Víctor Fernández Poyatos e6e1e37c1e fix(findings): exclude blank resource types from metadata endpoints (#8027) 2025-06-16 18:19:21 +05:45
Prowler Bot 2914510735 chore(regions_update): Changes in regions for AWS services (#8026)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-06-16 19:00:06 +08:00
Rubén De la Torre Vico 7e43c7797f fix(eks): add EKS to service without subservices (#7959)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-16 16:46:48 +08:00
Rubén De la Torre Vico 6954ef880e fix(azure): add new way to auth against App Insight (#7763) 2025-06-16 16:46:36 +08:00
Chandrapal Badshah 5f5e7015a9 feat(lighthouse): Add django endpoints to store config (#7848)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
Co-authored-by: Víctor Fernández Poyatos <vicferpoy@gmail.com>
2025-06-16 10:11:57 +02:00
Andoni Alonso bfafa518b1 feat(aws): avoid bypassing IAM check using wildcards (#7708) 2025-06-16 07:42:01 +02:00
Hugo Pereira Brito e34e59ff2d fix(network): allow 0 as compliant value (#7926)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-13 19:50:19 +08:00
Daniel Barranquero 7f80d2db46 fix(app): change api call for ftps_state (#7923) 2025-06-13 19:28:55 +08:00
sumit-tft 4a2a3921da feat(UI): Add Provider detail component in Findings, Scan details (#7968)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-06-13 12:17:18 +02:00
Pedro Martín e26b2e6527 feat(api): handle MitreAttack compliance requirements (#7987)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-06-13 10:26:34 +02:00
Mitchell @ Securemetrics 954814c1d7 feat(contrib): add PowerBI integration (#7826)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2025-06-13 09:55:07 +02:00
Andoni Alonso 113224cbd9 chore: update CHANGELOG (#8015) 2025-06-13 15:38:56 +08:00
Andoni Alonso f5f1fce779 fix(iam): check always if root credentials are present (#7967) 2025-06-12 17:48:09 +02:00
Pepe Fagoaga 0ba9383202 chore(changelog): make all consistent (#8010)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-06-12 20:09:01 +05:45
Adrián Jesús Peña Rodríguez 8e9a9797c7 fix(export): add name sanitization (#8007) 2025-06-12 20:02:18 +05:45
Pablo Lara 2b4e6bffae chore: update package-lock after lighthouse was merged (#8011) 2025-06-12 15:32:58 +02:00
Chandrapal Badshah 74f7a86c2b feat(lighthouse): Add chat interface (#7878)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-06-12 15:19:41 +02:00
Pablo Lara e218435b2f fix: improve error handling in UpdateViaCredentialsForm with early re… (#7988) 2025-06-12 11:39:49 +02:00
Prowler Bot 5ec34ad5e7 chore(regions_update): Changes in regions for AWS services (#7973)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-06-12 17:24:15 +08:00
Pedro Martín c4b0859efd fix(dashboard): handle account uids with 0 at start and end (#7955)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-12 17:21:52 +08:00
Pedro Martín 1241a490f9 fix(kubernetes): change object type to set for apiserver check (#7952) 2025-06-12 17:02:48 +08:00
Pedro Martín 4ec498a612 fix(k8s): remove typo for PCI 4.0 compliance framework (#7971) 2025-06-12 16:41:58 +08:00
Pedro Martín 119c5e80a9 feat(gcp): add NIS 2 compliance framework (#7912)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-06-12 16:40:33 +08:00
sumit-tft d393bc48a2 fix(PRWLR-7380): button nesting hydration error (#7998) 2025-06-12 10:02:20 +02:00
Daniel Barranquero e09e3855b1 fix(gcp): remove azure video from gcp docs (#8001) 2025-06-12 09:54:25 +02:00
Alejandro Bailo 8751615faa feat: MittreAtack compliance detailed view (#8002) 2025-06-12 09:27:47 +02:00
Prowler Bot e7c17ab0b3 chore(regions_update): Changes in regions for AWS services (#7898)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-06-12 15:14:28 +08:00
dependabot[bot] f05d3eb334 chore(deps): bump trufflesecurity/trufflehog from 3.88.26 to 3.88.35 (#7896)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-12 15:14:02 +08:00
dependabot[bot] cf449d4607 chore(deps): bump aws-actions/configure-aws-credentials from 4.1.0 to 4.2.1 (#7895)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-12 15:13:35 +08:00
dependabot[bot] b338ac9add chore(deps): bump codecov/codecov-action from 5.4.2 to 5.4.3 (#7894)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-12 15:13:12 +08:00
dependabot[bot] 366d2b392a chore(deps): bump docker/build-push-action from 6.16.0 to 6.18.0 (#7893)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-12 15:12:52 +08:00
dependabot[bot] 41fc536b44 chore(deps): bump github/codeql-action from 3.28.16 to 3.28.18 (#7892)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-12 15:12:28 +08:00
Adrián Jesús Peña Rodríguez e042445ecf fix(migration): create site stuff before socialaccount (#7999) 2025-06-11 13:34:21 +02:00
Daniel Barranquero 09b33d05a3 fix vulture 2025-06-11 12:52:36 +02:00
Daniel Barranquero 6a7cfd175c chore(tests): improve tests 2025-06-11 12:47:42 +02:00
Víctor Fernández Poyatos c17129afe3 revert: RLS transactions handling and DB custom backend (#7994) 2025-06-11 14:47:10 +05:45
Daniel Barranquero 82543c0d63 fix(azure): change azure fixer tests 2025-06-11 09:45:32 +02:00
Alejandro Bailo 4876d8435c feat: generic compliance detailed view (#7990) 2025-06-11 09:40:53 +02:00
Pedro Martín 1bd0d774e5 feat(mutelist): make validate_mutelist method static (#7811) 2025-06-11 11:33:49 +05:45
Daniel Barranquero 7360395263 feat(tests): add tests for new fixers 2025-06-10 19:15:16 +02:00
Daniel Barranquero 4ae790ee73 fix: tests with function apps 2025-06-10 11:17:54 +02:00
Alejandro Bailo c119cece89 feat: ThreatScore compliance detailed view (#7979) 2025-06-10 10:43:27 +02:00
Adrián Jesús Peña Rodríguez e24b211d22 feat(sso): add sso with saml to API (#7822) 2025-06-10 10:17:54 +02:00
Hugo Pereira Brito c589c95727 feat(storage): add new check storage_account_key_access_disabled (#7974)
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2025-06-10 08:23:09 +02:00
Hugo Pereira Brito 7e4f1a73bf feat(storage): add new check storage_ensure_file_shares_soft_delete_is_enabled (#7966) 2025-06-10 08:09:11 +02:00
Pepe Fagoaga 4d00aece45 chore(changelog): move entry for their version (#7969) 2025-06-09 21:50:13 +05:45
Hugo Pereira Brito 49aaf011aa fix(parser): add GitHub provider to prowler -h usage section (#7906) 2025-06-09 17:47:29 +02:00
Adrián Jesús Peña Rodríguez 898934c7f8 chore: update django version (#7984) 2025-06-09 17:33:16 +02:00
Pepe Fagoaga 81c4b5a9c1 chore(api): Delete old docker compose file (#7982) 2025-06-09 21:01:52 +05:45
Daniel Barranquero 7a2d3db082 chore(app): fix app service tests 2025-06-09 16:35:11 +02:00
Pepe Fagoaga fe31656ffe fix(k8s): return a session if using kubeconfig_content (#7953) 2025-06-09 19:11:59 +05:45
Daniel Barranquero 40934d34b2 fix: flake8 2025-06-09 13:38:16 +02:00
Daniel Barranquero 5c93372210 chore(tests): add tests for azure and m365 fixers 2025-06-09 13:32:47 +02:00
Hugo Pereira Brito 359059dee6 fix(docs): add Organization.Read.All to M365 provider requirements (#7961) 2025-06-09 12:11:14 +02:00
Daniel Barranquero ffcc516f00 chore(kms): modify fixer test 2025-06-09 11:30:44 +02:00
Alejandro Bailo 2eaa37921d feat: KISA detailed view (#7965) 2025-06-09 09:29:34 +02:00
Pablo Lara 3a99909b75 chore: align Next.js version to 14.2.29 across Prowler and Cloud (#7962) 2025-06-06 13:54:42 +02:00
Pablo Lara 2ecd9ad2c5 docs: update changelog (#7960) 2025-06-06 13:17:38 +02:00
Alejandro Bailo 50dc396aa3 feat: scan id filter drowpdown (#7949)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-06-06 12:38:14 +02:00
Andoni Alonso acf333493a chore(api): reorder docker layers to speed up build times (#7957) 2025-06-06 10:42:14 +02:00
Pedro Martín bd6272f5a7 feat(docs): add information about tenants and read-only roles (#7956) 2025-06-06 10:14:33 +02:00
Pepe Fagoaga 8c95e1efaf chore: update API changelog for v5.7.3 (#7948) 2025-06-05 15:54:36 +02:00
Hugo Pereira Brito 845a0aa0d5 fix(changelog): add entries for password encryption in v5.7.3 (#7939)
Co-authored-by: Víctor Fernández Poyatos <victor@prowler.com>
2025-06-05 14:23:12 +02:00
Hugo Pereira Brito 75a11be9e6 fix(docs): add final permission assignments example (#7943) 2025-06-05 18:07:43 +05:45
Hugo Pereira Brito a778d005b6 fix(docs): add mfa warning for users (#7924)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-06-05 17:55:27 +05:45
Pedro Martín 1281f4ec5e chore(changelog): update following the correct format (#7908) 2025-06-05 17:52:36 +05:45
Daniel Barranquero 9d4094e19e fix: remove unnecessary changes 2025-06-04 13:21:25 +02:00
Daniel Barranquero 00e491415f chore(app): new version of the fixer 2025-06-04 12:51:39 +02:00
Daniel Barranquero e17cbed4b3 Merge branch 'PRWLR-7353-fix-app-function-ftps-deployment-disabled-check' into PRWLR-5093-design-the-fixer-class 2025-06-04 12:34:01 +02:00
Daniel Barranquero d1e41f16ef fix: solve comments 2025-06-04 12:32:32 +02:00
Daniel Barranquero a17c3f94fc chore(azure): add permissions to azure fixer info 2025-06-04 10:48:55 +02:00
Daniel Barranquero 70f8232747 Merge branch 'PRWLR-7353-fix-app-function-ftps-deployment-disabled-check' into PRWLR-5093-design-the-fixer-class 2025-06-04 09:47:06 +02:00
Daniel Barranquero 31189f0d11 chore(app): mantain none by default 2025-06-04 09:43:17 +02:00
Daniel Barranquero 5aaf6e4858 feat(app): add changelog 2025-06-04 09:29:14 +02:00
Daniel Barranquero e05cc4cfab fix(app): change api call for app function ftps check 2025-06-03 17:57:52 +02:00
Daniel Barranquero 18a6f29593 feat(gcp): add first version of gcp fixers 2025-06-03 17:40:05 +02:00
Daniel Barranquero fc826da50c chore(azure): add changes to azure fixers 2025-06-03 17:38:09 +02:00
Daniel Barranquero b30ee077da merge branch 'master' into PRWLR-5093-design-the-fixer-class 2025-06-02 10:38:00 +02:00
Daniel Barranquero efdd967763 feat(fixers): add first version of azure fixers 2025-05-23 11:07:38 +02:00
Daniel Barranquero ee146cd43e feat(m365): add first fixer for m365 2025-05-21 14:00:34 +02:00
Daniel Barranquero f40aea757e feat(fixers): add first version of M365 fixers 2025-05-21 10:24:59 +02:00
Daniel Barranquero 7db24f8cb7 Merge branch 'master' into PRWLR-5093-design-the-fixer-class 2025-05-20 13:14:59 +02:00
Daniel Barranquero f78e5c9e33 feat(fixers): change classes structure 2025-05-20 09:41:14 +02:00
Daniel Barranquero d91bbe1ef4 feat(fixer): add fixing and modify errors from the v1 2025-05-15 16:40:09 +02:00
Daniel Barranquero c0d211492e feat(fixer): add poc for Fixer class 2025-05-15 13:57:29 +02:00
783 changed files with 48321 additions and 13171 deletions
+14 -1
View File
@@ -6,11 +6,15 @@
PROWLER_UI_VERSION="stable"
AUTH_URL=http://localhost:3000
API_BASE_URL=http://prowler-api:8080/api/v1
NEXT_PUBLIC_API_BASE_URL=${API_BASE_URL}
NEXT_PUBLIC_API_DOCS_URL=http://prowler-api:8080/api/v1/docs
AUTH_TRUST_HOST=true
UI_PORT=3000
# openssl rand -base64 32
AUTH_SECRET="N/c6mnaS5+SWq81+819OrzQZlmx1Vxtp/orjttJSmw8="
# Google Tag Manager ID
NEXT_PUBLIC_GOOGLE_TAG_MANAGER_ID=""
#### Prowler API Configuration ####
PROWLER_API_VERSION="stable"
@@ -127,7 +131,7 @@ SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.6.0
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.7.5
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
@@ -137,3 +141,12 @@ SOCIAL_GOOGLE_OAUTH_CLIENT_SECRET=""
SOCIAL_GITHUB_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/github"
SOCIAL_GITHUB_OAUTH_CLIENT_ID=""
SOCIAL_GITHUB_OAUTH_CLIENT_SECRET=""
# Single Sign-On (SSO)
SAML_SSO_CALLBACK_URL="${AUTH_URL}/api/auth/callback/saml"
# Lighthouse tracing
LANGSMITH_TRACING=false
LANGSMITH_ENDPOINT="https://api.smith.langchain.com"
LANGSMITH_API_KEY=""
LANGCHAIN_PROJECT=""
+5
View File
@@ -27,6 +27,11 @@ provider/github:
- any-glob-to-any-file: "prowler/providers/github/**"
- any-glob-to-any-file: "tests/providers/github/**"
provider/iac:
- changed-files:
- any-glob-to-any-file: "prowler/providers/iac/**"
- any-glob-to-any-file: "tests/providers/iac/**"
github_actions:
- changed-files:
- any-glob-to-any-file: ".github/workflows/*"
@@ -76,12 +76,12 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: ${{ env.WORKING_DIRECTORY }}
# Set push: false for testing
@@ -94,7 +94,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: ${{ env.WORKING_DIRECTORY }}
push: true
+2 -2
View File
@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/api-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
category: "/language:${{matrix.language}}"
+25 -10
View File
@@ -28,6 +28,10 @@ env:
VALKEY_DB: 0
API_WORKING_DIR: ./api
IMAGE_NAME: prowler-api
IGNORE_FILES: |
api/docs/**
api/README.md
api/CHANGELOG.md
jobs:
test:
@@ -78,12 +82,7 @@ jobs:
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: api/**
files_ignore: |
api/.github/**
api/docs/**
api/permissions/**
api/README.md
api/mkdocs.yml
files_ignore: ${{ env.IGNORE_FILES }}
- name: Replace @master with current branch in pyproject.toml
working-directory: ./api
@@ -113,6 +112,12 @@ jobs:
python-version: ${{ matrix.python-version }}
cache: "poetry"
- name: Install system dependencies for xmlsec
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
sudo apt-get update
sudo apt-get install -y libxml2-dev libxmlsec1-dev libxmlsec1-openssl pkg-config
- name: Install dependencies
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
@@ -158,8 +163,9 @@ jobs:
- name: Safety
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
# 76352, 76353, 77323 come from SDK, but they cannot upgrade it yet. It does not affect API
run: |
poetry run safety check --ignore 70612,66963,74429
poetry run safety check --ignore 70612,66963,74429,76352,76353,77323
- name: Vulture
working-directory: ./api
@@ -181,7 +187,7 @@ jobs:
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -190,10 +196,19 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: api/**
files_ignore: ${{ env.IGNORE_FILES }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Build Container
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: ${{ env.API_WORKING_DIR }}
push: false
+1 -1
View File
@@ -11,7 +11,7 @@ jobs:
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@b06f6d72a3791308bb7ba59c2b8cb7a083bd17e4 # v3.88.26
uses: trufflesecurity/trufflehog@6641d4ba5b684fffe195b9820345de1bf19f3181 # v3.89.2
with:
path: ./
base: ${{ github.event.repository.default_branch }}
@@ -0,0 +1,86 @@
name: Check Changelog
on:
pull_request:
types: [opened, synchronize, reopened, labeled, unlabeled]
jobs:
check-changelog:
if: contains(github.event.pull_request.labels.*.name, 'no-changelog') == false
runs-on: ubuntu-latest
permissions:
id-token: write
contents: read
pull-requests: write
env:
MONITORED_FOLDERS: "api ui prowler"
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
- name: Get list of changed files
id: changed_files
run: |
git fetch origin ${{ github.base_ref }}
git diff --name-only origin/${{ github.base_ref }}...HEAD > changed_files.txt
cat changed_files.txt
- name: Check for folder changes and changelog presence
id: check_folders
run: |
missing_changelogs=""
for folder in $MONITORED_FOLDERS; do
if grep -q "^${folder}/" changed_files.txt; then
echo "Detected changes in ${folder}/"
if ! grep -q "^${folder}/CHANGELOG.md$" changed_files.txt; then
echo "No changelog update found for ${folder}/"
missing_changelogs="${missing_changelogs}- \`${folder}\`\n"
fi
fi
done
echo "missing_changelogs<<EOF" >> $GITHUB_OUTPUT
echo -e "${missing_changelogs}" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
- name: Find existing changelog comment
if: github.event.pull_request.head.repo.full_name == github.repository
id: find_comment
uses: peter-evans/find-comment@3eae4d37986fb5a8592848f6a574fdf654e61f9e #v3.1.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: 'github-actions[bot]'
body-includes: '<!-- changelog-check -->'
- name: Comment on PR if changelog is missing
if: github.event.pull_request.head.repo.full_name == github.repository && steps.check_folders.outputs.missing_changelogs != ''
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.find_comment.outputs.comment-id }}
body: |
<!-- changelog-check -->
⚠️ **Changes detected in the following folders without a corresponding update to the `CHANGELOG.md`:**
${{ steps.check_folders.outputs.missing_changelogs }}
Please add an entry to the corresponding `CHANGELOG.md` file to maintain a clear history of changes.
- name: Comment on PR if all changelogs are present
if: github.event.pull_request.head.repo.full_name == github.repository && steps.check_folders.outputs.missing_changelogs == ''
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.find_comment.outputs.comment-id }}
body: |
<!-- changelog-check -->
✅ All necessary `CHANGELOG.md` files have been updated. Great job! 🎉
- name: Fail if changelog is missing
if: steps.check_folders.outputs.missing_changelogs != ''
run: |
echo "ERROR: Missing changelog updates in some folders."
exit 1
@@ -123,11 +123,11 @@ jobs:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Build and push container image (latest)
if: github.event_name == 'push'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
push: true
tags: |
@@ -140,7 +140,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
# Use local context to get changes
# https://github.com/docker/build-push-action#path-context
+2
View File
@@ -97,6 +97,7 @@ jobs:
commit-message: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
branch: "version-bump-to-v${{ env.BUMP_VERSION_TO }}"
title: "chore(release): Bump version to v${{ env.BUMP_VERSION_TO }}"
labels: no-changelog
body: |
### Description
@@ -135,6 +136,7 @@ jobs:
commit-message: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
branch: "version-bump-to-v${{ env.PATCH_VERSION_TO }}"
title: "chore(release): Bump version to v${{ env.PATCH_VERSION_TO }}"
labels: no-changelog
body: |
### Description
+2 -2
View File
@@ -56,12 +56,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/sdk-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
category: "/language:${{matrix.language}}"
+16 -1
View File
@@ -212,6 +212,21 @@ jobs:
run: |
poetry run pytest -n auto --cov=./prowler/providers/m365 --cov-report=xml:m365_coverage.xml tests/providers/m365
# Test IaC
- name: IaC - Check if any file has changed
id: iac-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/iac/**
./tests/providers/iac/**
.poetry.lock
- name: IaC - Test
if: steps.iac-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/iac --cov-report=xml:iac_coverage.xml tests/providers/iac
# Common Tests
- name: Lib - Test
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
@@ -226,7 +241,7 @@ jobs:
# Codecov
- name: Upload coverage reports to Codecov
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: codecov/codecov-action@ad3126e916f78f00edff4ed0317cf185271ccc2d # v5.4.2
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -38,7 +38,7 @@ jobs:
pip install boto3
- name: Configure AWS Credentials -- DEV
uses: aws-actions/configure-aws-credentials@ececac1a45f3b08a01d2dd070d28d111c5fe6722 # v4.1.0
uses: aws-actions/configure-aws-credentials@b47578312673ae6fa5b5096b330d9fbac3d116df # v4.2.1
with:
aws-region: ${{ env.AWS_REGION_DEV }}
role-to-assume: ${{ secrets.DEV_IAM_ROLE_ARN }}
@@ -30,6 +30,7 @@ env:
# Container Registries
PROWLERCLOUD_DOCKERHUB_REPOSITORY: prowlercloud
PROWLERCLOUD_DOCKERHUB_IMAGE: prowler-ui
NEXT_PUBLIC_API_BASE_URL: http://prowler-api:8080/api/v1
jobs:
repository-check:
@@ -76,16 +77,17 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Build and push container image (latest)
# Comment the following line for testing
if: github.event_name == 'push'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=${{ env.SHORT_SHA }}
NEXT_PUBLIC_API_BASE_URL=${{ env.NEXT_PUBLIC_API_BASE_URL }}
# Set push: false for testing
push: true
tags: |
@@ -96,11 +98,12 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: ${{ env.WORKING_DIRECTORY }}
build-args: |
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v${{ env.RELEASE_TAG }}
NEXT_PUBLIC_API_BASE_URL=${{ env.NEXT_PUBLIC_API_BASE_URL }}
push: true
tags: |
${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}:${{ env.RELEASE_TAG }}
+2 -2
View File
@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/ui-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@28deaeda66b76a05916b6923827895f2b14ab387 # v3.28.16
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
category: "/language:${{matrix.language}}"
+52 -3
View File
@@ -34,23 +34,72 @@ jobs:
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
cache-dependency-path: './ui/package-lock.json'
- name: Install dependencies
working-directory: ./ui
run: npm install
run: npm ci
- name: Run Healthcheck
working-directory: ./ui
run: npm run healthcheck
- name: Build the application
working-directory: ./ui
run: npm run build
e2e-tests:
runs-on: ubuntu-latest
env:
AUTH_SECRET: 'fallback-ci-secret-for-testing'
AUTH_TRUST_HOST: true
NEXTAUTH_URL: http://localhost:3000
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
persist-credentials: false
- name: Setup Node.js
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: '20.x'
cache: 'npm'
cache-dependency-path: './ui/package-lock.json'
- name: Install dependencies
working-directory: ./ui
run: npm ci
- name: Cache Playwright browsers
uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684 # v4.2.3
id: playwright-cache
with:
path: ~/.cache/ms-playwright
key: ${{ runner.os }}-playwright-${{ hashFiles('ui/package-lock.json') }}
restore-keys: |
${{ runner.os }}-playwright-
- name: Install Playwright browsers
working-directory: ./ui
if: steps.playwright-cache.outputs.cache-hit != 'true'
run: npm run test:e2e:install
- name: Build the application
working-directory: ./ui
run: npm run build
- name: Run Playwright tests
working-directory: ./ui
run: npm run test:e2e
- name: Upload Playwright report
uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b # v4.5.0
if: failure()
with:
name: playwright-report
path: ui/playwright-report/
retention-days: 30
test-container-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Build Container
uses: docker/build-push-action@14487ce63c7a62a4a324b0bfb37086795e31c6c1 # v6.16.0
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: ${{ env.UI_WORKING_DIR }}
# Always build using `prod` target
+10
View File
@@ -44,6 +44,16 @@ junit-reports/
# Cursor files
.cursorignore
.cursor/
# RooCode files
.roo/
.rooignore
.roomodes
# Cline files
.cline/
.clineignore
# Terraform
.terraform*
+1 -1
View File
@@ -115,7 +115,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'safety check --ignore 70612,66963,74429'
entry: bash -c 'safety check --ignore 70612,66963,74429,76352,76353'
language: system
- id: vulture
+3 -6
View File
@@ -1,4 +1,4 @@
FROM python:3.12.10-slim-bookworm AS build
FROM python:3.12.11-slim-bookworm AS build
LABEL maintainer="https://github.com/prowler-cloud/prowler"
LABEL org.opencontainers.image.source="https://github.com/prowler-cloud/prowler"
@@ -6,7 +6,8 @@ LABEL org.opencontainers.image.source="https://github.com/prowler-cloud/prowler"
ARG POWERSHELL_VERSION=7.5.0
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
RUN apt-get update && apt-get install -y --no-install-recommends \
wget libicu72 libunwind8 libssl3 libcurl4 ca-certificates apt-transport-https gnupg \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
@@ -46,10 +47,6 @@ ENV PATH="${HOME}/.local/bin:${PATH}"
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
# By default poetry does not compile Python source files to bytecode during installation.
# This speeds up the installation process, but the first execution may take a little more
# time because Python then compiles source files to bytecode automatically. If you want to
# compile source files to bytecode during installation, you can use the --compile option
RUN poetry install --compile && \
rm -rf ~/.cache/pip
+2 -2
View File
@@ -87,11 +87,11 @@ prowler dashboard
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 567 | 82 | 36 | 10 |
| GCP | 79 | 13 | 9 | 3 |
| GCP | 79 | 13 | 10 | 3 |
| Azure | 142 | 18 | 10 | 3 |
| Kubernetes | 83 | 7 | 5 | 7 |
| GitHub | 16 | 2 | 1 | 0 |
| M365 | 69 | 7 | 2 | 2 |
| M365 | 69 | 7 | 3 | 2 |
| NHN (Unofficial) | 6 | 2 | 1 | 0 |
> [!Note]
-168
View File
@@ -1,168 +0,0 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.pyc
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
/_data/
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock
# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/latest/usage/project/#working-with-version-control
.pdm.toml
.pdm-python
.pdm-build/
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
*.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
.idea/
# VSCode
.vscode/
-91
View File
@@ -1,91 +0,0 @@
repos:
## GENERAL
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
hooks:
- id: check-merge-conflict
- id: check-yaml
args: ["--unsafe"]
- id: check-json
- id: end-of-file-fixer
- id: trailing-whitespace
- id: no-commit-to-branch
- id: pretty-format-json
args: ["--autofix", "--no-sort-keys", "--no-ensure-ascii"]
exclude: 'src/backend/api/fixtures/dev/.*\.json$'
## TOML
- repo: https://github.com/macisamuele/language-formatters-pre-commit-hooks
rev: v2.13.0
hooks:
- id: pretty-format-toml
args: [--autofix]
files: pyproject.toml
## BASH
- repo: https://github.com/koalaman/shellcheck-precommit
rev: v0.10.0
hooks:
- id: shellcheck
exclude: contrib
## PYTHON
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.5.0
hooks:
# Run the linter.
- id: ruff
args: [ --fix ]
# Run the formatter.
- id: ruff-format
- repo: https://github.com/python-poetry/poetry
rev: 1.8.0
hooks:
- id: poetry-check
args: ["--directory=src"]
- id: poetry-lock
args: ["--no-update", "--directory=src"]
- repo: https://github.com/hadolint/hadolint
rev: v2.13.0-beta
hooks:
- id: hadolint
args: ["--ignore=DL3013", "Dockerfile"]
- repo: local
hooks:
- id: pylint
name: pylint
entry: bash -c 'poetry run pylint --disable=W,C,R,E -j 0 -rn -sn src/'
language: system
files: '.*\.py'
- id: trufflehog
name: TruffleHog
description: Detect secrets in your data.
entry: bash -c 'trufflehog --no-update git file://. --only-verified --fail'
# For running trufflehog in docker, use the following entry instead:
# entry: bash -c 'docker run -v "$(pwd):/workdir" -i --rm trufflesecurity/trufflehog:latest git file:///workdir --only-verified --fail'
language: system
stages: ["commit", "push"]
- id: bandit
name: bandit
description: "Bandit is a tool for finding common security issues in Python code"
entry: bash -c 'poetry run bandit -q -lll -x '*_test.py,./contrib/,./.venv/' -r .'
language: system
files: '.*\.py'
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'poetry run safety check --ignore 70612,66963,74429'
language: system
- id: vulture
name: vulture
description: "Vulture finds unused code in Python programs."
entry: bash -c 'poetry run vulture --exclude "contrib,.venv,tests,conftest.py" --min-confidence 100 .'
language: system
files: '.*\.py'
+97 -41
View File
@@ -2,57 +2,114 @@
All notable changes to the **Prowler API** are documented in this file.
## [v1.9.0] (Prowler UNRELEASED)
## [v1.10.0] (Prowler UNRELEASED)
### Added
- Support GCP Service Account key. [(#7824)](https://github.com/prowler-cloud/prowler/pull/7824)
- Added new `GET /compliance-overviews` endpoints to retrieve compliance metadata and specific requirements statuses [(#7877)](https://github.com/prowler-cloud/prowler/pull/7877).
- SSO with SAML support [(#8175)](https://github.com/prowler-cloud/prowler/pull/8175)
- `GET /resources/metadata`, `GET /resources/metadata/latest` and `GET /resources/latest` to expose resource metadata and latest scan results [(#8112)](https://github.com/prowler-cloud/prowler/pull/8112)
### Changed
- Renamed field encrypted_password to password for M365 provider [(#7784)](https://github.com/prowler-cloud/prowler/pull/7784)
- Reworked `GET /compliance-overviews` to return proper requirement metrics [(#7877)](https://github.com/prowler-cloud/prowler/pull/7877).
- `/processors` endpoints to post-process findings. Currently, only the Mutelist processor is supported to allow to mute findings.
- Optimized the underlying queries for resources endpoints [(#8112)](https://github.com/prowler-cloud/prowler/pull/8112)
- Optimized include parameters for resources view [(#8229)](https://github.com/prowler-cloud/prowler/pull/8229)
### Fixed
- Fixed the connection status verification before launching a scan [(#7831)](https://github.com/prowler-cloud/prowler/pull/7831)
- Search filter for findings and resources [(#8112)](https://github.com/prowler-cloud/prowler/pull/8112)
### Security
- Enhanced password validation to enforce 12+ character passwords with special characters, uppercase, lowercase, and numbers [(#8225)](https://github.com/prowler-cloud/prowler/pull/8225)
---
## [v1.9.1] (Prowler v5.8.1)
### Added
- Custom exception for provider connection errors during scans [(#8234)](https://github.com/prowler-cloud/prowler/pull/8234)
### Changed
- Summary and overview tasks now use a dedicated queue and no longer propagate errors to compliance tasks [(#8214)](https://github.com/prowler-cloud/prowler/pull/8214)
### Fixed
- Scan with no resources will not trigger legacy code for findings metadata [(#8183)](https://github.com/prowler-cloud/prowler/pull/8183)
- Invitation email comparison case-insensitive [(#8206)](https://github.com/prowler-cloud/prowler/pull/8206)
### Removed
- Validation of the provider's secret type during updates [(#8197)](https://github.com/prowler-cloud/prowler/pull/8197)
---
## [v1.9.0] (Prowler v5.8.0)
### Added
- Support GCP Service Account key [(#7824)](https://github.com/prowler-cloud/prowler/pull/7824)
- `GET /compliance-overviews` endpoints to retrieve compliance metadata and specific requirements statuses [(#7877)](https://github.com/prowler-cloud/prowler/pull/7877)
- Lighthouse configuration support [(#7848)](https://github.com/prowler-cloud/prowler/pull/7848)
### Changed
- Reworked `GET /compliance-overviews` to return proper requirement metrics [(#7877)](https://github.com/prowler-cloud/prowler/pull/7877)
- Optional `user` and `password` for M365 provider [(#7992)](https://github.com/prowler-cloud/prowler/pull/7992)
### Fixed
- Scheduled scans are no longer deleted when their daily schedule run is disabled [(#8082)](https://github.com/prowler-cloud/prowler/pull/8082)
---
## [v1.8.5] (Prowler v5.7.5)
### Fixed
- Normalize provider UID to ensure safe and unique export directory paths [(#8007)](https://github.com/prowler-cloud/prowler/pull/8007).
- Blank resource types in `/metadata` endpoints [(#8027)](https://github.com/prowler-cloud/prowler/pull/8027)
---
## [v1.8.4] (Prowler v5.7.4)
### Removed
- Reverted RLS transaction handling and DB custom backend [(#7994)](https://github.com/prowler-cloud/prowler/pull/7994)
---
## [v1.8.3] (Prowler v5.7.3)
### Added
- Database backend to handle already closed connections [(#7935)](https://github.com/prowler-cloud/prowler/pull/7935).
- Database backend to handle already closed connections [(#7935)](https://github.com/prowler-cloud/prowler/pull/7935)
### Changed
- Renamed field encrypted_password to password for M365 provider [(#7784)](https://github.com/prowler-cloud/prowler/pull/7784)
### Fixed
- Fixed transaction persistence with RLS operations [(#7916)](https://github.com/prowler-cloud/prowler/pull/7916).
- Reverted the change `get_with_retry` to use the original `get` method for retrieving tasks [(#7932)](https://github.com/prowler-cloud/prowler/pull/7932).
- Transaction persistence with RLS operations [(#7916)](https://github.com/prowler-cloud/prowler/pull/7916)
- Reverted the change `get_with_retry` to use the original `get` method for retrieving tasks [(#7932)](https://github.com/prowler-cloud/prowler/pull/7932)
---
## [v1.8.2] (Prowler v5.7.2)
### Fixed
- Fixed task lookup to use task_kwargs instead of task_args for scan report resolution. [(#7830)](https://github.com/prowler-cloud/prowler/pull/7830)
- Fixed Kubernetes UID validation to allow valid context names [(#7871)](https://github.com/prowler-cloud/prowler/pull/7871)
- Fixed a race condition when creating background tasks [(#7876)](https://github.com/prowler-cloud/prowler/pull/7876).
- Fixed an error when modifying or retrieving tenants due to missing user UUID in transaction context [(#7890)](https://github.com/prowler-cloud/prowler/pull/7890).
- Task lookup to use task_kwargs instead of task_args for scan report resolution [(#7830)](https://github.com/prowler-cloud/prowler/pull/7830)
- Kubernetes UID validation to allow valid context names [(#7871)](https://github.com/prowler-cloud/prowler/pull/7871)
- Connection status verification before launching a scan [(#7831)](https://github.com/prowler-cloud/prowler/pull/7831)
- Race condition when creating background tasks [(#7876)](https://github.com/prowler-cloud/prowler/pull/7876)
- Error when modifying or retrieving tenants due to missing user UUID in transaction context [(#7890)](https://github.com/prowler-cloud/prowler/pull/7890)
---
## [v1.8.1] (Prowler v5.7.1)
### Fixed
- Added database index to improve performance on finding lookup [(#7800)](https://github.com/prowler-cloud/prowler/pull/7800).
- Added database index to improve performance on finding lookup [(#7800)](https://github.com/prowler-cloud/prowler/pull/7800)
---
## [v1.8.0] (Prowler v5.7.0)
### Added
- Added huge improvements to `/findings/metadata` and resource related filters for findings [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
- Added improvements to `/overviews` endpoints [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
- Added new queue to perform backfill background tasks [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690).
- Added new endpoints to retrieve latest findings and metadata [(#7743)](https://github.com/prowler-cloud/prowler/pull/7743).
- Added export support for Prowler ThreatScore in M365 [(7783)](https://github.com/prowler-cloud/prowler/pull/7783)
- Huge improvements to `/findings/metadata` and resource related filters for findings [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690)
- Improvements to `/overviews` endpoints [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690)
- Queue to perform backfill background tasks [(#7690)](https://github.com/prowler-cloud/prowler/pull/7690)
- New endpoints to retrieve latest findings and metadata [(#7743)](https://github.com/prowler-cloud/prowler/pull/7743)
- Export support for Prowler ThreatScore in M365 [(7783)](https://github.com/prowler-cloud/prowler/pull/7783)
---
@@ -60,9 +117,9 @@ All notable changes to the **Prowler API** are documented in this file.
### Added
- Added M365 as a new provider [(#7563)](https://github.com/prowler-cloud/prowler/pull/7563).
- Added a `compliance/` folder and ZIPexport functionality for all compliance reports.[(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
- Added a new API endpoint to fetch and download any specific compliance file by name [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653).
- M365 as a new provider [(#7563)](https://github.com/prowler-cloud/prowler/pull/7563)
- `compliance/` folder and ZIPexport functionality for all compliance reports [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653)
- API endpoint to fetch and download any specific compliance file by name [(#7653)](https://github.com/prowler-cloud/prowler/pull/7653)
---
@@ -70,43 +127,42 @@ All notable changes to the **Prowler API** are documented in this file.
### Added
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167).
- HTTP Security Headers [(#7289)](https://github.com/prowler-cloud/prowler/pull/7289).
- New endpoint to get the compliance overviews metadata [(#7333)](https://github.com/prowler-cloud/prowler/pull/7333).
- Support for muted findings [(#7378)](https://github.com/prowler-cloud/prowler/pull/7378).
- Added missing fields to API findings and resources [(#7318)](https://github.com/prowler-cloud/prowler/pull/7318).
- Support for developing new integrations [(#7167)](https://github.com/prowler-cloud/prowler/pull/7167)
- HTTP Security Headers [(#7289)](https://github.com/prowler-cloud/prowler/pull/7289)
- New endpoint to get the compliance overviews metadata [(#7333)](https://github.com/prowler-cloud/prowler/pull/7333)
- Support for muted findings [(#7378)](https://github.com/prowler-cloud/prowler/pull/7378)
- Missing fields to API findings and resources [(#7318)](https://github.com/prowler-cloud/prowler/pull/7318)
---
## [v1.5.4] (Prowler v5.4.4)
### Fixed
- Fixed a bug with periodic tasks when trying to delete a provider ([#7466])(https://github.com/prowler-cloud/prowler/pull/7466).
- Bug with periodic tasks when trying to delete a provider [(#7466)](https://github.com/prowler-cloud/prowler/pull/7466)
---
## [v1.5.3] (Prowler v5.4.3)
### Fixed
- Added duplicated scheduled scans handling ([#7401])(https://github.com/prowler-cloud/prowler/pull/7401).
- Added environment variable to configure the deletion task batch size ([#7423])(https://github.com/prowler-cloud/prowler/pull/7423).
- Duplicated scheduled scans handling [(#7401)](https://github.com/prowler-cloud/prowler/pull/7401)
- Environment variable to configure the deletion task batch size [(#7423)](https://github.com/prowler-cloud/prowler/pull/7423)
---
## [v1.5.2] (Prowler v5.4.2)
### Changed
- Refactored deletion logic and implemented retry mechanism for deletion tasks [(#7349)](https://github.com/prowler-cloud/prowler/pull/7349).
- Refactored deletion logic and implemented retry mechanism for deletion tasks [(#7349)](https://github.com/prowler-cloud/prowler/pull/7349)
---
## [v1.5.1] (Prowler v5.4.1)
### Fixed
- Added a handled response in case local files are missing [(#7183)](https://github.com/prowler-cloud/prowler/pull/7183).
- Fixed a race condition when deleting export files after the S3 upload [(#7172)](https://github.com/prowler-cloud/prowler/pull/7172).
- Handled exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283).
- Handle response in case local files are missing [(#7183)](https://github.com/prowler-cloud/prowler/pull/7183)
- Race condition when deleting export files after the S3 upload [(#7172)](https://github.com/prowler-cloud/prowler/pull/7172)
- Handle exception when a provider has no secret in test connection [(#7283)](https://github.com/prowler-cloud/prowler/pull/7283)
---
@@ -114,20 +170,20 @@ All notable changes to the **Prowler API** are documented in this file.
### Added
- Social login integration with Google and GitHub [(#6906)](https://github.com/prowler-cloud/prowler/pull/6906)
- Add API scan report system, now all scans launched from the API will generate a compressed file with the report in OCSF, CSV and HTML formats [(#6878)](https://github.com/prowler-cloud/prowler/pull/6878).
- API scan report system, now all scans launched from the API will generate a compressed file with the report in OCSF, CSV and HTML formats [(#6878)](https://github.com/prowler-cloud/prowler/pull/6878)
- Configurable Sentry integration [(#6874)](https://github.com/prowler-cloud/prowler/pull/6874)
### Changed
- Optimized `GET /findings` endpoint to improve response time and size [(#7019)](https://github.com/prowler-cloud/prowler/pull/7019).
- Optimized `GET /findings` endpoint to improve response time and size [(#7019)](https://github.com/prowler-cloud/prowler/pull/7019)
---
## [v1.4.0] (Prowler v5.3.0)
### Changed
- Daily scheduled scan instances are now created beforehand with `SCHEDULED` state [(#6700)](https://github.com/prowler-cloud/prowler/pull/6700).
- Findings endpoints now require at least one date filter [(#6800)](https://github.com/prowler-cloud/prowler/pull/6800).
- Findings metadata endpoint received a performance improvement [(#6863)](https://github.com/prowler-cloud/prowler/pull/6863).
- Increased the allowed length of the provider UID for Kubernetes providers [(#6869)](https://github.com/prowler-cloud/prowler/pull/6869).
- Daily scheduled scan instances are now created beforehand with `SCHEDULED` state [(#6700)](https://github.com/prowler-cloud/prowler/pull/6700)
- Findings endpoints now require at least one date filter [(#6800)](https://github.com/prowler-cloud/prowler/pull/6800)
- Findings metadata endpoint received a performance improvement [(#6863)](https://github.com/prowler-cloud/prowler/pull/6863)
- Increased the allowed length of the provider UID for Kubernetes providers [(#6869)](https://github.com/prowler-cloud/prowler/pull/6869)
---
+16 -5
View File
@@ -6,7 +6,19 @@ ARG POWERSHELL_VERSION=7.5.0
ENV POWERSHELL_VERSION=${POWERSHELL_VERSION}
# hadolint ignore=DL3008
RUN apt-get update && apt-get install -y --no-install-recommends wget libicu72 \
RUN apt-get update && apt-get install -y --no-install-recommends \
wget \
libicu72 \
gcc \
g++ \
make \
libxml2-dev \
libxmlsec1-dev \
libxmlsec1-openssl \
pkg-config \
libtool \
libxslt1-dev \
python3-dev \
&& rm -rf /var/lib/apt/lists/*
# Install PowerShell
@@ -37,18 +49,17 @@ COPY pyproject.toml ./
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir poetry
COPY src/backend/ ./backend/
ENV PATH="/home/prowler/.local/bin:$PATH"
# Add `--no-root` to avoid installing the current project as a package
RUN poetry install --no-root && \
rm -rf ~/.cache/pip
COPY docker-entrypoint.sh ./docker-entrypoint.sh
RUN poetry run python "$(poetry env info --path)/src/prowler/prowler/providers/m365/lib/powershell/m365_powershell.py"
COPY src/backend/ ./backend/
COPY docker-entrypoint.sh ./docker-entrypoint.sh
WORKDIR /home/prowler/backend
# Development image
+1 -1
View File
@@ -257,7 +257,7 @@ cd src/backend
python manage.py loaddata api/fixtures/0_dev_users.json --database admin
```
> The default credentials are `dev@prowler.com:thisisapassword123` or `dev2@prowler.com:thisisapassword123`
> The default credentials are `dev@prowler.com:Thisisapassword123@` or `dev2@prowler.com:Thisisapassword123@`
## Run tests
-125
View File
@@ -1,125 +0,0 @@
services:
api:
build:
dockerfile: Dockerfile
image: prowler-api
env_file:
- path: ./.env
required: false
ports:
- "${DJANGO_PORT:-8000}:${DJANGO_PORT:-8000}"
profiles:
- prod
depends_on:
postgres:
condition: service_healthy
valkey:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "prod"
api-dev:
build:
dockerfile: Dockerfile
target: dev
image: prowler-api-dev
environment:
- DJANGO_SETTINGS_MODULE=config.django.devel
- DJANGO_LOGGING_FORMATTER=human_readable
env_file:
- path: ./.env
required: false
ports:
- "${DJANGO_PORT:-8080}:${DJANGO_PORT:-8080}"
volumes:
- "./src/backend:/home/prowler/backend"
- "./pyproject.toml:/home/prowler/pyproject.toml"
profiles:
- dev
depends_on:
postgres:
condition: service_healthy
valkey:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "dev"
postgres:
image: postgres:16.3-alpine
ports:
- "${POSTGRES_PORT:-5432}:${POSTGRES_PORT:-5432}"
hostname: "postgres-db"
volumes:
- ./_data/postgres:/var/lib/postgresql/data
environment:
- POSTGRES_USER=${POSTGRES_ADMIN_USER:-prowler}
- POSTGRES_PASSWORD=${POSTGRES_ADMIN_PASSWORD:-S3cret}
- POSTGRES_DB=${POSTGRES_DB:-prowler_db}
env_file:
- path: ./.env
required: false
healthcheck:
test: ["CMD-SHELL", "sh -c 'pg_isready -U ${POSTGRES_ADMIN_USER:-prowler} -d ${POSTGRES_DB:-prowler_db}'"]
interval: 5s
timeout: 5s
retries: 5
valkey:
image: valkey/valkey:7-alpine3.19
ports:
- "${VALKEY_PORT:-6379}:6379"
hostname: "valkey"
volumes:
- ./_data/valkey:/data
env_file:
- path: ./.env
required: false
healthcheck:
test: ["CMD-SHELL", "sh -c 'valkey-cli ping'"]
interval: 10s
timeout: 5s
retries: 3
worker:
build:
dockerfile: Dockerfile
image: prowler-worker
environment:
- DJANGO_SETTINGS_MODULE=${DJANGO_SETTINGS_MODULE:-config.django.production}
env_file:
- path: ./.env
required: false
profiles:
- dev
- prod
depends_on:
valkey:
condition: service_healthy
postgres:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "worker"
worker-beat:
build:
dockerfile: Dockerfile
image: prowler-worker
environment:
- DJANGO_SETTINGS_MODULE=${DJANGO_SETTINGS_MODULE:-config.django.production}
env_file:
- path: ./.env
required: false
profiles:
- dev
- prod
depends_on:
valkey:
condition: service_healthy
postgres:
condition: service_healthy
entrypoint:
- "../docker-entrypoint.sh"
- "beat"
+5 -1
View File
@@ -3,6 +3,10 @@
apply_migrations() {
echo "Applying database migrations..."
# Fix Inconsistent migration history after adding sites app
poetry run python manage.py check_and_fix_socialaccount_sites_migration --database admin
poetry run python manage.py migrate --database admin
}
@@ -28,7 +32,7 @@ start_prod_server() {
start_worker() {
echo "Starting the worker..."
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion,backfill -E --max-tasks-per-child 1
poetry run python -m celery -A config.celery worker -l "${DJANGO_LOGGING_LEVEL:-info}" -Q celery,scans,scan-reports,deletion,backfill,overview -E --max-tasks-per-child 1
}
start_worker_beat() {
+448 -26
View File
@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.1.3 and should not be changed by hand.
[[package]]
name = "about-time"
@@ -880,7 +880,6 @@ description = "Foreign Function Interface for Python calling C code."
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
markers = "platform_python_implementation != \"PyPy\""
files = [
{file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"},
{file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"},
@@ -950,6 +949,7 @@ files = [
{file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"},
{file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"},
]
markers = {dev = "platform_python_implementation != \"PyPy\""}
[package.dependencies]
pycparser = "*"
@@ -1448,6 +1448,18 @@ files = [
graph = ["objgraph (>=1.7.2)"]
profile = ["gprof2dot (>=2022.7.29)"]
[[package]]
name = "distro"
version = "1.9.0"
description = "Distro - an OS platform information API"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2"},
{file = "distro-1.9.0.tar.gz", hash = "sha256:2fa77c6fd8940f116ee1d6b94a2f90b13b5ea8d019b98bc8bafdcabcdd9bdbed"},
]
[[package]]
name = "dj-rest-auth"
version = "7.0.1"
@@ -1469,14 +1481,14 @@ with-social = ["django-allauth[socialaccount] (>=64.0.0)"]
[[package]]
name = "django"
version = "5.1.8"
version = "5.1.10"
description = "A high-level Python web framework that encourages rapid development and clean, pragmatic design."
optional = false
python-versions = ">=3.10"
groups = ["main", "dev"]
files = [
{file = "Django-5.1.8-py3-none-any.whl", hash = "sha256:11b28fa4b00e59d0def004e9ee012fefbb1065a5beb39ee838983fd24493ad4f"},
{file = "Django-5.1.8.tar.gz", hash = "sha256:42e92a1dd2810072bcc40a39a212b693f94406d0ba0749e68eb642f31dc770b4"},
{file = "django-5.1.10-py3-none-any.whl", hash = "sha256:19c9b771e9cf4de91101861aadd2daaa159bcf10698ca909c5755c88e70ccb84"},
{file = "django-5.1.10.tar.gz", hash = "sha256:73e5d191421d177803dbd5495d94bc7d06d156df9561f4eea9e11b4994c07137"},
]
[package.dependencies]
@@ -1490,19 +1502,20 @@ bcrypt = ["bcrypt"]
[[package]]
name = "django-allauth"
version = "65.4.1"
version = "65.8.0"
description = "Integrated set of Django applications addressing authentication, registration, account management as well as 3rd party (social) account authentication."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "django_allauth-65.4.1.tar.gz", hash = "sha256:60b32aef7dbbcc213319aa4fd8f570e985266ea1162ae6ef7a26a24efca85c8c"},
{file = "django_allauth-65.8.0.tar.gz", hash = "sha256:9da589d99d412740629333a01865a90c95c97e0fae0cde789aa45a8fda90e83b"},
]
[package.dependencies]
asgiref = ">=3.8.1"
Django = ">=4.2.16"
pyjwt = {version = ">=1.7", extras = ["crypto"], optional = true, markers = "extra == \"socialaccount\""}
python3-saml = {version = ">=1.15.0,<2.0.0", optional = true, markers = "extra == \"saml\""}
requests = {version = ">=2.0.0", optional = true, markers = "extra == \"socialaccount\""}
requests-oauthlib = {version = ">=0.3.0", optional = true, markers = "extra == \"socialaccount\""}
@@ -2470,6 +2483,93 @@ MarkupSafe = ">=2.0"
[package.extras]
i18n = ["Babel (>=2.7)"]
[[package]]
name = "jiter"
version = "0.10.0"
description = "Fast iterable JSON parser."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "jiter-0.10.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:cd2fb72b02478f06a900a5782de2ef47e0396b3e1f7d5aba30daeb1fce66f303"},
{file = "jiter-0.10.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:32bb468e3af278f095d3fa5b90314728a6916d89ba3d0ffb726dd9bf7367285e"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa8b3e0068c26ddedc7abc6fac37da2d0af16b921e288a5a613f4b86f050354f"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:286299b74cc49e25cd42eea19b72aa82c515d2f2ee12d11392c56d8701f52224"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6ed5649ceeaeffc28d87fb012d25a4cd356dcd53eff5acff1f0466b831dda2a7"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2ab0051160cb758a70716448908ef14ad476c3774bd03ddce075f3c1f90a3d6"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03997d2f37f6b67d2f5c475da4412be584e1cec273c1cfc03d642c46db43f8cf"},
{file = "jiter-0.10.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c404a99352d839fed80d6afd6c1d66071f3bacaaa5c4268983fc10f769112e90"},
{file = "jiter-0.10.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:66e989410b6666d3ddb27a74c7e50d0829704ede652fd4c858e91f8d64b403d0"},
{file = "jiter-0.10.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:b532d3af9ef4f6374609a3bcb5e05a1951d3bf6190dc6b176fdb277c9bbf15ee"},
{file = "jiter-0.10.0-cp310-cp310-win32.whl", hash = "sha256:da9be20b333970e28b72edc4dff63d4fec3398e05770fb3205f7fb460eb48dd4"},
{file = "jiter-0.10.0-cp310-cp310-win_amd64.whl", hash = "sha256:f59e533afed0c5b0ac3eba20d2548c4a550336d8282ee69eb07b37ea526ee4e5"},
{file = "jiter-0.10.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:3bebe0c558e19902c96e99217e0b8e8b17d570906e72ed8a87170bc290b1e978"},
{file = "jiter-0.10.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:558cc7e44fd8e507a236bee6a02fa17199ba752874400a0ca6cd6e2196cdb7dc"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4d613e4b379a07d7c8453c5712ce7014e86c6ac93d990a0b8e7377e18505e98d"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f62cf8ba0618eda841b9bf61797f21c5ebd15a7a1e19daab76e4e4b498d515b2"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:919d139cdfa8ae8945112398511cb7fca58a77382617d279556b344867a37e61"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:13ddbc6ae311175a3b03bd8994881bc4635c923754932918e18da841632349db"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c440ea003ad10927a30521a9062ce10b5479592e8a70da27f21eeb457b4a9c5"},
{file = "jiter-0.10.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:dc347c87944983481e138dea467c0551080c86b9d21de6ea9306efb12ca8f606"},
{file = "jiter-0.10.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:13252b58c1f4d8c5b63ab103c03d909e8e1e7842d302473f482915d95fefd605"},
{file = "jiter-0.10.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7d1bbf3c465de4a24ab12fb7766a0003f6f9bce48b8b6a886158c4d569452dc5"},
{file = "jiter-0.10.0-cp311-cp311-win32.whl", hash = "sha256:db16e4848b7e826edca4ccdd5b145939758dadf0dc06e7007ad0e9cfb5928ae7"},
{file = "jiter-0.10.0-cp311-cp311-win_amd64.whl", hash = "sha256:9c9c1d5f10e18909e993f9641f12fe1c77b3e9b533ee94ffa970acc14ded3812"},
{file = "jiter-0.10.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:1e274728e4a5345a6dde2d343c8da018b9d4bd4350f5a472fa91f66fda44911b"},
{file = "jiter-0.10.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7202ae396446c988cb2a5feb33a543ab2165b786ac97f53b59aafb803fef0744"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23ba7722d6748b6920ed02a8f1726fb4b33e0fd2f3f621816a8b486c66410ab2"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:371eab43c0a288537d30e1f0b193bc4eca90439fc08a022dd83e5e07500ed026"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6c675736059020365cebc845a820214765162728b51ab1e03a1b7b3abb70f74c"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0c5867d40ab716e4684858e4887489685968a47e3ba222e44cde6e4a2154f959"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:395bb9a26111b60141757d874d27fdea01b17e8fac958b91c20128ba8f4acc8a"},
{file = "jiter-0.10.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6842184aed5cdb07e0c7e20e5bdcfafe33515ee1741a6835353bb45fe5d1bd95"},
{file = "jiter-0.10.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:62755d1bcea9876770d4df713d82606c8c1a3dca88ff39046b85a048566d56ea"},
{file = "jiter-0.10.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:533efbce2cacec78d5ba73a41756beff8431dfa1694b6346ce7af3a12c42202b"},
{file = "jiter-0.10.0-cp312-cp312-win32.whl", hash = "sha256:8be921f0cadd245e981b964dfbcd6fd4bc4e254cdc069490416dd7a2632ecc01"},
{file = "jiter-0.10.0-cp312-cp312-win_amd64.whl", hash = "sha256:a7c7d785ae9dda68c2678532a5a1581347e9c15362ae9f6e68f3fdbfb64f2e49"},
{file = "jiter-0.10.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:e0588107ec8e11b6f5ef0e0d656fb2803ac6cf94a96b2b9fc675c0e3ab5e8644"},
{file = "jiter-0.10.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cafc4628b616dc32530c20ee53d71589816cf385dd9449633e910d596b1f5c8a"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:520ef6d981172693786a49ff5b09eda72a42e539f14788124a07530f785c3ad6"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:554dedfd05937f8fc45d17ebdf298fe7e0c77458232bcb73d9fbbf4c6455f5b3"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5bc299da7789deacf95f64052d97f75c16d4fc8c4c214a22bf8d859a4288a1c2"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5161e201172de298a8a1baad95eb85db4fb90e902353b1f6a41d64ea64644e25"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e2227db6ba93cb3e2bf67c87e594adde0609f146344e8207e8730364db27041"},
{file = "jiter-0.10.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:15acb267ea5e2c64515574b06a8bf393fbfee6a50eb1673614aa45f4613c0cca"},
{file = "jiter-0.10.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:901b92f2e2947dc6dfcb52fd624453862e16665ea909a08398dde19c0731b7f4"},
{file = "jiter-0.10.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d0cb9a125d5a3ec971a094a845eadde2db0de85b33c9f13eb94a0c63d463879e"},
{file = "jiter-0.10.0-cp313-cp313-win32.whl", hash = "sha256:48a403277ad1ee208fb930bdf91745e4d2d6e47253eedc96e2559d1e6527006d"},
{file = "jiter-0.10.0-cp313-cp313-win_amd64.whl", hash = "sha256:75f9eb72ecb640619c29bf714e78c9c46c9c4eaafd644bf78577ede459f330d4"},
{file = "jiter-0.10.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:28ed2a4c05a1f32ef0e1d24c2611330219fed727dae01789f4a335617634b1ca"},
{file = "jiter-0.10.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:14a4c418b1ec86a195f1ca69da8b23e8926c752b685af665ce30777233dfe070"},
{file = "jiter-0.10.0-cp313-cp313t-win_amd64.whl", hash = "sha256:d7bfed2fe1fe0e4dda6ef682cee888ba444b21e7a6553e03252e4feb6cf0adca"},
{file = "jiter-0.10.0-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:5e9251a5e83fab8d87799d3e1a46cb4b7f2919b895c6f4483629ed2446f66522"},
{file = "jiter-0.10.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:023aa0204126fe5b87ccbcd75c8a0d0261b9abdbbf46d55e7ae9f8e22424eeb8"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3c189c4f1779c05f75fc17c0c1267594ed918996a231593a21a5ca5438445216"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:15720084d90d1098ca0229352607cd68256c76991f6b374af96f36920eae13c4"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e4f2fb68e5f1cfee30e2b2a09549a00683e0fde4c6a2ab88c94072fc33cb7426"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ce541693355fc6da424c08b7edf39a2895f58d6ea17d92cc2b168d20907dee12"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31c50c40272e189d50006ad5c73883caabb73d4e9748a688b216e85a9a9ca3b9"},
{file = "jiter-0.10.0-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fa3402a2ff9815960e0372a47b75c76979d74402448509ccd49a275fa983ef8a"},
{file = "jiter-0.10.0-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:1956f934dca32d7bb647ea21d06d93ca40868b505c228556d3373cbd255ce853"},
{file = "jiter-0.10.0-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:fcedb049bdfc555e261d6f65a6abe1d5ad68825b7202ccb9692636c70fcced86"},
{file = "jiter-0.10.0-cp314-cp314-win32.whl", hash = "sha256:ac509f7eccca54b2a29daeb516fb95b6f0bd0d0d8084efaf8ed5dfc7b9f0b357"},
{file = "jiter-0.10.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:5ed975b83a2b8639356151cef5c0d597c68376fc4922b45d0eb384ac058cfa00"},
{file = "jiter-0.10.0-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3aa96f2abba33dc77f79b4cf791840230375f9534e5fac927ccceb58c5e604a5"},
{file = "jiter-0.10.0-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:bd6292a43c0fc09ce7c154ec0fa646a536b877d1e8f2f96c19707f65355b5a4d"},
{file = "jiter-0.10.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:39de429dcaeb6808d75ffe9effefe96a4903c6a4b376b2f6d08d77c1aaee2f18"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:52ce124f13a7a616fad3bb723f2bfb537d78239d1f7f219566dc52b6f2a9e48d"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:166f3606f11920f9a1746b2eea84fa2c0a5d50fd313c38bdea4edc072000b0af"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:28dcecbb4ba402916034fc14eba7709f250c4d24b0c43fc94d187ee0580af181"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:86c5aa6910f9bebcc7bc4f8bc461aff68504388b43bfe5e5c0bd21efa33b52f4"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ceeb52d242b315d7f1f74b441b6a167f78cea801ad7c11c36da77ff2d42e8a28"},
{file = "jiter-0.10.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ff76d8887c8c8ee1e772274fcf8cc1071c2c58590d13e33bd12d02dc9a560397"},
{file = "jiter-0.10.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a9be4d0fa2b79f7222a88aa488bd89e2ae0a0a5b189462a12def6ece2faa45f1"},
{file = "jiter-0.10.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9ab7fd8738094139b6c1ab1822d6f2000ebe41515c537235fd45dabe13ec9324"},
{file = "jiter-0.10.0-cp39-cp39-win32.whl", hash = "sha256:5f51e048540dd27f204ff4a87f5d79294ea0aa3aa552aca34934588cf27023cf"},
{file = "jiter-0.10.0-cp39-cp39-win_amd64.whl", hash = "sha256:1b28302349dc65703a9e4ead16f163b1c339efffbe1049c30a44b001a2a4fff9"},
{file = "jiter-0.10.0.tar.gz", hash = "sha256:07a7142c38aacc85194391108dc91b5b57093c978a9932bd86a36862759d9500"},
]
[[package]]
name = "jmespath"
version = "1.0.1"
@@ -2582,6 +2682,161 @@ websocket-client = ">=0.32.0,<0.40.0 || >0.40.0,<0.41.dev0 || >=0.43.dev0"
[package.extras]
adal = ["adal (>=1.0.2)"]
[[package]]
name = "lxml"
version = "5.3.2"
description = "Powerful and Pythonic XML processing library combining libxml2/libxslt with the ElementTree API."
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "lxml-5.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:c4b84d6b580a9625dfa47269bf1fd7fbba7ad69e08b16366a46acb005959c395"},
{file = "lxml-5.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b4c08ecb26e4270a62f81f81899dfff91623d349e433b126931c9c4577169666"},
{file = "lxml-5.3.2-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ef926e9f11e307b5a7c97b17c5c609a93fb59ffa8337afac8f89e6fe54eb0b37"},
{file = "lxml-5.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:017ceeabe739100379fe6ed38b033cd244ce2da4e7f6f07903421f57da3a19a2"},
{file = "lxml-5.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dae97d9435dc90590f119d056d233c33006b2fd235dd990d5564992261ee7ae8"},
{file = "lxml-5.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:910f39425c6798ce63c93976ae5af5fff6949e2cb446acbd44d6d892103eaea8"},
{file = "lxml-5.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c9780de781a0d62a7c3680d07963db3048b919fc9e3726d9cfd97296a65ffce1"},
{file = "lxml-5.3.2-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:1a06b0c6ba2e3ca45a009a78a4eb4d6b63831830c0a83dcdc495c13b9ca97d3e"},
{file = "lxml-5.3.2-cp310-cp310-manylinux_2_28_ppc64le.whl", hash = "sha256:4c62d0a34d1110769a1bbaf77871a4b711a6f59c4846064ccb78bc9735978644"},
{file = "lxml-5.3.2-cp310-cp310-manylinux_2_28_s390x.whl", hash = "sha256:8f961a4e82f411b14538fe5efc3e6b953e17f5e809c463f0756a0d0e8039b700"},
{file = "lxml-5.3.2-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:3dfc78f5f9251b6b8ad37c47d4d0bfe63ceb073a916e5b50a3bf5fd67a703335"},
{file = "lxml-5.3.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:10e690bc03214d3537270c88e492b8612d5e41b884f232df2b069b25b09e6711"},
{file = "lxml-5.3.2-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:aa837e6ee9534de8d63bc4c1249e83882a7ac22bd24523f83fad68e6ffdf41ae"},
{file = "lxml-5.3.2-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:da4c9223319400b97a2acdfb10926b807e51b69eb7eb80aad4942c0516934858"},
{file = "lxml-5.3.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:dc0e9bdb3aa4d1de703a437576007d366b54f52c9897cae1a3716bb44fc1fc85"},
{file = "lxml-5.3.2-cp310-cp310-win32.win32.whl", hash = "sha256:dd755a0a78dd0b2c43f972e7b51a43be518ebc130c9f1a7c4480cf08b4385486"},
{file = "lxml-5.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:d64ea1686474074b38da13ae218d9fde0d1dc6525266976808f41ac98d9d7980"},
{file = "lxml-5.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9d61a7d0d208ace43986a92b111e035881c4ed45b1f5b7a270070acae8b0bfb4"},
{file = "lxml-5.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:856dfd7eda0b75c29ac80a31a6411ca12209183e866c33faf46e77ace3ce8a79"},
{file = "lxml-5.3.2-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7a01679e4aad0727bedd4c9407d4d65978e920f0200107ceeffd4b019bd48529"},
{file = "lxml-5.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b6b37b4c3acb8472d191816d4582379f64d81cecbdce1a668601745c963ca5cc"},
{file = "lxml-5.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3df5a54e7b7c31755383f126d3a84e12a4e0333db4679462ef1165d702517477"},
{file = "lxml-5.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c09a40f28dcded933dc16217d6a092be0cc49ae25811d3b8e937c8060647c353"},
{file = "lxml-5.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a1ef20f1851ccfbe6c5a04c67ec1ce49da16ba993fdbabdce87a92926e505412"},
{file = "lxml-5.3.2-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:f79a63289dbaba964eb29ed3c103b7911f2dce28c36fe87c36a114e6bd21d7ad"},
{file = "lxml-5.3.2-cp311-cp311-manylinux_2_28_ppc64le.whl", hash = "sha256:75a72697d95f27ae00e75086aed629f117e816387b74a2f2da6ef382b460b710"},
{file = "lxml-5.3.2-cp311-cp311-manylinux_2_28_s390x.whl", hash = "sha256:b9b00c9ee1cc3a76f1f16e94a23c344e0b6e5c10bec7f94cf2d820ce303b8c01"},
{file = "lxml-5.3.2-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:77cbcab50cbe8c857c6ba5f37f9a3976499c60eada1bf6d38f88311373d7b4bc"},
{file = "lxml-5.3.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:29424058f072a24622a0a15357bca63d796954758248a72da6d512f9bd9a4493"},
{file = "lxml-5.3.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:7d82737a8afe69a7c80ef31d7626075cc7d6e2267f16bf68af2c764b45ed68ab"},
{file = "lxml-5.3.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:95473d1d50a5d9fcdb9321fdc0ca6e1edc164dce4c7da13616247d27f3d21e31"},
{file = "lxml-5.3.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:2162068f6da83613f8b2a32ca105e37a564afd0d7009b0b25834d47693ce3538"},
{file = "lxml-5.3.2-cp311-cp311-win32.whl", hash = "sha256:f8695752cf5d639b4e981afe6c99e060621362c416058effd5c704bede9cb5d1"},
{file = "lxml-5.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:d1a94cbb4ee64af3ab386c2d63d6d9e9cf2e256ac0fd30f33ef0a3c88f575174"},
{file = "lxml-5.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:16b3897691ec0316a1aa3c6585f61c8b7978475587c5b16fc1d2c28d283dc1b0"},
{file = "lxml-5.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:a8d4b34a0eeaf6e73169dcfd653c8d47f25f09d806c010daf074fba2db5e2d3f"},
{file = "lxml-5.3.2-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9cd7a959396da425022e1e4214895b5cfe7de7035a043bcc2d11303792b67554"},
{file = "lxml-5.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cac5eaeec3549c5df7f8f97a5a6db6963b91639389cdd735d5a806370847732b"},
{file = "lxml-5.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29b5f7d77334877c2146e7bb8b94e4df980325fab0a8af4d524e5d43cd6f789d"},
{file = "lxml-5.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:13f3495cfec24e3d63fffd342cc8141355d1d26ee766ad388775f5c8c5ec3932"},
{file = "lxml-5.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e70ad4c9658beeff99856926fd3ee5fde8b519b92c693f856007177c36eb2e30"},
{file = "lxml-5.3.2-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:507085365783abd7879fa0a6fa55eddf4bdd06591b17a2418403bb3aff8a267d"},
{file = "lxml-5.3.2-cp312-cp312-manylinux_2_28_ppc64le.whl", hash = "sha256:5bb304f67cbf5dfa07edad904732782cbf693286b9cd85af27059c5779131050"},
{file = "lxml-5.3.2-cp312-cp312-manylinux_2_28_s390x.whl", hash = "sha256:3d84f5c093645c21c29a4e972b84cb7cf682f707f8706484a5a0c7ff13d7a988"},
{file = "lxml-5.3.2-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:bdc13911db524bd63f37b0103af014b7161427ada41f1b0b3c9b5b5a9c1ca927"},
{file = "lxml-5.3.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1ec944539543f66ebc060ae180d47e86aca0188bda9cbfadff47d86b0dc057dc"},
{file = "lxml-5.3.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:59d437cc8a7f838282df5a199cf26f97ef08f1c0fbec6e84bd6f5cc2b7913f6e"},
{file = "lxml-5.3.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:0e275961adbd32e15672e14e0cc976a982075208224ce06d149c92cb43db5b93"},
{file = "lxml-5.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:038aeb6937aa404480c2966b7f26f1440a14005cb0702078c173c028eca72c31"},
{file = "lxml-5.3.2-cp312-cp312-win32.whl", hash = "sha256:3c2c8d0fa3277147bff180e3590be67597e17d365ce94beb2efa3138a2131f71"},
{file = "lxml-5.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:77809fcd97dfda3f399102db1794f7280737b69830cd5c961ac87b3c5c05662d"},
{file = "lxml-5.3.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:77626571fb5270ceb36134765f25b665b896243529eefe840974269b083e090d"},
{file = "lxml-5.3.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:78a533375dc7aa16d0da44af3cf6e96035e484c8c6b2b2445541a5d4d3d289ee"},
{file = "lxml-5.3.2-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a6f62b2404b3f3f0744bbcabb0381c5fe186fa2a9a67ecca3603480f4846c585"},
{file = "lxml-5.3.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2ea918da00091194526d40c30c4996971f09dacab032607581f8d8872db34fbf"},
{file = "lxml-5.3.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c35326f94702a7264aa0eea826a79547d3396a41ae87a70511b9f6e9667ad31c"},
{file = "lxml-5.3.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e3bef90af21d31c4544bc917f51e04f94ae11b43156356aff243cdd84802cbf2"},
{file = "lxml-5.3.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:52fa7ba11a495b7cbce51573c73f638f1dcff7b3ee23697467dc063f75352a69"},
{file = "lxml-5.3.2-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:ad131e2c4d2c3803e736bb69063382334e03648de2a6b8f56a878d700d4b557d"},
{file = "lxml-5.3.2-cp313-cp313-manylinux_2_28_ppc64le.whl", hash = "sha256:00a4463ca409ceacd20490a893a7e08deec7870840eff33dc3093067b559ce3e"},
{file = "lxml-5.3.2-cp313-cp313-manylinux_2_28_s390x.whl", hash = "sha256:87e8d78205331cace2b73ac8249294c24ae3cba98220687b5b8ec5971a2267f1"},
{file = "lxml-5.3.2-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:bf6389133bb255e530a4f2f553f41c4dd795b1fbb6f797aea1eff308f1e11606"},
{file = "lxml-5.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b3709fc752b42fb6b6ffa2ba0a5b9871646d97d011d8f08f4d5b3ee61c7f3b2b"},
{file = "lxml-5.3.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:abc795703d0de5d83943a4badd770fbe3d1ca16ee4ff3783d7caffc252f309ae"},
{file = "lxml-5.3.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:98050830bb6510159f65d9ad1b8aca27f07c01bb3884ba95f17319ccedc4bcf9"},
{file = "lxml-5.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6ba465a91acc419c5682f8b06bcc84a424a7aa5c91c220241c6fd31de2a72bc6"},
{file = "lxml-5.3.2-cp313-cp313-win32.whl", hash = "sha256:56a1d56d60ea1ec940f949d7a309e0bff05243f9bd337f585721605670abb1c1"},
{file = "lxml-5.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:1a580dc232c33d2ad87d02c8a3069d47abbcdce974b9c9cc82a79ff603065dbe"},
{file = "lxml-5.3.2-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:1a59f7fe888d0ec1916d0ad69364c5400cfa2f885ae0576d909f342e94d26bc9"},
{file = "lxml-5.3.2-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d67b50abc2df68502a26ed2ccea60c1a7054c289fb7fc31c12e5e55e4eec66bd"},
{file = "lxml-5.3.2-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2cb08d2cb047c98d6fbbb2e77d6edd132ad6e3fa5aa826ffa9ea0c9b1bc74a84"},
{file = "lxml-5.3.2-cp36-cp36m-manylinux_2_28_x86_64.whl", hash = "sha256:495ddb7e10911fb4d673d8aa8edd98d1eadafb3b56e8c1b5f427fd33cadc455b"},
{file = "lxml-5.3.2-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:884d9308ac7d581b705a3371185282e1b8eebefd68ccf288e00a2d47f077cc51"},
{file = "lxml-5.3.2-cp36-cp36m-musllinux_1_2_x86_64.whl", hash = "sha256:37f3d7cf7f2dd2520df6cc8a13df4c3e3f913c8e0a1f9a875e44f9e5f98d7fee"},
{file = "lxml-5.3.2-cp36-cp36m-win32.whl", hash = "sha256:e885a1bf98a76dff0a0648850c3083b99d9358ef91ba8fa307c681e8e0732503"},
{file = "lxml-5.3.2-cp36-cp36m-win_amd64.whl", hash = "sha256:b45f505d0d85f4cdd440cd7500689b8e95110371eaa09da0c0b1103e9a05030f"},
{file = "lxml-5.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b53cd668facd60b4f0dfcf092e01bbfefd88271b5b4e7b08eca3184dd006cb30"},
{file = "lxml-5.3.2-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e5dea998c891f082fe204dec6565dbc2f9304478f2fc97bd4d7a940fec16c873"},
{file = "lxml-5.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d46bc3e58b01e4f38d75e0d7f745a46875b7a282df145aca9d1479c65ff11561"},
{file = "lxml-5.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:661feadde89159fd5f7d7639a81ccae36eec46974c4a4d5ccce533e2488949c8"},
{file = "lxml-5.3.2-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:43af2a69af2cacc2039024da08a90174e85f3af53483e6b2e3485ced1bf37151"},
{file = "lxml-5.3.2-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:1539f962d82436f3d386eb9f29b2a29bb42b80199c74a695dff51b367a61ec0a"},
{file = "lxml-5.3.2-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:6673920bf976421b5fac4f29b937702eef4555ee42329546a5fc68bae6178a48"},
{file = "lxml-5.3.2-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:9fa722a9cd8845594593cce399a49aa6bfc13b6c83a7ee05e2ab346d9253d52f"},
{file = "lxml-5.3.2-cp37-cp37m-win32.whl", hash = "sha256:2eadd4efa487f4710755415aed3d6ae9ac8b4327ea45226ffccb239766c8c610"},
{file = "lxml-5.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:83d8707b1b08cd02c04d3056230ec3b771b18c566ec35e723e60cdf037064e08"},
{file = "lxml-5.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:bc6e8678bfa5ccba370103976ccfcf776c85c83da9220ead41ea6fd15d2277b4"},
{file = "lxml-5.3.2-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0bed509662f67f719119ad56006cd4a38efa68cfa74383060612044915e5f7ad"},
{file = "lxml-5.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e3925975fadd6fd72a6d80541a6ec75dfbad54044a03aa37282dafcb80fbdfa"},
{file = "lxml-5.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:83c0462dedc5213ac586164c6d7227da9d4d578cf45dd7fbab2ac49b63a008eb"},
{file = "lxml-5.3.2-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:53e3f9ca72858834688afa17278649d62aa768a4b2018344be00c399c4d29e95"},
{file = "lxml-5.3.2-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:32ba634ef3f1b20f781019a91d78599224dc45745dd572f951adbf1c0c9b0d75"},
{file = "lxml-5.3.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:1b16504c53f41da5fcf04868a80ac40a39d3eec5329caf761114caec6e844ad1"},
{file = "lxml-5.3.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:1f9682786138549da44ca4c49b20e7144d063b75f2b2ba611f4cff9b83db1062"},
{file = "lxml-5.3.2-cp38-cp38-win32.whl", hash = "sha256:d8f74ef8aacdf6ee5c07566a597634bb8535f6b53dc89790db43412498cf6026"},
{file = "lxml-5.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:49f1cee0fa27e1ee02589c696a9bdf4027e7427f184fa98e6bef0c6613f6f0fa"},
{file = "lxml-5.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:741c126bcf9aa939e950e64e5e0a89c8e01eda7a5f5ffdfc67073f2ed849caea"},
{file = "lxml-5.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:ab6e9e6aca1fd7d725ffa132286e70dee5b9a4561c5ed291e836440b82888f89"},
{file = "lxml-5.3.2-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:58e8c9b9ed3c15c2d96943c14efc324b69be6352fe5585733a7db2bf94d97841"},
{file = "lxml-5.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7811828ddfb8c23f4f1fbf35e7a7b2edec2f2e4c793dee7c52014f28c4b35238"},
{file = "lxml-5.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:72968623efb1e12e950cbdcd1d0f28eb14c8535bf4be153f1bfffa818b1cf189"},
{file = "lxml-5.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ebfceaa2ea588b54efb6160e3520983663d45aed8a3895bb2031ada080fb5f04"},
{file = "lxml-5.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d685d458505b2bfd2e28c812749fe9194a2b0ce285a83537e4309a187ffa270b"},
{file = "lxml-5.3.2-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:334e0e414dab1f5366ead8ca34ec3148415f236d5660e175f1d640b11d645847"},
{file = "lxml-5.3.2-cp39-cp39-manylinux_2_28_ppc64le.whl", hash = "sha256:02e56f7de72fa82561eae69628a7d6febd7891d72248c7ff7d3e7814d4031017"},
{file = "lxml-5.3.2-cp39-cp39-manylinux_2_28_s390x.whl", hash = "sha256:638d06b4e1d34d1a074fa87deed5fb55c18485fa0dab97abc5604aad84c12031"},
{file = "lxml-5.3.2-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:354dab7206d22d7a796fa27c4c5bffddd2393da2ad61835355a4759d435beb47"},
{file = "lxml-5.3.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:d9d9f82ff2c3bf9bb777cb355149f7f3a98ec58f16b7428369dc27ea89556a4c"},
{file = "lxml-5.3.2-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:95ad58340e3b7d2b828efc370d1791856613c5cb62ae267158d96e47b3c978c9"},
{file = "lxml-5.3.2-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:30fe05f4b7f6e9eb32862745512e7cbd021070ad0f289a7f48d14a0d3fc1d8a9"},
{file = "lxml-5.3.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:34c688fef86f73dbca0798e0a61bada114677006afa524a8ce97d9e5fabf42e6"},
{file = "lxml-5.3.2-cp39-cp39-win32.whl", hash = "sha256:4d6d3d1436d57f41984920667ec5ef04bcb158f80df89ac4d0d3f775a2ac0c87"},
{file = "lxml-5.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:2996e1116bbb3ae2a1fbb2ba4da8f92742290b4011e7e5bce2bd33bbc9d9485a"},
{file = "lxml-5.3.2-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:521ab9c80b98c30b2d987001c3ede2e647e92eeb2ca02e8cb66ef5122d792b24"},
{file = "lxml-5.3.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6f1231b0f9810289d41df1eacc4ebb859c63e4ceee29908a0217403cddce38d0"},
{file = "lxml-5.3.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:271f1a4d5d2b383c36ad8b9b489da5ea9c04eca795a215bae61ed6a57cf083cd"},
{file = "lxml-5.3.2-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:6fca8a5a13906ba2677a5252752832beb0f483a22f6c86c71a2bb320fba04f61"},
{file = "lxml-5.3.2-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:ea0c3b7922209160faef194a5b6995bfe7fa05ff7dda6c423ba17646b7b9de10"},
{file = "lxml-5.3.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:0a006390834603e5952a2ff74b9a31a6007c7cc74282a087aa6467afb4eea987"},
{file = "lxml-5.3.2-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:eae4136a3b8c4cf76f69461fc8f9410d55d34ea48e1185338848a888d71b9675"},
{file = "lxml-5.3.2-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d48e06be8d8c58e7feaedd8a37897a6122637efb1637d7ce00ddf5f11f9a92ad"},
{file = "lxml-5.3.2-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d4b83aed409134093d90e114007034d2c1ebcd92e501b71fd9ec70e612c8b2eb"},
{file = "lxml-5.3.2-pp37-pypy37_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:7a0e77edfe26d3703f954d46bed52c3ec55f58586f18f4b7f581fc56954f1d84"},
{file = "lxml-5.3.2-pp37-pypy37_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:19f6fcfd15b82036b4d235749d78785eb9c991c7812012dc084e0d8853b4c1c0"},
{file = "lxml-5.3.2-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:d49919c95d31ee06eefd43d8c6f69a3cc9bdf0a9b979cc234c4071f0eb5cb173"},
{file = "lxml-5.3.2-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:2d0a60841410123c533990f392819804a8448853f06daf412c0f383443925e89"},
{file = "lxml-5.3.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4b7f729e03090eb4e3981f10efaee35e6004b548636b1a062b8b9a525e752abc"},
{file = "lxml-5.3.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:579df6e20d8acce3bcbc9fb8389e6ae00c19562e929753f534ba4c29cfe0be4b"},
{file = "lxml-5.3.2-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:2abcf3f3b8367d6400b908d00d4cd279fc0b8efa287e9043820525762d383699"},
{file = "lxml-5.3.2-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:348c06cb2e3176ce98bee8c397ecc89181681afd13d85870df46167f140a305f"},
{file = "lxml-5.3.2-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:617ecaccd565cbf1ac82ffcaa410e7da5bd3a4b892bb3543fb2fe19bd1c4467d"},
{file = "lxml-5.3.2-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:c3eb4278dcdb9d86265ed2c20b9ecac45f2d6072e3904542e591e382c87a9c00"},
{file = "lxml-5.3.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:258b6b53458c5cbd2a88795557ff7e0db99f73a96601b70bc039114cd4ee9e02"},
{file = "lxml-5.3.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0a9d8d25ed2f2183e8471c97d512a31153e123ac5807f61396158ef2793cb6e"},
{file = "lxml-5.3.2-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:73bcb635a848c18a3e422ea0ab0092f2e4ef3b02d8ebe87ab49748ebc8ec03d8"},
{file = "lxml-5.3.2-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:1545de0a69a16ced5767bae8cca1801b842e6e49e96f5e4a8a5acbef023d970b"},
{file = "lxml-5.3.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:165fcdc2f40fc0fe88a3c3c06c9c2a097388a90bda6a16e6f7c9199c903c9b8e"},
{file = "lxml-5.3.2.tar.gz", hash = "sha256:773947d0ed809ddad824b7b14467e1a481b8976e87278ac4a730c2f7c7fcddc1"},
]
[package.extras]
cssselect = ["cssselect (>=0.7)"]
html-clean = ["lxml_html_clean"]
html5 = ["html5lib"]
htmlsoup = ["BeautifulSoup4"]
source = ["Cython (>=3.0.11,<3.1.0)"]
[[package]]
name = "markdown-it-py"
version = "3.0.0"
@@ -3221,6 +3476,33 @@ rsa = ["cryptography (>=3.0.0)"]
signals = ["blinker (>=1.4.0)"]
signedtoken = ["cryptography (>=3.0.0)", "pyjwt (>=2.0.0,<3)"]
[[package]]
name = "openai"
version = "1.82.0"
description = "The official Python library for the openai API"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "openai-1.82.0-py3-none-any.whl", hash = "sha256:8c40647fea1816516cb3de5189775b30b5f4812777e40b8768f361f232b61b30"},
{file = "openai-1.82.0.tar.gz", hash = "sha256:b0a009b9a58662d598d07e91e4219ab4b1e3d8ba2db3f173896a92b9b874d1a7"},
]
[package.dependencies]
anyio = ">=3.5.0,<5"
distro = ">=1.7.0,<2"
httpx = ">=0.23.0,<1"
jiter = ">=0.4.0,<1"
pydantic = ">=1.9.0,<3"
sniffio = "*"
tqdm = ">4"
typing-extensions = ">=4.11,<5"
[package.extras]
datalib = ["numpy (>=1)", "pandas (>=1.2.3)", "pandas-stubs (>=1.1.0.11)"]
realtime = ["websockets (>=13,<16)"]
voice-helpers = ["numpy (>=2.0.2)", "sounddevice (>=0.5.1)"]
[[package]]
name = "opentelemetry-api"
version = "1.32.1"
@@ -3578,26 +3860,26 @@ testing = ["google-api-core (>=1.31.5)"]
[[package]]
name = "protobuf"
version = "6.30.2"
version = "6.31.1"
description = ""
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "protobuf-6.30.2-cp310-abi3-win32.whl", hash = "sha256:b12ef7df7b9329886e66404bef5e9ce6a26b54069d7f7436a0853ccdeb91c103"},
{file = "protobuf-6.30.2-cp310-abi3-win_amd64.whl", hash = "sha256:7653c99774f73fe6b9301b87da52af0e69783a2e371e8b599b3e9cb4da4b12b9"},
{file = "protobuf-6.30.2-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:0eb523c550a66a09a0c20f86dd554afbf4d32b02af34ae53d93268c1f73bc65b"},
{file = "protobuf-6.30.2-cp39-abi3-manylinux2014_aarch64.whl", hash = "sha256:50f32cc9fd9cb09c783ebc275611b4f19dfdfb68d1ee55d2f0c7fa040df96815"},
{file = "protobuf-6.30.2-cp39-abi3-manylinux2014_x86_64.whl", hash = "sha256:4f6c687ae8efae6cf6093389a596548214467778146b7245e886f35e1485315d"},
{file = "protobuf-6.30.2-cp39-cp39-win32.whl", hash = "sha256:524afedc03b31b15586ca7f64d877a98b184f007180ce25183d1a5cb230ee72b"},
{file = "protobuf-6.30.2-cp39-cp39-win_amd64.whl", hash = "sha256:acec579c39c88bd8fbbacab1b8052c793efe83a0a5bd99db4a31423a25c0a0e2"},
{file = "protobuf-6.30.2-py3-none-any.whl", hash = "sha256:ae86b030e69a98e08c77beab574cbcb9fff6d031d57209f574a5aea1445f4b51"},
{file = "protobuf-6.30.2.tar.gz", hash = "sha256:35c859ae076d8c56054c25b59e5e59638d86545ed6e2b6efac6be0b6ea3ba048"},
{file = "protobuf-6.31.1-cp310-abi3-win32.whl", hash = "sha256:7fa17d5a29c2e04b7d90e5e32388b8bfd0e7107cd8e616feef7ed3fa6bdab5c9"},
{file = "protobuf-6.31.1-cp310-abi3-win_amd64.whl", hash = "sha256:426f59d2964864a1a366254fa703b8632dcec0790d8862d30034d8245e1cd447"},
{file = "protobuf-6.31.1-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:6f1227473dc43d44ed644425268eb7c2e488ae245d51c6866d19fe158e207402"},
{file = "protobuf-6.31.1-cp39-abi3-manylinux2014_aarch64.whl", hash = "sha256:a40fc12b84c154884d7d4c4ebd675d5b3b5283e155f324049ae396b95ddebc39"},
{file = "protobuf-6.31.1-cp39-abi3-manylinux2014_x86_64.whl", hash = "sha256:4ee898bf66f7a8b0bd21bce523814e6fbd8c6add948045ce958b73af7e8878c6"},
{file = "protobuf-6.31.1-cp39-cp39-win32.whl", hash = "sha256:0414e3aa5a5f3ff423828e1e6a6e907d6c65c1d5b7e6e975793d5590bdeecc16"},
{file = "protobuf-6.31.1-cp39-cp39-win_amd64.whl", hash = "sha256:8764cf4587791e7564051b35524b72844f845ad0bb011704c3736cce762d8fe9"},
{file = "protobuf-6.31.1-py3-none-any.whl", hash = "sha256:720a6c7e6b77288b85063569baae8536671b39f15cc22037ec7045658d80489e"},
{file = "protobuf-6.31.1.tar.gz", hash = "sha256:d8cac4c982f0b957a4dc73a80e2ea24fab08e679c0de9deb835f4a12d69aca9a"},
]
[[package]]
name = "prowler"
version = "5.6.0"
version = "5.8.0"
description = "Prowler is an Open Source security tool to perform AWS, GCP and Azure security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks."
optional = false
python-versions = ">3.9.1,<3.13"
@@ -3645,6 +3927,7 @@ numpy = "2.0.2"
pandas = "2.2.3"
py-ocsf-models = "0.3.1"
pydantic = "1.10.21"
pygithub = "2.5.0"
python-dateutil = ">=2.9.0.post0,<3.0.0"
pytz = "2025.1"
schema = "0.7.7"
@@ -3657,7 +3940,7 @@ tzlocal = "5.3.1"
type = "git"
url = "https://github.com/prowler-cloud/prowler.git"
reference = "master"
resolved_reference = "9828824b737b8deda61f4a6646b54e0ad45033b9"
resolved_reference = "ea97de7f43a2063476b49f7697bb6c7b51137c11"
[[package]]
name = "psutil"
@@ -3834,11 +4117,11 @@ description = "C parser in Python"
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
markers = "platform_python_implementation != \"PyPy\""
files = [
{file = "pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc"},
{file = "pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6"},
]
markers = {dev = "platform_python_implementation != \"PyPy\""}
[[package]]
name = "pycurl"
@@ -3949,6 +4232,26 @@ typing-extensions = ">=4.2.0"
dotenv = ["python-dotenv (>=0.10.4)"]
email = ["email-validator (>=1.0.3)"]
[[package]]
name = "pygithub"
version = "2.5.0"
description = "Use the full Github API v3"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "PyGithub-2.5.0-py3-none-any.whl", hash = "sha256:b0b635999a658ab8e08720bdd3318893ff20e2275f6446fcf35bf3f44f2c0fd2"},
{file = "pygithub-2.5.0.tar.gz", hash = "sha256:e1613ac508a9be710920d26eb18b1905ebd9926aa49398e88151c1b526aad3cf"},
]
[package.dependencies]
Deprecated = "*"
pyjwt = {version = ">=2.4.0", extras = ["crypto"]}
pynacl = ">=1.4.0"
requests = ">=2.14.0"
typing-extensions = ">=4.0.0"
urllib3 = ">=1.26.0"
[[package]]
name = "pygments"
version = "2.19.1"
@@ -4013,6 +4316,33 @@ tomlkit = ">=0.10.1"
spelling = ["pyenchant (>=3.2,<4.0)"]
testutils = ["gitpython (>3)"]
[[package]]
name = "pynacl"
version = "1.5.0"
description = "Python binding to the Networking and Cryptography (NaCl) library"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "PyNaCl-1.5.0-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:401002a4aaa07c9414132aaed7f6836ff98f59277a234704ff66878c2ee4a0d1"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:52cb72a79269189d4e0dc537556f4740f7f0a9ec41c1322598799b0bdad4ef92"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a36d4a9dda1f19ce6e03c9a784a2921a4b726b02e1c736600ca9c22029474394"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:0c84947a22519e013607c9be43706dd42513f9e6ae5d39d3613ca1e142fba44d"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:06b8f6fa7f5de8d5d2f7573fe8c863c051225a27b61e6860fd047b1775807858"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:a422368fc821589c228f4c49438a368831cb5bbc0eab5ebe1d7fac9dded6567b"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:61f642bf2378713e2c2e1de73444a3778e5f0a38be6fee0fe532fe30060282ff"},
{file = "PyNaCl-1.5.0-cp36-abi3-win32.whl", hash = "sha256:e46dae94e34b085175f8abb3b0aaa7da40767865ac82c928eeb9e57e1ea8a543"},
{file = "PyNaCl-1.5.0-cp36-abi3-win_amd64.whl", hash = "sha256:20f42270d27e1b6a29f54032090b972d97f0a1b0948cc52392041ef7831fee93"},
{file = "PyNaCl-1.5.0.tar.gz", hash = "sha256:8ac7448f09ab85811607bdd21ec2464495ac8b7c66d146bf545b0f08fb9220ba"},
]
[package.dependencies]
cffi = ">=1.4.1"
[package.extras]
docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
[[package]]
name = "pyparsing"
version = "3.2.3"
@@ -4236,6 +4566,27 @@ files = [
{file = "python_memcached-1.62-py2.py3-none-any.whl", hash = "sha256:1bdd8d2393ff53e80cd5e9442d750e658e0b35c3eebb3211af137303e3b729d1"},
]
[[package]]
name = "python3-saml"
version = "1.16.0"
description = "Saml Python Toolkit. Add SAML support to your Python software using this library"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "python3-saml-1.16.0.tar.gz", hash = "sha256:97c9669aecabc283c6e5fb4eb264f446b6e006f5267d01c9734f9d8bffdac133"},
{file = "python3_saml-1.16.0-py2-none-any.whl", hash = "sha256:c49097863c278ff669a337a96c46dc1f25d16307b4bb2679d2d1733cc4f5176a"},
{file = "python3_saml-1.16.0-py3-none-any.whl", hash = "sha256:20b97d11b04f01ee22e98f4a38242e2fea2e28fbc7fbc9bdd57cab5ac7fc2d0d"},
]
[package.dependencies]
isodate = ">=0.6.1"
lxml = ">=4.6.5,<4.7.0 || >4.7.0"
xmlsec = ">=1.3.9"
[package.extras]
test = ["coverage (>=4.5.2)", "flake8 (>=3.6.0,<=5.0.0)", "freezegun (>=0.3.11,<=1.1.0)", "pytest (>=4.6)"]
[[package]]
name = "pytz"
version = "2025.1"
@@ -4376,19 +4727,19 @@ typing-extensions = {version = ">=4.4.0", markers = "python_version < \"3.13\""}
[[package]]
name = "requests"
version = "2.32.3"
version = "2.32.4"
description = "Python HTTP for Humans."
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
files = [
{file = "requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6"},
{file = "requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760"},
{file = "requests-2.32.4-py3-none-any.whl", hash = "sha256:27babd3cda2a6d50b30443204ee89830707d396671944c998b5975b031ac2b2c"},
{file = "requests-2.32.4.tar.gz", hash = "sha256:27d0316682c8a29834d3264820024b62a36942083d52caf2f14c0591336d3422"},
]
[package.dependencies]
certifi = ">=2017.4.17"
charset-normalizer = ">=2,<4"
charset_normalizer = ">=2,<4"
idna = ">=2.5,<4"
urllib3 = ">=1.21.1,<3"
@@ -5050,7 +5401,7 @@ version = "4.67.1"
description = "Fast, Extensible Progress Meter"
optional = false
python-versions = ">=3.7"
groups = ["dev"]
groups = ["main", "dev"]
files = [
{file = "tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2"},
{file = "tqdm-4.67.1.tar.gz", hash = "sha256:f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2"},
@@ -5341,6 +5692,77 @@ files = [
{file = "xlsxwriter-3.2.3.tar.gz", hash = "sha256:ad6fd41bdcf1b885876b1f6b7087560aecc9ae5a9cc2ba97dcac7ab2e210d3d5"},
]
[[package]]
name = "xmlsec"
version = "1.3.14"
description = "Python bindings for the XML Security Library"
optional = false
python-versions = ">=3.5"
groups = ["main"]
files = [
{file = "xmlsec-1.3.14-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:4dea6df3ffcb65d0b215678c3a0fe7bbc66785d6eae81291296e372498bad43a"},
{file = "xmlsec-1.3.14-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1fa1311f7489d050dde9028f5a2b5849c2927bb09c9a93491cb2f28fdc563912"},
{file = "xmlsec-1.3.14-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28cd9f513cf01dc0c5b9d9f0728714ecde2e7f46b3b6f63de91f4ae32f3008b3"},
{file = "xmlsec-1.3.14-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:77749b338503fb6e151052c664064b34264f4168e2cb0cca1de78b7e5312a783"},
{file = "xmlsec-1.3.14-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4af81ce8044862ec865782efd353d22abdcd95b92364eef3c934de57ae6d5852"},
{file = "xmlsec-1.3.14-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:cf35a25be3eb6263b2e0544ba26294651113fab79064f994d347a2ca5973e8e2"},
{file = "xmlsec-1.3.14-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:004e8a82e26728bf8a60f8ece1ef3ffafdac30ef538139dfe28870e8503ca64a"},
{file = "xmlsec-1.3.14-cp310-cp310-win32.whl", hash = "sha256:e6cbc914d77678db0c8bc39e723d994174633d18f9d6be4665ec29cce978a96d"},
{file = "xmlsec-1.3.14-cp310-cp310-win_amd64.whl", hash = "sha256:4922afa9234d1c5763950b26c328a5320019e55eb6000272a79dfe54fee8e704"},
{file = "xmlsec-1.3.14-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:7799a9ff3593f9dd43464e18b1a621640bffc40456c47c23383727f937dca7fc"},
{file = "xmlsec-1.3.14-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1fe23c2dd5f5dbcb24f40e2c1061e2672a32aabee7cf8ac5337036a485607d72"},
{file = "xmlsec-1.3.14-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0be3b7a28e54a03b87faf07fb3c6dc3e50a2c79b686718c3ad08300b8bf6bb67"},
{file = "xmlsec-1.3.14-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:48e894ad3e7de373f56efc09d6a56f7eae73a8dd4cec8943313134849e9c6607"},
{file = "xmlsec-1.3.14-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:204d3c586b8bd6f02a5d4c59850a8157205569d40c32567f49576fa5795d897d"},
{file = "xmlsec-1.3.14-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:6679cec780386d848e7351d4b0de92c4483289ea4f0a2187e216159f939a4c6b"},
{file = "xmlsec-1.3.14-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c4d41c83c8a2b8d8030204391ebeb6174fbdb044f0331653c4b5a4ce4150bcc0"},
{file = "xmlsec-1.3.14-cp311-cp311-win32.whl", hash = "sha256:df4aa0782a53032fd35e18dcd6d328d6126324bfcfdef0cb5c2856f25b4b6f94"},
{file = "xmlsec-1.3.14-cp311-cp311-win_amd64.whl", hash = "sha256:1072878301cb9243a54679e0520e6a5be2266c07a28b0ecef9e029d05a90ffcd"},
{file = "xmlsec-1.3.14-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:1eb3dcf244a52f796377112d8f238dbb522eb87facffb498425dc8582a84a6bf"},
{file = "xmlsec-1.3.14-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:330147ce59fbe56a9be5b2085d739c55a569f112576b3f1b33681f87416eaf33"},
{file = "xmlsec-1.3.14-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed4034939d8566ccdcd3b4e4f23c63fd807fb8763ae5668d59a19e11640a8242"},
{file = "xmlsec-1.3.14-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a98eadfcb0c3b23ccceb7a2f245811f8d784bd287640dcfe696a26b9db1e2fc0"},
{file = "xmlsec-1.3.14-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:86ff7b2711557c1087b72b0a1a88d82eafbf2a6d38b97309a6f7101d4a7041c3"},
{file = "xmlsec-1.3.14-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:774d5d1e45f07f953c1cc14fd055c1063f0725f7248b6b0e681f59fd8638934d"},
{file = "xmlsec-1.3.14-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:bd10ca3201f164482775a7ce61bf7ee9aade2e7d032046044dd0f6f52c91d79d"},
{file = "xmlsec-1.3.14-cp312-cp312-win32.whl", hash = "sha256:19c86bab1498e4c2e56d8e2c878f461ccb6e56b67fd7522b0c8fda46d8910781"},
{file = "xmlsec-1.3.14-cp312-cp312-win_amd64.whl", hash = "sha256:d0762f4232bce2c7f6c0af329db8b821b4460bbe123a2528fb5677d03db7a4b5"},
{file = "xmlsec-1.3.14-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:03ccba7dacf197850de954666af0221c740a5de631a80136362a1559223fab75"},
{file = "xmlsec-1.3.14-cp36-cp36m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c12900e1903e289deb84eb893dca88591d6884d3e3cda4fb711b8812118416e8"},
{file = "xmlsec-1.3.14-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6566434e2e5c58e472362a6187f208601f1627a148683a6f92bd16479f1d9e20"},
{file = "xmlsec-1.3.14-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:2401e162aaab7d9416c3405bac7a270e5f370988a0f1f46f0f29b735edba87e1"},
{file = "xmlsec-1.3.14-cp36-cp36m-win32.whl", hash = "sha256:ba3b39c493e3b04354615068a3218f30897fcc2f42c6d8986d0c1d63aca87782"},
{file = "xmlsec-1.3.14-cp36-cp36m-win_amd64.whl", hash = "sha256:4edd8db4df04bbac9c4a5ab4af855b74fe2bf2c248d07cac2e6d92a485f1a685"},
{file = "xmlsec-1.3.14-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b6dd86f440fec9242515c64f0be93fec8b4289287db1f6de2651eee9995aaecb"},
{file = "xmlsec-1.3.14-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ad1634cabe0915fe2a12e142db0ed2daf5be80cbe3891a2cecbba0750195cc6b"},
{file = "xmlsec-1.3.14-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dba457ff87c39cbae3c5020475a728d24bbd9d00376df9af9724cd3bb59ff07a"},
{file = "xmlsec-1.3.14-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:12d90059308bb0c1b94bde065784e6852999d08b91bcb2048c17e62b954acb07"},
{file = "xmlsec-1.3.14-cp37-cp37m-win32.whl", hash = "sha256:ce4e165a1436697e5e39587c4fba24db4545a5c9801e0d749f1afd09ad3ab901"},
{file = "xmlsec-1.3.14-cp37-cp37m-win_amd64.whl", hash = "sha256:7e8e0171916026cbe8e2022c959558d02086655fd3c3466f2bc0451b09cf9ee8"},
{file = "xmlsec-1.3.14-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c42735cc68fdb4c6065cf0a0701dfff3a12a1734c63a36376349af9a5481f27b"},
{file = "xmlsec-1.3.14-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:38e035bf48300b7dbde2dd01d3b8569f8584fc9c73809be13886e6b6c77b74fb"},
{file = "xmlsec-1.3.14-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:73eabf5ef58189d81655058cf328c1dfa9893d89f1bff5fc941481f08533f338"},
{file = "xmlsec-1.3.14-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bddd2a2328b4e08c8a112e06cf2cd2b4d281f4ad94df15b4cef18f06cdc49d78"},
{file = "xmlsec-1.3.14-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:57fed3bc7943681c9ed4d2221600ab440f060d8d1a8f92f346f2b41effe175b8"},
{file = "xmlsec-1.3.14-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:147934bd39dfd840663fb6b920ea9201455fa886427975713f1b42d9f20b9b29"},
{file = "xmlsec-1.3.14-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:e732a75fcb6b84872b168f972fbbf3749baf76308635f14015d1d35ed0c5719c"},
{file = "xmlsec-1.3.14-cp38-cp38-win32.whl", hash = "sha256:b109cdf717257fd4daa77c1d3ec8a3fb2a81318a6d06a36c55a8a53ae381ae5e"},
{file = "xmlsec-1.3.14-cp38-cp38-win_amd64.whl", hash = "sha256:b7ba2ea38e3d9efa520b14f3c0b7d99a7c055244ae5ba8bc9f4ca73b18f3a215"},
{file = "xmlsec-1.3.14-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1b9b5de6bc69fdec23147e5f712cb05dc86df105462f254f140d743cc680cc7b"},
{file = "xmlsec-1.3.14-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:82ac81deb7d7bf5cc8a748148948e5df5386597ff43fb92ec651cc5c7addb0e7"},
{file = "xmlsec-1.3.14-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0bae37b2920115cf00759ee9fb7841cbdebcef3a8a92734ab93ae8fa41ac581d"},
{file = "xmlsec-1.3.14-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4fac2a787ae3b9fb761f9aec6b9f10f2d1c1b87abb574ebd8ff68435bdc97e3d"},
{file = "xmlsec-1.3.14-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:34c61ec0c0e70fda710290ae74b9efe1928d9242ed82c4eecf97aa696cff68e6"},
{file = "xmlsec-1.3.14-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:995e87acecc263a2f6f2aa3cc204268f651cac8f4d7a2047f11b2cd49979cc38"},
{file = "xmlsec-1.3.14-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:2f84a1c509c52773365645a87949081ee9ea9c535cd452048cc8ca4ad3b45666"},
{file = "xmlsec-1.3.14-cp39-cp39-win32.whl", hash = "sha256:7882963e9cb9c0bd0e8c2715a29159a366417ff4a30d8baf42b05bc5cf249446"},
{file = "xmlsec-1.3.14-cp39-cp39-win_amd64.whl", hash = "sha256:a487c3d144f791c32f5e560aa27a705fba23171728b8a8511f36de053ff6bc93"},
{file = "xmlsec-1.3.14.tar.gz", hash = "sha256:934f804f2f895bcdb86f1eaee236b661013560ee69ec108d29cdd6e5f292a2d9"},
]
[package.dependencies]
lxml = ">=3.8"
[[package]]
name = "yarl"
version = "1.20.0"
@@ -5483,4 +5905,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.11,<3.13"
content-hash = "051924735a7069c8393fefc18fc2c310b196ea24ad41b8c984dc5852683d0407"
content-hash = "6802b33984c2f8438c9dc02dac0a0c14d5a78af60251bd0c80ca59bc2182c48e"
+7 -4
View File
@@ -7,8 +7,8 @@ authors = [{name = "Prowler Engineering", email = "engineering@prowler.com"}]
dependencies = [
"celery[pytest] (>=5.4.0,<6.0.0)",
"dj-rest-auth[with_social,jwt] (==7.0.1)",
"django==5.1.8",
"django-allauth==65.4.1",
"django==5.1.10",
"django-allauth[saml] (>=65.8.0,<66.0.0)",
"django-celery-beat (>=2.7.0,<3.0.0)",
"django-celery-results (>=2.5.1,<3.0.0)",
"django-cors-headers==4.4.0",
@@ -23,11 +23,14 @@ dependencies = [
"drf-spectacular==0.27.2",
"drf-spectacular-jsonapi==0.5.1",
"gunicorn==23.0.0",
"lxml==5.3.2",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
"uuid6==2024.7.10"
"uuid6==2024.7.10",
"openai (>=1.82.0,<2.0.0)",
"xmlsec==1.3.14"
]
description = "Prowler's API (Django/DRF)"
license = "Apache-2.0"
@@ -35,7 +38,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.9.0"
version = "1.10.0"
[project.scripts]
celery = "src.backend.config.settings.celery"
+37 -27
View File
@@ -17,6 +17,8 @@ class ProwlerSocialAccountAdapter(DefaultSocialAccountAdapter):
def pre_social_login(self, request, sociallogin):
# Link existing accounts with the same email address
email = sociallogin.account.extra_data.get("email")
if sociallogin.provider.id == "saml":
email = sociallogin.user.email
if email:
existing_user = self.get_user_by_email(email)
if existing_user:
@@ -29,33 +31,41 @@ class ProwlerSocialAccountAdapter(DefaultSocialAccountAdapter):
"""
with transaction.atomic(using=MainRouter.admin_db):
user = super().save_user(request, sociallogin, form)
user.save(using=MainRouter.admin_db)
social_account_name = sociallogin.account.extra_data.get("name")
if social_account_name:
user.name = social_account_name
user.save(using=MainRouter.admin_db)
provider = sociallogin.provider.id
extra = sociallogin.account.extra_data
tenant = Tenant.objects.using(MainRouter.admin_db).create(
name=f"{user.email.split('@')[0]} default tenant"
)
with rls_transaction(str(tenant.id)):
Membership.objects.using(MainRouter.admin_db).create(
user=user, tenant=tenant, role=Membership.RoleChoices.OWNER
)
role = Role.objects.using(MainRouter.admin_db).create(
name="admin",
tenant_id=tenant.id,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=True,
)
UserRoleRelationship.objects.using(MainRouter.admin_db).create(
user=user,
role=role,
tenant_id=tenant.id,
if provider != "saml":
# Handle other providers (e.g., GitHub, Google)
user.save(using=MainRouter.admin_db)
social_account_name = extra.get("name")
if social_account_name:
user.name = social_account_name
user.save(using=MainRouter.admin_db)
tenant = Tenant.objects.using(MainRouter.admin_db).create(
name=f"{user.email.split('@')[0]} default tenant"
)
with rls_transaction(str(tenant.id)):
Membership.objects.using(MainRouter.admin_db).create(
user=user, tenant=tenant, role=Membership.RoleChoices.OWNER
)
role = Role.objects.using(MainRouter.admin_db).create(
name="admin",
tenant_id=tenant.id,
manage_users=True,
manage_account=True,
manage_billing=True,
manage_providers=True,
manage_integrations=True,
manage_scans=True,
unlimited_visibility=True,
)
UserRoleRelationship.objects.using(MainRouter.admin_db).create(
user=user,
role=role,
tenant_id=tenant.id,
)
else:
request.session["saml_user_created"] = str(user.id)
return user
+20 -46
View File
@@ -1,4 +1,5 @@
from django.core.exceptions import ObjectDoesNotExist
from django.db import transaction
from rest_framework import permissions
from rest_framework.exceptions import NotAuthenticated
from rest_framework.filters import SearchFilter
@@ -46,9 +47,11 @@ class BaseViewSet(ModelViewSet):
class BaseRLSViewSet(BaseViewSet):
def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs)
def dispatch(self, request, *args, **kwargs):
with transaction.atomic():
return super().dispatch(request, *args, **kwargs)
def initial(self, request, *args, **kwargs):
# Ideally, this logic would be in the `.setup()` method but DRF view sets don't call it
# https://docs.djangoproject.com/en/5.1/ref/class-based-views/base/#django.views.generic.base.View.setup
if request.auth is None:
@@ -58,19 +61,9 @@ class BaseRLSViewSet(BaseViewSet):
if tenant_id is None:
raise NotAuthenticated("Tenant ID is not present in token")
self.request.tenant_id = tenant_id
self._rls_cm = rls_transaction(tenant_id)
self._rls_cm.__enter__()
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
if hasattr(self, "_rls_cm"):
self._rls_cm.__exit__(None, None, None)
del self._rls_cm
return response
with rls_transaction(tenant_id):
self.request.tenant_id = tenant_id
return super().initial(request, *args, **kwargs)
def get_serializer_context(self):
context = super().get_serializer_context()
@@ -80,7 +73,8 @@ class BaseRLSViewSet(BaseViewSet):
class BaseTenantViewset(BaseViewSet):
def dispatch(self, request, *args, **kwargs):
tenant = super().dispatch(request, *args, **kwargs)
with transaction.atomic():
tenant = super().dispatch(request, *args, **kwargs)
try:
# If the request is a POST, create the admin role
@@ -115,8 +109,6 @@ class BaseTenantViewset(BaseViewSet):
pass # Tenant might not exist, handle gracefully
def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs)
if request.auth is None:
raise NotAuthenticated
@@ -125,27 +117,19 @@ class BaseTenantViewset(BaseViewSet):
raise NotAuthenticated("Tenant ID is not present in token")
user_id = str(request.user.id)
self._rls_cm = rls_transaction(value=user_id, parameter=POSTGRES_USER_VAR)
self._rls_cm.__enter__()
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
if hasattr(self, "_rls_cm"):
self._rls_cm.__exit__(None, None, None)
del self._rls_cm
return response
with rls_transaction(value=user_id, parameter=POSTGRES_USER_VAR):
return super().initial(request, *args, **kwargs)
class BaseUserViewset(BaseViewSet):
def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs)
def dispatch(self, request, *args, **kwargs):
with transaction.atomic():
return super().dispatch(request, *args, **kwargs)
def initial(self, request, *args, **kwargs):
# TODO refactor after improving RLS on users
if request.stream is not None and request.stream.method == "POST":
return
return super().initial(request, *args, **kwargs)
if request.auth is None:
raise NotAuthenticated
@@ -153,16 +137,6 @@ class BaseUserViewset(BaseViewSet):
if tenant_id is None:
raise NotAuthenticated("Tenant ID is not present in token")
self.request.tenant_id = tenant_id
self._rls_cm = rls_transaction(tenant_id)
self._rls_cm.__enter__()
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
if hasattr(self, "_rls_cm"):
self._rls_cm.__exit__(None, None, None)
del self._rls_cm
return response
with rls_transaction(tenant_id):
self.request.tenant_id = tenant_id
return super().initial(request, *args, **kwargs)
+4
View File
@@ -196,6 +196,10 @@ def generate_compliance_overview_template(prowler_compliance: dict):
requirement_dict = {
"name": requirement.Name or requirement.Id,
"description": requirement.Description,
"tactics": getattr(requirement, "Tactics", []),
"subtechniques": getattr(requirement, "SubTechniques", []),
"platforms": getattr(requirement, "Platforms", []),
"technique_url": getattr(requirement, "TechniqueURL", ""),
"attributes": [
dict(attribute) for attribute in requirement.Attributes
],
+12
View File
@@ -529,3 +529,15 @@ class IntegrationTypeEnum(EnumType):
class IntegrationTypeEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("integration_type", *args, **kwargs)
# Postgres enum definition for Processor type
class ProcessorTypeEnum(EnumType):
enum_type_name = "processor_type"
class ProcessorTypeEnumField(PostgresEnumField):
def __init__(self, *args, **kwargs):
super().__init__("processor_type", *args, **kwargs)
+5
View File
@@ -57,6 +57,11 @@ class TaskInProgressException(TaskManagementError):
super().__init__()
# Provider connection errors
class ProviderConnectionError(Exception):
"""Base exception for provider connection errors."""
def custom_exception_handler(exc, context):
if isinstance(exc, django_validation_error):
if hasattr(exc, "error_dict"):
+89
View File
@@ -1,5 +1,6 @@
from datetime import date, datetime, timedelta, timezone
from dateutil.parser import parse
from django.conf import settings
from django.db.models import Q
from django_filters.rest_framework import (
@@ -28,6 +29,7 @@ from api.models import (
Invitation,
Membership,
PermissionChoices,
Processor,
Provider,
ProviderGroup,
ProviderSecret,
@@ -338,6 +340,8 @@ class ResourceFilter(ProviderRelationshipFilterSet):
tags = CharFilter(method="filter_tag")
inserted_at = DateFilter(field_name="inserted_at", lookup_expr="date")
updated_at = DateFilter(field_name="updated_at", lookup_expr="date")
scan = UUIDFilter(field_name="provider__scan", lookup_expr="exact")
scan__in = UUIDInFilter(field_name="provider__scan", lookup_expr="in")
class Meta:
model = Resource
@@ -352,6 +356,82 @@ class ResourceFilter(ProviderRelationshipFilterSet):
"updated_at": ["gte", "lte"],
}
def filter_queryset(self, queryset):
if not (self.data.get("scan") or self.data.get("scan__in")) and not (
self.data.get("updated_at")
or self.data.get("updated_at__date")
or self.data.get("updated_at__gte")
or self.data.get("updated_at__lte")
):
raise ValidationError(
[
{
"detail": "At least one date filter is required: filter[updated_at], filter[updated_at.gte], "
"or filter[updated_at.lte].",
"status": 400,
"source": {"pointer": "/data/attributes/updated_at"},
"code": "required",
}
]
)
gte_date = (
parse(self.data.get("updated_at__gte")).date()
if self.data.get("updated_at__gte")
else datetime.now(timezone.utc).date()
)
lte_date = (
parse(self.data.get("updated_at__lte")).date()
if self.data.get("updated_at__lte")
else datetime.now(timezone.utc).date()
)
if abs(lte_date - gte_date) > timedelta(
days=settings.FINDINGS_MAX_DAYS_IN_RANGE
):
raise ValidationError(
[
{
"detail": f"The date range cannot exceed {settings.FINDINGS_MAX_DAYS_IN_RANGE} days.",
"status": 400,
"source": {"pointer": "/data/attributes/updated_at"},
"code": "invalid",
}
]
)
return super().filter_queryset(queryset)
def filter_tag_key(self, queryset, name, value):
return queryset.filter(Q(tags__key=value) | Q(tags__key__icontains=value))
def filter_tag_value(self, queryset, name, value):
return queryset.filter(Q(tags__value=value) | Q(tags__value__icontains=value))
def filter_tag(self, queryset, name, value):
# We won't know what the user wants to filter on just based on the value,
# and we don't want to build special filtering logic for every possible
# provider tag spec, so we'll just do a full text search
return queryset.filter(tags__text_search=value)
class LatestResourceFilter(ProviderRelationshipFilterSet):
tag_key = CharFilter(method="filter_tag_key")
tag_value = CharFilter(method="filter_tag_value")
tag = CharFilter(method="filter_tag")
tags = CharFilter(method="filter_tag")
class Meta:
model = Resource
fields = {
"provider": ["exact", "in"],
"uid": ["exact", "icontains"],
"name": ["exact", "icontains"],
"region": ["exact", "icontains", "in"],
"service": ["exact", "icontains", "in"],
"type": ["exact", "icontains", "in"],
}
def filter_tag_key(self, queryset, name, value):
return queryset.filter(Q(tags__key=value) | Q(tags__key__icontains=value))
@@ -704,3 +784,12 @@ class IntegrationFilter(FilterSet):
fields = {
"inserted_at": ["date", "gte", "lte"],
}
class ProcessorFilter(FilterSet):
processor_type = ChoiceFilter(choices=Processor.ProcessorChoices.choices)
processor_type__in = ChoiceInFilter(
choices=Processor.ProcessorChoices.choices,
field_name="processor_type",
lookup_expr="in",
)
@@ -3,7 +3,7 @@
"model": "api.user",
"pk": "8b38e2eb-6689-4f1e-a4ba-95b275130200",
"fields": {
"password": "pbkdf2_sha256$720000$vA62S78kog2c2ytycVQdke$Fp35GVLLMyy5fUq3krSL9I02A+ocQ+RVa4S22LIAO5s=",
"password": "pbkdf2_sha256$870000$Z63pGJ7nre48hfcGbk5S0O$rQpKczAmijs96xa+gPVJifpT3Fetb8DOusl5Eq6gxac=",
"last_login": null,
"name": "Devie Prowlerson",
"email": "dev@prowler.com",
@@ -16,7 +16,7 @@
"model": "api.user",
"pk": "b6493a3a-c997-489b-8b99-278bf74de9f6",
"fields": {
"password": "pbkdf2_sha256$720000$vA62S78kog2c2ytycVQdke$Fp35GVLLMyy5fUq3krSL9I02A+ocQ+RVa4S22LIAO5s=",
"password": "pbkdf2_sha256$870000$Z63pGJ7nre48hfcGbk5S0O$rQpKczAmijs96xa+gPVJifpT3Fetb8DOusl5Eq6gxac=",
"last_login": null,
"name": "Devietoo Prowlerson",
"email": "dev2@prowler.com",
@@ -0,0 +1,80 @@
from django.contrib.sites.models import Site
from django.core.management.base import BaseCommand
from django.db import DEFAULT_DB_ALIAS, connection, connections, transaction
from django.db.migrations.recorder import MigrationRecorder
def table_exists(table_name):
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT EXISTS (
SELECT 1 FROM information_schema.tables
WHERE table_name = %s
)
""",
[table_name],
)
return cursor.fetchone()[0]
class Command(BaseCommand):
help = "Fix migration inconsistency between socialaccount and sites"
def add_arguments(self, parser):
parser.add_argument(
"--database",
default=DEFAULT_DB_ALIAS,
help="Specifies the database to operate on.",
)
def handle(self, *args, **options):
db = options["database"]
connection = connections[db]
recorder = MigrationRecorder(connection)
applied = set(recorder.applied_migrations())
has_social = ("socialaccount", "0001_initial") in applied
with connection.cursor() as cursor:
cursor.execute(
"""
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_name = 'django_site'
);
"""
)
site_table_exists = cursor.fetchone()[0]
if has_social and not site_table_exists:
self.stdout.write(
f"Detected inconsistency in '{db}'. Creating 'django_site' table manually..."
)
with transaction.atomic(using=db):
with connection.schema_editor() as schema_editor:
schema_editor.create_model(Site)
recorder.record_applied("sites", "0001_initial")
recorder.record_applied("sites", "0002_alter_domain_unique")
self.stdout.write(
"Fixed: 'django_site' table created and migrations registered."
)
# Ensure the relationship table also exists
if not table_exists("socialaccount_socialapp_sites"):
self.stdout.write(
"Detected missing 'socialaccount_socialapp_sites' table. Creating manually..."
)
with connection.schema_editor() as schema_editor:
from allauth.socialaccount.models import SocialApp
schema_editor.create_model(
SocialApp._meta.get_field("sites").remote_field.through
)
self.stdout.write(
"Fixed: 'socialaccount_socialapp_sites' table created."
)
@@ -0,0 +1,107 @@
# Generated by Django 5.1.10 on 2025-06-12 12:45
import uuid
import django.core.validators
import django.db.models.deletion
from django.db import migrations, models
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0029_findings_check_index_parent"),
]
operations = [
migrations.CreateModel(
name="LighthouseConfiguration",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"name",
models.CharField(
help_text="Name of the configuration",
max_length=100,
validators=[django.core.validators.MinLengthValidator(3)],
),
),
(
"api_key",
models.BinaryField(
help_text="Encrypted API key for the LLM service"
),
),
(
"model",
models.CharField(
choices=[
("gpt-4o-2024-11-20", "GPT-4o v2024-11-20"),
("gpt-4o-2024-08-06", "GPT-4o v2024-08-06"),
("gpt-4o-2024-05-13", "GPT-4o v2024-05-13"),
("gpt-4o", "GPT-4o Default"),
("gpt-4o-mini-2024-07-18", "GPT-4o Mini v2024-07-18"),
("gpt-4o-mini", "GPT-4o Mini Default"),
],
default="gpt-4o-2024-08-06",
help_text="Must be one of the supported model names",
max_length=50,
),
),
(
"temperature",
models.FloatField(default=0, help_text="Must be between 0 and 1"),
),
(
"max_tokens",
models.IntegerField(
default=4000, help_text="Must be between 500 and 5000"
),
),
(
"business_context",
models.TextField(
blank=True,
default="",
help_text="Additional business context for this AI model configuration",
),
),
("is_active", models.BooleanField(default=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "lighthouse_configurations",
"abstract": False,
"constraints": [
models.UniqueConstraint(
fields=("tenant_id",),
name="unique_lighthouse_config_per_tenant",
),
],
},
),
migrations.AddConstraint(
model_name="lighthouseconfiguration",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_lighthouseconfiguration",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]
@@ -0,0 +1,24 @@
# Generated by Django 5.1.10 on 2025-06-23 10:04
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0030_lighthouseconfiguration"),
("django_celery_beat", "0019_alter_periodictasks_options"),
]
operations = [
migrations.AlterField(
model_name="scan",
name="scheduler_task",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="django_celery_beat.periodictask",
),
),
]
+150
View File
@@ -0,0 +1,150 @@
# Generated by Django 5.1.10 on 2025-07-02 15:47
import uuid
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
import api.db_utils
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0031_scan_disable_on_cascade_periodic_tasks"),
]
operations = [
migrations.AlterField(
model_name="integration",
name="integration_type",
field=api.db_utils.IntegrationTypeEnumField(
choices=[
("amazon_s3", "Amazon S3"),
("aws_security_hub", "AWS Security Hub"),
("jira", "JIRA"),
("slack", "Slack"),
]
),
),
migrations.CreateModel(
name="SAMLToken",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
("expires_at", models.DateTimeField(editable=False)),
("token", models.JSONField(unique=True)),
(
"user",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
],
options={
"db_table": "saml_tokens",
},
),
migrations.CreateModel(
name="SAMLConfiguration",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
(
"email_domain",
models.CharField(
help_text="Email domain used to identify the tenant, e.g. prowlerdemo.com",
max_length=254,
unique=True,
),
),
(
"metadata_xml",
models.TextField(
help_text="Raw IdP metadata XML to configure SingleSignOnService, certificates, etc."
),
),
("created_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "saml_configurations",
},
),
migrations.AddConstraint(
model_name="samlconfiguration",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_samlconfiguration",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddConstraint(
model_name="samlconfiguration",
constraint=models.UniqueConstraint(
fields=("tenant",), name="unique_samlconfig_per_tenant"
),
),
migrations.CreateModel(
name="SAMLDomainIndex",
fields=[
(
"id",
models.BigAutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("email_domain", models.CharField(max_length=254, unique=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "saml_domain_index",
},
),
migrations.AddConstraint(
model_name="samldomainindex",
constraint=models.UniqueConstraint(
fields=("email_domain", "tenant"),
name="unique_resources_by_email_domain",
),
),
migrations.AddConstraint(
model_name="samldomainindex",
constraint=api.rls.BaseSecurityConstraint(
name="statements_on_samldomainindex",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]
@@ -0,0 +1,34 @@
# Generated by Django 5.1.5 on 2025-03-03 15:46
from functools import partial
from django.db import migrations
from api.db_utils import PostgresEnumMigration, ProcessorTypeEnum, register_enum
from api.models import Processor
ProcessorTypeEnumMigration = PostgresEnumMigration(
enum_name="processor_type",
enum_values=tuple(
processor_type[0] for processor_type in Processor.ProcessorChoices.choices
),
)
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0032_saml"),
]
operations = [
migrations.RunPython(
ProcessorTypeEnumMigration.create_enum_type,
reverse_code=ProcessorTypeEnumMigration.drop_enum_type,
),
migrations.RunPython(
partial(register_enum, enum_class=ProcessorTypeEnum),
reverse_code=migrations.RunPython.noop,
),
]
@@ -0,0 +1,88 @@
# Generated by Django 5.1.5 on 2025-03-26 13:04
import uuid
import django.db.models.deletion
from django.db import migrations, models
import api.db_utils
import api.rls
from api.rls import RowLevelSecurityConstraint
class Migration(migrations.Migration):
dependencies = [
("api", "0033_processors_enum"),
]
operations = [
migrations.CreateModel(
name="Processor",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("inserted_at", models.DateTimeField(auto_now_add=True)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"processor_type",
api.db_utils.ProcessorTypeEnumField(
choices=[("mutelist", "Mutelist")]
),
),
("configuration", models.JSONField(default=dict)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="api.tenant"
),
),
],
options={
"db_table": "processors",
"abstract": False,
"indexes": [
models.Index(
fields=["tenant_id", "id"], name="processor_tenant_id_idx"
),
models.Index(
fields=["tenant_id", "processor_type"],
name="processor_tenant_type_idx",
),
],
},
),
migrations.AddConstraint(
model_name="processor",
constraint=models.UniqueConstraint(
fields=("tenant_id", "processor_type"),
name="unique_processor_types_tenant",
),
),
migrations.AddConstraint(
model_name="processor",
constraint=RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_processor",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddField(
model_name="scan",
name="processor",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="scans",
related_query_name="scan",
to="api.processor",
),
),
]
@@ -0,0 +1,22 @@
import django.core.validators
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0034_processors"),
]
operations = [
migrations.AddField(
model_name="finding",
name="muted_reason",
field=models.TextField(
blank=True,
max_length=500,
null=True,
validators=[django.core.validators.MinLengthValidator(3)],
),
),
]
@@ -0,0 +1,30 @@
from functools import partial
from django.db import migrations
from api.db_utils import create_index_on_partitions, drop_index_on_partitions
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0035_finding_muted_reason"),
]
operations = [
migrations.RunPython(
partial(
create_index_on_partitions,
parent_table="resource_finding_mappings",
index_name="rfm_tenant_finding_idx",
columns="tenant_id, finding_id",
method="BTREE",
),
reverse_code=partial(
drop_index_on_partitions,
parent_table="resource_finding_mappings",
index_name="rfm_tenant_finding_idx",
),
),
]
@@ -0,0 +1,17 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0036_rfm_tenant_finding_index_partitions"),
]
operations = [
migrations.AddIndex(
model_name="resourcefindingmapping",
index=models.Index(
fields=["tenant_id", "finding_id"],
name="rfm_tenant_finding_idx",
),
),
]
@@ -0,0 +1,15 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0037_rfm_tenant_finding_index_parent"),
]
operations = [
migrations.AddField(
model_name="resource",
name="failed_findings_count",
field=models.IntegerField(default=0),
)
]
@@ -0,0 +1,20 @@
from django.contrib.postgres.operations import AddIndexConcurrently
from django.db import migrations, models
class Migration(migrations.Migration):
atomic = False
dependencies = [
("api", "0038_resource_failed_findings_count"),
]
operations = [
AddIndexConcurrently(
model_name="resource",
index=models.Index(
fields=["tenant_id", "-failed_findings_count", "id"],
name="resources_failed_findings_idx",
),
),
]
+489 -19
View File
@@ -1,13 +1,21 @@
import json
import logging
import re
import xml.etree.ElementTree as ET
from datetime import datetime, timedelta, timezone
from uuid import UUID, uuid4
from cryptography.fernet import Fernet
from allauth.socialaccount.models import SocialApp
from config.custom_logging import BackendLogger
from config.settings.social_login import SOCIALACCOUNT_PROVIDERS
from cryptography.fernet import Fernet, InvalidToken
from django.conf import settings
from django.contrib.auth.models import AbstractBaseUser
from django.contrib.postgres.fields import ArrayField
from django.contrib.postgres.indexes import GinIndex
from django.contrib.postgres.search import SearchVector, SearchVectorField
from django.contrib.sites.models import Site
from django.core.exceptions import ValidationError
from django.core.validators import MinLengthValidator
from django.db import models
from django.db.models import Q
@@ -19,12 +27,14 @@ from psqlextra.models import PostgresPartitionedModel
from psqlextra.types import PostgresPartitioningMethod
from uuid6 import uuid7
from api.db_router import MainRouter
from api.db_utils import (
CustomUserManager,
FindingDeltaEnumField,
IntegrationTypeEnumField,
InvitationStateEnumField,
MemberRoleEnumField,
ProcessorTypeEnumField,
ProviderEnumField,
ProviderSecretTypeEnumField,
ScanTriggerEnumField,
@@ -49,6 +59,8 @@ fernet = Fernet(settings.SECRETS_ENCRYPTION_KEY.encode())
# Convert Prowler Severity enum to Django TextChoices
SeverityChoices = enum_to_choices(Severity)
logger = logging.getLogger(BackendLogger.API)
class StatusChoices(models.TextChoices):
"""
@@ -398,20 +410,6 @@ class Scan(RowLevelSecurityProtectedModel):
name = models.CharField(
blank=True, null=True, max_length=100, validators=[MinLengthValidator(3)]
)
provider = models.ForeignKey(
Provider,
on_delete=models.CASCADE,
related_name="scans",
related_query_name="scan",
)
task = models.ForeignKey(
Task,
on_delete=models.CASCADE,
related_name="scans",
related_query_name="scan",
null=True,
blank=True,
)
trigger = ScanTriggerEnumField(
choices=TriggerChoices.choices,
)
@@ -427,11 +425,31 @@ class Scan(RowLevelSecurityProtectedModel):
completed_at = models.DateTimeField(null=True, blank=True)
next_scan_at = models.DateTimeField(null=True, blank=True)
scheduler_task = models.ForeignKey(
PeriodicTask, on_delete=models.CASCADE, null=True, blank=True
PeriodicTask, on_delete=models.SET_NULL, null=True, blank=True
)
output_location = models.CharField(blank=True, null=True, max_length=200)
# TODO: mutelist foreign key
provider = models.ForeignKey(
Provider,
on_delete=models.CASCADE,
related_name="scans",
related_query_name="scan",
)
task = models.ForeignKey(
Task,
on_delete=models.CASCADE,
related_name="scans",
related_query_name="scan",
null=True,
blank=True,
)
processor = models.ForeignKey(
"Processor",
on_delete=models.SET_NULL,
related_name="scans",
related_query_name="scan",
null=True,
blank=True,
)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "scans"
@@ -543,6 +561,8 @@ class Resource(RowLevelSecurityProtectedModel):
details = models.TextField(blank=True, null=True)
partition = models.TextField(blank=True, null=True)
failed_findings_count = models.IntegerField(default=0)
# Relationships
tags = models.ManyToManyField(
ResourceTag,
@@ -589,6 +609,10 @@ class Resource(RowLevelSecurityProtectedModel):
fields=["tenant_id", "provider_id"],
name="resources_tenant_provider_idx",
),
models.Index(
fields=["tenant_id", "-failed_findings_count", "id"],
name="resources_failed_findings_idx",
),
]
constraints = [
@@ -687,6 +711,9 @@ class Finding(PostgresPartitionedModel, RowLevelSecurityProtectedModel):
check_id = models.CharField(max_length=100, blank=False, null=False)
check_metadata = models.JSONField(default=dict, null=False)
muted = models.BooleanField(default=False, null=False)
muted_reason = models.TextField(
blank=True, null=True, validators=[MinLengthValidator(3)], max_length=500
)
compliance = models.JSONField(default=dict, null=True, blank=True)
# Denormalize resource data for performance
@@ -828,6 +855,12 @@ class ResourceFindingMapping(PostgresPartitionedModel, RowLevelSecurityProtected
# - tenant_id
# - id
indexes = [
models.Index(
fields=["tenant_id", "finding_id"],
name="rfm_tenant_finding_idx",
),
]
constraints = [
models.UniqueConstraint(
fields=("tenant_id", "resource_id", "finding_id"),
@@ -932,6 +965,11 @@ class Invitation(RowLevelSecurityProtectedModel):
null=True,
)
def save(self, *args, **kwargs):
if self.email:
self.email = self.email.strip().lower()
super().save(*args, **kwargs)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "invitations"
@@ -1287,7 +1325,6 @@ class ScanSummary(RowLevelSecurityProtectedModel):
class Integration(RowLevelSecurityProtectedModel):
class IntegrationChoices(models.TextChoices):
S3 = "amazon_s3", _("Amazon S3")
SAML = "saml", _("SAML")
AWS_SECURITY_HUB = "aws_security_hub", _("AWS Security Hub")
JIRA = "jira", _("JIRA")
SLACK = "slack", _("Slack")
@@ -1361,6 +1398,273 @@ class IntegrationProviderRelationship(RowLevelSecurityProtectedModel):
]
class SAMLToken(models.Model):
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
expires_at = models.DateTimeField(editable=False)
token = models.JSONField(unique=True)
user = models.ForeignKey(User, on_delete=models.CASCADE)
class Meta:
db_table = "saml_tokens"
def save(self, *args, **kwargs):
if not self.expires_at:
self.expires_at = datetime.now(timezone.utc) + timedelta(seconds=15)
super().save(*args, **kwargs)
def is_expired(self) -> bool:
return datetime.now(timezone.utc) >= self.expires_at
class SAMLDomainIndex(models.Model):
"""
Public index of SAML domains. No RLS. Used for fast lookup in SAML login flow.
"""
email_domain = models.CharField(max_length=254, unique=True)
tenant = models.ForeignKey("Tenant", on_delete=models.CASCADE)
class Meta:
db_table = "saml_domain_index"
constraints = [
models.UniqueConstraint(
fields=("email_domain", "tenant"),
name="unique_resources_by_email_domain",
),
BaseSecurityConstraint(
name="statements_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class SAMLConfiguration(RowLevelSecurityProtectedModel):
"""
Stores per-tenant SAML settings, including email domain and IdP metadata.
Automatically syncs to a SocialApp instance on save.
Note:
This model exists to provide a tenant-aware abstraction over SAML configuration.
It supports row-level security, custom validation, and metadata parsing, enabling
Prowler to expose a clean API and admin interface for managing SAML integrations.
Although Django Allauth uses the SocialApp model to store provider configuration,
it is not designed for multi-tenant use. SocialApp lacks support for tenant scoping,
email domain mapping, and structured metadata handling.
By managing SAMLConfiguration separately, we ensure:
- Strong isolation between tenants via RLS.
- Ownership of raw IdP metadata and its validation.
- An explicit link between SAML config and business-level identifiers (e.g. email domain).
- Programmatic transformation into the SocialApp format used by Allauth.
In short, this model acts as a secure and user-friendly layer over Allauth's lower-level primitives.
"""
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
email_domain = models.CharField(
max_length=254,
unique=True,
help_text="Email domain used to identify the tenant, e.g. prowlerdemo.com",
)
metadata_xml = models.TextField(
help_text="Raw IdP metadata XML to configure SingleSignOnService, certificates, etc."
)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class JSONAPIMeta:
resource_name = "saml-configurations"
class Meta:
db_table = "saml_configurations"
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
# 1 config per tenant
models.UniqueConstraint(
fields=["tenant"],
name="unique_samlconfig_per_tenant",
),
]
def clean(self, old_email_domain=None, is_create=False):
# Domain must not contain @
if "@" in self.email_domain:
raise ValidationError({"email_domain": "Domain must not contain @"})
# Enforce at most one config per tenant
qs = SAMLConfiguration.objects.filter(tenant=self.tenant)
# Exclude ourselves in case of update
if self.pk:
qs = qs.exclude(pk=self.pk)
if qs.exists():
raise ValidationError(
{"tenant": "A SAML configuration already exists for this tenant."}
)
# The email domain must be unique in the entire system
qs = SAMLConfiguration.objects.using(MainRouter.admin_db).filter(
email_domain__iexact=self.email_domain
)
if qs.exists() and old_email_domain != self.email_domain:
raise ValidationError(
{"tenant": "There is a problem with your email domain."}
)
# The entityID must be unique in the system
idp_settings = self._parsed_metadata
entity_id = idp_settings.get("entity_id")
if entity_id:
# Find any SocialApp with this entityID
q = SocialApp.objects.filter(provider="saml", provider_id=entity_id)
# If updating, exclude our own SocialApp from the check
if not is_create:
q = q.exclude(client_id=old_email_domain)
else:
q = q.exclude(client_id=self.email_domain)
if q.exists():
raise ValidationError(
{"metadata_xml": "There is a problem with your metadata."}
)
def save(self, *args, **kwargs):
self.email_domain = self.email_domain.strip().lower()
is_create = not SAMLConfiguration.objects.filter(pk=self.pk).exists()
if not is_create:
old = SAMLConfiguration.objects.get(pk=self.pk)
old_email_domain = old.email_domain
old_metadata_xml = old.metadata_xml
else:
old_email_domain = None
old_metadata_xml = None
self._parsed_metadata = self._parse_metadata()
self.clean(old_email_domain, is_create)
super().save(*args, **kwargs)
if is_create or (
old_email_domain != self.email_domain
or old_metadata_xml != self.metadata_xml
):
self._sync_social_app(old_email_domain)
# Sync the public index
if not is_create and old_email_domain and old_email_domain != self.email_domain:
SAMLDomainIndex.objects.filter(email_domain=old_email_domain).delete()
# Create/update the new domain index
SAMLDomainIndex.objects.update_or_create(
email_domain=self.email_domain, defaults={"tenant": self.tenant}
)
def delete(self, *args, **kwargs):
super().delete(*args, **kwargs)
SocialApp.objects.filter(provider="saml", client_id=self.email_domain).delete()
SAMLDomainIndex.objects.filter(email_domain=self.email_domain).delete()
def _parse_metadata(self):
"""
Parse the raw IdP metadata XML and extract:
- entity_id
- sso_url
- slo_url (may be None)
- x509cert (required)
"""
ns = {
"md": "urn:oasis:names:tc:SAML:2.0:metadata",
"ds": "http://www.w3.org/2000/09/xmldsig#",
}
try:
root = ET.fromstring(self.metadata_xml)
except ET.ParseError as e:
raise ValidationError({"metadata_xml": f"Invalid XML: {e}"})
# Entity ID
entity_id = root.attrib.get("entityID")
if not entity_id:
raise ValidationError({"metadata_xml": "Missing entityID in metadata."})
# SSO endpoint (must exist)
sso = root.find(".//md:IDPSSODescriptor/md:SingleSignOnService", ns)
if sso is None or "Location" not in sso.attrib:
raise ValidationError(
{"metadata_xml": "Missing SingleSignOnService in metadata."}
)
sso_url = sso.attrib["Location"]
# SLO endpoint (optional)
slo = root.find(".//md:IDPSSODescriptor/md:SingleLogoutService", ns)
slo_url = slo.attrib.get("Location") if slo is not None else None
# X.509 certificate (required)
cert = root.find(
'.//md:KeyDescriptor[@use="signing"]/ds:KeyInfo/ds:X509Data/ds:X509Certificate',
ns,
)
if cert is None or not cert.text or not cert.text.strip():
raise ValidationError(
{
"metadata_xml": 'Metadata must include a <ds:X509Certificate> under <KeyDescriptor use="signing">.'
}
)
x509cert = cert.text.strip()
return {
"entity_id": entity_id,
"sso_url": sso_url,
"slo_url": slo_url,
"x509cert": x509cert,
}
def _sync_social_app(self, previous_email_domain=None):
"""
Create or update the corresponding SocialApp based on email_domain.
If the domain changed, update the matching SocialApp.
"""
settings_dict = SOCIALACCOUNT_PROVIDERS["saml"].copy()
settings_dict["idp"] = self._parsed_metadata
current_site = Site.objects.get(id=settings.SITE_ID)
social_app_qs = SocialApp.objects.filter(
provider="saml", client_id=previous_email_domain or self.email_domain
)
client_id = self.email_domain[:191]
name = f"SAML-{self.email_domain}"[:40]
if social_app_qs.exists():
social_app = social_app_qs.first()
social_app.client_id = client_id
social_app.name = name
social_app.settings = settings_dict
social_app.provider_id = self._parsed_metadata["entity_id"]
social_app.save()
social_app.sites.set([current_site])
else:
social_app = SocialApp.objects.create(
provider="saml",
client_id=client_id,
name=name,
settings=settings_dict,
provider_id=self._parsed_metadata["entity_id"],
)
social_app.sites.set([current_site])
class ResourceScanSummary(RowLevelSecurityProtectedModel):
scan_id = models.UUIDField(default=uuid7, db_index=True)
resource_id = models.UUIDField(default=uuid4, db_index=True)
@@ -1408,3 +1712,169 @@ class ResourceScanSummary(RowLevelSecurityProtectedModel):
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
class LighthouseConfiguration(RowLevelSecurityProtectedModel):
"""
Stores configuration and API keys for LLM services.
"""
class ModelChoices(models.TextChoices):
GPT_4O_2024_11_20 = "gpt-4o-2024-11-20", _("GPT-4o v2024-11-20")
GPT_4O_2024_08_06 = "gpt-4o-2024-08-06", _("GPT-4o v2024-08-06")
GPT_4O_2024_05_13 = "gpt-4o-2024-05-13", _("GPT-4o v2024-05-13")
GPT_4O = "gpt-4o", _("GPT-4o Default")
GPT_4O_MINI_2024_07_18 = "gpt-4o-mini-2024-07-18", _("GPT-4o Mini v2024-07-18")
GPT_4O_MINI = "gpt-4o-mini", _("GPT-4o Mini Default")
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
name = models.CharField(
max_length=100,
validators=[MinLengthValidator(3)],
blank=False,
null=False,
help_text="Name of the configuration",
)
api_key = models.BinaryField(
blank=False, null=False, help_text="Encrypted API key for the LLM service"
)
model = models.CharField(
max_length=50,
choices=ModelChoices.choices,
blank=False,
null=False,
default=ModelChoices.GPT_4O_2024_08_06,
help_text="Must be one of the supported model names",
)
temperature = models.FloatField(default=0, help_text="Must be between 0 and 1")
max_tokens = models.IntegerField(
default=4000, help_text="Must be between 500 and 5000"
)
business_context = models.TextField(
blank=True,
null=False,
default="",
help_text="Additional business context for this AI model configuration",
)
is_active = models.BooleanField(default=True)
def __str__(self):
return self.name
def clean(self):
super().clean()
# Validate temperature
if not 0 <= self.temperature <= 1:
raise ModelValidationError(
detail="Temperature must be between 0 and 1",
code="invalid_temperature",
pointer="/data/attributes/temperature",
)
# Validate max_tokens
if not 500 <= self.max_tokens <= 5000:
raise ModelValidationError(
detail="Max tokens must be between 500 and 5000",
code="invalid_max_tokens",
pointer="/data/attributes/max_tokens",
)
@property
def api_key_decoded(self):
"""Return the decrypted API key, or None if unavailable or invalid."""
if not self.api_key:
return None
try:
decrypted_key = fernet.decrypt(bytes(self.api_key))
return decrypted_key.decode()
except InvalidToken:
logger.warning("Invalid token while decrypting API key.")
except Exception as e:
logger.exception("Unexpected error while decrypting API key: %s", e)
@api_key_decoded.setter
def api_key_decoded(self, value):
"""Store the encrypted API key."""
if not value:
raise ModelValidationError(
detail="API key is required",
code="invalid_api_key",
pointer="/data/attributes/api_key",
)
# Validate OpenAI API key format
openai_key_pattern = r"^sk-[\w-]+T3BlbkFJ[\w-]+$"
if not re.match(openai_key_pattern, value):
raise ModelValidationError(
detail="Invalid OpenAI API key format.",
code="invalid_api_key",
pointer="/data/attributes/api_key",
)
self.api_key = fernet.encrypt(value.encode())
def save(self, *args, **kwargs):
self.full_clean()
super().save(*args, **kwargs)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "lighthouse_configurations"
constraints = [
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
# Add unique constraint for name within a tenant
models.UniqueConstraint(
fields=["tenant_id"], name="unique_lighthouse_config_per_tenant"
),
]
class JSONAPIMeta:
resource_name = "lighthouse-configurations"
class Processor(RowLevelSecurityProtectedModel):
class ProcessorChoices(models.TextChoices):
MUTELIST = "mutelist", _("Mutelist")
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)
updated_at = models.DateTimeField(auto_now=True, editable=False)
processor_type = ProcessorTypeEnumField(choices=ProcessorChoices.choices)
configuration = models.JSONField(default=dict)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "processors"
constraints = [
models.UniqueConstraint(
fields=("tenant_id", "processor_type"),
name="unique_processor_types_tenant",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
indexes = [
models.Index(
fields=["tenant_id", "id"],
name="processor_tenant_id_idx",
),
models.Index(
fields=["tenant_id", "processor_type"],
name="processor_tenant_type_idx",
),
]
class JSONAPIMeta:
resource_name = "processors"
File diff suppressed because it is too large Load Diff
@@ -11,7 +11,7 @@ def test_basic_authentication():
client = APIClient()
test_user = "test_email@prowler.com"
test_password = "test_password"
test_password = "Test_password@1"
# Check that a 401 is returned when no basic authentication is provided
no_auth_response = client.get(reverse("provider-list"))
@@ -108,7 +108,7 @@ def test_user_me_when_inviting_users(create_test_user, tenants_fixture, roles_fi
user1_email = "user1@testing.com"
user2_email = "user2@testing.com"
password = "thisisapassword123"
password = "Thisisapassword123@"
user1_response = client.post(
reverse("user-list"),
@@ -187,7 +187,7 @@ class TestTokenSwitchTenant:
client = APIClient()
test_user = "test_email@prowler.com"
test_password = "test_password"
test_password = "Test_password1@"
# Check that we can create a new user without any kind of authentication
user_creation_response = client.post(
@@ -17,7 +17,7 @@ def test_delete_provider_without_executing_task(
client = APIClient()
test_user = "test_email@prowler.com"
test_password = "test_password"
test_password = "Test_password1@"
prowler_task = tasks_fixture[0]
task_mock = Mock()
@@ -0,0 +1,77 @@
from unittest.mock import MagicMock, patch
import pytest
from allauth.socialaccount.models import SocialLogin
from django.contrib.auth import get_user_model
from api.adapters import ProwlerSocialAccountAdapter
User = get_user_model()
@pytest.mark.django_db
class TestProwlerSocialAccountAdapter:
def test_get_user_by_email_returns_user(self, create_test_user):
adapter = ProwlerSocialAccountAdapter()
user = adapter.get_user_by_email(create_test_user.email)
assert user == create_test_user
def test_get_user_by_email_returns_none_for_unknown_email(self):
adapter = ProwlerSocialAccountAdapter()
assert adapter.get_user_by_email("notfound@example.com") is None
def test_pre_social_login_links_existing_user(self, create_test_user, rf):
adapter = ProwlerSocialAccountAdapter()
sociallogin = MagicMock(spec=SocialLogin)
sociallogin.account = MagicMock()
sociallogin.provider = MagicMock()
sociallogin.provider.id = "saml"
sociallogin.account.extra_data = {}
sociallogin.user = create_test_user
sociallogin.connect = MagicMock()
adapter.pre_social_login(rf.get("/"), sociallogin)
call_args = sociallogin.connect.call_args
assert call_args is not None
called_request, called_user = call_args[0]
assert called_request.path == "/"
assert called_user.email == create_test_user.email
def test_pre_social_login_no_link_if_email_missing(self, rf):
adapter = ProwlerSocialAccountAdapter()
sociallogin = MagicMock(spec=SocialLogin)
sociallogin.account = MagicMock()
sociallogin.provider = MagicMock()
sociallogin.user = MagicMock()
sociallogin.provider.id = "saml"
sociallogin.account.extra_data = {}
sociallogin.connect = MagicMock()
adapter.pre_social_login(rf.get("/"), sociallogin)
sociallogin.connect.assert_not_called()
def test_save_user_saml_sets_session_flag(self, rf):
adapter = ProwlerSocialAccountAdapter()
request = rf.get("/")
request.session = {}
sociallogin = MagicMock(spec=SocialLogin)
sociallogin.provider = MagicMock()
sociallogin.provider.id = "saml"
sociallogin.account = MagicMock()
sociallogin.account.extra_data = {}
mock_user = MagicMock()
mock_user.id = 123
with patch("api.adapters.super") as mock_super:
with patch("api.adapters.transaction"):
with patch("api.adapters.MainRouter"):
mock_super.return_value.save_user.return_value = mock_user
adapter.save_user(request, sociallogin)
assert request.session["saml_user_created"] == "123"
@@ -218,6 +218,10 @@ class TestCompliance:
Description="Description of requirement 1",
Attributes=[],
Checks=["check1", "check2"],
Tactics=["tactic1"],
SubTechniques=["subtechnique1"],
Platforms=["platform1"],
TechniqueURL="https://example.com",
)
requirement2 = MagicMock(
Id="requirement2",
@@ -225,6 +229,10 @@ class TestCompliance:
Description="Description of requirement 2",
Attributes=[],
Checks=[],
Tactics=[],
SubTechniques=[],
Platforms=[],
TechniqueURL="",
)
compliance1 = MagicMock(
Requirements=[requirement1, requirement2],
@@ -247,6 +255,10 @@ class TestCompliance:
"requirement1": {
"name": "Requirement 1",
"description": "Description of requirement 1",
"tactics": ["tactic1"],
"subtechniques": ["subtechnique1"],
"platforms": ["platform1"],
"technique_url": "https://example.com",
"attributes": [],
"checks": {"check1": None, "check2": None},
"checks_status": {
@@ -260,6 +272,10 @@ class TestCompliance:
"requirement2": {
"name": "Requirement 2",
"description": "Description of requirement 2",
"tactics": [],
"subtechniques": [],
"platforms": [],
"technique_url": "",
"attributes": [],
"checks": {},
"checks_status": {
+205 -1
View File
@@ -1,6 +1,9 @@
import pytest
from allauth.socialaccount.models import SocialApp
from django.core.exceptions import ValidationError
from api.models import Resource, ResourceTag
from api.db_router import MainRouter
from api.models import Resource, ResourceTag, SAMLConfiguration, SAMLDomainIndex
@pytest.mark.django_db
@@ -120,3 +123,204 @@ class TestResourceModel:
# compliance={},
# )
# assert Finding.objects.filter(uid=long_uid).exists()
@pytest.mark.django_db
class TestSAMLConfigurationModel:
VALID_METADATA = """<?xml version='1.0' encoding='UTF-8'?>
<md:EntityDescriptor entityID='TEST' xmlns:md='urn:oasis:names:tc:SAML:2.0:metadata'>
<md:IDPSSODescriptor WantAuthnRequestsSigned='false' protocolSupportEnumeration='urn:oasis:names:tc:SAML:2.0:protocol'>
<md:KeyDescriptor use='signing'>
<ds:KeyInfo xmlns:ds='http://www.w3.org/2000/09/xmldsig#'>
<ds:X509Data>
<ds:X509Certificate>FAKECERTDATA</ds:X509Certificate>
</ds:X509Data>
</ds:KeyInfo>
</md:KeyDescriptor>
<md:SingleSignOnService Binding='urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST' Location='https://idp.test/sso'/>
</md:IDPSSODescriptor>
</md:EntityDescriptor>
"""
def test_creates_valid_configuration(self, tenants_fixture):
tenant = tenants_fixture[0]
config = SAMLConfiguration.objects.using(MainRouter.admin_db).create(
email_domain="ssoexample.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant,
)
assert config.email_domain == "ssoexample.com"
assert SocialApp.objects.filter(client_id="ssoexample.com").exists()
def test_email_domain_with_at_symbol_fails(self, tenants_fixture):
tenant = tenants_fixture[0]
config = SAMLConfiguration(
email_domain="invalid@domain.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config.clean()
errors = exc_info.value.message_dict
assert "email_domain" in errors
assert "Domain must not contain @" in errors["email_domain"][0]
def test_duplicate_email_domain_fails(self, tenants_fixture):
tenant1, tenant2, *_ = tenants_fixture
SAMLConfiguration.objects.using(MainRouter.admin_db).create(
email_domain="duplicate.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant1,
)
config = SAMLConfiguration(
email_domain="duplicate.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant2,
)
with pytest.raises(ValidationError) as exc_info:
config.clean()
errors = exc_info.value.message_dict
assert "tenant" in errors
assert "There is a problem with your email domain." in errors["tenant"][0]
def test_duplicate_tenant_config_fails(self, tenants_fixture):
tenant = tenants_fixture[0]
SAMLConfiguration.objects.using(MainRouter.admin_db).create(
email_domain="unique1.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant,
)
config = SAMLConfiguration(
email_domain="unique2.com",
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config.clean()
errors = exc_info.value.message_dict
assert "tenant" in errors
assert (
"A SAML configuration already exists for this tenant."
in errors["tenant"][0]
)
def test_invalid_metadata_xml_fails(self, tenants_fixture):
tenant = tenants_fixture[0]
config = SAMLConfiguration(
email_domain="brokenxml.com",
metadata_xml="<bad<xml>",
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config._parse_metadata()
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
assert "Invalid XML" in errors["metadata_xml"][0]
assert "not well-formed" in errors["metadata_xml"][0]
def test_metadata_missing_sso_fails(self, tenants_fixture):
tenant = tenants_fixture[0]
xml = """<md:EntityDescriptor entityID="x" xmlns:md="urn:oasis:names:tc:SAML:2.0:metadata">
<md:IDPSSODescriptor></md:IDPSSODescriptor>
</md:EntityDescriptor>"""
config = SAMLConfiguration(
email_domain="nosso.com",
metadata_xml=xml,
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config._parse_metadata()
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
assert "Missing SingleSignOnService" in errors["metadata_xml"][0]
def test_metadata_missing_certificate_fails(self, tenants_fixture):
tenant = tenants_fixture[0]
xml = """<md:EntityDescriptor entityID="x" xmlns:md="urn:oasis:names:tc:SAML:2.0:metadata">
<md:IDPSSODescriptor>
<md:SingleSignOnService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect" Location="https://example.com/sso"/>
</md:IDPSSODescriptor>
</md:EntityDescriptor>"""
config = SAMLConfiguration(
email_domain="nocert.com",
metadata_xml=xml,
tenant=tenant,
)
with pytest.raises(ValidationError) as exc_info:
config._parse_metadata()
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
assert "X509Certificate" in errors["metadata_xml"][0]
def test_deletes_saml_configuration_and_related_objects(self, tenants_fixture):
tenant = tenants_fixture[0]
email_domain = "deleteme.com"
# Create the configuration
config = SAMLConfiguration.objects.using(MainRouter.admin_db).create(
email_domain=email_domain,
metadata_xml=TestSAMLConfigurationModel.VALID_METADATA,
tenant=tenant,
)
# Verify that the SocialApp and SAMLDomainIndex exist
assert SocialApp.objects.filter(client_id=email_domain).exists()
assert (
SAMLDomainIndex.objects.using(MainRouter.admin_db)
.filter(email_domain=email_domain)
.exists()
)
# Delete the configuration
config.delete()
# Verify that the configuration and its related objects are deleted
assert (
not SAMLConfiguration.objects.using(MainRouter.admin_db)
.filter(pk=config.pk)
.exists()
)
assert not SocialApp.objects.filter(client_id=email_domain).exists()
assert (
not SAMLDomainIndex.objects.using(MainRouter.admin_db)
.filter(email_domain=email_domain)
.exists()
)
def test_duplicate_entity_id_fails_on_creation(self, tenants_fixture):
tenant1, tenant2, *_ = tenants_fixture
SAMLConfiguration.objects.using(MainRouter.admin_db).create(
email_domain="first.com",
metadata_xml=self.VALID_METADATA,
tenant=tenant1,
)
config = SAMLConfiguration(
email_domain="second.com",
metadata_xml=self.VALID_METADATA,
tenant=tenant2,
)
with pytest.raises(ValidationError) as exc_info:
config.save()
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
assert "There is a problem with your metadata." in errors["metadata_xml"][0]
+3 -3
View File
@@ -60,7 +60,7 @@ class TestUserViewSet:
def test_create_user_with_all_permissions(self, authenticated_client_rbac):
valid_user_payload = {
"name": "test",
"password": "newpassword123",
"password": "Newpassword123@",
"email": "new_user@test.com",
}
response = authenticated_client_rbac.post(
@@ -74,7 +74,7 @@ class TestUserViewSet:
):
valid_user_payload = {
"name": "test",
"password": "newpassword123",
"password": "Newpassword123@",
"email": "new_user@test.com",
}
response = authenticated_client_no_permissions_rbac.post(
@@ -321,7 +321,7 @@ class TestProviderViewSet:
@pytest.mark.django_db
class TestLimitedVisibility:
TEST_EMAIL = "rbac@rbac.com"
TEST_PASSWORD = "thisisapassword123"
TEST_PASSWORD = "Thisisapassword123@"
@pytest.fixture
def limited_admin_user(
+80
View File
@@ -0,0 +1,80 @@
import logging
from unittest.mock import MagicMock
from config.settings.sentry import before_send
def test_before_send_ignores_log_with_ignored_exception():
"""Test that before_send ignores logs containing ignored exceptions."""
log_record = MagicMock()
log_record.msg = "Provider kubernetes is not connected"
log_record.levelno = logging.ERROR # 40
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None
def test_before_send_ignores_exception_with_ignored_exception():
"""Test that before_send ignores exceptions containing ignored exceptions."""
exc_info = (Exception, Exception("Provider kubernetes is not connected"), None)
hint = {"exc_info": exc_info}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None
def test_before_send_passes_through_non_ignored_log():
"""Test that before_send passes through logs that don't contain ignored exceptions."""
log_record = MagicMock()
log_record.msg = "Some other error message"
log_record.levelno = logging.ERROR # 40
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was passed through
assert result == event
def test_before_send_passes_through_non_ignored_exception():
"""Test that before_send passes through exceptions that don't contain ignored exceptions."""
exc_info = (Exception, Exception("Some other error message"), None)
hint = {"exc_info": exc_info}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was passed through
assert result == event
def test_before_send_handles_warning_level():
"""Test that before_send handles warning level logs."""
log_record = MagicMock()
log_record.msg = "Provider kubernetes is not connected"
log_record.levelno = logging.WARNING # 30
hint = {"log_record": log_record}
event = MagicMock()
result = before_send(event, hint)
# Assert that the event was dropped (None returned)
assert result is None
+60 -4
View File
@@ -131,6 +131,21 @@ class TestInitializeProwlerProvider:
initialize_prowler_provider(provider)
mock_return_prowler_provider.return_value.assert_called_once_with(key="value")
@patch("api.utils.return_prowler_provider")
def test_initialize_prowler_provider_with_mutelist(
self, mock_return_prowler_provider
):
provider = MagicMock()
provider.secret.secret = {"key": "value"}
mutelist_processor = MagicMock()
mutelist_processor.configuration = {"Mutelist": {"key": "value"}}
mock_return_prowler_provider.return_value = MagicMock()
initialize_prowler_provider(provider, mutelist_processor)
mock_return_prowler_provider.return_value.assert_called_once_with(
key="value", mutelist_content={"key": "value"}
)
class TestProwlerProviderConnectionTest:
@patch("api.utils.return_prowler_provider")
@@ -200,6 +215,25 @@ class TestGetProwlerProviderKwargs:
expected_result = {**secret_dict, **expected_extra_kwargs}
assert result == expected_result
def test_get_prowler_provider_kwargs_with_mutelist(self):
provider_uid = "provider_uid"
secret_dict = {"key": "value"}
secret_mock = MagicMock()
secret_mock.secret = secret_dict
mutelist_processor = MagicMock()
mutelist_processor.configuration = {"Mutelist": {"key": "value"}}
provider = MagicMock()
provider.provider = Provider.ProviderChoices.AWS.value
provider.secret = secret_mock
provider.uid = provider_uid
result = get_prowler_provider_kwargs(provider, mutelist_processor)
expected_result = {**secret_dict, "mutelist_content": {"key": "value"}}
assert result == expected_result
def test_get_prowler_provider_kwargs_unsupported_provider(self):
# Setup
provider_uid = "provider_uid"
@@ -254,7 +288,7 @@ class TestValidateInvitation:
assert result == invitation
mock_db.get.assert_called_once_with(
token="VALID_TOKEN", email="user@example.com"
token="VALID_TOKEN", email__iexact="user@example.com"
)
def test_invitation_not_found_raises_validation_error(self):
@@ -269,7 +303,7 @@ class TestValidateInvitation:
"invitation_token": "Invalid invitation code."
}
mock_db.get.assert_called_once_with(
token="INVALID_TOKEN", email="user@example.com"
token="INVALID_TOKEN", email__iexact="user@example.com"
)
def test_invitation_not_found_raises_not_found(self):
@@ -284,7 +318,7 @@ class TestValidateInvitation:
assert exc_info.value.detail == "Invitation is not valid."
mock_db.get.assert_called_once_with(
token="INVALID_TOKEN", email="user@example.com"
token="INVALID_TOKEN", email__iexact="user@example.com"
)
def test_invitation_expired(self, invitation):
@@ -332,5 +366,27 @@ class TestValidateInvitation:
"invitation_token": "Invalid invitation code."
}
mock_db.get.assert_called_once_with(
token="VALID_TOKEN", email="different@example.com"
token="VALID_TOKEN", email__iexact="different@example.com"
)
def test_valid_invitation_uppercase_email(self):
"""Test that validate_invitation works with case-insensitive email lookup."""
uppercase_email = "USER@example.com"
invitation = MagicMock(spec=Invitation)
invitation.token = "VALID_TOKEN"
invitation.email = uppercase_email
invitation.expires_at = datetime.now(timezone.utc) + timedelta(days=1)
invitation.state = Invitation.State.PENDING
invitation.tenant = MagicMock()
with patch("api.utils.Invitation.objects.using") as mock_using:
mock_db = mock_using.return_value
mock_db.get.return_value = invitation
result = validate_invitation("VALID_TOKEN", "user@example.com")
assert result == invitation
mock_db.get.assert_called_once_with(
token="VALID_TOKEN", email__iexact="user@example.com"
)
File diff suppressed because it is too large Load Diff
+15 -4
View File
@@ -7,7 +7,7 @@ from rest_framework.exceptions import NotFound, ValidationError
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Invitation, Provider, Resource
from api.models import Invitation, Processor, Provider, Resource
from api.v1.serializers import FindingMetadataSerializer
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.azure.azure_provider import AzureProvider
@@ -83,11 +83,14 @@ def return_prowler_provider(
return prowler_provider
def get_prowler_provider_kwargs(provider: Provider) -> dict:
def get_prowler_provider_kwargs(
provider: Provider, mutelist_processor: Processor | None = None
) -> dict:
"""Get the Prowler provider kwargs based on the given provider type.
Args:
provider (Provider): The provider object containing the provider type and associated secret.
mutelist_processor (Processor): The mutelist processor object containing the mutelist configuration.
Returns:
dict: The provider kwargs for the corresponding provider class.
@@ -105,16 +108,24 @@ def get_prowler_provider_kwargs(provider: Provider) -> dict:
}
elif provider.provider == Provider.ProviderChoices.KUBERNETES.value:
prowler_provider_kwargs = {**prowler_provider_kwargs, "context": provider.uid}
if mutelist_processor:
mutelist_content = mutelist_processor.configuration.get("Mutelist", {})
if mutelist_content:
prowler_provider_kwargs["mutelist_content"] = mutelist_content
return prowler_provider_kwargs
def initialize_prowler_provider(
provider: Provider,
mutelist_processor: Processor | None = None,
) -> AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider:
"""Initialize a Prowler provider instance based on the given provider type.
Args:
provider (Provider): The provider object containing the provider type and associated secrets.
mutelist_processor (Processor): The mutelist processor object containing the mutelist configuration.
Returns:
AwsProvider | AzureProvider | GcpProvider | KubernetesProvider | M365Provider: An instance of the corresponding provider class
@@ -122,7 +133,7 @@ def initialize_prowler_provider(
provider's secrets.
"""
prowler_provider = return_prowler_provider(provider)
prowler_provider_kwargs = get_prowler_provider_kwargs(provider)
prowler_provider_kwargs = get_prowler_provider_kwargs(provider, mutelist_processor)
return prowler_provider(**prowler_provider_kwargs)
@@ -187,7 +198,7 @@ def validate_invitation(
# Admin DB connector is used to bypass RLS protection since the invitation belongs to a tenant the user
# is not a member of yet
invitation = Invitation.objects.using(MainRouter.admin_db).get(
token=invitation_token, email=email
token=invitation_token, email__iexact=email
)
except Invitation.DoesNotExist:
if raise_not_found:
+14 -2
View File
@@ -24,20 +24,32 @@ class PaginateByPkMixin:
request, # noqa: F841
base_queryset,
manager,
select_related: list[str] | None = None,
prefetch_related: list[str] | None = None,
select_related: list | None = None,
prefetch_related: list | None = None,
) -> Response:
"""
Paginate a queryset by primary key.
This method is useful when you want to paginate a queryset that has been
filtered or annotated in a way that would be lost if you used the default
pagination method.
"""
pk_list = base_queryset.values_list("id", flat=True)
page = self.paginate_queryset(pk_list)
if page is None:
return Response(self.get_serializer(base_queryset, many=True).data)
queryset = manager.filter(id__in=page)
if select_related:
queryset = queryset.select_related(*select_related)
if prefetch_related:
queryset = queryset.prefetch_related(*prefetch_related)
# Optimize tags loading, if applicable
if hasattr(self, "_optimize_tags_loading"):
queryset = self._optimize_tags_loading(queryset)
queryset = sorted(queryset, key=lambda obj: page.index(obj.id))
serialized = self.get_serializer(queryset, many=True).data
@@ -0,0 +1,23 @@
import yaml
from rest_framework_json_api import serializers
from rest_framework_json_api.serializers import ValidationError
class BaseValidateSerializer(serializers.Serializer):
def validate(self, data):
if hasattr(self, "initial_data"):
initial_data = set(self.initial_data.keys()) - {"id", "type"}
unknown_keys = initial_data - set(self.fields.keys())
if unknown_keys:
raise ValidationError(f"Invalid fields: {unknown_keys}")
return data
class YamlOrJsonField(serializers.JSONField):
def to_internal_value(self, data):
if isinstance(data, str):
try:
data = yaml.safe_load(data)
except yaml.YAMLError as exc:
raise serializers.ValidationError("Invalid YAML format") from exc
return super().to_internal_value(data)
@@ -1,19 +1,7 @@
from drf_spectacular.utils import extend_schema_field
from rest_framework_json_api import serializers
from rest_framework_json_api.serializers import ValidationError
class BaseValidateSerializer(serializers.Serializer):
def validate(self, data):
if hasattr(self, "initial_data"):
initial_data = set(self.initial_data.keys()) - {"id", "type"}
unknown_keys = initial_data - set(self.fields.keys())
if unknown_keys:
raise ValidationError(f"Invalid fields: {unknown_keys}")
return data
# Integrations
from api.v1.serializer_utils.base import BaseValidateSerializer
class S3ConfigSerializer(BaseValidateSerializer):
@@ -0,0 +1,21 @@
from drf_spectacular.utils import extend_schema_field
from api.v1.serializer_utils.base import YamlOrJsonField
from prowler.lib.mutelist.mutelist import mutelist_schema
@extend_schema_field(
{
"oneOf": [
{
"type": "object",
"title": "Mutelist",
"properties": {"Mutelist": mutelist_schema},
"additionalProperties": False,
},
]
}
)
class ProcessorConfigField(YamlOrJsonField):
pass
+354 -8
View File
@@ -7,7 +7,9 @@ from django.contrib.auth.models import update_last_login
from django.contrib.auth.password_validation import validate_password
from drf_spectacular.utils import extend_schema_field
from jwt.exceptions import InvalidKeyError
from rest_framework.validators import UniqueTogetherValidator
from rest_framework_json_api import serializers
from rest_framework_json_api.relations import SerializerMethodResourceRelatedField
from rest_framework_json_api.serializers import ValidationError
from rest_framework_simplejwt.exceptions import TokenError
from rest_framework_simplejwt.serializers import TokenObtainPairSerializer
@@ -19,7 +21,9 @@ from api.models import (
IntegrationProviderRelationship,
Invitation,
InvitationRoleRelationship,
LighthouseConfiguration,
Membership,
Processor,
Provider,
ProviderGroup,
ProviderGroupMembership,
@@ -28,6 +32,7 @@ from api.models import (
ResourceTag,
Role,
RoleProviderGroupRelationship,
SAMLConfiguration,
Scan,
StateChoices,
StatusChoices,
@@ -42,7 +47,9 @@ from api.v1.serializer_utils.integrations import (
IntegrationCredentialField,
S3ConfigSerializer,
)
from api.v1.serializer_utils.processors import ProcessorConfigField
from api.v1.serializer_utils.providers import ProviderSecretField
from prowler.lib.mutelist.mutelist import Mutelist
# Tokens
@@ -128,6 +135,12 @@ class TokenSerializer(BaseTokenSerializer):
class TokenSocialLoginSerializer(BaseTokenSerializer):
email = serializers.EmailField(write_only=True)
tenant_id = serializers.UUIDField(
write_only=True,
required=False,
help_text="If not provided, the tenant ID of the first membership that was added"
" to the user will be used.",
)
# Output tokens
refresh = serializers.CharField(read_only=True)
@@ -849,6 +862,7 @@ class ScanSerializer(RLSSerializer):
"completed_at",
"scheduled_at",
"next_scan_at",
"processor",
"url",
]
@@ -986,8 +1000,12 @@ class ResourceSerializer(RLSSerializer):
tags = serializers.SerializerMethodField()
type_ = serializers.CharField(read_only=True)
failed_findings_count = serializers.IntegerField(read_only=True)
findings = serializers.ResourceRelatedField(many=True, read_only=True)
findings = SerializerMethodResourceRelatedField(
many=True,
read_only=True,
)
class Meta:
model = Resource
@@ -1003,6 +1021,7 @@ class ResourceSerializer(RLSSerializer):
"tags",
"provider",
"findings",
"failed_findings_count",
"url",
]
extra_kwargs = {
@@ -1012,8 +1031,8 @@ class ResourceSerializer(RLSSerializer):
}
included_serializers = {
"findings": "api.v1.serializers.FindingSerializer",
"provider": "api.v1.serializers.ProviderSerializer",
"findings": "api.v1.serializers.FindingIncludeSerializer",
"provider": "api.v1.serializers.ProviderIncludeSerializer",
}
@extend_schema_field(
@@ -1024,6 +1043,10 @@ class ResourceSerializer(RLSSerializer):
}
)
def get_tags(self, obj):
# Use prefetched tags if available to avoid N+1 queries
if hasattr(obj, "prefetched_tags"):
return {tag.key: tag.value for tag in obj.prefetched_tags}
# Fallback to the original method if prefetch is not available
return obj.get_tags(self.context.get("tenant_id"))
def get_fields(self):
@@ -1033,10 +1056,17 @@ class ResourceSerializer(RLSSerializer):
fields["type"] = type_
return fields
def get_findings(self, obj):
return (
obj.latest_findings
if hasattr(obj, "latest_findings")
else obj.findings.all()
)
class ResourceIncludeSerializer(RLSSerializer):
"""
Serializer for the Resource model.
Serializer for the included Resource model.
"""
tags = serializers.SerializerMethodField()
@@ -1069,6 +1099,10 @@ class ResourceIncludeSerializer(RLSSerializer):
}
)
def get_tags(self, obj):
# Use prefetched tags if available to avoid N+1 queries
if hasattr(obj, "prefetched_tags"):
return {tag.key: tag.value for tag in obj.prefetched_tags}
# Fallback to the original method if prefetch is not available
return obj.get_tags(self.context.get("tenant_id"))
def get_fields(self):
@@ -1079,6 +1113,17 @@ class ResourceIncludeSerializer(RLSSerializer):
return fields
class ResourceMetadataSerializer(serializers.Serializer):
services = serializers.ListField(child=serializers.CharField(), allow_empty=True)
regions = serializers.ListField(child=serializers.CharField(), allow_empty=True)
types = serializers.ListField(child=serializers.CharField(), allow_empty=True)
# Temporarily disabled until we implement tag filtering in the UI
# tags = serializers.JSONField(help_text="Tags are described as key-value pairs.")
class Meta:
resource_name = "resources-metadata"
class FindingSerializer(RLSSerializer):
"""
Serializer for the Finding model.
@@ -1102,6 +1147,7 @@ class FindingSerializer(RLSSerializer):
"updated_at",
"first_seen_at",
"muted",
"muted_reason",
"url",
# Relationships
"scan",
@@ -1114,6 +1160,28 @@ class FindingSerializer(RLSSerializer):
}
class FindingIncludeSerializer(RLSSerializer):
"""
Serializer for the include Finding model.
"""
class Meta:
model = Finding
fields = [
"id",
"uid",
"status",
"severity",
"check_id",
"check_metadata",
"inserted_at",
"updated_at",
"first_seen_at",
"muted",
"muted_reason",
]
# To be removed when the related endpoint is removed as well
class FindingDynamicFilterSerializer(serializers.Serializer):
services = serializers.ListField(child=serializers.CharField(), allow_empty=True)
@@ -1198,8 +1266,8 @@ class M365ProviderSecret(serializers.Serializer):
client_id = serializers.CharField()
client_secret = serializers.CharField()
tenant_id = serializers.CharField()
user = serializers.EmailField()
password = serializers.CharField()
user = serializers.EmailField(required=False)
password = serializers.CharField(required=False)
class Meta:
resource_name = "provider-secrets"
@@ -1307,12 +1375,13 @@ class ProviderSecretUpdateSerializer(BaseWriteProviderSecretSerializer):
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
"provider": {"read_only": True},
"secret_type": {"read_only": True},
"secret_type": {"required": False},
}
def validate(self, attrs):
provider = self.instance.provider
secret_type = self.instance.secret_type
# To allow updating a secret with the same type without making the `secret_type` mandatory
secret_type = attrs.get("secret_type") or self.instance.secret_type
secret = attrs.get("secret")
validated_attrs = super().validate(attrs)
@@ -1721,6 +1790,8 @@ class ComplianceOverviewDetailSerializer(serializers.Serializer):
class ComplianceOverviewAttributesSerializer(serializers.Serializer):
id = serializers.CharField()
framework_description = serializers.CharField()
name = serializers.CharField()
framework = serializers.CharField()
version = serializers.CharField()
description = serializers.CharField()
@@ -2059,3 +2130,278 @@ class IntegrationUpdateSerializer(BaseWriteIntegrationSerializer):
IntegrationProviderRelationship.objects.bulk_create(new_relationships)
return super().update(instance, validated_data)
# Processors
class ProcessorSerializer(RLSSerializer):
"""
Serializer for the Processor model.
"""
configuration = ProcessorConfigField()
class Meta:
model = Processor
fields = [
"id",
"inserted_at",
"updated_at",
"processor_type",
"configuration",
"url",
]
class ProcessorCreateSerializer(RLSSerializer, BaseWriteSerializer):
configuration = ProcessorConfigField(required=True)
class Meta:
model = Processor
fields = [
"inserted_at",
"updated_at",
"processor_type",
"configuration",
]
extra_kwargs = {
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
}
validators = [
UniqueTogetherValidator(
queryset=Processor.objects.all(),
fields=["processor_type"],
message="A processor with the same type already exists.",
)
]
def validate(self, attrs):
validated_attrs = super().validate(attrs)
self.validate_processor_data(attrs)
return validated_attrs
def validate_processor_data(self, attrs):
processor_type = attrs.get("processor_type")
configuration = attrs.get("configuration")
if processor_type == "mutelist":
self.validate_mutelist_configuration(configuration)
def validate_mutelist_configuration(self, configuration):
if not isinstance(configuration, dict):
raise serializers.ValidationError("Invalid Mutelist configuration.")
mutelist_configuration = configuration.get("Mutelist", {})
if not mutelist_configuration:
raise serializers.ValidationError(
"Invalid Mutelist configuration: 'Mutelist' is a required property."
)
try:
Mutelist.validate_mutelist(mutelist_configuration, raise_on_exception=True)
return
except Exception as error:
raise serializers.ValidationError(
f"Invalid Mutelist configuration: {error}"
)
class ProcessorUpdateSerializer(BaseWriteSerializer):
configuration = ProcessorConfigField(required=True)
class Meta:
model = Processor
fields = [
"inserted_at",
"updated_at",
"configuration",
]
extra_kwargs = {
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
}
def validate(self, attrs):
validated_attrs = super().validate(attrs)
self.validate_processor_data(attrs)
return validated_attrs
def validate_processor_data(self, attrs):
processor_type = self.instance.processor_type
configuration = attrs.get("configuration")
if processor_type == "mutelist":
self.validate_mutelist_configuration(configuration)
def validate_mutelist_configuration(self, configuration):
if not isinstance(configuration, dict):
raise serializers.ValidationError("Invalid Mutelist configuration.")
mutelist_configuration = configuration.get("Mutelist", {})
if not mutelist_configuration:
raise serializers.ValidationError(
"Invalid Mutelist configuration: 'Mutelist' is a required property."
)
try:
Mutelist.validate_mutelist(mutelist_configuration, raise_on_exception=True)
return
except Exception as error:
raise serializers.ValidationError(
f"Invalid Mutelist configuration: {error}"
)
# SSO
class SamlInitiateSerializer(serializers.Serializer):
email_domain = serializers.CharField()
class JSONAPIMeta:
resource_name = "saml-initiate"
class SamlMetadataSerializer(serializers.Serializer):
class JSONAPIMeta:
resource_name = "saml-meta"
class SAMLConfigurationSerializer(RLSSerializer):
class Meta:
model = SAMLConfiguration
fields = ["id", "email_domain", "metadata_xml", "created_at", "updated_at"]
read_only_fields = ["id", "created_at", "updated_at"]
class LighthouseConfigSerializer(RLSSerializer):
"""
Serializer for the LighthouseConfig model.
"""
api_key = serializers.CharField(required=False)
class Meta:
model = LighthouseConfiguration
fields = [
"id",
"name",
"api_key",
"model",
"temperature",
"max_tokens",
"business_context",
"is_active",
"inserted_at",
"updated_at",
"url",
]
extra_kwargs = {
"id": {"read_only": True},
"is_active": {"read_only": True},
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
}
def to_representation(self, instance):
data = super().to_representation(instance)
# Check if api_key is specifically requested in fields param
fields_param = self.context.get("request", None) and self.context[
"request"
].query_params.get("fields[lighthouse-config]", "")
if fields_param == "api_key":
# Return decrypted key if specifically requested
data["api_key"] = instance.api_key_decoded if instance.api_key else None
else:
# Return masked key for general requests
data["api_key"] = "*" * len(instance.api_key) if instance.api_key else None
return data
class LighthouseConfigCreateSerializer(RLSSerializer, BaseWriteSerializer):
"""Serializer for creating new Lighthouse configurations."""
api_key = serializers.CharField(write_only=True, required=True)
class Meta:
model = LighthouseConfiguration
fields = [
"id",
"name",
"api_key",
"model",
"temperature",
"max_tokens",
"business_context",
"is_active",
"inserted_at",
"updated_at",
]
extra_kwargs = {
"id": {"read_only": True},
"is_active": {"read_only": True},
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
}
def validate(self, attrs):
tenant_id = self.context.get("request").tenant_id
if LighthouseConfiguration.objects.filter(tenant_id=tenant_id).exists():
raise serializers.ValidationError(
{
"tenant_id": "Lighthouse configuration already exists for this tenant."
}
)
return super().validate(attrs)
def create(self, validated_data):
api_key = validated_data.pop("api_key")
instance = super().create(validated_data)
instance.api_key_decoded = api_key
instance.save()
return instance
def to_representation(self, instance):
data = super().to_representation(instance)
# Always mask the API key in the response
data["api_key"] = "*" * len(instance.api_key) if instance.api_key else None
return data
class LighthouseConfigUpdateSerializer(BaseWriteSerializer):
"""
Serializer for updating LighthouseConfig instances.
"""
api_key = serializers.CharField(write_only=True, required=False)
class Meta:
model = LighthouseConfiguration
fields = [
"id",
"name",
"api_key",
"model",
"temperature",
"max_tokens",
"business_context",
"is_active",
]
extra_kwargs = {
"id": {"read_only": True},
"is_active": {"read_only": True},
"name": {"required": False},
"model": {"required": False},
"temperature": {"required": False},
"max_tokens": {"required": False},
}
def update(self, instance, validated_data):
api_key = validated_data.pop("api_key", None)
instance = super().update(instance, validated_data)
if api_key:
instance.api_key_decoded = api_key
instance.save()
return instance
+45
View File
@@ -1,9 +1,11 @@
from allauth.socialaccount.providers.saml.views import ACSView, MetadataView, SLSView
from django.urls import include, path
from drf_spectacular.views import SpectacularRedocView
from rest_framework_nested import routers
from api.v1.views import (
ComplianceOverviewViewSet,
CustomSAMLLoginView,
CustomTokenObtainView,
CustomTokenRefreshView,
CustomTokenSwitchTenantView,
@@ -13,8 +15,10 @@ from api.v1.views import (
IntegrationViewSet,
InvitationAcceptViewSet,
InvitationViewSet,
LighthouseConfigViewSet,
MembershipViewSet,
OverviewViewSet,
ProcessorViewSet,
ProviderGroupProvidersRelationshipView,
ProviderGroupViewSet,
ProviderSecretViewSet,
@@ -22,10 +26,14 @@ from api.v1.views import (
ResourceViewSet,
RoleProviderGroupRelationshipView,
RoleViewSet,
SAMLConfigurationViewSet,
SAMLInitiateAPIView,
SAMLTokenValidateView,
ScanViewSet,
ScheduleViewSet,
SchemaView,
TaskViewSet,
TenantFinishACSView,
TenantMembersViewSet,
TenantViewSet,
UserRoleRelationshipView,
@@ -49,6 +57,13 @@ router.register(
router.register(r"overviews", OverviewViewSet, basename="overview")
router.register(r"schedules", ScheduleViewSet, basename="schedule")
router.register(r"integrations", IntegrationViewSet, basename="integration")
router.register(r"processors", ProcessorViewSet, basename="processor")
router.register(r"saml-config", SAMLConfigurationViewSet, basename="saml-config")
router.register(
r"lighthouse-configurations",
LighthouseConfigViewSet,
basename="lighthouseconfiguration",
)
tenants_router = routers.NestedSimpleRouter(router, r"tenants", lookup="tenant")
tenants_router.register(
@@ -112,6 +127,36 @@ urlpatterns = [
),
name="provider_group-providers-relationship",
),
# API endpoint to start SAML SSO flow
path(
"auth/saml/initiate/", SAMLInitiateAPIView.as_view(), name="api_saml_initiate"
),
path(
"accounts/saml/<organization_slug>/login/",
CustomSAMLLoginView.as_view(),
name="saml_login",
),
path(
"accounts/saml/<organization_slug>/acs/",
ACSView.as_view(),
name="saml_acs",
),
path(
"accounts/saml/<organization_slug>/acs/finish/",
TenantFinishACSView.as_view(),
name="saml_finish_acs",
),
path(
"accounts/saml/<organization_slug>/sls/",
SLSView.as_view(),
name="saml_sls",
),
path(
"accounts/saml/<organization_slug>/metadata/",
MetadataView.as_view(),
name="saml_metadata",
),
path("tokens/saml", SAMLTokenValidateView.as_view(), name="token-saml"),
path("tokens/google", GoogleSocialLoginView.as_view(), name="token-google"),
path("tokens/github", GithubSocialLoginView.as_view(), name="token-github"),
path("", include(router.urls)),
+753 -40
View File
@@ -1,12 +1,17 @@
import glob
import logging
import os
from datetime import datetime, timedelta, timezone
from urllib.parse import urljoin
import sentry_sdk
from allauth.socialaccount.models import SocialAccount, SocialApp
from allauth.socialaccount.providers.github.views import GitHubOAuth2Adapter
from allauth.socialaccount.providers.google.views import GoogleOAuth2Adapter
from allauth.socialaccount.providers.saml.views import FinishACSView, LoginView
from botocore.exceptions import ClientError, NoCredentialsError, ParamValidationError
from celery.result import AsyncResult
from config.custom_logging import BackendLogger
from config.env import env
from config.settings.social_login import (
GITHUB_OAUTH_CALLBACK_URL,
@@ -17,9 +22,10 @@ from django.conf import settings as django_settings
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.search import SearchQuery
from django.db import transaction
from django.db.models import Count, Exists, F, OuterRef, Prefetch, Q, Sum
from django.db.models import Count, F, Prefetch, Q, Sum
from django.db.models.functions import Coalesce
from django.http import HttpResponse
from django.shortcuts import redirect
from django.urls import reverse
from django.utils.dateparse import parse_date
from django.utils.decorators import method_decorator
@@ -51,6 +57,7 @@ from tasks.beat import schedule_provider_scan
from tasks.jobs.export import get_s3_client
from tasks.tasks import (
backfill_scan_resource_summaries_task,
check_lighthouse_connection_task,
check_provider_connection_task,
delete_provider_task,
delete_tenant_task,
@@ -63,6 +70,7 @@ from api.compliance import (
get_compliance_frameworks,
)
from api.db_router import MainRouter
from api.db_utils import rls_transaction
from api.exceptions import TaskFailedException
from api.filters import (
ComplianceOverviewFilter,
@@ -70,7 +78,9 @@ from api.filters import (
IntegrationFilter,
InvitationFilter,
LatestFindingFilter,
LatestResourceFilter,
MembershipFilter,
ProcessorFilter,
ProviderFilter,
ProviderGroupFilter,
ProviderSecretFilter,
@@ -89,7 +99,9 @@ from api.models import (
Finding,
Integration,
Invitation,
LighthouseConfiguration,
Membership,
Processor,
Provider,
ProviderGroup,
ProviderGroupMembership,
@@ -97,8 +109,12 @@ from api.models import (
Resource,
ResourceFindingMapping,
ResourceScanSummary,
ResourceTag,
Role,
RoleProviderGroupRelationship,
SAMLConfiguration,
SAMLDomainIndex,
SAMLToken,
Scan,
ScanSummary,
SeverityChoices,
@@ -132,11 +148,17 @@ from api.v1.serializers import (
InvitationCreateSerializer,
InvitationSerializer,
InvitationUpdateSerializer,
LighthouseConfigCreateSerializer,
LighthouseConfigSerializer,
LighthouseConfigUpdateSerializer,
MembershipSerializer,
OverviewFindingSerializer,
OverviewProviderSerializer,
OverviewServiceSerializer,
OverviewSeveritySerializer,
ProcessorCreateSerializer,
ProcessorSerializer,
ProcessorUpdateSerializer,
ProviderCreateSerializer,
ProviderGroupCreateSerializer,
ProviderGroupMembershipSerializer,
@@ -147,11 +169,14 @@ from api.v1.serializers import (
ProviderSecretUpdateSerializer,
ProviderSerializer,
ProviderUpdateSerializer,
ResourceMetadataSerializer,
ResourceSerializer,
RoleCreateSerializer,
RoleProviderGroupRelationshipSerializer,
RoleSerializer,
RoleUpdateSerializer,
SAMLConfigurationSerializer,
SamlInitiateSerializer,
ScanComplianceReportSerializer,
ScanCreateSerializer,
ScanReportSerializer,
@@ -170,6 +195,8 @@ from api.v1.serializers import (
UserUpdateSerializer,
)
logger = logging.getLogger(BackendLogger.API)
CACHE_DECORATOR = cache_control(
max_age=django_settings.CACHE_MAX_AGE,
stale_while_revalidate=django_settings.CACHE_STALE_WHILE_REVALIDATE,
@@ -266,7 +293,7 @@ class SchemaView(SpectacularAPIView):
def get(self, request, *args, **kwargs):
spectacular_settings.TITLE = "Prowler API"
spectacular_settings.VERSION = "1.9.0"
spectacular_settings.VERSION = "1.10.0"
spectacular_settings.DESCRIPTION = (
"Prowler API specification.\n\nThis file is auto-generated."
)
@@ -327,6 +354,16 @@ class SchemaView(SpectacularAPIView):
"description": "Endpoints for managing third-party integrations, including registration, configuration,"
" retrieval, and deletion of integrations such as S3, JIRA, or other services.",
},
{
"name": "Lighthouse",
"description": "Endpoints for managing Lighthouse configurations, including creation, retrieval, "
"updating, and deletion of configurations such as OpenAI keys, models, and business context.",
},
{
"name": "Processor",
"description": "Endpoints for managing post-processors used to process Prowler findings, including "
"registration, configuration, and deletion of post-processing actions.",
},
]
return super().get(request, *args, **kwargs)
@@ -383,6 +420,274 @@ class GithubSocialLoginView(SocialLoginView):
return original_response
@extend_schema(exclude=True)
class SAMLTokenValidateView(GenericAPIView):
resource_name = "tokens"
http_method_names = ["post"]
def post(self, request):
token_id = request.query_params.get("id", "invalid")
try:
saml_token = SAMLToken.objects.using(MainRouter.admin_db).get(id=token_id)
except SAMLToken.DoesNotExist:
return Response({"detail": "Invalid token ID."}, status=404)
if saml_token.is_expired():
return Response({"detail": "Token expired."}, status=400)
token_data = saml_token.token
# Currently we don't store the tokens in the database, so we delete the token after use
saml_token.delete()
return Response(token_data, status=200)
@extend_schema(exclude=True)
class CustomSAMLLoginView(LoginView):
def dispatch(self, request, *args, **kwargs):
"""
Convert GET requests to POST to bypass allauth's confirmation screen.
Why this is necessary:
- django-allauth requires POST for social logins to prevent open redirect attacks
- SAML login links typically use GET requests (e.g., <a href="...">)
- This conversion allows seamless login without user-facing confirmation
Security considerations:
1. Preserves CSRF protection: Original POST handling remains intact
2. Avoids global SOCIALACCOUNT_LOGIN_ON_GET=True which would:
- Enable GET logins for ALL providers (security risk)
- Potentially expose open redirect vulnerabilities
3. SAML payloads remain signed/encrypted regardless of HTTP method
4. No sensitive parameters are exposed in URLs (copied to POST body)
This approach maintains security while providing better UX.
"""
if request.method == "GET":
# Convert GET to POST while preserving parameters
request.method = "POST"
return super().dispatch(request, *args, **kwargs)
@extend_schema(exclude=True)
class SAMLInitiateAPIView(GenericAPIView):
serializer_class = SamlInitiateSerializer
permission_classes = []
def post(self, request, *args, **kwargs):
# Validate the input payload and extract the domain
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
email = serializer.validated_data["email_domain"]
domain = email.split("@", 1)[-1].lower()
# Retrieve the SAML configuration for the given email domain
try:
check = SAMLDomainIndex.objects.get(email_domain=domain)
with rls_transaction(str(check.tenant_id)):
config = SAMLConfiguration.objects.get(tenant_id=str(check.tenant_id))
except (SAMLDomainIndex.DoesNotExist, SAMLConfiguration.DoesNotExist):
return Response(
{"detail": "Unauthorized domain."}, status=status.HTTP_403_FORBIDDEN
)
# Check certificates are not empty (TODO: Validate certificates)
# saml_public_cert = os.getenv("SAML_PUBLIC_CERT", "").strip()
# saml_private_key = os.getenv("SAML_PRIVATE_KEY", "").strip()
# if not saml_public_cert or not saml_private_key:
# return Response(
# {"detail": "SAML configuration is invalid: missing certificates."},
# status=status.HTTP_403_FORBIDDEN,
# )
# Build the SAML login URL using the configured API host
api_host = os.getenv("API_BASE_URL")
login_path = reverse(
"saml_login", kwargs={"organization_slug": config.email_domain}
)
login_url = urljoin(api_host, login_path)
return redirect(login_url)
@extend_schema_view(
list=extend_schema(
tags=["SAML"],
summary="List all SSO configurations",
description="Returns all the SAML-based SSO configurations associated with the current tenant.",
),
retrieve=extend_schema(
tags=["SAML"],
summary="Retrieve SSO configuration details",
description="Returns the details of a specific SAML configuration belonging to the current tenant.",
),
create=extend_schema(
tags=["SAML"],
summary="Create the SSO configuration",
description="Creates a new SAML SSO configuration for the current tenant, including email domain and metadata XML.",
),
partial_update=extend_schema(
tags=["SAML"],
summary="Update the SSO configuration",
description="Partially updates an existing SAML SSO configuration. Supports changes to email domain and metadata XML.",
),
destroy=extend_schema(
tags=["SAML"],
summary="Delete the SSO configuration",
description="Deletes an existing SAML SSO configuration associated with the current tenant.",
),
)
@method_decorator(CACHE_DECORATOR, name="retrieve")
@method_decorator(CACHE_DECORATOR, name="list")
class SAMLConfigurationViewSet(BaseRLSViewSet):
"""
ViewSet for managing SAML SSO configurations per tenant.
This endpoint allows authorized users to perform CRUD operations on SAMLConfiguration,
which define how a tenant integrates with an external SAML Identity Provider (IdP).
Typical use cases include:
- Listing all existing configurations for auditing or UI display.
- Retrieving a single configuration to show setup details.
- Creating or updating a configuration to onboard or modify SAML integration.
- Deleting a configuration when deactivating SAML for a tenant.
"""
serializer_class = SAMLConfigurationSerializer
required_permissions = [Permissions.MANAGE_INTEGRATIONS]
queryset = SAMLConfiguration.objects.all()
def get_queryset(self):
# If called during schema generation, return an empty queryset
if getattr(self, "swagger_fake_view", False):
return SAMLConfiguration.objects.none()
return SAMLConfiguration.objects.filter(tenant=self.request.tenant_id)
class TenantFinishACSView(FinishACSView):
def _rollback_saml_user(self, request):
"""Helper function to rollback SAML user if it was just created and validation fails"""
saml_user_id = request.session.get("saml_user_created")
if saml_user_id:
User.objects.using(MainRouter.admin_db).filter(id=saml_user_id).delete()
request.session.pop("saml_user_created", None)
def dispatch(self, request, organization_slug):
try:
super().dispatch(request, organization_slug)
except Exception as e:
logger.error(f"SAML dispatch failed: {e}")
self._rollback_saml_user(request)
callback_url = env.str("AUTH_URL")
return redirect(f"{callback_url}?sso_saml_failed=true")
user = getattr(request, "user", None)
if not user or not user.is_authenticated:
self._rollback_saml_user(request)
callback_url = env.str("AUTH_URL")
return redirect(f"{callback_url}?sso_saml_failed=true")
# Defensive check to avoid edge case failures due to inconsistent or incomplete data in the database
# This handles scenarios like partially deleted or missing related objects
try:
check = SAMLDomainIndex.objects.get(email_domain=organization_slug)
with rls_transaction(str(check.tenant_id)):
SAMLConfiguration.objects.get(tenant_id=str(check.tenant_id))
social_app = SocialApp.objects.get(
provider="saml", client_id=organization_slug
)
user_id = User.objects.get(email=str(user)).id
social_account = SocialAccount.objects.get(
user=str(user_id), provider=social_app.provider_id
)
except (
SAMLDomainIndex.DoesNotExist,
SAMLConfiguration.DoesNotExist,
SocialApp.DoesNotExist,
SocialAccount.DoesNotExist,
User.DoesNotExist,
) as e:
logger.error(f"SAML user is not authenticated: {e}")
self._rollback_saml_user(request)
callback_url = env.str("AUTH_URL")
return redirect(f"{callback_url}?sso_saml_failed=true")
extra = social_account.extra_data
user.first_name = (
extra.get("firstName", [""])[0] if extra.get("firstName") else ""
)
user.last_name = extra.get("lastName", [""])[0] if extra.get("lastName") else ""
user.company_name = (
extra.get("organization", [""])[0] if extra.get("organization") else ""
)
user.name = f"{user.first_name} {user.last_name}".strip()
if user.name == "":
user.name = "N/A"
user.save()
email_domain = user.email.split("@")[-1]
tenant = (
SAMLConfiguration.objects.using(MainRouter.admin_db)
.get(email_domain=email_domain)
.tenant
)
role_name = (
extra.get("userType", ["no_permissions"])[0].strip()
if extra.get("userType")
else "no_permissions"
)
try:
role = Role.objects.using(MainRouter.admin_db).get(
name=role_name, tenant=tenant
)
except Role.DoesNotExist:
role = Role.objects.using(MainRouter.admin_db).create(
name=role_name,
tenant=tenant,
manage_users=False,
manage_account=False,
manage_billing=False,
manage_providers=False,
manage_integrations=False,
manage_scans=False,
unlimited_visibility=False,
)
UserRoleRelationship.objects.using(MainRouter.admin_db).filter(
user=user,
tenant_id=tenant.id,
).delete()
UserRoleRelationship.objects.using(MainRouter.admin_db).create(
user=user,
role=role,
tenant_id=tenant.id,
)
membership, _ = Membership.objects.using(MainRouter.admin_db).get_or_create(
user=user,
tenant=tenant,
defaults={
"user": user,
"tenant": tenant,
"role": Membership.RoleChoices.MEMBER,
},
)
serializer = TokenSocialLoginSerializer(
data={"email": user.email, "tenant_id": str(tenant.id)}
)
serializer.is_valid(raise_exception=True)
token_data = serializer.validated_data
saml_token = SAMLToken.objects.using(MainRouter.admin_db).create(
token=token_data, user=user
)
callback_url = env.str("SAML_SSO_CALLBACK_URL")
redirect_url = f"{callback_url}?id={saml_token.id}"
request.session.pop("saml_user_created", None)
return redirect(redirect_url)
@extend_schema_view(
list=extend_schema(
tags=["User"],
@@ -1581,6 +1886,14 @@ class TaskViewSet(BaseRLSViewSet):
summary="List all resources",
description="Retrieve a list of all resources with options for filtering by various criteria. Resources are "
"objects that are discovered by Prowler. They can be anything from a single host to a whole VPC.",
parameters=[
OpenApiParameter(
name="filter[updated_at]",
description="At least one of the variations of the `filter[updated_at]` filter must be provided.",
required=True,
type=OpenApiTypes.DATE,
)
],
),
retrieve=extend_schema(
tags=["Resource"],
@@ -1588,15 +1901,43 @@ class TaskViewSet(BaseRLSViewSet):
description="Fetch detailed information about a specific resource by their ID. A Resource is an object that "
"is discovered by Prowler. It can be anything from a single host to a whole VPC.",
),
metadata=extend_schema(
tags=["Resource"],
summary="Retrieve metadata values from resources",
description="Fetch unique metadata values from a set of resources. This is useful for dynamic filtering.",
parameters=[
OpenApiParameter(
name="filter[updated_at]",
description="At least one of the variations of the `filter[updated_at]` filter must be provided.",
required=True,
type=OpenApiTypes.DATE,
)
],
filters=True,
),
latest=extend_schema(
tags=["Resource"],
summary="List the latest resources",
description="Retrieve a list of the latest resources from the latest scans for each provider with options for "
"filtering by various criteria.",
filters=True,
),
metadata_latest=extend_schema(
tags=["Resource"],
summary="Retrieve metadata values from the latest resources",
description="Fetch unique metadata values from a set of resources from the latest scans for each provider. "
"This is useful for dynamic filtering.",
filters=True,
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
class ResourceViewSet(BaseRLSViewSet):
queryset = Resource.objects.all()
class ResourceViewSet(PaginateByPkMixin, BaseRLSViewSet):
queryset = Resource.all_objects.all()
serializer_class = ResourceSerializer
http_method_names = ["get"]
filterset_class = ResourceFilter
ordering = ["-inserted_at"]
ordering = ["-failed_findings_count", "-updated_at"]
ordering_fields = [
"provider_uid",
"uid",
@@ -1607,6 +1948,14 @@ class ResourceViewSet(BaseRLSViewSet):
"inserted_at",
"updated_at",
]
prefetch_for_includes = {
"__all__": [],
"provider": [
Prefetch(
"provider", queryset=Provider.all_objects.select_related("resources")
)
],
}
# RBAC required permissions (implicit -> MANAGE_PROVIDERS enable unlimited visibility or check the visibility of
# the provider through the provider group)
required_permissions = []
@@ -1615,41 +1964,257 @@ class ResourceViewSet(BaseRLSViewSet):
user_roles = get_role(self.request.user)
if user_roles.unlimited_visibility:
# User has unlimited visibility, return all scans
queryset = Resource.objects.filter(tenant_id=self.request.tenant_id)
queryset = Resource.all_objects.filter(tenant_id=self.request.tenant_id)
else:
# User lacks permission, filter providers based on provider groups associated with the role
queryset = Resource.objects.filter(
queryset = Resource.all_objects.filter(
tenant_id=self.request.tenant_id, provider__in=get_providers(user_roles)
)
search_value = self.request.query_params.get("filter[search]", None)
if search_value:
# Django's ORM will build a LEFT JOIN and OUTER JOIN on the "through" table, resulting in duplicates
# The duplicates then require a `distinct` query
search_query = SearchQuery(
search_value, config="simple", search_type="plain"
)
queryset = queryset.filter(
Q(tags__key=search_value)
| Q(tags__value=search_value)
| Q(tags__text_search=search_query)
| Q(tags__key__contains=search_value)
| Q(tags__value__contains=search_value)
| Q(uid=search_value)
| Q(name=search_value)
| Q(region=search_value)
| Q(service=search_value)
| Q(type=search_value)
| Q(text_search=search_query)
| Q(uid__contains=search_value)
| Q(name__contains=search_value)
| Q(region__contains=search_value)
| Q(service__contains=search_value)
| Q(type__contains=search_value)
Q(text_search=search_query) | Q(tags__text_search=search_query)
).distinct()
return queryset
def _optimize_tags_loading(self, queryset):
"""Optimize tags loading with prefetch_related to avoid N+1 queries"""
# Use prefetch_related to load all tags in a single query
return queryset.prefetch_related(
Prefetch(
"tags",
queryset=ResourceTag.objects.filter(
tenant_id=self.request.tenant_id
).select_related(),
to_attr="prefetched_tags",
)
)
def get_serializer_class(self):
if self.action in ["metadata", "metadata_latest"]:
return ResourceMetadataSerializer
return super().get_serializer_class()
def get_filterset_class(self):
if self.action in ["latest", "metadata_latest"]:
return LatestResourceFilter
return ResourceFilter
def filter_queryset(self, queryset):
# Do not apply filters when retrieving specific resource
if self.action == "retrieve":
return queryset
return super().filter_queryset(queryset)
def list(self, request, *args, **kwargs):
filtered_queryset = self.filter_queryset(self.get_queryset())
return self.paginate_by_pk(
request,
filtered_queryset,
manager=Resource.all_objects,
select_related=["provider"],
prefetch_related=["findings"],
)
def retrieve(self, request, *args, **kwargs):
queryset = self._optimize_tags_loading(self.get_queryset())
instance = get_object_or_404(queryset, pk=kwargs.get("pk"))
mapping_ids = list(
ResourceFindingMapping.objects.filter(
resource=instance, tenant_id=request.tenant_id
).values_list("finding_id", flat=True)
)
latest_findings = (
Finding.all_objects.filter(id__in=mapping_ids, tenant_id=request.tenant_id)
.order_by("uid", "-inserted_at")
.distinct("uid")
)
setattr(instance, "latest_findings", latest_findings)
serializer = self.get_serializer(instance)
return Response(serializer.data, status=status.HTTP_200_OK)
@action(detail=False, methods=["get"], url_name="latest")
def latest(self, request):
tenant_id = request.tenant_id
filtered_queryset = self.filter_queryset(self.get_queryset())
latest_scan_ids = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
.values_list("id", flat=True)
)
filtered_queryset = filtered_queryset.filter(
tenant_id=tenant_id, provider__scan__in=latest_scan_ids
)
return self.paginate_by_pk(
request,
filtered_queryset,
manager=Resource.all_objects,
select_related=["provider"],
prefetch_related=["findings"],
)
@action(detail=False, methods=["get"], url_name="metadata")
def metadata(self, request):
# Force filter validation
self.filter_queryset(self.get_queryset())
tenant_id = request.tenant_id
query_params = request.query_params
queryset = ResourceScanSummary.objects.filter(tenant_id=tenant_id)
if scans := query_params.get("filter[scan__in]") or query_params.get(
"filter[scan]"
):
queryset = queryset.filter(scan_id__in=scans.split(","))
else:
exact = query_params.get("filter[inserted_at]")
gte = query_params.get("filter[inserted_at__gte]")
lte = query_params.get("filter[inserted_at__lte]")
date_filters = {}
if exact:
date = parse_date(exact)
datetime_start = datetime.combine(
date, datetime.min.time(), tzinfo=timezone.utc
)
datetime_end = datetime_start + timedelta(days=1)
date_filters["scan_id__gte"] = uuid7_start(
datetime_to_uuid7(datetime_start)
)
date_filters["scan_id__lt"] = uuid7_start(
datetime_to_uuid7(datetime_end)
)
else:
if gte:
date_start = parse_date(gte)
datetime_start = datetime.combine(
date_start, datetime.min.time(), tzinfo=timezone.utc
)
date_filters["scan_id__gte"] = uuid7_start(
datetime_to_uuid7(datetime_start)
)
if lte:
date_end = parse_date(lte)
datetime_end = datetime.combine(
date_end + timedelta(days=1),
datetime.min.time(),
tzinfo=timezone.utc,
)
date_filters["scan_id__lt"] = uuid7_start(
datetime_to_uuid7(datetime_end)
)
if date_filters:
queryset = queryset.filter(**date_filters)
if service_filter := query_params.get("filter[service]") or query_params.get(
"filter[service__in]"
):
queryset = queryset.filter(service__in=service_filter.split(","))
if region_filter := query_params.get("filter[region]") or query_params.get(
"filter[region__in]"
):
queryset = queryset.filter(region__in=region_filter.split(","))
if resource_type_filter := query_params.get("filter[type]") or query_params.get(
"filter[type__in]"
):
queryset = queryset.filter(
resource_type__in=resource_type_filter.split(",")
)
services = list(
queryset.values_list("service", flat=True).distinct().order_by("service")
)
regions = list(
queryset.values_list("region", flat=True).distinct().order_by("region")
)
resource_types = list(
queryset.values_list("resource_type", flat=True)
.exclude(resource_type__isnull=True)
.exclude(resource_type__exact="")
.distinct()
.order_by("resource_type")
)
result = {
"services": services,
"regions": regions,
"types": resource_types,
}
serializer = self.get_serializer(data=result)
serializer.is_valid(raise_exception=True)
return Response(serializer.data)
@action(
detail=False,
methods=["get"],
url_name="metadata_latest",
url_path="metadata/latest",
)
def metadata_latest(self, request):
tenant_id = request.tenant_id
query_params = request.query_params
latest_scans_queryset = (
Scan.all_objects.filter(tenant_id=tenant_id, state=StateChoices.COMPLETED)
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
)
queryset = ResourceScanSummary.objects.filter(
tenant_id=tenant_id,
scan_id__in=latest_scans_queryset.values_list("id", flat=True),
)
if service_filter := query_params.get("filter[service]") or query_params.get(
"filter[service__in]"
):
queryset = queryset.filter(service__in=service_filter.split(","))
if region_filter := query_params.get("filter[region]") or query_params.get(
"filter[region__in]"
):
queryset = queryset.filter(region__in=region_filter.split(","))
if resource_type_filter := query_params.get("filter[type]") or query_params.get(
"filter[type__in]"
):
queryset = queryset.filter(
resource_type__in=resource_type_filter.split(",")
)
services = list(
queryset.values_list("service", flat=True).distinct().order_by("service")
)
regions = list(
queryset.values_list("region", flat=True).distinct().order_by("region")
)
resource_types = list(
queryset.values_list("resource_type", flat=True)
.exclude(resource_type__isnull=True)
.exclude(resource_type__exact="")
.distinct()
.order_by("resource_type")
)
result = {
"services": services,
"regions": regions,
"types": resource_types,
}
serializer = self.get_serializer(data=result)
serializer.is_valid(raise_exception=True)
return Response(serializer.data)
@extend_schema_view(
list=extend_schema(
@@ -1768,17 +2333,7 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
search_value, config="simple", search_type="plain"
)
resource_match = Resource.all_objects.filter(
text_search=search_query,
id__in=ResourceFindingMapping.objects.filter(
resource_id=OuterRef("pk"),
tenant_id=tenant_id,
).values("resource_id"),
)
queryset = queryset.filter(
Q(text_search=search_query) | Q(Exists(resource_match))
)
queryset = queryset.filter(text_search=search_query)
return queryset
@@ -1881,9 +2436,12 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
# ToRemove: Temporary fallback mechanism
if not queryset.exists():
scan_ids = Scan.objects.filter(
raw_scans_ids = Scan.objects.filter(
tenant_id=tenant_id, **scan_based_filters
).values_list("id", flat=True)
).values_list("id", "unique_resource_count")
scan_ids = [
scan_id for scan_id, count in raw_scans_ids if count and count > 0
]
for scan_id in scan_ids:
backfill_scan_resource_summaries_task.apply_async(
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
@@ -1915,6 +2473,8 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
)
resource_types = list(
queryset.values_list("resource_type", flat=True)
.exclude(resource_type__isnull=True)
.exclude(resource_type__exact="")
.distinct()
.order_by("resource_type")
)
@@ -1967,7 +2527,12 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
.order_by("provider_id", "-inserted_at")
.distinct("provider_id")
)
latest_scans_ids = list(latest_scans_queryset.values_list("id", flat=True))
raw_latest_scans_ids = list(
latest_scans_queryset.values_list("id", "unique_resource_count")
)
latest_scans_ids = [
scan_id for scan_id, count in raw_latest_scans_ids if count and count > 0
]
queryset = ResourceScanSummary.objects.filter(
tenant_id=tenant_id,
@@ -2016,6 +2581,8 @@ class FindingViewSet(PaginateByPkMixin, BaseRLSViewSet):
)
resource_types = list(
queryset.values_list("resource_type", flat=True)
.exclude(resource_type__isnull=True)
.exclude(resource_type__exact="")
.distinct()
.order_by("resource_type")
)
@@ -2827,13 +3394,31 @@ class ComplianceOverviewViewSet(BaseRLSViewSet, TaskManagementMixin):
metadata = requirement.get("attributes", [])
base_attributes = {
"metadata": metadata,
"check_ids": check_ids,
}
# Add technique details for MITRE-ATTACK framework
if "mitre_attack" in compliance_id:
base_attributes["technique_details"] = {
"tactics": requirement.get("tactics", []),
"subtechniques": requirement.get("subtechniques", []),
"platforms": requirement.get("platforms", []),
"technique_url": requirement.get("technique_url", ""),
}
attribute_data.append(
{
"id": requirement_id,
"framework_description": compliance_framework.get(
"description", ""
),
"name": requirement.get("name", ""),
"framework": compliance_framework.get("framework", ""),
"version": compliance_framework.get("version", ""),
"description": requirement.get("description", ""),
"attributes": {"metadata": metadata, "check_ids": check_ids},
"attributes": base_attributes,
}
)
@@ -3208,3 +3793,131 @@ class IntegrationViewSet(BaseRLSViewSet):
context = super().get_serializer_context()
context["allowed_providers"] = self.allowed_providers
return context
@extend_schema_view(
list=extend_schema(
tags=["Lighthouse"],
summary="List all Lighthouse configurations",
description="Retrieve a list of all Lighthouse configurations.",
),
create=extend_schema(
tags=["Lighthouse"],
summary="Create a new Lighthouse configuration",
description="Create a new Lighthouse configuration with the specified details.",
),
partial_update=extend_schema(
tags=["Lighthouse"],
summary="Partially update a Lighthouse configuration",
description="Update certain fields of an existing Lighthouse configuration.",
),
destroy=extend_schema(
tags=["Lighthouse"],
summary="Delete a Lighthouse configuration",
description="Remove a Lighthouse configuration by its ID.",
),
connection=extend_schema(
tags=["Lighthouse"],
summary="Check the connection to the OpenAI API",
description="Verify the connection to the OpenAI API for a specific Lighthouse configuration.",
request=None,
responses={202: OpenApiResponse(response=TaskSerializer)},
),
)
class LighthouseConfigViewSet(BaseRLSViewSet):
"""
API endpoint for managing Lighthouse configuration.
"""
serializer_class = LighthouseConfigSerializer
ordering_fields = ["name", "inserted_at", "updated_at", "is_active"]
ordering = ["-inserted_at"]
def get_queryset(self):
return LighthouseConfiguration.objects.filter(tenant_id=self.request.tenant_id)
def get_serializer_class(self):
if self.action == "create":
return LighthouseConfigCreateSerializer
elif self.action == "partial_update":
return LighthouseConfigUpdateSerializer
elif self.action == "connection":
return TaskSerializer
return super().get_serializer_class()
@extend_schema(exclude=True)
def retrieve(self, request, *args, **kwargs):
raise MethodNotAllowed(method="GET")
@action(detail=True, methods=["post"], url_name="connection")
def connection(self, request, pk=None):
"""
Check the connection to the OpenAI API asynchronously.
"""
instance = self.get_object()
with transaction.atomic():
task = check_lighthouse_connection_task.delay(
lighthouse_config_id=str(instance.id), tenant_id=self.request.tenant_id
)
prowler_task = Task.objects.get(id=task.id)
serializer = TaskSerializer(prowler_task)
return Response(
data=serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse(
"task-detail", kwargs={"pk": prowler_task.id}
)
},
)
@extend_schema_view(
list=extend_schema(
tags=["Processor"],
summary="List all processors",
description="Retrieve a list of all configured processors with options for filtering by various criteria.",
),
retrieve=extend_schema(
tags=["Processor"],
summary="Retrieve processor details",
description="Fetch detailed information about a specific processor by its ID.",
),
create=extend_schema(
tags=["Processor"],
summary="Create a new processor",
description="Register a new processor with the system, providing necessary configuration details. There can "
"only be one processor of each type per tenant.",
),
partial_update=extend_schema(
tags=["Processor"],
summary="Partially update a processor",
description="Modify certain fields of an existing processor without affecting other settings.",
),
destroy=extend_schema(
tags=["Processor"],
summary="Delete a processor",
description="Remove a processor from the system by its ID.",
),
)
@method_decorator(CACHE_DECORATOR, name="list")
@method_decorator(CACHE_DECORATOR, name="retrieve")
class ProcessorViewSet(BaseRLSViewSet):
queryset = Processor.objects.all()
serializer_class = ProcessorSerializer
http_method_names = ["get", "post", "patch", "delete"]
filterset_class = ProcessorFilter
ordering = ["processor_type", "-inserted_at"]
# RBAC required permissions
required_permissions = [Permissions.MANAGE_ACCOUNT]
def get_queryset(self):
queryset = Processor.objects.filter(tenant_id=self.request.tenant_id)
return queryset
def get_serializer_class(self):
if self.action == "create":
return ProcessorCreateSerializer
elif self.action == "partial_update":
return ProcessorUpdateSerializer
return super().get_serializer_class()
+88
View File
@@ -1,3 +1,5 @@
import string
from django.core.exceptions import ValidationError
from django.utils.translation import gettext as _
@@ -20,3 +22,89 @@ class MaximumLengthValidator:
return _(
f"Your password must contain no more than {self.max_length} characters."
)
class SpecialCharactersValidator:
def __init__(self, special_characters=None, min_special_characters=1):
# Use string.punctuation if no custom characters provided
self.special_characters = special_characters or string.punctuation
self.min_special_characters = min_special_characters
def validate(self, password, user=None):
if (
sum(1 for char in password if char in self.special_characters)
< self.min_special_characters
):
raise ValidationError(
_("This password must contain at least one special character."),
code="password_no_special_characters",
params={
"special_characters": self.special_characters,
"min_special_characters": self.min_special_characters,
},
)
def get_help_text(self):
return _(
f"Your password must contain at least one special character from: {self.special_characters}"
)
class UppercaseValidator:
def __init__(self, min_uppercase=1):
self.min_uppercase = min_uppercase
def validate(self, password, user=None):
if sum(1 for char in password if char.isupper()) < self.min_uppercase:
raise ValidationError(
_(
"This password must contain at least %(min_uppercase)d uppercase letter."
),
code="password_no_uppercase_letters",
params={"min_uppercase": self.min_uppercase},
)
def get_help_text(self):
return _(
f"Your password must contain at least {self.min_uppercase} uppercase letter."
)
class LowercaseValidator:
def __init__(self, min_lowercase=1):
self.min_lowercase = min_lowercase
def validate(self, password, user=None):
if sum(1 for char in password if char.islower()) < self.min_lowercase:
raise ValidationError(
_(
"This password must contain at least %(min_lowercase)d lowercase letter."
),
code="password_no_lowercase_letters",
params={"min_lowercase": self.min_lowercase},
)
def get_help_text(self):
return _(
f"Your password must contain at least {self.min_lowercase} lowercase letter."
)
class NumericValidator:
def __init__(self, min_numeric=1):
self.min_numeric = min_numeric
def validate(self, password, user=None):
if sum(1 for char in password if char.isdigit()) < self.min_numeric:
raise ValidationError(
_(
"This password must contain at least %(min_numeric)d numeric character."
),
code="password_no_numeric_characters",
params={"min_numeric": self.min_numeric},
)
def get_help_text(self):
return _(
f"Your password must contain at least {self.min_numeric} numeric character."
)
+7
View File
@@ -1,6 +1,13 @@
import warnings
from celery import Celery, Task
from config.env import env
# Suppress specific warnings from django-rest-auth: https://github.com/iMerica/dj-rest-auth/issues/684
warnings.filterwarnings(
"ignore", category=UserWarning, module="dj_rest_auth.registration.serializers"
)
BROKER_VISIBILITY_TIMEOUT = env.int("DJANGO_BROKER_VISIBILITY_TIMEOUT", default=86400)
celery_app = Celery("tasks")
+32 -1
View File
@@ -10,6 +10,8 @@ from config.settings.social_login import * # noqa
SECRET_KEY = env("SECRET_KEY", default="secret")
DEBUG = env.bool("DJANGO_DEBUG", default=False)
ALLOWED_HOSTS = ["localhost", "127.0.0.1"]
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
USE_X_FORWARDED_HOST = True
# Application definition
@@ -33,10 +35,12 @@ INSTALLED_APPS = [
"django_celery_beat",
"rest_framework_simplejwt.token_blacklist",
"allauth",
"django.contrib.sites",
"allauth.account",
"allauth.socialaccount",
"allauth.socialaccount.providers.google",
"allauth.socialaccount.providers.github",
"allauth.socialaccount.providers.saml",
"dj_rest_auth.registration",
"rest_framework.authtoken",
]
@@ -128,7 +132,6 @@ DJANGO_GUID = {
}
DATABASE_ROUTERS = ["api.db_router.MainRouter"]
POSTGRES_EXTRA_DB_BACKEND_BASE = "database_backend"
# Password validation
@@ -156,6 +159,30 @@ AUTH_PASSWORD_VALIDATORS = [
{
"NAME": "django.contrib.auth.password_validation.NumericPasswordValidator",
},
{
"NAME": "api.validators.SpecialCharactersValidator",
"OPTIONS": {
"min_special_characters": 1,
},
},
{
"NAME": "api.validators.UppercaseValidator",
"OPTIONS": {
"min_uppercase": 1,
},
},
{
"NAME": "api.validators.LowercaseValidator",
"OPTIONS": {
"min_lowercase": 1,
},
},
{
"NAME": "api.validators.NumericValidator",
"OPTIONS": {
"min_numeric": 1,
},
},
]
SIMPLE_JWT = {
@@ -246,3 +273,7 @@ X_FRAME_OPTIONS = "DENY"
SECURE_REFERRER_POLICY = "strict-origin-when-cross-origin"
DJANGO_DELETION_BATCH_SIZE = env.int("DJANGO_DELETION_BATCH_SIZE", 5000)
# SAML requirement
CSRF_COOKIE_SECURE = True
SESSION_COOKIE_SECURE = True
+16 -6
View File
@@ -4,6 +4,7 @@ from config.env import env
IGNORED_EXCEPTIONS = [
# Provider is not connected due to credentials errors
"is not connected",
"ProviderConnectionError",
# Authentication Errors from AWS
"InvalidToken",
"AccessDeniedException",
@@ -16,7 +17,7 @@ IGNORED_EXCEPTIONS = [
"InternalServerErrorException",
"AccessDenied",
"No Shodan API Key", # Shodan Check
"RequestLimitExceeded", # For now we don't want to log the RequestLimitExceeded errors
"RequestLimitExceeded", # For now, we don't want to log the RequestLimitExceeded errors
"ThrottlingException",
"Rate exceeded",
"SubscriptionRequiredException",
@@ -42,7 +43,9 @@ IGNORED_EXCEPTIONS = [
"AWSAccessKeyIDInvalidError",
"AWSSessionTokenExpiredError",
"EndpointConnectionError", # AWS Service is not available in a region
"Pool is closed", # The following comes from urllib3: eu-west-1 -- HTTPClientError[126]: An HTTP Client raised an unhandled exception: AWSHTTPSConnectionPool(host='hostname.s3.eu-west-1.amazonaws.com', port=443): Pool is closed.
# The following comes from urllib3: eu-west-1 -- HTTPClientError[126]: An HTTP Client raised an
# unhandled exception: AWSHTTPSConnectionPool(host='hostname.s3.eu-west-1.amazonaws.com', port=443): Pool is closed.
"Pool is closed",
# Authentication Errors from GCP
"ClientAuthenticationError",
"AuthorizationFailed",
@@ -71,7 +74,7 @@ IGNORED_EXCEPTIONS = [
def before_send(event, hint):
"""
before_send handles the Sentry events in order to sent them or not
before_send handles the Sentry events in order to send them or not
"""
# Ignore logs with the ignored_exceptions
# https://docs.python.org/3/library/logging.html#logrecord-objects
@@ -79,9 +82,16 @@ def before_send(event, hint):
log_msg = hint["log_record"].msg
log_lvl = hint["log_record"].levelno
# Handle Error events and discard the rest
if log_lvl == 40 and any(ignored in log_msg for ignored in IGNORED_EXCEPTIONS):
return
# Handle Error and Critical events and discard the rest
if log_lvl <= 40 and any(ignored in log_msg for ignored in IGNORED_EXCEPTIONS):
return None # Explicitly return None to drop the event
# Ignore exceptions with the ignored_exceptions
if "exc_info" in hint and hint["exc_info"]:
exc_value = str(hint["exc_info"][1])
if any(ignored in exc_value for ignored in IGNORED_EXCEPTIONS):
return None # Explicitly return None to drop the event
return event
@@ -11,8 +11,7 @@ GITHUB_OAUTH_CALLBACK_URL = env("SOCIAL_GITHUB_OAUTH_CALLBACK_URL", default="")
# Allauth settings
ACCOUNT_LOGIN_METHODS = {"email"} # Use Email / Password authentication
ACCOUNT_USERNAME_REQUIRED = False
ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_SIGNUP_FIELDS = ["email*", "password1*", "password2*"]
ACCOUNT_EMAIL_VERIFICATION = "none" # Do not require email confirmation
ACCOUNT_USER_MODEL_USERNAME_FIELD = None
REST_AUTH = {
@@ -25,6 +24,20 @@ SOCIALACCOUNT_EMAIL_AUTHENTICATION = True
# Connect local account and social account if local account with that email address already exists
SOCIALACCOUNT_EMAIL_AUTHENTICATION_AUTO_CONNECT = True
SOCIALACCOUNT_ADAPTER = "api.adapters.ProwlerSocialAccountAdapter"
# def inline(pem: str) -> str:
# return "".join(
# line.strip()
# for line in pem.splitlines()
# if "CERTIFICATE" not in line and "KEY" not in line
# )
# # SAML keys (TODO: Validate certificates)
# SAML_PUBLIC_CERT = inline(env("SAML_PUBLIC_CERT", default=""))
# SAML_PRIVATE_KEY = inline(env("SAML_PRIVATE_KEY", default=""))
SOCIALACCOUNT_PROVIDERS = {
"google": {
"APP": {
@@ -50,4 +63,20 @@ SOCIALACCOUNT_PROVIDERS = {
"read:org",
],
},
"saml": {
"use_nameid_for_email": True,
"sp": {
"entity_id": "urn:prowler.com:sp",
},
"advanced": {
# TODO: Validate certificates
# "x509cert": SAML_PUBLIC_CERT,
# "private_key": SAML_PRIVATE_KEY,
# "authn_request_signed": True,
# "want_message_signed": True,
# "want_assertion_signed": True,
"reject_idp_initiated_sso": False,
"name_id_format": "urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress",
},
},
}
+181 -2
View File
@@ -1,8 +1,9 @@
import logging
from datetime import datetime, timedelta, timezone
from unittest.mock import patch
from unittest.mock import MagicMock, patch
import pytest
from allauth.socialaccount.models import SocialLogin
from django.conf import settings
from django.db import connection as django_connection
from django.db import connections as django_connections
@@ -20,13 +21,18 @@ from api.models import (
Integration,
IntegrationProviderRelationship,
Invitation,
LighthouseConfiguration,
Membership,
Processor,
Provider,
ProviderGroup,
ProviderSecret,
Resource,
ResourceTag,
ResourceTagMapping,
Role,
SAMLConfiguration,
SAMLDomainIndex,
Scan,
ScanSummary,
StateChoices,
@@ -377,8 +383,27 @@ def providers_fixture(tenants_fixture):
tenant_id=tenant.id,
scanner_args={"key1": "value1", "key2": {"key21": "value21"}},
)
provider6 = Provider.objects.create(
provider="m365",
uid="m365.test.com",
alias="m365_testing",
tenant_id=tenant.id,
)
return provider1, provider2, provider3, provider4, provider5
return provider1, provider2, provider3, provider4, provider5, provider6
@pytest.fixture
def processor_fixture(tenants_fixture):
tenant, *_ = tenants_fixture
processor = Processor.objects.create(
tenant_id=tenant.id,
processor_type="mutelist",
configuration="Mutelist:\n Accounts:\n *:\n Checks:\n iam_user_hardware_mfa_enabled:\n "
" Regions:\n - *\n Resources:\n - *",
)
return processor
@pytest.fixture
@@ -630,6 +655,7 @@ def findings_fixture(scans_fixture, resources_fixture):
check_metadata={
"CheckId": "test_check_id",
"Description": "test description apple sauce",
"servicename": "ec2",
},
first_seen_at="2024-01-02T00:00:00Z",
)
@@ -656,6 +682,7 @@ def findings_fixture(scans_fixture, resources_fixture):
check_metadata={
"CheckId": "test_check_id",
"Description": "test description orange juice",
"servicename": "s3",
},
first_seen_at="2024-01-02T00:00:00Z",
muted=True,
@@ -877,6 +904,22 @@ def compliance_requirements_overviews_fixture(scans_fixture, tenants_fixture):
total_checks=3,
)
# Create another compliance framework for testing MITRE ATT&CK
requirement_overview7 = ComplianceRequirementOverview.objects.create(
tenant=tenant,
scan=scan1,
compliance_id="mitre_attack_aws",
framework="MITRE-ATTACK",
version="1.0",
description="MITRE ATT&CK",
region="eu-west-1",
requirement_id="mitre_requirement1",
requirement_status=StatusChoices.FAIL,
passed_checks=0,
failed_checks=0,
total_checks=0,
)
return (
requirement_overview1,
requirement_overview2,
@@ -884,6 +927,7 @@ def compliance_requirements_overviews_fixture(scans_fixture, tenants_fixture):
requirement_overview4,
requirement_overview5,
requirement_overview6,
requirement_overview7,
)
@@ -1039,6 +1083,20 @@ def backfill_scan_metadata_fixture(scans_fixture, findings_fixture):
backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
@pytest.fixture
def lighthouse_config_fixture(authenticated_client, tenants_fixture):
return LighthouseConfiguration.objects.create(
tenant_id=tenants_fixture[0].id,
name="OpenAI",
api_key_decoded="sk-test1234567890T3BlbkFJtest1234567890",
model="gpt-4o",
temperature=0,
max_tokens=4000,
business_context="Test business context",
is_active=True,
)
@pytest.fixture(scope="function")
def latest_scan_finding(authenticated_client, providers_fixture, resources_fixture):
provider = providers_fixture[0]
@@ -1080,6 +1138,127 @@ def latest_scan_finding(authenticated_client, providers_fixture, resources_fixtu
return finding
@pytest.fixture(scope="function")
def latest_scan_resource(authenticated_client, providers_fixture):
provider = providers_fixture[0]
tenant_id = str(providers_fixture[0].tenant_id)
scan = Scan.objects.create(
name="latest completed scan for resource",
provider=provider,
trigger=Scan.TriggerChoices.MANUAL,
state=StateChoices.COMPLETED,
tenant_id=tenant_id,
)
resource = Resource.objects.create(
tenant_id=tenant_id,
provider=provider,
uid="latest_resource_uid",
name="Latest Resource",
region="us-east-1",
service="ec2",
type="instance",
metadata='{"test": "metadata"}',
details='{"test": "details"}',
)
resource_tag = ResourceTag.objects.create(
tenant_id=tenant_id,
key="environment",
value="test",
)
ResourceTagMapping.objects.create(
tenant_id=tenant_id,
resource=resource,
tag=resource_tag,
)
finding = Finding.objects.create(
tenant_id=tenant_id,
uid="test_finding_uid_latest",
scan=scan,
delta="new",
status=Status.FAIL,
status_extended="test status extended ",
impact=Severity.critical,
impact_extended="test impact extended",
severity=Severity.critical,
raw_result={
"status": Status.FAIL,
"impact": Severity.critical,
"severity": Severity.critical,
},
tags={"test": "latest"},
check_id="test_check_id_latest",
check_metadata={
"CheckId": "test_check_id_latest",
"Description": "test description latest",
},
first_seen_at="2024-01-02T00:00:00Z",
)
finding.add_resources([resource])
backfill_resource_scan_summaries(tenant_id, str(scan.id))
return resource
@pytest.fixture
def saml_setup(tenants_fixture):
tenant_id = tenants_fixture[0].id
domain = "prowler.com"
SAMLDomainIndex.objects.create(email_domain=domain, tenant_id=tenant_id)
metadata_xml = """<?xml version='1.0' encoding='UTF-8'?>
<md:EntityDescriptor entityID='TEST' xmlns:md='urn:oasis:names:tc:SAML:2.0:metadata'>
<md:IDPSSODescriptor WantAuthnRequestsSigned='false' protocolSupportEnumeration='urn:oasis:names:tc:SAML:2.0:protocol'>
<md:KeyDescriptor use='signing'>
<ds:KeyInfo xmlns:ds='http://www.w3.org/2000/09/xmldsig#'>
<ds:X509Data>
<ds:X509Certificate>TEST</ds:X509Certificate>
</ds:X509Data>
</ds:KeyInfo>
</md:KeyDescriptor>
<md:NameIDFormat>urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress</md:NameIDFormat>
<md:SingleSignOnService Binding='urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST' Location='https://TEST/sso/saml'/>
<md:SingleSignOnService Binding='urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect' Location='https://TEST/sso/saml'/>
</md:IDPSSODescriptor>
</md:EntityDescriptor>
"""
SAMLConfiguration.objects.create(
tenant_id=str(tenant_id),
email_domain=domain,
metadata_xml=metadata_xml,
)
return {
"email": f"user@{domain}",
"domain": domain,
"tenant_id": tenant_id,
}
@pytest.fixture
def saml_sociallogin(users_fixture):
user = users_fixture[0]
user.email = "samlsso@acme.com"
extra_data = {
"firstName": ["Test"],
"lastName": ["User"],
"organization": ["Prowler"],
"userType": ["member"],
}
account = MagicMock()
account.provider = "saml"
account.extra_data = extra_data
sociallogin = MagicMock(spec=SocialLogin)
sociallogin.account = account
sociallogin.user = user
return sociallogin
def get_authorization_header(access_token: str) -> dict:
return {"Authorization": f"Bearer {access_token}"}
-15
View File
@@ -1,15 +0,0 @@
import django.db
from django.db.backends.postgresql.base import (
DatabaseWrapper as BuiltinPostgresDatabaseWrapper,
)
from psycopg2 import InterfaceError
class DatabaseWrapper(BuiltinPostgresDatabaseWrapper):
def create_cursor(self, name=None):
try:
return super().create_cursor(name=name)
except InterfaceError:
django.db.close_old_connections()
django.db.connection.connect()
return super().create_cursor(name=name)
+6
View File
@@ -3,6 +3,12 @@
import os
import sys
import warnings
# Suppress specific warnings from django-rest-auth: https://github.com/iMerica/dj-rest-auth/issues/684
warnings.filterwarnings(
"ignore", category=UserWarning, module="dj_rest_auth.registration.serializers"
)
def main():
+45 -1
View File
@@ -1,8 +1,9 @@
from datetime import datetime, timezone
import openai
from celery.utils.log import get_task_logger
from api.models import Provider
from api.models import LighthouseConfiguration, Provider
from api.utils import prowler_provider_connection_test
logger = get_task_logger(__name__)
@@ -39,3 +40,46 @@ def check_provider_connection(provider_id: str):
connection_error = f"{connection_result.error}" if connection_result.error else None
return {"connected": connection_result.is_connected, "error": connection_error}
def check_lighthouse_connection(lighthouse_config_id: str):
"""
Business logic to check the connection status of a Lighthouse configuration.
Args:
lighthouse_config_id (str): The primary key of the LighthouseConfiguration instance to check.
Returns:
dict: A dictionary containing:
- 'connected' (bool): Indicates whether the connection is successful.
- 'error' (str or None): The error message if the connection failed, otherwise `None`.
- 'available_models' (list): List of available models if connection is successful.
Raises:
Model.DoesNotExist: If the lighthouse configuration does not exist.
"""
lighthouse_config = LighthouseConfiguration.objects.get(pk=lighthouse_config_id)
if not lighthouse_config.api_key_decoded:
lighthouse_config.is_active = False
lighthouse_config.save()
return {
"connected": False,
"error": "API key is invalid or missing.",
"available_models": [],
}
try:
client = openai.OpenAI(api_key=lighthouse_config.api_key_decoded)
models = client.models.list()
lighthouse_config.is_active = True
lighthouse_config.save()
return {
"connected": True,
"error": None,
"available_models": [model.id for model in models.data],
}
except Exception as e:
lighthouse_config.is_active = False
lighthouse_config.save()
return {"connected": False, "error": str(e), "available_models": []}
+8 -2
View File
@@ -1,4 +1,5 @@
import os
import re
import zipfile
import boto3
@@ -30,6 +31,7 @@ from prowler.lib.outputs.compliance.iso27001.iso27001_gcp import GCPISO27001
from prowler.lib.outputs.compliance.iso27001.iso27001_kubernetes import (
KubernetesISO27001,
)
from prowler.lib.outputs.compliance.iso27001.iso27001_m365 import M365ISO27001
from prowler.lib.outputs.compliance.kisa_ismsp.kisa_ismsp_aws import AWSKISAISMSP
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_aws import AWSMitreAttack
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_azure import (
@@ -89,6 +91,7 @@ COMPLIANCE_CLASS_MAP = {
"m365": [
(lambda name: name.startswith("cis_"), M365CIS),
(lambda name: name == "prowler_threatscore_m365", ProwlerThreatScoreM365),
(lambda name: name.startswith("iso27001_"), M365ISO27001),
],
}
@@ -238,15 +241,18 @@ def _generate_output_directory(
'/tmp/tenant-1234/aws/scan-5678/prowler-output-2023-02-15T12:34:56',
'/tmp/tenant-1234/aws/scan-5678/compliance/prowler-output-2023-02-15T12:34:56'
"""
# Sanitize the prowler provider name to ensure it is a valid directory name
prowler_provider_sanitized = re.sub(r"[^\w\-]", "-", prowler_provider)
path = (
f"{output_directory}/{tenant_id}/{scan_id}/prowler-output-"
f"{prowler_provider}-{output_file_timestamp}"
f"{prowler_provider_sanitized}-{output_file_timestamp}"
)
os.makedirs("/".join(path.split("/")[:-1]), exist_ok=True)
compliance_path = (
f"{output_directory}/{tenant_id}/{scan_id}/compliance/prowler-output-"
f"{prowler_provider}-{output_file_timestamp}"
f"{prowler_provider_sanitized}-{output_file_timestamp}"
)
os.makedirs("/".join(compliance_path.split("/")[:-1]), exist_ok=True)
+80 -6
View File
@@ -6,7 +6,7 @@ from datetime import datetime, timezone
from celery.utils.log import get_task_logger
from config.settings.celery import CELERY_DEADLOCK_ATTEMPTS
from django.db import IntegrityError, OperationalError
from django.db.models import Case, Count, IntegerField, Sum, When
from django.db.models import Case, Count, IntegerField, OuterRef, Subquery, Sum, When
from tasks.utils import CustomEncoder
from api.compliance import (
@@ -14,9 +14,11 @@ from api.compliance import (
generate_scan_compliance,
)
from api.db_utils import create_objects_in_batches, rls_transaction
from api.exceptions import ProviderConnectionError
from api.models import (
ComplianceRequirementOverview,
Finding,
Processor,
Provider,
Resource,
ResourceScanSummary,
@@ -26,7 +28,7 @@ from api.models import (
StateChoices,
)
from api.models import StatusChoices as FindingStatus
from api.utils import initialize_prowler_provider
from api.utils import initialize_prowler_provider, return_prowler_provider
from api.v1.serializers import ScanTaskSerializer
from prowler.lib.outputs.finding import Finding as ProwlerFinding
from prowler.lib.scan.scan import Scan as ProwlerScan
@@ -132,14 +134,28 @@ def perform_prowler_scan(
scan_instance.started_at = datetime.now(tz=timezone.utc)
scan_instance.save()
# Find the mutelist processor if it exists
with rls_transaction(tenant_id):
try:
mutelist_processor = Processor.objects.get(
tenant_id=tenant_id, processor_type=Processor.ProcessorChoices.MUTELIST
)
except Processor.DoesNotExist:
mutelist_processor = None
except Exception as e:
logger.error(f"Error processing mutelist rules: {e}")
mutelist_processor = None
try:
with rls_transaction(tenant_id):
try:
prowler_provider = initialize_prowler_provider(provider_instance)
prowler_provider = initialize_prowler_provider(
provider_instance, mutelist_processor
)
provider_instance.connected = True
except Exception as e:
provider_instance.connected = False
exc = ValueError(
exc = ProviderConnectionError(
f"Provider {provider_instance.provider} is not connected: {e}"
)
finally:
@@ -149,7 +165,8 @@ def perform_prowler_scan(
provider_instance.save()
# If the provider is not connected, raise an exception outside the transaction.
# If raised within the transaction, the transaction will be rolled back and the provider will not be marked as not connected.
# If raised within the transaction, the transaction will be rolled back and the provider will not be marked
# as not connected.
if exc:
raise exc
@@ -273,6 +290,9 @@ def perform_prowler_scan(
if not last_first_seen_at:
last_first_seen_at = datetime.now(tz=timezone.utc)
# If the finding is muted at this time the reason must be the configured Mutelist
muted_reason = "Muted by mutelist" if finding.muted else None
# Create the finding
finding_instance = Finding.objects.create(
tenant_id=tenant_id,
@@ -288,6 +308,7 @@ def perform_prowler_scan(
scan=scan_instance,
first_seen_at=last_first_seen_at,
muted=finding.muted,
muted_reason=muted_reason,
compliance=finding.compliance,
)
finding_instance.add_resources([resource_instance])
@@ -355,12 +376,16 @@ def perform_prowler_scan(
def aggregate_findings(tenant_id: str, scan_id: str):
"""
Aggregates findings for a given scan and stores the results in the ScanSummary table.
Also updates the failed_findings_count for each resource based on the latest findings.
This function retrieves all findings associated with a given `scan_id` and calculates various
metrics such as counts of failed, passed, and muted findings, as well as their deltas (new,
changed, unchanged). The results are grouped by `check_id`, `service`, `severity`, and `region`.
These aggregated metrics are then stored in the `ScanSummary` table.
Additionally, it updates the failed_findings_count field for each resource based on the most
recent findings for each finding.uid.
Args:
tenant_id (str): The ID of the tenant to which the scan belongs.
scan_id (str): The ID of the scan for which findings need to be aggregated.
@@ -380,6 +405,8 @@ def aggregate_findings(tenant_id: str, scan_id: str):
- muted_new: Muted findings with a delta of 'new'.
- muted_changed: Muted findings with a delta of 'changed'.
"""
_update_resource_failed_findings_count(tenant_id, scan_id)
with rls_transaction(tenant_id):
findings = Finding.objects.filter(tenant_id=tenant_id, scan_id=scan_id)
@@ -504,6 +531,53 @@ def aggregate_findings(tenant_id: str, scan_id: str):
ScanSummary.objects.bulk_create(scan_aggregations, batch_size=3000)
def _update_resource_failed_findings_count(tenant_id: str, scan_id: str):
"""
Update the failed_findings_count field for resources based on the latest findings.
This function calculates the number of failed findings for each resource by:
1. Getting the latest finding for each finding.uid
2. Counting failed findings per resource
3. Updating the failed_findings_count field for each resource
Args:
tenant_id (str): The ID of the tenant to which the scan belongs.
scan_id (str): The ID of the scan for which to update resource counts.
"""
with rls_transaction(tenant_id):
scan = Scan.objects.get(pk=scan_id)
provider_id = scan.provider_id
resources = list(
Resource.all_objects.filter(tenant_id=tenant_id, provider_id=provider_id)
)
# For each resource, calculate failed findings count based on latest findings
for resource in resources:
with rls_transaction(tenant_id):
# Get the latest finding for each finding.uid that affects this resource
latest_findings_subquery = (
Finding.all_objects.filter(
tenant_id=tenant_id, uid=OuterRef("uid"), resources=resource
)
.order_by("-inserted_at")
.values("id")[:1]
)
# Count failed findings from the latest findings
failed_count = Finding.all_objects.filter(
tenant_id=tenant_id,
resources=resource,
id__in=Subquery(latest_findings_subquery),
status=FindingStatus.FAIL,
muted=False,
).count()
resource.failed_findings_count = failed_count
resource.save(update_fields=["failed_findings_count"])
def create_compliance_requirements(tenant_id: str, scan_id: str):
"""
Create detailed compliance requirement overview records for a scan.
@@ -526,7 +600,7 @@ def create_compliance_requirements(tenant_id: str, scan_id: str):
with rls_transaction(tenant_id):
scan_instance = Scan.objects.get(pk=scan_id)
provider_instance = scan_instance.provider
prowler_provider = initialize_prowler_provider(provider_instance)
prowler_provider = return_prowler_provider(provider_instance)
# Get check status data by region from findings
check_status_by_region = {}
+45 -20
View File
@@ -8,7 +8,7 @@ from config.celery import RLSTask
from config.django.base import DJANGO_FINDINGS_BATCH_SIZE, DJANGO_TMP_OUTPUT_DIRECTORY
from django_celery_beat.models import PeriodicTask
from tasks.jobs.backfill import backfill_resource_scan_summaries
from tasks.jobs.connection import check_provider_connection
from tasks.jobs.connection import check_lighthouse_connection, check_provider_connection
from tasks.jobs.deletion import delete_provider, delete_tenant
from tasks.jobs.export import (
COMPLIANCE_CLASS_MAP,
@@ -37,6 +37,26 @@ from prowler.lib.outputs.finding import Finding as FindingOutput
logger = get_task_logger(__name__)
def _perform_scan_complete_tasks(tenant_id: str, scan_id: str, provider_id: str):
"""
Helper function to perform tasks after a scan is completed.
Args:
tenant_id (str): The tenant ID under which the scan was performed.
scan_id (str): The ID of the scan that was performed.
provider_id (str): The primary key of the Provider instance that was scanned.
"""
create_compliance_requirements_task.apply_async(
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
)
chain(
perform_scan_summary_task.si(tenant_id=tenant_id, scan_id=scan_id),
generate_outputs_task.si(
scan_id=scan_id, provider_id=provider_id, tenant_id=tenant_id
),
).apply_async()
@shared_task(base=RLSTask, name="provider-connection-check")
@set_tenant
def check_provider_connection_task(provider_id: str):
@@ -103,13 +123,7 @@ def perform_scan_task(
checks_to_execute=checks_to_execute,
)
chain(
perform_scan_summary_task.si(tenant_id, scan_id),
create_compliance_requirements_task.si(tenant_id=tenant_id, scan_id=scan_id),
generate_outputs.si(
scan_id=scan_id, provider_id=provider_id, tenant_id=tenant_id
),
).apply_async()
_perform_scan_complete_tasks(tenant_id, scan_id, provider_id)
return result
@@ -214,20 +228,12 @@ def perform_scheduled_scan_task(self, tenant_id: str, provider_id: str):
scheduler_task_id=periodic_task_instance.id,
)
chain(
perform_scan_summary_task.si(tenant_id, scan_instance.id),
create_compliance_requirements_task.si(
tenant_id=tenant_id, scan_id=str(scan_instance.id)
),
generate_outputs.si(
scan_id=str(scan_instance.id), provider_id=provider_id, tenant_id=tenant_id
),
).apply_async()
_perform_scan_complete_tasks(tenant_id, str(scan_instance.id), provider_id)
return result
@shared_task(name="scan-summary")
@shared_task(name="scan-summary", queue="overview")
def perform_scan_summary_task(tenant_id: str, scan_id: str):
return aggregate_findings(tenant_id=tenant_id, scan_id=scan_id)
@@ -243,7 +249,7 @@ def delete_tenant_task(tenant_id: str):
queue="scan-reports",
)
@set_tenant(keep_tenant=True)
def generate_outputs(scan_id: str, provider_id: str, tenant_id: str):
def generate_outputs_task(scan_id: str, provider_id: str, tenant_id: str):
"""
Process findings in batches and generate output files in multiple formats.
@@ -381,7 +387,7 @@ def backfill_scan_resource_summaries_task(tenant_id: str, scan_id: str):
return backfill_resource_scan_summaries(tenant_id=tenant_id, scan_id=scan_id)
@shared_task(base=RLSTask, name="scan-compliance-overviews")
@shared_task(base=RLSTask, name="scan-compliance-overviews", queue="overview")
def create_compliance_requirements_task(tenant_id: str, scan_id: str):
"""
Creates detailed compliance requirement records for a scan.
@@ -395,3 +401,22 @@ def create_compliance_requirements_task(tenant_id: str, scan_id: str):
scan_id (str): The ID of the scan for which to create records.
"""
return create_compliance_requirements(tenant_id=tenant_id, scan_id=scan_id)
@shared_task(base=RLSTask, name="lighthouse-connection-check")
@set_tenant
def check_lighthouse_connection_task(lighthouse_config_id: str, tenant_id: str = None):
"""
Task to check the connection status of a Lighthouse configuration.
Args:
lighthouse_config_id (str): The primary key of the LighthouseConfiguration instance to check.
tenant_id (str): The tenant ID for the task.
Returns:
dict: A dictionary containing:
- 'connected' (bool): Indicates whether the connection is successful.
- 'error' (str or None): The error message if the connection failed, otherwise `None`.
- 'available_models' (list): List of available models if connection is successful.
"""
return check_lighthouse_connection(lighthouse_config_id=lighthouse_config_id)
+19
View File
@@ -55,3 +55,22 @@ class TestScheduleProviderScan:
assert "There is already a scheduled scan for this provider." in str(
exc_info.value
)
def test_remove_periodic_task(self, providers_fixture):
provider_instance = providers_fixture[0]
assert Scan.objects.count() == 0
with patch("tasks.tasks.perform_scheduled_scan_task.apply_async"):
schedule_provider_scan(provider_instance)
assert Scan.objects.count() == 1
scan = Scan.objects.first()
periodic_task = scan.scheduler_task
assert periodic_task is not None
periodic_task.delete()
scan.refresh_from_db()
# Assert the scan still exists but its scheduler_task is set to None
# Otherwise, Scan.DoesNotExist would be raised
assert Scan.objects.get(id=scan.id).scheduler_task is None
+60 -3
View File
@@ -1,10 +1,10 @@
from datetime import datetime, timezone
from unittest.mock import patch, MagicMock
from unittest.mock import MagicMock, patch
import pytest
from tasks.jobs.connection import check_lighthouse_connection, check_provider_connection
from api.models import Provider
from tasks.jobs.connection import check_provider_connection
from api.models import LighthouseConfiguration, Provider
@pytest.mark.parametrize(
@@ -70,3 +70,60 @@ def test_check_provider_connection_exception(
mock_provider_instance.save.assert_called_once()
assert mock_provider_instance.connected is False
@pytest.mark.parametrize(
"lighthouse_data",
[
{
"name": "OpenAI",
"api_key_decoded": "sk-test1234567890T3BlbkFJtest1234567890",
"model": "gpt-4o",
"temperature": 0,
"max_tokens": 4000,
"business_context": "Test business context",
"is_active": True,
},
],
)
@patch("tasks.jobs.connection.openai.OpenAI")
@pytest.mark.django_db
def test_check_lighthouse_connection(
mock_openai_client, tenants_fixture, lighthouse_data
):
lighthouse_config = LighthouseConfiguration.objects.create(
**lighthouse_data, tenant_id=tenants_fixture[0].id
)
mock_models = MagicMock()
mock_models.data = [MagicMock(id="gpt-4o"), MagicMock(id="gpt-4o-mini")]
mock_openai_client.return_value.models.list.return_value = mock_models
result = check_lighthouse_connection(
lighthouse_config_id=str(lighthouse_config.id),
)
lighthouse_config.refresh_from_db()
mock_openai_client.assert_called_once_with(
api_key=lighthouse_data["api_key_decoded"]
)
assert lighthouse_config.is_active is True
assert result["connected"] is True
assert result["error"] is None
assert result["available_models"] == ["gpt-4o", "gpt-4o-mini"]
@patch("tasks.jobs.connection.LighthouseConfiguration.objects.get")
@pytest.mark.django_db
def test_check_lighthouse_connection_missing_api_key(mock_lighthouse_get):
mock_lighthouse_instance = MagicMock()
mock_lighthouse_instance.api_key_decoded = None
mock_lighthouse_get.return_value = mock_lighthouse_instance
result = check_lighthouse_connection("lighthouse_config_id")
assert result["connected"] is False
assert result["error"] == "API key is invalid or missing."
assert result["available_models"] == []
assert mock_lighthouse_instance.is_active is False
mock_lighthouse_instance.save.assert_called_once()
@@ -145,3 +145,22 @@ class TestOutputs:
assert path.endswith(f"{provider}-{output_file_timestamp}")
assert compliance.endswith(f"{provider}-{output_file_timestamp}")
def test_generate_output_directory_invalid_character(self, tmpdir):
from prowler.config.config import output_file_timestamp
base_tmp = Path(str(tmpdir.mkdir("generate_output")))
base_dir = str(base_tmp)
tenant_id = "t1"
scan_id = "s1"
provider = "aws/test@check"
path, compliance = _generate_output_directory(
base_dir, provider, tenant_id, scan_id
)
assert os.path.isdir(os.path.dirname(path))
assert os.path.isdir(os.path.dirname(compliance))
assert path.endswith(f"aws-test-check-{output_file_timestamp}")
assert compliance.endswith(f"aws-test-check-{output_file_timestamp}")
+110 -51
View File
@@ -7,11 +7,13 @@ import pytest
from tasks.jobs.scan import (
_create_finding_delta,
_store_resources,
_update_resource_failed_findings_count,
create_compliance_requirements,
perform_prowler_scan,
)
from tasks.utils import CustomEncoder
from api.exceptions import ProviderConnectionError
from api.models import (
ComplianceRequirementOverview,
Finding,
@@ -158,6 +160,7 @@ class TestPerformScan:
assert scan_finding.raw_result == finding.raw
assert scan_finding.muted
assert scan_finding.compliance == finding.compliance
assert scan_finding.muted_reason == "Muted by mutelist"
assert scan_resource.tenant == tenant
assert scan_resource.uid == finding.resource_uid
@@ -203,7 +206,7 @@ class TestPerformScan:
provider_id = str(provider.id)
checks_to_execute = ["check1", "check2"]
with pytest.raises(ValueError):
with pytest.raises(ProviderConnectionError):
perform_prowler_scan(tenant_id, scan_id, provider_id, checks_to_execute)
scan.refresh_from_db()
@@ -399,9 +402,7 @@ class TestCreateComplianceRequirements:
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
@@ -427,9 +428,7 @@ class TestCreateComplianceRequirements:
"us-east-1",
"us-west-2",
]
mock_initialize_prowler_provider.return_value = (
mock_prowler_provider_instance
)
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"cis_1.4_aws": {
@@ -512,9 +511,7 @@ class TestCreateComplianceRequirements:
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
@@ -557,9 +554,7 @@ class TestCreateComplianceRequirements:
"us-east-1",
"us-west-2",
]
mock_initialize_prowler_provider.return_value = (
mock_prowler_provider_instance
)
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"test_compliance": {
@@ -607,9 +602,7 @@ class TestCreateComplianceRequirements:
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
@@ -641,9 +634,7 @@ class TestCreateComplianceRequirements:
mock_prowler_provider_instance.get_regions.side_effect = AttributeError(
"No get_regions method"
)
mock_initialize_prowler_provider.return_value = (
mock_prowler_provider_instance
)
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"kubernetes_cis": {
@@ -676,9 +667,7 @@ class TestCreateComplianceRequirements:
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
@@ -704,9 +693,7 @@ class TestCreateComplianceRequirements:
mock_prowler_provider_instance = MagicMock()
mock_prowler_provider_instance.get_regions.return_value = ["us-east-1"]
mock_initialize_prowler_provider.return_value = (
mock_prowler_provider_instance
)
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"cis_1.4_aws": {
@@ -743,9 +730,7 @@ class TestCreateComplianceRequirements:
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
@@ -759,7 +744,7 @@ class TestCreateComplianceRequirements:
tenant_id = str(tenant.id)
scan_id = str(scan.id)
mock_initialize_prowler_provider.side_effect = Exception(
mock_prowler_provider.side_effect = Exception(
"Provider initialization failed"
)
@@ -774,9 +759,7 @@ class TestCreateComplianceRequirements:
):
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
patch("tasks.jobs.scan.return_prowler_provider") as mock_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
@@ -800,9 +783,7 @@ class TestCreateComplianceRequirements:
mock_prowler_provider_instance = MagicMock()
mock_prowler_provider_instance.get_regions.return_value = ["us-east-1"]
mock_initialize_prowler_provider.return_value = (
mock_prowler_provider_instance
)
mock_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {}
@@ -821,8 +802,8 @@ class TestCreateComplianceRequirements:
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
"tasks.jobs.scan.return_prowler_provider"
) as mock_return_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
@@ -862,9 +843,7 @@ class TestCreateComplianceRequirements:
mock_prowler_provider_instance = MagicMock()
mock_prowler_provider_instance.get_regions.return_value = ["us-east-1"]
mock_initialize_prowler_provider.return_value = (
mock_prowler_provider_instance
)
mock_return_prowler_provider.return_value = mock_prowler_provider_instance
mock_compliance_template.__getitem__.return_value = {
"cis_1.4_aws": {
@@ -898,8 +877,8 @@ class TestCreateComplianceRequirements:
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
"tasks.jobs.scan.return_prowler_provider"
) as mock_return_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
@@ -911,7 +890,6 @@ class TestCreateComplianceRequirements:
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
providers_fixture[0]
mock_findings_filter.return_value = []
@@ -921,7 +899,7 @@ class TestCreateComplianceRequirements:
"us-west-2",
"eu-west-1",
]
mock_initialize_prowler_provider.return_value = mock_prowler_provider
mock_return_prowler_provider.return_value = mock_prowler_provider
mock_compliance_template.__getitem__.return_value = {
"test_compliance": {
@@ -990,8 +968,8 @@ class TestCreateComplianceRequirements:
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
"tasks.jobs.scan.return_prowler_provider"
) as mock_return_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
@@ -1009,7 +987,7 @@ class TestCreateComplianceRequirements:
mock_prowler_provider = MagicMock()
mock_prowler_provider.get_regions.return_value = ["us-east-1", "us-west-2"]
mock_initialize_prowler_provider.return_value = mock_prowler_provider
mock_return_prowler_provider.return_value = mock_prowler_provider
mock_compliance_template.__getitem__.return_value = {
"test_compliance": {
@@ -1077,8 +1055,8 @@ class TestCreateComplianceRequirements:
with (
patch("api.db_utils.rls_transaction"),
patch(
"tasks.jobs.scan.initialize_prowler_provider"
) as mock_initialize_prowler_provider,
"tasks.jobs.scan.return_prowler_provider"
) as mock_return_prowler_provider,
patch(
"tasks.jobs.scan.PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE"
) as mock_compliance_template,
@@ -1090,13 +1068,12 @@ class TestCreateComplianceRequirements:
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
providers_fixture[0]
mock_findings_filter.return_value = []
mock_prowler_provider = MagicMock()
mock_prowler_provider.get_regions.return_value = ["us-east-1", "us-west-2"]
mock_initialize_prowler_provider.return_value = mock_prowler_provider
mock_return_prowler_provider.return_value = mock_prowler_provider
mock_compliance_template.__getitem__.return_value = {
"test_compliance": {
@@ -1190,3 +1167,85 @@ class TestCreateComplianceRequirements:
assert len(req_2_objects) == 2
assert all(obj.requirement_status == "PASS" for obj in req_1_objects)
assert all(obj.requirement_status == "FAIL" for obj in req_2_objects)
@pytest.mark.django_db
class TestUpdateResourceFailedFindingsCount:
@patch("api.models.Resource.all_objects.filter")
@patch("api.models.Finding.all_objects.filter")
def test_failed_findings_count_update(
self,
mock_finding_filter,
mock_resource_filter,
tenants_fixture,
scans_fixture,
providers_fixture,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
provider = providers_fixture[0]
scan.provider = provider
scan.save()
tenant_id = str(tenant.id)
scan_id = str(scan.id)
resource1 = MagicMock()
resource1.uid = "res-1"
resource1.failed_findings_count = None
resource1.save = MagicMock()
resource2 = MagicMock()
resource2.uid = "res-2"
resource2.failed_findings_count = None
resource2.save = MagicMock()
mock_resource_filter.return_value = [resource1, resource2]
fake_subquery_qs = MagicMock()
fake_subquery_qs.order_by.return_value = fake_subquery_qs
fake_subquery_qs.values.return_value = fake_subquery_qs
fake_subquery_qs.__getitem__.return_value = fake_subquery_qs
def finding_filter_side_effect(*args, **kwargs):
if "status" in kwargs:
qs_count = MagicMock()
if kwargs.get("resources") == resource1:
qs_count.count.return_value = 3
else:
qs_count.count.return_value = 0
return qs_count
return fake_subquery_qs
mock_finding_filter.side_effect = finding_filter_side_effect
_update_resource_failed_findings_count(tenant_id, scan_id)
# resource1 should have been updated to 3
assert resource1.failed_findings_count == 3
resource1.save.assert_called_once_with(update_fields=["failed_findings_count"])
# resource2 should have been updated to 0
assert resource2.failed_findings_count == 0
resource2.save.assert_called_once_with(update_fields=["failed_findings_count"])
@patch("api.models.Resource.all_objects.filter", return_value=[])
@patch("api.models.Finding.all_objects.filter")
def test_no_resources_no_error(
self,
mock_finding_filter,
mock_resource_filter,
tenants_fixture,
scans_fixture,
providers_fixture,
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
provider = providers_fixture[0]
scan.provider = provider
scan.save()
_update_resource_failed_findings_count(str(tenant.id), str(scan.id))
mock_finding_filter.assert_not_called()
+31 -8
View File
@@ -3,9 +3,10 @@ from pathlib import Path
from unittest.mock import MagicMock, patch
import pytest
from tasks.tasks import generate_outputs
from tasks.tasks import _perform_scan_complete_tasks, generate_outputs_task
# TODO Move this to outputs/reports jobs
@pytest.mark.django_db
class TestGenerateOutputs:
def setup_method(self):
@@ -17,7 +18,7 @@ class TestGenerateOutputs:
with patch("tasks.tasks.ScanSummary.objects.filter") as mock_filter:
mock_filter.return_value.exists.return_value = False
result = generate_outputs(
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
@@ -99,7 +100,7 @@ class TestGenerateOutputs:
mock_compress.return_value = "/tmp/zipped.zip"
mock_upload.return_value = "s3://bucket/zipped.zip"
result = generate_outputs(
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
@@ -150,7 +151,7 @@ class TestGenerateOutputs:
True,
]
result = generate_outputs(
result = generate_outputs_task(
scan_id="scan",
provider_id="provider",
tenant_id=self.tenant_id,
@@ -208,7 +209,7 @@ class TestGenerateOutputs:
{"aws": [(lambda x: True, MagicMock())]},
),
):
generate_outputs(
generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
@@ -276,7 +277,7 @@ class TestGenerateOutputs:
}
},
):
result = generate_outputs(
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
@@ -346,7 +347,7 @@ class TestGenerateOutputs:
):
mock_summary.return_value.exists.return_value = True
result = generate_outputs(
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
@@ -407,9 +408,31 @@ class TestGenerateOutputs:
),
):
with caplog.at_level("ERROR"):
generate_outputs(
generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
assert "Error deleting output files" in caplog.text
class TestScanCompleteTasks:
@patch("tasks.tasks.create_compliance_requirements_task.apply_async")
@patch("tasks.tasks.perform_scan_summary_task.si")
@patch("tasks.tasks.generate_outputs_task.si")
def test_scan_complete_tasks(
self, mock_outputs_task, mock_scan_summary_task, mock_compliance_tasks
):
_perform_scan_complete_tasks("tenant-id", "scan-id", "provider-id")
mock_compliance_tasks.assert_called_once_with(
kwargs={"tenant_id": "tenant-id", "scan_id": "scan-id"},
)
mock_scan_summary_task.assert_called_once_with(
scan_id="scan-id",
tenant_id="tenant-id",
)
mock_outputs_task.assert_called_once_with(
scan_id="scan-id",
provider_id="provider-id",
tenant_id="tenant-id",
)
@@ -0,0 +1,128 @@
import random
from collections import defaultdict
import requests
from locust import events, task
from utils.helpers import APIUserBase, get_api_token, get_auth_headers
GLOBAL = {
"token": None,
"available_scans_info": {},
}
SUPPORTED_COMPLIANCE_IDS = {
"aws": ["ens_rd2022", "cis_2.0", "prowler_threatscore", "soc2"],
"gcp": ["ens_rd2022", "cis_2.0", "prowler_threatscore", "soc2"],
"azure": ["ens_rd2022", "cis_2.0", "prowler_threatscore", "soc2"],
"m365": ["cis_4.0", "iso27001_2022", "prowler_threatscore"],
}
def _get_random_scan() -> tuple:
provider_type = random.choice(list(GLOBAL["available_scans_info"].keys()))
scan_info = random.choice(GLOBAL["available_scans_info"][provider_type])
return provider_type, scan_info
def _get_random_compliance_id(provider: str) -> str:
return f"{random.choice(SUPPORTED_COMPLIANCE_IDS[provider])}_{provider}"
def _get_compliance_available_scans_by_provider_type(host: str, token: str) -> dict:
excluded_providers = ["kubernetes"]
response_dict = defaultdict(list)
provider_response = requests.get(
f"{host}/providers?fields[providers]=id,provider&filter[connected]=true",
headers=get_auth_headers(token),
)
for provider in provider_response.json()["data"]:
provider_id = provider["id"]
provider_type = provider["attributes"]["provider"]
if provider_type in excluded_providers:
continue
scan_response = requests.get(
f"{host}/scans?fields[scans]=id&filter[provider]={provider_id}&filter[state]=completed",
headers=get_auth_headers(token),
)
scan_data = scan_response.json()["data"]
if not scan_data:
continue
scan_id = scan_data[0]["id"]
response_dict[provider_type].append(scan_id)
return response_dict
def _get_compliance_regions_from_scan(host: str, token: str, scan_id: str) -> list:
response = requests.get(
f"{host}/compliance-overviews/metadata?filter[scan_id]={scan_id}",
headers=get_auth_headers(token),
)
assert response.status_code == 200, f"Failed to get scan: {response.text}"
return response.json()["data"]["attributes"]["regions"]
@events.test_start.add_listener
def on_test_start(environment, **kwargs):
GLOBAL["token"] = get_api_token(environment.host)
scans_by_provider = _get_compliance_available_scans_by_provider_type(
environment.host, GLOBAL["token"]
)
scan_info = defaultdict(list)
for provider, scans in scans_by_provider.items():
for scan in scans:
scan_info[provider].append(
{
"scan_id": scan,
"regions": _get_compliance_regions_from_scan(
environment.host, GLOBAL["token"], scan
),
}
)
GLOBAL["available_scans_info"] = scan_info
class APIUser(APIUserBase):
def on_start(self):
self.token = GLOBAL["token"]
@task(3)
def compliance_overviews_default(self):
provider_type, scan_info = _get_random_scan()
name = f"/compliance-overviews ({provider_type})"
endpoint = f"/compliance-overviews?" f"filter[scan_id]={scan_info['scan_id']}"
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(2)
def compliance_overviews_region(self):
provider_type, scan_info = _get_random_scan()
name = f"/compliance-overviews?filter[region] ({provider_type})"
endpoint = (
f"/compliance-overviews"
f"?filter[scan_id]={scan_info['scan_id']}"
f"&filter[region]={random.choice(scan_info['regions'])}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task(2)
def compliance_overviews_requirements(self):
provider_type, scan_info = _get_random_scan()
compliance_id = _get_random_compliance_id(provider_type)
name = f"/compliance-overviews/requirements ({compliance_id})"
endpoint = (
f"/compliance-overviews/requirements"
f"?filter[scan_id]={scan_info['scan_id']}"
f"&filter[compliance_id]={compliance_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@task
def compliance_overviews_attributes(self):
provider_type, _ = _get_random_scan()
compliance_id = _get_random_compliance_id(provider_type)
name = f"/compliance-overviews/attributes ({compliance_id})"
endpoint = (
f"/compliance-overviews/attributes"
f"?filter[compliance_id]={compliance_id}"
)
self.client.get(endpoint, headers=get_auth_headers(self.token), name=name)
@@ -0,0 +1,117 @@
# Prowler Multicloud CIS Benchmarks PowerBI Template
![Prowler Report](https://github.com/user-attachments/assets/560f7f83-1616-4836-811a-16963223c72f)
## Getting Started
1. Install Microsoft PowerBI Desktop
This report requires the Microsoft PowerBI Desktop software which can be downloaded for free from Microsoft.
2. Run compliance scans in Prowler
The report uses compliance csv outputs from Prowler. Compliance scans be run using either [Prowler CLI](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-cli) or [Prowler Cloud/App](https://cloud.prowler.com/sign-in)
1. Prowler CLI -&gt; Run a Prowler scan using the --compliance option
2. Prowler Cloud/App -&gt; Navigate to the compliance section to download csv outputs
![Download Compliance Scan](https://github.com/user-attachments/assets/42c11a60-8ce8-4c60-a663-2371199c052b)
The template supports the following CIS Benchmarks only:
| Compliance Framework | Version |
| ---------------------------------------------- | ------- |
| CIS Amazon Web Services Foundations Benchmark | v4.0.1 |
| CIS Google Cloud Platform Foundation Benchmark | v3.0.0 |
| CIS Microsoft Azure Foundations Benchmark | v3.0.0 |
| CIS Kubernetes Benchmark | v1.10.0 |
Ensure you run or download the correct benchmark versions.
3. Create a local directory to store Prowler csvoutputs
Once downloaded, place your csv outputs in a directory on your local machine. If you rename the files, they must maintain the provider in the filename.
To use time-series capabilities such as "compliance percent over time" you'll need scans from multiple dates.
4. Download and run the PowerBI template file (.pbit)
Running the .pbit file will open PowerBI Desktop and prompt you for the full filepath to the local directory
5. Enter the full filepath to the directory created in step 3
Provide the full filepath from the root directory.
Ensure that the filepath is not wrapped in quotation marks (""). If you use Window's "copy as path" feature, it will automatically include quotation marks.
6. Save the report as a PowerBI file (.pbix)
Once the filepath is entered, the template will automatically ingest and populate the report. You can then save this file as a new PowerBI report. If you'd like to generate another report, simply re-run the template file (.pbit) from step 4.
## Validation
After setting up your dashboard, you may want to validate the Prowler csv files were ingested correctly. To do this, navigate to the "Configuration" tab.
The "loaded CIS Benchmarks" table shows the supported benchmarks and versions. This is defined by the template file and not editable by the user. All benchmarks will be loaded regardless of which providers you provided csv outputs for.
The "Prowler CSV Folder" shows the path to the local directory you provided.
The "Loaded Prowler Exports" table shows the ingested csv files from the local directory. It will mark files that are treated as the latest assessment with a green checkmark.
![Prowler Validation](https://github.com/user-attachments/assets/a543ca9b-6cbe-4ad1-b32a-d4ac2163d447)
## Report Sections
The PowerBI Report is broken into three main report pages
| Report Page | Description |
| ----------- | ----------------------------------------------------------------------------------- |
| Overview | Provides general CIS Benchmark overview across both AWS, Azure, GCP, and Kubernetes |
| Benchmark | Provides overview of a single CIS Benchmark |
| Requirement | Drill-through page to view details of a single requirement |
### Overview Page
The overview page is a general CIS Benchmark overview across both AWS, Azure, GCP, and Kubernetes.
![image](https://github.com/user-attachments/assets/94164fa9-36a4-4bb9-890d-e9a9a63a3e7d)
The page has the following components:
| Component | Description |
| ---------------------------------------- | ------------------------------------------------------------------------ |
| CIS Benchmark Overview | Table with benchmark name, Version, and overall compliance percentage |
| Provider by Requirement Status | Bar chart showing benchmark requirements by status by provider |
| Compliance Percent Heatmap | Heatmap showing compliance percent by benchmark and profile level |
| Profile level by Requirement Status | Bar chart showing requirements by status and profile level |
| Compliance Percent Over Time by Provider | Line chart showing overall compliance perecentage over time by provider. |
### Benchmark Page
The benchmark page provides an overview of a single CIS Benchmark. You can select the benchmark from the dropdown as well as scope down to specific profile levels or regions.
![image](https://github.com/user-attachments/assets/34498ee8-317b-4b81-b241-c561451d8def)
The page has the following components:
| Component | Description |
| --------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------ |
| Compliance Percent Heatmap | Heatmap showing compliance percent by region and profile level |
| Benchmark Section by Requirement Status | Bar chart showing benchmark requirements by bennchmark section and status |
| Compliance percent Over Time by Region | Line chart showing overall compliance percentage over time by region |
| Benchmark Requirements | Table showing requirement section, requirement number, reuqirement title, number of resources tested, status, and number of failing checks |
### Requirement Page
The requirement page is a drill-through page to view details of a single requirement. To populate the requirement page right click on a requiement from the "Benchmark Requirements" table on the benchmark page and select "Drill through" -&gt; "Requirement".
![image](https://github.com/user-attachments/assets/5c9172d9-56fe-4514-b341-7e708863fad6)
The requirement page has the following components:
| Component | Description |
| ------------------------------------------ | --------------------------------------------------------------------------------- |
| Title | Title of the requirement |
| Rationale | Rationale of the requirement |
| Remediation | Remedation guidance for the requirement |
| Region by Check Status | Bar chart showing Prowler checks by region and status |
| Resource Checks for Benchmark Requirements | Table showing Resource ID, Resource Name, Status, Description, and Prowler Checkl |
## Walkthrough Video
[![image](https://github.com/user-attachments/assets/866642c6-43ac-4aac-83d3-bb625002da0b)](https://www.youtube.com/watch?v=lfKFkTqBxjU)
@@ -0,0 +1,23 @@
import warnings
from dashboard.common_methods import get_section_container_iso
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ATTRIBUTES_CATEGORY",
"REQUIREMENTS_ATTRIBUTES_OBJETIVE_ID",
"REQUIREMENTS_ATTRIBUTES_OBJETIVE_NAME",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
]
return get_section_container_iso(
aux, "REQUIREMENTS_ATTRIBUTES_CATEGORY", "REQUIREMENTS_ATTRIBUTES_OBJETIVE_ID"
)
+43
View File
@@ -0,0 +1,43 @@
import warnings
from dashboard.common_methods import get_section_containers_3_levels
warnings.filterwarnings("ignore")
def get_table(data):
data["REQUIREMENTS_DESCRIPTION"] = (
data["REQUIREMENTS_ID"] + " - " + data["REQUIREMENTS_DESCRIPTION"]
)
data["REQUIREMENTS_DESCRIPTION"] = data["REQUIREMENTS_DESCRIPTION"].apply(
lambda x: x[:150] + "..." if len(str(x)) > 150 else x
)
data["REQUIREMENTS_ATTRIBUTES_SECTION"] = data[
"REQUIREMENTS_ATTRIBUTES_SECTION"
].apply(lambda x: x[:80] + "..." if len(str(x)) > 80 else x)
data["REQUIREMENTS_ATTRIBUTES_SUBSECTION"] = data[
"REQUIREMENTS_ATTRIBUTES_SUBSECTION"
].apply(lambda x: x[:150] + "..." if len(str(x)) > 150 else x)
aux = data[
[
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
]
return get_section_containers_3_levels(
aux,
"REQUIREMENTS_ATTRIBUTES_SECTION",
"REQUIREMENTS_ATTRIBUTES_SUBSECTION",
"REQUIREMENTS_DESCRIPTION",
)
+4 -1
View File
@@ -4,7 +4,10 @@ from dash import html
def create_provider_card(
provider: str, provider_logo: str, account_type: str, filtered_data
provider: str,
provider_logo: str,
account_type: str,
filtered_data,
) -> List[html.Div]:
"""
Card to display the provider's name and icon.
+25
View File
@@ -245,6 +245,31 @@ def create_service_dropdown(services: list) -> html.Div:
)
def create_provider_dropdown(providers: list) -> html.Div:
"""
Dropdown to select the provider.
Args:
providers (list): List of providers.
Returns:
html.Div: Dropdown to select the provider.
"""
return html.Div(
[
html.Label(
"Provider:", className="text-prowler-stone-900 font-bold text-sm"
),
dcc.Dropdown(
id="provider-filter",
options=[{"label": i, "value": i} for i in providers],
value=["All"],
clearable=False,
multi=True,
style={"color": "#000000"},
),
],
)
def create_status_dropdown(status: list) -> html.Div:
"""
Dropdown to select the status.
+5 -2
View File
@@ -9,9 +9,11 @@ def create_layout_overview(
download_button_xlsx: html.Button,
severity_dropdown: html.Div,
service_dropdown: html.Div,
provider_dropdown: html.Div,
table_row_dropdown: html.Div,
status_dropdown: html.Div,
table_div_header: html.Div,
amount_providers: int,
) -> html.Div:
"""
Create the layout of the dashboard.
@@ -47,9 +49,10 @@ def create_layout_overview(
[
html.Div([severity_dropdown], className=""),
html.Div([service_dropdown], className=""),
html.Div([provider_dropdown], className=""),
html.Div([status_dropdown], className=""),
],
className="grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-3",
className="grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-4",
),
html.Div(
[
@@ -59,7 +62,7 @@ def create_layout_overview(
html.Div(className="flex", id="k8s_card", n_clicks=0),
html.Div(className="flex", id="m365_card", n_clicks=0),
],
className="grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-5",
className=f"grid gap-x-4 mb-[30px] sm:grid-cols-2 lg:grid-cols-{amount_providers}",
),
html.H4(
"Count of Findings by severity",
+13 -20
View File
@@ -346,34 +346,27 @@ def display_data(
if item == "nan" or item.__class__.__name__ != "str":
region_filter_options.remove(item)
# Convert ASSESSMENTDATE to datetime
data["ASSESSMENTDATE"] = pd.to_datetime(data["ASSESSMENTDATE"], errors="coerce")
data["ASSESSMENTDATE"] = data["ASSESSMENTDATE"].dt.strftime("%Y-%m-%d %H:%M:%S")
data["ASSESSMENTDAY"] = data["ASSESSMENTDATE"].dt.date
# Choosing the date that is the most recent
data_values = data["ASSESSMENTDATE"].unique()
data_values.sort()
data_values = data_values[::-1]
aux = []
# Find the latest timestamp per account per day
latest_per_account_day = data.groupby(["ACCOUNTID", "ASSESSMENTDAY"])[
"ASSESSMENTDATE"
].transform("max")
data_values = [str(i) for i in data_values]
for value in data_values:
if value.split(" ")[0] not in [aux[i].split(" ")[0] for i in range(len(aux))]:
aux.append(value)
data_values = [str(i) for i in aux]
# Keep only rows with the latest timestamp for each account and day
data = data[data["ASSESSMENTDATE"] == latest_per_account_day]
data = data[data["ASSESSMENTDATE"].isin(data_values)]
data["ASSESSMENTDATE"] = data["ASSESSMENTDATE"].apply(lambda x: x.split(" ")[0])
# Prepare the date filter options (unique days, as strings)
options_date = sorted(data["ASSESSMENTDAY"].astype(str).unique(), reverse=True)
options_date = data["ASSESSMENTDATE"].unique()
options_date.sort()
options_date = options_date[::-1]
# Filter DATE
# Filter by selected date (as string)
if date_filter_analytics in options_date:
data = data[data["ASSESSMENTDATE"] == date_filter_analytics]
data = data[data["ASSESSMENTDAY"].astype(str) == date_filter_analytics]
else:
date_filter_analytics = options_date[0]
data = data[data["ASSESSMENTDATE"] == date_filter_analytics]
data = data[data["ASSESSMENTDAY"].astype(str) == date_filter_analytics]
if data.empty:
fig = px.pie()
+87 -35
View File
@@ -38,6 +38,7 @@ from dashboard.lib.cards import create_provider_card
from dashboard.lib.dropdowns import (
create_account_dropdown,
create_date_dropdown,
create_provider_dropdown,
create_region_dropdown,
create_service_dropdown,
create_severity_dropdown,
@@ -83,7 +84,18 @@ def load_csv_files(csv_files):
"""Load CSV files into a single pandas DataFrame."""
dfs = []
for file in csv_files:
df = pd.read_csv(file, sep=";", on_bad_lines="skip")
account_columns = ["ACCOUNT_ID", "ACCOUNT_UID", "SUBSCRIPTION"]
df_sample = pd.read_csv(file, sep=";", on_bad_lines="skip", nrows=1)
dtype_dict = {}
for col in account_columns:
if col in df_sample.columns:
dtype_dict[col] = str
# Read the full file with proper dtypes
df = pd.read_csv(file, sep=";", on_bad_lines="skip", dtype=dtype_dict)
if "CHECK_ID" in df.columns:
if "TIMESTAMP" in df.columns or df["PROVIDER"].unique() == "aws":
dfs.append(df.astype(str))
@@ -120,7 +132,6 @@ if data is None:
]
)
else:
# This handles the case where we are using v3 outputs
if "ASSESSMENT_START_TIME" in data.columns:
data["ASSESSMENT_START_TIME"] = data["ASSESSMENT_START_TIME"].str.replace(
@@ -288,6 +299,13 @@ else:
service_dropdown = create_service_dropdown(services)
# Provider Dropdown
providers = ["All"] + list(data["PROVIDER"].unique())
providers = [
x for x in providers if str(x) != "nan" and x.__class__.__name__ == "str"
]
provider_dropdown = create_provider_dropdown(providers)
# Create the download button
download_button_csv = html.Button(
"Download this table as CSV",
@@ -469,9 +487,11 @@ else:
download_button_xlsx,
severity_dropdown,
service_dropdown,
provider_dropdown,
table_row_dropdown,
status_dropdown,
table_div_header,
len(data["PROVIDER"].unique()),
)
@@ -498,6 +518,8 @@ else:
Output("severity-filter", "value"),
Output("severity-filter", "options"),
Output("service-filter", "value"),
Output("provider-filter", "value"),
Output("provider-filter", "options"),
Output("service-filter", "options"),
Output("table-rows", "value"),
Output("table-rows", "options"),
@@ -516,6 +538,7 @@ else:
Input("download_link_xlsx", "n_clicks"),
Input("severity-filter", "value"),
Input("service-filter", "value"),
Input("provider-filter", "value"),
Input("table-rows", "value"),
Input("status-filter", "value"),
Input("search-input", "value"),
@@ -539,6 +562,7 @@ def filter_data(
n_clicks_xlsx,
severity_values,
service_values,
provider_values,
table_row_values,
status_values,
search_value,
@@ -864,6 +888,25 @@ def filter_data(
filtered_data["SERVICE_NAME"].isin(updated_service_values)
]
provider_filter_options = ["All"] + list(filtered_data["PROVIDER"].unique())
# Filter Provider
if provider_values == ["All"]:
updated_provider_values = filtered_data["PROVIDER"].unique()
elif "All" in provider_values and len(provider_values) > 1:
# Remove 'All' from the list
provider_values.remove("All")
updated_provider_values = provider_values
elif len(provider_values) == 0:
updated_provider_values = filtered_data["PROVIDER"].unique()
provider_values = ["All"]
else:
updated_provider_values = provider_values
filtered_data = filtered_data[
filtered_data["PROVIDER"].isin(updated_provider_values)
]
# Filter Status
if status_values == ["All"]:
updated_status_values = filtered_data["STATUS"].unique()
@@ -1084,25 +1127,17 @@ def filter_data(
table_row_options = []
# Take the values from the table_row_values
# Calculate table row options as percentages
percentages = [0.05, 0.10, 0.25, 0.50, 0.75, 1.0]
total_rows = len(filtered_data)
for pct in percentages:
value = max(1, int(total_rows * pct))
label = f"{int(pct * 100)}%"
table_row_options.append({"label": label, "value": value})
# Default to 25% if not set
if table_row_values is None or table_row_values == -1:
if len(filtered_data) < 25:
table_row_values = len(filtered_data)
else:
table_row_values = 25
if len(filtered_data) < 25:
table_row_values = len(filtered_data)
if len(filtered_data) >= 25:
table_row_options.append(25)
if len(filtered_data) >= 50:
table_row_options.append(50)
if len(filtered_data) >= 75:
table_row_options.append(75)
if len(filtered_data) >= 100:
table_row_options.append(100)
table_row_options.append(len(filtered_data))
table_row_values = table_row_options[0]["value"]
# For the values that are nan or none, replace them with ""
filtered_data = filtered_data.replace({np.nan: ""})
@@ -1337,21 +1372,36 @@ def filter_data(
]
# Create Provider Cards
aws_card = create_provider_card(
"aws", aws_provider_logo, "Accounts", full_filtered_data
)
azure_card = create_provider_card(
"azure", azure_provider_logo, "Subscriptions", full_filtered_data
)
gcp_card = create_provider_card(
"gcp", gcp_provider_logo, "Projects", full_filtered_data
)
k8s_card = create_provider_card(
"kubernetes", ks8_provider_logo, "Clusters", full_filtered_data
)
m365_card = create_provider_card(
"m365", m365_provider_logo, "Accounts", full_filtered_data
)
if "aws" in list(data["PROVIDER"].unique()):
aws_card = create_provider_card(
"aws", aws_provider_logo, "Accounts", full_filtered_data
)
else:
aws_card = None
if "azure" in list(data["PROVIDER"].unique()):
azure_card = create_provider_card(
"azure", azure_provider_logo, "Subscriptions", full_filtered_data
)
else:
azure_card = None
if "gcp" in list(data["PROVIDER"].unique()):
gcp_card = create_provider_card(
"gcp", gcp_provider_logo, "Projects", full_filtered_data
)
else:
gcp_card = None
if "kubernetes" in list(data["PROVIDER"].unique()):
k8s_card = create_provider_card(
"kubernetes", ks8_provider_logo, "Clusters", full_filtered_data
)
else:
k8s_card = None
if "m365" in list(data["PROVIDER"].unique()):
m365_card = create_provider_card(
"m365", m365_provider_logo, "Accounts", full_filtered_data
)
else:
m365_card = None
# Subscribe to Prowler Cloud card
subscribe_card = [
@@ -1435,6 +1485,8 @@ def filter_data(
severity_values,
severity_filter_options,
service_values,
provider_values,
provider_filter_options,
service_filter_options,
table_row_values,
table_row_options,
+122
View File
@@ -0,0 +1,122 @@
# AWS Provider
In this page you can find all the details about [Amazon Web Services (AWS)](https://aws.amazon.com/) provider implementation in Prowler.
By default, Prowler will audit just one account and organization settings per scan. To configure it, follow the [getting started](../index.md#aws) page.
## AWS Provider Classes Architecture
The AWS provider implementation follows the general [Provider structure](./provider.md). This section focuses on the AWS-specific implementation, highlighting how the generic provider concepts are realized for AWS in Prowler. For a full overview of the provider pattern, base classes, and extension guidelines, see [Provider documentation](./provider.md). In next subsection you can find a list of the main classes of the AWS provider.
### `AwsProvider` (Main Class)
- **Location:** [`prowler/providers/aws/aws_provider.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/aws/aws_provider.py)
- **Base Class:** Inherits from `Provider` (see [base class details](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/common/provider.py)).
- **Purpose:** Central orchestrator for AWS-specific logic, session management, credential validation, role assumption, region and organization discovery, and configuration.
- **Key AWS Responsibilities:**
- Initializes and manages AWS sessions (with or without role assumption, MFA, etc.).
- Validates credentials and sets up the AWS identity context.
- Loads and manages configuration, mutelist, and fixer settings.
- Discovers enabled AWS regions and organization metadata.
- Provides properties and methods for downstream AWS service classes to access session, identity, and configuration data.
### Data Models
- **Location:** [`prowler/providers/aws/models.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/aws/models.py)
- **Purpose:** Define structured data for AWS identity, session, credentials, organization info, and more.
- **Key AWS Models:**
- `AWSOrganizationsInfo`: Holds AWS Organizations metadata, to be used by the checks.
- `AWSCredentials`, `AWSAssumeRoleInfo`, `AWSAssumeRoleConfiguration`: Used for role assumption and session management.
- `AWSIdentityInfo`: Stores account, user, partition, and region context for the scan.
- `AWSSession`: Wraps the current and original [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) sessions and config.
### `AWSService` (Service Base Class)
- **Location:** [`prowler/providers/aws/lib/service/service.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/aws/lib/service/service.py)
- **Purpose:** Abstract base class that all AWS service-specific classes inherit from. This implements the generic service pattern (described in [service page](./services.md#service-base-class)) specifically for AWS.
- **Key AWS Responsibilities:**
- Receives an `AwsProvider` instance to access session, identity, and configuration.
- Manages clients for all services by regions.
- Provides `__threading_call__` method to make boto3 calls in parallel. By default, this calls are made by region, but it can be overridden with the first parameter of the method and use by resource.
- Exposes common audit context (`audited_account`, `audited_account_arn`, `audited_partition`, `audited_resources`) to subclasses.
### Exception Handling
- **Location:** [`prowler/providers/aws/exceptions/exceptions.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/aws/exceptions/exceptions.py)
- **Purpose:** Custom exception classes for AWS-specific error handling, such as credential and role errors.
### Session and Utility Helpers
- **Location:** [`prowler/providers/aws/lib/`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/aws/lib/)
- **Purpose:** Helpers for session setup, ARN parsing, mutelist management, and other cross-cutting concerns.
## Specific Patterns in AWS Services
The generic service pattern is described in [service page](./services.md#service-structure-and-initialisation). You can find all the right now implemented services in the following locations:
- Directly in the code, in location [`prowler/providers/aws/services/`](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers/aws/services)
- In the [Prowler Hub](https://hub.prowler.com/). For a more human-readable view.
The best reference to understand how to implement a new service is following the [service implementation documentation](./services.md#adding-a-new-service) and taking other services already implemented as reference. In next subsection you can find a list of common patterns that are used accross all AWS services.
### AWS Service Common Patterns
- Services communicate with AWS using boto3, you can find the documentation with all the services [here](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/index.html).
- Every AWS service class inherits from `AWSService`, ensuring access to session, identity, configuration, and threading utilities.
- The constructor (`__init__`) always calls `super().__init__` with the service name and provider (e.g. `super().__init__(__class__.__name__, provider))`). Ensure that the service name in boto3 is the same that you use in the constructor. Usually is used the `__class__.__name__` to get the service name because it is the same as the class name.
- Resource containers **must** be initialized in the constructor. They should be dictionaries, with the key being the resource ARN or equivalent unique identifier and the value being the resource object.
- Resource discovery and attribute collection are parallelized using `self.__threading_call__`, typically by region or resource, for performance. The first parameter of the method is the iterator, if not provided, it will be the region; but if present indicate an array of the resources to be processed.
- Resource filtering is consistently enforced using `self.audit_resources` attribute and `is_resource_filtered` function, it is used to see if user has provided some resource that is not in the audit scope, so we can skip it in the service logic. Normally it is used befor storing the resource in the service container as follows: `if not self.audit_resources or (is_resource_filtered(resource["arn"], self.audit_resources)):`.
- All AWS resources are represented as Pydantic `BaseModel` classes, providing type safety and structured access to resource attributes.
- AWS API calls are wrapped in try/except blocks, with specific handling for `ClientError` and generic exceptions, always logging errors.
- If ARN is not present for some resource, it can be constructed using string interpolation, always including partition, service, region, account, and resource ID.
- Tags and additional attributes that cannot be retrieved from the default call, should be collected and stored for each resource using dedicated methods and threading using the resource object list as iterator.
## Specific Patterns in AWS Checks
The AWS checks pattern is described in [checks page](./checks.md). You can find all the right now implemented checks:
- Directly in the code, within each service folder, each check has its own folder named after the name of the check. (e.g. [`prowler/providers/aws/services/s3/s3_bucket_acl_prohibited/`](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers/aws/services/s3/s3_bucket_acl_prohibited))
- In the [Prowler Hub](https://hub.prowler.com/). For a more human-readable view.
The best reference to understand how to implement a new check is following the [check creation documentation](./checks.md#creating-a-check) and taking other similar checks as reference.
### Check Report Class
The `Check_Report_AWS` class models a single finding for an AWS resource in a check report. It is defined in [`prowler/lib/check/models.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py) and inherits from the generic `Check_Report` base class.
#### Purpose
`Check_Report_AWS` extends the base report structure with AWS-specific fields, enabling detailed tracking of the resource, ARN, and region associated with each finding.
#### Constructor and Attribute Population
When you instantiate `Check_Report_AWS`, you must provide the check metadata and a resource object. The class will attempt to automatically populate its AWS-specific attributes from the resource, using the following logic (in order of precedence):
- **`resource_id`**:
- Uses `resource.id` if present.
- Otherwise, uses `resource.name` if present.
- Defaults to an empty string if none are available.
- **`resource_arn`**:
- Uses `resource.arn` if present.
- Defaults to an empty string if ARN is not present in the resource object.
- **`region`**:
- Uses `resource.region` if present.
- Defaults to an empty string if region is not present in the resource object.
If the resource object does not contain the required attributes, you must set them manually in the check logic.
Other attributes are inherited from the `Check_Report` class, from that ones you **always** have to set the `status` and `status_extended` attributes in the check logic.
#### Example Usage
```python
report = Check_Report_AWS(
metadata=check_metadata,
resource=resource_object
)
report.status = "PASS"
report.status_extended = "Resource is compliant."
```
+121
View File
@@ -0,0 +1,121 @@
# Azure Provider
In this page you can find all the details about [Microsoft Azure](https://azure.microsoft.com/) provider implementation in Prowler.
By default, Prowler will audit all the subscriptions that it is able to list in the Microsoft Entra tenant, and tenant Entra ID service. To configure it, follow the [getting started](../index.md#azure) page.
## Azure Provider Classes Architecture
The Azure provider implementation follows the general [Provider structure](./provider.md). This section focuses on the Azure-specific implementation, highlighting how the generic provider concepts are realized for Azure in Prowler. For a full overview of the provider pattern, base classes, and extension guidelines, see [Provider documentation](./provider.md). In next subsection you can find a list of the main classes of the Azure provider.
### `AzureProvider` (Main Class)
- **Location:** [`prowler/providers/azure/azure_provider.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/azure_provider.py)
- **Base Class:** Inherits from `Provider` (see [base class details](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/common/provider.py)).
- **Purpose:** Central orchestrator for Azure-specific logic, session management, credential validation, and configuration.
- **Key Azure Responsibilities:**
- Initializes and manages Azure sessions (supports Service Principal, CLI, Browser, and Managed Identity authentication).
- Validates credentials and sets up the Azure identity context.
- Loads and manages configuration, mutelist, and fixer settings.
- Retrieves subscription(s) metadata.
- Provides properties and methods for downstream Azure service classes to access session, identity, and configuration data.
### Data Models
- **Location:** [`prowler/providers/azure/models.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/models.py)
- **Purpose:** Define structured data for Azure identity, session, region configuration, and subscription info.
- **Key Azure Models:**
- `AzureIdentityInfo`: Holds Azure identity metadata, including tenant ID, domain, subscription names and IDs, and locations.
- `AzureRegionConfig`: Stores the specific region that will be audited. That can be: Global, US Government or China.
- `AzureSubscription`: Represents a subscription with ID, display name, and state.
### `AzureService` (Service Base Class)
- **Location:** [`prowler/providers/azure/lib/service/service.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/lib/service/service.py)
- **Purpose:** Abstract base class that all Azure service-specific classes inherit from. This implements the generic service pattern (described in [service page](./services.md#service-base-class)) specifically for Azure.
- **Key Azure Responsibilities:**
- Receives an `AzureProvider` instance to access session, identity, and configuration.
- Manages clients for all services by subscription.
- Exposes common audit context (`subscriptions`, `locations`, `audit_config`, `fixer_config`) to subclasses.
### Exception Handling
- **Location:** [`prowler/providers/azure/exceptions/exceptions.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/exceptions/exceptions.py)
- **Purpose:** Custom exception classes for Azure-specific error handling, such as credential, region, and session errors.
### Session and Utility Helpers
- **Location:** [`prowler/providers/azure/lib/`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/lib/)
- **Purpose:** Helpers for argument parsing, region setup, mutelist management, and other cross-cutting concerns.
## Specific Patterns in Azure Services
The generic service pattern is described in [service page](./services.md#service-structure-and-initialisation). You can find all the currently implemented services in the following locations:
- Directly in the code, in location [`prowler/providers/azure/services/`](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers/azure/services)
- In the [Prowler Hub](https://hub.prowler.com/) for a more human-readable view.
The best reference to understand how to implement a new service is following the [service implementation documentation](./services.md#adding-a-new-service) and taking other services already implemented as reference. In next subsection you can find a list of common patterns that are used accross all Azure services.
### Azure Service Common Patterns
- Services communicate with Azure using the Azure Python SDK, mainly using the Azure Management Client (except for the Microsoft Entra ID service, that is using the Microsoft Graph API), you can find the documentation with all the management services [here](https://learn.microsoft.com/en-us/python/api/overview/azure/?view=azure-python).
- Every Azure service class inherits from `AzureService`, ensuring access to session, identity, configuration, and client utilities.
- The constructor (`__init__`) always calls `super().__init__` with the service Azure Management Client and Prowler provider object (e.g `super().__init__(WebSiteManagementClient, provider)`).
- Resource containers **must** be initialized in the constructor, and they should be dictionaries, with the key being the subscription ID, the value being a dictionary with the resource ID as key and the resource object as value.
- All Azure resources are represented as Pydantic `BaseModel` classes, providing type safety and structured access to resource attributes. Some are represented as dataclasses due to legacy reasons, but new resources should be represented as Pydantic `BaseModel` classes.
- Azure SDK functions are wrapped in try/except blocks, with specific handling for errors, always logging errors. It is a best practice to create a custom function for every Azure SDK call, in that way we can handle the errors in a more specific way.
## Specific Patterns in Azure Checks
The Azure checks pattern is described in [checks page](./checks.md). You can find all the currently implemented checks:
- Directly in the code, within each service folder, each check has its own folder named after the name of the check. (e.g. [`prowler/providers/azure/services/storage/storage_blob_public_access_level_is_disabled/`](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers/azure/services/storage/storage_blob_public_access_level_is_disabled))
- In the [Prowler Hub](https://hub.prowler.com/) for a more human-readable view.
The best reference to understand how to implement a new check is the [Azure check implementation documentation](./checks.md#creating-a-check) and taking other similar checks as reference.
### Check Report Class
The `Check_Report_Azure` class models a single finding for an Azure resource in a check report. It is defined in [`prowler/lib/check/models.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py) and inherits from the generic `Check_Report` base class.
#### Purpose
`Check_Report_Azure` extends the base report structure with Azure-specific fields, enabling detailed tracking of the resource, resource ID, name, subscription, and location associated with each finding.
#### Constructor and Attribute Population
When you instantiate `Check_Report_Azure`, you must provide the check metadata and a resource object. The class will attempt to automatically populate its Azure-specific attributes from the resource, using the following logic (in order of precedence):
- **`resource_id`**:
- Uses `resource.id` if present.
- Otherwise, uses `resource.resource_id` if present.
- Defaults to an empty string if not available.
- **`resource_name`**:
- Uses `resource.name` if present.
- Otherwise, uses `resource.resource_name` if present.
- Defaults to an empty string if not available.
- **`subscription`**:
- Defaults to an empty string, it **must** be set in the check logic.
- **`location`**:
- Uses `resource.location` if present.
- Defaults to an empty string if not available.
If the resource object does not contain the required attributes, you must set them manually in the check logic.
Other attributes are inherited from the `Check_Report` class, from which you **always** have to set the `status` and `status_extended` attributes in the check logic.
#### Example Usage
```python
report = Check_Report_Azure(
metadata=check_metadata,
resource=resource_object
)
report.subscription = subscription_id
report.status = "PASS"
report.status_extended = "Resource is compliant."
```
+222 -278
View File
@@ -1,370 +1,314 @@
# Create a new Check for a Provider
# Prowler Checks
Here you can find how to create new checks for Prowler.
**To create a check is required to have a Prowler provider service already created, so if the service is not present or the attribute you want to audit is not retrieved by the service, please refer to the [Service](./services.md) documentation.**
This guide explains how to create new checks in Prowler.
## Introduction
The checks are the fundamental piece of Prowler. A check is a simply piece of code that ensures if something is configured against cybersecurity best practices. Then the check generates a finding with the result and includes the check's metadata to give the user more contextual information about the result, the risk and how to remediate it.
Checks are the core component of Prowler. A check is a piece of code designed to validate whether a configuration aligns with cybersecurity best practices. Execution of a check yields a finding, which includes the result and contextual metadata (e.g., outcome, risks, remediation).
To create a new check for a supported Prowler provider, you will need to create a folder with the check name inside the specific service for the selected provider.
### Creating a Check
We are going to use the `ec2_ami_public` check from the `AWS` provider as an example. So the folder name will be `prowler/providers/aws/services/ec2/ec2_ami_public` (following the format `prowler/providers/<provider>/services/<service>/<check_name>`), with the name of check following the pattern: `service_subservice_resource_action`.
To create a new check:
- Prerequisites: A Prowler provider and service must exist. Verify support and check for pre-existing checks via [Prowler Hub](https://hub.prowler.com). If the provider or service is not present, please refer to the [Provider](./provider.md) and [Service](./services.md) documentation for creation instructions.
- Navigate to the service directory. The path should be as follows: `prowler/providers/<provider>/services/<service>`.
- Create a check-specific folder. The path should follow this pattern: `prowler/providers/<provider>/services/<service>/<check_name>`. Adhere to the [Naming Format for Checks](#naming-format-for-checks).
- Populate the folder with files as specified in [File Creation](#file-creation).
### Naming Format for Checks
Checks must be named following the format: `service_subservice_resource_action`.
The name components are:
- `service` The main service being audited (e.g., ec2, entra, iam, etc.)
- `subservice` An individual component or subset of functionality within the service that is being audited. This may correspond to a shortened version of the class attribute accessed within the check. If there is no subservice, just omit.
- `resource` The specific resource type being evaluated (e.g., instance, policy, role, etc.)
- `action` The security aspect or configuration being checked (e.g., public, encrypted, enabled, etc.)
### File Creation
Each check in Prowler follows a straightforward structure. Within the newly created folder, three files must be added to implement the check logic:
- `__init__.py` (empty file) Ensures Python treats the check folder as a package.
- `<check_name>.py` (code file) Contains the check logic, following the prescribed format. Please refer to the [prowler's check code structure](./checks.md#prowlers-check-code-structure) for more information.
- `<check_name>.metadata.json` (metadata file) Defines the check's metadata for contextual information. Please refer to the [check metadata](./checks.md#) for more information.
## Prowler's Check Code Structure
Prowler's check structure is designed for clarity and maintainability. It follows a dynamic loading approach based on predefined paths, ensuring seamless integration of new checks into a provider's service without additional manual steps.
Below the code for a generic check is presented. It is strongly recommended to consult other checks from the same provider and service to understand provider-specific details and patterns. This will help ensure consistency and proper implementation of provider-specific requirements.
Report fields are the most dependent on the provider, consult the `CheckReport<Provider>` class for more information on what can be included in the report [here](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py).
???+ note
A subservice is an specific component of a service that is gonna be audited. Sometimes it could be the shortened name of the class attribute that is gonna be accessed in the check.
Legacy providers (AWS, Azure, GCP, Kubernetes) follow the `Check_Report_<Provider>` naming convention. This is not recommended for current instances. Newer providers adopt the `CheckReport<Provider>` naming convention. Learn more at [Prowler Code](https://github.com/prowler-cloud/prowler/tree/master/prowler/lib/check/models.py).
Inside that folder, we need to create three files:
```python title="Generic Check Class"
# Required Imports
# Import the base Check class and the provider-specific CheckReport class
from prowler.lib.check.models import Check, CheckReport<Provider>
# Import the provider service client
from prowler.providers.<provider>.services.<service>.<service>_client import <service>_client
- An empty `__init__.py`: to make Python treat this check folder as a package.
- A `check_name.py` with the above format containing the check's logic. Refer to the [check](./checks.md#check)
- A `check_name.metadata.json` containing the check's metadata. Refer to the [check metadata](./checks.md#check-metadata)
# Defining the Check Class
# Each check must be implemented as a Python class with the same name as its corresponding file.
# The class must inherit from the Check base class.
class <check_name>(Check):
"""Short description of what is being checked"""
## Check
The Prowler's check structure is very simple and following it there is nothing more to do to include a check in a provider's service because the load is done dynamically based on the paths.
The following is the code for the `ec2_ami_public` check:
```python title="Check Class"
# At the top of the file we need to import the following:
# - Check class which is in charge of the following:
# - Retrieve the check metadata and expose the `metadata()`
# to return a JSON representation of the metadata,
# read more at Check Metadata Model down below.
# - Enforce that each check requires to have the `execute()` function
from prowler.lib.check.models import Check, Check_Report_AWS
# Then you have to import the provider service client
# read more at the Service documentation.
from prowler.providers.aws.services.ec2.ec2_client import ec2_client
# For each check we need to create a python class called the same as the
# file which inherits from the Check class.
class ec2_ami_public(Check):
"""ec2_ami_public verifies if an EC2 AMI is publicly shared"""
# Then, within the check's class we need to create the "execute(self)"
# function, which is enforce by the "Check" class to implement
# the Check's interface and let Prowler to run this check.
def execute(self):
"""Execute <check short description>
# Inside the execute(self) function we need to create
# the list of findings initialised to an empty list []
Returns:
List[CheckReport<Provider>]: A list of reports containing the result of the check.
"""
findings = []
# Then, using the service client we need to iterate by the resource we
# want to check, in this case EC2 AMIs stored in the
# "ec2_client.images" object.
for image in ec2_client.images:
# Once iterating for the images, we have to intialise
# the Check_Report_AWS class passing the check's metadata
# using the "metadata" function explained above.
report = Check_Report_AWS(self.metadata())
# For each Prowler check we MUST fill the following
# Check_Report_AWS fields:
# - region
# - resource_id
# - resource_arn
# - resource_tags
# - status
# - status_extended
report.region = image.region
report.resource_id = image.id
report.resource_arn = image.arn
# The resource_tags should be filled if the resource has the ability
# of having tags, please check the service first.
report.resource_tags = image.tags
# Then we need to create the business logic for the check
# which always should be simple because the Prowler service
# must do the heavy lifting and the check should be in charge
# of parsing the data provided
# Iterate over the target resources using the provider service client
for resource in <service>_client.<resources>:
# Initialize the provider-specific report class, passing metadata and resource
report = Check_Report_<Provider>(metadata=self.metadata(), resource=resource)
# Set required fields and implement check logic
report.status = "PASS"
report.status_extended = f"EC2 AMI {image.id} is not public."
# In this example each "image" object has a boolean attribute
# called "public" to set if the AMI is publicly shared
if image.public:
report.status_extended = f"<Description about why the resource is compliant>"
# If some of the information needed for the report is not inside the resource, it can be set it manually here.
# This depends on the provider and the resource that is being audited.
# report.region = resource.region
# report.resource_tags = getattr(resource, "tags", [])
# ...
# Example check logic (replace with actual logic):
if <non_compliant_condition>:
report.status = "FAIL"
report.status_extended = (
f"EC2 AMI {image.id} is currently public."
)
# Then at the same level as the "report"
# object we need to append it to the findings list.
report.status_extended = f"<Description about why the resource is not compliant>"
findings.append(report)
# Last thing to do is to return the findings list to Prowler
return findings
```
### Check Status
### Data Requirements for Checks in Prowler
All the checks MUST fill the `report.status` and `report.status_extended` with the following criteria:
One of the most important aspects when creating a new check is ensuring that all required data is available from the service client. Often, default API calls are insufficient. Extending the service class with new methods or resource attributes may be required to fetch and store requisite data.
- Status -- `report.status`
- `PASS` --> If the check is passing against the configured value.
- `FAIL` --> If the check is failing against the configured value.
- `MANUAL` --> This value cannot be used unless a manual operation is required in order to determine if the `report.status` is whether `PASS` or `FAIL`.
- Status Extended -- `report.status_extended`
- MUST end in a dot `.`
- MUST include the service audited with the resource and a brief explanation of the result generated, e.g.: `EC2 AMI ami-0123456789 is not public.`
### Statuses for Checks in Prowler
### Check Region
Required Fields: status and status\_extended
All the checks MUST fill the `report.region` with the following criteria:
Each check **must** populate the `report.status` and `report.status_extended` fields according to the following criteria:
- If the audited resource is regional use the `region` (the name changes depending on the provider: `location` in Azure and GCP and `namespace` in K8s) attribute within the resource object.
- If the audited resource is global use the `service_client.region` within the service client object.
- Status field: `report.status`
- `PASS` Assigned when the check confirms compliance with the configured value.
- `FAIL` Assigned when the check detects non-compliance with the configured value.
- `MANUAL` This status must not be used unless manual verification is necessary to determine whether the status (`report.status`) passes (`PASS`) or fails (`FAIL`).
### Check Severity
- Status extended field: `report.status_extended`
- It **must** end with a period (`.`).
- It **must** include the audited service, the resource, and a concise explanation of the check result, for instance: `EC2 AMI ami-0123456789 is not public.`.
The severity of the checks are defined in the metadata file with the `Severity` field. The severity is always in lowercase and can be one of the following values:
### Prowler's Check Severity Levels
- `critical`
- `high`
- `medium`
- `low`
- `informational`
The severity of each check is defined in the metadata file using the `Severity` field. Severity values are always lowercase and must be one of the predefined categories below.
You may need to change it in the check's code if the check has different scenarios that could change the severity. This can be done by using the `report.check_metadata.Severity` attribute:
- `critical` Issue that must be addressed immediately.
- `high` Issue that should be addressed as soon as possible.
- `medium` Issue that should be addressed within a reasonable timeframe.
- `low` Issue that can be addressed in the future.
- `informational` Not an issue but provides valuable information.
If the check involves multiple scenarios that may alter its severity, adjustments can be made dynamically within the check's logic using the severity `report.check_metadata.Severity` attribute:
```python
if <valid for more than 6 months>:
if <generic_condition_1>:
report.status = "PASS"
report.check_metadata.Severity = "informational"
report.status_extended = f"RDS Instance {db_instance.id} certificate has over 6 months of validity left."
elif <valid for more than 3 months>:
report.status = "PASS"
report.status_extended = f"<Resource> is compliant with <requirement>."
elif <generic_condition_2>:
report.status = "FAIL"
report.check_metadata.Severity = "low"
report.status_extended = f"RDS Instance {db_instance.id} certificate has between 3 and 6 months of validity."
elif <valid for more than 1 month>:
report.status_extended = f"<Resource> is not compliant with <requirement>: <reason>."
elif <generic_condition_3>:
report.status = "FAIL"
report.check_metadata.Severity = "medium"
report.status_extended = f"RDS Instance {db_instance.id} certificate less than 3 months of validity."
elif <valid for less than 1 month>:
report.status_extended = f"<Resource> is not compliant with <requirement>: <reason>."
elif <generic_condition_4>:
report.status = "FAIL"
report.check_metadata.Severity = "high"
report.status_extended = f"RDS Instance {db_instance.id} certificate less than 1 month of validity."
report.status_extended = f"<Resource> is not compliant with <requirement>: <reason>."
else:
report.status = "FAIL"
report.check_metadata.Severity = "critical"
report.status_extended = (
f"RDS Instance {db_instance.id} certificate has expired."
)
report.status_extended = f"<Resource> is not compliant with <requirement>: <critical reason>."
```
### Resource ID, Name and ARN
All the checks MUST fill the `report.resource_id` and `report.resource_arn` with the following criteria:
### Resource Identification in Prowler
Each check **must** populate the report with an unique identifier for the audited resource. This identifier or identifiers are going to depend on the provider and the resource that is being audited. Here are the criteria for each provider:
- AWS
- Resouce ID and resource ARN:
- If the resource audited is the AWS account:
- `resource_id` -> AWS Account Number
- `resource_arn` -> AWS Account Root ARN
- If we cant get the ARN from the resource audited, we create a valid ARN with the `resource_id` part as the resource audited. Examples:
- Bedrock -> `arn:<partition>:bedrock:<region>:<account-id>:model-invocation-logging`
- DirectConnect -> `arn:<partition>:directconnect:<region>:<account-id>:dxcon`
- If there is no real resource to audit we do the following:
- resource_id -> `resource_type/unknown`
- resource_arn -> `arn:<partition>:<service>:<region>:<account-id>:<resource_type>/unknown`
- Amazon Resource ID — `report.resource_id`.
- The resource identifier. This is the name of the resource, the ID of the resource, or a resource path. Some resource identifiers include a parent resource (sub-resource-type/parent-resource/sub-resource) or a qualifier such as a version (resource-type:resource-name:qualifier).
- If the resource ID cannot be retrieved directly from the audited resource, it can be extracted from the ARN. It is the last part of the ARN after the last slash (`/`) or colon (`:`).
- If no actual resource to audit exists, this format can be used: `<resource_type>/unknown`
- Amazon Resource Name — `report.resource_arn`.
- The [Amazon Resource Name (ARN)](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference-arns.html) of the audited entity.
- If the ARN cannot be retrieved directly from the audited resource, construct a valid ARN using the `resource_id` component as the audited entity. Examples:
- Bedrock — `arn:<partition>:bedrock:<region>:<account-id>:model-invocation-logging`.
- DirectConnect — `arn:<partition>:directconnect:<region>:<account-id>:dxcon`.
- If no actual resource to audit exists, this format can be used: `arn:<partition>:<service>:<region>:<account-id>:<resource_type>/unknown`.
- Examples:
- AWS Security Hub -> `arn:<partition>:security-hub:<region>:<account-id>:hub/unknown`
- Access Analyzer -> `arn:<partition>:access-analyzer:<region>:<account-id>:analyzer/unknown`
- GuardDuty -> `arn:<partition>:guardduty:<region>:<account-id>:detector/unknown`
- AWS Security Hub `arn:<partition>:security-hub:<region>:<account-id>:hub/unknown`.
- Access Analyzer `arn:<partition>:access-analyzer:<region>:<account-id>:analyzer/unknown`.
- GuardDuty `arn:<partition>:guardduty:<region>:<account-id>:detector/unknown`.
- GCP
- Resource ID -- `report.resource_id`
- GCP Resource --> Resource ID
- Resource Name -- `report.resource_name`
- GCP Resource --> Resource Name
- Resource ID — `report.resource_id`.
- Resource ID represents the full, [unambiguous path to a resource](https://google.aip.dev/122#full-resource-names), known as the full resource name. Typically, it follows the format: `//{api_service/resource_path}`.
- If the resource ID cannot be retrieved directly from the audited resource, by default the resource name is used.
- Resource Name — `report.resource_name`.
- Resource Name usually refers to the name of a resource within its service.
- Azure
- Resource ID -- `report.resource_id`
- Azure Resource --> Resource ID
- Resource Name -- `report.resource_name`
- Azure Resource --> Resource Name
### Python Model
The following is the Python model for the check's class.
- Resource ID — `report.resource_id`.
- Resource ID represents the full Azure Resource Manager path to a resource, which follows the format: `/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}`.
- Resource Name — `report.resource_name`.
- Resource Name usually refers to the name of a resource within its service.
- If the [resource name](https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/resource-name-rules) cannot be retrieved directly from the audited resource, the last part of the resource ID can be used.
As per April 11th 2024 the `Check_Metadata_Model` can be found [here](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py#L36-L82).
- Kubernetes
```python
class Check(ABC, Check_Metadata_Model):
"""Prowler Check"""
- Resource ID — `report.resource_id`.
- The UID of the Kubernetes object. This is a system-generated string that uniquely identifies the object within the cluster for its entire lifetime. See [Kubernetes Object Names and IDs - UIDs](https://kubernetes.io/docs/concepts/overview/working-with-objects/names/#uids).
- Resource Name — `report.resource_name`.
- The name of the Kubernetes object. This is a client-provided string that must be unique for the resource type within a namespace (for namespaced resources) or cluster (for cluster-scoped resources). Names typically follow DNS subdomain or label conventions. See [Kubernetes Object Names and IDs - Names](https://kubernetes.io/docs/concepts/overview/working-with-objects/names/#names).
def __init__(self, **data):
"""Check's init function. Calls the CheckMetadataModel init."""
# Parse the Check's metadata file
metadata_file = (
os.path.abspath(sys.modules[self.__module__].__file__)[:-3]
+ ".metadata.json"
)
# Store it to validate them with Pydantic
data = Check_Metadata_Model.parse_file(metadata_file).dict()
# Calls parents init function
super().__init__(**data)
- M365
def metadata(self) -> dict:
"""Return the JSON representation of the check's metadata"""
return self.json()
- Resource ID — `report.resource_id`.
- If the audited resource has a globally unique identifier such as a `guid`, use it as the `resource_id`.
- If no `guid` exists, use another unique and relevant identifier for the resource, such as the tenant domain, the internal policy ID, or a representative string following the format `<resource_type>/<name_or_id>`.
- Resource Name — `report.resource_name`.
- Use the visible or descriptive name of the audited resource. If no explicit name is available, use a clear description of the resource or configuration being evaluated.
- Examples:
- For an organization:
- `resource_id`: Organization GUID
- `resource_name`: Organization name
- For a policy:
- `resource_id`: Unique policy ID
- `resource_name`: Policy display name
- For global configurations:
- `resource_id`: Tenant domain or representative string (e.g., "userSettings")
- `resource_name`: Description of the configuration (e.g., "SharePoint Settings")
@abstractmethod
def execute(self):
"""Execute the check's logic"""
```
- GitHub
### Using the audit config
- Resource ID — `report.resource_id`.
- The ID of the Github resource. This is a system-generated integer that uniquely identifies the resource within the Github platform.
- Resource Name — `report.resource_name`.
- The name of the Github resource. In the case of a repository, this is just the repository name. For full repository names use the resource `full_name`.
Prowler has a [configuration file](../tutorials/configuration_file.md) which is used to pass certain configuration values to the checks, like the following:
### Configurable Checks in Prowler
```python title="ec2_securitygroup_with_many_ingress_egress_rules.py"
class ec2_securitygroup_with_many_ingress_egress_rules(Check):
def execute(self):
findings = []
See [Configurable Checks](./configurable-checks.md) for detailed information on making checks configurable using the `audit_config` object and configuration file.
# max_security_group_rules, default: 50
max_security_group_rules = ec2_client.audit_config.get(
"max_security_group_rules", 50
)
for security_group_arn, security_group in ec2_client.security_groups.items():
```
## Metadata Structure for Prowler Checks
```yaml title="config.yaml"
# AWS Configuration
aws:
# AWS EC2 Configuration
Each Prowler check must include a metadata file named `<check_name>.metadata.json` that must be located in its directory. This file supplies crucial information for execution, reporting, and context.
# aws.ec2_securitygroup_with_many_ingress_egress_rules
# The default value is 50 rules
max_security_group_rules: 50
```
### Example Metadata File
As you can see in the above code, within the service client, in this case the `ec2_client`, there is an object called `audit_config` which is a Python dictionary containing the values read from the configuration file.
In order to use it, you have to check first if the value is present in the configuration file. If the value is not present, you can create it in the `config.yaml` file and then, read it from the check.
???+ note
It is mandatory to always use the `dictionary.get(value, default)` syntax to set a default value in the case the configuration value is not present.
## Check Metadata
Each Prowler check has metadata associated which is stored at the same level of the check's folder in a file called A `check_name.metadata.json` containing the check's metadata.
???+ note
We are going to include comments in this example metadata JSON but they cannot be included because the JSON format does not allow comments.
Below is a generic example of a check metadata file. **Do not include comments in actual JSON files.**
```json
{
# Provider holds the Prowler provider which the checks belongs to
"Provider": "aws",
# CheckID holds check name
"CheckID": "ec2_ami_public",
# CheckTitle holds the title of the check
"CheckTitle": "Ensure there are no EC2 AMIs set as Public.",
# CheckType holds Software and Configuration Checks, check more here
# https://docs.aws.amazon.com/securityhub/latest/userguide/asff-required-attributes.html#Types
"CheckType": [
"Infrastructure Security"
],
# ServiceName holds the provider service name
"CheckID": "example_check_id",
"CheckTitle": "Example Check Title",
"CheckType": ["Infrastructure Security"],
"ServiceName": "ec2",
# SubServiceName holds the service's subservice or resource used by the check
"SubServiceName": "ami",
# ResourceIdTemplate holds the unique ID for the resource used by the check
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
# Severity holds the check's severity, always in lowercase (critical, high, medium, low or informational)
"Severity": "critical",
# ResourceType only for AWS, holds the type from here
# https://docs.aws.amazon.com/securityhub/latest/userguide/asff-resources.html
# In case of not existing, use CloudFormation type but removing the "::" and using capital letters only at the beginning of each word. Example: "AWS::EC2::Instance" -> "AwsEc2Instance"
# CloudFormation type reference: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-template-resource-type-ref.html
# If the resource type does not exist in the CloudFormation types, use "Other".
"ResourceType": "Other",
# Description holds the title of the check, for now is the same as CheckTitle
"Description": "Ensure there are no EC2 AMIs set as Public.",
# Risk holds the check risk if the result is FAIL
"Risk": "When your AMIs are publicly accessible, they are available in the Community AMIs where everyone with an AWS account can use them to launch EC2 instances. Your AMIs could contain snapshots of your applications (including their data), therefore exposing your snapshots in this manner is not advised.",
# RelatedUrl holds an URL with more information about the check purpose
"RelatedUrl": "",
# Remediation holds the information to help the practitioner to fix the issue in the case of the check raise a FAIL
"Description": "Example description of the check.",
"Risk": "Example risk if the check fails.",
"RelatedUrl": "https://example.com",
"Remediation": {
# Code holds different methods to remediate the FAIL finding
"Code": {
# CLI holds the command in the provider native CLI to remediate it
"CLI": "aws ec2 modify-image-attribute --region <REGION> --image-id <EC2_AMI_ID> --launch-permission {\"Remove\":[{\"Group\":\"all\"}]}",
# NativeIaC holds the native IaC code to remediate it, use "https://docs.bridgecrew.io/docs"
"CLI": "example CLI command",
"NativeIaC": "",
# Other holds the other commands, scripts or code to remediate it, use "https://www.trendmicro.com/cloudoneconformity"
"Other": "https://docs.prowler.com/checks/public_8#aws-console",
# Terraform holds the Terraform code to remediate it, use "https://docs.bridgecrew.io/docs"
"Other": "",
"Terraform": ""
},
# Recommendation holds the recommendation for this check with a description and a related URL
"Recommendation": {
"Text": "We recommend your EC2 AMIs are not publicly accessible, or generally available in the Community AMIs.",
"Url": "https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/cancel-sharing-an-AMI.html"
"Text": "Example recommendation text.",
"Url": "https://example.com/remediation"
}
},
# Categories holds the category or categories where the check can be included, if applied
"Categories": [
"internet-exposed"
],
# DependsOn is not actively used for the moment but it will hold other
# checks wich this check is dependant to
"Categories": ["example-category"],
"DependsOn": [],
# RelatedTo is not actively used for the moment but it will hold other
# checks wich this check is related to
"RelatedTo": [],
# Notes holds additional information not covered in this file
"Notes": ""
}
```
### Remediation Code
### Metadata Fields and Their Purpose
For the Remediation Code we use the following knowledge base to fill it:
- **Provider** — The Prowler provider related to the check. The name **must** be lowercase and match the provider folder name. For supported providers refer to [Prowler Hub](https://hub.prowler.com/check) or directly to [Prowler Code](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers).
- Official documentation for the provider
- https://docs.prowler.com/checks/checks-index
- https://www.trendmicro.com/cloudoneconformity
- https://github.com/cloudmatos/matos/tree/master/remediations
- **CheckID** — The unique identifier for the check inside the provider, this field **must** match the check's folder and python file and json metadata file name. For more information about the naming refer to the [Naming Format for Checks](#naming-format-for-checks) section.
### RelatedURL and Recommendation
- **CheckTitle** — A concise, descriptive title for the check.
The RelatedURL field must be filled with an URL from the provider's official documentation like https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/sharingamis-intro.html
- **CheckType***For now this field is only standardized for the AWS provider*.
- For AWS this field must follow the [AWS Security Hub Types](https://docs.aws.amazon.com/securityhub/latest/userguide/asff-required-attributes.html#Types) format. So the common pattern to follow is `namespace/category/classifier`, refer to the attached documentation for the valid values for this fields.
Also, if not present you can use the Risk and Recommendation texts from the TrendMicro [CloudConformity](https://www.trendmicro.com/cloudoneconformity) guide.
- **ServiceName** — The name of the provider service being audited. This field **must** be in lowercase and match with the service folder name. For supported services refer to [Prowler Hub](https://hub.prowler.com/check) or directly to [Prowler Code](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers).
- **SubServiceName** — The subservice or resource within the service, if applicable. For more information refer to the [Naming Format for Checks](#naming-format-for-checks) section.
### Python Model
The following is the Python model for the check's metadata model. We use the Pydantic's [BaseModel](https://docs.pydantic.dev/latest/api/base_model/#pydantic.BaseModel) as the parent class.
- **ResourceIdTemplate** — A template for the unique resource identifier. For more information refer to the [Prowler's Resource Identification](#prowlers-resource-identification) section.
As per August 5th 2023 the `Check_Metadata_Model` can be found [here](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py#L34-L56).
```python
class Check_Metadata_Model(BaseModel):
"""Check Metadata Model"""
- **Severity** — The severity of the finding if the check fails. Must be one of: `critical`, `high`, `medium`, `low`, or `informational`, this field **must** be in lowercase. To get more information about the severity levels refer to the [Prowler's Check Severity Levels](#prowlers-check-severity-levels) section.
Provider: str
CheckID: str
CheckTitle: str
CheckType: list[str]
ServiceName: str
SubServiceName: str
ResourceIdTemplate: str
Severity: str
ResourceType: str
Description: str
Risk: str
RelatedUrl: str
Remediation: Remediation
Categories: list[str]
DependsOn: list[str]
RelatedTo: list[str]
Notes: str
# We set the compliance to None to
# store the compliance later if supplied
Compliance: list = None
```
- **ResourceType** — The type of resource being audited. *For now this field is only standardized for the AWS provider*.
- For AWS use the [Security Hub resource types](https://docs.aws.amazon.com/securityhub/latest/userguide/asff-resources.html) or, if not available, the PascalCase version of the [CloudFormation type](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-template-resource-type-ref.html) (e.g., `AwsEc2Instance`). Use "Other" if no match exists.
- **Description** — A short description of what the check does.
- **Risk** — The risk or impact if the check fails, explaining why the finding matters.
- **RelatedUrl** — A URL to official documentation or further reading about the check's purpose. If no official documentation is available, use the risk and recommendation text from trusted third-party sources.
- **Remediation** — Guidance for fixing a failed check, including:
- **Code** — Remediation commands or code snippets for CLI, Terraform, native IaC, or other tools like the Web Console.
- **Recommendation** — A textual human readable recommendation. Here it is not necessary to include actual steps, but rather a general recommendation about what to do to fix the check.
- **Categories** — One or more categories for grouping checks in execution (e.g., `internet-exposed`). For the current list of categories, refer to the [Prowler Hub](https://hub.prowler.com/check).
- **DependsOn** — Currently not used.
- **RelatedTo** — Currently not used.
- **Notes** — Any additional information not covered by other fields.
### Remediation Code Guidelines
When providing remediation steps, reference the following sources:
- Official provider documentation.
- [Prowler Checks Remediation Index](https://docs.prowler.com/checks/checks-index)
- [TrendMicro Cloud One Conformity](https://www.trendmicro.com/cloudoneconformity)
- [CloudMatos Remediation Repository](https://github.com/cloudmatos/matos/tree/master/remediations)
### Python Model Reference
The metadata structure is enforced in code using a Pydantic model. For reference, see the [`CheckMetadata`](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py).
@@ -0,0 +1,46 @@
# Configurable Checks in Prowler
Prowler empowers users to extend and adapt cloud security coverage by making checks configurable through the use of the `audit_config` object. This approach enables customization of checks to meet specific requirements through a configuration file.
## Understanding the `audit_config` Object
The `audit_config` object is a dictionary attached to each provider's service client (for example, `<service_name>_client.audit_config`). This object loads configuration values from the main configuration file (`prowler/config/config.yaml`). Use `audit_config` to make checks flexible and user-configurable.
## Using `audit_config` to Configure Checks
Retrieve configuration values in a check by using the `.get()` method on the `audit_config` object. For example, to get the minimum number of Availability Zones for Lambda from the configuration file, use the following code. If the value is not set in the configuration, the check defaults to 2:
```python
LAMBDA_MIN_AZS = awslambda_client.audit_config.get("lambda_min_azs", 2)
```
Always provide a default value in `.get()` to ensure the check works even if the configuration is missing the variable.
### Example: Security Group Rule Limit
```python title="ec2_securitygroup_with_many_ingress_egress_rules.py"
class ec2_securitygroup_with_many_ingress_egress_rules(Check):
def execute(self):
findings = []
max_security_group_rules = ec2_client.audit_config.get(
"max_security_group_rules", 50
)
for security_group_arn, security_group in ec2_client.security_groups.items():
# ... check logic ...
```
## Required File Updates for Configurable Variables
When adding a new configurable check to Prowler, update the following files:
- **Configuration File:** Add the new variable under the relevant provider or service section in `prowler/config/config.yaml`.
```yaml
# aws.awslambda_function_vpc_multi_az
lambda_min_azs: 2
```
- **Test Fixtures:** If tests depend on this configuration, add the variable to `tests/config/fixtures/config.yaml`.
- **Documentation:** Document the new variable in the list of configurable checks in `docs/tutorials/configuration_file.md`.
For a complete list of checks that already support configuration, see the [Configuration File Tutorial](../tutorials/configuration_file.md).
This approach ensures that checks are easily configurable, making Prowler highly adaptable to different environments and requirements.
+9 -7
View File
@@ -1,14 +1,16 @@
# Debugging
# Debugging in Prowler
Debugging in Prowler make things easier!
If you are developing Prowler, it's possible that you will encounter some situations where you have to inspect the code in depth to fix some unexpected issues during the execution.
Debugging in Prowler simplifies the development process, allowing developers to efficiently inspect and resolve unexpected issues during execution.
## VSCode
## Debugging with Visual Studio Code
In VSCode you can run the code using the integrated debugger. Please, refer to this [documentation](https://code.visualstudio.com/docs/editor/debugging) for guidance about the debugger in VSCode.
The following file is an example of the [debugging configuration](https://code.visualstudio.com/docs/editor/debugging#_launch-configurations) file that you can add to [Virtual Studio Code](https://code.visualstudio.com/).
Visual Studio Code (also referred to as VSCode) provides an integrated debugger for executing and analyzing Prowler code. Refer to the official VSCode debugger [documentation](https://code.visualstudio.com/docs/editor/debugging) for detailed instructions.
This file should inside the *.vscode* folder and its name has to be *launch.json*:
### Debugging Configuration Example
The following file is an example of a [debugging configuration](https://code.visualstudio.com/docs/editor/debugging#_launch-configurations) file for [Virtual Studio Code](https://code.visualstudio.com/).
This file must be placed inside the *.vscode* directory and named *launch.json*:
```json
{
+26 -6
View File
@@ -1,8 +1,28 @@
## Contribute with documentation
## Contributing to Documentation
We use `mkdocs` to build this Prowler documentation site so you can easily contribute back with new docs or improving them. To install all necessary dependencies use `poetry install --with docs`.
Prowler documentation is built using `mkdocs`, allowing contributors to easily add or enhance documentation.
1. Install `mkdocs` with your favorite package manager.
2. Inside the `prowler` repository folder run `mkdocs serve` and point your browser to `http://localhost:8000` and you will see live changes to your local copy of this documentation site.
3. Make all needed changes to docs or add new documents. To do so just edit existing md files inside `prowler/docs` and if you are adding a new section or file please make sure you add it to `mkdocs.yaml` file in the root folder of the Prowler repo.
4. Once you are done with changes, please send a pull request to us for review and merge. Thank you in advance!
### Installation and Setup
Install all necessary dependencies using: `poetry install --with docs`.
1. Install `mkdocs` using your preferred package manager.
2. Running the Documentation Locally
Navigate to the `prowler` repository folder.
Start the local documentation server by running: `mkdocs serve`.
Open `http://localhost:8000` in your browser to view live updates.
3. Making Documentation Changes
Make all needed changes to docs or add new documents. Edit existing Markdown (.md) files inside `prowler/docs`.
To add new sections or files, update the `mkdocs.yaml` file located in the root directory of Prowlers repository.
4. Submitting Changes
Once documentation updates are complete:
Submit a pull request for review.
The Prowler team will assess and merge contributions.
Your efforts help improve Prowler documentation—thank you for contributing!
+133
View File
@@ -0,0 +1,133 @@
# Google Cloud Provider
This page details the [Google Cloud Platform (GCP)](https://cloud.google.com/) provider implementation in Prowler.
By default, Prowler will audit all the GCP projects that the authenticated identity can access. To configure it, follow the [getting started](../index.md#google-cloud) page.
## GCP Provider Classes Architecture
The GCP provider implementation follows the general [Provider structure](./provider.md). This section focuses on the GCP-specific implementation, highlighting how the generic provider concepts are realized for GCP in Prowler. For a full overview of the provider pattern, base classes, and extension guidelines, see [Provider documentation](./provider.md).
### Main Class
- **Location:** [`prowler/providers/gcp/gcp_provider.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/gcp_provider.py)
- **Base Class:** Inherits from `Provider` (see [base class details](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/common/provider.py)).
- **Purpose:** Central orchestrator for GCP-specific logic, session management, credential validation, project and organization discovery, and configuration.
- **Key GCP Responsibilities:**
- Initializes and manages GCP sessions (supports Application Default Credentials, Service Account, OAuth, and impersonation).
- Validates credentials and sets up the GCP identity context.
- Loads and manages configuration, mutelist, and fixer settings.
- Discovers accessible GCP projects and organization metadata.
- Provides properties and methods for downstream GCP service classes to access session, identity, and configuration data.
### Data Models
- **Location:** [`prowler/providers/gcp/models.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/models.py)
- **Purpose:** Define structured data for GCP identity, project, and organization info.
- **Key GCP Models:**
- `GCPIdentityInfo`: Holds GCP identity metadata, such as the profile name.
- `GCPOrganization`: Represents a GCP organization with ID, name, and display name.
- `GCPProject`: Represents a GCP project with number, ID, name, organization, labels, and lifecycle state.
### `GCPService` (Service Base Class)
- **Location:** [`prowler/providers/gcp/lib/service/service.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/lib/service/service.py)
- **Purpose:** Abstract base class that all GCP service-specific classes inherit from. This implements the generic service pattern (described in [service page](./services.md#service-base-class)) specifically for GCP.
- **Key GCP Responsibilities:**
- Receives a `GcpProvider` instance to access session, identity, and configuration.
- Manages clients for all services by project.
- Filters projects to only those with the relevant API enabled.
- Provides `__threading_call__` method to make API calls in parallel by project or resource.
- Exposes common audit context (`project_ids`, `projects`, `default_project_id`, `audit_config`, `fixer_config`) to subclasses.
### Exception Handling
- **Location:** [`prowler/providers/gcp/exceptions/exceptions.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/exceptions/exceptions.py)
- **Purpose:** Custom exception classes for GCP-specific error handling, such as credential, session, and project access errors.
### Session and Utility Helpers
- **Location:** [`prowler/providers/gcp/lib/`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/lib/)
- **Purpose:** Helpers for argument parsing, mutelist management, and other cross-cutting concerns.
## Specific Patterns in GCP Services
The generic service pattern is described in [service page](./services.md#service-structure-and-initialisation). You can find all the currently implemented services in the following locations:
- Directly in the code, in location [`prowler/providers/gcp/services/`](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers/gcp/services)
- In the [Prowler Hub](https://hub.prowler.com/) for a more human-readable view.
The best reference to understand how to implement a new service is following the [service implementation documentation](./services.md#adding-a-new-service) and taking other services already implemented as reference. In next subsection you can find a list of common patterns that are used accross all GCP services.
### GCP Service Common Patterns
- Services communicate with GCP using the Google Cloud Python SDK, you can find the documentation with all the services [here](https://cloud.google.com/python/docs/reference).
- Every GCP service class inherits from `GCPService`, ensuring access to session, identity, configuration, and client utilities.
- The constructor (`__init__`) always calls `super().__init__` with the service name, provider, region (default "global"), and API version (default "v1"). Usually, the service name is the class name in lowercase, so it is called like `super().__init__(__class__.__name__, provider)`.
- Resource containers **must** be initialized in the constructor, typically as dictionaries keyed by resource ID and the value is the resource object.
- Only projects with the API enabled are included in the audit scope.
- Resource discovery and attribute collection can be parallelized using `self.__threading_call__`, typically by region/zone or resource.
- All GCP resources are represented as Pydantic `BaseModel` classes, providing type safety and structured access to resource attributes.
- Each GCP API calls are wrapped in try/except blocks, always logging errors.
- Tags and additional attributes that cannot be retrieved from the default call should be collected and stored for each resource using dedicated methods and threading.
## Specific Patterns in GCP Checks
The GCP checks pattern is described in [checks page](./checks.md). You can find all the currently implemented checks:
- Directly in the code, within each service folder, each check has its own folder named after the name of the check. (e.g. [`prowler/providers/gcp/services/iam/iam_sa_user_managed_key_unused/`](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers/gcp/services/iam/iam_sa_user_managed_key_unused))
- In the [Prowler Hub](https://hub.prowler.com/) for a more human-readable view.
The best reference to understand how to implement a new check is following the [GCP check implementation documentation](./checks.md#creating-a-check) and taking other similar checks as reference.
### Check Report Class
The `Check_Report_GCP` class models a single finding for a GCP resource in a check report. It is defined in [`prowler/lib/check/models.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py) and inherits from the generic `Check_Report` base class.
#### Purpose
`Check_Report_GCP` extends the base report structure with GCP-specific fields, enabling detailed tracking of the resource, project, and location associated with each finding.
#### Constructor and Attribute Population
When you instantiate `Check_Report_GCP`, you must provide the check metadata and a resource object. The class will attempt to automatically populate its GCP-specific attributes from the resource, using the following logic (in order of precedence):
- **`resource_id`**:
- Uses the explicit `resource_id` argument if provided.
- Otherwise, uses `resource.id` if present.
- Otherwise, uses `resource.name` if present.
- Defaults to an empty string if none are available.
- **`resource_name`**:
- Uses the explicit `resource_name` argument if provided.
- Otherwise, uses `resource.name` if present.
- Defaults to an empty string.
- **`project_id`**:
- Uses the explicit `project_id` argument if provided.
- Otherwise, uses `resource.project_id` if present.
- Defaults to an empty string.
- **`location`**:
- Uses the explicit `location` argument if provided.
- Otherwise, uses `resource.location` if present.
- Otherwise, uses `resource.region` if present.
- Defaults to "global" if none are available.
All these attributes can be overridden by passing the corresponding argument to the constructor. If the resource object does not contain the required attributes, you must set them manually.
Others attributes are inherited from the `Check_Report` class, from that ones you **always** have to set the `status` and `status_extended` attributes in the check logic.
#### Example Usage
```python
report = Check_Report_GCP(
metadata=check_metadata,
resource=resource_object,
resource_id="custom-id", # Optional override
resource_name="custom-name", # Optional override
project_id="my-gcp-project", # Optional override
location="us-central1" # Optional override
)
report.status = "PASS"
report.status_extended = "Resource is compliant."
```
+116
View File
@@ -0,0 +1,116 @@
# GitHub Provider
This page details the [GitHub](https://github.com/) provider implementation in Prowler.
By default, Prowler will audit the GitHub account - scanning all repositories, organizations, and applications that your configured credentials can access. To configure it, follow the [getting started](../index.md#github) page.
## GitHub Provider Classes Architecture
The GitHub provider implementation follows the general [Provider structure](./provider.md). This section focuses on the GitHub-specific implementation, highlighting how the generic provider concepts are realized for GitHub in Prowler. For a full overview of the provider pattern, base classes, and extension guidelines, see [Provider documentation](./provider.md).
### `GithubProvider` (Main Class)
- **Location:** [`prowler/providers/github/github_provider.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/github/github_provider.py)
- **Base Class:** Inherits from `Provider` (see [base class details](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/common/provider.py)).
- **Purpose:** Central orchestrator for GitHub-specific logic, session management, credential validation, and configuration.
- **Key GitHub Responsibilities:**
- Initializes and manages GitHub sessions (supports Personal Access Token, OAuth App, and GitHub App authentication).
- Validates credentials and sets up the GitHub identity context.
- Loads and manages configuration, mutelist, and fixer settings.
- Provides properties and methods for downstream GitHub service classes to access session, identity, and configuration data.
### Data Models
- **Location:** [`prowler/providers/github/models.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/github/models.py)
- **Purpose:** Define structured data for GitHub identity, session, and output options.
- **Key GitHub Models:**
- `GithubSession`: Holds authentication tokens and keys for the session.
- `GithubIdentityInfo`, `GithubAppIdentityInfo`: Store account or app identity metadata.
### `GithubService` (Service Base Class)
- **Location:** [`prowler/providers/github/lib/service/service.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/github/lib/service/service.py)
- **Purpose:** Abstract base class for all GitHub service-specific classes.
- **Key GitHub Responsibilities:**
- Receives a `GithubProvider` instance to access session, identity, and configuration.
- Manages GitHub API clients for the authenticated user or app.
- Exposes common audit context (`audit_config`, `fixer_config`) to subclasses.
### Exception Handling
- **Location:** [`prowler/providers/github/exceptions/exceptions.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/github/exceptions/exceptions.py)
- **Purpose:** Custom exception classes for GitHub-specific error handling, such as credential and session errors.
### Session and Utility Helpers
- **Location:** [`prowler/providers/github/lib/`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/github/lib/)
- **Purpose:** Helpers for argument parsing, mutelist management, and other cross-cutting concerns.
## Specific Patterns in GitHub Services
The generic service pattern is described in [service page](./services.md#service-structure-and-initialisation). You can find all the currently implemented services in the following locations:
- Directly in the code, in location [`prowler/providers/github/services/`](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers/github/services)
- In the [Prowler Hub](https://hub.prowler.com/) for a more human-readable view.
The best reference to understand how to implement a new service is following the [service implementation documentation](./services.md#adding-a-new-service) and by taking other already implemented services as reference.
### GitHub Service Common Patterns
- Services communicate with GitHub using the PyGithub Python SDK. See the [official documentation](https://pygithub.readthedocs.io/).
- Every GitHub service class inherits from `GithubService`, ensuring access to session, identity, configuration, and client utilities.
- The constructor (`__init__`) always calls `super().__init__` with the service name and provider (e.g. `super().__init__(__class__.__name__, provider))`). Ensure that the service name in PyGithub is the same that you use in the constructor. Usually is used the `__class__.__name__` to get the service name because it is the same as the class name.
- Resource containers **must** be initialized in the constructor, typically as dictionaries keyed by resource ID or name.
- All GitHub resources are represented as Pydantic `BaseModel` classes, providing type safety and structured access to resource attributes.
- GitHub API calls are wrapped in try/except blocks, always logging errors.
## Specific Patterns in GitHub Checks
The GitHub checks pattern is described in [checks page](./checks.md). You can find all the currently implemented checks in:
- Directly in the code, within each service folder, each check has its own folder named after the name of the check. (e.g. [`prowler/providers/github/services/repository/repository_secret_scanning_enabled/`](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers/github/services/repository/repository_secret_scanning_enabled))
- In the [Prowler Hub](https://hub.prowler.com/) for a more human-readable view.
The best reference to understand how to implement a new check is the [GitHub check implementation documentation](./checks.md#creating-a-check) and by taking other checks as reference.
### Check Report Class
The `CheckReportGithub` class models a single finding for a GitHub resource in a check report. It is defined in [`prowler/lib/check/models.py`](https://github.com/prowler-cloud/prowler/blob/master/prowler/lib/check/models.py) and inherits from the generic `Check_Report` base class.
#### Purpose
`CheckReportGithub` extends the base report structure with GitHub-specific fields, enabling detailed tracking of the resource, name, and owner associated with each finding.
#### Constructor and Attribute Population
When you instantiate `CheckReportGithub`, you must provide the check metadata and a resource object. The class will attempt to automatically populate its GitHub-specific attributes from the resource, using the following logic (in order of precedence):
- **`resource_id`**:
- Uses the explicit `resource_id` argument if provided.
- Otherwise, uses `resource.id` if present.
- Defaults to an empty string if not available.
- **`resource_name`**:
- Uses the explicit `resource_name` argument if provided.
- Otherwise, uses `resource.name` if present.
- Defaults to an empty string if not available.
- **`owner`**:
- Uses the explicit `owner` argument if provided.
- Otherwise, uses `resource.owner` for repositories and `resource.name` for organizations.
- Defaults to an empty string if not available.
If the resource object does not contain the required attributes, you must set them manually in the check logic.
Other attributes are inherited from the `Check_Report` class, from which you **always** have to set the `status` and `status_extended` attributes in the check logic.
#### Example Usage
```python
report = CheckReportGithub(
metadata=check_metadata,
resource=resource_object
)
report.status = "PASS"
report.status_extended = "Resource is compliant."
```

Some files were not shown because too many files have changed in this diff Show More