Compare commits

...

266 Commits
4.2.2 ... 4.3.2

Author SHA1 Message Date
github-actions
8df7f3ab72 chore(release): 4.3.2 2024-08-05 18:27:17 +00:00
Pedro Martín
8adc72ad57 fix(gcp): check cloudsql sslMode (#4635) 2024-08-05 14:09:34 -04:00
Pepe Fagoaga
9addf86aa5 refactor(mutelist): Remove re.match and improve docs (#4637)
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-05 14:01:27 -04:00
Pedro Martín
2913d50a52 fix(gcp): check next rotation time in KMS keys (#4633) 2024-08-05 13:59:24 -04:00
Sergio Garcia
c6c06b3354 refactor(tags): convert tags to a dictionary (#4598)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-08-05 13:58:01 -04:00
Sergio Garcia
8242fa883e fix(gcp): use KMS key id in checks (#4610) 2024-08-05 13:57:47 -04:00
Pedro Martín
6646bae26c fix(sns): add condition to sns topics (#4498)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-05 13:57:10 -04:00
Pepe Fagoaga
32da86f393 fix(mutelist): Fix tags match (#4606) 2024-08-01 09:01:44 -04:00
Pepe Fagoaga
74d02e1da6 chore(version): Update Prowler version (#4605) 2024-08-01 08:01:45 -04:00
Pepe Fagoaga
8ec6e89e5c chore(regions_update): Changes in regions for AWS services. (#4607)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-08-01 11:35:08 +02:00
dependabot[bot]
17012ec1a4 chore(deps): bump trufflesecurity/trufflehog from 3.80.3 to 3.80.4 (#4601)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-01 08:06:11 +02:00
Pepe Fagoaga
8461257428 fix(status): Recover status filtering (#4572)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-31 10:10:07 -04:00
Kay Agahd
26a5ffaf82 fix(aws): only check artifacts that can be scanned for vulnerabilities by ecr_repositories_scan_vulnerabilities_in_latest_image (#4507)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-31 09:27:26 -04:00
Sergio Garcia
563ddb3707 chore(permissions): add missing ec2 permission (#4577) 2024-07-31 14:22:21 +02:00
Pedro Martín
2c11c3d6f9 fix(typo): fix typo on PR template (#4596) 2024-07-31 07:58:53 -04:00
cetteup
e050f44d63 fix(aws): Pass backup retention check if retention period is equal to minimum (#4593) 2024-07-31 13:25:53 +02:00
Pepe Fagoaga
4fd3405bbf chore(regions_update): Changes in regions for AWS services. (#4592)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-31 11:33:59 +02:00
dependabot[bot]
a1c2caa745 chore(deps): bump boto3 from 1.34.149 to 1.34.151 (#4587)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-31 09:47:41 +02:00
dependabot[bot]
f639dc8bf4 chore(deps): bump trufflesecurity/trufflehog from 3.80.2 to 3.80.3 (#4581)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-31 09:16:37 +02:00
dependabot[bot]
35325d9f40 chore(deps): bump google-api-python-client from 2.138.0 to 2.139.0 (#4579)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-31 09:16:08 +02:00
Pepe Fagoaga
71503b553a chore(pr-template): Add Checklist (#4586) 2024-07-31 08:31:55 +02:00
dependabot[bot]
d91a240ea8 chore(deps): bump botocore from 1.34.150 to 1.34.151 (#4578)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-31 08:29:51 +02:00
Sergio Garcia
b9b5f66073 fix(test): solve VPC import in tests (#4574) 2024-07-30 10:34:55 -04:00
Sergio Garcia
e3f66840aa chore(version): update Prowler version (#4565)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2024-07-30 10:17:56 +02:00
Rubén De la Torre Vico
0d6c529a46 fix(autoscaling): change unexpected exception to error severity logger (#4569) 2024-07-30 10:07:36 +02:00
dependabot[bot]
5237658047 chore(deps): bump botocore from 1.34.149 to 1.34.150 (#4567)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-30 09:25:07 +02:00
Daniel Barranquero
c00f61ac10 test(GCP): Add remaining GCP tests for KMS checks (#4550) 2024-07-29 13:22:41 -04:00
Rubén De la Torre Vico
2cd840a2b5 fix(autoscaling): Add exception manage while decoding UserData (#4562)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-29 12:03:44 -04:00
dependabot[bot]
7e630ebe27 chore(deps): bump boto3 from 1.34.148 to 1.34.149 (#4556)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-29 08:24:06 +02:00
dependabot[bot]
2f1c0facfd chore(deps): bump trufflesecurity/trufflehog from 3.80.1 to 3.80.2 (#4557)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-29 08:23:45 +02:00
Pepe Fagoaga
603bb03f35 chore(regions_update): Changes in regions for AWS services. (#4560)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-29 08:22:19 +02:00
Rubén De la Torre Vico
b7af1a06e8 fix(organizations): Fix types errors related to policies and json.loads function (#4554) 2024-07-26 10:51:46 -04:00
Kay Agahd
02fc034b1f feat(aws): make check eks_control_plane_logging_all_types_enabled configurable (#4553) 2024-07-26 10:24:01 -04:00
joshua_jebaraj
40522cdc62 fix(gcp): false positive for iam_sa_no_administrative_privilege check (#4500)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-26 10:15:34 -04:00
Rubén De la Torre Vico
dc11d85451 chore(cloudsql): Change default cases for CloudSQL checks and remaining tests (#4537) 2024-07-26 10:09:04 -04:00
Pepe Fagoaga
13c50086eb chore(regions_update): Changes in regions for AWS services. (#4552)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-26 09:38:43 -04:00
Sergio Garcia
f7729381e0 fix(s3): enhance threading in s3 service (#4530) 2024-07-26 09:16:47 -04:00
dependabot[bot]
d244475578 chore(deps): bump azure-mgmt-network from 25.4.0 to 26.0.0 (#4543)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 14:28:46 +02:00
dependabot[bot]
10dcbaea7b chore(deps): bump google-api-python-client from 2.137.0 to 2.138.0 (#4542)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 13:33:43 +02:00
dependabot[bot]
c91bbdcf2b chore(deps): bump azure-mgmt-compute from 31.0.0 to 32.0.0 (#4541)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 13:01:49 +02:00
dependabot[bot]
c7dbcb17d6 chore(deps): bump botocore from 1.34.148 to 1.34.149 (#4539)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 12:08:53 +02:00
dependabot[bot]
5a8a9286db chore(deps): bump boto3 from 1.34.147 to 1.34.148 (#4538)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 09:11:02 +02:00
dependabot[bot]
2476a1275a chore(deps-dev): bump pytest from 8.3.1 to 8.3.2 (#4540)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 08:28:40 +02:00
Hugo Pereira Brito
ac680c58cd docs(services): Fixed changed links (#4536) 2024-07-25 13:14:10 +02:00
Daniel Barranquero
68f0916ce4 test(iam): Add remaining GCP tests for IAM checks (#4519) 2024-07-25 11:21:36 +02:00
dependabot[bot]
dc896fc0af chore(deps): bump botocore from 1.34.147 to 1.34.148 (#4532)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-25 09:28:42 +02:00
dependabot[bot]
76af71d2df chore(deps): bump boto3 from 1.34.146 to 1.34.147 (#4531) 2024-07-25 08:43:22 +02:00
dependabot[bot]
96f761e4ef chore(deps): bump azure-mgmt-containerservice from 30.0.0 to 31.0.0 (#4513)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-24 11:53:31 +02:00
Pepe Fagoaga
9e16e477e9 chore(CODEOWNERS): update team (#4527) 2024-07-24 09:12:33 +02:00
Sergio Garcia
2038e30d3e fix(checks): ensure CheckID is correct in check's metadata (#4522)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-24 09:08:51 +02:00
dependabot[bot]
a4dc6975b0 chore(deps): bump botocore from 1.34.146 to 1.34.147 (#4526)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-24 08:41:38 +02:00
dependabot[bot]
a4a89fa581 chore(deps): bump boto3 from 1.34.145 to 1.34.146 (#4525)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-24 07:38:59 +02:00
Pepe Fagoaga
fc449bfd7b chore(s3): create class and refactor (#4457)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-23 10:03:28 -04:00
Rubén De la Torre Vico
2477948ae9 test(gcp): Test GCP provider new auth and print credentials (#4331) 2024-07-23 09:26:29 -04:00
Rubén De la Torre Vico
ca98584ded test(logging): Add remaining tests for Logging checks (#4481) 2024-07-23 09:24:32 -04:00
Rubén De la Torre Vico
489830f01a docs(azure): Review actual roles necessary to execute Prowler (#4501) 2024-07-23 09:15:23 -04:00
Rubén De la Torre Vico
bd56ca2979 chore(dms): Change checks IDs to match with metadata (#4520) 2024-07-23 06:41:07 -04:00
dependabot[bot]
04483a9a4f chore(deps): bump cryptography from 42.0.6 to 43.0.0 (#4512)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:42:59 +02:00
dependabot[bot]
684f63d398 chore(deps): bump numpy from 2.0.0 to 2.0.1 (#4510)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:02:05 +02:00
dependabot[bot]
b528dd44cd chore(deps): bump botocore from 1.34.145 to 1.34.146 (#4511)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 10:11:50 +02:00
dependabot[bot]
dfdeac0a46 chore(deps-dev): bump pylint from 3.2.5 to 3.2.6 (#4509)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 09:23:33 +02:00
dependabot[bot]
b52b67fd4b chore(deps-dev): bump pytest from 8.2.2 to 8.3.1 (#4508)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-22 18:56:30 -04:00
Sergio Garcia
5cf7d89aab fix(inspector2): add more efficient way to check if any active findings (#4505) 2024-07-22 16:25:23 -04:00
Pedro Martín
f5e6b1e438 docs(developer): improve developers docs with Trufflehog and --no-verify (#4502)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-22 13:12:52 +02:00
Pedro Martín
aa44bde940 chore(deps): update cryptography to 42.0.6 (#4499) 2024-07-22 12:09:55 +02:00
Sergio Garcia
ddc927a4ad chore(test): add missing acm imported certificate test (#4485) 2024-07-22 09:49:37 +02:00
dependabot[bot]
fbc99259e2 chore(deps): bump boto3 from 1.34.144 to 1.34.145 (#4497)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-22 09:11:54 +02:00
Daniel Barranquero
28f6f0abcc test(cloudstorage): Add remaining GCP tests for CloudStorage checks (#4464) 2024-07-19 08:37:22 -04:00
dependabot[bot]
0933a04239 chore(deps): bump azure-storage-blob from 12.20.0 to 12.21.0 (#4490)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-19 08:36:57 -04:00
Pedro Martín
5185f3a41e chore(output): review report function (#4465)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-07-19 08:36:39 -04:00
Pepe Fagoaga
6d20b11394 chore(CODEOWNERS): protect unauthorized changes (#4493) 2024-07-19 12:37:34 +02:00
dependabot[bot]
a01635e9ea chore(deps): bump botocore from 1.34.144 to 1.34.145 (#4491)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-19 11:32:35 +02:00
Pedro Martín
3bf9cd3db1 docs(readme): add Prowler animation gif to README (#4492) 2024-07-19 10:56:01 +02:00
dependabot[bot]
e15f0b2d0f chore(deps): bump trufflesecurity/trufflehog from 3.80.0 to 3.80.1 (#4486)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-19 09:59:44 +02:00
Sergio Garcia
f2de059ca1 fix(ssm): add missing ResourceArn to SSM check (#4482) 2024-07-18 18:10:06 +02:00
Ikko Eltociear Ashimine
8c8ac95d9c docs(readme): update README.md (#4483) 2024-07-18 17:31:52 +02:00
Pepe Fagoaga
89159c2111 chore(codeowners): update for sdk and checks (#4480) 2024-07-18 09:52:23 -04:00
Pedro Martín
70eb59185b docs(readme): update dashboard screenshot in README (#4479) 2024-07-18 12:53:03 +02:00
Pepe Fagoaga
f97af19860 chore(regions_update): Changes in regions for AWS services. (#4478)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-18 11:32:53 +02:00
dependabot[bot]
5ccd8af2a2 chore(deps): bump msgraph-sdk from 1.5.2 to 1.5.3 (#4475)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-18 10:40:07 +02:00
Pedro Martín
b53e8abc87 fix(main): change module name (#4477) 2024-07-18 10:29:47 +02:00
dependabot[bot]
db4c4fdaeb chore(deps): bump azure-mgmt-keyvault from 10.3.0 to 10.3.1 (#4474)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-18 08:39:20 +02:00
Sergio Garcia
44afe2db3e chore(compliance): simplify ComplianceOutput class (#4467) 2024-07-18 08:36:57 +02:00
Sergio Garcia
204d548cd0 chore(csv): remove old CSV functions (#4469) 2024-07-18 08:30:07 +02:00
dependabot[bot]
3faf80c0d7 chore(deps): bump trufflesecurity/trufflehog from 3.79.0 to 3.80.0 (#4471)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-18 08:28:01 +02:00
chaipot
5078e4a823 chore(docs): update remediation of custom checks metadata (#4470) 2024-07-17 17:14:35 -04:00
Pepe Fagoaga
d1b57ebd75 feat(output): Add a setter for the file descriptor and include extension (#4468) 2024-07-17 17:09:47 -04:00
Sergio Garcia
fdab3a737a chore(compliance): change compliance model names (#4466) 2024-07-17 11:47:28 -04:00
Rubén De la Torre Vico
b6f01b92dd test(gcp): Add bigquery and half of cloudsql check tests (#4462) 2024-07-17 12:03:22 +02:00
Pepe Fagoaga
c92537c791 chore(regions_update): Changes in regions for AWS services. (#4463)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-17 11:35:53 +02:00
Sergio Garcia
3e7cc2e0a2 chore(compliance): add manual requirements to compliance output (#4449)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2024-07-17 08:23:38 +02:00
Rubén De la Torre Vico
b8cfdb590b test(gcp): Add remaining CloudSQL tests (#4380) 2024-07-16 13:51:53 -04:00
Pepe Fagoaga
577afbd521 chore(mutelist): create new class to encapsulate the logic (#4413) 2024-07-16 13:44:43 -04:00
Rubén De la Torre Vico
d01cc51b6d test(compute): Add remaining tests for Compute service in GCP provider (#4458) 2024-07-16 11:43:30 -04:00
dependabot[bot]
ffa60b4ccd chore(deps): bump msgraph-sdk from 1.4.0 to 1.5.2 (#4426)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-16 06:57:42 -04:00
Rubén De la Torre Vico
d6dd0f7244 fix(entra): Change to correct service in entra_user_with_vm_access_has_mfa metadata (#4454) 2024-07-16 12:06:18 +02:00
Pepe Fagoaga
4df0dc4904 chore(regions_update): Changes in regions for AWS services. (#4455)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-16 11:48:52 +02:00
dependabot[bot]
386a1e1d1a chore(deps): bump boto3 from 1.34.143 to 1.34.144 (#4451)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-16 08:54:32 +02:00
dependabot[bot]
db9d7a4439 chore(deps): bump setuptools from 69.5.1 to 70.0.0 (#4450)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-15 16:12:56 -04:00
Pedro Martín
5725035e29 chore(GenericCompliance): add Generic Compliance class (#4447)
Co-authored-by: Sergio <sergio@prowler.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-15 12:56:22 -04:00
Pedro Martín
96a49e97d2 fix(iam_avoid_root_usage): change timestamp format (#4446) 2024-07-15 17:10:49 +02:00
Sergio Garcia
2a95750525 chore(iso27001): add ISO27001 output class (#4441)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-15 09:43:26 -04:00
Pedro Martín
b868d1a7fe fix(glue): add getters for connection attributes (#4445) 2024-07-15 14:51:01 +02:00
Pepe Fagoaga
37ade2a722 chore(revert): PR #4067 (#4440)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2024-07-15 10:25:00 +02:00
dependabot[bot]
c67032e07f chore(deps): bump botocore from 1.34.143 to 1.34.144 (#4442)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-15 09:25:25 +02:00
Pepe Fagoaga
0de8ef032a chore(regions_update): Changes in regions for AWS services. (#4444)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-15 09:24:37 +02:00
Sergio Garcia
027aa9796d chore(aws): add AWS Well-Architected output class (#4439) 2024-07-12 11:27:21 -04:00
Sergio Garcia
a505776227 chore(ens): add ENS output class (#4435) 2024-07-12 10:50:41 -04:00
Sergio Garcia
3be9de376a chore(mitre): add MITRE ATT&CK output class (#4425) 2024-07-12 10:08:32 -04:00
dependabot[bot]
bd26d74b28 chore(deps): bump boto3 from 1.34.142 to 1.34.143 (#4437)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-12 09:36:43 +02:00
dependabot[bot]
ca27854ff0 chore(deps-dev): bump coverage from 7.5.4 to 7.6.0 (#4438)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-12 08:55:33 +02:00
Pepe Fagoaga
abd18dc14d chore(regions_update): Changes in regions for AWS services. (#4433) 2024-07-11 09:27:52 -04:00
Pepe Fagoaga
297f506fd3 docs(gcp): Fix typo in title (#4434) 2024-07-11 09:27:04 -04:00
dependabot[bot]
78ca4b93a5 chore(deps): bump botocore from 1.34.142 to 1.34.143 (#4428)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 10:04:33 +02:00
dependabot[bot]
c80d51b585 chore(deps): bump boto3 from 1.34.141 to 1.34.142 (#4427)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 08:47:57 +02:00
Sergio Garcia
cf9b23c302 fix(cis): add missing fields and reorder (#4424) 2024-07-10 13:11:55 -04:00
Sergio Garcia
ef4b9e8d6a fix(templates): solve broken GitHub issues templates (#4423) 2024-07-10 16:55:51 +02:00
Sergio Garcia
a5a8c2a769 chore(cis): add CIS output class (#4400)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-10 09:26:08 -04:00
Pepe Fagoaga
64b21ae2b9 chore(labeler): add outputs and integrations (#4422) 2024-07-10 09:25:07 -04:00
Pepe Fagoaga
3da4824a1d chore(regions_update): Changes in regions for AWS services. (#4420)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-10 09:24:05 -04:00
Pepe Fagoaga
2247296cf9 chore(templates): update to remove titles (#4421) 2024-07-10 09:22:13 -04:00
dependabot[bot]
615127f790 chore(deps): bump botocore from 1.34.141 to 1.34.142 (#4416)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-10 11:54:21 +02:00
dependabot[bot]
42f21a52c9 chore(deps): bump google-api-python-client from 2.136.0 to 2.137.0 (#4415)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-10 08:36:13 +02:00
dependabot[bot]
e9442b2f89 chore(deps): bump zipp from 3.18.1 to 3.19.1 (#4414)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-10 08:32:34 +02:00
Pepe Fagoaga
6336b1c0d9 refactor(SecurityHub): create class to handle integration (#4397)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-09 11:47:47 -04:00
Pepe Fagoaga
a0603b972e chore(regions_update): Changes in regions for AWS services. (#4412)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-09 09:18:57 -04:00
dependabot[bot]
f319884532 chore(deps): bump boto3 from 1.34.139 to 1.34.141 (#4410)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 13:27:29 +02:00
dependabot[bot]
d49139c4f4 chore(deps-dev): bump moto from 5.0.10 to 5.0.11 (#4404)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 11:33:00 +02:00
dependabot[bot]
046c82232d chore(deps): bump botocore from 1.34.140 to 1.34.141 (#4403)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 09:39:19 +02:00
dependabot[bot]
027aafd9ea chore(deps): bump jsonschema from 4.22.0 to 4.23.0 (#4402)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 09:01:13 +02:00
Sergio Garcia
215d5dabd7 fix(docs): update deprecated command (#4401) 2024-07-09 08:40:25 +02:00
Pepe Fagoaga
f5e2ac7486 chore(regions_update): Changes in regions for AWS services. (#4396)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-08 09:56:03 -04:00
Pepe Fagoaga
6fc24b5435 chore: rename test function in the HTML test class (#4395) 2024-07-08 09:51:44 -04:00
dependabot[bot]
3d99e6ea28 chore(deps): bump botocore from 1.34.139 to 1.34.140 (#4391)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-08 09:27:57 +02:00
dependabot[bot]
b23aefadc1 chore(deps): bump certifi from 2024.2.2 to 2024.7.4 (#4392)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-08 09:21:56 +02:00
dependabot[bot]
b585a31a14 chore(deps): bump boto3 from 1.34.138 to 1.34.139 (#4383) 2024-07-05 19:03:20 -04:00
Pepe Fagoaga
9c817ae8a9 tests: add for empty findings and little renamings (#4388)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-05 15:09:23 -04:00
JackyCCChen
cd7f19c00e fix(gcp): Not all gcp projects have name (#4387) 2024-07-05 11:08:31 -04:00
dependabot[bot]
d1a7d19799 chore(deps-dev): bump safety from 3.2.3 to 3.2.4 (#4385)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-05 11:20:41 +02:00
Pedro Martín
d7dffbc44b chore(test): enhance OCSF tests (#4386) 2024-07-05 11:19:53 +02:00
dependabot[bot]
0402cc7e2d chore(deps): bump slack-sdk from 3.30.0 to 3.31.0 (#4384)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-05 08:47:41 +02:00
Sergio Garcia
bf83f38c89 chore(html): add HTML class (#4360) 2024-07-04 13:28:09 -04:00
Pepe Fagoaga
673619c8a1 refactor(ASFF): create class (#4368)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2024-07-04 12:04:36 -04:00
Pedro Martín
2345a7384b chore(ocsf): add OCSF class for outputs (#4355) 2024-07-04 17:08:01 +02:00
Oleksii
e387c591c3 chore(k8s): Add helm-chart (#4370)
Co-authored-by: Oleksii Tsyganov <otsyganov@magicleap.com>
2024-07-04 10:30:45 -04:00
Rubén De la Torre Vico
47a37c7d0d chore(iam): Improve status extended adding the resource type (#4378) 2024-07-04 09:32:35 -04:00
dependabot[bot]
7b359cf1eb chore(deps): bump botocore from 1.34.138 to 1.34.139 (#4373)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-04 14:32:16 +02:00
Pepe Fagoaga
35d525b903 chore(regions_update): Changes in regions for AWS services. (#4379)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-04 11:48:04 +02:00
Pedro Martín
b5b193427d docs(readme): update check number on readme (#4377) 2024-07-04 08:54:12 +02:00
Rubén De la Torre Vico
e6ae539323 feat(IAM): Add inline policies checks and improve custom policy checks (#4255) 2024-07-03 15:51:19 -04:00
Pepe Fagoaga
541b907038 chore(regions_update): Changes in regions for AWS services. (#4369)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-03 09:56:15 -04:00
dependabot[bot]
040e1eaa5e chore(deps): bump boto3 from 1.34.136 to 1.34.138 (#4367)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-03 08:43:03 +02:00
dependabot[bot]
e23a674277 chore(deps): bump google-api-python-client from 2.135.0 to 2.136.0 (#4362)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-03 08:09:39 +02:00
dependabot[bot]
e73cefdf1a chore(deps): bump botocore from 1.34.137 to 1.34.138 (#4361)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-03 07:37:08 +02:00
Rubén De la Torre Vico
9ed4e89c60 chore(iam): Remove unnecesary attached policy in a inline policy (#4359) 2024-07-02 12:38:00 -04:00
Pedro Martín
da547b2bbe fix(test-csv): fix test using tempfile (#4356) 2024-07-02 09:16:12 -04:00
Pedro Martín
ca033745c9 chore(csv): add CSVOutput class (#4315)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-02 13:12:43 +02:00
dependabot[bot]
fb49fb83ae chore(deps): bump botocore from 1.34.136 to 1.34.137 (#4351)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-02 09:30:49 +02:00
dependabot[bot]
76e0b23365 chore(deps): bump boto3 from 1.34.132 to 1.34.136 (#4352)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-02 08:52:10 +02:00
Sergio Garcia
82ccdc45d2 chore(elasticache): enhance service and checks (#4329) 2024-07-01 10:06:24 -04:00
dependabot[bot]
de777a6417 chore(deps): bump azure-mgmt-storage from 21.2.0 to 21.2.1 (#4339)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 09:55:17 -04:00
dependabot[bot]
87d8cda745 chore(deps-dev): bump moto from 5.0.9 to 5.0.10 (#4343)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 09:19:22 -04:00
dependabot[bot]
64abd0a6d0 chore(deps-dev): bump pylint from 3.2.3 to 3.2.5 (#4347)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 13:58:42 +02:00
dependabot[bot]
096d7c6304 chore(deps): bump botocore from 1.34.132 to 1.34.136 (#4337)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 12:58:50 +02:00
dependabot[bot]
4908e06544 chore(deps): bump google-api-python-client from 2.134.0 to 2.135.0 (#4345)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 12:28:08 +02:00
dependabot[bot]
d42cc66d9f chore(deps): bump trufflesecurity/trufflehog from 3.78.2 to 3.79.0 (#4335)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 11:48:30 +02:00
Pepe Fagoaga
7a5318b936 chore(dependabot): Run daily (#4334) 2024-07-01 11:43:50 +02:00
Pepe Fagoaga
ffb494f9a4 chore(regions_update): Changes in regions for AWS services. (#4332) 2024-07-01 08:57:03 +02:00
Sergio Garcia
f515b2b53b fix(aws): parallelize functions per resource (#4323) 2024-06-28 09:27:47 -04:00
Pepe Fagoaga
a3cf7665ac chore(regions_update): Changes in regions for AWS services. (#4330)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-28 11:43:29 +02:00
Rubén De la Torre Vico
dbaf72958e doc(requirements): Add management group for multiple subscriptions (#4282)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2024-06-28 10:06:16 +02:00
Sergio Garcia
169d1686d2 fix(s3): handle empty Action in bucket policy (#4328) 2024-06-28 08:25:40 +02:00
sansns-aws
ba726b205d feat(Elasticache): Additional Elasticache checks (#4317)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-27 18:07:22 -04:00
sansns-aws
630d980861 feat(NetworkFirewall): Add Deletion Protection Check (#4318)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-27 10:08:31 -04:00
Pedro Martín
7d81040eae fix(docs): Rewrite dashboard docs (#4327) 2024-06-27 12:55:02 +02:00
Pepe Fagoaga
4009d96f8a chore(regions_update): Changes in regions for AWS services. (#4326)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-27 12:33:45 +02:00
Pepe Fagoaga
cee5064b11 chore(tests): Improve CloudTrail tests checking for multiregional trails (#4177)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-26 17:33:50 -04:00
Sergio Garcia
e5c911abef chore(python): update vulnerable anyio library (#4322) 2024-06-26 16:57:57 -04:00
Sergio Garcia
ff5c41f363 fix(codebuild): enhance service functions (#4319)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-06-26 11:27:50 -04:00
Sergio Garcia
cf84875355 feat(gcp): add service account impersonation (#4291) 2024-06-26 15:31:47 +02:00
Pepe Fagoaga
fc23eccc7b chore(regions_update): Changes in regions for AWS services. (#4320)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-26 11:47:28 +02:00
Pedro Martín
c5fb11e815 docs(kubernetes): add docs about kubernetes in tutorials page (#4288)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-25 11:41:13 -04:00
dependabot[bot]
fdab1edd3e chore(deps): bump boto3 from 1.34.123 to 1.34.132 (#4316)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 16:15:42 +02:00
dependabot[bot]
ea74d82c48 chore(deps): bump azure-mgmt-web from 7.2.0 to 7.3.0 (#4301)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 09:21:12 -04:00
Rubén De la Torre Vico
093738c65f chore(s3): reduce false positive in s3 public check (#4281) 2024-06-25 08:55:42 -04:00
Pedro Martín
bae224c891 fix(csv-outputs): compliance outputs not showing consistents values (#4287) 2024-06-25 14:50:17 +02:00
dependabot[bot]
32cded949d chore(deps): bump azure-mgmt-cosmosdb from 9.5.0 to 9.5.1 (#4298)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 14:46:25 +02:00
dependabot[bot]
6463dcdde0 chore(deps): bump azure-identity from 1.16.1 to 1.17.1 (#4300)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 14:07:39 +02:00
dependabot[bot]
0b16dab2ad chore(deps): bump azure-mgmt-storage from 21.1.0 to 21.2.0 (#4297)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 13:34:12 +02:00
dependabot[bot]
825c620e6f chore(deps): bump botocore from 1.34.128 to 1.34.132 (#4296)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 12:30:26 +02:00
dependabot[bot]
819a5597a3 chore(deps-dev): bump coverage from 7.5.3 to 7.5.4 (#4295)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 10:35:50 +02:00
dependabot[bot]
4bae3d2600 chore(deps): bump slack-sdk from 3.29.0 to 3.30.0 (#4294)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 09:51:31 +02:00
Sergio Garcia
131cb82751 chore(readme): update checks number (#4290) 2024-06-25 08:56:04 +02:00
dependabot[bot]
029caf3b10 chore(deps): bump google-api-python-client from 2.133.0 to 2.134.0 (#4293)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 08:38:08 +02:00
dependabot[bot]
9ee23a39b5 chore(deps): bump trufflesecurity/trufflehog from 3.78.1 to 3.78.2 (#4292) 2024-06-25 07:57:24 +02:00
Pedro Martín
4837df4352 chore(aws): handle new permissions (#4289) 2024-06-24 12:14:20 -04:00
sansns-aws
d173d58a93 feat(DMS): Add Database Migration Service (DMS) (#4249) 2024-06-24 11:41:33 -04:00
sansns-aws
af29570fe9 feat(DocumentDB): New DocumentDB checks (#4247) 2024-06-24 11:40:39 -04:00
sansns-aws
9253cd42dd feat(neptune): Additional Neptune checks (#4243) 2024-06-24 11:38:41 -04:00
Sergio Garcia
836b4ba2cc fix(rds): handle not existing endpoint (#4285) 2024-06-24 09:38:26 +02:00
Pepe Fagoaga
f28c0578aa chore(regions_update): Changes in regions for AWS services. (#4286)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-24 07:53:01 +02:00
Rubén De la Torre Vico
536f0df9d3 feat(app): Add new Azure functions checks (#4189)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-21 11:32:31 -04:00
Pepe Fagoaga
465261e1df chore(regions_update): Changes in regions for AWS services. (#4283)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-21 10:54:24 -04:00
Sergio Garcia
3667370604 chore(safety): update vulnerable library version (#4284) 2024-06-21 10:23:17 -04:00
sansns-aws
9ca64e7bdb feat(RDS): Additional RDS checks (#4233)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-20 13:41:08 -04:00
dependabot[bot]
95a9f1c458 chore(deps): bump kubernetes from 29.0.0 to 30.1.0 (#4226)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-20 11:34:35 -04:00
Pepe Fagoaga
9fbd627f9a chore(regions_update): Changes in regions for AWS services. (#4280)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-20 08:57:32 -04:00
Pepe Fagoaga
7203fcf4f1 chore(regions_update): Changes in regions for AWS services. (#4278)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-20 08:57:05 -04:00
Rubén De la Torre Vico
f10bb343a6 doc(debugging): Improve actual VSCode debugging file (#4279) 2024-06-20 09:11:01 +02:00
John Mastron
9147a45e2f fix(aws): aws check and metadata fixes (#4251)
Co-authored-by: John Mastron <jmastron@jpl.nasa.gov>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-06-19 10:21:50 +02:00
dependabot[bot]
5353d515b6 chore(deps): bump dash from 2.17.0 to 2.17.1 (#4272)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 09:37:44 -04:00
Pepe Fagoaga
e8a94733bf fix(aws): Assume role for Gov Cloud (#4254)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-06-18 09:37:23 -04:00
Pepe Fagoaga
625be45742 chore(regions_update): Changes in regions for AWS services. (#4277)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-18 09:09:43 -04:00
dependabot[bot]
ecb6cb897f chore(deps): bump numpy from 1.26.4 to 2.0.0 (#4275) 2024-06-18 14:53:38 +02:00
dependabot[bot]
f07bd79442 chore(deps-dev): bump flake8 from 7.0.0 to 7.1.0 (#4269)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 13:43:11 +02:00
dependabot[bot]
b7c1fabae1 chore(deps-dev): bump bandit from 1.7.8 to 1.7.9 (#4271)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 11:44:02 +02:00
dependabot[bot]
59d3b2f33e chore(deps): bump google-api-python-client from 2.132.0 to 2.133.0 (#4274)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 11:04:25 +02:00
dependabot[bot]
6c098e98e3 chore(deps): bump botocore from 1.34.123 to 1.34.128 (#4273)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 10:15:13 +02:00
dependabot[bot]
380011fd1e chore(deps): bump urllib3 from 1.26.18 to 1.26.19 (#4276)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 09:06:35 +02:00
dependabot[bot]
e97bf32a90 chore(deps): bump slack-sdk from 3.28.0 to 3.29.0 (#4270)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 08:50:52 +02:00
dependabot[bot]
ed18ea0ec4 chore(deps): bump docker/build-push-action from 5 to 6 (#4260)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 08:49:47 +02:00
dependabot[bot]
dc897986bc chore(deps): bump trufflesecurity/trufflehog from 3.78.0 to 3.78.1 (#4259)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 08:49:36 +02:00
Pepe Fagoaga
e296d6e5c1 fix: Some minor fixes in several parts (#4237)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-06-17 16:54:54 -04:00
Andoni Alonso
1252e6163b chore(docs): update checks reference link (#4258) 2024-06-17 15:30:39 -04:00
Pepe Fagoaga
8ad14c7833 fix(custom_checks): workaround to fix execution (#4256) 2024-06-17 14:13:18 -04:00
Pepe Fagoaga
61b9ecc214 chore(regions_update): Changes in regions for AWS services. (#4252)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-14 11:07:22 -04:00
Sergio Garcia
f8f2c19454 fix(readme): update note syntax (#4250) 2024-06-13 16:05:10 -04:00
Rubén De la Torre Vico
922438a7a0 chore(network): Reduce network watchers azure check findings (#4242) 2024-06-13 15:57:44 -04:00
Pepe Fagoaga
920f98c9ef chore(regions_update): Changes in regions for AWS services. (#4248)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-13 15:12:10 -04:00
Pepe Fagoaga
9b1ad5dd2e chore(regions_update): Changes in regions for AWS services. (#4246)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-12 07:56:53 -04:00
dependabot[bot]
d7a97b6e1d chore(deps): bump azure-identity from 1.16.0 to 1.16.1 (#4230)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 17:49:44 -04:00
dependabot[bot]
07db051d14 chore(deps): bump azure-identity from 1.16.0 to 1.16.1 (#4245)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 17:49:30 -04:00
dependabot[bot]
6fec85589d chore(deps-dev): bump pylint from 3.2.2 to 3.2.3 (#4229) 2024-06-11 12:59:21 -04:00
dependabot[bot]
f82aa1c3e1 chore(deps-dev): bump pytest from 8.2.1 to 8.2.2 (#4223) 2024-06-11 12:10:27 -04:00
Pepe Fagoaga
ee9faedbbe docs(developer-guide): How to fork the repo (#4238) 2024-06-11 12:08:54 -04:00
Pepe Fagoaga
e5dec1251d fix(s3): Send HTML also (#4240) 2024-06-11 12:08:13 -04:00
Pepe Fagoaga
692a39b08f chore(regions_update): Changes in regions for AWS services. (#4241) 2024-06-11 12:04:51 -04:00
Pepe Fagoaga
60b3523def chore(release): 4.2.4 (#4236) 2024-06-11 09:46:33 -04:00
Rubén De la Torre Vico
e1428bc1ff chore(iam): improve iam user console access check (#4211) 2024-06-11 12:45:29 +02:00
dependabot[bot]
0ff8b7e02a chore(deps): bump boto3 from 1.34.113 to 1.34.123 (#4235)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 11:56:02 +02:00
dependabot[bot]
7b84008046 chore(deps): bump google-api-python-client from 2.131.0 to 2.132.0 (#4227)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 11:02:08 +02:00
dependabot[bot]
30a092e2aa chore(deps): bump slack-sdk from 3.27.2 to 3.28.0 (#4228)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 09:54:38 +02:00
dependabot[bot]
11a7ff2977 chore(deps): bump trufflesecurity/trufflehog from 3.77.0 to 3.78.0 (#4222)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 09:51:43 +02:00
dependabot[bot]
12ba978361 chore(deps-dev): bump safety from 3.2.0 to 3.2.3 (#4232)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 09:22:41 +02:00
dependabot[bot]
42182a2b70 chore(deps): bump botocore from 1.34.118 to 1.34.123 (#4224)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 08:37:14 +02:00
dependabot[bot]
26eaec3101 chore(deps-dev): bump authlib from 1.3.0 to 1.3.1 (#4213)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-10 16:47:40 -04:00
Pepe Fagoaga
daf6194dee chore(regions_update): Changes in regions for AWS services. (#4210)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-08 16:14:04 +02:00
William Leung
e28300a1db fix(config/html): handle encoding issues and improve error handling in config and HTML file loading functions (#4203)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-07 12:51:01 -04:00
Rubén De la Torre Vico
1a225c334f chore(acm): Improve near-expiration certificates check (#4207)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-07 12:22:05 -04:00
Sergio Garcia
1d64ca4372 fix(compliance): check if custom check has compliance metadata (#4208) 2024-06-07 10:54:34 -04:00
Seiji Ujihira
2a139e3dc7 fix(custom): execute custom checks (#4202) 2024-06-07 10:01:28 -04:00
Pedro Martín
89d1712ff1 fix(dashboard): fix styles in overview page (#4204) 2024-06-07 09:46:54 -04:00
Pedro Martín
45ea9e1e79 fix(html): fix status from HTML outputs (#4206) 2024-06-07 09:36:21 -04:00
Pepe Fagoaga
4b46fe9788 chore(regions_update): Changes in regions for AWS services. (#4205)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-07 09:31:55 -04:00
Sergio Garcia
28b9e269b7 chore(version): update Prowler version (#4201) 2024-06-07 08:40:03 +02:00
Pedro Martín
0a41ec4746 fix(html): resolve html changing finding status (#4199) 2024-06-06 11:30:49 -04:00
Pedro Martín
e6472f9bfc fix(html): handle muted status to html outputs (#4195)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-06-06 10:06:02 -04:00
Pedro Martín
c033af6194 docs(readme): Update checks number (#4197) 2024-06-06 09:39:24 -04:00
sansns-aws
4d662dc446 feat(rds): Add security group event subscription check (#4130)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-06 08:45:50 -04:00
Sergio Garcia
0de10c4742 fix(s3): check if account is signed up (#4194) 2024-06-06 08:43:49 -04:00
Sergio Garcia
f7b7ce3b95 fix(glue): check if get dev endpoints call is supported (#4193) 2024-06-06 08:43:39 -04:00
Sergio Garcia
7b43b3d31e fix(elasticache): handle empty cluster subnets (#4192) 2024-06-06 08:43:30 -04:00
Sergio Garcia
84b9c442fe fix(rds): handle not existing parameter values (#4191) 2024-06-06 08:43:19 -04:00
Kay Agahd
a890895e8b docs(index): fix docu about output modes (#4187) 2024-06-05 10:10:11 -04:00
Pedro Martín
f3c6720a1c chore(version): update prowler version (#4190) 2024-06-05 09:11:50 -04:00
Kay Agahd
8c29bbfe4e docs(reporting): fix mapping of json-ocsf field cloud.account.type (#4186) 2024-06-04 17:17:28 -04:00
563 changed files with 38016 additions and 9293 deletions

6
.github/CODEOWNERS vendored
View File

@@ -1 +1,5 @@
* @prowler-cloud/prowler-oss @prowler-cloud/prowler-dev
* @prowler-cloud/sdk @prowler-cloud/detection-and-remediation
# To protect a repository fully against unauthorized changes, you also need to define an owner for the CODEOWNERS file itself.
# https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners#codeowners-and-branch-protection
/.github/ @prowler-cloud/sdk

View File

@@ -1,6 +1,5 @@
name: 🐞 Bug Report
description: Create a report to help us improve
title: "[Bug]: "
labels: ["bug", "status/needs-triage"]
body:

View File

@@ -1,8 +1,7 @@
name: 💡 Feature Request
name: 💡 Feature Request
description: Suggest an idea for this project
labels: ["feature-request", "status/needs-triage"]
body:
- type: textarea
id: Problem

View File

@@ -8,7 +8,7 @@ updates:
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
interval: "daily"
open-pull-requests-limit: 10
target-branch: master
labels:
@@ -17,14 +17,14 @@ updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
interval: "daily"
open-pull-requests-limit: 10
target-branch: master
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
interval: "daily"
open-pull-requests-limit: 10
target-branch: v3
labels:
@@ -34,7 +34,7 @@ updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
interval: "daily"
open-pull-requests-limit: 10
target-branch: v3
labels:

50
.github/labeler.yml vendored
View File

@@ -29,3 +29,53 @@ github_actions:
cli:
- changed-files:
- any-glob-to-any-file: "cli/**"
mutelist:
- changed-files:
- any-glob-to-any-file: "prowler/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/aws/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/kubernetes/lib/mutelist/**"
- any-glob-to-any-file: "tests/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/aws/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/kubernetes/lib/mutelist/**"
integration/s3:
- changed-files:
- any-glob-to-any-file: "prowler/providers/aws/lib/s3/**"
- any-glob-to-any-file: "tests/providers/aws/lib/s3/**"
integration/slack:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/slack/**"
- any-glob-to-any-file: "tests/lib/outputs/slack/**"
integration/security-hub:
- changed-files:
- any-glob-to-any-file: "prowler/providers/aws/lib/security_hub/**"
- any-glob-to-any-file: "tests/providers/aws/lib/security_hub/**"
- any-glob-to-any-file: "prowler/lib/outputs/asff/**"
- any-glob-to-any-file: "tests/lib/outputs/asff/**"
output/html:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/html/**"
- any-glob-to-any-file: "tests/lib/outputs/html/**"
output/asff:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/asff/**"
- any-glob-to-any-file: "tests/lib/outputs/asff/**"
output/ocsf:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/ocsf/**"
- any-glob-to-any-file: "tests/lib/outputs/ocsf/**"
output/csv:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/csv/**"
- any-glob-to-any-file: "tests/lib/outputs/csv/**"

View File

@@ -2,11 +2,18 @@
Please include relevant motivation and context for this PR.
If fixes an issue please add it with `Fix #XXXX`
### Description
Please include a summary of the change and which issue is fixed. List any dependencies that are required for this change.
### Checklist
- Are there new checks included in this PR? Yes / No
- If so, do we need to update permissions for the provider? Please review this carefully.
- [ ] Review if the code is being covered by tests.
- [ ] Review if code is being documented following this specification https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
### License

View File

@@ -118,7 +118,7 @@ jobs:
- name: Build and push container image (latest)
if: github.event_name == 'push'
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
push: true
tags: |
@@ -130,7 +130,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
# Use local context to get changes
# https://github.com/docker/build-push-action#path-context

View File

@@ -11,7 +11,7 @@ jobs:
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@v3.77.0
uses: trufflesecurity/trufflehog@3.80.4
with:
path: ./
base: ${{ github.event.repository.default_branch }}

View File

@@ -73,7 +73,7 @@ jobs:
- name: Safety
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run safety check --ignore 67599 --ignore 70612
poetry run safety check --ignore 70612
- name: Vulture
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |

View File

@@ -97,7 +97,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'safety check --ignore 67599 --ignore 70612'
entry: bash -c 'safety check --ignore 70612'
language: system
- id: vulture

View File

@@ -37,6 +37,9 @@
<a href="https://twitter.com/prowlercloud"><img alt="Twitter" src="https://img.shields.io/twitter/follow/prowlercloud?style=social"></a>
</p>
<hr>
<p align="center">
<img align="center" src="/docs/img/prowler-cli-quick.gif" width="100%" height="100%">
</p>
# Description
@@ -60,9 +63,9 @@ It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, Fe
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 359 | 66 -> `prowler aws --list-services` | 28 -> `prowler aws --list-compliance` | 7 -> `prowler aws --list-categories` |
| AWS | 385 | 67 -> `prowler aws --list-services` | 28 -> `prowler aws --list-compliance` | 7 -> `prowler aws --list-categories` |
| GCP | 77 | 13 -> `prowler gcp --list-services` | 1 -> `prowler gcp --list-compliance` | 2 -> `prowler gcp --list-categories`|
| Azure | 127 | 16 -> `prowler azure --list-services` | 2 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
| Azure | 135 | 16 -> `prowler azure --list-services` | 2 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
| Kubernetes | 83 | 7 -> `prowler kubernetes --list-services` | 1 -> `prowler kubernetes --list-compliance` | 7 -> `prowler kubernetes --list-categories` |
# 💻 Installation
@@ -74,7 +77,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
pip install prowler
prowler -v
```
More details at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/)
>More details at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/)
## Containers
@@ -91,7 +94,7 @@ The container images are available here:
- [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags)
- [AWS Public ECR](https://gallery.ecr.aws/prowler-cloud/prowler)
## From Github
## From GitHub
Python >= 3.9, < 3.13 is required with pip and poetry:
@@ -102,8 +105,7 @@ poetry shell
poetry install
python prowler.py -v
```
???+ note
If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
> If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
# 📐✏️ High level architecture
You can run Prowler from your workstation, a Kubernetes Job, a Google Compute Engine, an Azure VM, an EC2 instance, Fargate or any other container, CloudShell and many more.

View File

@@ -1,6 +0,0 @@
# CLI
To show the banner, use:
`python cli/cli.py banner`
## Listing
List services by provider.
`python cli/cli.py <provider> list-services`

View File

@@ -1,63 +0,0 @@
import typer
from prowler.lib.banner import print_banner
from prowler.lib.check.check import (
list_fixers,
list_services,
print_fixers,
print_services,
)
app = typer.Typer()
aws = typer.Typer(name="aws")
azure = typer.Typer(name="azure")
gcp = typer.Typer(name="gcp")
kubernetes = typer.Typer(name="kubernetes")
app.add_typer(aws, name="aws")
app.add_typer(azure, name="azure")
app.add_typer(gcp, name="gcp")
app.add_typer(kubernetes, name="kubernetes")
def list_resources(provider: str, resource_type: str):
if resource_type == "services":
print_services(list_services(provider))
elif resource_type == "fixers":
print_fixers(list_fixers(provider))
def create_list_commands(provider_typer: typer.Typer):
provider_name = provider_typer.info.name
@provider_typer.command(
"list-services",
help=f"List the {provider_name} services that are supported by Prowler.",
)
def list_services_command():
list_resources(provider_name, "services")
@provider_typer.command(
"list-fixers",
help=f"List the {provider_name} fixers that are supported by Prowler.",
)
def list_fixers_command():
list_resources(provider_name, "fixers")
create_list_commands(aws)
create_list_commands(azure)
create_list_commands(gcp)
create_list_commands(kubernetes)
@app.command("banner", help="Prints the banner of the tool.")
def banner(show: bool = True):
if show:
print_banner(show)
else:
print("Banner is not shown.")
if __name__ == "__main__":
app()

View File

@@ -0,0 +1,23 @@
# Patterns to ignore when building packages.
# This supports shell glob matching, relative path matching, and
# negation (prefixed with !). Only one pattern per line.
.DS_Store
# Common VCS dirs
.git/
.gitignore
.bzr/
.bzrignore
.hg/
.hgignore
.svn/
# Common backup files
*.swp
*.bak
*.tmp
*.orig
*~
# Various IDEs
.project
.idea/
*.tmproj
.vscode/

View File

@@ -0,0 +1,24 @@
apiVersion: v2
name: prowler
description: Prowler Security Tool Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
# to be deployed.
#
# Library charts provide useful utilities or functions for the chart developer. They're included as
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.1
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
appVersion: "1.16.0"

View File

@@ -0,0 +1,78 @@
# prowler
![Version: 0.1.1](https://img.shields.io/badge/Version-0.1.1-informational?style=flat-square) ![Type: application](https://img.shields.io/badge/Type-application-informational?style=flat-square) ![AppVersion: 1.16.0](https://img.shields.io/badge/AppVersion-1.16.0-informational?style=flat-square)
Prowler Security Tool Helm chart for Kubernetes
# Prowler Helm Chart Deployment
This guide provides step-by-step instructions for deploying the Prowler Helm chart.
## Prerequisites
Before you begin, ensure you have the following:
1. A running Kubernetes cluster.
2. Helm installed on your local machine. If you don't have Helm installed, you can follow the [Helm installation guide](https://helm.sh/docs/intro/install/).
3. Proper access to your Kubernetes cluster (e.g., `kubectl` is configured and working).
## Deployment Steps
### 1. Clone the Repository
Clone the repository containing the Helm chart to your local machine.
```sh
git clone git@github.com:prowler-cloud/prowler.git
cd prowler/contrib/k8s/helm
```
### 2. Deploy the helm chart
```
helm install prowler .
```
### 3. Verify the deployment
```
helm status prowler
kubectl get all -n prowler-ns
```
### 4. Clean Up
To uninstall the Helm release and clean up the resources, run:
```helm uninstall prowler
kubectl delete namespace prowler-ns
```
## Values
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| clusterRole.name | string | `"prowler-read-cluster"` | |
| clusterRoleBinding.name | string | `"prowler-read-cluster-binding"` | |
| configMap.name | string | `"prowler-hostpaths"` | |
| configMapData.etcCniNetd | string | `"/etc/cni/net.d"` | |
| configMapData.etcKubernetes | string | `"/etc/kubernetes"` | |
| configMapData.etcSystemd | string | `"/etc/systemd"` | |
| configMapData.libSystemd | string | `"/lib/systemd"` | |
| configMapData.optCniBin | string | `"/opt/cni/bin"` | |
| configMapData.usrBin | string | `"/usr/bin"` | |
| configMapData.varLibCni | string | `"/var/lib/cni"` | |
| configMapData.varLibEtcd | string | `"/var/lib/etcd"` | |
| configMapData.varLibKubeControllerManager | string | `"/var/lib/kube-controller-manager"` | |
| configMapData.varLibKubeScheduler | string | `"/var/lib/kube-scheduler"` | |
| configMapData.varLibKubelet | string | `"/var/lib/kubelet"` | |
| cronjob.hostPID | bool | `true` | |
| cronjob.name | string | `"prowler"` | |
| cronjob.schedule | string | `"0 0 * * *"` | |
| image.pullPolicy | string | `"Always"` | |
| image.repository | string | `"toniblyx/prowler"` | |
| image.tag | string | `"stable"` | |
| namespace.name | string | `"prowler"` | |
| serviceAccount.name | string | `"prowler"` | |
----------------------------------------------
Autogenerated from chart metadata using [helm-docs v1.11.3](https://github.com/norwoodj/helm-docs/releases/v1.11.3)

View File

@@ -0,0 +1,11 @@
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRole
metadata:
name: {{ .Values.clusterRole.name }}
rules:
- apiGroups: [""]
resources: ["pods", "configmaps", "nodes", "namespaces"]
verbs: ["get", "list", "watch"]
- apiGroups: ["rbac.authorization.k8s.io"]
resources: ["clusterrolebindings", "rolebindings", "clusterroles", "roles"]
verbs: ["get", "list", "watch"]

View File

@@ -0,0 +1,18 @@
apiVersion: v1
kind: ConfigMap
metadata:
name: {{ .Values.configMap.name }}
namespace: {{ .Values.namespace.name }}
data:
varLibCni: "{{ .Values.configMap.data.varLibCni }}"
varLibEtcd: "{{ .Values.configMap.data.varLibEtcd }}"
varLibKubelet: "{{ .Values.configMap.data.varLibKubelet }}"
varLibKubeScheduler: "{{ .Values.configMap.data.varLibKubeScheduler }}"
varLibKubeControllerManager: "{{ .Values.configMap.data.varLibKubeControllerManager }}"
etcSystemd: "{{ .Values.configMap.data.etcSystemd }}"
libSystemd: "{{ .Values.configMap.data.libSystemd }}"
etcKubernetes: "{{ .Values.configMap.data.etcKubernetes }}"
usrBin: "{{ .Values.configMap.data.usrBin }}"
etcCniNetd: "{{ .Values.configMap.data.etcCniNetd }}"
optCniBin: "{{ .Values.configMap.data.optCniBin }}"
srvKubernetes: "{{ .Values.configMap.data.srvKubernetes }}"

View File

@@ -0,0 +1,42 @@
apiVersion: batch/v1
kind: CronJob
metadata:
name: {{ .Values.cronjob.name }}
namespace: {{ .Values.namespace.name }}
spec:
schedule: "{{ .Values.cronjob.schedule }}"
jobTemplate:
spec:
template:
metadata:
labels:
app: prowler
spec:
serviceAccountName: {{ .Values.serviceAccount.name }}
containers:
- name: prowler
image: {{ .Values.image.repository }}:{{ .Values.image.tag }}
command: ["prowler"]
args: ["kubernetes", "-z", "-b"]
imagePullPolicy: {{ .Values.image.pullPolicy }}
volumeMounts:
{{- range $key, $value := .Values.configMap.data }}
{{- if and (eq $.Values.clusterType "gke") (eq $key "srvKubernetes") }}
{{- else }}
- name: {{ $key | lower }}
mountPath: {{ $value }}
readOnly: true
{{- end }}
{{- end }}
hostPID: {{ .Values.cronjob.hostPID }}
restartPolicy: Never
volumes:
{{- range $key, $value := .Values.configMap.data }}
{{- if and (eq $.Values.clusterType "gke") (eq $key "srvKubernetes") }}
{{- else }}
- name: {{ $key | lower }}
hostPath:
path: {{ $value }}
{{- end }}
{{- end }}

View File

@@ -0,0 +1,4 @@
apiVersion: v1
kind: Namespace
metadata:
name: {{ .Values.namespace.name }}

View File

@@ -0,0 +1,12 @@
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: {{ .Values.clusterRoleBinding.name }}
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: {{ .Values.clusterRole.name }}
subjects:
- kind: ServiceAccount
name: {{ .Values.serviceAccount.name }}
namespace: {{ .Values.namespace.name }}

View File

@@ -0,0 +1,5 @@
apiVersion: v1
kind: ServiceAccount
metadata:
name: {{ .Values.serviceAccount.name }}
namespace: {{ .Values.namespace.name }}

View File

@@ -0,0 +1,40 @@
namespace:
name: prowler-ns
cronjob:
name: prowler
schedule: "0 0 * * *"
hostPID: true
serviceAccount:
name: prowler-sa
image:
repository: toniblyx/prowler
tag: stable
pullPolicy: Always
clusterType:
configMap:
name: prowler-config
data:
varLibCni: "/var/lib/cni"
varLibEtcd: "/var/lib/etcd"
varLibKubelet: "/var/lib/kubelet"
varLibKubeScheduler: "/var/lib/kube-scheduler"
varLibKubeControllerManager: "/var/lib/kube-controller-manager"
etcSystemd: "/etc/systemd"
libSystemd: "/lib/systemd"
etcKubernetes: "/etc/kubernetes"
usrBin: "/usr/bin"
etcCniNetd: "/etc/cni/net.d"
optCniBin: "/opt/cni/bin"
srvKubernetes: "/srv/kubernetes"
clusterRole:
name: prowler-read-cluster
clusterRoleBinding:
name: prowler-read-cluster-binding
roleName: prowler-read-cluster

View File

@@ -21,7 +21,7 @@ print(
f"{Fore.GREEN}Loading all CSV files from the folder {folder_path_overview} ...\n{Style.RESET_ALL}"
)
cli.show_server_banner = lambda *x: click.echo(
f"{Fore.YELLOW}NOTE:{Style.RESET_ALL} If you are a {Fore.GREEN}{Style.BRIGHT}Prowler SaaS{Style.RESET_ALL} customer and you want to use your data from your S3 bucket,\nrun: `{orange_color}aws s3 cp s3://<your-bucket>/output/csv ./output --recursive{Style.RESET_ALL}`\nand then run `prowler dashboard` again to load the new files."
f"{Fore.YELLOW}NOTE:{Style.RESET_ALL} If you are using {Fore.GREEN}{Style.BRIGHT}Prowler SaaS{Style.RESET_ALL} with the S3 integration or that integration \nfrom {Fore.CYAN}{Style.BRIGHT}Prowler Open Source{Style.RESET_ALL} and you want to use your data from your S3 bucket,\nrun: `{orange_color}aws s3 cp s3://<your-bucket>/output/csv ./output --recursive{Style.RESET_ALL}`\nand then run `prowler dashboard` again to load the new files."
)
# Initialize the app - incorporate css

View File

@@ -21,7 +21,7 @@ muted_manual_color = "#b33696"
critical_color = "#951649"
high_color = "#e11d48"
medium_color = "#ee6f15"
low_color = "#f9f5e6"
low_color = "#fcf45d"
informational_color = "#3274d9"
# Folder output path

View File

@@ -945,7 +945,7 @@ def filter_data(
color_mapping_status = {
"FAIL": fail_color,
"PASS": pass_color,
"INFO": info_color,
"LOW": info_color,
"MANUAL": manual_color,
"WARNING": muted_fail_color,
"MUTED (FAIL)": muted_fail_color,
@@ -1564,7 +1564,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
data.get(
"FINDING_UID", ""
)
)
),
style={
"margin-left": "5px"
},
),
],
style={"display": "flex"},
@@ -1644,28 +1647,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
"STATUS_EXTENDED",
"",
)
)
),
],
style={"display": "flex"},
),
html.Div(
[
html.P(
html.Strong(
"Risk: ",
style={
"margin-right": "5px"
},
)
),
html.P(
str(
data.get(
"RISK",
"",
)
)
),
style={
"margin-left": "5px"
},
),
],
style={"display": "flex"},
@@ -1689,7 +1674,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
)
),
html.P(
str(data.get("RISK", ""))
str(data.get("RISK", "")),
style={
"margin-left": "5px"
},
),
],
style={"display": "flex"},
@@ -1744,7 +1732,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
"REMEDIATION_RECOMMENDATION_TEXT",
"",
)
)
),
style={
"margin-left": "5px"
},
),
],
style={"display": "flex"},
@@ -1772,7 +1763,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
"",
)
),
style={"color": "#3182ce"},
style={
"color": "#3182ce",
"margin-left": "5px",
},
),
],
style={"display": "flex"},

View File

@@ -319,7 +319,7 @@ Each Prowler check has metadata associated which is stored at the same level of
For the Remediation Code we use the following knowledge base to fill it:
- Official documentation for the provider
- https://docs.bridgecrew.io
- https://docs.prowler.com/checks/checks-index
- https://www.trendmicro.com/cloudoneconformity
- https://github.com/cloudmatos/matos/tree/master/remediations

View File

@@ -1,7 +1,11 @@
# Debugging
Debugging in Prowler make things easier!
If you are developing Prowler, it's possible that you will encounter some situations where you have to inspect the code in depth to fix some unexpected issues during the execution. To do that, if you are using VSCode you can run the code using the integrated debugger. Please, refer to this [documentation](https://code.visualstudio.com/docs/editor/debugging) for guidance about the debugger in VSCode.
If you are developing Prowler, it's possible that you will encounter some situations where you have to inspect the code in depth to fix some unexpected issues during the execution.
## VSCode
In VSCode you can run the code using the integrated debugger. Please, refer to this [documentation](https://code.visualstudio.com/docs/editor/debugging) for guidance about the debugger in VSCode.
The following file is an example of the [debugging configuration](https://code.visualstudio.com/docs/editor/debugging#_launch-configurations) file that you can add to [Virtual Studio Code](https://code.visualstudio.com/).
This file should inside the *.vscode* folder and its name has to be *launch.json*:
@@ -11,31 +15,62 @@ This file should inside the *.vscode* folder and its name has to be *launch.json
"version": "0.2.0",
"configurations": [
{
"name": "Python: Current File",
"type": "python",
"name": "Debug AWS Check",
"type": "debugpy",
"request": "launch",
"program": "prowler.py",
"args": [
"aws",
"-f",
"eu-west-1",
"--service",
"cloudwatch",
"--log-level",
"ERROR",
"-p",
"dev",
"-c",
"<check_name>"
],
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Python: Debug Tests",
"type": "python",
"name": "Debug Azure Check",
"type": "debugpy",
"request": "launch",
"program": "${file}",
"purpose": [
"debug-test"
"program": "prowler.py",
"args": [
"azure",
"--sp-env-auth",
"--log-level",
"ERROR",
"-c",
"<check_name>"
],
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Debug GCP Check",
"type": "debugpy",
"request": "launch",
"program": "prowler.py",
"args": [
"gcp",
"--log-level",
"ERROR",
"-c",
"<check_name>"
],
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Debug K8s Check",
"type": "debugpy",
"request": "launch",
"program": "prowler.py",
"args": [
"kubernetes",
"--log-level",
"ERROR",
"-c",
"<check_name>"
],
"console": "integratedTerminal",
"justMyCode": false

View File

@@ -4,10 +4,14 @@ You can extend Prowler Open Source in many different ways, in most cases you wil
## Get the code and install all dependencies
First of all, you need a version of Python 3.9 or higher and also pip installed to be able to install all dependencies required. Once that is satisfied go a head and clone the repo:
First of all, you need a version of Python 3.9 or higher and also `pip` installed to be able to install all dependencies required.
Then, to start working with the Prowler Github repository you need to fork it to be able to propose changes for new features, bug fixing, etc. To fork the Prowler repo please refer to [this guide](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/fork-a-repo?tool=webui#forking-a-repository).
Once that is satisfied go ahead and clone your forked repo:
```
git clone https://github.com/prowler-cloud/prowler
git clone https://github.com/<your-github-user>/prowler
cd prowler
```
For isolation and avoid conflicts with other environments, we recommend usage of `poetry`:
@@ -44,7 +48,10 @@ Before we merge any of your pull requests we pass checks to the code, we use the
You can see all dependencies in file `pyproject.toml`.
Moreover, you would need to install [`TruffleHog`](https://github.com/trufflesecurity/trufflehog) to check for secrets in the code. You can install it using the official installation guide [here](https://github.com/trufflesecurity/trufflehog?tab=readme-ov-file#floppy_disk-installation).
Moreover, you would need to install [`TruffleHog`](https://github.com/trufflesecurity/trufflehog) on the latest version to check for secrets in the code. You can install it using the official installation guide [here](https://github.com/trufflesecurity/trufflehog?tab=readme-ov-file#floppy_disk-installation).
???+ note
If you have any trouble when committing to the Prowler repository, add the `--no-verify` flag to the `git commit` command.
## Pull Request Checklist

View File

@@ -23,8 +23,8 @@ The Prowler's service structure is the following and the way to initialise it is
All the Prowler provider's services inherits from a base class depending on the provider used.
- [AWS Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/aws/lib/service/service.py)
- [GCP Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/lib/service/service.py)
- [Azure Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/lib/service/service.py)
- [GCP Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/lib/service/service.py)
- [Azure Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/lib/service/service.py)
- [Kubernetes Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/kubernetes/lib/service/service.py)
Each class is used to initialize the credentials and the API's clients to be used in the service. If some threading is used it must be coded there.

View File

@@ -40,10 +40,10 @@ If your IAM entity enforces MFA you can use `--mfa` and Prowler will ask you to
Prowler for Azure supports the following authentication types:
- Service principal authentication by environment variables (Enterprise Application)
- [Service principal application](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) by environment variables (recommended)
- Current az cli credentials stored
- Interactive browser authentication
- Managed identity authentication
- [Managed identity](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) authentication
### Service Principal authentication
@@ -56,6 +56,8 @@ export AZURE_CLIENT_SECRET="XXXXXXX"
```
If you try to execute Prowler with the `--sp-env-auth` flag and those variables are empty or not exported, the execution is going to fail.
Follow the instructions in the [Create Prowler Service Principal](../tutorials/azure/create-prowler-service-principal.md) section to create a service principal.
### AZ CLI / Browser / Managed Identity authentication
The other three cases does not need additional configuration, `--az-cli-auth` and `--managed-identity-auth` are automated options. To use `--browser-auth` the user needs to authenticate against Azure using the default browser to start the scan, also `tenant-id` is required.
@@ -64,55 +66,22 @@ The other three cases does not need additional configuration, `--az-cli-auth` an
To use each one you need to pass the proper flag to the execution. Prowler for Azure handles two types of permission scopes, which are:
- **Microsoft Entra ID permissions**: Used to retrieve metadata from the identity assumed by Prowler (not mandatory to have access to execute the tool).
- **Subscription scope permissions**: Required to launch the checks against your resources, mandatory to launch the tool.
#### Microsoft Entra ID scope
Microsoft Entra ID (AAD earlier) permissions required by the tool are the following:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All`
The best way to assign it is through the Azure web console:
1. Access to Microsoft Entra ID
2. In the left menu bar, go to "App registrations"
3. Once there, in the menu bar click on "+ New registration" to register a new application
4. Fill the "Name, select the "Supported account types" and click on "Register. You will be redirected to the applications page.
![Register an Application page](../img/register-application.png)
4. Select the new application
5. In the left menu bar, select "API permissions"
6. Then click on "+ Add a permission" and select "Microsoft Graph"
7. Once in the "Microsoft Graph" view, select "Application permissions"
8. Finally, search for "Directory", "Policy" and "UserAuthenticationMethod" select the following permissions:
- **Microsoft Entra ID permissions**: Used to retrieve metadata from the identity assumed by Prowler and specific Entra checks (not mandatory to have access to execute the tool). The permissions required by the tool are the following:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All`
![EntraID Permissions](../img/AAD-permissions.png)
- **Subscription scope permissions**: Required to launch the checks against your resources, mandatory to launch the tool. It is required to add the following RBAC builtin roles per subscription to the entity that is going to be assumed by the tool:
- `Reader`
- `ProwlerRole` (custom role defined in [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json))
To assign the permissions, follow the instructions in the [Microsoft Entra ID permissions](../tutorials/azure/create-prowler-service-principal.md#assigning-the-proper-permissions) section and the [Azure subscriptions permissions](../tutorials/azure/subscriptions.md#assigning-proper-permissions) section, respectively.
#### Subscriptions scope
#### Checks that require ProwlerRole
Regarding the subscription scope, Prowler by default scans all the subscriptions that is able to list, so it is required to add the following RBAC builtin roles per subscription to the entity that is going to be assumed by the tool:
The following checks require the `ProwlerRole` custom role to be executed, if you want to run them, make sure you have assigned the role to the identity that is going to be assumed by Prowler:
- `Security Reader`
- `Reader`
To assign this roles, follow the instructions:
1. Access your subscription, then select your subscription.
2. Select "Access control (IAM)".
3. In the overview, select "Roles"
![IAM Page](../img/page-IAM.png)
4. Click on "+ Add" and select "Add role assignment"
5. In the search bar, type `Security Reader`, select it and click on "Next"
6. In the Members tab, click on "+ Select members" and add the members you want to assign this role.
7. Click on "Review + assign" to apply the new role.
*Repeat these steps for `Reader` role*
- `app_function_access_keys_configured`
- `app_function_ftps_deployment_disabled`
## Google Cloud

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.4 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 357 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 688 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 214 KiB

After

Width:  |  Height:  |  Size: 746 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 348 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 552 KiB

View File

@@ -212,10 +212,10 @@ prowler <provider>
If you miss the former output you can use `--verbose` but Prowler v4 is smoking fast, so you won't see much ;
By default, Prowler will generate a CSV, JSON and HTML reports, however you can generate a JSON-ASFF (used by AWS Security Hub) report with `-M` or `--output-modes`:
By default, Prowler generates CSV, JSON-OCSF and HTML reports. However, you can generate a JSON-ASFF report (used by AWS Security Hub) with `-M` or `--output-modes`:
```console
prowler <provider> -M csv json json-asff html
prowler <provider> -M csv json-asff json-ocsf html
```
The html report will be located in the output directory as the other files and it will look like:

View File

@@ -85,7 +85,7 @@ prowler --security-hub --region eu-west-1
```
???+ note
It is recommended to send only fails to Security Hub and that is possible adding `-q/--quiet` to the command. You can use, instead of the `-q/--quiet` argument, the `--send-sh-only-fails` argument to save all the findings in the Prowler outputs but just to send FAIL findings to AWS Security Hub.
It is recommended to send only fails to Security Hub and that is possible adding `--status FAIL` to the command. You can use, instead of the `--status FAIL` argument, the `--send-sh-only-fails` argument to save all the findings in the Prowler outputs but just to send FAIL findings to AWS Security Hub.
Since Prowler perform checks to all regions by default you may need to filter by region when running Security Hub integration, as shown in the example above. Remember to enable Security Hub in the region or regions you need by calling `aws securityhub enable-security-hub --region <region>` and run Prowler with the option `-f/--region <region>` (if no region is used it will try to push findings in all regions hubs). Prowler will send findings to the Security Hub on the region where the scanned resource is located.
@@ -121,13 +121,13 @@ prowler --security-hub --role arn:aws:iam::123456789012:role/ProwlerExecutionRol
## Send only failed findings to Security Hub
When using the **AWS Security Hub** integration you can send only the `FAIL` findings generated by **Prowler**. Therefore, the **AWS Security Hub** usage costs eventually would be lower. To follow that recommendation you could add the `-q/--quiet` flag to the Prowler command:
When using the **AWS Security Hub** integration you can send only the `FAIL` findings generated by **Prowler**. Therefore, the **AWS Security Hub** usage costs eventually would be lower. To follow that recommendation you could add the `--status FAIL` flag to the Prowler command:
```sh
prowler --security-hub --quiet
prowler --security-hub --status FAIL
```
You can use, instead of the `-q/--quiet` argument, the `--send-sh-only-fails` argument to save all the findings in the Prowler outputs but just to send FAIL findings to AWS Security Hub:
You can use, instead of the `--status FAIL` argument, the `--send-sh-only-fails` argument to save all the findings in the Prowler outputs but just to send FAIL findings to AWS Security Hub:
```sh
prowler --security-hub --send-sh-only-fails

View File

@@ -0,0 +1,34 @@
# How to create Prowler Service Principal
To allow Prowler assume an identity to start the scan with the required privileges is necesary to create a Service Principal. To create one follow the next steps:
1. Access to Microsoft Entra ID
2. In the left menu bar, go to "App registrations"
3. Once there, in the menu bar click on "+ New registration" to register a new application
4. Fill the "Name, select the "Supported account types" and click on "Register. You will be redirected to the applications page.
5. Once in the application page, in the left menu bar, select "Certificates & secrets"
6. In the "Certificates & secrets" view, click on "+ New client secret"
7. Fill the "Description" and "Expires" fields and click on "Add"
8. Copy the value of the secret, it is going to be used as `AZURE_CLIENT_SECRET` environment variable.
![Register an Application page](../../img/create-sp.gif)
## Assigning the proper permissions
To allow Prowler to retrieve metadata from the identity assumed and specific Entra checks, it is needed to assign the following permissions:
1. Access to Microsoft Entra ID
2. In the left menu bar, go to "App registrations"
3. Once there, select the application that you have created
4. In the left menu bar, select "API permissions"
5. Then click on "+ Add a permission" and select "Microsoft Graph"
6. Once in the "Microsoft Graph" view, select "Application permissions"
7. Finally, search for "Directory", "Policy" and "UserAuthenticationMethod" select the following permissions:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All`
8. Click on "Add permissions" to apply the new permissions.
9. Finally, click on "Grant admin consent for [your tenant]" to apply the permissions.
![EntraID Permissions](../../img/AAD-permissions.png)

View File

@@ -1,6 +1,6 @@
# Azure subscriptions scope
By default, Prowler is multisubscription, which means that is going to scan all the subscriptions is able to list. If you only assign permissions to one subscription, it is going to scan a single one.
By default, Prowler is multisubscription, which means that is going to scan all the subscriptions is able to list. If you only assign permissions to one subscription, it is going to scan a single one.
Prowler also has the ability to limit the subscriptions to scan to a set passed as input argument, to do so:
```console
@@ -8,3 +8,36 @@ prowler azure --az-cli-auth --subscription-ids <subscription ID 1> <subscription
```
Where you can pass from 1 up to N subscriptions to be scanned.
## Assigning proper permissions
Regarding the subscription scope, Prowler by default scans all subscriptions that it is able to list, so it is necessary to add the `Reader` RBAC built-in roles per subscription or management group (recommended for multiple subscriptions, see it in the [next section](#recommendation-for-multiple-subscriptions)) to the entity that will be adopted by the tool:
To assign this roles, follow the instructions:
1. Access your subscription, then select your subscription.
2. Select "Access control (IAM)".
3. In the overview, select "Roles".
4. Click on "+ Add" and select "Add role assignment".
5. In the search bar, type `Reader`, select it and click on "Next".
6. In the Members tab, click on "+ Select members" and add the members you want to assign this role.
7. Click on "Review + assign" to apply the new role.
![Add reader role to subscription](../../img/add-reader-role.gif)
Moreover, some additional read-only permissions are needed for some checks, for this kind of checks that are not covered by built-in roles we use a custom role. This role is defined in [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json). Once the cusotm role is created, repeat the steps mentioned above to assign the new `ProwlerRole` to an identity.
## Recommendation for multiple subscriptions
While scanning multiple subscriptions could be tedious to create and assign roles for each one. For this reason in Prowler we recommend the usage of *[management groups](https://learn.microsoft.com/en-us/azure/governance/management-groups/overview)* to group all subscriptions that are going to be audited by Prowler.
To do this in a proper way you have to [create a new management group](https://learn.microsoft.com/en-us/azure/governance/management-groups/create-management-group-portal) and add all roles in the same way that have been done for subscription scope.
![Create management group](../../img/create-management-group.gif)
Once the management group is properly set you can add all the subscription that you want to audit.
![Add subscription to management group](../../img/add-sub-to-management-group.gif)
???+ note
By default, `prowler` will scan all subscriptions in the Azure tenant, use the flag `--subscription-id` to specify the subscriptions to be scanned.

View File

@@ -29,19 +29,23 @@ The following list includes all the AWS checks with configurable variables that
| `organizations_delegated_administrators` | `organizations_trusted_delegated_administrators` | List of Strings |
| `ecr_repositories_scan_vulnerabilities_in_latest_image` | `ecr_repository_vulnerability_minimum_severity` | String |
| `trustedadvisor_premium_support_plan_subscribed` | `verify_premium_support_plans` | Boolean |
| `config_recorder_all_regions_enabled` | `mute_non_default_regions` | Boolean |
| `drs_job_exist` | `mute_non_default_regions` | Boolean |
| `guardduty_is_enabled` | `mute_non_default_regions` | Boolean |
| `securityhub_enabled` | `mute_non_default_regions` | Boolean |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_entropy` | Integer |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_minutes` | Integer |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_actions` | List of Strings |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_entropy` | Integer |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_minutes` | Integer |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_actions` | List of Strings |
| `rds_instance_backup_enabled` | `check_rds_instance_replicas` | Boolean |
| `ec2_securitygroup_allow_ingress_from_internet_to_any_port` | `ec2_allowed_interface_types` | List of Strings |
| `ec2_securitygroup_allow_ingress_from_internet_to_any_port` | `ec2_allowed_instance_owners` | List of Strings |
| `config_recorder_all_regions_enabled` | `mute_non_default_regions` | Boolean |
| `drs_job_exist` | `mute_non_default_regions` | Boolean |
| `guardduty_is_enabled` | `mute_non_default_regions` | Boolean |
| `securityhub_enabled` | `mute_non_default_regions` | Boolean |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_entropy` | Integer |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_minutes` | Integer |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_actions` | List of Strings |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_entropy` | Integer |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_minutes` | Integer |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_actions` | List of Strings |
| `rds_instance_backup_enabled` | `check_rds_instance_replicas` | Boolean |
| `ec2_securitygroup_allow_ingress_from_internet_to_any_port` | `ec2_allowed_interface_types` | List of Strings |
| `ec2_securitygroup_allow_ingress_from_internet_to_any_port` | `ec2_allowed_instance_owners` | List of Strings |
| `acm_certificates_expiration_check` | `days_to_expire_threshold` | Integer |
| `eks_control_plane_logging_all_types_enabled` | `eks_required_log_types` | List of Strings |
## Azure
### Configurable Checks
@@ -80,10 +84,20 @@ The following list includes all the Azure checks with configurable variables tha
```yaml title="config.yaml"
# AWS Configuration
aws:
# AWS Global Configuration
# aws.mute_non_default_regions --> Mute Failed Findings in non-default regions for GuardDuty, SecurityHub, DRS and Config
# aws.mute_non_default_regions --> Set to True to muted failed findings in non-default regions for AccessAnalyzer, GuardDuty, SecurityHub, DRS and Config
mute_non_default_regions: False
# If you want to mute failed findings only in specific regions, create a file with the following syntax and run it with `prowler aws -w mutelist.yaml`:
# Mutelist:
# Accounts:
# "*":
# Checks:
# "*":
# Regions:
# - "ap-southeast-1"
# - "ap-southeast-2"
# Resources:
# - "*"
# AWS IAM Configuration
# aws.iam_user_accesskey_unused --> CIS recommends 45 days
@@ -93,6 +107,7 @@ aws:
# AWS EC2 Configuration
# aws.ec2_elastic_ip_shodan
# TODO: create common config
shodan_api_key: null
# aws.ec2_securitygroup_with_many_ingress_egress_rules --> by default is 50 rules
max_security_group_rules: 50
@@ -102,13 +117,13 @@ aws:
# allowed network interface types for security groups open to the Internet
ec2_allowed_interface_types:
[
"api_gateway_managed",
"vpc_endpoint",
"api_gateway_managed",
"vpc_endpoint",
]
# allowed network interface owners for security groups open to the Internet
ec2_allowed_instance_owners:
[
"amazon-elb"
"amazon-elb"
]
# AWS VPC Configuration (vpc_endpoint_connections_trust_boundaries, vpc_endpoint_services_allowed_principals_trust_boundaries)
@@ -133,205 +148,234 @@ aws:
# aws.awslambda_function_using_supported_runtimes
obsolete_lambda_runtimes:
[
"java8",
"go1.x",
"provided",
"python3.6",
"python2.7",
"python3.7",
"nodejs4.3",
"nodejs4.3-edge",
"nodejs6.10",
"nodejs",
"nodejs8.10",
"nodejs10.x",
"nodejs12.x",
"nodejs14.x",
"dotnet5.0",
"dotnetcore1.0",
"dotnetcore2.0",
"dotnetcore2.1",
"dotnetcore3.1",
"ruby2.5",
"ruby2.7",
]
# AWS Organizations
# organizations_scp_check_deny_regions
# organizations_enabled_regions: [
# 'eu-central-1',
# 'eu-west-1',
# aws.organizations_scp_check_deny_regions
# aws.organizations_enabled_regions: [
# "eu-central-1",
# "eu-west-1",
# "us-east-1"
# ]
organizations_enabled_regions: []
organizations_trusted_delegated_administrators: []
# AWS ECR
# ecr_repositories_scan_vulnerabilities_in_latest_image
# aws.ecr_repositories_scan_vulnerabilities_in_latest_image
# CRITICAL
# HIGH
# MEDIUM
ecr_repository_vulnerability_minimum_severity: "MEDIUM"
# AWS Trusted Advisor
# trustedadvisor_premium_support_plan_subscribed
# aws.trustedadvisor_premium_support_plan_subscribed
verify_premium_support_plans: True
# AWS CloudTrail Configuration
# aws.cloudtrail_threat_detection_privilege_escalation
threat_detection_privilege_escalation_entropy: 0.7 # Percentage of actions found to decide if it is an privilege_escalation attack event, by default is 0.7 (70%)
threat_detection_privilege_escalation_threshold: 0.1 # Percentage of actions found to decide if it is an privilege_escalation attack event, by default is 0.1 (10%)
threat_detection_privilege_escalation_minutes: 1440 # Past minutes to search from now for privilege_escalation attacks, by default is 1440 minutes (24 hours)
threat_detection_privilege_escalation_actions: [
"AddPermission",
"AddRoleToInstanceProfile",
"AddUserToGroup",
"AssociateAccessPolicy",
"AssumeRole",
"AttachGroupPolicy",
"AttachRolePolicy",
"AttachUserPolicy",
"ChangePassword",
"CreateAccessEntry",
"CreateAccessKey",
"CreateDevEndpoint",
"CreateEventSourceMapping",
"CreateFunction",
"CreateGroup",
"CreateJob",
"CreateKeyPair",
"CreateLoginProfile",
"CreatePipeline",
"CreatePolicyVersion",
"CreateRole",
"CreateStack",
"DeleteRolePermissionsBoundary",
"DeleteRolePolicy",
"DeleteUserPermissionsBoundary",
"DeleteUserPolicy",
"DetachRolePolicy",
"DetachUserPolicy",
"GetCredentialsForIdentity",
"GetId",
"GetPolicyVersion",
"GetUserPolicy",
"Invoke",
"ModifyInstanceAttribute",
"PassRole",
"PutGroupPolicy",
"PutPipelineDefinition",
"PutRolePermissionsBoundary",
"PutRolePolicy",
"PutUserPermissionsBoundary",
"PutUserPolicy",
"ReplaceIamInstanceProfileAssociation",
"RunInstances",
"SetDefaultPolicyVersion",
"UpdateAccessKey",
"UpdateAssumeRolePolicy",
"UpdateDevEndpoint",
"UpdateEventSourceMapping",
"UpdateFunctionCode",
"UpdateJob",
"UpdateLoginProfile",
]
threat_detection_privilege_escalation_actions:
[
"AddPermission",
"AddRoleToInstanceProfile",
"AddUserToGroup",
"AssociateAccessPolicy",
"AssumeRole",
"AttachGroupPolicy",
"AttachRolePolicy",
"AttachUserPolicy",
"ChangePassword",
"CreateAccessEntry",
"CreateAccessKey",
"CreateDevEndpoint",
"CreateEventSourceMapping",
"CreateFunction",
"CreateGroup",
"CreateJob",
"CreateKeyPair",
"CreateLoginProfile",
"CreatePipeline",
"CreatePolicyVersion",
"CreateRole",
"CreateStack",
"DeleteRolePermissionsBoundary",
"DeleteRolePolicy",
"DeleteUserPermissionsBoundary",
"DeleteUserPolicy",
"DetachRolePolicy",
"DetachUserPolicy",
"GetCredentialsForIdentity",
"GetId",
"GetPolicyVersion",
"GetUserPolicy",
"Invoke",
"ModifyInstanceAttribute",
"PassRole",
"PutGroupPolicy",
"PutPipelineDefinition",
"PutRolePermissionsBoundary",
"PutRolePolicy",
"PutUserPermissionsBoundary",
"PutUserPolicy",
"ReplaceIamInstanceProfileAssociation",
"RunInstances",
"SetDefaultPolicyVersion",
"UpdateAccessKey",
"UpdateAssumeRolePolicy",
"UpdateDevEndpoint",
"UpdateEventSourceMapping",
"UpdateFunctionCode",
"UpdateJob",
"UpdateLoginProfile",
]
# aws.cloudtrail_threat_detection_enumeration
threat_detection_enumeration_entropy: 0.7 # Percentage of actions found to decide if it is an enumeration attack event, by default is 0.7 (70%)
threat_detection_enumeration_threshold: 0.1 # Percentage of actions found to decide if it is an enumeration attack event, by default is 0.1 (10%)
threat_detection_enumeration_minutes: 1440 # Past minutes to search from now for enumeration attacks, by default is 1440 minutes (24 hours)
threat_detection_enumeration_actions: [
"DescribeAccessEntry",
"DescribeAccountAttributes",
"DescribeAvailabilityZones",
"DescribeBundleTasks",
"DescribeCarrierGateways",
"DescribeClientVpnRoutes",
"DescribeCluster",
"DescribeDhcpOptions",
"DescribeFlowLogs",
"DescribeImages",
"DescribeInstanceAttribute",
"DescribeInstanceInformation",
"DescribeInstanceTypes",
"DescribeInstances",
"DescribeInstances",
"DescribeKeyPairs",
"DescribeLogGroups",
"DescribeLogStreams",
"DescribeOrganization",
"DescribeRegions",
"DescribeSecurityGroups",
"DescribeSnapshotAttribute",
"DescribeSnapshotTierStatus",
"DescribeSubscriptionFilters",
"DescribeTransitGatewayMulticastDomains",
"DescribeVolumes",
"DescribeVolumesModifications",
"DescribeVpcEndpointConnectionNotifications",
"DescribeVpcs",
"GetAccount",
"GetAccountAuthorizationDetails",
"GetAccountSendingEnabled",
"GetBucketAcl",
"GetBucketLogging",
"GetBucketPolicy",
"GetBucketReplication",
"GetBucketVersioning",
"GetCallerIdentity",
"GetCertificate",
"GetConsoleScreenshot",
"GetCostAndUsage",
"GetDetector",
"GetEbsDefaultKmsKeyId",
"GetEbsEncryptionByDefault",
"GetFindings",
"GetFlowLogsIntegrationTemplate",
"GetIdentityVerificationAttributes",
"GetInstances",
"GetIntrospectionSchema",
"GetLaunchTemplateData",
"GetLaunchTemplateData",
"GetLogRecord",
"GetParameters",
"GetPolicyVersion",
"GetPublicAccessBlock",
"GetQueryResults",
"GetRegions",
"GetSMSAttributes",
"GetSMSSandboxAccountStatus",
"GetSendQuota",
"GetTransitGatewayRouteTableAssociations",
"GetUserPolicy",
"HeadObject",
"ListAccessKeys",
"ListAccounts",
"ListAllMyBuckets",
"ListAssociatedAccessPolicies",
"ListAttachedUserPolicies",
"ListClusters",
"ListDetectors",
"ListDomains",
"ListFindings",
"ListHostedZones",
"ListIPSets",
"ListIdentities",
"ListInstanceProfiles",
"ListObjects",
"ListOrganizationalUnitsForParent",
"ListOriginationNumbers",
"ListPolicyVersions",
"ListRoles",
"ListRoles",
"ListRules",
"ListServiceQuotas",
"ListSubscriptions",
"ListTargetsByRule",
"ListTopics",
"ListUsers",
"LookupEvents",
"Search",
]
threat_detection_enumeration_actions:
[
"DescribeAccessEntry",
"DescribeAccountAttributes",
"DescribeAvailabilityZones",
"DescribeBundleTasks",
"DescribeCarrierGateways",
"DescribeClientVpnRoutes",
"DescribeCluster",
"DescribeDhcpOptions",
"DescribeFlowLogs",
"DescribeImages",
"DescribeInstanceAttribute",
"DescribeInstanceInformation",
"DescribeInstanceTypes",
"DescribeInstances",
"DescribeInstances",
"DescribeKeyPairs",
"DescribeLogGroups",
"DescribeLogStreams",
"DescribeOrganization",
"DescribeRegions",
"DescribeSecurityGroups",
"DescribeSnapshotAttribute",
"DescribeSnapshotTierStatus",
"DescribeSubscriptionFilters",
"DescribeTransitGatewayMulticastDomains",
"DescribeVolumes",
"DescribeVolumesModifications",
"DescribeVpcEndpointConnectionNotifications",
"DescribeVpcs",
"GetAccount",
"GetAccountAuthorizationDetails",
"GetAccountSendingEnabled",
"GetBucketAcl",
"GetBucketLogging",
"GetBucketPolicy",
"GetBucketReplication",
"GetBucketVersioning",
"GetCallerIdentity",
"GetCertificate",
"GetConsoleScreenshot",
"GetCostAndUsage",
"GetDetector",
"GetEbsDefaultKmsKeyId",
"GetEbsEncryptionByDefault",
"GetFindings",
"GetFlowLogsIntegrationTemplate",
"GetIdentityVerificationAttributes",
"GetInstances",
"GetIntrospectionSchema",
"GetLaunchTemplateData",
"GetLaunchTemplateData",
"GetLogRecord",
"GetParameters",
"GetPolicyVersion",
"GetPublicAccessBlock",
"GetQueryResults",
"GetRegions",
"GetSMSAttributes",
"GetSMSSandboxAccountStatus",
"GetSendQuota",
"GetTransitGatewayRouteTableAssociations",
"GetUserPolicy",
"HeadObject",
"ListAccessKeys",
"ListAccounts",
"ListAllMyBuckets",
"ListAssociatedAccessPolicies",
"ListAttachedUserPolicies",
"ListClusters",
"ListDetectors",
"ListDomains",
"ListFindings",
"ListHostedZones",
"ListIPSets",
"ListIdentities",
"ListInstanceProfiles",
"ListObjects",
"ListOrganizationalUnitsForParent",
"ListOriginationNumbers",
"ListPolicyVersions",
"ListRoles",
"ListRoles",
"ListRules",
"ListServiceQuotas",
"ListSubscriptions",
"ListTargetsByRule",
"ListTopics",
"ListUsers",
"LookupEvents",
"Search",
]
# AWS RDS Configuration
# aws.rds_instance_backup_enabled
# Whether to check RDS instance replicas or not
check_rds_instance_replicas: False
# AWS ACM Configuration
# aws.acm_certificates_expiration_check
days_to_expire_threshold: 7
# AWS EKS Configuration
# aws.eks_control_plane_logging_all_types_enabled
# EKS control plane logging types that must be enabled
eks_required_log_types:
[
"api",
"audit",
"authenticator",
"controllerManager",
"scheduler",
]
# Azure Configuration
azure:
# Azure Network Configuration
# azure.network_public_ip_shodan
# TODO: create common config
shodan_api_key: null
# Azure App Configuration
# Azure App Service
# azure.app_ensure_php_version_is_latest
php_latest_version: "8.2"
# azure.app_ensure_python_version_is_latest
@@ -345,4 +389,34 @@ gcp:
# gcp.compute_public_address_shodan
shodan_api_key: null
# Kubernetes Configuration
kubernetes:
# Kubernetes API Server
# kubernetes.apiserver_audit_log_maxbackup_set
audit_log_maxbackup: 10
# kubernetes.apiserver_audit_log_maxsize_set
audit_log_maxsize: 100
# kubernetes.apiserver_audit_log_maxage_set
audit_log_maxage: 30
# kubernetes.apiserver_strong_ciphers_only
apiserver_strong_ciphers:
[
"TLS_AES_128_GCM_SHA256",
"TLS_AES_256_GCM_SHA384",
"TLS_CHACHA20_POLY1305_SHA256",
]
# Kubelet
# kubernetes.kubelet_strong_ciphers_only
kubelet_strong_ciphers:
[
"TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256",
"TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256",
"TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305",
"TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384",
"TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305",
"TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384",
"TLS_RSA_WITH_AES_256_GCM_SHA384",
"TLS_RSA_WITH_AES_128_GCM_SHA256",
]
```

View File

@@ -54,7 +54,7 @@ CustomChecksMetadata:
RelatedUrl: https://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html
Remediation:
Code:
CLI: aws s3api put-bucket-versioning --bucket <bucket-name> --versioning-configuration Status=Enabled
CLI: aws s3api put-bucket-versioning --bucket <bucket-name> --versioning-configuration Status=Enabled,MFADelete=Enabled
NativeIaC: https://aws.amazon.com/es/s3/features/versioning/
Other: https://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html
Terraform: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/s3_bucket_versioning

View File

@@ -81,7 +81,7 @@ def get_table(data):
## S3 Integration
If you are a Prowler Saas customer and you want to use your data from your S3 bucket, you can run:
If you are using Prowler SaaS with the S3 integration or that integration from Prowler Open Source and you want to use your data from your S3 bucket, you can run:
```sh
aws s3 cp s3://<your-bucket>/output/csv ./output --recursive

View File

@@ -25,7 +25,17 @@ Prowler will follow the same credentials search as [Google authentication librar
Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the `Viewer` role to the member associated with the credentials.
# GCP Service APIs
## Impersonate Service Account
If you want to impersonate a GCP service account, you can use the `--impersonate-service-account` argument:
```console
prowler gcp --impersonate-service-account <service-account-email>
```
This argument will use the default credentials to impersonate the service account provided.
## Service APIs
Prowler will use the Google Cloud APIs to get the information needed to perform the checks. Make sure that the following APIs are enabled in the project:

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.5 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 26 KiB

View File

@@ -0,0 +1,20 @@
# In-Cluster Execution
For in-cluster execution, you can use the supplied yaml files inside `/kubernetes`:
* [job.yaml](https://github.com/prowler-cloud/prowler/blob/master/kubernetes/job.yaml)
* [prowler-role.yaml](https://github.com/prowler-cloud/prowler/blob/master/kubernetes/prowler-role.yaml)
* [prowler-rolebinding.yaml](https://github.com/prowler-cloud/prowler/blob/master/kubernetes/prowler-rolebinding.yaml)
They can be used to run Prowler as a job within a new Prowler namespace:
```console
kubectl apply -f kubernetes/job.yaml
kubectl apply -f kubernetes/prowler-role.yaml
kubectl apply -f kubernetes/prowler-rolebinding.yaml
kubectl get pods --namespace prowler-ns --> prowler-XXXXX
kubectl logs prowler-XXXXX --namespace prowler-ns
```
???+ note
By default, `prowler` will scan all namespaces in your active Kubernetes context. Use the [`--namespace`](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/kubernetes/namespace/) flag to specify the namespace(s) to be scanned.

View File

@@ -0,0 +1,23 @@
# Miscellaneous
## Context Filtering
Prowler will scan the active Kubernetes context by default.
To specify the Kubernetes context to be scanned, use the `--context` flag followed by the desired context name. For example:
```console
prowler --context my-context
```
This will ensure that Prowler scans the specified context/cluster for vulnerabilities and misconfigurations.
## Namespace Filtering
By default, `prowler` will scan all namespaces in the context you specify.
To specify the namespace(s) to be scanned, use the `--namespace` flag followed by the desired namespace(s) separated by spaces. For example:
```console
prowler --namespace namespace1 namespace2
```

View File

@@ -0,0 +1,15 @@
# Non in-cluster execution
For non in-cluster execution, you can provide the location of the [kubeconfig](https://kubernetes.io/docs/concepts/configuration/organize-cluster-access-kubeconfig/) file with the following argument:
```console
prowler kubernetes --kubeconfig-file /path/to/kubeconfig
```
???+ note
If no `--kubeconfig-file` is provided, Prowler will use the default KubeConfig file location (`~/.kube/config`).
???+ note
`prowler` will scan the active Kubernetes context by default. Use the [`--context`](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/kubernetes/context/) flag to specify the context to be scanned.
???+ note
By default, `prowler` will scan all namespaces in your active Kubernetes context. Use the [`--namespace`](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/kubernetes/namespace/) flag to specify the namespace(s) to be scanned.

View File

@@ -10,7 +10,7 @@ Execute Prowler in verbose mode (like in Version 2):
prowler <provider> --verbose
```
## Filter findings by status
Prowler can filter the findings by their status:
Prowler can filter the findings by their status, so you can see only in the CLI and in the reports the findings with a specific status:
```console
prowler <provider> --status [PASS, FAIL, MANUAL]
```

View File

@@ -7,97 +7,121 @@ Mutelist option works along with other options and will modify the output in the
- CSV: `muted` is `True`. The field `status` will keep the original status, `MANUAL`, `PASS` or `FAIL`, of the finding.
You can use `-w`/`--mutelist-file` with the path of your mutelist yaml file:
## How the Mutelist Works
The Mutelist uses an "ANDed" and "ORed" logic to determine which resources, checks, regions, and tags should be muted. For each check, the Mutelist checks if the account, region, and resource match the specified criteria, using an "ANDed" logic. If tags are specified, the mutelist uses and "ORed" logic to see if at least one tag is present in the resource.
If any of the criteria do not match, the check is not muted.
## Mutelist Specification
???+ note
- For Azure provider, the Account ID is the Subscription Name and the Region is the Location.
- For GCP provider, the Account ID is the Project ID and the Region is the Zone.
- For Kubernetes provider, the Account ID is the Cluster Name and the Region is the Namespace.
The Mutelist file uses the [YAML](https://en.wikipedia.org/wiki/YAML) format with the following syntax:
```yaml
### Account, Check and/or Region can be * to apply for all the cases.
### Resources and tags are lists that can have either Regex or Keywords.
### Tags is an optional list that matches on tuples of 'key=value' and are "ANDed" together.
### Use an alternation Regex to match one of multiple tags with "ORed" logic.
### For each check you can except Accounts, Regions, Resources and/or Tags.
########################### MUTELIST EXAMPLE ###########################
Mutelist:
Accounts:
"123456789012":
Checks:
"iam_user_hardware_mfa_enabled":
Regions:
- "us-east-1"
Resources:
- "user-1" # Will ignore user-1 in check iam_user_hardware_mfa_enabled
- "user-2" # Will ignore user-2 in check iam_user_hardware_mfa_enabled
"ec2_*":
Regions:
- "*"
Resources:
- "*" # Will ignore every EC2 check in every account and region
"*":
Regions:
- "*"
Resources:
- "test"
Tags:
- "test=test" # Will ignore every resource containing the string "test" and the tags 'test=test' and
- "project=test|project=stage" # either of ('project=test' OR project=stage) in account 123456789012 and every region
"*":
Checks:
"s3_bucket_object_versioning":
Regions:
- "eu-west-1"
- "us-east-1"
Resources:
- "ci-logs" # Will ignore bucket "ci-logs" AND ALSO bucket "ci-logs-replica" in specified check and regions
- "logs" # Will ignore EVERY BUCKET containing the string "logs" in specified check and regions
- ".+-logs" # Will ignore all buckets containing the terms ci-logs, qa-logs, etc. in specified check and regions
"ecs_task_definitions_no_environment_secrets":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Accounts:
- "0123456789012"
Regions:
- "eu-west-1"
- "eu-south-2" # Will ignore every resource in check ecs_task_definitions_no_environment_secrets except the ones in account 0123456789012 located in eu-south-2 or eu-west-1
"*":
Regions:
- "*"
Resources:
- "*"
Tags:
- "environment=dev" # Will ignore every resource containing the tag 'environment=dev' in every account and region
"123456789012":
Checks:
"*":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Resources:
- "test"
Tags:
- "environment=prod" # Will ignore every resource except in account 123456789012 except the ones containing the string "test" and tag environment=prod
```
### Account, Check, Region, Resource, and Tag
| Field | Description | Logic |
|----------|----------|----------|
| `<account_id>` | Use `*` to apply the mutelist to all accounts. | `ANDed` |
| `<check_name>` | The name of the Prowler check. Use `*` to apply the mutelist to all checks. | `ANDed` |
| `<region>` | The region identifier. Use `*` to apply the mutelist to all regions. | `ANDed` |
| `<resource>` | The resource identifier. Use `*` to apply the mutelist to all resources. | `ANDed` |
| `<tag>` | The tag value. | `ORed` |
## How to Use the Mutelist
To use the Mutelist, you need to specify the path to the Mutelist YAML file using the `-w` or `--mutelist-file` option when running Prowler:
```
prowler <provider> -w mutelist.yaml
```
## Mutelist YAML File Syntax
Replace `<provider>` with the appropriate provider name.
???+ note
For Azure provider, the Account ID is the Subscription Name and the Region is the Location.
## Considerations
???+ note
For GCP provider, the Account ID is the Project ID and the Region is the Zone.
- The Mutelist can be used in combination with other Prowler options, such as the `--service` or `--checks` option, to further customize the scanning process.
- Make sure to review and update the Mutelist regularly to ensure it reflects the desired exclusions and remains up to date with your infrastructure.
???+ note
For Kubernetes provider, the Account ID is the Cluster Name and the Region is the Namespace.
The Mutelist file is a YAML file with the following syntax:
```yaml
### Account, Check and/or Region can be * to apply for all the cases.
### Resources and tags are lists that can have either Regex or Keywords.
### Tags is an optional list that matches on tuples of 'key=value' and are "ANDed" together.
### Use an alternation Regex to match one of multiple tags with "ORed" logic.
### For each check you can except Accounts, Regions, Resources and/or Tags.
########################### MUTELIST EXAMPLE ###########################
Mutelist:
Accounts:
"123456789012":
Checks:
"iam_user_hardware_mfa_enabled":
Regions:
- "us-east-1"
Resources:
- "user-1" # Will ignore user-1 in check iam_user_hardware_mfa_enabled
- "user-2" # Will ignore user-2 in check iam_user_hardware_mfa_enabled
"ec2_*":
Regions:
- "*"
Resources:
- "*" # Will ignore every EC2 check in every account and region
"*":
Regions:
- "*"
Resources:
- "test"
Tags:
- "test=test" # Will ignore every resource containing the string "test" and the tags 'test=test' and
- "project=test|project=stage" # either of ('project=test' OR project=stage) in account 123456789012 and every region
"*":
Checks:
"s3_bucket_object_versioning":
Regions:
- "eu-west-1"
- "us-east-1"
Resources:
- "ci-logs" # Will ignore bucket "ci-logs" AND ALSO bucket "ci-logs-replica" in specified check and regions
- "logs" # Will ignore EVERY BUCKET containing the string "logs" in specified check and regions
- ".+-logs" # Will ignore all buckets containing the terms ci-logs, qa-logs, etc. in specified check and regions
"ecs_task_definitions_no_environment_secrets":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Accounts:
- "0123456789012"
Regions:
- "eu-west-1"
- "eu-south-2" # Will ignore every resource in check ecs_task_definitions_no_environment_secrets except the ones in account 0123456789012 located in eu-south-2 or eu-west-1
"*":
Regions:
- "*"
Resources:
- "*"
Tags:
- "environment=dev" # Will ignore every resource containing the tag 'environment=dev' in every account and region
"123456789012":
Checks:
"*":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Resources:
- "test"
Tags:
- "environment=prod" # Will ignore every resource except in account 123456789012 except the ones containing the string "test" and tag environment=prod
```
## AWS Mutelist
### Mute specific AWS regions

View File

@@ -125,7 +125,7 @@ The JSON-OCSF output format implements the [Detection Finding](https://schema.oc
"product": {
"name": "Prowler",
"vendor_name": "Prowler",
"version": "4.2.1"
"version": "4.2.4"
},
"version": "1.1.0"
},
@@ -333,7 +333,7 @@ The following is the mapping between the native JSON and the Detection Finding f
| --- |---|
| AssessmentStartTime | event_time |
| FindingUniqueId | finding_info.uid |
| Provider | cloud.account.type |
| Provider | cloud.provider |
| CheckID | metadata.event_code |
| CheckTitle | finding_info.title |
| CheckType | unmapped.check_type |

View File

@@ -11,6 +11,12 @@ prowler <provider> --scan-unused-services
## Services that are ignored
### AWS
#### ACM
You can have certificates in ACM that are not in use by any AWS resource.
Prowler will check if every certificate is going to expire soon, if this certificate is not in use by default it is not going to be check if it is expired, is going to expire soon or it is good.
- `acm_certificates_expiration_check`
#### Athena
When you create an AWS Account, Athena will create a default primary workgroup for you.
Prowler will check if that workgroup is enabled and if it is being used by checking if there were queries in the last 45 days.

View File

@@ -83,9 +83,14 @@ nav:
- Authentication: tutorials/azure/authentication.md
- Non default clouds: tutorials/azure/use-non-default-cloud.md
- Subscriptions: tutorials/azure/subscriptions.md
- Create Prowler Service Principal: tutorials/azure/create-prowler-service-principal.md
- Google Cloud:
- Authentication: tutorials/gcp/authentication.md
- Projects: tutorials/gcp/projects.md
- Kubernetes:
- In-Cluster Execution: tutorials/kubernetes/in-cluster.md
- Non In-Cluster Execution: tutorials/kubernetes/outside-cluster.md
- Miscellaneous: tutorials/kubernetes/misc.md
- Developer Guide:
- Introduction: developer-guide/introduction.md
- Provider: developer-guide/provider.md

View File

@@ -58,20 +58,29 @@ Resources:
- 'account:Get*'
- 'appstream:Describe*'
- 'appstream:List*'
- 'backup:List*'
- 'cloudtrail:GetInsightSelectors'
- 'codeartifact:List*'
- 'codebuild:BatchGet*'
- 'cognito-idp:GetUserPoolMfaConfig'
- 'dlm:Get*'
- 'drs:Describe*'
- 'ds:Get*'
- 'ds:Describe*'
- 'ds:List*'
- 'dynamodb:GetResourcePolicy'
- 'ec2:GetEbsEncryptionByDefault'
- 'ec2:GetSnapshotBlockPublicAccessState'
- 'ec2:GetInstanceMetadataDefaults'
- 'ecr:Describe*'
- 'ecr:GetRegistryScanningConfiguration'
- 'elasticfilesystem:DescribeBackupPolicy'
- 'glue:GetConnections'
- 'glue:GetSecurityConfiguration*'
- 'glue:SearchTables'
- 'lambda:GetFunction*'
- 'logs:FilterLogEvents'
- 'lightsail:GetRelationalDatabases'
- 'macie2:GetMacieSession'
- 's3:GetAccountPublicAccessBlock'
- 'shield:DescribeProtection'
@@ -79,8 +88,10 @@ Resources:
- 'securityhub:BatchImportFindings'
- 'securityhub:GetFindings'
- 'ssm:GetDocument'
- 'ssm-incidents:List*'
- 'support:Describe*'
- 'tag:GetTagKeys'
- 'wellarchitected:List*'
Resource: '*'
- PolicyName: ProwlerScanRoleAdditionalViewPrivilegesApiGateway
PolicyDocument:

View File

@@ -16,7 +16,10 @@
"ds:Get*",
"ds:Describe*",
"ds:List*",
"dynamodb:GetResourcePolicy",
"ec2:GetEbsEncryptionByDefault",
"ec2:GetSnapshotBlockPublicAccessState",
"ec2:GetInstanceMetadataDefaults",
"ecr:Describe*",
"ecr:GetRegistryScanningConfiguration",
"elasticfilesystem:DescribeBackupPolicy",
@@ -25,6 +28,7 @@
"glue:SearchTables",
"lambda:GetFunction*",
"logs:FilterLogEvents",
"lightsail:GetRelationalDatabases",
"macie2:GetMacieSession",
"s3:GetAccountPublicAccessBlock",
"shield:DescribeProtection",

View File

@@ -0,0 +1,20 @@
{
"properties": {
"roleName": "ProwlerRole",
"description": "Role used for checks that require read-only access to Azure resources and are not covered by the Reader role.",
"assignableScopes": [
"/"
],
"permissions": [
{
"actions": [
"Microsoft.Web/sites/host/listkeys/action",
"Microsoft.Web/sites/config/list/Action"
],
"notActions": [],
"dataActions": [],
"notDataActions": []
}
]
}
}

543
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -6,7 +6,13 @@ from os import environ
from colorama import Fore, Style
from prowler.config.config import get_available_compliance_frameworks
from prowler.config.config import (
csv_file_suffix,
get_available_compliance_frameworks,
html_file_suffix,
json_asff_file_suffix,
json_ocsf_file_suffix,
)
from prowler.lib.banner import print_banner
from prowler.lib.check.check import (
bulk_load_checks_metadata,
@@ -36,19 +42,32 @@ from prowler.lib.check.custom_checks_metadata import (
)
from prowler.lib.cli.parser import ProwlerArgumentParser
from prowler.lib.logger import logger, set_logging_config
from prowler.lib.outputs.asff.asff import ASFF
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
AWSWellArchitected,
)
from prowler.lib.outputs.compliance.cis.cis_aws import AWSCIS
from prowler.lib.outputs.compliance.cis.cis_azure import AzureCIS
from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
from prowler.lib.outputs.compliance.compliance import display_compliance_table
from prowler.lib.outputs.html.html import add_html_footer, fill_html_overview_statistics
from prowler.lib.outputs.json.json import close_json
from prowler.lib.outputs.compliance.ens.ens_aws import AWSENS
from prowler.lib.outputs.compliance.generic.generic import GenericCompliance
from prowler.lib.outputs.compliance.iso27001.iso27001_aws import AWSISO27001
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_aws import AWSMitreAttack
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_azure import (
AzureMitreAttack,
)
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_gcp import GCPMitreAttack
from prowler.lib.outputs.csv.csv import CSV
from prowler.lib.outputs.finding import Finding
from prowler.lib.outputs.html.html import HTML
from prowler.lib.outputs.ocsf.ocsf import OCSF
from prowler.lib.outputs.outputs import extract_findings_statistics
from prowler.lib.outputs.slack.slack import Slack
from prowler.lib.outputs.summary_table import display_summary_table
from prowler.providers.aws.lib.s3.s3 import send_to_s3_bucket
from prowler.providers.aws.lib.security_hub.security_hub import (
batch_send_to_security_hub,
prepare_security_hub_findings,
resolve_security_hub_previous_findings,
verify_security_hub_integration_enabled_per_region,
)
from prowler.providers.aws.lib.s3.s3 import S3
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHub
from prowler.providers.common.provider import Provider
from prowler.providers.common.quick_inventory import run_provider_quick_inventory
@@ -180,7 +199,17 @@ def prowler():
# Import custom checks from folder
if checks_folder:
parse_checks_from_folder(global_provider, checks_folder)
custom_checks = parse_checks_from_folder(global_provider, checks_folder)
# Workaround to be able to execute custom checks alongside all checks if nothing is explicitly set
if (
not checks_file
and not checks
and not services
and not severities
and not compliance_framework
and not categories
):
checks_to_execute.update(custom_checks)
# Exclude checks if -e/--excluded-checks
if excluded_checks:
@@ -270,97 +299,321 @@ def prowler():
)
sys.exit(1)
# Outputs
# TODO: this part is needed since the checks generates a Check_Report_XXX and the output uses Finding
# This will be refactored for the outputs generate directly the Finding
finding_outputs = [
Finding.generate_output(global_provider, finding) for finding in findings
]
generated_outputs = {"regular": [], "compliance": []}
if args.output_formats:
for mode in args.output_formats:
# Close json file if exists
if "json" in mode:
close_json(
global_provider.output_options.output_filename,
global_provider.output_options.output_directory,
mode,
filename = (
f"{global_provider.output_options.output_directory}/"
f"{global_provider.output_options.output_filename}"
)
if mode == "csv":
csv_output = CSV(
findings=finding_outputs,
create_file_descriptor=True,
file_path=f"{filename}{csv_file_suffix}",
)
generated_outputs["regular"].append(csv_output)
# Write CSV Finding Object to file
csv_output.batch_write_data_to_file()
if mode == "json-asff":
asff_output = ASFF(
findings=finding_outputs,
create_file_descriptor=True,
file_path=f"{filename}{json_asff_file_suffix}",
)
generated_outputs["regular"].append(asff_output)
# Write ASFF Finding Object to file
asff_output.batch_write_data_to_file()
if mode == "json-ocsf":
json_output = OCSF(
findings=finding_outputs,
create_file_descriptor=True,
file_path=f"{filename}{json_ocsf_file_suffix}",
)
generated_outputs["regular"].append(json_output)
json_output.batch_write_data_to_file()
if mode == "html":
html_output = HTML(
findings=finding_outputs,
create_file_descriptor=True,
file_path=f"{filename}{html_file_suffix}",
)
generated_outputs["regular"].append(html_output)
html_output.batch_write_data_to_file(
provider=global_provider, stats=stats
)
if "html" in mode:
add_html_footer(
global_provider.output_options.output_filename,
global_provider.output_options.output_directory,
# Compliance Frameworks
input_compliance_frameworks = set(
global_provider.output_options.output_modes
).intersection(get_available_compliance_frameworks(provider))
if provider == "aws":
for compliance_name in input_compliance_frameworks:
if compliance_name.startswith("cis_"):
# Generate CIS Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
fill_html_overview_statistics(
stats,
global_provider.output_options.output_filename,
global_provider.output_options.output_directory,
cis = AWSCIS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(cis)
cis.batch_write_data_to_file()
elif compliance_name == "mitre_attack_aws":
# Generate MITRE ATT&CK Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
mitre_attack = AWSMitreAttack(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(mitre_attack)
mitre_attack.batch_write_data_to_file()
elif compliance_name.startswith("ens_"):
# Generate ENS Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
ens = AWSENS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(ens)
ens.batch_write_data_to_file()
elif compliance_name.startswith("aws_well_architected_framework"):
# Generate AWS Well-Architected Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
aws_well_architected = AWSWellArchitected(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(aws_well_architected)
aws_well_architected.batch_write_data_to_file()
elif compliance_name.startswith("iso27001_"):
# Generate ISO27001 Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
iso27001 = AWSISO27001(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(iso27001)
iso27001.batch_write_data_to_file()
else:
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
generic_compliance = GenericCompliance(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
# Send output to S3 if needed (-B / -D)
if provider == "aws" and (
args.output_bucket or args.output_bucket_no_assume
):
output_bucket = args.output_bucket
bucket_session = global_provider.session.current_session
# Check if -D was input
if args.output_bucket_no_assume:
output_bucket = args.output_bucket_no_assume
bucket_session = global_provider.session.original_session
send_to_s3_bucket(
global_provider.output_options.output_filename,
args.output_directory,
mode,
output_bucket,
bucket_session,
elif provider == "azure":
for compliance_name in input_compliance_frameworks:
if compliance_name.startswith("cis_"):
# Generate CIS Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
cis = AzureCIS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(cis)
cis.batch_write_data_to_file()
elif compliance_name == "mitre_attack_azure":
# Generate MITRE ATT&CK Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
mitre_attack = AzureMitreAttack(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(mitre_attack)
mitre_attack.batch_write_data_to_file()
else:
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
generic_compliance = GenericCompliance(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
elif provider == "gcp":
for compliance_name in input_compliance_frameworks:
if compliance_name.startswith("cis_"):
# Generate CIS Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
cis = GCPCIS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(cis)
cis.batch_write_data_to_file()
elif compliance_name == "mitre_attack_gcp":
# Generate MITRE ATT&CK Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
mitre_attack = GCPMitreAttack(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(mitre_attack)
mitre_attack.batch_write_data_to_file()
else:
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
generic_compliance = GenericCompliance(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
elif provider == "kubernetes":
for compliance_name in input_compliance_frameworks:
if compliance_name.startswith("cis_"):
# Generate CIS Finding Object
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
cis = KubernetesCIS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(cis)
cis.batch_write_data_to_file()
else:
filename = (
f"{global_provider.output_options.output_directory}/compliance/"
f"{global_provider.output_options.output_filename}_{compliance_name}.csv"
)
generic_compliance = GenericCompliance(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
# AWS Security Hub Integration
if provider == "aws" and args.security_hub:
print(
f"{Style.BRIGHT}\nSending findings to AWS Security Hub, please wait...{Style.RESET_ALL}"
)
# Verify where AWS Security Hub is enabled
aws_security_enabled_regions = []
security_hub_regions = (
global_provider.get_available_aws_service_regions("securityhub")
if not global_provider.identity.audited_regions
else global_provider.identity.audited_regions
)
for region in security_hub_regions:
# Save the regions where AWS Security Hub is enabled
if verify_security_hub_integration_enabled_per_region(
global_provider.identity.partition,
region,
global_provider.session.current_session,
global_provider.identity.account,
):
aws_security_enabled_regions.append(region)
# Prepare the findings to be sent to Security Hub
security_hub_findings_per_region = prepare_security_hub_findings(
findings,
global_provider,
global_provider.output_options,
aws_security_enabled_regions,
)
# Send the findings to Security Hub
findings_sent_to_security_hub = batch_send_to_security_hub(
security_hub_findings_per_region, global_provider.session.current_session
)
print(
f"{Style.BRIGHT}{Fore.GREEN}\n{findings_sent_to_security_hub} findings sent to AWS Security Hub!{Style.RESET_ALL}"
)
# Resolve previous fails of Security Hub
if not args.skip_sh_update:
if provider == "aws":
# Send output to S3 if needed (-B / -D) for all the output formats
if args.output_bucket or args.output_bucket_no_assume:
output_bucket = args.output_bucket
bucket_session = global_provider.session.current_session
# Check if -D was input
if args.output_bucket_no_assume:
output_bucket = args.output_bucket_no_assume
bucket_session = global_provider.session.original_session
s3 = S3(
session=bucket_session,
bucket_name=output_bucket,
output_directory=args.output_directory,
)
s3.send_to_bucket(generated_outputs)
if args.security_hub:
print(
f"{Style.BRIGHT}\nArchiving previous findings in AWS Security Hub, please wait...{Style.RESET_ALL}"
f"{Style.BRIGHT}\nSending findings to AWS Security Hub, please wait...{Style.RESET_ALL}"
)
findings_archived_in_security_hub = resolve_security_hub_previous_findings(
security_hub_findings_per_region,
global_provider,
security_hub_regions = (
global_provider.get_available_aws_service_regions("securityhub")
if not global_provider.identity.audited_regions
else global_provider.identity.audited_regions
)
security_hub = SecurityHub(
aws_account_id=global_provider.identity.account,
aws_partition=global_provider.identity.partition,
aws_session=global_provider.session.current_session,
findings=asff_output.data,
send_only_fails=global_provider.output_options.send_sh_only_fails,
aws_security_hub_available_regions=security_hub_regions,
)
# Send the findings to Security Hub
findings_sent_to_security_hub = security_hub.batch_send_to_security_hub()
print(
f"{Style.BRIGHT}{Fore.GREEN}\n{findings_archived_in_security_hub} findings archived in AWS Security Hub!{Style.RESET_ALL}"
f"{Style.BRIGHT}{Fore.GREEN}\n{findings_sent_to_security_hub} findings sent to AWS Security Hub!{Style.RESET_ALL}"
)
# Resolve previous fails of Security Hub
if not args.skip_sh_update:
print(
f"{Style.BRIGHT}\nArchiving previous findings in AWS Security Hub, please wait...{Style.RESET_ALL}"
)
findings_archived_in_security_hub = (
security_hub.archive_previous_findings()
)
print(
f"{Style.BRIGHT}{Fore.GREEN}\n{findings_archived_in_security_hub} findings archived in AWS Security Hub!{Style.RESET_ALL}"
)
# Display summary table
if not args.only_logs:
display_summary_table(

View File

@@ -1288,7 +1288,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "Configure TLS encryption for the etcd service.",
@@ -1310,7 +1310,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "Enable client authentication on etcd service.",
@@ -1332,7 +1332,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "Do not use self-signed certificates for TLS.",
@@ -1354,7 +1354,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "etcd should be configured to make use of TLS encryption for peer connections.",
@@ -1376,7 +1376,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "etcd should be configured for peer authentication.",
@@ -1398,7 +1398,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "Do not use automatically generated self-signed certificates for TLS connections between peers.",
@@ -1420,7 +1420,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 2 - Master Node",
"AssessmentStatus": "Manual",
"Description": "Use a different certificate authority for etcd from the one used for Kubernetes.",
@@ -2634,7 +2634,7 @@
],
"Attributes": [
{
"Section": "5.4",
"Section": "5.4 Secrets Management",
"Profile": "Level 2 - Master Node",
"AssessmentStatus": "Manual",
"Description": "Kubernetes supports mounting secrets as data volumes or as environment variables. Minimize the use of environment variable secrets.",

View File

@@ -1,6 +1,5 @@
import os
import pathlib
import sys
from datetime import datetime, timezone
from os import getcwd
@@ -11,7 +10,7 @@ from prowler.lib.logger import logger
timestamp = datetime.today()
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
prowler_version = "4.2.1"
prowler_version = "4.3.2"
html_logo_url = "https://github.com/prowler-cloud/prowler/"
square_logo_img = "https://prowler.com/wp-content/uploads/logo-html.png"
aws_logo = "https://user-images.githubusercontent.com/38561120/235953920-3e3fba08-0795-41dc-b480-9bea57db9f2e.png"
@@ -65,6 +64,8 @@ default_config_file_path = (
default_fixer_config_file_path = (
f"{pathlib.Path(os.path.dirname(os.path.realpath(__file__)))}/fixer_config.yaml"
)
encoding_format_utf_8 = "utf-8"
available_output_formats = ["csv", "json-asff", "json-ocsf", "html"]
def get_default_mute_file_path(provider: str):
@@ -99,52 +100,84 @@ def check_current_version():
def load_and_validate_config_file(provider: str, config_file_path: str) -> dict:
"""
load_and_validate_config_file reads the Prowler config file in YAML format from the default location or the file passed with the --config-file flag
Reads the Prowler config file in YAML format from the default location or the file passed with the --config-file flag.
Args:
provider (str): The provider name (e.g., 'aws', 'gcp', 'azure', 'kubernetes').
config_file_path (str): The path to the configuration file.
Returns:
dict: The configuration dictionary for the specified provider.
"""
try:
with open(config_file_path) as f:
config = {}
with open(config_file_path, "r", encoding=encoding_format_utf_8) as f:
config_file = yaml.safe_load(f)
# Not to introduce a breaking change we have to allow the old format config file without any provider keys
# and a new format with a key for each provider to include their configuration values within
# Check if the new format is passed
if (
"aws" in config_file
or "gcp" in config_file
or "azure" in config_file
or "kubernetes" in config_file
):
# Not to introduce a breaking change, allow the old format config file without any provider keys
# and a new format with a key for each provider to include their configuration values within.
if any(key in config_file for key in ["aws", "gcp", "azure", "kubernetes"]):
config = config_file.get(provider, {})
else:
config = config_file if config_file else {}
# Not to break Azure, K8s and GCP does not support neither use the old config format
# Not to break Azure, K8s and GCP does not support or use the old config format
if provider in ["azure", "gcp", "kubernetes"]:
config = {}
return config
except Exception as error:
logger.critical(
except FileNotFoundError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
sys.exit(1)
except yaml.YAMLError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
except UnicodeDecodeError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
return {}
def load_and_validate_fixer_config_file(
provider: str, fixer_config_file_path: str
) -> dict:
"""
load_and_validate_fixer_config_file reads the Prowler fixer config file in YAML format from the default location or the file passed with the --fixer-config flag
Reads the Prowler fixer config file in YAML format from the default location or the file passed with the --fixer-config flag.
Args:
provider (str): The provider name (e.g., 'aws', 'gcp', 'azure', 'kubernetes').
fixer_config_file_path (str): The path to the fixer configuration file.
Returns:
dict: The fixer configuration dictionary for the specified provider.
"""
try:
with open(fixer_config_file_path) as f:
with open(fixer_config_file_path, "r", encoding=encoding_format_utf_8) as f:
fixer_config_file = yaml.safe_load(f)
return fixer_config_file.get(provider, {})
except Exception as error:
logger.critical(
except FileNotFoundError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
sys.exit(1)
except yaml.YAMLError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
except UnicodeDecodeError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
return {}

View File

@@ -262,10 +262,28 @@ aws:
"LookupEvents",
"Search",
]
# AWS RDS Configuration
# aws.rds_instance_backup_enabled
# Whether to check RDS instance replicas or not
check_rds_instance_replicas: False
# AWS ACM Configuration
# aws.acm_certificates_expiration_check
days_to_expire_threshold: 7
# AWS EKS Configuration
# aws.eks_control_plane_logging_all_types_enabled
# EKS control plane logging types that must be enabled
eks_required_log_types:
[
"api",
"audit",
"authenticator",
"controllerManager",
"scheduler",
]
# Azure Configuration
azure:
# Azure Network Configuration

View File

@@ -19,7 +19,6 @@ from prowler.lib.check.compliance_models import load_compliance_framework
from prowler.lib.check.custom_checks_metadata import update_check_metadata
from prowler.lib.check.models import Check, load_check_metadata
from prowler.lib.logger import logger
from prowler.lib.mutelist.mutelist import mutelist_findings
from prowler.lib.outputs.outputs import report
from prowler.lib.utils.utils import open_file, parse_json_file, print_boxes
from prowler.providers.common.models import Audit_Metadata
@@ -126,9 +125,10 @@ def parse_checks_from_file(input_file: str, provider: str) -> set:
# Load checks from custom folder
def parse_checks_from_folder(provider, input_folder: str) -> int:
def parse_checks_from_folder(provider, input_folder: str) -> set:
# TODO: move the AWS-specific code into the provider
try:
imported_checks = 0
custom_checks = set()
# Check if input folder is a S3 URI
if provider.type == "aws" and re.search(
"^s3://([^/]+)/(.*?([^/]+))/$", input_folder
@@ -156,8 +156,8 @@ def parse_checks_from_folder(provider, input_folder: str) -> int:
if os.path.exists(prowler_module):
shutil.rmtree(prowler_module)
shutil.copytree(check_module, prowler_module)
imported_checks += 1
return imported_checks
custom_checks.add(check.name)
return custom_checks
except Exception as error:
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
@@ -611,9 +611,9 @@ def execute_checks(
else:
# Prepare your messages
messages = [f"Config File: {Fore.YELLOW}{config_file}{Style.RESET_ALL}"]
if global_provider.mutelist_file_path:
if global_provider.mutelist.mutelist_file_path:
messages.append(
f"Mutelist File: {Fore.YELLOW}{global_provider.mutelist_file_path}{Style.RESET_ALL}"
f"Mutelist File: {Fore.YELLOW}{global_provider.mutelist.mutelist_file_path}{Style.RESET_ALL}"
)
if global_provider.type == "aws":
messages.append(
@@ -706,6 +706,14 @@ def execute(
check_class, verbose, global_provider.output_options.only_logs
)
# Exclude findings per status
if global_provider.output_options.status:
check_findings = [
finding
for finding in check_findings
if finding.status in global_provider.output_options.status
]
# Update Audit Status
services_executed.add(service)
checks_executed.add(check_name)
@@ -714,11 +722,21 @@ def execute(
)
# Mutelist findings
if hasattr(global_provider, "mutelist") and global_provider.mutelist:
check_findings = mutelist_findings(
global_provider,
check_findings,
)
if hasattr(global_provider, "mutelist") and global_provider.mutelist.mutelist:
# TODO: make this prettier
is_finding_muted_args = {}
if global_provider.type == "aws":
is_finding_muted_args["aws_account_id"] = (
global_provider.identity.account
)
elif global_provider.type == "kubernetes":
is_finding_muted_args["cluster"] = global_provider.identity.cluster
for finding in check_findings:
is_finding_muted_args["finding"] = finding
finding.muted = global_provider.mutelist.is_finding_muted(
**is_finding_muted_args
)
# Refactor(Outputs)
# Report the check's findings

View File

@@ -1,16 +1,21 @@
import sys
from pydantic import parse_obj_as
from prowler.lib.check.compliance_models import Compliance_Base_Model
from prowler.lib.check.models import Check_Metadata_Model
from prowler.lib.check.compliance_models import ComplianceBaseModel
from prowler.lib.logger import logger
def update_checks_metadata_with_compliance(
bulk_compliance_frameworks: dict, bulk_checks_metadata: dict
):
"""Update the check metadata model with the compliance framework"""
) -> dict:
"""
Update the check metadata model with the compliance framework
Args:
bulk_compliance_frameworks (dict): The compliance frameworks
bulk_checks_metadata (dict): The checks metadata
Returns:
dict: The checks metadata with the compliance frameworks
"""
try:
for check in bulk_checks_metadata:
check_compliance = []
@@ -22,7 +27,7 @@ def update_checks_metadata_with_compliance(
# Include the requirement into the check's framework requirements
compliance_requirements.append(requirement)
# Create the Compliance_Model
compliance = Compliance_Base_Model(
compliance = ComplianceBaseModel(
Framework=framework.Framework,
Provider=framework.Provider,
Version=framework.Version,
@@ -33,53 +38,6 @@ def update_checks_metadata_with_compliance(
check_compliance.append(compliance)
# Save it into the check's metadata
bulk_checks_metadata[check].Compliance = check_compliance
# Add requirements of Manual Controls
for framework in bulk_compliance_frameworks.values():
for requirement in framework.Requirements:
compliance_requirements = []
# Verify if requirement is Manual
if not requirement.Checks:
compliance_requirements.append(requirement)
# Create the Compliance_Model
compliance = Compliance_Base_Model(
Framework=framework.Framework,
Provider=framework.Provider,
Version=framework.Version,
Description=framework.Description,
Requirements=compliance_requirements,
)
# Include the compliance framework for the check
check_compliance.append(compliance)
# Create metadata for Manual Control
manual_check_metadata = {
"Provider": framework.Provider.lower(),
"CheckID": "manual_check",
"CheckTitle": "Manual Check",
"CheckType": [],
"ServiceName": "",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "low",
"ResourceType": "",
"Description": "",
"Risk": "",
"RelatedUrl": "",
"Remediation": {
"Code": {"CLI": "", "NativeIaC": "", "Other": "", "Terraform": ""},
"Recommendation": {"Text": "", "Url": ""},
},
"Categories": [],
"Tags": {},
"DependsOn": [],
"RelatedTo": [],
"Notes": "",
}
manual_check = parse_obj_as(Check_Metadata_Model, manual_check_metadata)
# Save it into the check's metadata
bulk_checks_metadata["manual_check"] = manual_check
bulk_checks_metadata["manual_check"].Compliance = check_compliance
return bulk_checks_metadata
except Exception as e:
logger.critical(f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}] -- {e}")

View File

@@ -91,7 +91,6 @@ class CIS_Requirement_Attribute(BaseModel):
AdditionalInformation: str
DefaultValue: Optional[str]
References: str
DefaultValue: Optional[str]
# Well Architected Requirement Attribute
@@ -189,8 +188,8 @@ class Compliance_Requirement(BaseModel):
Checks: list[str]
class Compliance_Base_Model(BaseModel):
"""Compliance_Base_Model holds the base model for every compliance framework"""
class ComplianceBaseModel(BaseModel):
"""ComplianceBaseModel holds the base model for every compliance framework"""
Framework: str
Provider: str
@@ -218,10 +217,10 @@ class Compliance_Base_Model(BaseModel):
# Testing Pending
def load_compliance_framework(
compliance_specification_file: str,
) -> Compliance_Base_Model:
) -> ComplianceBaseModel:
"""load_compliance_framework loads and parse a Compliance Framework Specification"""
try:
compliance_framework = Compliance_Base_Model.parse_file(
compliance_framework = ComplianceBaseModel.parse_file(
compliance_specification_file
)
except ValidationError as error:

View File

@@ -96,6 +96,8 @@ class Check(ABC, Check_Metadata_Model):
data = Check_Metadata_Model.parse_file(metadata_file).dict()
# Calls parents init function
super().__init__(**data)
# TODO: verify that the CheckID is the same as the filename and classname
# to mimic the test done at test_<provider>_checks_metadata_is_valid
def metadata(self) -> dict:
"""Return the JSON representation of the check's metadata"""

View File

@@ -5,6 +5,7 @@ from argparse import RawTextHelpFormatter
from dashboard.lib.arguments.arguments import init_dashboard_parser
from prowler.config.config import (
available_compliance_frameworks,
available_output_formats,
check_current_version,
default_config_file_path,
default_fixer_config_file_path,
@@ -147,7 +148,7 @@ Detailed documentation at https://docs.prowler.com
nargs="+",
help="Output modes, by default csv and json-oscf are saved. When using AWS Security Hub integration, json-asff output is also saved.",
default=["csv", "json-ocsf", "html"],
choices=["csv", "json-asff", "json-ocsf", "html"],
choices=available_output_formats,
)
common_outputs_parser.add_argument(
"--output-filename",
@@ -262,7 +263,7 @@ Detailed documentation at https://docs.prowler.com
group.add_argument(
"--compliance",
nargs="+",
help="Compliance Framework to check against for. The format should be the following: framework_version_provider (e.g.: ens_rd2022_aws)",
help="Compliance Framework to check against for. The format should be the following: framework_version_provider (e.g.: cis_3.0_aws)",
choices=available_compliance_frameworks,
)
group.add_argument(

View File

@@ -1,373 +1,331 @@
import re
from typing import Any
from abc import ABC, abstractmethod
import yaml
from prowler.lib.logger import logger
from prowler.lib.mutelist.models import mutelist_schema
from prowler.lib.outputs.utils import unroll_tags
def get_mutelist_file_from_local_file(mutelist_path: str):
try:
with open(mutelist_path) as f:
mutelist = yaml.safe_load(f)["Mutelist"]
return mutelist
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return {}
def validate_mutelist(mutelist: dict) -> dict:
try:
mutelist = mutelist_schema.validate(mutelist)
return mutelist
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- Mutelist YAML is malformed - {error}[{error.__traceback__.tb_lineno}]"
)
return {}
def mutelist_findings(
global_provider: Any,
check_findings: list[Any],
) -> list[Any]:
# Check if finding is muted
for finding in check_findings:
# TODO: Move this mapping to the execute_check function and pass that output to the mutelist and the report
if global_provider.type == "aws":
finding.muted = is_muted(
global_provider.mutelist,
global_provider.identity.account,
finding.check_metadata.CheckID,
finding.region,
finding.resource_id,
unroll_tags(finding.resource_tags),
)
elif global_provider.type == "azure":
finding.muted = is_muted(
global_provider.mutelist,
finding.subscription,
finding.check_metadata.CheckID,
# TODO: add region to the findings when we add Azure Locations
# finding.region,
"",
finding.resource_name,
unroll_tags(finding.resource_tags),
)
elif global_provider.type == "gcp":
finding.muted = is_muted(
global_provider.mutelist,
finding.project_id,
finding.check_metadata.CheckID,
finding.location,
finding.resource_name,
unroll_tags(finding.resource_tags),
)
elif global_provider.type == "kubernetes":
finding.muted = is_muted(
global_provider.mutelist,
global_provider.identity.cluster,
finding.check_metadata.CheckID,
finding.namespace,
finding.resource_name,
unroll_tags(finding.resource_tags),
)
return check_findings
def is_muted(
mutelist: dict,
audited_account: str,
check: str,
finding_region: str,
finding_resource: str,
finding_tags,
) -> bool:
class Mutelist(ABC):
"""
Check if the provided finding is muted for the audited account, check, region, resource and tags.
Abstract base class for managing a mutelist.
Args:
mutelist (dict): Dictionary containing information about muted checks for different accounts.
audited_account (str): The account being audited.
check (str): The check to be evaluated for muting.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags: The tags associated with the finding.
Attributes:
_mutelist (dict): Dictionary containing information about muted checks for different accounts.
_mutelist_file_path (str): Path to the mutelist file.
MUTELIST_KEY (str): Key used to access the mutelist in the mutelist file.
Returns:
bool: True if the finding is muted for the audited account, check, region, resource and tags., otherwise False.
Methods:
__init__: Initializes a Mutelist object.
mutelist: Property that returns the mutelist dictionary.
mutelist_file_path: Property that returns the mutelist file path.
is_finding_muted: Abstract method to check if a finding is muted.
get_mutelist_file_from_local_file: Retrieves the mutelist file from a local file.
validate_mutelist: Validates the mutelist against a schema.
is_muted: Checks if a finding is muted for the audited account, check, region, resource, and tags.
is_muted_in_check: Checks if a check is muted.
is_excepted: Checks if the account, region, resource, and tags are excepted based on the exceptions.
"""
try:
# By default is not muted
is_finding_muted = False
# We always check all the accounts present in the mutelist
# if one mutes the finding we set the finding as muted
for account in mutelist["Accounts"]:
if account == audited_account or account == "*":
if is_muted_in_check(
mutelist["Accounts"][account]["Checks"],
audited_account,
check,
finding_region,
finding_resource,
finding_tags,
_mutelist: dict = {}
_mutelist_file_path: str = None
MUTELIST_KEY = "Mutelist"
def __init__(
self, mutelist_path: str = "", mutelist_content: dict = {}
) -> "Mutelist":
if mutelist_path:
self._mutelist_file_path = mutelist_path
self.get_mutelist_file_from_local_file(mutelist_path)
else:
self._mutelist = mutelist_content
if self._mutelist:
self.validate_mutelist()
@property
def mutelist(self) -> dict:
return self._mutelist
@property
def mutelist_file_path(self) -> dict:
return self._mutelist_file_path
@abstractmethod
def is_finding_muted(self) -> bool:
raise NotImplementedError
def get_mutelist_file_from_local_file(self, mutelist_path: str):
try:
with open(mutelist_path) as f:
self._mutelist = yaml.safe_load(f)[self.MUTELIST_KEY]
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
def validate_mutelist(self) -> bool:
try:
self._mutelist = mutelist_schema.validate(self._mutelist)
return True
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- Mutelist YAML is malformed - {error}[{error.__traceback__.tb_lineno}]"
)
self._mutelist = {}
return False
def is_muted(
self,
audited_account: str,
check: str,
finding_region: str,
finding_resource: str,
finding_tags,
) -> bool:
"""
Check if the provided finding is muted for the audited account, check, region, resource and tags.
The Mutelist works in a way that each field is ANDed, so if a check is muted for an account, region, resource and tags, it will be muted.
The exceptions are ORed, so if a check is excepted for an account, region, resource or tags, it will not be muted.
The only particularity is the tags, which are ORed.
So, for the following Mutelist:
```
Mutelist:
Accounts:
'*':
Checks:
ec2_instance_detailed_monitoring_enabled:
Regions: ['*']
Resources:
- 'i-123456789'
Tags:
- 'Name=AdminInstance | Environment=Prod'
```
The check `ec2_instance_detailed_monitoring_enabled` will be muted for all accounts and regions and for the resource_id 'i-123456789' with at least one of the tags 'Name=AdminInstance' or 'Environment=Prod'.
Args:
mutelist (dict): Dictionary containing information about muted checks for different accounts.
audited_account (str): The account being audited.
check (str): The check to be evaluated for muting.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags: The tags associated with the finding.
Returns:
bool: True if the finding is muted for the audited account, check, region, resource and tags., otherwise False.
"""
try:
# By default is not muted
is_finding_muted = False
# We always check all the accounts present in the mutelist
# if one mutes the finding we set the finding as muted
for account in self._mutelist.get("Accounts", []):
if account == audited_account or account == "*":
if self.is_muted_in_check(
self._mutelist["Accounts"][account]["Checks"],
audited_account,
check,
finding_region,
finding_resource,
finding_tags,
):
is_finding_muted = True
break
return is_finding_muted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_muted_in_check(
self,
muted_checks,
audited_account,
check,
finding_region,
finding_resource,
finding_tags,
) -> bool:
"""
Check if the provided check is muted.
Args:
muted_checks (dict): Dictionary containing information about muted checks.
audited_account (str): The account to be audited.
check (str): The check to be evaluated for muting.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags (str): The tags associated with the finding.
Returns:
bool: True if the check is muted, otherwise False.
"""
try:
# Default value is not muted
is_check_muted = False
for muted_check, muted_check_info in muted_checks.items():
# map lambda to awslambda
muted_check = re.sub("^lambda", "awslambda", muted_check)
check_match = (
"*" == muted_check
or check == muted_check
or self.is_item_matched([muted_check], check)
)
# Check if the finding is excepted
exceptions = muted_check_info.get("Exceptions")
if (
self.is_excepted(
exceptions,
audited_account,
finding_region,
finding_resource,
finding_tags,
)
and check_match
):
is_finding_muted = True
# Break loop and return default value since is excepted
break
return is_finding_muted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
muted_regions = muted_check_info.get("Regions")
muted_resources = muted_check_info.get("Resources")
muted_tags = muted_check_info.get("Tags", "*")
# We need to set the muted_tags if None, "" or [], so the falsy helps
if not muted_tags:
muted_tags = "*"
# If there is a *, it affects to all checks
if check_match:
muted_in_check = True
muted_in_region = self.is_item_matched(
muted_regions, finding_region
)
muted_in_resource = self.is_item_matched(
muted_resources, finding_resource
)
muted_in_tags = self.is_item_matched(muted_tags, finding_tags)
# For a finding to be muted requires the following set to True:
# - muted_in_check -> True
# - muted_in_region -> True
# - muted_in_tags -> True
# - muted_in_resource -> True
# - excepted -> False
def is_muted_in_check(
muted_checks,
audited_account,
check,
finding_region,
finding_resource,
finding_tags,
) -> bool:
"""
Check if the provided check is muted.
if (
muted_in_check
and muted_in_region
and muted_in_tags
and muted_in_resource
):
is_check_muted = True
Args:
muted_checks (dict): Dictionary containing information about muted checks.
audited_account (str): The account to be audited.
check (str): The check to be evaluated for muting.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags (str): The tags associated with the finding.
Returns:
bool: True if the check is muted, otherwise False.
"""
try:
# Default value is not muted
is_check_muted = False
for muted_check, muted_check_info in muted_checks.items():
# map lambda to awslambda
muted_check = re.sub("^lambda", "awslambda", muted_check)
check_match = (
"*" == muted_check
or check == muted_check
or re.search(muted_check, check)
return is_check_muted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
# Check if the finding is excepted
exceptions = muted_check_info.get("Exceptions")
if (
is_excepted(
exceptions,
audited_account,
finding_region,
finding_resource,
finding_tags,
)
and check_match
):
# Break loop and return default value since is excepted
break
return False
muted_regions = muted_check_info.get("Regions")
muted_resources = muted_check_info.get("Resources")
muted_tags = muted_check_info.get("Tags", "*")
# We need to set the muted_tags if None, "" or [], so the falsy helps
if not muted_tags:
muted_tags = "*"
# If there is a *, it affects to all checks
if check_match:
muted_in_check = True
muted_in_region = is_muted_in_region(muted_regions, finding_region)
muted_in_resource = is_muted_in_resource(
muted_resources, finding_resource
)
muted_in_tags = is_muted_in_tags(muted_tags, finding_tags)
def is_excepted(
self,
exceptions,
audited_account,
finding_region,
finding_resource,
finding_tags,
) -> bool:
"""
Check if the provided account, region, resource, and tags are excepted based on the exceptions dictionary.
# For a finding to be muted requires the following set to True:
# - muted_in_check -> True
# - muted_in_region -> True
# - muted_in_tags -> True
# - muted_in_resource -> True
# - excepted -> False
Args:
exceptions (dict): Dictionary containing exceptions for different attributes like Accounts, Regions, Resources, and Tags.
audited_account (str): The account to be audited.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags (str): The tags associated with the finding.
Returns:
bool: True if the account, region, resource, and tags are excepted based on the exceptions, otherwise False.
"""
try:
excepted = False
is_account_excepted = False
is_region_excepted = False
is_resource_excepted = False
is_tag_excepted = False
if exceptions:
excepted_accounts = exceptions.get("Accounts", [])
is_account_excepted = self.is_item_matched(
excepted_accounts, audited_account
)
excepted_regions = exceptions.get("Regions", [])
is_region_excepted = self.is_item_matched(
excepted_regions, finding_region
)
excepted_resources = exceptions.get("Resources", [])
is_resource_excepted = self.is_item_matched(
excepted_resources, finding_resource
)
excepted_tags = exceptions.get("Tags", [])
is_tag_excepted = self.is_item_matched(excepted_tags, finding_tags)
if (
muted_in_check
and muted_in_region
and muted_in_tags
and muted_in_resource
not is_account_excepted
and not is_region_excepted
and not is_resource_excepted
and not is_tag_excepted
):
is_check_muted = True
return is_check_muted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_muted_in_region(
mutelist_regions,
finding_region,
) -> bool:
"""
Check if the finding_region is present in the mutelist_regions.
Args:
mutelist_regions (list): List of regions in the mute list.
finding_region (str): Region to check if it is muted.
Returns:
bool: True if the finding_region is muted in any of the mutelist_regions, otherwise False.
"""
try:
return __is_item_matched__(mutelist_regions, finding_region)
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_muted_in_tags(muted_tags, finding_tags) -> bool:
"""
Check if any of the muted tags are present in the finding tags.
Args:
muted_tags (list): List of muted tags to be checked.
finding_tags (str): String containing tags to search for muted tags.
Returns:
bool: True if any of the muted tags are present in the finding tags, otherwise False.
"""
try:
return __is_item_matched__(muted_tags, finding_tags)
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_muted_in_resource(muted_resources, finding_resource) -> bool:
"""
Check if any of the muted_resources are present in the finding_resource.
Args:
muted_resources (list): List of muted resources to be checked.
finding_resource (str): Resource to search for muted resources.
Returns:
bool: True if any of the muted_resources are present in the finding_resource, otherwise False.
"""
try:
return __is_item_matched__(muted_resources, finding_resource)
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_excepted(
exceptions,
audited_account,
finding_region,
finding_resource,
finding_tags,
) -> bool:
"""
Check if the provided account, region, resource, and tags are excepted based on the exceptions dictionary.
Args:
exceptions (dict): Dictionary containing exceptions for different attributes like Accounts, Regions, Resources, and Tags.
audited_account (str): The account to be audited.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags (str): The tags associated with the finding.
Returns:
bool: True if the account, region, resource, and tags are excepted based on the exceptions, otherwise False.
"""
try:
excepted = False
is_account_excepted = False
is_region_excepted = False
is_resource_excepted = False
is_tag_excepted = False
if exceptions:
excepted_accounts = exceptions.get("Accounts", [])
is_account_excepted = __is_item_matched__(
excepted_accounts, audited_account
excepted = False
elif (
(is_account_excepted or not excepted_accounts)
and (is_region_excepted or not excepted_regions)
and (is_resource_excepted or not excepted_resources)
and (is_tag_excepted or not excepted_tags)
):
excepted = True
return excepted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
excepted_regions = exceptions.get("Regions", [])
is_region_excepted = __is_item_matched__(excepted_regions, finding_region)
@staticmethod
def is_item_matched(matched_items, finding_items):
"""
Check if any of the items in matched_items are present in finding_items.
excepted_resources = exceptions.get("Resources", [])
is_resource_excepted = __is_item_matched__(
excepted_resources, finding_resource
Args:
matched_items (list): List of items to be matched.
finding_items (str): String to search for matched items.
Returns:
bool: True if any of the matched_items are present in finding_items, otherwise False.
"""
try:
is_item_matched = False
if matched_items and (finding_items or finding_items == ""):
for item in matched_items:
if item.startswith("*"):
item = ".*" + item[1:]
if re.search(item, finding_items):
is_item_matched = True
break
return is_item_matched
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
excepted_tags = exceptions.get("Tags", [])
is_tag_excepted = __is_item_matched__(excepted_tags, finding_tags)
if (
not is_account_excepted
and not is_region_excepted
and not is_resource_excepted
and not is_tag_excepted
):
excepted = False
elif (
(is_account_excepted or not excepted_accounts)
and (is_region_excepted or not excepted_regions)
and (is_resource_excepted or not excepted_resources)
and (is_tag_excepted or not excepted_tags)
):
excepted = True
return excepted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def __is_item_matched__(matched_items, finding_items):
"""
Check if any of the items in matched_items are present in finding_items.
Args:
matched_items (list): List of items to be matched.
finding_items (str): String to search for matched items.
Returns:
bool: True if any of the matched_items are present in finding_items, otherwise False.
"""
try:
is_item_matched = False
if matched_items and (finding_items or finding_items == ""):
for item in matched_items:
if item.startswith("*"):
item = ".*" + item[1:]
if re.search(item, finding_items):
is_item_matched = True
break
return is_item_matched
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
return False

View File

@@ -0,0 +1,401 @@
from json import dump
from os import SEEK_SET
from typing import Optional
from pydantic import BaseModel, validator
from prowler.config.config import prowler_version, timestamp_utc
from prowler.lib.logger import logger
from prowler.lib.outputs.finding import Finding
from prowler.lib.outputs.output import Output
from prowler.lib.utils.utils import hash_sha512
class ASFF(Output):
"""
ASFF class represents a transformation of findings into AWS Security Finding Format (ASFF).
This class provides methods to transform a list of findings into the ASFF format required by AWS Security Hub. It includes operations such as generating unique identifiers, formatting timestamps, handling compliance frameworks, and ensuring the status values match the allowed values in ASFF.
Attributes:
- _data: A list to store the transformed findings.
- _file_descriptor: A file descriptor to write to file.
Methods:
- transform(findings: list[Finding]) -> None: Transforms a list of findings into ASFF format.
- batch_write_data_to_file() -> None: Writes the findings data to a file in JSON ASFF format.
- generate_status(status: str, muted: bool = False) -> str: Generates the ASFF status based on the provided status and muted flag.
References:
- AWS Security Hub API Reference: https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
- AWS Security Finding Format Syntax: https://docs.aws.amazon.com/securityhub/latest/userguide/securityhub-findings-format-syntax.html
"""
def transform(self, findings: list[Finding]) -> None:
"""
Transforms a list of findings into AWS Security Finding Format (ASFF).
This method iterates over the list of findings provided as input and transforms each finding into the ASFF format required by AWS Security Hub. It performs several operations for each finding, including generating unique identifiers, formatting timestamps, handling compliance frameworks, and ensuring the status values match the allowed values in ASFF.
Parameters:
- findings (list[Finding]): A list of Finding objects representing the findings to be transformed.
Returns:
- None
Notes:
- The method skips findings with a status of "MANUAL" as it is not valid in SecurityHub.
- It generates unique identifiers for each finding based on specific attributes.
- It formats timestamps in the required ASFF format.
- It handles compliance frameworks and associated standards for each finding.
- It ensures that the finding status matches the allowed values in ASFF.
References:
- AWS Security Hub API Reference: https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
- AWS Security Finding Format Syntax: https://docs.aws.amazon.com/securityhub/latest/userguide/securityhub-findings-format-syntax.html
"""
try:
for finding in findings:
# MANUAL status is not valid in SecurityHub
# https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
if finding.status == "MANUAL":
continue
timestamp = timestamp_utc.strftime("%Y-%m-%dT%H:%M:%SZ")
associated_standards, compliance_summary = ASFF.format_compliance(
finding.compliance
)
# Ensures finding_status matches allowed values in ASFF
finding_status = ASFF.generate_status(finding.status, finding.muted)
self._data.append(
AWSSecurityFindingFormat(
# The following line cannot be changed because it is the format we use to generate unique findings for AWS Security Hub
# If changed some findings could be lost because the unique identifier will be different
Id=f"prowler-{finding.check_id}-{finding.account_uid}-{finding.region}-{hash_sha512(finding.resource_uid)}",
ProductArn=f"arn:{finding.partition}:securityhub:{finding.region}::product/prowler/prowler",
ProductFields=ProductFields(
ProwlerResourceName=finding.resource_uid,
),
GeneratorId="prowler-" + finding.check_id,
AwsAccountId=finding.account_uid,
Types=(
finding.check_type.split(",")
if finding.check_type
else ["Software and Configuration Checks"]
),
FirstObservedAt=timestamp,
UpdatedAt=timestamp,
CreatedAt=timestamp,
Severity=Severity(Label=finding.severity.value),
Title=finding.check_title,
Description=finding.description,
Resources=[
Resource(
Id=finding.resource_uid,
Type=finding.resource_type,
Partition=finding.partition,
Region=finding.region,
Tags=finding.resource_tags,
)
],
Compliance=Compliance(
Status=finding_status,
AssociatedStandards=associated_standards,
RelatedRequirements=compliance_summary,
),
Remediation=Remediation(
Recommendation=Recommendation(
Text=finding.remediation_recommendation_text,
Url=finding.remediation_recommendation_url,
)
),
)
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def batch_write_data_to_file(self) -> None:
"""
Writes the findings data to a file in JSON ASFF format.
This method iterates over the findings data stored in the '_data' attribute and writes it to the file descriptor '_file_descriptor' in JSON format. It starts by writing the JSON opening/header '[', then iterates over each finding, dumping it to the file with an indent of 4 spaces. After writing all findings, it writes the closing ']' to complete the JSON array structure. Finally, it closes the file descriptor.
Returns:
None
"""
try:
if (
getattr(self, "_file_descriptor", None)
and not self._file_descriptor.closed
and self._data
):
# Write JSON opening/header [
self._file_descriptor.write("[")
# Write findings
for finding in self._data:
dump(
finding.dict(exclude_none=True),
self._file_descriptor,
indent=4,
)
self._file_descriptor.write(",")
# Write footer/closing ]
if self._file_descriptor.tell() > 0:
if self._file_descriptor.tell() != 1:
self._file_descriptor.seek(
self._file_descriptor.tell() - 1, SEEK_SET
)
self._file_descriptor.truncate()
self._file_descriptor.write("]")
# Close file descriptor
self._file_descriptor.close()
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
@staticmethod
def generate_status(status: str, muted: bool = False) -> str:
"""
Generates the ASFF status based on the provided status and muted flag.
Parameters:
- status (str): The status of the finding.
- muted (bool): Flag indicating if the finding is muted.
Returns:
- str: The ASFF status corresponding to the provided status and muted flag.
References:
- AWS Security Hub API Reference: https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
"""
json_asff_status = ""
if muted:
# Per AWS Security Hub "MUTED" is not a valid status
# https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
json_asff_status = "WARNING"
else:
if status == "PASS":
json_asff_status = "PASSED"
elif status == "FAIL":
json_asff_status = "FAILED"
else:
# MANUAL is set to NOT_AVAILABLE
json_asff_status = "NOT_AVAILABLE"
return json_asff_status
@staticmethod
def format_compliance(compliance: dict) -> tuple[list[dict], list[str]]:
"""
Transforms a dictionary of compliance data into a tuple of associated standards and compliance summaries.
Parameters:
- compliance (dict): A dictionary containing compliance data where keys are standards and values are lists of compliance details.
Returns:
- tuple[list[dict], list[str]]: A tuple containing a list of associated standards (each as a dictionary with 'StandardsId') and a list of compliance summaries.
Notes:
- The method limits the number of associated standards to 20.
- Each compliance summary is a concatenation of the standard key and its associated compliance details.
- If the concatenated summary exceeds 64 characters, it is truncated to 63 characters.
Example:
format_compliance({"standard1": ["detail1", "detail2"], "standard2": ["detail3"]}) -> ([{"StandardsId": "standard1"}, {"StandardsId": "standard2"}], ["standard1 detail1 detail2", "standard2 detail3"])
"""
compliance_summary = []
associated_standards = []
for key, value in compliance.items():
if (
len(associated_standards) < 20
): # AssociatedStandards should NOT have more than 20 items
associated_standards.append({"StandardsId": key})
item = f"{key} {' '.join(value)}"
if len(item) > 64:
item = item[0:63]
compliance_summary.append(item)
return associated_standards, compliance_summary
class ProductFields(BaseModel):
"""
Class representing the Product Fields of a finding in the AWS Security Finding Format.
Attributes:
- ProviderName (str): The name of the provider, default value is "Prowler".
- ProviderVersion (str): The version of the provider, fetched from the prowler_version in config.py.
- ProwlerResourceName (str): The name of the Prowler resource.
"""
ProviderName: str = "Prowler"
ProviderVersion: str = prowler_version
ProwlerResourceName: str
class Severity(BaseModel):
"""
Class representing the severity of a finding in the AWS Security Finding Format.
Attributes:
- Label (str): A string representing the severity label of the finding.
This class is used to define the severity level of a finding in the AWS Security Finding Format.
"""
Label: str
@validator("Label", pre=True, always=True)
def severity_uppercase(severity):
return severity.upper()
class Resource(BaseModel):
"""
Class representing a resource in the AWS Security Finding Format.
Attributes:
- Type (str): The type of the resource.
- Id (str): The unique identifier of the resource.
- Partition (str): The partition where the resource resides.
- Region (str): The region where the resource is located.
- Tags (Optional[dict]): Optional dictionary of tags associated with the resource.
This class defines the structure of a resource within the AWS Security Finding Format. It includes attributes to specify the type, unique identifier, partition, region, and optional tags of the resource.
"""
Type: str
Id: str
Partition: str
Region: str
Tags: Optional[dict]
@validator("Tags", pre=True, always=True)
def tags_cannot_be_empty_dict(tags):
if not tags:
return None
return tags
class Compliance(BaseModel):
"""
Class representing the compliance details of a finding in the AWS Security Finding Format.
Attributes:
- Status (str): The compliance status of the finding.
- RelatedRequirements (list[str]): A list of related compliance requirements for the finding.
- AssociatedStandards (list[dict]): A list of associated standards with the finding, where each item is a dictionary containing the 'StandardsId'.
This class defines the structure of compliance information within the AWS Security Finding Format. It includes attributes to specify the compliance status, related requirements, and associated standards of a finding.
"""
Status: str
RelatedRequirements: list[str]
AssociatedStandards: list[dict]
@validator("Status", pre=True, always=True)
def status(status):
if status not in ["PASSED", "WARNING", "FAILED", "NOT_AVAILABLE"]:
raise ValueError("must contain a space")
return status
class Recommendation(BaseModel):
"""
Class representing a recommendation for remediation in the AWS Security Finding Format.
Attributes:
- Text (str): The text description of the recommendation.
- Url (str): The URL link for additional information related to the recommendation.
This class defines the structure of a recommendation within the AWS Security Finding Format. It includes attributes to specify the text description and URL link for further details regarding the recommendation.
"""
Text: str = ""
Url: str = ""
@validator("Text", pre=True, always=True)
def text_must_not_exceed_512_chars(text):
text_validated = text
if len(text) > 512:
text_validated = text[:509] + "..."
return text_validated
@validator("Url", pre=True, always=True)
def set_default_url_if_empty(url):
default_url = "https://docs.aws.amazon.com/securityhub/latest/userguide/what-is-securityhub.html"
if url:
default_url = url
return default_url
class Remediation(BaseModel):
"""
Class representing a remediation action in the AWS Security Finding Format.
Attributes:
- Recommendation (Recommendation): An instance of the Recommendation class providing details for remediation.
This class defines the structure of a remediation action within the AWS Security Finding Format. It includes an attribute to specify the recommendation for remediation, which is an instance of the Recommendation class.
"""
Recommendation: Recommendation
class AWSSecurityFindingFormat(BaseModel):
"""
AWSSecurityFindingFormat generates a finding's output in JSON ASFF format: https://docs.aws.amazon.com/securityhub/latest/userguide/securityhub-findings-format-syntax.html
Attributes:
- SchemaVersion (str): The version of the ASFF schema being used, default value is "2018-10-08".
- Id (str): The unique identifier of the finding.
- ProductArn (str): The ARN of the product generating the finding.
- RecordState (str): The state of the finding record, default value is "ACTIVE".
- ProductFields (ProductFields): An instance of the ProductFields class representing the product fields of the finding.
- GeneratorId (str): The ID of the generator.
- AwsAccountId (str): The AWS account ID associated with the finding.
- Types (list[str]): A list of types associated with the finding, default value is None.
- FirstObservedAt (str): The timestamp when the finding was first observed.
- UpdatedAt (str): The timestamp when the finding was last updated.
- CreatedAt (str): The timestamp when the finding was created.
- Severity (Severity): An instance of the Severity class representing the severity of the finding.
- Title (str): The title of the finding.
- Description (str): The description of the finding, truncated to 1024 characters if longer.
- Resources (list[Resource]): A list of resources associated with the finding, default value is None.
- Compliance (Compliance): An instance of the Compliance class representing the compliance details of the finding.
- Remediation (Remediation): An instance of the Remediation class providing details for remediation.
This class defines the structure of a finding in the AWS Security Finding Format, including various attributes such as schema version, identifiers, timestamps, severity, title, description, resources, compliance details, and remediation information.
"""
SchemaVersion: str = "2018-10-08"
Id: str
ProductArn: str
RecordState: str = "ACTIVE"
ProductFields: ProductFields
GeneratorId: str
AwsAccountId: str
Types: list[str] = None
FirstObservedAt: str
UpdatedAt: str
CreatedAt: str
Severity: Severity
Title: str
Description: str
Resources: list[Resource] = None
Compliance: Compliance
Remediation: Remediation
@validator("Description", pre=True, always=True)
def description_must_not_exceed_1024_chars(description):
description_validated = description
if len(description) > 1024:
description_validated = description[:1021] + "..."
return description_validated

View File

@@ -2,7 +2,6 @@ from operator import attrgetter
from prowler.config.config import timestamp
from prowler.lib.logger import logger
from prowler.lib.outputs.common_models import FindingOutput
from prowler.lib.outputs.utils import unroll_list, unroll_tags
from prowler.lib.utils.utils import outputs_unix_timestamp
@@ -22,87 +21,6 @@ def get_provider_data_mapping(provider) -> dict:
return data
def generate_provider_output(provider, finding, csv_data) -> FindingOutput:
"""
generate_provider_output returns the provider's Finding output model
"""
# TODO: we have to standardize this between the above mapping and the provider.get_output_mapping()
try:
if provider.type == "aws":
# TODO: probably Organization UID is without the account id
csv_data["auth_method"] = f"profile: {csv_data['auth_method']}"
csv_data["resource_name"] = finding.resource_id
csv_data["resource_uid"] = finding.resource_arn
csv_data["region"] = finding.region
elif provider.type == "azure":
# TODO: we should show the authentication method used I think
csv_data["auth_method"] = (
f"{provider.identity.identity_type}: {provider.identity.identity_id}"
)
# Get the first tenant domain ID, just in case
csv_data["account_organization_uid"] = csv_data["account_organization_uid"][
0
]
csv_data["account_uid"] = (
csv_data["account_organization_uid"]
if "Tenant:" in finding.subscription
else provider.identity.subscriptions[finding.subscription]
)
csv_data["account_name"] = finding.subscription
csv_data["resource_name"] = finding.resource_name
csv_data["resource_uid"] = finding.resource_id
csv_data["region"] = finding.location
elif provider.type == "gcp":
csv_data["auth_method"] = f"Principal: {csv_data['auth_method']}"
csv_data["account_uid"] = provider.projects[finding.project_id].id
csv_data["account_name"] = provider.projects[finding.project_id].name
csv_data["account_tags"] = provider.projects[finding.project_id].labels
csv_data["resource_name"] = finding.resource_name
csv_data["resource_uid"] = finding.resource_id
csv_data["region"] = finding.location
if (
provider.projects
and finding.project_id in provider.projects
and getattr(provider.projects[finding.project_id], "organization")
):
csv_data["account_organization_uid"] = provider.projects[
finding.project_id
].organization.id
# TODO: for now is None since we don't retrieve that data
csv_data["account_organization"] = provider.projects[
finding.project_id
].organization.display_name
elif provider.type == "kubernetes":
if provider.identity.context == "In-Cluster":
csv_data["auth_method"] = "in-cluster"
else:
csv_data["auth_method"] = "kubeconfig"
csv_data["resource_name"] = finding.resource_name
csv_data["resource_uid"] = finding.resource_id
csv_data["account_name"] = f"context: {provider.identity.context}"
csv_data["region"] = f"namespace: {finding.namespace}"
# Finding Unique ID
# TODO: move this to a function
# TODO: in Azure, GCP and K8s there are fidings without resource_name
csv_data["finding_uid"] = (
f"prowler-{provider.type}-{finding.check_metadata.CheckID}-{csv_data['account_uid']}-{csv_data['region']}-{csv_data['resource_name']}"
)
finding_output = FindingOutput(**csv_data)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
return finding_output
# TODO: add test for outputs_unix_timestamp
def fill_common_finding_data(finding: dict, unix_timestamp: bool) -> dict:
finding_data = {

View File

@@ -1,77 +0,0 @@
from datetime import datetime
from enum import Enum
from typing import Optional, Union
from pydantic import BaseModel
from prowler.config.config import prowler_version
class Status(str, Enum):
PASS = "PASS"
FAIL = "FAIL"
MANUAL = "MANUAL"
class Severity(str, Enum):
critical = "critical"
high = "high"
medium = "medium"
low = "low"
informational = "informational"
class FindingOutput(BaseModel):
"""
FindingOutput generates a finding's output. It can be written to CSV or another format doing the mapping.
This is the base finding output model for every provider.
"""
auth_method: str
timestamp: Union[int, datetime]
account_uid: str
# Optional since depends on permissions
account_name: Optional[str]
# Optional since depends on permissions
account_email: Optional[str]
# Optional since depends on permissions
account_organization_uid: Optional[str]
# Optional since depends on permissions
account_organization_name: Optional[str]
# Optional since depends on permissions
account_tags: Optional[list[str]]
finding_uid: str
provider: str
check_id: str
check_title: str
check_type: str
status: Status
status_extended: str
muted: bool = False
service_name: str
subservice_name: str
severity: Severity
resource_type: str
resource_uid: str
resource_name: str
resource_details: str
resource_tags: str
# Only present for AWS and Azure
partition: Optional[str]
region: str
description: str
risk: str
related_url: str
remediation_recommendation_text: str
remediation_recommendation_url: str
remediation_code_nativeiac: str
remediation_code_terraform: str
remediation_code_cli: str
remediation_code_other: str
compliance: dict
categories: str
depends_on: str
related_to: str
notes: str
prowler_version: str = prowler_version

View File

@@ -0,0 +1,97 @@
from prowler.lib.check.compliance_models import ComplianceBaseModel
from prowler.lib.outputs.compliance.aws_well_architected.models import (
AWSWellArchitectedModel,
)
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.finding import Finding
class AWSWellArchitected(ComplianceOutput):
"""
This class represents the AWS Well-Architected compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into AWS Well-Architected compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: ComplianceBaseModel,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into AWS Well-Architected compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (ComplianceBaseModel): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = AWSWellArchitectedModel(
Provider=finding.provider,
Description=compliance.Description,
AccountId=finding.account_uid,
Region=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Name=attribute.Name,
Requirements_Attributes_WellArchitectedQuestionId=attribute.WellArchitectedQuestionId,
Requirements_Attributes_WellArchitectedPracticeId=attribute.WellArchitectedPracticeId,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_LevelOfRisk=attribute.LevelOfRisk,
Requirements_Attributes_AssessmentMethod=attribute.AssessmentMethod,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_ImplementationGuidanceUrl=attribute.ImplementationGuidanceUrl,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = AWSWellArchitectedModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
AccountId="",
Region="",
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Name=attribute.Name,
Requirements_Attributes_WellArchitectedQuestionId=attribute.WellArchitectedQuestionId,
Requirements_Attributes_WellArchitectedPracticeId=attribute.WellArchitectedPracticeId,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_LevelOfRisk=attribute.LevelOfRisk,
Requirements_Attributes_AssessmentMethod=attribute.AssessmentMethod,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_ImplementationGuidanceUrl=attribute.ImplementationGuidanceUrl,
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

View File

@@ -0,0 +1,32 @@
from typing import Optional
from pydantic import BaseModel
class AWSWellArchitectedModel(BaseModel):
"""
AWSWellArchitectedModel generates a finding's output in AWS Well-Architected Framework format.
"""
Provider: str
Description: str
AccountId: str
Region: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Name: str
Requirements_Attributes_WellArchitectedQuestionId: str
Requirements_Attributes_WellArchitectedPracticeId: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_LevelOfRisk: str
Requirements_Attributes_AssessmentMethod: str
Requirements_Attributes_Description: str
Requirements_Attributes_ImplementationGuidanceUrl: str
Status: str
StatusExtended: str
ResourceId: str
CheckId: str
Muted: bool
ResourceName: str

View File

@@ -1,60 +0,0 @@
from csv import DictWriter
from prowler.config.config import timestamp
from prowler.lib.logger import logger
from prowler.lib.outputs.compliance.models import Check_Output_CSV_AWS_Well_Architected
from prowler.lib.outputs.csv.csv import generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def write_compliance_row_aws_well_architected_framework(
file_descriptors, finding, compliance, output_options, provider
):
try:
compliance_output = compliance.Framework
if compliance.Version != "":
compliance_output += "_" + compliance.Version
if compliance.Provider != "":
compliance_output += "_" + compliance.Provider
compliance_output = compliance_output.lower().replace("-", "_")
csv_header = generate_csv_fields(Check_Output_CSV_AWS_Well_Architected)
csv_writer = DictWriter(
file_descriptors[compliance_output],
fieldnames=csv_header,
delimiter=";",
)
for requirement in compliance.Requirements:
requirement_description = requirement.Description
requirement_id = requirement.Id
for attribute in requirement.Attributes:
compliance_row = Check_Output_CSV_AWS_Well_Architected(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=provider.identity.account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Description=requirement_description,
Requirements_Attributes_Name=attribute.Name,
Requirements_Attributes_WellArchitectedQuestionId=attribute.WellArchitectedQuestionId,
Requirements_Attributes_WellArchitectedPracticeId=attribute.WellArchitectedPracticeId,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_LevelOfRisk=attribute.LevelOfRisk,
Requirements_Attributes_AssessmentMethod=attribute.AssessmentMethod,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_ImplementationGuidanceUrl=attribute.ImplementationGuidanceUrl,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
Muted=finding.muted,
)
csv_writer.writerow(compliance_row.__dict__)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)

View File

@@ -2,75 +2,6 @@ from colorama import Fore, Style
from tabulate import tabulate
from prowler.config.config import orange_color
from prowler.lib.logger import logger
from prowler.lib.outputs.compliance.cis_aws import generate_compliance_row_cis_aws
from prowler.lib.outputs.compliance.cis_azure import generate_compliance_row_cis_azure
from prowler.lib.outputs.compliance.cis_gcp import generate_compliance_row_cis_gcp
from prowler.lib.outputs.compliance.cis_kubernetes import (
generate_compliance_row_cis_kubernetes,
)
from prowler.lib.outputs.csv.csv import write_csv
def write_compliance_row_cis(
file_descriptors,
finding,
compliance,
output_options,
provider,
input_compliance_frameworks,
):
try:
compliance_output = (
"cis_" + compliance.Version + "_" + compliance.Provider.lower()
)
# Only with the version of CIS that was selected
if compliance_output in str(input_compliance_frameworks):
for requirement in compliance.Requirements:
for attribute in requirement.Attributes:
if compliance.Provider == "AWS":
(compliance_row, csv_header) = generate_compliance_row_cis_aws(
finding,
compliance,
requirement,
attribute,
output_options,
provider,
)
elif compliance.Provider == "Azure":
(compliance_row, csv_header) = (
generate_compliance_row_cis_azure(
finding,
compliance,
requirement,
attribute,
output_options,
)
)
elif compliance.Provider == "GCP":
(compliance_row, csv_header) = generate_compliance_row_cis_gcp(
finding, compliance, requirement, attribute, output_options
)
elif compliance.Provider == "Kubernetes":
(compliance_row, csv_header) = (
generate_compliance_row_cis_kubernetes(
finding,
compliance,
requirement,
attribute,
output_options,
provider,
)
)
write_csv(
file_descriptors[compliance_output], csv_header, compliance_row
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def get_cis_table(

View File

@@ -0,0 +1,97 @@
from prowler.lib.check.compliance_models import ComplianceBaseModel
from prowler.lib.outputs.compliance.cis.models import AWSCISModel
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.finding import Finding
class AWSCIS(ComplianceOutput):
"""
This class represents the AWS CIS compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into AWS CIS compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: ComplianceBaseModel,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into AWS CIS compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (ComplianceBaseModel): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = AWSCISModel(
Provider=finding.provider,
Description=compliance.Description,
AccountId=finding.account_uid,
Region=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = AWSCISModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
AccountId="",
Region="",
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

View File

@@ -0,0 +1,99 @@
from prowler.lib.check.compliance_models import ComplianceBaseModel
from prowler.lib.outputs.compliance.cis.models import AzureCISModel
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.finding import Finding
class AzureCIS(ComplianceOutput):
"""
This class represents the Azure CIS compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into Azure CIS compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: ComplianceBaseModel,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into Azure CIS compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (ComplianceBaseModel): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = AzureCISModel(
Provider=finding.provider,
Description=compliance.Description,
Subscription=finding.account_name,
Location=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_DefaultValue=attribute.DefaultValue,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = AzureCISModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
Subscription="",
Location="",
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_DefaultValue=attribute.DefaultValue,
Requirements_Attributes_References=attribute.References,
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

View File

@@ -0,0 +1,97 @@
from prowler.lib.check.compliance_models import ComplianceBaseModel
from prowler.lib.outputs.compliance.cis.models import GCPCISModel
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.finding import Finding
class GCPCIS(ComplianceOutput):
"""
This class represents the GCP CIS compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into GCP CIS compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: ComplianceBaseModel,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into GCP CIS compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (ComplianceBaseModel): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = GCPCISModel(
Provider=finding.provider,
Description=compliance.Description,
ProjectId=finding.account_uid,
Location=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = GCPCISModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
ProjectId="",
Location="",
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

View File

@@ -0,0 +1,101 @@
from datetime import datetime
from prowler.lib.check.compliance_models import ComplianceBaseModel
from prowler.lib.outputs.compliance.cis.models import KubernetesCISModel
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.finding import Finding
class KubernetesCIS(ComplianceOutput):
"""
This class represents the Kubernetes CIS compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into Kubernetes CIS compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: ComplianceBaseModel,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into Kubernetes CIS compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (ComplianceBaseModel): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = KubernetesCISModel(
Provider=finding.provider,
Description=compliance.Description,
Context=finding.account_name,
Namespace=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Requirements_Attributes_DefaultValue=attribute.DefaultValue,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = KubernetesCISModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
Context="",
Namespace="",
AssessmentDate=str(datetime.now()),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Requirements_Attributes_DefaultValue=attribute.DefaultValue,
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

View File

@@ -0,0 +1,162 @@
from pydantic import BaseModel
class AWSCISModel(BaseModel):
"""
AWSCISModel generates a finding's output in AWS CIS Compliance format.
"""
Provider: str
Description: str
AccountId: str
Region: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
Requirements_Attributes_RationaleStatement: str
Requirements_Attributes_ImpactStatement: str
Requirements_Attributes_RemediationProcedure: str
Requirements_Attributes_AuditProcedure: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_References: str
Status: str
StatusExtended: str
ResourceId: str
ResourceName: str
CheckId: str
Muted: bool
class AzureCISModel(BaseModel):
"""
AzureCISModel generates a finding's output in Azure CIS Compliance format.
"""
Provider: str
Description: str
Subscription: str
Location: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
Requirements_Attributes_RationaleStatement: str
Requirements_Attributes_ImpactStatement: str
Requirements_Attributes_RemediationProcedure: str
Requirements_Attributes_AuditProcedure: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_DefaultValue: str
Requirements_Attributes_References: str
Status: str
StatusExtended: str
ResourceId: str
ResourceName: str
CheckId: str
Muted: bool
class GCPCISModel(BaseModel):
"""
GCPCISModel generates a finding's output in GCP CIS Compliance format.
"""
Provider: str
Description: str
ProjectId: str
Location: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
Requirements_Attributes_RationaleStatement: str
Requirements_Attributes_ImpactStatement: str
Requirements_Attributes_RemediationProcedure: str
Requirements_Attributes_AuditProcedure: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_References: str
Status: str
StatusExtended: str
ResourceId: str
ResourceName: str
CheckId: str
Muted: bool
class KubernetesCISModel(BaseModel):
"""
KubernetesCISModel generates a finding's output in Kubernetes CIS Compliance format.
"""
Provider: str
Description: str
Context: str
Namespace: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
Requirements_Attributes_RationaleStatement: str
Requirements_Attributes_ImpactStatement: str
Requirements_Attributes_RemediationProcedure: str
Requirements_Attributes_AuditProcedure: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_References: str
Requirements_Attributes_DefaultValue: str
Status: str
StatusExtended: str
ResourceId: str
ResourceName: str
CheckId: str
Muted: bool
# TODO: Create a parent class for the common fields of CIS and have the specific classes from each provider to inherit from it.
# It is not done yet because it is needed to respect the current order of the fields in the output file.
# class AWS(CIS):
# """
# AWS CIS Compliance format.
# """
# AccountId: str
# Region: str
# class Azure(CIS):
# """
# Azure CIS Compliance format.
# """
# Subscription: str
# Location: str
# class GCP(CIS):
# """
# GCP CIS Compliance format.
# """
# ProjectId: str
# Location: str
# class Kubernetes(CIS):
# """
# Kubernetes CIS Compliance format.
# """
# Context: str
# Namespace: str

View File

@@ -1,36 +0,0 @@
from prowler.config.config import timestamp
from prowler.lib.outputs.compliance.models import Check_Output_CSV_AWS_CIS
from prowler.lib.outputs.csv.csv import generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def generate_compliance_row_cis_aws(
finding, compliance, requirement, attribute, output_options, provider
):
compliance_row = Check_Output_CSV_AWS_CIS(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=provider.identity.account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(output_options.unix_timestamp, timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
Muted=finding.muted,
)
csv_header = generate_csv_fields(Check_Output_CSV_AWS_CIS)
return compliance_row, csv_header

View File

@@ -1,37 +0,0 @@
from prowler.config.config import timestamp
from prowler.lib.outputs.compliance.models import Check_Output_CSV_AZURE_CIS
from prowler.lib.outputs.csv.csv import generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def generate_compliance_row_cis_azure(
finding, compliance, requirement, attribute, output_options
):
compliance_row = Check_Output_CSV_AZURE_CIS(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
Subscription=finding.subscription,
AssessmentDate=outputs_unix_timestamp(output_options.unix_timestamp, timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_DefaultValue=attribute.DefaultValue,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
ResourceName=finding.resource_name,
CheckId=finding.check_metadata.CheckID,
Muted=finding.muted,
)
csv_header = generate_csv_fields(Check_Output_CSV_AZURE_CIS)
return compliance_row, csv_header

View File

@@ -1,37 +0,0 @@
from prowler.config.config import timestamp
from prowler.lib.outputs.compliance.models import Check_Output_CSV_GCP_CIS
from prowler.lib.outputs.csv.csv import generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def generate_compliance_row_cis_gcp(
finding, compliance, requirement, attribute, output_options
):
compliance_row = Check_Output_CSV_GCP_CIS(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
ProjectId=finding.project_id,
Location=finding.location.lower(),
AssessmentDate=outputs_unix_timestamp(output_options.unix_timestamp, timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
ResourceName=finding.resource_name,
CheckId=finding.check_metadata.CheckID,
Muted=finding.muted,
)
csv_header = generate_csv_fields(Check_Output_CSV_GCP_CIS)
return compliance_row, csv_header

View File

@@ -1,37 +0,0 @@
from prowler.config.config import timestamp
from prowler.lib.outputs.compliance.models import Check_Output_CSV_KUBERNETES_CIS
from prowler.lib.outputs.csv.csv import generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def generate_compliance_row_cis_kubernetes(
finding, compliance, requirement, attribute, output_options, provider
):
compliance_row = Check_Output_CSV_KUBERNETES_CIS(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
Context=provider.identity.context,
Namespace=finding.namespace,
AssessmentDate=outputs_unix_timestamp(output_options.unix_timestamp, timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Requirements_Attributes_DefaultValue=attribute.DefaultValue,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
Muted=finding.muted,
)
csv_header = generate_csv_fields(Check_Output_CSV_KUBERNETES_CIS)
return compliance_row, csv_header

View File

@@ -2,144 +2,16 @@ import sys
from prowler.lib.check.models import Check_Report
from prowler.lib.logger import logger
from prowler.lib.outputs.compliance.aws_well_architected_framework import (
write_compliance_row_aws_well_architected_framework,
)
from prowler.lib.outputs.compliance.cis import get_cis_table, write_compliance_row_cis
from prowler.lib.outputs.compliance.ens_rd2022_aws import (
get_ens_rd2022_aws_table,
write_compliance_row_ens_rd2022_aws,
)
from prowler.lib.outputs.compliance.generic import (
from prowler.lib.outputs.compliance.cis.cis import get_cis_table
from prowler.lib.outputs.compliance.ens.ens import get_ens_table
from prowler.lib.outputs.compliance.generic.generic_table import (
get_generic_compliance_table,
write_compliance_row_generic,
)
from prowler.lib.outputs.compliance.iso27001_2013_aws import (
write_compliance_row_iso27001_2013_aws,
)
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack import (
get_mitre_attack_table,
write_compliance_row_mitre_attack,
)
def add_manual_controls(
output_options, provider, file_descriptors, input_compliance_frameworks
):
try:
# Check if MANUAL control was already added to output
if "manual_check" in output_options.bulk_checks_metadata:
manual_finding = Check_Report(
output_options.bulk_checks_metadata["manual_check"].json()
)
manual_finding.status = "MANUAL"
manual_finding.status_extended = "Manual check"
manual_finding.resource_id = "manual_check"
manual_finding.resource_name = "Manual check"
manual_finding.region = ""
manual_finding.location = ""
manual_finding.project_id = ""
manual_finding.subscription = ""
manual_finding.namespace = ""
fill_compliance(
output_options,
manual_finding,
provider,
file_descriptors,
input_compliance_frameworks,
)
del output_options.bulk_checks_metadata["manual_check"]
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def get_check_compliance_frameworks_in_input(
check_id, bulk_checks_metadata, input_compliance_frameworks
):
"""get_check_compliance_frameworks_in_input returns a list of Compliance for the given check if the compliance framework is present in the input compliance to execute"""
check_compliances = []
if bulk_checks_metadata and bulk_checks_metadata[check_id]:
for compliance in bulk_checks_metadata[check_id].Compliance:
compliance_name = ""
if compliance.Version:
compliance_name = (
compliance.Framework.lower()
+ "_"
+ compliance.Version.lower()
+ "_"
+ compliance.Provider.lower()
)
else:
compliance_name = (
compliance.Framework.lower() + "_" + compliance.Provider.lower()
)
if compliance_name.replace("-", "_") in input_compliance_frameworks:
check_compliances.append(compliance)
return check_compliances
def fill_compliance(
output_options, finding, provider, file_descriptors, input_compliance_frameworks
):
try:
# We have to retrieve all the check's compliance requirements and get the ones matching with the input ones
check_compliances = get_check_compliance_frameworks_in_input(
finding.check_metadata.CheckID,
output_options.bulk_checks_metadata,
input_compliance_frameworks,
)
for compliance in check_compliances:
if compliance.Framework == "ENS" and compliance.Version == "RD2022":
write_compliance_row_ens_rd2022_aws(
file_descriptors, finding, compliance, output_options, provider
)
elif compliance.Framework == "CIS":
write_compliance_row_cis(
file_descriptors,
finding,
compliance,
output_options,
provider,
input_compliance_frameworks,
)
elif (
"AWS-Well-Architected-Framework" in compliance.Framework
and compliance.Provider == "AWS"
):
write_compliance_row_aws_well_architected_framework(
file_descriptors, finding, compliance, output_options, provider
)
elif (
compliance.Framework == "ISO27001"
and compliance.Version == "2013"
and compliance.Provider == "AWS"
):
write_compliance_row_iso27001_2013_aws(
file_descriptors, finding, compliance, output_options, provider
)
elif compliance.Framework == "MITRE-ATTACK" and compliance.Version == "":
write_compliance_row_mitre_attack(
file_descriptors, finding, compliance, provider
)
else:
write_compliance_row_generic(
file_descriptors, finding, compliance, output_options, provider
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def display_compliance_table(
findings: list,
bulk_checks_metadata: dict,
@@ -147,10 +19,24 @@ def display_compliance_table(
output_filename: str,
output_directory: str,
compliance_overview: bool,
):
) -> None:
"""
display_compliance_table generates the compliance table for the given compliance framework.
Args:
findings (list): The list of findings
bulk_checks_metadata (dict): The bulk checks metadata
compliance_framework (str): The compliance framework to generate the table
output_filename (str): The output filename
output_directory (str): The output directory
compliance_overview (bool): The compliance
Returns:
None
"""
try:
if "ens_rd2022_aws" == compliance_framework:
get_ens_rd2022_aws_table(
if "ens_" in compliance_framework:
get_ens_table(
findings,
bulk_checks_metadata,
compliance_framework,
@@ -192,7 +78,10 @@ def display_compliance_table(
sys.exit(1)
def get_check_compliance(finding, provider_type, output_options) -> dict:
# TODO: this should be in the Check class
def get_check_compliance(
finding: Check_Report, provider_type: str, bulk_checks_metadata: dict
) -> dict:
"""get_check_compliance returns a map with the compliance framework as key and the requirements where the finding's check is present.
Example:
@@ -201,12 +90,20 @@ def get_check_compliance(finding, provider_type, output_options) -> dict:
"CIS-1.4": ["2.1.3"],
"CIS-1.5": ["2.1.3"],
}
Args:
finding (Any): The Check_Report finding
provider_type (str): The provider type
bulk_checks_metadata (dict): The bulk checks metadata
Returns:
dict: The compliance framework as key and the requirements where the finding's check is present.
"""
try:
check_compliance = {}
# We have to retrieve all the check's compliance requirements
if finding.check_metadata.CheckID in output_options.bulk_checks_metadata:
for compliance in output_options.bulk_checks_metadata[
if finding.check_metadata.CheckID in bulk_checks_metadata:
for compliance in bulk_checks_metadata[
finding.check_metadata.CheckID
].Compliance:
compliance_fw = compliance.Framework

View File

@@ -0,0 +1,81 @@
from csv import DictWriter
from pathlib import Path
from typing import List
from prowler.lib.check.compliance_models import ComplianceBaseModel
from prowler.lib.logger import logger
from prowler.lib.outputs.finding import Finding
from prowler.lib.outputs.output import Output
class ComplianceOutput(Output):
"""
This class represents an abstract base class for defining different types of outputs for findings.
Attributes:
_data (list): A list to store transformed data from findings.
_file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
__init__: Initializes the Output class with findings, optionally creates a file descriptor.
data: Property to access the transformed data.
file_descriptor: Property to access the file descriptor.
transform: Abstract method to transform findings into a specific format.
batch_write_data_to_file: Abstract method to write data to a file in batches.
create_file_descriptor: Method to create a file descriptor for writing data to a file.
"""
def __init__(
self,
findings: List[Finding],
compliance: ComplianceBaseModel,
create_file_descriptor: bool = False,
file_path: str = None,
file_extension: str = "",
) -> None:
self._data = []
if not file_extension and file_path:
self._file_extension = "".join(Path(file_path).suffixes)
if file_extension:
self._file_extension = file_extension
if findings:
# Get the compliance name of the model
compliance_name = (
compliance.Framework + "-" + compliance.Version
if compliance.Version
else compliance.Framework
)
self.transform(findings, compliance, compliance_name)
if create_file_descriptor:
self.create_file_descriptor(file_path)
def batch_write_data_to_file(self) -> None:
"""
Writes the findings data to a CSV file in the specific compliance format.
Returns:
- None
"""
try:
if (
getattr(self, "_file_descriptor", None)
and not self._file_descriptor.closed
and self._data
):
csv_writer = DictWriter(
self._file_descriptor,
fieldnames=[field.upper() for field in self._data[0].dict().keys()],
delimiter=";",
)
csv_writer.writeheader()
for finding in self._data:
csv_writer.writerow(
{k.upper(): v for k, v in finding.dict().items()}
)
self._file_descriptor.close()
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)

View File

@@ -1,58 +1,10 @@
from csv import DictWriter
from colorama import Fore, Style
from tabulate import tabulate
from prowler.config.config import orange_color, timestamp
from prowler.lib.outputs.compliance.models import Check_Output_CSV_ENS_RD2022
from prowler.lib.outputs.csv.csv import generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
from prowler.config.config import orange_color
def write_compliance_row_ens_rd2022_aws(
file_descriptors, finding, compliance, output_options, provider
):
compliance_output = "ens_rd2022_aws"
csv_header = generate_csv_fields(Check_Output_CSV_ENS_RD2022)
csv_writer = DictWriter(
file_descriptors[compliance_output],
fieldnames=csv_header,
delimiter=";",
)
for requirement in compliance.Requirements:
requirement_description = requirement.Description
requirement_id = requirement.Id
for attribute in requirement.Attributes:
compliance_row = Check_Output_CSV_ENS_RD2022(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=provider.identity.account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Description=requirement_description,
Requirements_Attributes_IdGrupoControl=attribute.IdGrupoControl,
Requirements_Attributes_Marco=attribute.Marco,
Requirements_Attributes_Categoria=attribute.Categoria,
Requirements_Attributes_DescripcionControl=attribute.DescripcionControl,
Requirements_Attributes_Nivel=attribute.Nivel,
Requirements_Attributes_Tipo=attribute.Tipo,
Requirements_Attributes_Dimensiones=",".join(attribute.Dimensiones),
Requirements_Attributes_ModoEjecucion=attribute.ModoEjecucion,
Requirements_Attributes_Dependencias=",".join(attribute.Dependencias),
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
Muted=finding.muted,
)
csv_writer.writerow(compliance_row.__dict__)
def get_ens_rd2022_aws_table(
def get_ens_table(
findings: list,
bulk_checks_metadata: dict,
compliance_framework: str,
@@ -78,11 +30,7 @@ def get_ens_rd2022_aws_table(
check = bulk_checks_metadata[finding.check_metadata.CheckID]
check_compliances = check.Compliance
for compliance in check_compliances:
if (
compliance.Framework == "ENS"
and compliance.Provider == "AWS"
and compliance.Version == "RD2022"
):
if compliance.Framework == "ENS" and compliance.Provider == "AWS":
for requirement in compliance.Requirements:
for attribute in requirement.Attributes:
marco_categoria = f"{attribute.Marco}/{attribute.Categoria}"

View File

@@ -0,0 +1,103 @@
from prowler.lib.check.compliance_models import ComplianceBaseModel
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.compliance.ens.models import AWSENSModel
from prowler.lib.outputs.finding import Finding
class AWSENS(ComplianceOutput):
"""
This class represents the AWS ENS compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into AWS ENS compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: ComplianceBaseModel,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into AWS ENS compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (ComplianceBaseModel): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = AWSENSModel(
Provider=finding.provider,
Description=compliance.Description,
AccountId=finding.account_uid,
Region=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_IdGrupoControl=attribute.IdGrupoControl,
Requirements_Attributes_Marco=attribute.Marco,
Requirements_Attributes_Categoria=attribute.Categoria,
Requirements_Attributes_DescripcionControl=attribute.DescripcionControl,
Requirements_Attributes_Nivel=attribute.Nivel,
Requirements_Attributes_Tipo=attribute.Tipo,
Requirements_Attributes_Dimensiones=",".join(
attribute.Dimensiones
),
Requirements_Attributes_ModoEjecucion=attribute.ModoEjecucion,
Requirements_Attributes_Dependencias=",".join(
attribute.Dependencias
),
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = AWSENSModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
AccountId="",
Region="",
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_IdGrupoControl=attribute.IdGrupoControl,
Requirements_Attributes_Marco=attribute.Marco,
Requirements_Attributes_Categoria=attribute.Categoria,
Requirements_Attributes_DescripcionControl=attribute.DescripcionControl,
Requirements_Attributes_Nivel=attribute.Nivel,
Requirements_Attributes_Tipo=attribute.Tipo,
Requirements_Attributes_Dimensiones=",".join(
attribute.Dimensiones
),
Requirements_Attributes_ModoEjecucion=attribute.ModoEjecucion,
Requirements_Attributes_Dependencias=",".join(
attribute.Dependencias
),
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

View File

@@ -0,0 +1,30 @@
from pydantic import BaseModel
class AWSENSModel(BaseModel):
"""
AWSENSModel generates a finding's output in CSV ENS format for AWS.
"""
Provider: str
Description: str
AccountId: str
Region: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_IdGrupoControl: str
Requirements_Attributes_Marco: str
Requirements_Attributes_Categoria: str
Requirements_Attributes_DescripcionControl: str
Requirements_Attributes_Nivel: str
Requirements_Attributes_Tipo: str
Requirements_Attributes_Dimensiones: str
Requirements_Attributes_ModoEjecucion: str
Requirements_Attributes_Dependencias: str
Status: str
StatusExtended: str
ResourceId: str
CheckId: str
Muted: bool
ResourceName: str

View File

@@ -1,105 +0,0 @@
from csv import DictWriter
from colorama import Fore, Style
from tabulate import tabulate
from prowler.config.config import orange_color, timestamp
from prowler.lib.outputs.compliance.models import Check_Output_CSV_Generic_Compliance
from prowler.lib.outputs.csv.csv import generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def write_compliance_row_generic(
file_descriptors, finding, compliance, output_options, provider
):
compliance_output = compliance.Framework
if compliance.Version != "":
compliance_output += "_" + compliance.Version
if compliance.Provider != "":
compliance_output += "_" + compliance.Provider
compliance_output = compliance_output.lower().replace("-", "_")
csv_header = generate_csv_fields(Check_Output_CSV_Generic_Compliance)
csv_writer = DictWriter(
file_descriptors[compliance_output],
fieldnames=csv_header,
delimiter=";",
)
for requirement in compliance.Requirements:
requirement_description = requirement.Description
requirement_id = requirement.Id
for attribute in requirement.Attributes:
compliance_row = Check_Output_CSV_Generic_Compliance(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=provider.identity.account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Description=requirement_description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_SubGroup=attribute.SubGroup,
Requirements_Attributes_Service=attribute.Service,
Requirements_Attributes_Type=attribute.Type,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
Muted=finding.muted,
)
csv_writer.writerow(compliance_row.__dict__)
def get_generic_compliance_table(
findings: list,
bulk_checks_metadata: dict,
compliance_framework: str,
output_filename: str,
output_directory: str,
compliance_overview: bool,
):
pass_count = []
fail_count = []
muted_count = []
for index, finding in enumerate(findings):
check = bulk_checks_metadata[finding.check_metadata.CheckID]
check_compliances = check.Compliance
for compliance in check_compliances:
if (
compliance.Framework.upper()
in compliance_framework.upper().replace("_", "-")
and compliance.Version in compliance_framework.upper()
and compliance.Provider in compliance_framework.upper()
):
for requirement in compliance.Requirements:
for attribute in requirement.Attributes:
if finding.muted:
if index not in muted_count:
muted_count.append(index)
else:
if finding.status == "FAIL" and index not in fail_count:
fail_count.append(index)
elif finding.status == "PASS" and index not in pass_count:
pass_count.append(index)
if (
len(fail_count) + len(pass_count) + len(muted_count) > 1
): # If there are no resources, don't print the compliance table
print(
f"\nCompliance Status of {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL} Framework:"
)
overview_table = [
[
f"{Fore.RED}{round(len(fail_count) / len(findings) * 100, 2)}% ({len(fail_count)}) FAIL{Style.RESET_ALL}",
f"{Fore.GREEN}{round(len(pass_count) / len(findings) * 100, 2)}% ({len(pass_count)}) PASS{Style.RESET_ALL}",
f"{orange_color}{round(len(muted_count) / len(findings) * 100, 2)}% ({len(muted_count)}) MUTED{Style.RESET_ALL}",
]
]
print(tabulate(overview_table, tablefmt="rounded_grid"))
if not compliance_overview:
print(f"\nDetailed results of {compliance_framework.upper()} are in:")
print(
f" - CSV: {output_directory}/compliance/{output_filename}_{compliance_framework}.csv\n"
)

View File

@@ -0,0 +1,87 @@
from prowler.lib.check.compliance_models import ComplianceBaseModel
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.compliance.generic.models import GenericComplianceModel
from prowler.lib.outputs.finding import Finding
class GenericCompliance(ComplianceOutput):
"""
This class represents the Generic compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into Generic compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: ComplianceBaseModel,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into Generic compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (ComplianceBaseModel): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = GenericComplianceModel(
Provider=finding.provider,
Description=compliance.Description,
AccountId=finding.account_uid,
Region=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_SubGroup=attribute.SubGroup,
Requirements_Attributes_Service=attribute.Service,
Requirements_Attributes_Type=attribute.Type,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = GenericComplianceModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
AccountId="",
Region="",
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_SubGroup=attribute.SubGroup,
Requirements_Attributes_Service=attribute.Service,
Requirements_Attributes_Type=attribute.Type,
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

View File

@@ -0,0 +1,54 @@
from colorama import Fore, Style
from tabulate import tabulate
from prowler.config.config import orange_color
def get_generic_compliance_table(
findings: list,
bulk_checks_metadata: dict,
compliance_framework: str,
output_filename: str,
output_directory: str,
compliance_overview: bool,
):
pass_count = []
fail_count = []
muted_count = []
for index, finding in enumerate(findings):
check = bulk_checks_metadata[finding.check_metadata.CheckID]
check_compliances = check.Compliance
for compliance in check_compliances:
if (
compliance.Framework.upper()
in compliance_framework.upper().replace("_", "-")
and compliance.Version in compliance_framework.upper()
and compliance.Provider in compliance_framework.upper()
):
if finding.muted:
if index not in muted_count:
muted_count.append(index)
else:
if finding.status == "FAIL" and index not in fail_count:
fail_count.append(index)
elif finding.status == "PASS" and index not in pass_count:
pass_count.append(index)
if (
len(fail_count) + len(pass_count) + len(muted_count) > 1
): # If there are no resources, don't print the compliance table
print(
f"\nCompliance Status of {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL} Framework:"
)
overview_table = [
[
f"{Fore.RED}{round(len(fail_count) / len(findings) * 100, 2)}% ({len(fail_count)}) FAIL{Style.RESET_ALL}",
f"{Fore.GREEN}{round(len(pass_count) / len(findings) * 100, 2)}% ({len(pass_count)}) PASS{Style.RESET_ALL}",
f"{orange_color}{round(len(muted_count) / len(findings) * 100, 2)}% ({len(muted_count)}) MUTED{Style.RESET_ALL}",
]
]
print(tabulate(overview_table, tablefmt="rounded_grid"))
if not compliance_overview:
print(f"\nDetailed results of {compliance_framework.upper()} are in:")
print(
f" - CSV: {output_directory}/compliance/{output_filename}_{compliance_framework}.csv\n"
)

View File

@@ -0,0 +1,28 @@
from typing import Optional
from pydantic import BaseModel
class GenericComplianceModel(BaseModel):
"""
GenericComplianceModel generates a finding's output in Generic Compliance format.
"""
Provider: str
Description: str
AccountId: str
Region: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: Optional[str]
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_SubGroup: Optional[str]
Requirements_Attributes_Service: Optional[str]
Requirements_Attributes_Type: Optional[str]
Status: str
StatusExtended: str
ResourceId: str
CheckId: str
Muted: bool
ResourceName: str

Some files were not shown because too many files have changed in this diff Show More