Compare commits

...

446 Commits
4.2.3 ... 4.4.0

Author SHA1 Message Date
Sergio Garcia
f1f0609697 chore(release): point v4.4 to master (#5250)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Mario Rodriguez Lopez <101330800+MarioRgzLpz@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Prowler Bot <bot@prowler.com>
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: LefterisXefteris <136985982+LefterisXefteris@users.noreply.github.com>
Co-authored-by: Lefteris Gilmaz <lefterisgilmaz@Lefteriss-MacBook-Pro.local>
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
Co-authored-by: Amogh Bantwal <100332169+abant07@users.noreply.github.com>
Co-authored-by: Harshit Raj Singh <harshitrajsingh.hrs@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Jude Bae(Bae cheongho) <jude@megazone.com>
Co-authored-by: MZC01-JUDE <mzc01-jude@MZC01-JUDE-2.local>
Co-authored-by: johannes-engler-mw <132657752+johannes-engler-mw@users.noreply.github.com>
2024-09-30 15:38:05 -04:00
Pedro Martín
1af7f658a8 refactor(azure): remove validate_arguments for CLI (#4985) 2024-09-11 13:13:06 +02:00
dependabot[bot]
1298620da8 chore(deps-dev): bump pytest from 8.3.2 to 8.3.3 (#4991)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-11 08:59:46 +02:00
Hugo Pereira Brito
75c48cfaa3 refactor(cloudfront): replace origins dictionary with custom Origin class (#4981)
Co-authored-by: Sergio <sergio@prowler.com>
2024-09-10 16:04:57 -04:00
Sergio Garcia
3406a07ae5 fix(audit): solve resources audit (#4983) 2024-09-10 15:41:59 -04:00
Sergio Garcia
cc9e1c5af8 chore(dependencies): update boto3 and botocore packages (#4976) 2024-09-10 15:36:23 -04:00
Sergio Garcia
0343f01cca chore(README): update summary table (#4984) 2024-09-10 21:17:33 +02:00
dependabot[bot]
cad7985c28 chore(deps-dev): bump moto from 5.0.13 to 5.0.14 (#4965)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-09-10 14:36:21 -04:00
Pedro Martín
71030f6f42 fix(main): logic for resource_tag and resource_arn usage (#4979)
Co-authored-by: Sergio <sergio@prowler.com>
2024-09-10 14:07:07 -04:00
Daniel Barranquero
6883467d2f feat(aws): Add new check to ensure RDS DB clusters are encrypted at rest (#4931) 2024-09-10 13:40:08 -04:00
Sergio Garcia
2c6944176f fix(rds): handle new rds arn template function syntax (#4980) 2024-09-10 13:24:19 -04:00
Daniel Barranquero
1ef15f0b24 feat(aws): Add new check to ensure RDS event notification subscriptions are configured for critical database parameter group events (#4907) 2024-09-10 11:10:57 -04:00
dependabot[bot]
f5b0583df5 chore(deps-dev): bump pytest-env from 1.1.3 to 1.1.4 (#4966)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-10 10:17:36 -04:00
Daniel Barranquero
db225e9d2a feat(aws): Add new RDS check to ensure db instances are protected by a backup plan (#4879)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-09-10 10:14:40 -04:00
Daniel Barranquero
c9ae9df87f feat(aws): Add new check to ensure RDS event notification subscriptions are configured for critical database instance events (#4891) 2024-09-10 09:26:15 -04:00
Daniel Barranquero
159a090c02 feat(aws): Add new check to ensure RDS event notification subscriptions are configured for critical cluster events (#4887) 2024-09-10 09:25:42 -04:00
Daniel Barranquero
605c6770e5 fix(rds): Modify RDS Event Notification Subscriptions for Security Groups Events check (#4969) 2024-09-10 09:13:46 -04:00
Pedro Martín
ae950484ed fix(aws): make intersection to retrieve checks to execute (#4970) 2024-09-10 13:24:35 +02:00
Prowler Bot
c54b815b90 chore(regions_update): Changes in regions for AWS services (#4971)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-09-10 12:55:06 +02:00
Pedro Martín
7a937c7708 refactor(provider): move audit and fixer config inside the provider (#4960) 2024-09-10 09:48:11 +02:00
dependabot[bot]
d62e74853e chore(deps-dev): bump mkdocs-git-revision-date-localized-plugin from 1.2.7 to 1.2.8 (#4967)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-10 09:22:10 +02:00
Mario Rodriguez Lopez
bab59bc86e feat(EC2): Change service to adjust the data saved in template_data in LaunchTemplateVersion (#4848) 2024-09-09 12:32:39 -04:00
dependabot[bot]
39e8485fc1 chore(deps): bump slack-sdk from 3.31.0 to 3.32.0 (#4955)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-09 11:10:40 +02:00
Prowler Bot
b9f46cafff chore(regions_update): Changes in regions for AWS services (#4956)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-09-09 09:15:40 +02:00
Pedro Martín
48377ca865 feat(azure): add custom exception class (#4871) 2024-09-06 14:50:27 +02:00
Pedro Martín
4d902e02bb fix(security-groups): remove RFC1918 from ec2_securitygroup_allow_wide_open_public_ipv4 (#4951) 2024-09-06 13:42:28 +02:00
Pedro Martín
e146491d4b fix(aws): change check metadata ec2_securitygroup_allow_wide_open_public_ipv4 (#4946) 2024-09-06 12:31:19 +02:00
Pedro Martín
4eed5c7a99 refactor(check_metadata): move bulk_load_checks_metadata inside class (#4934) 2024-09-06 09:50:14 +02:00
dependabot[bot]
f169599a56 chore(deps): bump msgraph-sdk from 1.5.4 to 1.6.0 (#4940)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2024-09-06 09:49:20 +02:00
dependabot[bot]
95768baa9e chore(deps): bump google-api-python-client from 2.143.0 to 2.144.0 (#4943)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-06 08:24:31 +02:00
Pedro Martín
d8d348f609 feat(kubernetes): add custom exception class (#4912) 2024-09-05 16:52:34 +02:00
dependabot[bot]
bd336250ee chore(deps): bump dash from 2.17.1 to 2.18.0 (#4932)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-05 09:16:51 +02:00
Pedro Martín
a975e96a45 feat(compliance): add method list_compliance_requirements (#4890)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-09-04 20:35:26 +02:00
Pedro Martín
3933440a08 feat(secrets): improve detect secrets checks and add config (#4915) 2024-09-04 16:54:55 +02:00
Prowler Bot
36e7bf0912 chore(regions_update): Changes in regions for AWS services (#4929)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-09-04 11:45:59 +02:00
dependabot[bot]
897e25dd3c chore(deps): bump cryptography from 43.0.0 to 43.0.1 (#4928)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-04 09:46:58 +02:00
dependabot[bot]
f4a8059f9b chore(deps): bump cryptography from 43.0.0 to 43.0.1 (#4923)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-04 08:54:56 +02:00
dependabot[bot]
71d844c101 chore(deps): bump peter-evans/create-pull-request from 6 to 7 (#4926)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-04 08:53:26 +02:00
Pedro Martín
c2b2754926 feat(gcp): add custom exceptions clas (#4908) 2024-09-03 15:56:49 +02:00
Pedro Martín
cfd4019281 fix(aws): raise ArgumentTypeError for parser (#4921) 2024-09-03 13:47:43 +02:00
dependabot[bot]
989fce300d chore(deps-dev): bump pylint from 3.2.6 to 3.2.7 (#4920)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-03 07:21:52 +02:00
Amogh Bantwal
70fdc2693e feat(html): Add number of muted findings in HTML report #4703 (#4895) 2024-09-02 10:13:06 +02:00
Rubén De la Torre Vico
9797c11152 chore(prowler): change all methods from services from format double underscore to single underscore (#4910) 2024-09-02 10:07:21 +02:00
Pedro Martín
007c1febf7 fix(metadata): change description from documentdb_cluster_deletion_protection (#4909) 2024-09-02 09:59:29 +02:00
Pepe Fagoaga
163027a49d chore(aws): Remove token from log line (#4903) 2024-08-30 11:50:18 +02:00
Pepe Fagoaga
80c4802b36 chore(aws_mutelist): Add more Control Tower resources and tests (#4900) 2024-08-30 10:13:00 +02:00
dependabot[bot]
285eb45673 chore(deps): bump trufflesecurity/trufflehog from 3.81.9 to 3.81.10 (#4898)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-30 09:44:12 +02:00
dependabot[bot]
5c2f2ee3b3 chore(deps-dev): bump safety from 3.2.6 to 3.2.7 (#4899)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-30 09:43:58 +02:00
Pedro Martín
1f83e4fe7b chore(pull-request): add check for backport (#4901) 2024-08-30 09:42:52 +02:00
Pedro Martín
b29f99441a feat(aws): add custom exceptions class (#4847) 2024-08-29 19:08:47 +02:00
Pedro Martín
82c065bff4 feat(compliance): rename Compliance class and add list_compliance (#4883) 2024-08-29 16:55:22 +02:00
Pedro Martín
168d44d14b docs(fixers): improve docs about fixers (#4889) 2024-08-29 14:15:31 +02:00
dependabot[bot]
910a72140b chore(deps): bump google-api-python-client from 2.142.0 to 2.143.0 (#4884)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-29 07:56:38 +02:00
Prowler Bot
d988877173 chore(regions_update): Changes in regions for AWS services (#4880)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-08-28 11:45:12 +02:00
Toni de la Fuente
4fd673fd7c chore(readme): Update Slack invite link (#4875) 2024-08-27 21:44:12 +02:00
Pepe Fagoaga
1bff2451e5 chore(release): Remove unused step (#4874) 2024-08-27 16:40:15 +02:00
Pepe Fagoaga
0921daf18b chore: remove not used variable (#4873) 2024-08-27 16:31:13 +02:00
Pedro Martín
7ff80dbb8f fix(rds): get the db_instances values (#4866) 2024-08-27 13:22:54 +02:00
dependabot[bot]
f487bda1fe chore(deps): bump numpy from 2.0.1 to 2.0.2 (#4869)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-27 08:05:57 +02:00
Pepe Fagoaga
d61e999b8f chore(check_metadata): Rename to CheckMetadata (#4864) 2024-08-26 15:25:19 +02:00
Rubén De la Torre Vico
bcb63d0b2d feat(elb): add new check elb_is_in_multiple_az (#4829)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-08-26 13:27:08 +02:00
Pepe Fagoaga
71f50422ad chore(aws-region): Use Prowler Bot (#4863) 2024-08-26 11:04:02 +02:00
Rubén De la Torre Vico
2b49aa8e89 chore(readme): Update the number of AWS checks (#4860) 2024-08-26 10:09:54 +02:00
Pedro Martín
921b6b1e85 fix(aws): enchance check cloudformation_stack_outputs_find_secrets (#4859) 2024-08-26 10:08:19 +02:00
dependabot[bot]
fc155e8368 chore(deps): bump azure-mgmt-compute from 32.0.0 to 33.0.0 (#4856)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-26 08:01:31 +02:00
Rubén De la Torre Vico
79f1cf89cf feat(elb): add new check elb_cross_zone_load_balancing_enabled (#4818)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-23 10:09:32 -04:00
Pedro Martín
496d4daf01 refactor(azure): refactor azure provider (#4653)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-23 10:01:35 -04:00
Daniel Barranquero
559c0d4e0b chore(aws): Change RDS instance type from list to dict (#4851) 2024-08-23 09:26:53 -04:00
Pedro Martín
2fda2388bb refactor(aws): Refactor provider (#4808)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-08-23 09:19:05 -04:00
Pepe Fagoaga
0f79312c33 chore(backport): Use Prowler-Bot PAT (#4855) 2024-08-23 09:18:24 -04:00
Daniel Barranquero
472aea6a91 feat(aws): Add new check to ensure RDS db clusters copy tags to snapshots (#4846) 2024-08-23 09:09:52 -04:00
Pedro Martín
0d18406f80 refactor(kubernetes): refactor provider (#4805)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-23 14:22:03 +02:00
Pedro Martín
05da5d1796 refactor(gcp): refactor GCP provider (#4790)
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-23 07:37:02 -04:00
Sergio Garcia
fb449cede8 fix(aws): handle AWS key-only tags (#4845) 2024-08-23 13:02:59 +02:00
Pepe Fagoaga
61df2ce0c2 chore(regions_update): Changes in regions for AWS services. (#4849)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-08-23 11:45:45 +02:00
Pedro Martín
b7e20344a8 docs(is_item_matched): update docstrings for method (#4836)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-08-23 10:15:15 +02:00
Sergio Garcia
c2552ee508 fix: handle empty input regions (#4841) 2024-08-22 13:54:18 -04:00
Hugo Pereira Brito
57f1fa5bfa feat(s3): add s3_bucket_lifecycle_enabled check (#4801)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-22 12:24:59 -04:00
Rubén De la Torre Vico
0b238243b1 feat(elbv2): add new check elbv2_is_in_multiple_az (#4800)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-22 11:08:49 -04:00
Sergio Garcia
df405254c6 fix(aws): enhance resource arn filtering (#4821) 2024-08-22 16:48:25 +02:00
Daniel Barranquero
460acf2860 feat(aws): Add new RDS check to verify that db instances copy tags to snapshots (#4806) 2024-08-22 10:44:26 -04:00
Rubén De la Torre Vico
dec3e652c5 feat(IAM): add new check iam_group_administrator_access_policy (#4831)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-22 10:14:45 -04:00
Mario Rodriguez Lopez
fc03188bfb feat(ec2): Client VPN Endpoints Should Have Client Connection Logging Enabled (#4804)
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-22 09:57:33 -04:00
Mario Rodriguez Lopez
ff244138d9 feat(ec2): Ensure automatic acceptance of VPC attachment requests is disabled (#4765) 2024-08-22 08:26:01 -04:00
Sergio Garcia
903f9c576f chore(test): improve iam_root_hardware_mfa_enabled tests (#4833) 2024-08-22 08:08:25 -04:00
Daniel Barranquero
0005f86a5f feat(aws): Add new RDS check to ensure db clusters are configured for multiple availability zones (#4781) 2024-08-22 07:49:59 -04:00
Daniel Barranquero
a2144ad353 chore(rds): Revert changes on inherited instance checks (#4827) 2024-08-22 07:33:25 -04:00
Pepe Fagoaga
5f075b296d chore(regions_update): Changes in regions for AWS services. (#4826)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2024-08-22 13:21:45 +02:00
dependabot[bot]
0c7b960e08 chore(deps-dev): bump safety from 3.2.5 to 3.2.6 (#4825)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-22 08:26:58 +02:00
dependabot[bot]
c65e91f834 chore(deps): bump tj-actions/changed-files from 44 to 45 (#4822)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-22 08:25:43 +02:00
Pedro Martín
5876fea163 fix(outputs): refactor unroll_tags to use str as tags (#4817) 2024-08-21 12:40:46 -04:00
Pepe Fagoaga
a557d62d84 chore(regions_update): Changes in regions for AWS services. (#4814)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-08-21 13:07:03 +02:00
dependabot[bot]
f25319f3f6 chore(deps): bump azure-mgmt-web from 7.3.0 to 7.3.1 (#4813)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-21 11:11:54 +02:00
dependabot[bot]
1e02b05d2d chore(deps): bump google-api-python-client from 2.141.0 to 2.142.0 (#4812)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-21 08:33:53 +02:00
Rubén De la Torre Vico
78042063cb feat(iam): add new check to ensure user does not have policies with admin access (#4802) 2024-08-20 11:08:51 -04:00
Mario Rodriguez Lopez
8129b174f1 feat(CodeBuild): Ensure source repository URLs do not contain sensitive credentials (#4731)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-20 09:44:55 -04:00
Daniel Barranquero
3f78fb4220 feat(aws): Add new RDS check for deletion protection enabled on clusters (#4738) 2024-08-20 09:07:11 -04:00
Pedro Martín
e11bb478d6 fix(mutelist): change logic for tags in aws mutelist (#4786) 2024-08-20 07:38:06 -04:00
dependabot[bot]
dec5fb6428 chore(deps-dev): bump mkdocs-git-revision-date-localized-plugin from 1.2.6 to 1.2.7 (#4796)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-20 09:34:40 +02:00
dependabot[bot]
256ccfea79 chore(deps-dev): bump moto from 5.0.12 to 5.0.13 (#4795)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-20 08:16:18 +02:00
Rubén De la Torre Vico
1a8bc14587 feat(awslambda): New check to ensure that a function is inside VPC (#4783)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-19 14:22:21 -04:00
Rubén De la Torre Vico
8483486095 chore(elbv2): Add SecurityHub link to elbv2_ssl_listeners metadata (#4787) 2024-08-19 13:06:34 -04:00
Rubén De la Torre Vico
7aaecbabab chore(elbv2): add SecurityHub link to elbv2_desync_mitigation_mode metadata (#4791) 2024-08-19 13:04:48 -04:00
Rubén De la Torre Vico
5cc9554c23 chore(awslambda): Enhance function public access check called from other resource (#4679)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-19 13:03:30 -04:00
Hugo Pereira Brito
5d42ae6e6f feat(s3): add s3_bucket_cross_region_replication check (#4761)
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-19 12:42:42 -04:00
Sergio Garcia
38b73fb0c0 feat(kubernetes): add a test_connection method (#4684) 2024-08-19 12:12:00 -04:00
Sergio Garcia
84a76f4535 feat(gcp): add a test_connection method (#4616)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2024-08-19 12:11:20 -04:00
Rubén De la Torre Vico
a126fd82b3 fix(ec2): Manage UnicodeDecodeError when reading user data (#4785) 2024-08-19 11:34:39 -04:00
Rubén De la Torre Vico
bf139138e0 chore(azure): Fix CIS 2.1 mapping (#4760) 2024-08-19 11:44:34 +02:00
dependabot[bot]
0fcf4243f5 chore(deps): bump boto3 from 1.34.160 to 1.34.162 (#4778)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-19 09:14:39 +02:00
dependabot[bot]
bbb0248bc1 chore(deps): bump google-api-python-client from 2.140.0 to 2.141.0 (#4751)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-16 12:11:28 -04:00
Sergio Garcia
e6581255c2 fix(iam): update logic of Root Hardware MFA check (#4726) 2024-08-16 11:49:30 -04:00
Sergio Garcia
717932ae26 fix(aws): run Prowler as IAM Root or Federated User (#4712) 2024-08-16 11:49:14 -04:00
Sergio Garcia
3f56731e6d fix(version): update version flag logic (#4688) 2024-08-16 11:48:57 -04:00
Pepe Fagoaga
0f837f658e chore(regions_update): Changes in regions for AWS services. (#4753)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-08-16 11:45:12 -04:00
Sergio Garcia
b70977163e fix(ecr): change log level of non-scanned images (#4747) 2024-08-16 11:43:04 -04:00
Sergio Garcia
98fc624010 fix(ecr): handle non-existing findingSeverityCounts key (#4746) 2024-08-16 11:42:53 -04:00
dependabot[bot]
ccb755340f chore(deps): bump botocore from 1.34.160 to 1.34.162 (#4758)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-16 11:28:04 -04:00
Mario Rodriguez Lopez
49ff901195 feat(EC2): Add new check for security group port restrictions (#4594) 2024-08-16 09:43:00 -04:00
dependabot[bot]
e7d0d49809 chore(deps): bump trufflesecurity/trufflehog from 3.81.8 to 3.81.9 (#4756)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-16 09:35:08 -04:00
Hugo Pereira Brito
47bb97961c chore(cloudtrail): add remediation link to check cloudtrail_s3_dataevents_read_enabled (#4764) 2024-08-16 09:33:09 -04:00
Hugo Pereira Brito
1178317567 chore(cloudtrail): add remediation link to check cloudtrail_s3_dataevents_write_enabled (#4762) 2024-08-16 09:32:35 -04:00
dependabot[bot]
edd0dd1080 chore(deps): bump boto3 from 1.34.159 to 1.34.160 (#4750)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-16 09:18:48 -04:00
Hugo Pereira Brito
ae1b114a13 refactor(s3): Changed buckets variable type form list to dict (#4742) 2024-08-14 10:28:06 -04:00
dependabot[bot]
3c9c28f351 chore(deps): bump botocore from 1.34.159 to 1.34.160 (#4735)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-14 10:20:15 -04:00
dependabot[bot]
93e6751e35 chore(deps): bump boto3 from 1.34.158 to 1.34.159 (#4734)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-14 09:30:11 -04:00
Daniel Barranquero
680781656b feat(aws): Add new RDS check to verify that cluster minor version upgrade is enabled (#4725) 2024-08-14 09:04:27 -04:00
Pepe Fagoaga
21382efd07 chore(regions_update): Changes in regions for AWS services. (#4739)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-08-14 08:31:50 -04:00
Hugo Pereira Brito
097e61ab9d feat(elasticache): Ensure Redis Cache Clusters Automatically Install Minor Updates (#4699) 2024-08-14 08:28:16 -04:00
Daniel Barranquero
52d83bd83b feat(aws): Split the checks that mix RDS Instances and Clusters (#4730) 2024-08-13 10:16:50 -04:00
dependabot[bot]
49cfe15abc chore(deps): bump botocore from 1.34.158 to 1.34.159 (#4728)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-13 09:03:15 -04:00
Mario Rodriguez Lopez
0ef30c655a fix(ACM): Change check logic to scan only in use certificates (#4732) 2024-08-13 08:39:27 -04:00
Daniel Barranquero
e2d211c188 feat(aws): Add new Neptune check for cluster snapshot visibility (#4709) 2024-08-13 08:27:35 -04:00
Daniel Barranquero
62a1d91869 feat(aws): Add new CodeBuild check to validate environment variables (#4632)
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-13 08:15:45 -04:00
dependabot[bot]
8c1347323e chore(deps): bump boto3 from 1.34.157 to 1.34.158 (#4727)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-13 08:13:00 -04:00
Mario Rodriguez Lopez
cb807e4aed feat(DocumentDB): Add new DocumentDB check for cluster snapshot visibility (#4702) 2024-08-12 14:05:04 -04:00
dependabot[bot]
bcc8d5f1fe chore(deps-dev): bump safety from 3.2.4 to 3.2.5 (#4722)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-12 10:03:00 -04:00
dependabot[bot]
59acd303fb chore(deps): bump botocore from 1.34.157 to 1.34.158 (#4721)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-12 08:40:42 -04:00
dependabot[bot]
0675cc8fdb chore(deps): bump boto3 from 1.34.156 to 1.34.157 (#4719)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-12 08:02:17 -04:00
dependabot[bot]
ed27491118 chore(deps): bump trufflesecurity/trufflehog from 3.81.7 to 3.81.8 (#4720)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-12 07:59:29 -04:00
dependabot[bot]
abb28af68e chore(deps): bump aiohttp from 3.9.5 to 3.10.2 (#4713)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-09 14:52:42 -04:00
Rubén De la Torre Vico
18885d0cd7 chore(ec2): Change security groups to dict (#4700) 2024-08-09 14:40:34 -04:00
Pedro Martín
ca56ac4e77 feat(azure): add test_connection method (#4615)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-08-09 14:38:12 -04:00
Pedro Martín
8f2b39b3ce fix(iam): handle no arn serial numbers for MFA devices (#4697)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-08-09 12:57:34 -04:00
Pepe Fagoaga
761eebac1e feat(aws): Add a test_connection method (#4563)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2024-08-09 12:01:40 +02:00
Pepe Fagoaga
8bdff0d681 fix(backport): Workaround not to fail if no backport is needed (#4707) 2024-08-09 09:56:02 +02:00
dependabot[bot]
55e0656375 chore(deps): bump botocore from 1.34.156 to 1.34.157 (#4704)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-09 07:56:26 +02:00
dependabot[bot]
e666b66ec0 chore(deps): bump boto3 from 1.34.154 to 1.34.156 (#4698)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-08 11:54:40 +02:00
Pedro Martín
cdb4f73803 docs(developer-guide): add info about docstrings (#4701) 2024-08-08 11:41:32 +02:00
dependabot[bot]
b4c7345124 chore(deps): bump botocore from 1.34.155 to 1.34.156 (#4694)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-08 10:49:13 +02:00
dependabot[bot]
af8cc37eea chore(deps): bump trufflesecurity/trufflehog from 3.81.6 to 3.81.7 (#4693)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-08 10:48:41 +02:00
Sergio Garcia
28bed98ee4 chore(version): update version logic in Prowler (#4654) 2024-08-07 18:15:10 +02:00
Sergio Garcia
3d39eb7db6 chore(backport): update backport PR title (#4686)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-08-07 16:59:47 +02:00
Pepe Fagoaga
2c5f2e9f5c chore(labeler): Run also for v4.* (#4687) 2024-08-07 10:30:49 -04:00
Hugo Pereira Brito
5ce54e5605 feat(aws): Add new S3 check for public access block configuration in access points (#4608) 2024-08-07 10:23:12 -04:00
Daniel Barranquero
6c029a9d7d feat(aws): Add new KMS check to prevent unintentional key deletion (#4595)
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-07 09:15:22 -04:00
Sergio Garcia
96f893c3ec chore(version): update master version (#4681) 2024-08-07 14:53:45 +02:00
Pepe Fagoaga
f0047cf5a7 chore(actions): Run for v4.* branch (#4682) 2024-08-07 14:11:38 +02:00
Mario Rodriguez Lopez
1b18aef0f0 feat(acm): Add new check for insecure algorithms in certificates (#4551)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-07 08:00:24 -04:00
dependabot[bot]
80e13bffa2 chore(deps): bump botocore from 1.34.154 to 1.34.155 (#4665)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-07 11:33:45 +02:00
dependabot[bot]
384d16749c chore(deps): bump azure-storage-blob from 12.21.0 to 12.22.0 (#4664)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-07 11:01:14 +02:00
Pepe Fagoaga
9c4ba1183b chore(regions): Update labels for backporting (#4678) 2024-08-07 11:00:41 +02:00
Pepe Fagoaga
40a88e07d1 chore(backport): Automate all the things! (#4669) 2024-08-07 10:40:14 +02:00
dependabot[bot]
692ed760e0 chore(deps): bump google-api-python-client from 2.139.0 to 2.140.0 (#4666)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-07 10:26:48 +02:00
dependabot[bot]
6c3e451f32 chore(deps): bump boto3 from 1.34.152 to 1.34.154 (#4663)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-07 09:01:28 +02:00
dependabot[bot]
24f511b567 chore(deps): bump trufflesecurity/trufflehog from 3.81.5 to 3.81.6 (#4662)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-07 09:00:56 +02:00
Sergio Garcia
89c6652bd6 fix(tags): handle AWS dictionary type tags (#4656) 2024-08-07 08:34:57 +02:00
dependabot[bot]
8aca456285 chore(deps-dev): bump moto from 5.0.11 to 5.0.12 (#4642)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-06 14:59:29 -04:00
Rubén De la Torre Vico
824a465667 test(awslambda): Cover possible checks with moto instead MagicMock (#4609)
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-06 13:40:51 -04:00
Amogh Bantwal
086c203e6b feat(aws) Add check to make sure EKS clusters have a supported version (#4604)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-06 13:40:05 -04:00
dependabot[bot]
f746a9e742 chore(deps-dev): bump flake8 from 7.1.0 to 7.1.1 (#4643)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-06 09:19:05 +02:00
Pepe Fagoaga
90810d9098 chore: change SaaS for Prowler (#4651) 2024-08-06 08:56:04 +02:00
Pepe Fagoaga
75b3f52309 docs(mutelist): Add service_* documentation (#4650) 2024-08-06 08:55:55 +02:00
dependabot[bot]
8ecb4696d4 chore(deps): bump botocore from 1.34.152 to 1.34.154 (#4641)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-06 08:44:43 +02:00
dependabot[bot]
7b22c9c97b chore(deps): bump trufflesecurity/trufflehog from 3.81.4 to 3.81.5 (#4645)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-06 08:24:27 +02:00
dependabot[bot]
84f0542b98 chore(deps-dev): bump coverage from 7.6.0 to 7.6.1 (#4640)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-06 08:07:24 +02:00
Rubén De la Torre Vico
8faa40dfb6 feat(opensearch): Add domain inside VPC case for public domain check (#4570) 2024-08-05 13:04:49 -04:00
Pepe Fagoaga
47f7555d05 refactor(mutelist): Remove re.match and improve docs (#4637)
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-05 12:59:30 -04:00
Pedro Martín
96d9cbd8af fix(gcp): check cloudsql sslMode (#4635) 2024-08-05 12:12:00 -04:00
Pedro Martín
c8bc54aa48 fix(gcp): check next rotation time in KMS keys (#4633) 2024-08-05 11:31:38 -04:00
Rubén De la Torre Vico
fad0b8995a chore(aws): Convert ELB and ELBv2 attributes to dictionaries (#4575)
Co-authored-by: Sergio <sergio@prowler.com>
2024-08-05 11:14:19 -04:00
dependabot[bot]
d4b6fa27e2 chore(deps): bump msgraph-sdk from 1.5.3 to 1.5.4 (#4629)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-05 15:02:49 +02:00
dependabot[bot]
a37723fd32 chore(deps): bump boto3 from 1.34.151 to 1.34.152 (#4628)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-05 08:14:55 -04:00
Pedro Martín
fc5eefe532 fix(scan_test): change resource_tags to a dict (#4631) 2024-08-05 10:02:41 +02:00
Pedro Martín
ffd9b2a2f6 chore(scan-class): add new scan class (#4564)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-08-05 08:21:13 +02:00
dependabot[bot]
112f48ac08 chore(deps-dev): bump black from 24.4.2 to 24.8.0 (#4627)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-05 08:19:54 +02:00
Sergio Garcia
95ec3d91b4 refactor(tags): convert tags to a dictionary (#4598)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-08-05 08:17:43 +02:00
Sergio Garcia
b0709d08cd fix(gcp): use KMS key id in checks (#4610) 2024-08-05 08:16:56 +02:00
dependabot[bot]
a0e3cb87a4 chore(deps): bump trufflesecurity/trufflehog from 3.80.5 to 3.81.4 (#4625)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-05 08:15:49 +02:00
Pepe Fagoaga
1b9cc9e3db chore(regions_update): Changes in regions for AWS services. (#4630)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-08-05 08:14:49 +02:00
Jon Young
d9fb67bc43 docs(Tutorials): include volume option when running dashboard in docker (#4620) 2024-08-05 08:06:24 +02:00
dependabot[bot]
a79022dce8 chore(deps): bump botocore from 1.34.151 to 1.34.152 (#4611)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-02 08:03:55 +02:00
dependabot[bot]
0a2ce690f4 chore(deps): bump trufflesecurity/trufflehog from 3.80.4 to 3.80.5 (#4612)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-02 07:57:22 +02:00
Pedro Martín
bbc51114b0 fix(sns): add condition to sns topics (#4498)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-08-01 11:54:36 -04:00
Pepe Fagoaga
32da86f393 fix(mutelist): Fix tags match (#4606) 2024-08-01 09:01:44 -04:00
Pepe Fagoaga
74d02e1da6 chore(version): Update Prowler version (#4605) 2024-08-01 08:01:45 -04:00
Pepe Fagoaga
8ec6e89e5c chore(regions_update): Changes in regions for AWS services. (#4607)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-08-01 11:35:08 +02:00
dependabot[bot]
17012ec1a4 chore(deps): bump trufflesecurity/trufflehog from 3.80.3 to 3.80.4 (#4601)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-01 08:06:11 +02:00
Pepe Fagoaga
8461257428 fix(status): Recover status filtering (#4572)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-31 10:10:07 -04:00
Kay Agahd
26a5ffaf82 fix(aws): only check artifacts that can be scanned for vulnerabilities by ecr_repositories_scan_vulnerabilities_in_latest_image (#4507)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-31 09:27:26 -04:00
Sergio Garcia
563ddb3707 chore(permissions): add missing ec2 permission (#4577) 2024-07-31 14:22:21 +02:00
Pedro Martín
2c11c3d6f9 fix(typo): fix typo on PR template (#4596) 2024-07-31 07:58:53 -04:00
cetteup
e050f44d63 fix(aws): Pass backup retention check if retention period is equal to minimum (#4593) 2024-07-31 13:25:53 +02:00
Pepe Fagoaga
4fd3405bbf chore(regions_update): Changes in regions for AWS services. (#4592)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-31 11:33:59 +02:00
dependabot[bot]
a1c2caa745 chore(deps): bump boto3 from 1.34.149 to 1.34.151 (#4587)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-31 09:47:41 +02:00
dependabot[bot]
f639dc8bf4 chore(deps): bump trufflesecurity/trufflehog from 3.80.2 to 3.80.3 (#4581)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-31 09:16:37 +02:00
dependabot[bot]
35325d9f40 chore(deps): bump google-api-python-client from 2.138.0 to 2.139.0 (#4579)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-31 09:16:08 +02:00
Pepe Fagoaga
71503b553a chore(pr-template): Add Checklist (#4586) 2024-07-31 08:31:55 +02:00
dependabot[bot]
d91a240ea8 chore(deps): bump botocore from 1.34.150 to 1.34.151 (#4578)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-31 08:29:51 +02:00
Sergio Garcia
b9b5f66073 fix(test): solve VPC import in tests (#4574) 2024-07-30 10:34:55 -04:00
Sergio Garcia
e3f66840aa chore(version): update Prowler version (#4565)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2024-07-30 10:17:56 +02:00
Rubén De la Torre Vico
0d6c529a46 fix(autoscaling): change unexpected exception to error severity logger (#4569) 2024-07-30 10:07:36 +02:00
dependabot[bot]
5237658047 chore(deps): bump botocore from 1.34.149 to 1.34.150 (#4567)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-30 09:25:07 +02:00
Daniel Barranquero
c00f61ac10 test(GCP): Add remaining GCP tests for KMS checks (#4550) 2024-07-29 13:22:41 -04:00
Rubén De la Torre Vico
2cd840a2b5 fix(autoscaling): Add exception manage while decoding UserData (#4562)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-29 12:03:44 -04:00
dependabot[bot]
7e630ebe27 chore(deps): bump boto3 from 1.34.148 to 1.34.149 (#4556)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-29 08:24:06 +02:00
dependabot[bot]
2f1c0facfd chore(deps): bump trufflesecurity/trufflehog from 3.80.1 to 3.80.2 (#4557)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-29 08:23:45 +02:00
Pepe Fagoaga
603bb03f35 chore(regions_update): Changes in regions for AWS services. (#4560)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-29 08:22:19 +02:00
Rubén De la Torre Vico
b7af1a06e8 fix(organizations): Fix types errors related to policies and json.loads function (#4554) 2024-07-26 10:51:46 -04:00
Kay Agahd
02fc034b1f feat(aws): make check eks_control_plane_logging_all_types_enabled configurable (#4553) 2024-07-26 10:24:01 -04:00
joshua_jebaraj
40522cdc62 fix(gcp): false positive for iam_sa_no_administrative_privilege check (#4500)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-26 10:15:34 -04:00
Rubén De la Torre Vico
dc11d85451 chore(cloudsql): Change default cases for CloudSQL checks and remaining tests (#4537) 2024-07-26 10:09:04 -04:00
Pepe Fagoaga
13c50086eb chore(regions_update): Changes in regions for AWS services. (#4552)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-26 09:38:43 -04:00
Sergio Garcia
f7729381e0 fix(s3): enhance threading in s3 service (#4530) 2024-07-26 09:16:47 -04:00
dependabot[bot]
d244475578 chore(deps): bump azure-mgmt-network from 25.4.0 to 26.0.0 (#4543)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 14:28:46 +02:00
dependabot[bot]
10dcbaea7b chore(deps): bump google-api-python-client from 2.137.0 to 2.138.0 (#4542)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 13:33:43 +02:00
dependabot[bot]
c91bbdcf2b chore(deps): bump azure-mgmt-compute from 31.0.0 to 32.0.0 (#4541)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 13:01:49 +02:00
dependabot[bot]
c7dbcb17d6 chore(deps): bump botocore from 1.34.148 to 1.34.149 (#4539)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 12:08:53 +02:00
dependabot[bot]
5a8a9286db chore(deps): bump boto3 from 1.34.147 to 1.34.148 (#4538)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 09:11:02 +02:00
dependabot[bot]
2476a1275a chore(deps-dev): bump pytest from 8.3.1 to 8.3.2 (#4540)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-26 08:28:40 +02:00
Hugo Pereira Brito
ac680c58cd docs(services): Fixed changed links (#4536) 2024-07-25 13:14:10 +02:00
Daniel Barranquero
68f0916ce4 test(iam): Add remaining GCP tests for IAM checks (#4519) 2024-07-25 11:21:36 +02:00
dependabot[bot]
dc896fc0af chore(deps): bump botocore from 1.34.147 to 1.34.148 (#4532)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-25 09:28:42 +02:00
dependabot[bot]
76af71d2df chore(deps): bump boto3 from 1.34.146 to 1.34.147 (#4531) 2024-07-25 08:43:22 +02:00
dependabot[bot]
96f761e4ef chore(deps): bump azure-mgmt-containerservice from 30.0.0 to 31.0.0 (#4513)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-24 11:53:31 +02:00
Pepe Fagoaga
9e16e477e9 chore(CODEOWNERS): update team (#4527) 2024-07-24 09:12:33 +02:00
Sergio Garcia
2038e30d3e fix(checks): ensure CheckID is correct in check's metadata (#4522)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-24 09:08:51 +02:00
dependabot[bot]
a4dc6975b0 chore(deps): bump botocore from 1.34.146 to 1.34.147 (#4526)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-24 08:41:38 +02:00
dependabot[bot]
a4a89fa581 chore(deps): bump boto3 from 1.34.145 to 1.34.146 (#4525)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-24 07:38:59 +02:00
Pepe Fagoaga
fc449bfd7b chore(s3): create class and refactor (#4457)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-23 10:03:28 -04:00
Rubén De la Torre Vico
2477948ae9 test(gcp): Test GCP provider new auth and print credentials (#4331) 2024-07-23 09:26:29 -04:00
Rubén De la Torre Vico
ca98584ded test(logging): Add remaining tests for Logging checks (#4481) 2024-07-23 09:24:32 -04:00
Rubén De la Torre Vico
489830f01a docs(azure): Review actual roles necessary to execute Prowler (#4501) 2024-07-23 09:15:23 -04:00
Rubén De la Torre Vico
bd56ca2979 chore(dms): Change checks IDs to match with metadata (#4520) 2024-07-23 06:41:07 -04:00
dependabot[bot]
04483a9a4f chore(deps): bump cryptography from 42.0.6 to 43.0.0 (#4512)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:42:59 +02:00
dependabot[bot]
684f63d398 chore(deps): bump numpy from 2.0.0 to 2.0.1 (#4510)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:02:05 +02:00
dependabot[bot]
b528dd44cd chore(deps): bump botocore from 1.34.145 to 1.34.146 (#4511)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 10:11:50 +02:00
dependabot[bot]
dfdeac0a46 chore(deps-dev): bump pylint from 3.2.5 to 3.2.6 (#4509)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 09:23:33 +02:00
dependabot[bot]
b52b67fd4b chore(deps-dev): bump pytest from 8.2.2 to 8.3.1 (#4508)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-22 18:56:30 -04:00
Sergio Garcia
5cf7d89aab fix(inspector2): add more efficient way to check if any active findings (#4505) 2024-07-22 16:25:23 -04:00
Pedro Martín
f5e6b1e438 docs(developer): improve developers docs with Trufflehog and --no-verify (#4502)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-22 13:12:52 +02:00
Pedro Martín
aa44bde940 chore(deps): update cryptography to 42.0.6 (#4499) 2024-07-22 12:09:55 +02:00
Sergio Garcia
ddc927a4ad chore(test): add missing acm imported certificate test (#4485) 2024-07-22 09:49:37 +02:00
dependabot[bot]
fbc99259e2 chore(deps): bump boto3 from 1.34.144 to 1.34.145 (#4497)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-22 09:11:54 +02:00
Daniel Barranquero
28f6f0abcc test(cloudstorage): Add remaining GCP tests for CloudStorage checks (#4464) 2024-07-19 08:37:22 -04:00
dependabot[bot]
0933a04239 chore(deps): bump azure-storage-blob from 12.20.0 to 12.21.0 (#4490)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-19 08:36:57 -04:00
Pedro Martín
5185f3a41e chore(output): review report function (#4465)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-07-19 08:36:39 -04:00
Pepe Fagoaga
6d20b11394 chore(CODEOWNERS): protect unauthorized changes (#4493) 2024-07-19 12:37:34 +02:00
dependabot[bot]
a01635e9ea chore(deps): bump botocore from 1.34.144 to 1.34.145 (#4491)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-19 11:32:35 +02:00
Pedro Martín
3bf9cd3db1 docs(readme): add Prowler animation gif to README (#4492) 2024-07-19 10:56:01 +02:00
dependabot[bot]
e15f0b2d0f chore(deps): bump trufflesecurity/trufflehog from 3.80.0 to 3.80.1 (#4486)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-19 09:59:44 +02:00
Sergio Garcia
f2de059ca1 fix(ssm): add missing ResourceArn to SSM check (#4482) 2024-07-18 18:10:06 +02:00
Ikko Eltociear Ashimine
8c8ac95d9c docs(readme): update README.md (#4483) 2024-07-18 17:31:52 +02:00
Pepe Fagoaga
89159c2111 chore(codeowners): update for sdk and checks (#4480) 2024-07-18 09:52:23 -04:00
Pedro Martín
70eb59185b docs(readme): update dashboard screenshot in README (#4479) 2024-07-18 12:53:03 +02:00
Pepe Fagoaga
f97af19860 chore(regions_update): Changes in regions for AWS services. (#4478)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-18 11:32:53 +02:00
dependabot[bot]
5ccd8af2a2 chore(deps): bump msgraph-sdk from 1.5.2 to 1.5.3 (#4475)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-18 10:40:07 +02:00
Pedro Martín
b53e8abc87 fix(main): change module name (#4477) 2024-07-18 10:29:47 +02:00
dependabot[bot]
db4c4fdaeb chore(deps): bump azure-mgmt-keyvault from 10.3.0 to 10.3.1 (#4474)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-18 08:39:20 +02:00
Sergio Garcia
44afe2db3e chore(compliance): simplify ComplianceOutput class (#4467) 2024-07-18 08:36:57 +02:00
Sergio Garcia
204d548cd0 chore(csv): remove old CSV functions (#4469) 2024-07-18 08:30:07 +02:00
dependabot[bot]
3faf80c0d7 chore(deps): bump trufflesecurity/trufflehog from 3.79.0 to 3.80.0 (#4471)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-18 08:28:01 +02:00
chaipot
5078e4a823 chore(docs): update remediation of custom checks metadata (#4470) 2024-07-17 17:14:35 -04:00
Pepe Fagoaga
d1b57ebd75 feat(output): Add a setter for the file descriptor and include extension (#4468) 2024-07-17 17:09:47 -04:00
Sergio Garcia
fdab3a737a chore(compliance): change compliance model names (#4466) 2024-07-17 11:47:28 -04:00
Rubén De la Torre Vico
b6f01b92dd test(gcp): Add bigquery and half of cloudsql check tests (#4462) 2024-07-17 12:03:22 +02:00
Pepe Fagoaga
c92537c791 chore(regions_update): Changes in regions for AWS services. (#4463)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-17 11:35:53 +02:00
Sergio Garcia
3e7cc2e0a2 chore(compliance): add manual requirements to compliance output (#4449)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2024-07-17 08:23:38 +02:00
Rubén De la Torre Vico
b8cfdb590b test(gcp): Add remaining CloudSQL tests (#4380) 2024-07-16 13:51:53 -04:00
Pepe Fagoaga
577afbd521 chore(mutelist): create new class to encapsulate the logic (#4413) 2024-07-16 13:44:43 -04:00
Rubén De la Torre Vico
d01cc51b6d test(compute): Add remaining tests for Compute service in GCP provider (#4458) 2024-07-16 11:43:30 -04:00
dependabot[bot]
ffa60b4ccd chore(deps): bump msgraph-sdk from 1.4.0 to 1.5.2 (#4426)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-16 06:57:42 -04:00
Rubén De la Torre Vico
d6dd0f7244 fix(entra): Change to correct service in entra_user_with_vm_access_has_mfa metadata (#4454) 2024-07-16 12:06:18 +02:00
Pepe Fagoaga
4df0dc4904 chore(regions_update): Changes in regions for AWS services. (#4455)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-16 11:48:52 +02:00
dependabot[bot]
386a1e1d1a chore(deps): bump boto3 from 1.34.143 to 1.34.144 (#4451)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-16 08:54:32 +02:00
dependabot[bot]
db9d7a4439 chore(deps): bump setuptools from 69.5.1 to 70.0.0 (#4450)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-15 16:12:56 -04:00
Pedro Martín
5725035e29 chore(GenericCompliance): add Generic Compliance class (#4447)
Co-authored-by: Sergio <sergio@prowler.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-15 12:56:22 -04:00
Pedro Martín
96a49e97d2 fix(iam_avoid_root_usage): change timestamp format (#4446) 2024-07-15 17:10:49 +02:00
Sergio Garcia
2a95750525 chore(iso27001): add ISO27001 output class (#4441)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-15 09:43:26 -04:00
Pedro Martín
b868d1a7fe fix(glue): add getters for connection attributes (#4445) 2024-07-15 14:51:01 +02:00
Pepe Fagoaga
37ade2a722 chore(revert): PR #4067 (#4440)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2024-07-15 10:25:00 +02:00
dependabot[bot]
c67032e07f chore(deps): bump botocore from 1.34.143 to 1.34.144 (#4442)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-15 09:25:25 +02:00
Pepe Fagoaga
0de8ef032a chore(regions_update): Changes in regions for AWS services. (#4444)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-15 09:24:37 +02:00
Sergio Garcia
027aa9796d chore(aws): add AWS Well-Architected output class (#4439) 2024-07-12 11:27:21 -04:00
Sergio Garcia
a505776227 chore(ens): add ENS output class (#4435) 2024-07-12 10:50:41 -04:00
Sergio Garcia
3be9de376a chore(mitre): add MITRE ATT&CK output class (#4425) 2024-07-12 10:08:32 -04:00
dependabot[bot]
bd26d74b28 chore(deps): bump boto3 from 1.34.142 to 1.34.143 (#4437)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-12 09:36:43 +02:00
dependabot[bot]
ca27854ff0 chore(deps-dev): bump coverage from 7.5.4 to 7.6.0 (#4438)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-12 08:55:33 +02:00
Pepe Fagoaga
abd18dc14d chore(regions_update): Changes in regions for AWS services. (#4433) 2024-07-11 09:27:52 -04:00
Pepe Fagoaga
297f506fd3 docs(gcp): Fix typo in title (#4434) 2024-07-11 09:27:04 -04:00
dependabot[bot]
78ca4b93a5 chore(deps): bump botocore from 1.34.142 to 1.34.143 (#4428)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 10:04:33 +02:00
dependabot[bot]
c80d51b585 chore(deps): bump boto3 from 1.34.141 to 1.34.142 (#4427)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 08:47:57 +02:00
Sergio Garcia
cf9b23c302 fix(cis): add missing fields and reorder (#4424) 2024-07-10 13:11:55 -04:00
Sergio Garcia
ef4b9e8d6a fix(templates): solve broken GitHub issues templates (#4423) 2024-07-10 16:55:51 +02:00
Sergio Garcia
a5a8c2a769 chore(cis): add CIS output class (#4400)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-10 09:26:08 -04:00
Pepe Fagoaga
64b21ae2b9 chore(labeler): add outputs and integrations (#4422) 2024-07-10 09:25:07 -04:00
Pepe Fagoaga
3da4824a1d chore(regions_update): Changes in regions for AWS services. (#4420)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-10 09:24:05 -04:00
Pepe Fagoaga
2247296cf9 chore(templates): update to remove titles (#4421) 2024-07-10 09:22:13 -04:00
dependabot[bot]
615127f790 chore(deps): bump botocore from 1.34.141 to 1.34.142 (#4416)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-10 11:54:21 +02:00
dependabot[bot]
42f21a52c9 chore(deps): bump google-api-python-client from 2.136.0 to 2.137.0 (#4415)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-10 08:36:13 +02:00
dependabot[bot]
e9442b2f89 chore(deps): bump zipp from 3.18.1 to 3.19.1 (#4414)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-10 08:32:34 +02:00
Pepe Fagoaga
6336b1c0d9 refactor(SecurityHub): create class to handle integration (#4397)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-09 11:47:47 -04:00
Pepe Fagoaga
a0603b972e chore(regions_update): Changes in regions for AWS services. (#4412)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-09 09:18:57 -04:00
dependabot[bot]
f319884532 chore(deps): bump boto3 from 1.34.139 to 1.34.141 (#4410)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 13:27:29 +02:00
dependabot[bot]
d49139c4f4 chore(deps-dev): bump moto from 5.0.10 to 5.0.11 (#4404)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 11:33:00 +02:00
dependabot[bot]
046c82232d chore(deps): bump botocore from 1.34.140 to 1.34.141 (#4403)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 09:39:19 +02:00
dependabot[bot]
027aafd9ea chore(deps): bump jsonschema from 4.22.0 to 4.23.0 (#4402)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 09:01:13 +02:00
Sergio Garcia
215d5dabd7 fix(docs): update deprecated command (#4401) 2024-07-09 08:40:25 +02:00
Pepe Fagoaga
f5e2ac7486 chore(regions_update): Changes in regions for AWS services. (#4396)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-08 09:56:03 -04:00
Pepe Fagoaga
6fc24b5435 chore: rename test function in the HTML test class (#4395) 2024-07-08 09:51:44 -04:00
dependabot[bot]
3d99e6ea28 chore(deps): bump botocore from 1.34.139 to 1.34.140 (#4391)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-08 09:27:57 +02:00
dependabot[bot]
b23aefadc1 chore(deps): bump certifi from 2024.2.2 to 2024.7.4 (#4392)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-08 09:21:56 +02:00
dependabot[bot]
b585a31a14 chore(deps): bump boto3 from 1.34.138 to 1.34.139 (#4383) 2024-07-05 19:03:20 -04:00
Pepe Fagoaga
9c817ae8a9 tests: add for empty findings and little renamings (#4388)
Co-authored-by: Sergio <sergio@prowler.com>
2024-07-05 15:09:23 -04:00
JackyCCChen
cd7f19c00e fix(gcp): Not all gcp projects have name (#4387) 2024-07-05 11:08:31 -04:00
dependabot[bot]
d1a7d19799 chore(deps-dev): bump safety from 3.2.3 to 3.2.4 (#4385)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-05 11:20:41 +02:00
Pedro Martín
d7dffbc44b chore(test): enhance OCSF tests (#4386) 2024-07-05 11:19:53 +02:00
dependabot[bot]
0402cc7e2d chore(deps): bump slack-sdk from 3.30.0 to 3.31.0 (#4384)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-05 08:47:41 +02:00
Sergio Garcia
bf83f38c89 chore(html): add HTML class (#4360) 2024-07-04 13:28:09 -04:00
Pepe Fagoaga
673619c8a1 refactor(ASFF): create class (#4368)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2024-07-04 12:04:36 -04:00
Pedro Martín
2345a7384b chore(ocsf): add OCSF class for outputs (#4355) 2024-07-04 17:08:01 +02:00
Oleksii
e387c591c3 chore(k8s): Add helm-chart (#4370)
Co-authored-by: Oleksii Tsyganov <otsyganov@magicleap.com>
2024-07-04 10:30:45 -04:00
Rubén De la Torre Vico
47a37c7d0d chore(iam): Improve status extended adding the resource type (#4378) 2024-07-04 09:32:35 -04:00
dependabot[bot]
7b359cf1eb chore(deps): bump botocore from 1.34.138 to 1.34.139 (#4373)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-04 14:32:16 +02:00
Pepe Fagoaga
35d525b903 chore(regions_update): Changes in regions for AWS services. (#4379)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-04 11:48:04 +02:00
Pedro Martín
b5b193427d docs(readme): update check number on readme (#4377) 2024-07-04 08:54:12 +02:00
Rubén De la Torre Vico
e6ae539323 feat(IAM): Add inline policies checks and improve custom policy checks (#4255) 2024-07-03 15:51:19 -04:00
Pepe Fagoaga
541b907038 chore(regions_update): Changes in regions for AWS services. (#4369)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-07-03 09:56:15 -04:00
dependabot[bot]
040e1eaa5e chore(deps): bump boto3 from 1.34.136 to 1.34.138 (#4367)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-03 08:43:03 +02:00
dependabot[bot]
e23a674277 chore(deps): bump google-api-python-client from 2.135.0 to 2.136.0 (#4362)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-03 08:09:39 +02:00
dependabot[bot]
e73cefdf1a chore(deps): bump botocore from 1.34.137 to 1.34.138 (#4361)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-03 07:37:08 +02:00
Rubén De la Torre Vico
9ed4e89c60 chore(iam): Remove unnecesary attached policy in a inline policy (#4359) 2024-07-02 12:38:00 -04:00
Pedro Martín
da547b2bbe fix(test-csv): fix test using tempfile (#4356) 2024-07-02 09:16:12 -04:00
Pedro Martín
ca033745c9 chore(csv): add CSVOutput class (#4315)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-07-02 13:12:43 +02:00
dependabot[bot]
fb49fb83ae chore(deps): bump botocore from 1.34.136 to 1.34.137 (#4351)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-02 09:30:49 +02:00
dependabot[bot]
76e0b23365 chore(deps): bump boto3 from 1.34.132 to 1.34.136 (#4352)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-02 08:52:10 +02:00
Sergio Garcia
82ccdc45d2 chore(elasticache): enhance service and checks (#4329) 2024-07-01 10:06:24 -04:00
dependabot[bot]
de777a6417 chore(deps): bump azure-mgmt-storage from 21.2.0 to 21.2.1 (#4339)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 09:55:17 -04:00
dependabot[bot]
87d8cda745 chore(deps-dev): bump moto from 5.0.9 to 5.0.10 (#4343)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 09:19:22 -04:00
dependabot[bot]
64abd0a6d0 chore(deps-dev): bump pylint from 3.2.3 to 3.2.5 (#4347)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 13:58:42 +02:00
dependabot[bot]
096d7c6304 chore(deps): bump botocore from 1.34.132 to 1.34.136 (#4337)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 12:58:50 +02:00
dependabot[bot]
4908e06544 chore(deps): bump google-api-python-client from 2.134.0 to 2.135.0 (#4345)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 12:28:08 +02:00
dependabot[bot]
d42cc66d9f chore(deps): bump trufflesecurity/trufflehog from 3.78.2 to 3.79.0 (#4335)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-01 11:48:30 +02:00
Pepe Fagoaga
7a5318b936 chore(dependabot): Run daily (#4334) 2024-07-01 11:43:50 +02:00
Pepe Fagoaga
ffb494f9a4 chore(regions_update): Changes in regions for AWS services. (#4332) 2024-07-01 08:57:03 +02:00
Sergio Garcia
f515b2b53b fix(aws): parallelize functions per resource (#4323) 2024-06-28 09:27:47 -04:00
Pepe Fagoaga
a3cf7665ac chore(regions_update): Changes in regions for AWS services. (#4330)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-28 11:43:29 +02:00
Rubén De la Torre Vico
dbaf72958e doc(requirements): Add management group for multiple subscriptions (#4282)
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2024-06-28 10:06:16 +02:00
Sergio Garcia
169d1686d2 fix(s3): handle empty Action in bucket policy (#4328) 2024-06-28 08:25:40 +02:00
sansns-aws
ba726b205d feat(Elasticache): Additional Elasticache checks (#4317)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-27 18:07:22 -04:00
sansns-aws
630d980861 feat(NetworkFirewall): Add Deletion Protection Check (#4318)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-27 10:08:31 -04:00
Pedro Martín
7d81040eae fix(docs): Rewrite dashboard docs (#4327) 2024-06-27 12:55:02 +02:00
Pepe Fagoaga
4009d96f8a chore(regions_update): Changes in regions for AWS services. (#4326)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-27 12:33:45 +02:00
Pepe Fagoaga
cee5064b11 chore(tests): Improve CloudTrail tests checking for multiregional trails (#4177)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-26 17:33:50 -04:00
Sergio Garcia
e5c911abef chore(python): update vulnerable anyio library (#4322) 2024-06-26 16:57:57 -04:00
Sergio Garcia
ff5c41f363 fix(codebuild): enhance service functions (#4319)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-06-26 11:27:50 -04:00
Sergio Garcia
cf84875355 feat(gcp): add service account impersonation (#4291) 2024-06-26 15:31:47 +02:00
Pepe Fagoaga
fc23eccc7b chore(regions_update): Changes in regions for AWS services. (#4320)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-26 11:47:28 +02:00
Pedro Martín
c5fb11e815 docs(kubernetes): add docs about kubernetes in tutorials page (#4288)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-25 11:41:13 -04:00
dependabot[bot]
fdab1edd3e chore(deps): bump boto3 from 1.34.123 to 1.34.132 (#4316)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 16:15:42 +02:00
dependabot[bot]
ea74d82c48 chore(deps): bump azure-mgmt-web from 7.2.0 to 7.3.0 (#4301)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 09:21:12 -04:00
Rubén De la Torre Vico
093738c65f chore(s3): reduce false positive in s3 public check (#4281) 2024-06-25 08:55:42 -04:00
Pedro Martín
bae224c891 fix(csv-outputs): compliance outputs not showing consistents values (#4287) 2024-06-25 14:50:17 +02:00
dependabot[bot]
32cded949d chore(deps): bump azure-mgmt-cosmosdb from 9.5.0 to 9.5.1 (#4298)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 14:46:25 +02:00
dependabot[bot]
6463dcdde0 chore(deps): bump azure-identity from 1.16.1 to 1.17.1 (#4300)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 14:07:39 +02:00
dependabot[bot]
0b16dab2ad chore(deps): bump azure-mgmt-storage from 21.1.0 to 21.2.0 (#4297)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 13:34:12 +02:00
dependabot[bot]
825c620e6f chore(deps): bump botocore from 1.34.128 to 1.34.132 (#4296)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 12:30:26 +02:00
dependabot[bot]
819a5597a3 chore(deps-dev): bump coverage from 7.5.3 to 7.5.4 (#4295)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 10:35:50 +02:00
dependabot[bot]
4bae3d2600 chore(deps): bump slack-sdk from 3.29.0 to 3.30.0 (#4294)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 09:51:31 +02:00
Sergio Garcia
131cb82751 chore(readme): update checks number (#4290) 2024-06-25 08:56:04 +02:00
dependabot[bot]
029caf3b10 chore(deps): bump google-api-python-client from 2.133.0 to 2.134.0 (#4293)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-25 08:38:08 +02:00
dependabot[bot]
9ee23a39b5 chore(deps): bump trufflesecurity/trufflehog from 3.78.1 to 3.78.2 (#4292) 2024-06-25 07:57:24 +02:00
Pedro Martín
4837df4352 chore(aws): handle new permissions (#4289) 2024-06-24 12:14:20 -04:00
sansns-aws
d173d58a93 feat(DMS): Add Database Migration Service (DMS) (#4249) 2024-06-24 11:41:33 -04:00
sansns-aws
af29570fe9 feat(DocumentDB): New DocumentDB checks (#4247) 2024-06-24 11:40:39 -04:00
sansns-aws
9253cd42dd feat(neptune): Additional Neptune checks (#4243) 2024-06-24 11:38:41 -04:00
Sergio Garcia
836b4ba2cc fix(rds): handle not existing endpoint (#4285) 2024-06-24 09:38:26 +02:00
Pepe Fagoaga
f28c0578aa chore(regions_update): Changes in regions for AWS services. (#4286)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-24 07:53:01 +02:00
Rubén De la Torre Vico
536f0df9d3 feat(app): Add new Azure functions checks (#4189)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-21 11:32:31 -04:00
Pepe Fagoaga
465261e1df chore(regions_update): Changes in regions for AWS services. (#4283)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-21 10:54:24 -04:00
Sergio Garcia
3667370604 chore(safety): update vulnerable library version (#4284) 2024-06-21 10:23:17 -04:00
sansns-aws
9ca64e7bdb feat(RDS): Additional RDS checks (#4233)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-20 13:41:08 -04:00
dependabot[bot]
95a9f1c458 chore(deps): bump kubernetes from 29.0.0 to 30.1.0 (#4226)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-20 11:34:35 -04:00
Pepe Fagoaga
9fbd627f9a chore(regions_update): Changes in regions for AWS services. (#4280)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-20 08:57:32 -04:00
Pepe Fagoaga
7203fcf4f1 chore(regions_update): Changes in regions for AWS services. (#4278)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-20 08:57:05 -04:00
Rubén De la Torre Vico
f10bb343a6 doc(debugging): Improve actual VSCode debugging file (#4279) 2024-06-20 09:11:01 +02:00
John Mastron
9147a45e2f fix(aws): aws check and metadata fixes (#4251)
Co-authored-by: John Mastron <jmastron@jpl.nasa.gov>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-06-19 10:21:50 +02:00
dependabot[bot]
5353d515b6 chore(deps): bump dash from 2.17.0 to 2.17.1 (#4272)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 09:37:44 -04:00
Pepe Fagoaga
e8a94733bf fix(aws): Assume role for Gov Cloud (#4254)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-06-18 09:37:23 -04:00
Pepe Fagoaga
625be45742 chore(regions_update): Changes in regions for AWS services. (#4277)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-18 09:09:43 -04:00
dependabot[bot]
ecb6cb897f chore(deps): bump numpy from 1.26.4 to 2.0.0 (#4275) 2024-06-18 14:53:38 +02:00
dependabot[bot]
f07bd79442 chore(deps-dev): bump flake8 from 7.0.0 to 7.1.0 (#4269)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 13:43:11 +02:00
dependabot[bot]
b7c1fabae1 chore(deps-dev): bump bandit from 1.7.8 to 1.7.9 (#4271)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 11:44:02 +02:00
dependabot[bot]
59d3b2f33e chore(deps): bump google-api-python-client from 2.132.0 to 2.133.0 (#4274)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 11:04:25 +02:00
dependabot[bot]
6c098e98e3 chore(deps): bump botocore from 1.34.123 to 1.34.128 (#4273)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 10:15:13 +02:00
dependabot[bot]
380011fd1e chore(deps): bump urllib3 from 1.26.18 to 1.26.19 (#4276)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 09:06:35 +02:00
dependabot[bot]
e97bf32a90 chore(deps): bump slack-sdk from 3.28.0 to 3.29.0 (#4270)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 08:50:52 +02:00
dependabot[bot]
ed18ea0ec4 chore(deps): bump docker/build-push-action from 5 to 6 (#4260)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 08:49:47 +02:00
dependabot[bot]
dc897986bc chore(deps): bump trufflesecurity/trufflehog from 3.78.0 to 3.78.1 (#4259)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-18 08:49:36 +02:00
Pepe Fagoaga
e296d6e5c1 fix: Some minor fixes in several parts (#4237)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-06-17 16:54:54 -04:00
Andoni Alonso
1252e6163b chore(docs): update checks reference link (#4258) 2024-06-17 15:30:39 -04:00
Pepe Fagoaga
8ad14c7833 fix(custom_checks): workaround to fix execution (#4256) 2024-06-17 14:13:18 -04:00
Pepe Fagoaga
61b9ecc214 chore(regions_update): Changes in regions for AWS services. (#4252)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-14 11:07:22 -04:00
Sergio Garcia
f8f2c19454 fix(readme): update note syntax (#4250) 2024-06-13 16:05:10 -04:00
Rubén De la Torre Vico
922438a7a0 chore(network): Reduce network watchers azure check findings (#4242) 2024-06-13 15:57:44 -04:00
Pepe Fagoaga
920f98c9ef chore(regions_update): Changes in regions for AWS services. (#4248)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-13 15:12:10 -04:00
Pepe Fagoaga
9b1ad5dd2e chore(regions_update): Changes in regions for AWS services. (#4246)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-12 07:56:53 -04:00
dependabot[bot]
d7a97b6e1d chore(deps): bump azure-identity from 1.16.0 to 1.16.1 (#4230)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 17:49:44 -04:00
dependabot[bot]
07db051d14 chore(deps): bump azure-identity from 1.16.0 to 1.16.1 (#4245)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 17:49:30 -04:00
dependabot[bot]
6fec85589d chore(deps-dev): bump pylint from 3.2.2 to 3.2.3 (#4229) 2024-06-11 12:59:21 -04:00
dependabot[bot]
f82aa1c3e1 chore(deps-dev): bump pytest from 8.2.1 to 8.2.2 (#4223) 2024-06-11 12:10:27 -04:00
Pepe Fagoaga
ee9faedbbe docs(developer-guide): How to fork the repo (#4238) 2024-06-11 12:08:54 -04:00
Pepe Fagoaga
e5dec1251d fix(s3): Send HTML also (#4240) 2024-06-11 12:08:13 -04:00
Pepe Fagoaga
692a39b08f chore(regions_update): Changes in regions for AWS services. (#4241) 2024-06-11 12:04:51 -04:00
Pepe Fagoaga
60b3523def chore(release): 4.2.4 (#4236) 2024-06-11 09:46:33 -04:00
Rubén De la Torre Vico
e1428bc1ff chore(iam): improve iam user console access check (#4211) 2024-06-11 12:45:29 +02:00
dependabot[bot]
0ff8b7e02a chore(deps): bump boto3 from 1.34.113 to 1.34.123 (#4235)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 11:56:02 +02:00
dependabot[bot]
7b84008046 chore(deps): bump google-api-python-client from 2.131.0 to 2.132.0 (#4227)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 11:02:08 +02:00
dependabot[bot]
30a092e2aa chore(deps): bump slack-sdk from 3.27.2 to 3.28.0 (#4228)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 09:54:38 +02:00
dependabot[bot]
11a7ff2977 chore(deps): bump trufflesecurity/trufflehog from 3.77.0 to 3.78.0 (#4222)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 09:51:43 +02:00
dependabot[bot]
12ba978361 chore(deps-dev): bump safety from 3.2.0 to 3.2.3 (#4232)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 09:22:41 +02:00
dependabot[bot]
42182a2b70 chore(deps): bump botocore from 1.34.118 to 1.34.123 (#4224)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-11 08:37:14 +02:00
dependabot[bot]
26eaec3101 chore(deps-dev): bump authlib from 1.3.0 to 1.3.1 (#4213)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-10 16:47:40 -04:00
Pepe Fagoaga
daf6194dee chore(regions_update): Changes in regions for AWS services. (#4210)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-08 16:14:04 +02:00
William Leung
e28300a1db fix(config/html): handle encoding issues and improve error handling in config and HTML file loading functions (#4203)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-07 12:51:01 -04:00
Rubén De la Torre Vico
1a225c334f chore(acm): Improve near-expiration certificates check (#4207)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-07 12:22:05 -04:00
Sergio Garcia
1d64ca4372 fix(compliance): check if custom check has compliance metadata (#4208) 2024-06-07 10:54:34 -04:00
Seiji Ujihira
2a139e3dc7 fix(custom): execute custom checks (#4202) 2024-06-07 10:01:28 -04:00
Pedro Martín
89d1712ff1 fix(dashboard): fix styles in overview page (#4204) 2024-06-07 09:46:54 -04:00
Pedro Martín
45ea9e1e79 fix(html): fix status from HTML outputs (#4206) 2024-06-07 09:36:21 -04:00
Pepe Fagoaga
4b46fe9788 chore(regions_update): Changes in regions for AWS services. (#4205)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-06-07 09:31:55 -04:00
Sergio Garcia
28b9e269b7 chore(version): update Prowler version (#4201) 2024-06-07 08:40:03 +02:00
Pedro Martín
0a41ec4746 fix(html): resolve html changing finding status (#4199) 2024-06-06 11:30:49 -04:00
Pedro Martín
e6472f9bfc fix(html): handle muted status to html outputs (#4195)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-06-06 10:06:02 -04:00
Pedro Martín
c033af6194 docs(readme): Update checks number (#4197) 2024-06-06 09:39:24 -04:00
sansns-aws
4d662dc446 feat(rds): Add security group event subscription check (#4130)
Co-authored-by: Sergio <sergio@prowler.com>
2024-06-06 08:45:50 -04:00
1436 changed files with 84597 additions and 17634 deletions

14
.backportrc.json Normal file
View File

@@ -0,0 +1,14 @@
{
"repoOwner": "prowler-cloud",
"repoName": "prowler",
"targetPRLabels": [
"backport"
],
"sourcePRLabels": [
"was-backported"
],
"copySourcePRLabels": false,
"copySourcePRReviewers": true,
"prTitle": "{{sourcePullRequest.title}}",
"commitConflicts": true
}

6
.github/CODEOWNERS vendored
View File

@@ -1 +1,5 @@
* @prowler-cloud/prowler-oss @prowler-cloud/prowler-dev
* @prowler-cloud/sdk @prowler-cloud/detection-and-remediation
# To protect a repository fully against unauthorized changes, you also need to define an owner for the CODEOWNERS file itself.
# https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners#codeowners-and-branch-protection
/.github/ @prowler-cloud/sdk

View File

@@ -1,6 +1,5 @@
name: 🐞 Bug Report
description: Create a report to help us improve
title: "[Bug]: "
labels: ["bug", "status/needs-triage"]
body:

View File

@@ -1,8 +1,7 @@
name: 💡 Feature Request
name: 💡 Feature Request
description: Suggest an idea for this project
labels: ["feature-request", "status/needs-triage"]
body:
- type: textarea
id: Problem

View File

@@ -8,7 +8,7 @@ updates:
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
interval: "daily"
open-pull-requests-limit: 10
target-branch: master
labels:
@@ -17,14 +17,14 @@ updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
interval: "daily"
open-pull-requests-limit: 10
target-branch: master
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
interval: "daily"
open-pull-requests-limit: 10
target-branch: v3
labels:
@@ -34,7 +34,7 @@ updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
interval: "daily"
open-pull-requests-limit: 10
target-branch: v3
labels:

50
.github/labeler.yml vendored
View File

@@ -29,3 +29,53 @@ github_actions:
cli:
- changed-files:
- any-glob-to-any-file: "cli/**"
mutelist:
- changed-files:
- any-glob-to-any-file: "prowler/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/aws/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/kubernetes/lib/mutelist/**"
- any-glob-to-any-file: "tests/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/aws/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/kubernetes/lib/mutelist/**"
integration/s3:
- changed-files:
- any-glob-to-any-file: "prowler/providers/aws/lib/s3/**"
- any-glob-to-any-file: "tests/providers/aws/lib/s3/**"
integration/slack:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/slack/**"
- any-glob-to-any-file: "tests/lib/outputs/slack/**"
integration/security-hub:
- changed-files:
- any-glob-to-any-file: "prowler/providers/aws/lib/security_hub/**"
- any-glob-to-any-file: "tests/providers/aws/lib/security_hub/**"
- any-glob-to-any-file: "prowler/lib/outputs/asff/**"
- any-glob-to-any-file: "tests/lib/outputs/asff/**"
output/html:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/html/**"
- any-glob-to-any-file: "tests/lib/outputs/html/**"
output/asff:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/asff/**"
- any-glob-to-any-file: "tests/lib/outputs/asff/**"
output/ocsf:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/ocsf/**"
- any-glob-to-any-file: "tests/lib/outputs/ocsf/**"
output/csv:
- changed-files:
- any-glob-to-any-file: "prowler/lib/outputs/csv/**"
- any-glob-to-any-file: "tests/lib/outputs/csv/**"

View File

@@ -2,11 +2,19 @@
Please include relevant motivation and context for this PR.
If fixes an issue please add it with `Fix #XXXX`
### Description
Please include a summary of the change and which issue is fixed. List any dependencies that are required for this change.
### Checklist
- Are there new checks included in this PR? Yes / No
- If so, do we need to update permissions for the provider? Please review this carefully.
- [ ] Review if the code is being covered by tests.
- [ ] Review if code is being documented following this specification https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
- [ ] Review if backport is needed.
### License

42
.github/workflows/backport.yml vendored Normal file
View File

@@ -0,0 +1,42 @@
name: Automatic Backport
on:
pull_request_target:
branches: ['master']
types: ['labeled', 'closed']
jobs:
backport:
name: Backport PR
if: github.event.pull_request.merged == true && !(contains(github.event.pull_request.labels.*.name, 'backport'))
runs-on: ubuntu-latest
permissions:
id-token: write
pull-requests: write
contents: write
steps:
# Workaround not to fail the workflow if the PR does not need a backport
# https://github.com/sorenlouv/backport-github-action/issues/127#issuecomment-2258561266
- name: Check for backport labels
id: check_labels
run: |-
labels='${{ toJSON(github.event.pull_request.labels.*.name) }}'
echo "$labels"
matched=$(echo "${labels}" | jq '. | map(select(startswith("backport-to-"))) | length')
echo "matched=$matched"
echo "matched=$matched" >> $GITHUB_OUTPUT
- name: Backport Action
if: fromJSON(steps.check_labels.outputs.matched) > 0
uses: sorenlouv/backport-github-action@v9.5.1
with:
github_token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
auto_backport_label_prefix: backport-to-
- name: Info log
if: ${{ success() && fromJSON(steps.check_labels.outputs.matched) > 0 }}
run: cat ~/.backport/backport.info.log
- name: Debug log
if: ${{ failure() && fromJSON(steps.check_labels.outputs.matched) > 0 }}
run: cat ~/.backport/backport.debug.log

View File

@@ -16,9 +16,9 @@ jobs:
name: Documentation Link
runs-on: ubuntu-latest
steps:
- name: Leave PR comment with the SaaS Documentation URI
- name: Leave PR comment with the Prowler Documentation URI
uses: peter-evans/create-or-update-comment@v4
with:
issue-number: ${{ env.PR_NUMBER }}
body: |
You can check the documentation for this PR here -> [SaaS Documentation](https://prowler-prowler-docs--${{ env.PR_NUMBER }}.com.readthedocs.build/projects/prowler-open-source/en/${{ env.PR_NUMBER }}/)
You can check the documentation for this PR here -> [Prowler Documentation](https://prowler-prowler-docs--${{ env.PR_NUMBER }}.com.readthedocs.build/projects/prowler-open-source/en/${{ env.PR_NUMBER }}/)

View File

@@ -43,7 +43,7 @@ jobs:
runs-on: ubuntu-latest
outputs:
prowler_version_major: ${{ steps.get-prowler-version.outputs.PROWLER_VERSION_MAJOR }}
prowler_version: ${{ steps.update-prowler-version.outputs.PROWLER_VERSION }}
prowler_version: ${{ steps.get-prowler-version.outputs.PROWLER_VERSION }}
env:
POETRY_VIRTUALENVS_CREATE: "false"
@@ -65,6 +65,8 @@ jobs:
id: get-prowler-version
run: |
PROWLER_VERSION="$(poetry version -s 2>/dev/null)"
echo "PROWLER_VERSION=${PROWLER_VERSION}" >> "${GITHUB_ENV}"
echo "PROWLER_VERSION=${PROWLER_VERSION}" >> "${GITHUB_OUTPUT}"
# Store prowler version major just for the release
PROWLER_VERSION_MAJOR="${PROWLER_VERSION%%.*}"
@@ -89,15 +91,6 @@ jobs:
;;
esac
- name: Update Prowler version (release)
id: update-prowler-version
if: github.event_name == 'release'
run: |
PROWLER_VERSION="${{ github.event.release.tag_name }}"
poetry version "${PROWLER_VERSION}"
echo "PROWLER_VERSION=${PROWLER_VERSION}" >> "${GITHUB_ENV}"
echo "PROWLER_VERSION=${PROWLER_VERSION}" >> "${GITHUB_OUTPUT}"
- name: Login to DockerHub
uses: docker/login-action@v3
with:
@@ -118,7 +111,7 @@ jobs:
- name: Build and push container image (latest)
if: github.event_name == 'push'
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
push: true
tags: |
@@ -130,7 +123,7 @@ jobs:
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
# Use local context to get changes
# https://github.com/docker/build-push-action#path-context
@@ -160,7 +153,7 @@ jobs:
run: |
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" \
-H "Authorization: Bearer ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
--data '{"event_type":"dispatch","client_payload":{"version":"v3-latest", "tag": "${{ env.LATEST_COMMIT_HASH }}"}}'
@@ -169,6 +162,6 @@ jobs:
run: |
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" \
-H "Authorization: Bearer ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
--data '{"event_type":"dispatch","client_payload":{"version":"release", "tag":"${{ needs.container-build-push.outputs.prowler_version }}"}}'

View File

@@ -13,10 +13,10 @@ name: "CodeQL"
on:
push:
branches: [ "master", "v3" ]
branches: [ "master", "v3", "v4.*" ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ "master", "v3" ]
branches: [ "master", "v3", "v4.*" ]
schedule:
- cron: '00 12 * * *'

View File

@@ -11,7 +11,7 @@ jobs:
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@v3.77.0
uses: trufflesecurity/trufflehog@v3.82.6
with:
path: ./
base: ${{ github.event.repository.default_branch }}

View File

@@ -5,6 +5,7 @@ on:
branches:
- "master"
- "v3"
- "v4.*"
jobs:
labeler:

View File

@@ -5,10 +5,12 @@ on:
branches:
- "master"
- "v3"
- "v4.*"
pull_request:
branches:
- "master"
- "v3"
- "v4.*"
jobs:
build:
runs-on: ubuntu-latest
@@ -20,7 +22,7 @@ jobs:
- uses: actions/checkout@v4
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@v44
uses: tj-actions/changed-files@v45
with:
files: ./**
files_ignore: |
@@ -29,6 +31,7 @@ jobs:
docs/**
permissions/**
mkdocs.yml
.backportrc.json
- name: Install poetry
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
@@ -73,7 +76,7 @@ jobs:
- name: Safety
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |
poetry run safety check --ignore 67599 --ignore 70612
poetry run safety check --ignore 70612
- name: Vulture
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |

View File

@@ -8,8 +8,6 @@ env:
RELEASE_TAG: ${{ github.event.release.tag_name }}
PYTHON_VERSION: 3.11
CACHE: "poetry"
# TODO: create a bot user for this kind of tasks, like prowler-bot
GIT_COMMITTER_EMAIL: "sergio@prowler.com"
jobs:
release-prowler-job:
@@ -40,7 +38,6 @@ jobs:
- name: Install dependencies
run: |
pipx install poetry
pipx inject poetry poetry-bumpversion
- name: Setup Python
uses: actions/setup-python@v5
@@ -48,34 +45,6 @@ jobs:
python-version: ${{ env.PYTHON_VERSION }}
cache: ${{ env.CACHE }}
- name: Update Poetry and config version
run: |
poetry version ${{ env.RELEASE_TAG }}
- name: Import GPG key
uses: crazy-max/ghaction-import-gpg@v6
with:
gpg_private_key: ${{ secrets.GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.GPG_PASSPHRASE }}
git_user_signingkey: true
git_commit_gpgsign: true
- name: Push updated version to the release tag
run: |
# Configure Git
git config user.name "github-actions"
git config user.email "${{ env.GIT_COMMITTER_EMAIL }}"
# Add the files with the version changed
git add prowler/config/config.py pyproject.toml
git commit -m "chore(release): ${{ env.RELEASE_TAG }}" --no-verify -S
# Replace the tag with the version updated
git tag -fa ${{ env.RELEASE_TAG }} -m "chore(release): ${{ env.RELEASE_TAG }}" --sign
# Push the tag
git push -f origin ${{ env.RELEASE_TAG }}
- name: Build Prowler package
run: |
poetry build

View File

@@ -50,13 +50,13 @@ jobs:
# Create pull request
- name: Create Pull Request
uses: peter-evans/create-pull-request@v6
uses: peter-evans/create-pull-request@v7
with:
token: ${{ secrets.PROWLER_ACCESS_TOKEN }}
commit-message: "feat(regions_update): Update regions for AWS services."
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
commit-message: "feat(regions_update): Update regions for AWS services"
branch: "aws-services-regions-updated-${{ github.sha }}"
labels: "status/waiting-for-revision, severity/low, provider/aws, backport-v3"
title: "chore(regions_update): Changes in regions for AWS services."
labels: "status/waiting-for-revision, severity/low, provider/aws, backport-to-v3"
title: "chore(regions_update): Changes in regions for AWS services"
body: |
### Description

View File

@@ -97,7 +97,7 @@ repos:
- id: safety
name: safety
description: "Safety is a tool that checks your installed dependencies for known security vulnerabilities"
entry: bash -c 'safety check --ignore 67599 --ignore 70612'
entry: bash -c 'safety check --ignore 70612'
language: system
- id: vulture

View File

@@ -12,7 +12,7 @@
<p align="center">
<a href="https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog"><img width="30" height="30" alt="Prowler community on Slack" src="https://github.com/prowler-cloud/prowler/assets/38561120/3c8b4ec5-6849-41a5-b5e1-52bbb94af73a"></a>
<br>
<a href="https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog">Join our Prowler community!</a>
<a href="https://join.slack.com/t/prowler-workspace/shared_invite/zt-2oinmgmw6-cl7gOrljSEqo_aoripVPFA">Join our Prowler community!</a>
</p>
<hr>
<p align="center">
@@ -37,6 +37,9 @@
<a href="https://twitter.com/prowlercloud"><img alt="Twitter" src="https://img.shields.io/twitter/follow/prowlercloud?style=social"></a>
</p>
<hr>
<p align="center">
<img align="center" src="/docs/img/prowler-cli-quick.gif" width="100%" height="100%">
</p>
# Description
@@ -60,9 +63,9 @@ It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, Fe
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 359 | 66 -> `prowler aws --list-services` | 28 -> `prowler aws --list-compliance` | 7 -> `prowler aws --list-categories` |
| GCP | 77 | 13 -> `prowler gcp --list-services` | 1 -> `prowler gcp --list-compliance` | 2 -> `prowler gcp --list-categories`|
| Azure | 127 | 16 -> `prowler azure --list-services` | 2 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
| AWS | 457 | 67 -> `prowler aws --list-services` | 30 -> `prowler aws --list-compliance` | 9 -> `prowler aws --list-categories` |
| GCP | 77 | 13 -> `prowler gcp --list-services` | 2 -> `prowler gcp --list-compliance` | 2 -> `prowler gcp --list-categories`|
| Azure | 136 | 17 -> `prowler azure --list-services` | 3 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
| Kubernetes | 83 | 7 -> `prowler kubernetes --list-services` | 1 -> `prowler kubernetes --list-compliance` | 7 -> `prowler kubernetes --list-categories` |
# 💻 Installation
@@ -74,7 +77,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
pip install prowler
prowler -v
```
More details at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/)
>More details at [https://docs.prowler.com](https://docs.prowler.com/projects/prowler-open-source/en/latest/)
## Containers
@@ -91,7 +94,7 @@ The container images are available here:
- [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags)
- [AWS Public ECR](https://gallery.ecr.aws/prowler-cloud/prowler)
## From Github
## From GitHub
Python >= 3.9, < 3.13 is required with pip and poetry:
@@ -102,8 +105,7 @@ poetry shell
poetry install
python prowler.py -v
```
???+ note
If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
> If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
# 📐✏️ High level architecture
You can run Prowler from your workstation, a Kubernetes Job, a Google Compute Engine, an Azure VM, an EC2 instance, Fargate or any other container, CloudShell and many more.

View File

@@ -1,6 +0,0 @@
# CLI
To show the banner, use:
`python cli/cli.py banner`
## Listing
List services by provider.
`python cli/cli.py <provider> list-services`

View File

@@ -1,63 +0,0 @@
import typer
from prowler.lib.banner import print_banner
from prowler.lib.check.check import (
list_fixers,
list_services,
print_fixers,
print_services,
)
app = typer.Typer()
aws = typer.Typer(name="aws")
azure = typer.Typer(name="azure")
gcp = typer.Typer(name="gcp")
kubernetes = typer.Typer(name="kubernetes")
app.add_typer(aws, name="aws")
app.add_typer(azure, name="azure")
app.add_typer(gcp, name="gcp")
app.add_typer(kubernetes, name="kubernetes")
def list_resources(provider: str, resource_type: str):
if resource_type == "services":
print_services(list_services(provider))
elif resource_type == "fixers":
print_fixers(list_fixers(provider))
def create_list_commands(provider_typer: typer.Typer):
provider_name = provider_typer.info.name
@provider_typer.command(
"list-services",
help=f"List the {provider_name} services that are supported by Prowler.",
)
def list_services_command():
list_resources(provider_name, "services")
@provider_typer.command(
"list-fixers",
help=f"List the {provider_name} fixers that are supported by Prowler.",
)
def list_fixers_command():
list_resources(provider_name, "fixers")
create_list_commands(aws)
create_list_commands(azure)
create_list_commands(gcp)
create_list_commands(kubernetes)
@app.command("banner", help="Prints the banner of the tool.")
def banner(show: bool = True):
if show:
print_banner(show)
else:
print("Banner is not shown.")
if __name__ == "__main__":
app()

View File

@@ -0,0 +1,23 @@
# Patterns to ignore when building packages.
# This supports shell glob matching, relative path matching, and
# negation (prefixed with !). Only one pattern per line.
.DS_Store
# Common VCS dirs
.git/
.gitignore
.bzr/
.bzrignore
.hg/
.hgignore
.svn/
# Common backup files
*.swp
*.bak
*.tmp
*.orig
*~
# Various IDEs
.project
.idea/
*.tmproj
.vscode/

View File

@@ -0,0 +1,24 @@
apiVersion: v2
name: prowler
description: Prowler Security Tool Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
# to be deployed.
#
# Library charts provide useful utilities or functions for the chart developer. They're included as
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.1
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
appVersion: "1.16.0"

View File

@@ -0,0 +1,78 @@
# prowler
![Version: 0.1.1](https://img.shields.io/badge/Version-0.1.1-informational?style=flat-square) ![Type: application](https://img.shields.io/badge/Type-application-informational?style=flat-square) ![AppVersion: 1.16.0](https://img.shields.io/badge/AppVersion-1.16.0-informational?style=flat-square)
Prowler Security Tool Helm chart for Kubernetes
# Prowler Helm Chart Deployment
This guide provides step-by-step instructions for deploying the Prowler Helm chart.
## Prerequisites
Before you begin, ensure you have the following:
1. A running Kubernetes cluster.
2. Helm installed on your local machine. If you don't have Helm installed, you can follow the [Helm installation guide](https://helm.sh/docs/intro/install/).
3. Proper access to your Kubernetes cluster (e.g., `kubectl` is configured and working).
## Deployment Steps
### 1. Clone the Repository
Clone the repository containing the Helm chart to your local machine.
```sh
git clone git@github.com:prowler-cloud/prowler.git
cd prowler/contrib/k8s/helm
```
### 2. Deploy the helm chart
```
helm install prowler .
```
### 3. Verify the deployment
```
helm status prowler
kubectl get all -n prowler-ns
```
### 4. Clean Up
To uninstall the Helm release and clean up the resources, run:
```helm uninstall prowler
kubectl delete namespace prowler-ns
```
## Values
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| clusterRole.name | string | `"prowler-read-cluster"` | |
| clusterRoleBinding.name | string | `"prowler-read-cluster-binding"` | |
| configMap.name | string | `"prowler-hostpaths"` | |
| configMapData.etcCniNetd | string | `"/etc/cni/net.d"` | |
| configMapData.etcKubernetes | string | `"/etc/kubernetes"` | |
| configMapData.etcSystemd | string | `"/etc/systemd"` | |
| configMapData.libSystemd | string | `"/lib/systemd"` | |
| configMapData.optCniBin | string | `"/opt/cni/bin"` | |
| configMapData.usrBin | string | `"/usr/bin"` | |
| configMapData.varLibCni | string | `"/var/lib/cni"` | |
| configMapData.varLibEtcd | string | `"/var/lib/etcd"` | |
| configMapData.varLibKubeControllerManager | string | `"/var/lib/kube-controller-manager"` | |
| configMapData.varLibKubeScheduler | string | `"/var/lib/kube-scheduler"` | |
| configMapData.varLibKubelet | string | `"/var/lib/kubelet"` | |
| cronjob.hostPID | bool | `true` | |
| cronjob.name | string | `"prowler"` | |
| cronjob.schedule | string | `"0 0 * * *"` | |
| image.pullPolicy | string | `"Always"` | |
| image.repository | string | `"toniblyx/prowler"` | |
| image.tag | string | `"stable"` | |
| namespace.name | string | `"prowler"` | |
| serviceAccount.name | string | `"prowler"` | |
----------------------------------------------
Autogenerated from chart metadata using [helm-docs v1.11.3](https://github.com/norwoodj/helm-docs/releases/v1.11.3)

View File

@@ -0,0 +1,11 @@
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRole
metadata:
name: {{ .Values.clusterRole.name }}
rules:
- apiGroups: [""]
resources: ["pods", "configmaps", "nodes", "namespaces"]
verbs: ["get", "list", "watch"]
- apiGroups: ["rbac.authorization.k8s.io"]
resources: ["clusterrolebindings", "rolebindings", "clusterroles", "roles"]
verbs: ["get", "list", "watch"]

View File

@@ -0,0 +1,18 @@
apiVersion: v1
kind: ConfigMap
metadata:
name: {{ .Values.configMap.name }}
namespace: {{ .Values.namespace.name }}
data:
varLibCni: "{{ .Values.configMap.data.varLibCni }}"
varLibEtcd: "{{ .Values.configMap.data.varLibEtcd }}"
varLibKubelet: "{{ .Values.configMap.data.varLibKubelet }}"
varLibKubeScheduler: "{{ .Values.configMap.data.varLibKubeScheduler }}"
varLibKubeControllerManager: "{{ .Values.configMap.data.varLibKubeControllerManager }}"
etcSystemd: "{{ .Values.configMap.data.etcSystemd }}"
libSystemd: "{{ .Values.configMap.data.libSystemd }}"
etcKubernetes: "{{ .Values.configMap.data.etcKubernetes }}"
usrBin: "{{ .Values.configMap.data.usrBin }}"
etcCniNetd: "{{ .Values.configMap.data.etcCniNetd }}"
optCniBin: "{{ .Values.configMap.data.optCniBin }}"
srvKubernetes: "{{ .Values.configMap.data.srvKubernetes }}"

View File

@@ -0,0 +1,42 @@
apiVersion: batch/v1
kind: CronJob
metadata:
name: {{ .Values.cronjob.name }}
namespace: {{ .Values.namespace.name }}
spec:
schedule: "{{ .Values.cronjob.schedule }}"
jobTemplate:
spec:
template:
metadata:
labels:
app: prowler
spec:
serviceAccountName: {{ .Values.serviceAccount.name }}
containers:
- name: prowler
image: {{ .Values.image.repository }}:{{ .Values.image.tag }}
command: ["prowler"]
args: ["kubernetes", "-z", "-b"]
imagePullPolicy: {{ .Values.image.pullPolicy }}
volumeMounts:
{{- range $key, $value := .Values.configMap.data }}
{{- if and (eq $.Values.clusterType "gke") (eq $key "srvKubernetes") }}
{{- else }}
- name: {{ $key | lower }}
mountPath: {{ $value }}
readOnly: true
{{- end }}
{{- end }}
hostPID: {{ .Values.cronjob.hostPID }}
restartPolicy: Never
volumes:
{{- range $key, $value := .Values.configMap.data }}
{{- if and (eq $.Values.clusterType "gke") (eq $key "srvKubernetes") }}
{{- else }}
- name: {{ $key | lower }}
hostPath:
path: {{ $value }}
{{- end }}
{{- end }}

View File

@@ -0,0 +1,4 @@
apiVersion: v1
kind: Namespace
metadata:
name: {{ .Values.namespace.name }}

View File

@@ -0,0 +1,12 @@
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: {{ .Values.clusterRoleBinding.name }}
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: {{ .Values.clusterRole.name }}
subjects:
- kind: ServiceAccount
name: {{ .Values.serviceAccount.name }}
namespace: {{ .Values.namespace.name }}

View File

@@ -0,0 +1,5 @@
apiVersion: v1
kind: ServiceAccount
metadata:
name: {{ .Values.serviceAccount.name }}
namespace: {{ .Values.namespace.name }}

View File

@@ -0,0 +1,40 @@
namespace:
name: prowler-ns
cronjob:
name: prowler
schedule: "0 0 * * *"
hostPID: true
serviceAccount:
name: prowler-sa
image:
repository: toniblyx/prowler
tag: stable
pullPolicy: Always
clusterType:
configMap:
name: prowler-config
data:
varLibCni: "/var/lib/cni"
varLibEtcd: "/var/lib/etcd"
varLibKubelet: "/var/lib/kubelet"
varLibKubeScheduler: "/var/lib/kube-scheduler"
varLibKubeControllerManager: "/var/lib/kube-controller-manager"
etcSystemd: "/etc/systemd"
libSystemd: "/lib/systemd"
etcKubernetes: "/etc/kubernetes"
usrBin: "/usr/bin"
etcCniNetd: "/etc/cni/net.d"
optCniBin: "/opt/cni/bin"
srvKubernetes: "/srv/kubernetes"
clusterRole:
name: prowler-read-cluster
clusterRoleBinding:
name: prowler-read-cluster-binding
roleName: prowler-read-cluster

View File

@@ -21,7 +21,7 @@ print(
f"{Fore.GREEN}Loading all CSV files from the folder {folder_path_overview} ...\n{Style.RESET_ALL}"
)
cli.show_server_banner = lambda *x: click.echo(
f"{Fore.YELLOW}NOTE:{Style.RESET_ALL} If you are a {Fore.GREEN}{Style.BRIGHT}Prowler SaaS{Style.RESET_ALL} customer and you want to use your data from your S3 bucket,\nrun: `{orange_color}aws s3 cp s3://<your-bucket>/output/csv ./output --recursive{Style.RESET_ALL}`\nand then run `prowler dashboard` again to load the new files."
f"{Fore.YELLOW}NOTE:{Style.RESET_ALL} If you are using {Fore.GREEN}{Style.BRIGHT}Prowler SaaS{Style.RESET_ALL} with the S3 integration or that integration \nfrom {Fore.CYAN}{Style.BRIGHT}Prowler Open Source{Style.RESET_ALL} and you want to use your data from your S3 bucket,\nrun: `{orange_color}aws s3 cp s3://<your-bucket>/output/csv ./output --recursive{Style.RESET_ALL}`\nand then run `prowler dashboard` again to load the new files."
)
# Initialize the app - incorporate css

View File

@@ -2223,3 +2223,232 @@ def get_section_containers_ens(data, section_1, section_2, section_3, section_4)
section_containers.append(section_container)
return html.Div(section_containers, className="compliance-data-layout")
# This function extracts and compares up to two numeric values, ensuring correct sorting for version-like strings.
def extract_numeric_values(value):
numbers = re.findall(r"\d+", str(value))
if len(numbers) >= 2:
return int(numbers[0]), int(numbers[1])
elif len(numbers) == 1:
return int(numbers[0]), 0
return 0, 0
def get_section_containers_kisa_ismsp(data, section_1, section_2):
data["STATUS"] = data["STATUS"].apply(map_status_to_icon)
data[section_1] = data[section_1].astype(str)
data[section_2] = data[section_2].astype(str)
data.sort_values(
by=section_1,
key=lambda x: x.map(extract_numeric_values),
ascending=True,
inplace=True,
)
findings_counts_section = (
data.groupby([section_2, "STATUS"]).size().unstack(fill_value=0)
)
findings_counts_name = (
data.groupby([section_1, "STATUS"]).size().unstack(fill_value=0)
)
section_containers = []
for name in data[section_1].unique():
success_name = (
findings_counts_name.loc[name, pass_emoji]
if pass_emoji in findings_counts_name.columns
else 0
)
failed_name = (
findings_counts_name.loc[name, fail_emoji]
if fail_emoji in findings_counts_name.columns
else 0
)
fig_name = go.Figure(
data=[
go.Bar(
name="Failed",
x=[failed_name],
y=[""],
orientation="h",
marker=dict(color="#e77676"),
width=[0.8],
),
go.Bar(
name="Success",
x=[success_name],
y=[""],
orientation="h",
marker=dict(color="#45cc6e"),
width=[0.8],
),
]
)
fig_name.update_layout(
barmode="stack",
margin=dict(l=10, r=10, t=10, b=10),
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
showlegend=False,
width=350,
height=30,
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
annotations=[
dict(
x=success_name + failed_name,
y=0,
xref="x",
yref="y",
text=str(success_name),
showarrow=False,
font=dict(color="#45cc6e", size=14),
xanchor="left",
yanchor="middle",
),
dict(
x=0,
y=0,
xref="x",
yref="y",
text=str(failed_name),
showarrow=False,
font=dict(color="#e77676", size=14),
xanchor="right",
yanchor="middle",
),
],
)
graph_name = dcc.Graph(
figure=fig_name, config={"staticPlot": True}, className="info-bar"
)
graph_div = html.Div(graph_name, className="graph-section")
direct_internal_items = []
for section in data[data[section_1] == name][section_2].unique():
specific_data = data[
(data[section_1] == name) & (data[section_2] == section)
]
success_section = (
findings_counts_section.loc[section, pass_emoji]
if pass_emoji in findings_counts_section.columns
else 0
)
failed_section = (
findings_counts_section.loc[section, fail_emoji]
if fail_emoji in findings_counts_section.columns
else 0
)
data_table = dash_table.DataTable(
data=specific_data.to_dict("records"),
columns=[
{"name": i, "id": i}
for i in ["CHECKID", "STATUS", "REGION", "ACCOUNTID", "RESOURCEID"]
],
style_table={"overflowX": "auto"},
style_as_list_view=True,
style_cell={"textAlign": "left", "padding": "5px"},
)
fig_section = go.Figure(
data=[
go.Bar(
name="Failed",
x=[failed_section],
y=[""],
orientation="h",
marker=dict(color="#e77676"),
),
go.Bar(
name="Success",
x=[success_section],
y=[""],
orientation="h",
marker=dict(color="#45cc6e"),
),
]
)
fig_section.update_layout(
barmode="stack",
margin=dict(l=10, r=10, t=10, b=10),
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
showlegend=False,
width=350,
height=30,
xaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
yaxis=dict(showticklabels=False, showgrid=False, zeroline=False),
annotations=[
dict(
x=success_section + failed_section,
y=0,
xref="x",
yref="y",
text=str(success_section),
showarrow=False,
font=dict(color="#45cc6e", size=14),
xanchor="left",
yanchor="middle",
),
dict(
x=0,
y=0,
xref="x",
yref="y",
text=str(failed_section),
showarrow=False,
font=dict(color="#e77676", size=14),
xanchor="right",
yanchor="middle",
),
],
)
graph_section = dcc.Graph(
figure=fig_section,
config={"staticPlot": True},
className="info-bar-child",
)
graph_div_section = html.Div(graph_section, className="graph-section-req")
internal_accordion_item = dbc.AccordionItem(
title=section,
children=[html.Div([data_table], className="inner-accordion-content")],
)
internal_section_container = html.Div(
[
graph_div_section,
dbc.Accordion(
[internal_accordion_item], start_collapsed=True, flush=True
),
],
className="accordion-inner--child",
)
direct_internal_items.append(internal_section_container)
accordion_item = dbc.AccordionItem(
title=f"{name}", children=direct_internal_items
)
section_container = html.Div(
[
graph_div,
dbc.Accordion([accordion_item], start_collapsed=True, flush=True),
],
className="accordion-inner",
)
section_containers.append(section_container)
return html.Div(section_containers, className="compliance-data-layout")

View File

@@ -0,0 +1,25 @@
import warnings
from dashboard.common_methods import get_section_containers_kisa_ismsp
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
# "REQUIREMENTS_DESCRIPTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_kisa_ismsp(
aux, "REQUIREMENTS_ATTRIBUTES_SUBDOMAIN", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -0,0 +1,25 @@
import warnings
from dashboard.common_methods import get_section_containers_kisa_ismsp
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_ATTRIBUTES_SUBDOMAIN",
"REQUIREMENTS_ATTRIBUTES_SECTION",
# "REQUIREMENTS_DESCRIPTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_kisa_ismsp(
aux, "REQUIREMENTS_ATTRIBUTES_SUBDOMAIN", "REQUIREMENTS_ATTRIBUTES_SECTION"
)

View File

@@ -21,7 +21,7 @@ muted_manual_color = "#b33696"
critical_color = "#951649"
high_color = "#e11d48"
medium_color = "#ee6f15"
low_color = "#f9f5e6"
low_color = "#fcf45d"
informational_color = "#3274d9"
# Folder output path

View File

@@ -945,7 +945,7 @@ def filter_data(
color_mapping_status = {
"FAIL": fail_color,
"PASS": pass_color,
"INFO": info_color,
"LOW": info_color,
"MANUAL": manual_color,
"WARNING": muted_fail_color,
"MUTED (FAIL)": muted_fail_color,
@@ -1564,7 +1564,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
data.get(
"FINDING_UID", ""
)
)
),
style={
"margin-left": "5px"
},
),
],
style={"display": "flex"},
@@ -1644,28 +1647,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
"STATUS_EXTENDED",
"",
)
)
),
],
style={"display": "flex"},
),
html.Div(
[
html.P(
html.Strong(
"Risk: ",
style={
"margin-right": "5px"
},
)
),
html.P(
str(
data.get(
"RISK",
"",
)
)
),
style={
"margin-left": "5px"
},
),
],
style={"display": "flex"},
@@ -1689,7 +1674,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
)
),
html.P(
str(data.get("RISK", ""))
str(data.get("RISK", "")),
style={
"margin-left": "5px"
},
),
],
style={"display": "flex"},
@@ -1744,7 +1732,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
"REMEDIATION_RECOMMENDATION_TEXT",
"",
)
)
),
style={
"margin-left": "5px"
},
),
],
style={"display": "flex"},
@@ -1772,7 +1763,10 @@ def generate_table(data, index, color_mapping_severity, color_mapping_status):
"",
)
),
style={"color": "#3182ce"},
style={
"color": "#3182ce",
"margin-left": "5px",
},
),
],
style={"display": "flex"},

View File

@@ -222,7 +222,7 @@ class ec2_securitygroup_with_many_ingress_egress_rules(Check):
max_security_group_rules = ec2_client.audit_config.get(
"max_security_group_rules", 50
)
for security_group in ec2_client.security_groups:
for security_group_arn, security_group in ec2_client.security_groups.items():
```
```yaml title="config.yaml"
@@ -272,7 +272,7 @@ Each Prowler check has metadata associated which is stored at the same level of
# Severity holds the check's severity, always in lowercase (critical, high, medium, low or informational)
"Severity": "critical",
# ResourceType only for AWS, holds the type from here
# https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-template-resource-type-ref.html
# https://docs.aws.amazon.com/securityhub/latest/userguide/asff-resources.html
"ResourceType": "Other",
# Description holds the title of the check, for now is the same as CheckTitle
"Description": "Ensure there are no EC2 AMIs set as Public.",
@@ -319,7 +319,7 @@ Each Prowler check has metadata associated which is stored at the same level of
For the Remediation Code we use the following knowledge base to fill it:
- Official documentation for the provider
- https://docs.bridgecrew.io
- https://docs.prowler.com/checks/checks-index
- https://www.trendmicro.com/cloudoneconformity
- https://github.com/cloudmatos/matos/tree/master/remediations

View File

@@ -1,7 +1,11 @@
# Debugging
Debugging in Prowler make things easier!
If you are developing Prowler, it's possible that you will encounter some situations where you have to inspect the code in depth to fix some unexpected issues during the execution. To do that, if you are using VSCode you can run the code using the integrated debugger. Please, refer to this [documentation](https://code.visualstudio.com/docs/editor/debugging) for guidance about the debugger in VSCode.
If you are developing Prowler, it's possible that you will encounter some situations where you have to inspect the code in depth to fix some unexpected issues during the execution.
## VSCode
In VSCode you can run the code using the integrated debugger. Please, refer to this [documentation](https://code.visualstudio.com/docs/editor/debugging) for guidance about the debugger in VSCode.
The following file is an example of the [debugging configuration](https://code.visualstudio.com/docs/editor/debugging#_launch-configurations) file that you can add to [Virtual Studio Code](https://code.visualstudio.com/).
This file should inside the *.vscode* folder and its name has to be *launch.json*:
@@ -11,31 +15,62 @@ This file should inside the *.vscode* folder and its name has to be *launch.json
"version": "0.2.0",
"configurations": [
{
"name": "Python: Current File",
"type": "python",
"name": "Debug AWS Check",
"type": "debugpy",
"request": "launch",
"program": "prowler.py",
"args": [
"aws",
"-f",
"eu-west-1",
"--service",
"cloudwatch",
"--log-level",
"ERROR",
"-p",
"dev",
"-c",
"<check_name>"
],
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Python: Debug Tests",
"type": "python",
"name": "Debug Azure Check",
"type": "debugpy",
"request": "launch",
"program": "${file}",
"purpose": [
"debug-test"
"program": "prowler.py",
"args": [
"azure",
"--sp-env-auth",
"--log-level",
"ERROR",
"-c",
"<check_name>"
],
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Debug GCP Check",
"type": "debugpy",
"request": "launch",
"program": "prowler.py",
"args": [
"gcp",
"--log-level",
"ERROR",
"-c",
"<check_name>"
],
"console": "integratedTerminal",
"justMyCode": false
},
{
"name": "Debug K8s Check",
"type": "debugpy",
"request": "launch",
"program": "prowler.py",
"args": [
"kubernetes",
"--log-level",
"ERROR",
"-c",
"<check_name>"
],
"console": "integratedTerminal",
"justMyCode": false

View File

@@ -4,16 +4,18 @@ You can extend Prowler Open Source in many different ways, in most cases you wil
## Get the code and install all dependencies
First of all, you need a version of Python 3.9 or higher and also pip installed to be able to install all dependencies required. Once that is satisfied go a head and clone the repo:
First of all, you need a version of Python 3.9 or higher and also `pip` installed to be able to install all dependencies required.
Then, to start working with the Prowler Github repository you need to fork it to be able to propose changes for new features, bug fixing, etc. To fork the Prowler repo please refer to [this guide](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/fork-a-repo?tool=webui#forking-a-repository).
Once that is satisfied go ahead and clone your forked repo:
```
git clone https://github.com/prowler-cloud/prowler
git clone https://github.com/<your-github-user>/prowler
cd prowler
```
For isolation and avoid conflicts with other environments, we recommend usage of `poetry`:
```
pip install poetry
```
For isolation and to avoid conflicts with other environments, we recommend using `poetry`, a Python dependency management tool. You can install it by following the instructions [here](https://python-poetry.org/docs/#installation).
Then install all dependencies including the ones for developers:
```
poetry install --with dev
@@ -44,7 +46,12 @@ Before we merge any of your pull requests we pass checks to the code, we use the
You can see all dependencies in file `pyproject.toml`.
Moreover, you would need to install [`TruffleHog`](https://github.com/trufflesecurity/trufflehog) to check for secrets in the code. You can install it using the official installation guide [here](https://github.com/trufflesecurity/trufflehog?tab=readme-ov-file#floppy_disk-installation).
Moreover, you would need to install [`TruffleHog`](https://github.com/trufflesecurity/trufflehog) on the latest version to check for secrets in the code. You can install it using the official installation guide [here](https://github.com/trufflesecurity/trufflehog?tab=readme-ov-file#floppy_disk-installation).
Additionally, please ensure to follow the code documentation practices outlined in this guide: [Google Python Style Guide - Comments and Docstrings](https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings).
???+ note
If you have any trouble when committing to the Prowler repository, add the `--no-verify` flag to the `git commit` command.
## Pull Request Checklist

View File

@@ -23,8 +23,8 @@ The Prowler's service structure is the following and the way to initialise it is
All the Prowler provider's services inherits from a base class depending on the provider used.
- [AWS Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/aws/lib/service/service.py)
- [GCP Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/lib/service/service.py)
- [Azure Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/lib/service/service.py)
- [GCP Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/lib/service/service.py)
- [Azure Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/azure/lib/service/service.py)
- [Kubernetes Service Base Class](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/kubernetes/lib/service/service.py)
Each class is used to initialize the credentials and the API's clients to be used in the service. If some threading is used it must be coded there.

View File

@@ -592,7 +592,7 @@ is following the actual format, add one function where the client is passed to b
`mock_api_<endpoint>_calls` (*endpoint* refers to the first attribute pointed after *client*).
In the example of BigQuery the function is called `mock_api_dataset_calls`. And inside of this function we found an assignation to
be used in the `__get_datasets__` method in BigQuery class:
be used in the `_get_datasets` method in BigQuery class:
```python
# Mocking datasets
@@ -765,7 +765,7 @@ from tests.providers.azure.azure_fixtures import (
set_mocked_azure_provider,
)
# Function to mock the service function __get_components__, this function task is to return a possible value that real function could returns
# Function to mock the service function _get_components, this function task is to return a possible value that real function could returns
def mock_appinsights_get_components(_):
return {
AZURE_SUBSCRIPTION_ID: {
@@ -779,12 +779,12 @@ def mock_appinsights_get_components(_):
# Patch decorator to use the mocked function instead the function with the real API call
@patch(
"prowler.providers.azure.services.appinsights.appinsights_service.AppInsights.__get_components__",
"prowler.providers.azure.services.appinsights.appinsights_service.AppInsights._get_components",
new=mock_appinsights_get_components,
)
class Test_AppInsights_Service:
# Mandatory test for every service, this method test the instance of the client is correct
def test__get_client__(self):
def test_get_client(self):
app_insights = AppInsights(set_mocked_azure_provider())
assert (
app_insights.clients[AZURE_SUBSCRIPTION_ID].__class__.__name__
@@ -794,8 +794,8 @@ class Test_AppInsights_Service:
def test__get_subscriptions__(self):
app_insights = AppInsights(set_mocked_azure_provider())
assert app_insights.subscriptions.__class__.__name__ == "dict"
# Test for the function __get_components__, inside this client is used the mocked function
def test__get_components__(self):
# Test for the function _get_components, inside this client is used the mocked function
def test_get_components(self):
appinsights = AppInsights(set_mocked_azure_provider())
assert len(appinsights.components) == 1
assert (

View File

@@ -40,10 +40,10 @@ If your IAM entity enforces MFA you can use `--mfa` and Prowler will ask you to
Prowler for Azure supports the following authentication types:
- Service principal authentication by environment variables (Enterprise Application)
- [Service principal application](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) by environment variables (recommended)
- Current az cli credentials stored
- Interactive browser authentication
- Managed identity authentication
- [Managed identity](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) authentication
### Service Principal authentication
@@ -56,6 +56,8 @@ export AZURE_CLIENT_SECRET="XXXXXXX"
```
If you try to execute Prowler with the `--sp-env-auth` flag and those variables are empty or not exported, the execution is going to fail.
Follow the instructions in the [Create Prowler Service Principal](../tutorials/azure/create-prowler-service-principal.md) section to create a service principal.
### AZ CLI / Browser / Managed Identity authentication
The other three cases does not need additional configuration, `--az-cli-auth` and `--managed-identity-auth` are automated options. To use `--browser-auth` the user needs to authenticate against Azure using the default browser to start the scan, also `tenant-id` is required.
@@ -64,55 +66,22 @@ The other three cases does not need additional configuration, `--az-cli-auth` an
To use each one you need to pass the proper flag to the execution. Prowler for Azure handles two types of permission scopes, which are:
- **Microsoft Entra ID permissions**: Used to retrieve metadata from the identity assumed by Prowler (not mandatory to have access to execute the tool).
- **Subscription scope permissions**: Required to launch the checks against your resources, mandatory to launch the tool.
#### Microsoft Entra ID scope
Microsoft Entra ID (AAD earlier) permissions required by the tool are the following:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All`
The best way to assign it is through the Azure web console:
1. Access to Microsoft Entra ID
2. In the left menu bar, go to "App registrations"
3. Once there, in the menu bar click on "+ New registration" to register a new application
4. Fill the "Name, select the "Supported account types" and click on "Register. You will be redirected to the applications page.
![Register an Application page](../img/register-application.png)
4. Select the new application
5. In the left menu bar, select "API permissions"
6. Then click on "+ Add a permission" and select "Microsoft Graph"
7. Once in the "Microsoft Graph" view, select "Application permissions"
8. Finally, search for "Directory", "Policy" and "UserAuthenticationMethod" select the following permissions:
- **Microsoft Entra ID permissions**: Used to retrieve metadata from the identity assumed by Prowler and specific Entra checks (not mandatory to have access to execute the tool). The permissions required by the tool are the following:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All`
![EntraID Permissions](../img/AAD-permissions.png)
- **Subscription scope permissions**: Required to launch the checks against your resources, mandatory to launch the tool. It is required to add the following RBAC builtin roles per subscription to the entity that is going to be assumed by the tool:
- `Reader`
- `ProwlerRole` (custom role defined in [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json))
To assign the permissions, follow the instructions in the [Microsoft Entra ID permissions](../tutorials/azure/create-prowler-service-principal.md#assigning-the-proper-permissions) section and the [Azure subscriptions permissions](../tutorials/azure/subscriptions.md#assigning-proper-permissions) section, respectively.
#### Subscriptions scope
#### Checks that require ProwlerRole
Regarding the subscription scope, Prowler by default scans all the subscriptions that is able to list, so it is required to add the following RBAC builtin roles per subscription to the entity that is going to be assumed by the tool:
The following checks require the `ProwlerRole` custom role to be executed, if you want to run them, make sure you have assigned the role to the identity that is going to be assumed by Prowler:
- `Security Reader`
- `Reader`
To assign this roles, follow the instructions:
1. Access your subscription, then select your subscription.
2. Select "Access control (IAM)".
3. In the overview, select "Roles"
![IAM Page](../img/page-IAM.png)
4. Click on "+ Add" and select "Add role assignment"
5. In the search bar, type `Security Reader`, select it and click on "Next"
6. In the Members tab, click on "+ Select members" and add the members you want to assign this role.
7. Click on "Review + assign" to apply the new role.
*Repeat these steps for `Reader` role*
- `app_function_access_keys_configured`
- `app_function_ftps_deployment_disabled`
## Google Cloud

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.4 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 357 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 688 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 214 KiB

After

Width:  |  Height:  |  Size: 746 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 348 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 552 KiB

View File

@@ -19,14 +19,40 @@ It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, Fe
## Quick Start
### Installation
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/), thus can be installed using pip with `Python >= 3.9`:
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/), thus can be installed as Python package with `Python >= 3.9`:
=== "Generic"
=== "pipx"
[pipx](https://pipx.pypa.io/stable/) is a tool to install Python applications in isolated environments. It is recommended to use `pipx` for a global installation.
_Requirements_:
* `Python >= 3.9`
* `Python pip >= 3.9`
* `pipx` installed: [pipx installation](https://pipx.pypa.io/stable/installation/).
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
``` bash
pipx install prowler
prowler -v
```
To upgrade Prowler to the latest version, run:
``` bash
pipx upgrade prowler
```
=== "pip"
???+ warning
This method is not recommended because it will modify the environment which you choose to install. Consider using [pipx](https://docs.prowler.com/projects/prowler-open-source/en/latest/#__tabbed_1_1) for a global installation.
_Requirements_:
* `Python >= 3.9`
* `Python pip >= 21.0.0`
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
@@ -36,13 +62,19 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
prowler -v
```
To upgrade Prowler to the latest version, run:
``` bash
pip install --upgrade prowler
```
=== "Docker"
_Requirements_:
* Have `docker` installed: https://docs.docker.com/get-docker/.
* AWS, GCP, Azure and/or Kubernetes credentials
* In the command below, change `-v` to your local directory path in order to access the reports.
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
@@ -54,41 +86,21 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
--env AWS_SESSION_TOKEN toniblyx/prowler:latest
```
=== "Ubuntu"
_Requirements for Ubuntu 20.04.3 LTS_:
* AWS, GCP, Azure and/or Kubernetes credentials
* Install python 3.9 with: `sudo apt-get install python3.9`
* Remove python 3.8 to avoid conflicts if you can: `sudo apt-get remove python3.8`
* Make sure you have the python3 distutils package installed: `sudo apt-get install python3-distutils`
* To make sure you use pip for 3.9 get the get-pip script with: `curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py`
* Execute it with the proper python version: `sudo python3.9 get-pip.py`
* Now you should have pip for 3.9 ready: `pip3.9 --version`
_Commands_:
```
pip3.9 install prowler
export PATH=$PATH:/home/$HOME/.local/bin/
prowler -v
```
=== "GitHub"
_Requirements for Developers_:
* `git`
* `poetry` installed: [poetry installation](https://python-poetry.org/docs/#installation).
* AWS, GCP, Azure and/or Kubernetes credentials
* `git`, `Python >= 3.9`, `pip` and `poetry` installed (`pip install poetry`)
_Commands_:
```
git clone https://github.com/prowler-cloud/prowler
cd prowler
poetry shell
poetry install
python prowler.py -v
poetry run python prowler.py -v
```
???+ note
If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
@@ -97,15 +109,33 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
_Requirements_:
* `Python >= 3.9`
* AWS, GCP, Azure and/or Kubernetes credentials
* Latest Amazon Linux 2 should come with Python 3.9 already installed however it may need pip. Install Python pip 3.9 with: `sudo yum install -y python3-pip`.
* Make sure setuptools for python is already installed with: `pip3 install setuptools`
_Commands_:
```
pip3.9 install prowler
export PATH=$PATH:/home/$HOME/.local/bin/
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install prowler
prowler -v
```
=== "Ubuntu"
_Requirements_:
* `Ubuntu 23.04` or above, if you are using an older version of Ubuntu check [pipx installation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#__tabbed_1_1) and ensure you have `Python >= 3.9`.
* `Python >= 3.9`
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
``` bash
sudo apt update
sudo apt install pipx
pipx ensurepath
pipx install prowler
prowler -v
```
@@ -125,7 +155,7 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
=== "AWS CloudShell"
After the migration of AWS CloudShell from Amazon Linux 2 to Amazon Linux 2023 [[1]](https://aws.amazon.com/about-aws/whats-new/2023/12/aws-cloudshell-migrated-al2023/) [2](https://docs.aws.amazon.com/cloudshell/latest/userguide/cloudshell-AL2023-migration.html), there is no longer a need to manually compile Python 3.9 as it's already included in AL2023. Prowler can thus be easily installed following the Generic method of installation via pip. Follow the steps below to successfully execute Prowler v4 in AWS CloudShell:
After the migration of AWS CloudShell from Amazon Linux 2 to Amazon Linux 2023 [[1]](https://aws.amazon.com/about-aws/whats-new/2023/12/aws-cloudshell-migrated-al2023/) [[2]](https://docs.aws.amazon.com/cloudshell/latest/userguide/cloudshell-AL2023-migration.html), there is no longer a need to manually compile Python 3.9 as it's already included in AL2023. Prowler can thus be easily installed following the Generic method of installation via pip. Follow the steps below to successfully execute Prowler v4 in AWS CloudShell:
_Requirements_:
@@ -133,11 +163,13 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
_Commands_:
```
```bash
sudo bash
adduser prowler
su prowler
pip install prowler
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install prowler
cd /tmp
prowler aws
```
@@ -153,9 +185,12 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/),
_Commands_:
```
pip install prowler
prowler -v
```bash
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install prowler
cd /tmp
prowler azure --az-cli-auth
```
## Prowler container versions

View File

@@ -85,7 +85,7 @@ prowler --security-hub --region eu-west-1
```
???+ note
It is recommended to send only fails to Security Hub and that is possible adding `-q/--quiet` to the command. You can use, instead of the `-q/--quiet` argument, the `--send-sh-only-fails` argument to save all the findings in the Prowler outputs but just to send FAIL findings to AWS Security Hub.
It is recommended to send only fails to Security Hub and that is possible adding `--status FAIL` to the command. You can use, instead of the `--status FAIL` argument, the `--send-sh-only-fails` argument to save all the findings in the Prowler outputs but just to send FAIL findings to AWS Security Hub.
Since Prowler perform checks to all regions by default you may need to filter by region when running Security Hub integration, as shown in the example above. Remember to enable Security Hub in the region or regions you need by calling `aws securityhub enable-security-hub --region <region>` and run Prowler with the option `-f/--region <region>` (if no region is used it will try to push findings in all regions hubs). Prowler will send findings to the Security Hub on the region where the scanned resource is located.
@@ -121,13 +121,13 @@ prowler --security-hub --role arn:aws:iam::123456789012:role/ProwlerExecutionRol
## Send only failed findings to Security Hub
When using the **AWS Security Hub** integration you can send only the `FAIL` findings generated by **Prowler**. Therefore, the **AWS Security Hub** usage costs eventually would be lower. To follow that recommendation you could add the `-q/--quiet` flag to the Prowler command:
When using the **AWS Security Hub** integration you can send only the `FAIL` findings generated by **Prowler**. Therefore, the **AWS Security Hub** usage costs eventually would be lower. To follow that recommendation you could add the `--status FAIL` flag to the Prowler command:
```sh
prowler --security-hub --quiet
prowler --security-hub --status FAIL
```
You can use, instead of the `-q/--quiet` argument, the `--send-sh-only-fails` argument to save all the findings in the Prowler outputs but just to send FAIL findings to AWS Security Hub:
You can use, instead of the `--status FAIL` argument, the `--send-sh-only-fails` argument to save all the findings in the Prowler outputs but just to send FAIL findings to AWS Security Hub:
```sh
prowler --security-hub --send-sh-only-fails

View File

@@ -0,0 +1,34 @@
# How to create Prowler Service Principal
To allow Prowler assume an identity to start the scan with the required privileges is necesary to create a Service Principal. To create one follow the next steps:
1. Access to Microsoft Entra ID
2. In the left menu bar, go to "App registrations"
3. Once there, in the menu bar click on "+ New registration" to register a new application
4. Fill the "Name, select the "Supported account types" and click on "Register. You will be redirected to the applications page.
5. Once in the application page, in the left menu bar, select "Certificates & secrets"
6. In the "Certificates & secrets" view, click on "+ New client secret"
7. Fill the "Description" and "Expires" fields and click on "Add"
8. Copy the value of the secret, it is going to be used as `AZURE_CLIENT_SECRET` environment variable.
![Register an Application page](../../img/create-sp.gif)
## Assigning the proper permissions
To allow Prowler to retrieve metadata from the identity assumed and specific Entra checks, it is needed to assign the following permissions:
1. Access to Microsoft Entra ID
2. In the left menu bar, go to "App registrations"
3. Once there, select the application that you have created
4. In the left menu bar, select "API permissions"
5. Then click on "+ Add a permission" and select "Microsoft Graph"
6. Once in the "Microsoft Graph" view, select "Application permissions"
7. Finally, search for "Directory", "Policy" and "UserAuthenticationMethod" select the following permissions:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All`
8. Click on "Add permissions" to apply the new permissions.
9. Finally, click on "Grant admin consent for [your tenant]" to apply the permissions.
![EntraID Permissions](../../img/AAD-permissions.png)

View File

@@ -1,6 +1,6 @@
# Azure subscriptions scope
By default, Prowler is multisubscription, which means that is going to scan all the subscriptions is able to list. If you only assign permissions to one subscription, it is going to scan a single one.
By default, Prowler is multisubscription, which means that is going to scan all the subscriptions is able to list. If you only assign permissions to one subscription, it is going to scan a single one.
Prowler also has the ability to limit the subscriptions to scan to a set passed as input argument, to do so:
```console
@@ -8,3 +8,36 @@ prowler azure --az-cli-auth --subscription-ids <subscription ID 1> <subscription
```
Where you can pass from 1 up to N subscriptions to be scanned.
## Assigning proper permissions
Regarding the subscription scope, Prowler by default scans all subscriptions that it is able to list, so it is necessary to add the `Reader` RBAC built-in roles per subscription or management group (recommended for multiple subscriptions, see it in the [next section](#recommendation-for-multiple-subscriptions)) to the entity that will be adopted by the tool:
To assign this roles, follow the instructions:
1. Access your subscription, then select your subscription.
2. Select "Access control (IAM)".
3. In the overview, select "Roles".
4. Click on "+ Add" and select "Add role assignment".
5. In the search bar, type `Reader`, select it and click on "Next".
6. In the Members tab, click on "+ Select members" and add the members you want to assign this role.
7. Click on "Review + assign" to apply the new role.
![Add reader role to subscription](../../img/add-reader-role.gif)
Moreover, some additional read-only permissions are needed for some checks, for this kind of checks that are not covered by built-in roles we use a custom role. This role is defined in [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json). Once the cusotm role is created, repeat the steps mentioned above to assign the new `ProwlerRole` to an identity.
## Recommendation for multiple subscriptions
While scanning multiple subscriptions could be tedious to create and assign roles for each one. For this reason in Prowler we recommend the usage of *[management groups](https://learn.microsoft.com/en-us/azure/governance/management-groups/overview)* to group all subscriptions that are going to be audited by Prowler.
To do this in a proper way you have to [create a new management group](https://learn.microsoft.com/en-us/azure/governance/management-groups/create-management-group-portal) and add all roles in the same way that have been done for subscription scope.
![Create management group](../../img/create-management-group.gif)
Once the management group is properly set you can add all the subscription that you want to audit.
![Add subscription to management group](../../img/add-sub-to-management-group.gif)
???+ note
By default, `prowler` will scan all subscriptions in the Azure tenant, use the flag `--subscription-id` to specify the subscriptions to be scanned.

View File

@@ -13,35 +13,55 @@ The following list includes all the AWS checks with configurable variables that
| Check Name | Value | Type |
|---------------------------------------------------------------|--------------------------------------------------|-----------------|
| `acm_certificates_expiration_check` | `days_to_expire_threshold` | Integer |
| `appstream_fleet_maximum_session_duration` | `max_session_duration_seconds` | Integer |
| `appstream_fleet_session_disconnect_timeout` | `max_disconnect_timeout_in_seconds` | Integer |
| `appstream_fleet_session_idle_disconnect_timeout` | `max_idle_disconnect_timeout_in_seconds` | Integer |
| `autoscaling_find_secrets_ec2_launch_configuration` | `secrets_ignore_patterns` | List of Strings |
| `awslambda_function_no_secrets_in_code` | `secrets_ignore_patterns` | List of Strings |
| `awslambda_function_no_secrets_in_variables` | `secrets_ignore_patterns` | List of Strings |
| `awslambda_function_using_supported_runtimes` | `obsolete_lambda_runtimes` | Integer |
| `awslambda_function_vpc_is_in_multi_azs` | `lambda_min_azs` | Integer |
| `cloudformation_stack_outputs_find_secrets` | `secrets_ignore_patterns` | List of Strings |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_actions` | List of Strings |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_entropy` | Integer |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_minutes` | Integer |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_actions` | List of Strings |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_entropy` | Integer |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_minutes` | Integer |
| `cloudwatch_log_group_no_secrets_in_logs` | `secrets_ignore_patterns` | List of Strings |
| `cloudwatch_log_group_retention_policy_specific_days_enabled` | `log_group_retention_days` | Integer |
| `codebuild_project_no_secrets_in_variables` | `excluded_sensitive_environment_variables` | List of Strings |
| `codebuild_project_no_secrets_in_variables` | `secrets_ignore_patterns` | List of Strings |
| `config_recorder_all_regions_enabled` | `mute_non_default_regions` | Boolean |
| `drs_job_exist` | `mute_non_default_regions` | Boolean |
| `ec2_elastic_ip_shodan` | `shodan_api_key` | String |
| `ec2_instance_older_than_specific_days` | `max_ec2_instance_age_in_days` | Integer |
| `ec2_instance_secrets_user_data` | `secrets_ignore_patterns` | List of Strings |
| `ec2_launch_template_no_secrets` | `secrets_ignore_patterns` | List of Strings |
| `ec2_securitygroup_allow_ingress_from_internet_to_any_port` | `ec2_allowed_instance_owners` | List of Strings |
| `ec2_securitygroup_allow_ingress_from_internet_to_any_port` | `ec2_allowed_interface_types` | List of Strings |
| `ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports`| `ec2_sg_high_risk_ports` | List of Integer |
| `ec2_securitygroup_with_many_ingress_egress_rules` | `max_security_group_rules` | Integer |
| `ecs_task_definitions_no_environment_secrets` | `secrets_ignore_patterns` | List of Strings |
| `ecr_repositories_scan_vulnerabilities_in_latest_image` | `ecr_repository_vulnerability_minimum_severity` | String |
| `eks_cluster_uses_a_supported_version` | `eks_cluster_oldest_version_supported` | String |
| `eks_control_plane_logging_all_types_enabled` | `eks_required_log_types` | List of Strings |
| `elb_is_in_multiple_az` | `elb_min_azs` | Integer |
| `elbv2_is_in_multiple_az` | `elbv2_min_azs` | Integer |
| `guardduty_is_enabled` | `mute_non_default_regions` | Boolean |
| `iam_user_accesskey_unused` | `max_unused_access_keys_days` | Integer |
| `iam_user_console_access_unused` | `max_console_access_days` | Integer |
| `ec2_elastic_ip_shodan` | `shodan_api_key` | String |
| `ec2_securitygroup_with_many_ingress_egress_rules` | `max_security_group_rules` | Integer |
| `ec2_instance_older_than_specific_days` | `max_ec2_instance_age_in_days` | Integer |
| `organizations_delegated_administrators` | `organizations_trusted_delegated_administrators` | List of Strings |
| `organizations_scp_check_deny_regions` | `organizations_enabled_regions` | List of Strings |
| `rds_instance_backup_enabled` | `check_rds_instance_replicas` | Boolean |
| `securityhub_enabled` | `mute_non_default_regions` | Boolean |
| `ssm_document_secrets` | `secrets_ignore_patterns` | List of Strings |
| `trustedadvisor_premium_support_plan_subscribed` | `verify_premium_support_plans` | Boolean |
| `vpc_endpoint_connections_trust_boundaries` | `trusted_account_ids` | List of Strings |
| `vpc_endpoint_services_allowed_principals_trust_boundaries` | `trusted_account_ids` | List of Strings |
| `cloudwatch_log_group_retention_policy_specific_days_enabled` | `log_group_retention_days` | Integer |
| `appstream_fleet_session_idle_disconnect_timeout` | `max_idle_disconnect_timeout_in_seconds` | Integer |
| `appstream_fleet_session_disconnect_timeout` | `max_disconnect_timeout_in_seconds` | Integer |
| `appstream_fleet_maximum_session_duration` | `max_session_duration_seconds` | Integer |
| `awslambda_function_using_supported_runtimes` | `obsolete_lambda_runtimes` | Integer |
| `organizations_scp_check_deny_regions` | `organizations_enabled_regions` | List of Strings |
| `organizations_delegated_administrators` | `organizations_trusted_delegated_administrators` | List of Strings |
| `ecr_repositories_scan_vulnerabilities_in_latest_image` | `ecr_repository_vulnerability_minimum_severity` | String |
| `trustedadvisor_premium_support_plan_subscribed` | `verify_premium_support_plans` | Boolean |
| `config_recorder_all_regions_enabled` | `mute_non_default_regions` | Boolean |
| `drs_job_exist` | `mute_non_default_regions` | Boolean |
| `guardduty_is_enabled` | `mute_non_default_regions` | Boolean |
| `securityhub_enabled` | `mute_non_default_regions` | Boolean |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_entropy` | Integer |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_minutes` | Integer |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_actions` | List of Strings |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_entropy` | Integer |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_minutes` | Integer |
| `cloudtrail_threat_detection_enumeration` | `threat_detection_enumeration_actions` | List of Strings |
| `rds_instance_backup_enabled` | `check_rds_instance_replicas` | Boolean |
| `ec2_securitygroup_allow_ingress_from_internet_to_any_port` | `ec2_allowed_interface_types` | List of Strings |
| `ec2_securitygroup_allow_ingress_from_internet_to_any_port` | `ec2_allowed_instance_owners` | List of Strings |
## Azure
### Configurable Checks
@@ -80,10 +100,20 @@ The following list includes all the Azure checks with configurable variables tha
```yaml title="config.yaml"
# AWS Configuration
aws:
# AWS Global Configuration
# aws.mute_non_default_regions --> Mute Failed Findings in non-default regions for GuardDuty, SecurityHub, DRS and Config
# aws.mute_non_default_regions --> Set to True to muted failed findings in non-default regions for AccessAnalyzer, GuardDuty, SecurityHub, DRS and Config
mute_non_default_regions: False
# If you want to mute failed findings only in specific regions, create a file with the following syntax and run it with `prowler aws -w mutelist.yaml`:
# Mutelist:
# Accounts:
# "*":
# Checks:
# "*":
# Regions:
# - "ap-southeast-1"
# - "ap-southeast-2"
# Resources:
# - "*"
# AWS IAM Configuration
# aws.iam_user_accesskey_unused --> CIS recommends 45 days
@@ -93,6 +123,7 @@ aws:
# AWS EC2 Configuration
# aws.ec2_elastic_ip_shodan
# TODO: create common config
shodan_api_key: null
# aws.ec2_securitygroup_with_many_ingress_egress_rules --> by default is 50 rules
max_security_group_rules: 50
@@ -102,16 +133,32 @@ aws:
# allowed network interface types for security groups open to the Internet
ec2_allowed_interface_types:
[
"api_gateway_managed",
"vpc_endpoint",
"api_gateway_managed",
"vpc_endpoint",
]
# allowed network interface owners for security groups open to the Internet
ec2_allowed_instance_owners:
[
"amazon-elb"
"amazon-elb"
]
# aws.ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports
ec2_sg_high_risk_ports:
[
25,
110,
135,
143,
445,
3000,
4333,
5000,
5500,
8080,
8088,
]
# AWS VPC Configuration (vpc_endpoint_connections_trust_boundaries, vpc_endpoint_services_allowed_principals_trust_boundaries)
# AWS SSM Configuration (aws.ssm_documents_set_as_public)
# Single account environment: No action required. The AWS account number will be automatically added by the checks.
# Multi account environment: Any additional trusted account number should be added as a space separated list, e.g.
# trusted_account_ids : ["123456789012", "098765432109", "678901234567"]
@@ -133,205 +180,246 @@ aws:
# aws.awslambda_function_using_supported_runtimes
obsolete_lambda_runtimes:
[
"java8",
"go1.x",
"provided",
"python3.6",
"python2.7",
"python3.7",
"nodejs4.3",
"nodejs4.3-edge",
"nodejs6.10",
"nodejs",
"nodejs8.10",
"nodejs10.x",
"nodejs12.x",
"nodejs14.x",
"dotnet5.0",
"dotnetcore1.0",
"dotnetcore2.0",
"dotnetcore2.1",
"dotnetcore3.1",
"ruby2.5",
"ruby2.7",
]
# AWS Organizations
# organizations_scp_check_deny_regions
# organizations_enabled_regions: [
# 'eu-central-1',
# 'eu-west-1',
# aws.organizations_scp_check_deny_regions
# aws.organizations_enabled_regions: [
# "eu-central-1",
# "eu-west-1",
# "us-east-1"
# ]
organizations_enabled_regions: []
organizations_trusted_delegated_administrators: []
# AWS ECR
# ecr_repositories_scan_vulnerabilities_in_latest_image
# aws.ecr_repositories_scan_vulnerabilities_in_latest_image
# CRITICAL
# HIGH
# MEDIUM
ecr_repository_vulnerability_minimum_severity: "MEDIUM"
# AWS Trusted Advisor
# trustedadvisor_premium_support_plan_subscribed
# aws.trustedadvisor_premium_support_plan_subscribed
verify_premium_support_plans: True
# AWS CloudTrail Configuration
# aws.cloudtrail_threat_detection_privilege_escalation
threat_detection_privilege_escalation_entropy: 0.7 # Percentage of actions found to decide if it is an privilege_escalation attack event, by default is 0.7 (70%)
threat_detection_privilege_escalation_threshold: 0.1 # Percentage of actions found to decide if it is an privilege_escalation attack event, by default is 0.1 (10%)
threat_detection_privilege_escalation_minutes: 1440 # Past minutes to search from now for privilege_escalation attacks, by default is 1440 minutes (24 hours)
threat_detection_privilege_escalation_actions: [
"AddPermission",
"AddRoleToInstanceProfile",
"AddUserToGroup",
"AssociateAccessPolicy",
"AssumeRole",
"AttachGroupPolicy",
"AttachRolePolicy",
"AttachUserPolicy",
"ChangePassword",
"CreateAccessEntry",
"CreateAccessKey",
"CreateDevEndpoint",
"CreateEventSourceMapping",
"CreateFunction",
"CreateGroup",
"CreateJob",
"CreateKeyPair",
"CreateLoginProfile",
"CreatePipeline",
"CreatePolicyVersion",
"CreateRole",
"CreateStack",
"DeleteRolePermissionsBoundary",
"DeleteRolePolicy",
"DeleteUserPermissionsBoundary",
"DeleteUserPolicy",
"DetachRolePolicy",
"DetachUserPolicy",
"GetCredentialsForIdentity",
"GetId",
"GetPolicyVersion",
"GetUserPolicy",
"Invoke",
"ModifyInstanceAttribute",
"PassRole",
"PutGroupPolicy",
"PutPipelineDefinition",
"PutRolePermissionsBoundary",
"PutRolePolicy",
"PutUserPermissionsBoundary",
"PutUserPolicy",
"ReplaceIamInstanceProfileAssociation",
"RunInstances",
"SetDefaultPolicyVersion",
"UpdateAccessKey",
"UpdateAssumeRolePolicy",
"UpdateDevEndpoint",
"UpdateEventSourceMapping",
"UpdateFunctionCode",
"UpdateJob",
"UpdateLoginProfile",
]
threat_detection_privilege_escalation_actions:
[
"AddPermission",
"AddRoleToInstanceProfile",
"AddUserToGroup",
"AssociateAccessPolicy",
"AssumeRole",
"AttachGroupPolicy",
"AttachRolePolicy",
"AttachUserPolicy",
"ChangePassword",
"CreateAccessEntry",
"CreateAccessKey",
"CreateDevEndpoint",
"CreateEventSourceMapping",
"CreateFunction",
"CreateGroup",
"CreateJob",
"CreateKeyPair",
"CreateLoginProfile",
"CreatePipeline",
"CreatePolicyVersion",
"CreateRole",
"CreateStack",
"DeleteRolePermissionsBoundary",
"DeleteRolePolicy",
"DeleteUserPermissionsBoundary",
"DeleteUserPolicy",
"DetachRolePolicy",
"DetachUserPolicy",
"GetCredentialsForIdentity",
"GetId",
"GetPolicyVersion",
"GetUserPolicy",
"Invoke",
"ModifyInstanceAttribute",
"PassRole",
"PutGroupPolicy",
"PutPipelineDefinition",
"PutRolePermissionsBoundary",
"PutRolePolicy",
"PutUserPermissionsBoundary",
"PutUserPolicy",
"ReplaceIamInstanceProfileAssociation",
"RunInstances",
"SetDefaultPolicyVersion",
"UpdateAccessKey",
"UpdateAssumeRolePolicy",
"UpdateDevEndpoint",
"UpdateEventSourceMapping",
"UpdateFunctionCode",
"UpdateJob",
"UpdateLoginProfile",
]
# aws.cloudtrail_threat_detection_enumeration
threat_detection_enumeration_entropy: 0.7 # Percentage of actions found to decide if it is an enumeration attack event, by default is 0.7 (70%)
threat_detection_enumeration_threshold: 0.1 # Percentage of actions found to decide if it is an enumeration attack event, by default is 0.1 (10%)
threat_detection_enumeration_minutes: 1440 # Past minutes to search from now for enumeration attacks, by default is 1440 minutes (24 hours)
threat_detection_enumeration_actions: [
"DescribeAccessEntry",
"DescribeAccountAttributes",
"DescribeAvailabilityZones",
"DescribeBundleTasks",
"DescribeCarrierGateways",
"DescribeClientVpnRoutes",
"DescribeCluster",
"DescribeDhcpOptions",
"DescribeFlowLogs",
"DescribeImages",
"DescribeInstanceAttribute",
"DescribeInstanceInformation",
"DescribeInstanceTypes",
"DescribeInstances",
"DescribeInstances",
"DescribeKeyPairs",
"DescribeLogGroups",
"DescribeLogStreams",
"DescribeOrganization",
"DescribeRegions",
"DescribeSecurityGroups",
"DescribeSnapshotAttribute",
"DescribeSnapshotTierStatus",
"DescribeSubscriptionFilters",
"DescribeTransitGatewayMulticastDomains",
"DescribeVolumes",
"DescribeVolumesModifications",
"DescribeVpcEndpointConnectionNotifications",
"DescribeVpcs",
"GetAccount",
"GetAccountAuthorizationDetails",
"GetAccountSendingEnabled",
"GetBucketAcl",
"GetBucketLogging",
"GetBucketPolicy",
"GetBucketReplication",
"GetBucketVersioning",
"GetCallerIdentity",
"GetCertificate",
"GetConsoleScreenshot",
"GetCostAndUsage",
"GetDetector",
"GetEbsDefaultKmsKeyId",
"GetEbsEncryptionByDefault",
"GetFindings",
"GetFlowLogsIntegrationTemplate",
"GetIdentityVerificationAttributes",
"GetInstances",
"GetIntrospectionSchema",
"GetLaunchTemplateData",
"GetLaunchTemplateData",
"GetLogRecord",
"GetParameters",
"GetPolicyVersion",
"GetPublicAccessBlock",
"GetQueryResults",
"GetRegions",
"GetSMSAttributes",
"GetSMSSandboxAccountStatus",
"GetSendQuota",
"GetTransitGatewayRouteTableAssociations",
"GetUserPolicy",
"HeadObject",
"ListAccessKeys",
"ListAccounts",
"ListAllMyBuckets",
"ListAssociatedAccessPolicies",
"ListAttachedUserPolicies",
"ListClusters",
"ListDetectors",
"ListDomains",
"ListFindings",
"ListHostedZones",
"ListIPSets",
"ListIdentities",
"ListInstanceProfiles",
"ListObjects",
"ListOrganizationalUnitsForParent",
"ListOriginationNumbers",
"ListPolicyVersions",
"ListRoles",
"ListRoles",
"ListRules",
"ListServiceQuotas",
"ListSubscriptions",
"ListTargetsByRule",
"ListTopics",
"ListUsers",
"LookupEvents",
"Search",
]
threat_detection_enumeration_actions:
[
"DescribeAccessEntry",
"DescribeAccountAttributes",
"DescribeAvailabilityZones",
"DescribeBundleTasks",
"DescribeCarrierGateways",
"DescribeClientVpnRoutes",
"DescribeCluster",
"DescribeDhcpOptions",
"DescribeFlowLogs",
"DescribeImages",
"DescribeInstanceAttribute",
"DescribeInstanceInformation",
"DescribeInstanceTypes",
"DescribeInstances",
"DescribeInstances",
"DescribeKeyPairs",
"DescribeLogGroups",
"DescribeLogStreams",
"DescribeOrganization",
"DescribeRegions",
"DescribeSecurityGroups",
"DescribeSnapshotAttribute",
"DescribeSnapshotTierStatus",
"DescribeSubscriptionFilters",
"DescribeTransitGatewayMulticastDomains",
"DescribeVolumes",
"DescribeVolumesModifications",
"DescribeVpcEndpointConnectionNotifications",
"DescribeVpcs",
"GetAccount",
"GetAccountAuthorizationDetails",
"GetAccountSendingEnabled",
"GetBucketAcl",
"GetBucketLogging",
"GetBucketPolicy",
"GetBucketReplication",
"GetBucketVersioning",
"GetCallerIdentity",
"GetCertificate",
"GetConsoleScreenshot",
"GetCostAndUsage",
"GetDetector",
"GetEbsDefaultKmsKeyId",
"GetEbsEncryptionByDefault",
"GetFindings",
"GetFlowLogsIntegrationTemplate",
"GetIdentityVerificationAttributes",
"GetInstances",
"GetIntrospectionSchema",
"GetLaunchTemplateData",
"GetLaunchTemplateData",
"GetLogRecord",
"GetParameters",
"GetPolicyVersion",
"GetPublicAccessBlock",
"GetQueryResults",
"GetRegions",
"GetSMSAttributes",
"GetSMSSandboxAccountStatus",
"GetSendQuota",
"GetTransitGatewayRouteTableAssociations",
"GetUserPolicy",
"HeadObject",
"ListAccessKeys",
"ListAccounts",
"ListAllMyBuckets",
"ListAssociatedAccessPolicies",
"ListAttachedUserPolicies",
"ListClusters",
"ListDetectors",
"ListDomains",
"ListFindings",
"ListHostedZones",
"ListIPSets",
"ListIdentities",
"ListInstanceProfiles",
"ListObjects",
"ListOrganizationalUnitsForParent",
"ListOriginationNumbers",
"ListPolicyVersions",
"ListRoles",
"ListRoles",
"ListRules",
"ListServiceQuotas",
"ListSubscriptions",
"ListTargetsByRule",
"ListTopics",
"ListUsers",
"LookupEvents",
"Search",
]
# AWS RDS Configuration
# aws.rds_instance_backup_enabled
# Whether to check RDS instance replicas or not
check_rds_instance_replicas: False
# AWS ACM Configuration
# aws.acm_certificates_expiration_check
days_to_expire_threshold: 7
# AWS EKS Configuration
# aws.eks_control_plane_logging_all_types_enabled
# EKS control plane logging types that must be enabled
eks_required_log_types:
[
"api",
"audit",
"authenticator",
"controllerManager",
"scheduler",
]
# aws.eks_cluster_uses_a_supported_version
# EKS clusters must be version 1.28 or higher
eks_cluster_oldest_version_supported: "1.28"
# AWS CodeBuild Configuration
# aws.codebuild_project_no_secrets_in_variables
# CodeBuild sensitive variables that are excluded from the check
excluded_sensitive_environment_variables:
[
]
# Azure Configuration
azure:
# Azure Network Configuration
# azure.network_public_ip_shodan
# TODO: create common config
shodan_api_key: null
# Azure App Configuration
# Azure App Service
# azure.app_ensure_php_version_is_latest
php_latest_version: "8.2"
# azure.app_ensure_python_version_is_latest
@@ -345,4 +433,34 @@ gcp:
# gcp.compute_public_address_shodan
shodan_api_key: null
# Kubernetes Configuration
kubernetes:
# Kubernetes API Server
# kubernetes.apiserver_audit_log_maxbackup_set
audit_log_maxbackup: 10
# kubernetes.apiserver_audit_log_maxsize_set
audit_log_maxsize: 100
# kubernetes.apiserver_audit_log_maxage_set
audit_log_maxage: 30
# kubernetes.apiserver_strong_ciphers_only
apiserver_strong_ciphers:
[
"TLS_AES_128_GCM_SHA256",
"TLS_AES_256_GCM_SHA384",
"TLS_CHACHA20_POLY1305_SHA256",
]
# Kubelet
# kubernetes.kubelet_strong_ciphers_only
kubelet_strong_ciphers:
[
"TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256",
"TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256",
"TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305",
"TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384",
"TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305",
"TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384",
"TLS_RSA_WITH_AES_256_GCM_SHA384",
"TLS_RSA_WITH_AES_128_GCM_SHA256",
]
```

View File

@@ -54,7 +54,7 @@ CustomChecksMetadata:
RelatedUrl: https://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html
Remediation:
Code:
CLI: aws s3api put-bucket-versioning --bucket <bucket-name> --versioning-configuration Status=Enabled
CLI: aws s3api put-bucket-versioning --bucket <bucket-name> --versioning-configuration Status=Enabled,MFADelete=Enabled
NativeIaC: https://aws.amazon.com/es/s3/features/versioning/
Other: https://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html
Terraform: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/s3_bucket_versioning

View File

@@ -10,9 +10,11 @@ prowler dashboard
To run Prowler local dashboard with Docker, use:
```sh
docker run --env HOST=0.0.0.0 --publish 127.0.0.1:11666:11666 toniblyx/prowler:latest dashboard
docker run -v /your/local/dir/prowler-output:/home/prowler/output --env HOST=0.0.0.0 --publish 127.0.0.1:11666:11666 toniblyx/prowler:latest dashboard
```
Make sure you update the `/your/local/dir/prowler-output` to match the path that contains your prowler output.
???+ note
**Remember that the `dashboard` server is not authenticated, if you expose it to the internet, you are running it at your own risk.**
@@ -81,7 +83,7 @@ def get_table(data):
## S3 Integration
If you are a Prowler Saas customer and you want to use your data from your S3 bucket, you can run:
If you are using Prowler SaaS with the S3 integration or that integration from Prowler Open Source and you want to use your data from your S3 bucket, you can run:
```sh
aws s3 cp s3://<your-bucket>/output/csv ./output --recursive

View File

@@ -13,7 +13,7 @@ prowler <provider> -c <check_to_fix_1> <check_to_fix_2> ... --fixer
```sh
prowler <provider> --list-fixers
```
It's important to note that using the fixers for `Access Analyzer`, `GuardDuty`, and `SecurityHub` may incur additional costs. These AWS services might trigger actions or deploy resources that can lead to charges on your AWS account.
## Writing a Fixer
To write a fixer, you need to create a file called `<check_id>_fixer.py` inside the check folder, with a function called `fixer` that receives either the region or the resource to be fixed as a parameter, and returns a boolean value indicating if the fix was successful or not.

View File

@@ -25,7 +25,17 @@ Prowler will follow the same credentials search as [Google authentication librar
Those credentials must be associated to a user or service account with proper permissions to do all checks. To make sure, add the `Viewer` role to the member associated with the credentials.
# GCP Service APIs
## Impersonate Service Account
If you want to impersonate a GCP service account, you can use the `--impersonate-service-account` argument:
```console
prowler gcp --impersonate-service-account <service-account-email>
```
This argument will use the default credentials to impersonate the service account provided.
## Service APIs
Prowler will use the Google Cloud APIs to get the information needed to perform the checks. Make sure that the following APIs are enabled in the project:

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.5 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 26 KiB

View File

@@ -0,0 +1,20 @@
# In-Cluster Execution
For in-cluster execution, you can use the supplied yaml files inside `/kubernetes`:
* [job.yaml](https://github.com/prowler-cloud/prowler/blob/master/kubernetes/job.yaml)
* [prowler-role.yaml](https://github.com/prowler-cloud/prowler/blob/master/kubernetes/prowler-role.yaml)
* [prowler-rolebinding.yaml](https://github.com/prowler-cloud/prowler/blob/master/kubernetes/prowler-rolebinding.yaml)
They can be used to run Prowler as a job within a new Prowler namespace:
```console
kubectl apply -f kubernetes/job.yaml
kubectl apply -f kubernetes/prowler-role.yaml
kubectl apply -f kubernetes/prowler-rolebinding.yaml
kubectl get pods --namespace prowler-ns --> prowler-XXXXX
kubectl logs prowler-XXXXX --namespace prowler-ns
```
???+ note
By default, `prowler` will scan all namespaces in your active Kubernetes context. Use the [`--namespace`](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/kubernetes/namespace/) flag to specify the namespace(s) to be scanned.

View File

@@ -0,0 +1,23 @@
# Miscellaneous
## Context Filtering
Prowler will scan the active Kubernetes context by default.
To specify the Kubernetes context to be scanned, use the `--context` flag followed by the desired context name. For example:
```console
prowler --context my-context
```
This will ensure that Prowler scans the specified context/cluster for vulnerabilities and misconfigurations.
## Namespace Filtering
By default, `prowler` will scan all namespaces in the context you specify.
To specify the namespace(s) to be scanned, use the `--namespace` flag followed by the desired namespace(s) separated by spaces. For example:
```console
prowler --namespace namespace1 namespace2
```

View File

@@ -0,0 +1,15 @@
# Non in-cluster execution
For non in-cluster execution, you can provide the location of the [kubeconfig](https://kubernetes.io/docs/concepts/configuration/organize-cluster-access-kubeconfig/) file with the following argument:
```console
prowler kubernetes --kubeconfig-file /path/to/kubeconfig
```
???+ note
If no `--kubeconfig-file` is provided, Prowler will use the default KubeConfig file location (`~/.kube/config`).
???+ note
`prowler` will scan the active Kubernetes context by default. Use the [`--context`](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/kubernetes/context/) flag to specify the context to be scanned.
???+ note
By default, `prowler` will scan all namespaces in your active Kubernetes context. Use the [`--namespace`](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/kubernetes/namespace/) flag to specify the namespace(s) to be scanned.

View File

@@ -10,7 +10,7 @@ Execute Prowler in verbose mode (like in Version 2):
prowler <provider> --verbose
```
## Filter findings by status
Prowler can filter the findings by their status:
Prowler can filter the findings by their status, so you can see only in the CLI and in the reports the findings with a specific status:
```console
prowler <provider> --status [PASS, FAIL, MANUAL]
```

View File

@@ -7,97 +7,155 @@ Mutelist option works along with other options and will modify the output in the
- CSV: `muted` is `True`. The field `status` will keep the original status, `MANUAL`, `PASS` or `FAIL`, of the finding.
You can use `-w`/`--mutelist-file` with the path of your mutelist yaml file:
## How the Mutelist Works
The **Mutelist** uses both "AND" and "OR" logic to determine which resources, checks, regions, and tags should be muted. For each check, the Mutelist evaluates whether the account, region, and resource match the specified criteria using "AND" logic. If tags are specified, the Mutelist can apply either "AND" or "OR" logic.
If any of the criteria do not match, the check is not muted.
???+ note
Remember that mutelist can be used with regular expressions.
## Mutelist Specification
???+ note
- For Azure provider, the Account ID is the Subscription Name and the Region is the Location.
- For GCP provider, the Account ID is the Project ID and the Region is the Zone.
- For Kubernetes provider, the Account ID is the Cluster Name and the Region is the Namespace.
The Mutelist file uses the [YAML](https://en.wikipedia.org/wiki/YAML) format with the following syntax:
```yaml
### Account, Check and/or Region can be * to apply for all the cases.
### Resources and tags are lists that can have either Regex or Keywords.
### Tags is an optional list that matches on tuples of 'key=value' and are "ANDed" together.
### Use an alternation Regex to match one of multiple tags with "ORed" logic.
### For each check you can except Accounts, Regions, Resources and/or Tags.
########################### MUTELIST EXAMPLE ###########################
Mutelist:
Accounts:
"123456789012":
Checks:
"iam_user_hardware_mfa_enabled":
Regions:
- "us-east-1"
Resources:
- "user-1" # Will mute user-1 in check iam_user_hardware_mfa_enabled
- "user-2" # Will mute user-2 in check iam_user_hardware_mfa_enabled
"ec2_*":
Regions:
- "*"
Resources:
- "*" # Will mute every EC2 check in every account and region
"*":
Regions:
- "*"
Resources:
- "test"
Tags:
- "test=test" # Will mute every resource containing the string "test" and the tags 'test=test' and
- "project=test|project=stage" # either of ('project=test' OR project=stage) in account 123456789012 and every region
"*":
Regions:
- "*"
Resources:
- "test"
Tags:
- "test=test"
- "project=test" # This will mute every resource containing the string "test" and BOTH tags at the same time.
"*":
Regions:
- "*"
Resources:
- "test"
Tags: # This will mute every resource containing the string "test" and the ones that contain EITHER the `test=test` OR `project=test` OR `project=dev`
- "test=test|project=(test|dev)"
"*":
Regions:
- "*"
Resources:
- "test"
Tags:
- "test=test" # This will mute every resource containing the string "test" and the tags `test=test` and either `project=test` OR `project=stage` in every account and region.
- "project=test|project=stage"
"*":
Checks:
"s3_bucket_object_versioning":
Regions:
- "eu-west-1"
- "us-east-1"
Resources:
- "ci-logs" # Will mute bucket "ci-logs" AND ALSO bucket "ci-logs-replica" in specified check and regions
- "logs" # Will mute EVERY BUCKET containing the string "logs" in specified check and regions
- ".+-logs" # Will mute all buckets containing the terms ci-logs, qa-logs, etc. in specified check and regions
"ecs_task_definitions_no_environment_secrets":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Accounts:
- "0123456789012"
Regions:
- "eu-west-1"
- "eu-south-2" # Will mute every resource in check ecs_task_definitions_no_environment_secrets except the ones in account 0123456789012 located in eu-south-2 or eu-west-1
"*":
Regions:
- "*"
Resources:
- "*"
Tags:
- "environment=dev" # Will mute every resource containing the tag 'environment=dev' in every account and region
"123456789012":
Checks:
"*":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Resources:
- "test"
Tags:
- "environment=prod" # Will mute every resource except in account 123456789012 except the ones containing the string "test" and tag environment=prod
"*":
Checks:
"ec2_*":
Regions:
- "*"
Resources:
- "test-resource" # Will mute the resource "test-resource" in all accounts and regions for whatever check from the EC2 service
```
### Account, Check, Region, Resource, and Tag
| Field | Description | Logic |
|----------|----------|----------|
| `account_id` | Use `*` to apply the mutelist to all accounts. | `ANDed` |
| `check_name` | The name of the Prowler check. Use `*` to apply the mutelist to all checks, or `service_*` to apply it to all service's checks. | `ANDed` |
| `region` | The region identifier. Use `*` to apply the mutelist to all regions. | `ANDed` |
| `resource` | The resource identifier. Use `*` to apply the mutelist to all resources. | `ANDed` |
| `tag` | The tag value. | `ORed` |
## How to Use the Mutelist
To use the Mutelist, you need to specify the path to the Mutelist YAML file using the `-w` or `--mutelist-file` option when running Prowler:
```
prowler <provider> -w mutelist.yaml
```
## Mutelist YAML File Syntax
Replace `<provider>` with the appropriate provider name.
???+ note
For Azure provider, the Account ID is the Subscription Name and the Region is the Location.
## Considerations
???+ note
For GCP provider, the Account ID is the Project ID and the Region is the Zone.
- The Mutelist can be used in combination with other Prowler options, such as the `--service` or `--checks` option, to further customize the scanning process.
- Make sure to review and update the Mutelist regularly to ensure it reflects the desired exclusions and remains up to date with your infrastructure.
???+ note
For Kubernetes provider, the Account ID is the Cluster Name and the Region is the Namespace.
The Mutelist file is a YAML file with the following syntax:
```yaml
### Account, Check and/or Region can be * to apply for all the cases.
### Resources and tags are lists that can have either Regex or Keywords.
### Tags is an optional list that matches on tuples of 'key=value' and are "ANDed" together.
### Use an alternation Regex to match one of multiple tags with "ORed" logic.
### For each check you can except Accounts, Regions, Resources and/or Tags.
########################### MUTELIST EXAMPLE ###########################
Mutelist:
Accounts:
"123456789012":
Checks:
"iam_user_hardware_mfa_enabled":
Regions:
- "us-east-1"
Resources:
- "user-1" # Will ignore user-1 in check iam_user_hardware_mfa_enabled
- "user-2" # Will ignore user-2 in check iam_user_hardware_mfa_enabled
"ec2_*":
Regions:
- "*"
Resources:
- "*" # Will ignore every EC2 check in every account and region
"*":
Regions:
- "*"
Resources:
- "test"
Tags:
- "test=test" # Will ignore every resource containing the string "test" and the tags 'test=test' and
- "project=test|project=stage" # either of ('project=test' OR project=stage) in account 123456789012 and every region
"*":
Checks:
"s3_bucket_object_versioning":
Regions:
- "eu-west-1"
- "us-east-1"
Resources:
- "ci-logs" # Will ignore bucket "ci-logs" AND ALSO bucket "ci-logs-replica" in specified check and regions
- "logs" # Will ignore EVERY BUCKET containing the string "logs" in specified check and regions
- ".+-logs" # Will ignore all buckets containing the terms ci-logs, qa-logs, etc. in specified check and regions
"ecs_task_definitions_no_environment_secrets":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Accounts:
- "0123456789012"
Regions:
- "eu-west-1"
- "eu-south-2" # Will ignore every resource in check ecs_task_definitions_no_environment_secrets except the ones in account 0123456789012 located in eu-south-2 or eu-west-1
"*":
Regions:
- "*"
Resources:
- "*"
Tags:
- "environment=dev" # Will ignore every resource containing the tag 'environment=dev' in every account and region
"123456789012":
Checks:
"*":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Resources:
- "test"
Tags:
- "environment=prod" # Will ignore every resource except in account 123456789012 except the ones containing the string "test" and tag environment=prod
```
## AWS Mutelist
### Mute specific AWS regions

View File

@@ -125,7 +125,7 @@ The JSON-OCSF output format implements the [Detection Finding](https://schema.oc
"product": {
"name": "Prowler",
"vendor_name": "Prowler",
"version": "4.2.2"
"version": "4.2.4"
},
"version": "1.1.0"
},

View File

@@ -11,6 +11,12 @@ prowler <provider> --scan-unused-services
## Services that are ignored
### AWS
#### ACM
You can have certificates in ACM that are not in use by any AWS resource.
Prowler will check if every certificate is going to expire soon, if this certificate is not in use by default it is not going to be check if it is expired, is going to expire soon or it is good.
- `acm_certificates_expiration_check`
#### Athena
When you create an AWS Account, Athena will create a default primary workgroup for you.
Prowler will check if that workgroup is enabled and if it is being used by checking if there were queries in the last 45 days.
@@ -30,10 +36,11 @@ If EBS default encyption is not enabled, sensitive information at rest is not pr
- `ec2_ebs_default_encryption`
If your Security groups are not properly configured the attack surface is increased, nonetheless, Prowler will detect those security groups that are being used (they are attached) to only notify those that are being used. This logic applies to the 15 checks related to open ports in security groups and the check for the default security group.
If your Security groups are not properly configured the attack surface is increased, nonetheless, Prowler will detect those security groups that are being used (they are attached) to only notify those that are being used. This logic applies to the 15 checks related to open ports in security groups, the check for the default security group and for the security groups that allow ingress and egress traffic.
- `ec2_securitygroup_allow_ingress_from_internet_to_port_X` (15 checks)
- `ec2_securitygroup_default_restrict_traffic`
- `ec2_securitygroup_allow_wide_open_public_ipv4`
Prowler will also check for used Network ACLs to only alerts those with open ports that are being used.

View File

@@ -83,9 +83,14 @@ nav:
- Authentication: tutorials/azure/authentication.md
- Non default clouds: tutorials/azure/use-non-default-cloud.md
- Subscriptions: tutorials/azure/subscriptions.md
- Create Prowler Service Principal: tutorials/azure/create-prowler-service-principal.md
- Google Cloud:
- Authentication: tutorials/gcp/authentication.md
- Projects: tutorials/gcp/projects.md
- Kubernetes:
- In-Cluster Execution: tutorials/kubernetes/in-cluster.md
- Non In-Cluster Execution: tutorials/kubernetes/outside-cluster.md
- Miscellaneous: tutorials/kubernetes/misc.md
- Developer Guide:
- Introduction: developer-guide/introduction.md
- Provider: developer-guide/provider.md

View File

@@ -58,20 +58,29 @@ Resources:
- 'account:Get*'
- 'appstream:Describe*'
- 'appstream:List*'
- 'backup:List*'
- 'cloudtrail:GetInsightSelectors'
- 'codeartifact:List*'
- 'codebuild:BatchGet*'
- 'cognito-idp:GetUserPoolMfaConfig'
- 'dlm:Get*'
- 'drs:Describe*'
- 'ds:Get*'
- 'ds:Describe*'
- 'ds:List*'
- 'dynamodb:GetResourcePolicy'
- 'ec2:GetEbsEncryptionByDefault'
- 'ec2:GetSnapshotBlockPublicAccessState'
- 'ec2:GetInstanceMetadataDefaults'
- 'ecr:Describe*'
- 'ecr:GetRegistryScanningConfiguration'
- 'elasticfilesystem:DescribeBackupPolicy'
- 'glue:GetConnections'
- 'glue:GetSecurityConfiguration*'
- 'glue:SearchTables'
- 'lambda:GetFunction*'
- 'logs:FilterLogEvents'
- 'lightsail:GetRelationalDatabases'
- 'macie2:GetMacieSession'
- 's3:GetAccountPublicAccessBlock'
- 'shield:DescribeProtection'
@@ -79,8 +88,10 @@ Resources:
- 'securityhub:BatchImportFindings'
- 'securityhub:GetFindings'
- 'ssm:GetDocument'
- 'ssm-incidents:List*'
- 'support:Describe*'
- 'tag:GetTagKeys'
- 'wellarchitected:List*'
Resource: '*'
- PolicyName: ProwlerScanRoleAdditionalViewPrivilegesApiGateway
PolicyDocument:

View File

@@ -16,7 +16,10 @@
"ds:Get*",
"ds:Describe*",
"ds:List*",
"dynamodb:GetResourcePolicy",
"ec2:GetEbsEncryptionByDefault",
"ec2:GetSnapshotBlockPublicAccessState",
"ec2:GetInstanceMetadataDefaults",
"ecr:Describe*",
"ecr:GetRegistryScanningConfiguration",
"elasticfilesystem:DescribeBackupPolicy",
@@ -25,6 +28,7 @@
"glue:SearchTables",
"lambda:GetFunction*",
"logs:FilterLogEvents",
"lightsail:GetRelationalDatabases",
"macie2:GetMacieSession",
"s3:GetAccountPublicAccessBlock",
"shield:DescribeProtection",

View File

@@ -0,0 +1,20 @@
{
"properties": {
"roleName": "ProwlerRole",
"description": "Role used for checks that require read-only access to Azure resources and are not covered by the Reader role.",
"assignableScopes": [
"/"
],
"permissions": [
{
"actions": [
"Microsoft.Web/sites/host/listkeys/action",
"Microsoft.Web/sites/config/list/Action"
],
"notActions": [],
"dataActions": [],
"notDataActions": []
}
]
}
}

2727
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -6,11 +6,15 @@ from os import environ
from colorama import Fore, Style
from prowler.config.config import get_available_compliance_frameworks
from prowler.config.config import (
csv_file_suffix,
get_available_compliance_frameworks,
html_file_suffix,
json_asff_file_suffix,
json_ocsf_file_suffix,
)
from prowler.lib.banner import print_banner
from prowler.lib.check.check import (
bulk_load_checks_metadata,
bulk_load_compliance_frameworks,
exclude_checks_to_run,
exclude_services_to_run,
execute_checks,
@@ -30,27 +34,47 @@ from prowler.lib.check.check import (
)
from prowler.lib.check.checks_loader import load_checks_to_execute
from prowler.lib.check.compliance import update_checks_metadata_with_compliance
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.check.custom_checks_metadata import (
parse_custom_checks_metadata_file,
update_checks_metadata,
)
from prowler.lib.check.models import CheckMetadata
from prowler.lib.cli.parser import ProwlerArgumentParser
from prowler.lib.logger import logger, set_logging_config
from prowler.lib.outputs.asff.asff import ASFF
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
AWSWellArchitected,
)
from prowler.lib.outputs.compliance.cis.cis_aws import AWSCIS
from prowler.lib.outputs.compliance.cis.cis_azure import AzureCIS
from prowler.lib.outputs.compliance.cis.cis_gcp import GCPCIS
from prowler.lib.outputs.compliance.cis.cis_kubernetes import KubernetesCIS
from prowler.lib.outputs.compliance.compliance import display_compliance_table
from prowler.lib.outputs.html.html import add_html_footer, fill_html_overview_statistics
from prowler.lib.outputs.json.json import close_json
from prowler.lib.outputs.compliance.ens.ens_aws import AWSENS
from prowler.lib.outputs.compliance.generic.generic import GenericCompliance
from prowler.lib.outputs.compliance.iso27001.iso27001_aws import AWSISO27001
from prowler.lib.outputs.compliance.kisa_ismsp.kisa_ismsp_aws import AWSKISAISMSP
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_aws import AWSMitreAttack
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_azure import (
AzureMitreAttack,
)
from prowler.lib.outputs.compliance.mitre_attack.mitre_attack_gcp import GCPMitreAttack
from prowler.lib.outputs.csv.csv import CSV
from prowler.lib.outputs.finding import Finding
from prowler.lib.outputs.html.html import HTML
from prowler.lib.outputs.ocsf.ocsf import OCSF
from prowler.lib.outputs.outputs import extract_findings_statistics
from prowler.lib.outputs.slack.slack import Slack
from prowler.lib.outputs.summary_table import display_summary_table
from prowler.providers.aws.lib.s3.s3 import send_to_s3_bucket
from prowler.providers.aws.lib.security_hub.security_hub import (
batch_send_to_security_hub,
prepare_security_hub_findings,
resolve_security_hub_previous_findings,
verify_security_hub_integration_enabled_per_region,
)
from prowler.providers.aws.lib.s3.s3 import S3
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHub
from prowler.providers.aws.models import AWSOutputOptions
from prowler.providers.azure.models import AzureOutputOptions
from prowler.providers.common.provider import Provider
from prowler.providers.common.quick_inventory import run_provider_quick_inventory
from prowler.providers.gcp.models import GCPOutputOptions
from prowler.providers.kubernetes.models import KubernetesOutputOptions
def prowler():
@@ -112,7 +136,7 @@ def prowler():
# Load checks metadata
logger.debug("Loading checks metadata from .metadata.json files")
bulk_checks_metadata = bulk_load_checks_metadata(provider)
bulk_checks_metadata = CheckMetadata.get_bulk(provider)
if args.list_categories:
print_categories(list_categories(bulk_checks_metadata))
@@ -122,7 +146,7 @@ def prowler():
# Load compliance frameworks
logger.debug("Loading compliance frameworks from .json files")
bulk_compliance_frameworks = bulk_load_compliance_frameworks(provider)
bulk_compliance_frameworks = Compliance.get_bulk(provider)
# Complete checks metadata with the compliance framework specification
bulk_checks_metadata = update_checks_metadata_with_compliance(
bulk_compliance_frameworks, bulk_checks_metadata
@@ -171,7 +195,7 @@ def prowler():
sys.exit()
# Provider to scan
Provider.set_global_provider(args)
Provider.init_global_provider(args)
global_provider = Provider.get_global_provider()
# Print Provider Credentials
@@ -180,7 +204,17 @@ def prowler():
# Import custom checks from folder
if checks_folder:
parse_checks_from_folder(global_provider, checks_folder)
custom_checks = parse_checks_from_folder(global_provider, checks_folder)
# Workaround to be able to execute custom checks alongside all checks if nothing is explicitly set
if (
not checks_file
and not checks
and not services
and not severities
and not compliance_framework
and not categories
):
checks_to_execute.update(custom_checks)
# Exclude checks if -e/--excluded-checks
if excluded_checks:
@@ -195,7 +229,8 @@ def prowler():
# Once the provider is set and we have the eventual checks based on the resource identifier,
# it is time to check what Prowler's checks are going to be executed
checks_from_resources = global_provider.get_checks_to_execute_by_audit_resources()
if checks_from_resources:
# Intersect checks from resources with checks to execute so we only run the checks that apply to the resources with the specified ARNs or tags
if getattr(args, "resource_arn", None) or getattr(args, "resource_tag", None):
checks_to_execute = checks_to_execute.intersection(checks_from_resources)
# Sort final check list
@@ -205,7 +240,22 @@ def prowler():
global_provider.mutelist = args.mutelist_file
# Setup Output Options
global_provider.output_options = (args, bulk_checks_metadata)
if provider == "aws":
output_options = AWSOutputOptions(
args, bulk_checks_metadata, global_provider.identity
)
elif provider == "azure":
output_options = AzureOutputOptions(
args, bulk_checks_metadata, global_provider.identity
)
elif provider == "gcp":
output_options = GCPOutputOptions(
args, bulk_checks_metadata, global_provider.identity
)
elif provider == "kubernetes":
output_options = KubernetesOutputOptions(
args, bulk_checks_metadata, global_provider.identity
)
# Run the quick inventory for the provider if available
if hasattr(args, "quick_inventory") and args.quick_inventory:
@@ -221,6 +271,7 @@ def prowler():
global_provider,
custom_checks_metadata,
args.config_file,
output_options,
)
else:
logger.error(
@@ -228,7 +279,7 @@ def prowler():
)
# Prowler Fixer
if global_provider.output_options.fixer:
if output_options.fixer:
print(f"{Style.BRIGHT}\nRunning Prowler Fixer, please wait...{Style.RESET_ALL}")
# Check if there are any FAIL findings
if any("FAIL" in finding.status for finding in findings):
@@ -270,103 +321,342 @@ def prowler():
)
sys.exit(1)
# Outputs
# TODO: this part is needed since the checks generates a Check_Report_XXX and the output uses Finding
# This will be refactored for the outputs generate directly the Finding
finding_outputs = [
Finding.generate_output(global_provider, finding, output_options)
for finding in findings
]
generated_outputs = {"regular": [], "compliance": []}
if args.output_formats:
for mode in args.output_formats:
# Close json file if exists
if "json" in mode:
close_json(
global_provider.output_options.output_filename,
global_provider.output_options.output_directory,
mode,
filename = (
f"{output_options.output_directory}/"
f"{output_options.output_filename}"
)
if mode == "csv":
csv_output = CSV(
findings=finding_outputs,
create_file_descriptor=True,
file_path=f"{filename}{csv_file_suffix}",
)
generated_outputs["regular"].append(csv_output)
# Write CSV Finding Object to file
csv_output.batch_write_data_to_file()
if mode == "json-asff":
asff_output = ASFF(
findings=finding_outputs,
create_file_descriptor=True,
file_path=f"{filename}{json_asff_file_suffix}",
)
generated_outputs["regular"].append(asff_output)
# Write ASFF Finding Object to file
asff_output.batch_write_data_to_file()
if mode == "json-ocsf":
json_output = OCSF(
findings=finding_outputs,
create_file_descriptor=True,
file_path=f"{filename}{json_ocsf_file_suffix}",
)
generated_outputs["regular"].append(json_output)
json_output.batch_write_data_to_file()
if mode == "html":
html_output = HTML(
findings=finding_outputs,
create_file_descriptor=True,
file_path=f"{filename}{html_file_suffix}",
)
generated_outputs["regular"].append(html_output)
html_output.batch_write_data_to_file(
provider=global_provider, stats=stats
)
if "html" in mode:
add_html_footer(
global_provider.output_options.output_filename,
global_provider.output_options.output_directory,
# Compliance Frameworks
input_compliance_frameworks = set(output_options.output_modes).intersection(
get_available_compliance_frameworks(provider)
)
if provider == "aws":
for compliance_name in input_compliance_frameworks:
if compliance_name.startswith("cis_"):
# Generate CIS Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
fill_html_overview_statistics(
stats,
global_provider.output_options.output_filename,
global_provider.output_options.output_directory,
cis = AWSCIS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(cis)
cis.batch_write_data_to_file()
elif compliance_name == "mitre_attack_aws":
# Generate MITRE ATT&CK Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
mitre_attack = AWSMitreAttack(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(mitre_attack)
mitre_attack.batch_write_data_to_file()
elif compliance_name.startswith("ens_"):
# Generate ENS Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
ens = AWSENS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(ens)
ens.batch_write_data_to_file()
elif compliance_name.startswith("aws_well_architected_framework"):
# Generate AWS Well-Architected Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
aws_well_architected = AWSWellArchitected(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(aws_well_architected)
aws_well_architected.batch_write_data_to_file()
elif compliance_name.startswith("iso27001_"):
# Generate ISO27001 Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
iso27001 = AWSISO27001(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(iso27001)
iso27001.batch_write_data_to_file()
elif compliance_name.startswith("kisa"):
# Generate KISA-ISMS-P Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
kisa_ismsp = AWSKISAISMSP(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(kisa_ismsp)
kisa_ismsp.batch_write_data_to_file()
else:
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
generic_compliance = GenericCompliance(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
# Send output to S3 if needed (-B / -D)
if provider == "aws" and (
args.output_bucket or args.output_bucket_no_assume
):
output_bucket = args.output_bucket
bucket_session = global_provider.session.current_session
# Check if -D was input
if args.output_bucket_no_assume:
output_bucket = args.output_bucket_no_assume
bucket_session = global_provider.session.original_session
send_to_s3_bucket(
global_provider.output_options.output_filename,
args.output_directory,
mode,
output_bucket,
bucket_session,
elif provider == "azure":
for compliance_name in input_compliance_frameworks:
if compliance_name.startswith("cis_"):
# Generate CIS Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
cis = AzureCIS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(cis)
cis.batch_write_data_to_file()
elif compliance_name == "mitre_attack_azure":
# Generate MITRE ATT&CK Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
mitre_attack = AzureMitreAttack(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(mitre_attack)
mitre_attack.batch_write_data_to_file()
else:
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
generic_compliance = GenericCompliance(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
elif provider == "gcp":
for compliance_name in input_compliance_frameworks:
if compliance_name.startswith("cis_"):
# Generate CIS Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
cis = GCPCIS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(cis)
cis.batch_write_data_to_file()
elif compliance_name == "mitre_attack_gcp":
# Generate MITRE ATT&CK Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
mitre_attack = GCPMitreAttack(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(mitre_attack)
mitre_attack.batch_write_data_to_file()
else:
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
generic_compliance = GenericCompliance(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
elif provider == "kubernetes":
for compliance_name in input_compliance_frameworks:
if compliance_name.startswith("cis_"):
# Generate CIS Finding Object
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
cis = KubernetesCIS(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(cis)
cis.batch_write_data_to_file()
else:
filename = (
f"{output_options.output_directory}/compliance/"
f"{output_options.output_filename}_{compliance_name}.csv"
)
generic_compliance = GenericCompliance(
findings=finding_outputs,
compliance=bulk_compliance_frameworks[compliance_name],
create_file_descriptor=True,
file_path=filename,
)
generated_outputs["compliance"].append(generic_compliance)
generic_compliance.batch_write_data_to_file()
# AWS Security Hub Integration
if provider == "aws" and args.security_hub:
print(
f"{Style.BRIGHT}\nSending findings to AWS Security Hub, please wait...{Style.RESET_ALL}"
)
# Verify where AWS Security Hub is enabled
aws_security_enabled_regions = []
security_hub_regions = (
global_provider.get_available_aws_service_regions("securityhub")
if not global_provider.identity.audited_regions
else global_provider.identity.audited_regions
)
for region in security_hub_regions:
# Save the regions where AWS Security Hub is enabled
if verify_security_hub_integration_enabled_per_region(
global_provider.identity.partition,
region,
global_provider.session.current_session,
global_provider.identity.account,
):
aws_security_enabled_regions.append(region)
# Prepare the findings to be sent to Security Hub
security_hub_findings_per_region = prepare_security_hub_findings(
findings,
global_provider,
global_provider.output_options,
aws_security_enabled_regions,
)
# Send the findings to Security Hub
findings_sent_to_security_hub = batch_send_to_security_hub(
security_hub_findings_per_region, global_provider.session.current_session
)
print(
f"{Style.BRIGHT}{Fore.GREEN}\n{findings_sent_to_security_hub} findings sent to AWS Security Hub!{Style.RESET_ALL}"
)
# Resolve previous fails of Security Hub
if not args.skip_sh_update:
if provider == "aws":
# Send output to S3 if needed (-B / -D) for all the output formats
if args.output_bucket or args.output_bucket_no_assume:
output_bucket = args.output_bucket
bucket_session = global_provider.session.current_session
# Check if -D was input
if args.output_bucket_no_assume:
output_bucket = args.output_bucket_no_assume
bucket_session = global_provider.session.original_session
s3 = S3(
session=bucket_session,
bucket_name=output_bucket,
output_directory=args.output_directory,
)
s3.send_to_bucket(generated_outputs)
if args.security_hub:
print(
f"{Style.BRIGHT}\nArchiving previous findings in AWS Security Hub, please wait...{Style.RESET_ALL}"
f"{Style.BRIGHT}\nSending findings to AWS Security Hub, please wait...{Style.RESET_ALL}"
)
findings_archived_in_security_hub = resolve_security_hub_previous_findings(
security_hub_findings_per_region,
global_provider,
security_hub_regions = (
global_provider.get_available_aws_service_regions("securityhub")
if not global_provider.identity.audited_regions
else global_provider.identity.audited_regions
)
security_hub = SecurityHub(
aws_account_id=global_provider.identity.account,
aws_partition=global_provider.identity.partition,
aws_session=global_provider.session.current_session,
findings=asff_output.data,
send_only_fails=output_options.send_sh_only_fails,
aws_security_hub_available_regions=security_hub_regions,
)
# Send the findings to Security Hub
findings_sent_to_security_hub = security_hub.batch_send_to_security_hub()
print(
f"{Style.BRIGHT}{Fore.GREEN}\n{findings_archived_in_security_hub} findings archived in AWS Security Hub!{Style.RESET_ALL}"
f"{Style.BRIGHT}{Fore.GREEN}\n{findings_sent_to_security_hub} findings sent to AWS Security Hub!{Style.RESET_ALL}"
)
# Resolve previous fails of Security Hub
if not args.skip_sh_update:
print(
f"{Style.BRIGHT}\nArchiving previous findings in AWS Security Hub, please wait...{Style.RESET_ALL}"
)
findings_archived_in_security_hub = (
security_hub.archive_previous_findings()
)
print(
f"{Style.BRIGHT}{Fore.GREEN}\n{findings_archived_in_security_hub} findings archived in AWS Security Hub!{Style.RESET_ALL}"
)
# Display summary table
if not args.only_logs:
display_summary_table(
findings,
global_provider,
global_provider.output_options,
output_options,
)
# Only display compliance table if there are findings (not all MANUAL) and it is a default execution
if (
@@ -385,13 +675,13 @@ def prowler():
findings,
bulk_checks_metadata,
compliance,
global_provider.output_options.output_filename,
global_provider.output_options.output_directory,
output_options.output_filename,
output_options.output_directory,
compliance_overview,
)
if compliance_overview:
print(
f"\nDetailed compliance results are in {Fore.YELLOW}{global_provider.output_options.output_directory}/compliance/{Style.RESET_ALL}\n"
f"\nDetailed compliance results are in {Fore.YELLOW}{output_options.output_directory}/compliance/{Style.RESET_ALL}\n"
)
# If custom checks were passed, remove the modules

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -3044,7 +3044,7 @@
"Id": "9.4",
"Description": "Ensure that Register with Entra ID is enabled on App Service",
"Checks": [
"app_client_certificates_on"
""
],
"Attributes": [
{
@@ -3066,7 +3066,7 @@
"Id": "9.5",
"Description": "Ensure That 'PHP version' is the Latest, If Used to Run the Web App",
"Checks": [
"app_register_with_identity"
"app_ensure_php_version_is_latest"
],
"Attributes": [
{
@@ -3088,7 +3088,7 @@
"Id": "9.6",
"Description": "Ensure that 'Python version' is the Latest Stable Version, if Used to Run the Web App",
"Checks": [
"app_ensure_php_version_is_latest"
"app_ensure_python_version_is_latest"
],
"Attributes": [
{
@@ -3110,7 +3110,7 @@
"Id": "9.7",
"Description": "Ensure that 'Java version' is the latest, if used to run the Web App",
"Checks": [
"app_ensure_python_version_is_latest"
"app_ensure_java_version_is_latest"
],
"Attributes": [
{
@@ -3132,7 +3132,7 @@
"Id": "9.8",
"Description": "Ensure that 'HTTP Version' is the Latest, if Used to Run the Web App",
"Checks": [
"app_ensure_java_version_is_latest"
"app_ensure_using_http20"
],
"Attributes": [
{
@@ -3154,7 +3154,7 @@
"Id": "9.9",
"Description": "Ensure FTP deployments are Disabled",
"Checks": [
"app_ensure_using_http20"
"app_ftp_deployment_disabled"
],
"Attributes": [
{
@@ -3176,7 +3176,7 @@
"Id": "9.10",
"Description": "Ensure Azure Key Vaults are Used to Store Secrets",
"Checks": [
"app_ftp_deployment_disabled"
""
],
"Attributes": [
{
@@ -3213,66 +3213,6 @@
"References": "https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-lock-resources:https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-subscription-governance#azure-resource-locks:https://docs.microsoft.com/en-us/azure/governance/blueprints/concepts/resource-locking:https://learn.microsoft.com/en-us/security/benchmark/azure/mcsb-asset-management#am-4-limit-access-to-asset-management"
}
]
},
{
"Id": "9.10",
"Description": "Ensure FTP deployments are Disabled",
"Checks": [],
"Attributes": [
{
"Section": "9. AppService",
"Profile": "Level 1",
"AssessmentStatus": "Automated",
"Description": "By default, Azure Functions, Web, and API Services can be deployed over FTP. If FTP is required for an essential deployment workflow, FTPS should be required for FTP login for all App Service Apps and Functions.",
"RationaleStatement": "Azure FTP deployment endpoints are public. An attacker listening to traffic on a wifi network used by a remote employee or a corporate network could see login traffic in clear-text which would then grant them full control of the code base of the app or service. This finding is more severe if User Credentials for deployment are set at the subscription level rather than using the default Application Credentials which are unique per App.",
"ImpactStatement": "Any deployment workflows that rely on FTP or FTPs rather than the WebDeploy or HTTPs endpoints may be affected.",
"RemediationProcedure": "**From Azure Portal** 1. Go to the Azure Portal 2. Select `App Services` 3. Click on an app 4. Select `Settings` and then `Configuration` 5. Under `General Settings`, for the `Platform Settings`, the `FTP state` should be set to `Disabled` or `FTPS Only` **From Azure CLI** For each out of compliance application, run the following choosing either 'disabled' or 'FtpsOnly' as appropriate: ``` az webapp config set --resource-group <resource group name> --name <app name> --ftps-state [disabled|FtpsOnly] ``` **From PowerShell** For each out of compliance application, run the following: ``` Set-AzWebApp -ResourceGroupName <resource group name> -Name <app name> -FtpsState <Disabled or FtpsOnly> ```",
"AuditProcedure": "**From Azure Portal** 1. Go to the Azure Portal 2. Select `App Services` 3. Click on an app 4. Select `Settings` and then `Configuration` 5. Under `General Settings`, for the `Platform Settings`, the `FTP state` should not be set to `All allowed` **From Azure CLI** List webapps to obtain the ids. ``` az webapp list ``` List the publish profiles to obtain the username, password and ftp server url. ``` az webapp deployment list-publishing-profiles --ids <ids> { publishUrl: <URL_FOR_WEB_APP>, userName: <USER_NAME>, userPWD: <USER_PASSWORD>, } ``` **From PowerShell** List all Web Apps: ``` Get-AzWebApp ``` For each app: ``` Get-AzWebApp -ResourceGroupName <resource group name> -Name <app name> | Select-Object -ExpandProperty SiteConfig ``` In the output, look for the value of **FtpsState**. If its value is **AllAllowed** the setting is out of compliance. Any other value is considered in compliance with this check.",
"AdditionalInformation": "",
"DefaultValue": "[Azure Web Service Deploy via FTP](https://docs.microsoft.com/en-us/azure/app-service/deploy-ftp):[Azure Web Service Deployment](https://docs.microsoft.com/en-us/azure/app-service/overview-security):https://docs.microsoft.com/en-us/security/benchmark/azure/security-controls-v3-data-protection#dp-4-encrypt-sensitive-information-in-transit:https://docs.microsoft.com/en-us/security/benchmark/azure/security-controls-v3-posture-vulnerability-management#pv-7-rapidly-and-automatically-remediate-software-vulnerabilities",
"References": "TA0008, T1570, M1031"
}
]
},
{
"Id": "9.11",
"Description": "Ensure Azure Key Vaults are Used to Store Secrets",
"Checks": [],
"Attributes": [
{
"Section": "9. AppService",
"Profile": "Level 2",
"AssessmentStatus": "Manual",
"Description": "Azure Key Vault will store multiple types of sensitive information such as encryption keys, certificate thumbprints, and Managed Identity Credentials. Access to these 'Secrets' can be controlled through granular permissions.",
"RationaleStatement": "The credentials given to an application have permissions to create, delete, or modify data stored within the systems they access. If these credentials are stored within the application itself, anyone with access to the application or a copy of the code has access to them. Storing within Azure Key Vault as secrets increases security by controlling access. This also allows for updates of the credentials without redeploying the entire application.",
"ImpactStatement": "Integrating references to secrets within the key vault are required to be specifically integrated within the application code. This will require additional configuration to be made during the writing of an application, or refactoring of an already written one. There are also additional costs that are charged per 10000 requests to the Key Vault.",
"RemediationProcedure": "Remediation has 2 steps 1. Setup the Key Vault 2. Setup the App Service to use the Key Vault **Step 1: Set up the Key Vault** **From Azure CLI** ``` az keyvault create --name <name> --resource-group <myResourceGroup> --location myLocation ``` **From Powershell** ``` New-AzKeyvault -name <name> -ResourceGroupName <myResourceGroup> -Location <myLocation> ``` **Step 2: Set up the App Service to use the Key Vault** Sample JSON Template for App Service Configuration: ``` { //... resources: [ { type: Microsoft.Storage/storageAccounts, name: [variables('storageAccountName')], //... }, { type: Microsoft.Insights/components, name: [variables('appInsightsName')], //... }, { type: Microsoft.Web/sites, name: [variables('functionAppName')], identity: { type: SystemAssigned }, //... resources: [ { type: config, name: appsettings, //... dependsOn: [ [resourceId('Microsoft.Web/sites', variables('functionAppName'))], [resourceId('Microsoft.KeyVault/vaults/', variables('keyVaultName'))], [resourceId('Microsoft.KeyVault/vaults/secrets', variables('keyVaultName'), variables('storageConnectionStringName'))], [resourceId('Microsoft.KeyVault/vaults/secrets', variables('keyVaultName'), variables('appInsightsKeyName'))] ], properties: { AzureWebJobsStorage: [concat('@Microsoft.KeyVault(SecretUri=', reference(variables('storageConnectionStringResourceId')).secretUriWithVersion, ')')], WEBSITE_CONTENTAZUREFILECONNECTIONSTRING: [concat('@Microsoft.KeyVault(SecretUri=', reference(variables('storageConnectionStringResourceId')).secretUriWithVersion, ')')], APPINSIGHTS_INSTRUMENTATIONKEY: [concat('@Microsoft.KeyVault(SecretUri=', reference(variables('appInsightsKeyResourceId')).secretUriWithVersion, ')')], WEBSITE_ENABLE_SYNC_UPDATE_SITE: true //... } }, { type: sourcecontrols, name: web, //... dependsOn: [ [resourceId('Microsoft.Web/sites', variables('functionAppName'))], [resourceId('Microsoft.Web/sites/config', variables('functionAppName'), 'appsettings')] ], } ] }, { type: Microsoft.KeyVault/vaults, name: [variables('keyVaultName')], //... dependsOn: [ [resourceId('Microsoft.Web/sites', variables('functionAppName'))] ], properties: { //... accessPolicies: [ { tenantId: [reference(concat('Microsoft.Web/sites/', variables('functionAppName'), '/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').tenantId], objectId: [reference(concat('Microsoft.Web/sites/', variables('functionAppName'), '/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').principalId], permissions: { secrets: [ get ] } } ] }, resources: [ { type: secrets, name: [variables('storageConnectionStringName')], //... dependsOn: [ [resourceId('Microsoft.KeyVault/vaults/', variables('keyVaultName'))], [resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))] ], properties: { value: [concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountResourceId'),'2015-05-01-preview').key1)] } }, { type: secrets, name: [variables('appInsightsKeyName')], //... dependsOn: [ [resourceId('Microsoft.KeyVault/vaults/', variables('keyVaultName'))], [resourceId('Microsoft.Insights/components', variables('appInsightsName'))] ], properties: { value: [reference(resourceId('microsoft.insights/components/', variables('appInsightsName')), '2015-05-01').InstrumentationKey] } } ] } ] } ```",
"AuditProcedure": "**From Azure Portal** 1. Login to Azure Portal 2. In the expandable menu on the left go to `Key Vaults` 3. View the Key Vaults listed. **From Azure CLI** To list key vaults within a subscription run the following command: ``` Get-AzKeyVault ``` To list the secrets within these key vaults run the following command: ``` Get-AzKeyVaultSecret [-VaultName] <vault name> ``` **From Powershell** To list key vaults within a subscription run the following command: ``` Get-AzKeyVault ``` To list all secrets in a key vault run the following command: ``` Get-AzKeyVaultSecret -VaultName '<vaultName' ```",
"AdditionalInformation": "",
"DefaultValue": "https://docs.microsoft.com/en-us/azure/app-service/app-service-key-vault-references:https://docs.microsoft.com/en-us/security/benchmark/azure/security-controls-v3-identity-management#im-2-manage-application-identities-securely-and-automatically:https://docs.microsoft.com/en-us/cli/azure/keyvault?view=azure-cli-latest:https://docs.microsoft.com/en-us/cli/azure/keyvault?view=azure-cli-latest",
"References": "TA0006, T1552, M1041"
}
]
},
{
"Id": "10.1",
"Description": "Ensure that Resource Locks are set for Mission-Critical Azure Resources",
"Checks": [],
"Attributes": [
{
"Section": "10. Miscellaneous",
"Profile": "Level 2",
"AssessmentStatus": "Manual",
"Description": "Resource Manager Locks provide a way for administrators to lock down Azure resources to prevent deletion of, or modifications to, a resource. These locks sit outside of the Role Based Access Controls (RBAC) hierarchy and, when applied, will place restrictions on the resource for all users. These locks are very useful when there is an important resource in a subscription that users should not be able to delete or change. Locks can help prevent accidental and malicious changes or deletion.",
"RationaleStatement": "As an administrator, it may be necessary to lock a subscription, resource group, or resource to prevent other users in the organization from accidentally deleting or modifying critical resources. The lock level can be set to to `CanNotDelete` or `ReadOnly` to achieve this purpose. - `CanNotDelete` means authorized users can still read and modify a resource, but they cannot delete the resource. - `ReadOnly` means authorized users can read a resource, but they cannot delete or update the resource. Applying this lock is similar to restricting all authorized users to the permissions granted by the Reader role.",
"ImpactStatement": "There can be unintended outcomes of locking a resource. Applying a lock to a parent service will cause it to be inherited by all resources within. Conversely, applying a lock to a resource may not apply to connected storage, leaving it unlocked. Please see the documentation for further information.",
"RemediationProcedure": "**From Azure Portal** 1. Navigate to the specific Azure Resource or Resource Group 2. For each mission critical resource, click on `Locks` 3. Click `Add` 4. Give the lock a name and a description, then select the type, `Read-only` or `Delete` as appropriate 5. Click OK **From Azure CLI** To lock a resource, provide the name of the resource, its resource type, and its resource group name. ``` az lock create --name <LockName> --lock-type <CanNotDelete/Read-only> --resource-group <resourceGroupName> --resource-name <resourceName> --resource-type <resourceType> ``` **From Powershell** ``` Get-AzResourceLock -ResourceName <Resource Name> -ResourceType <Resource Type> -ResourceGroupName <Resource Group Name> -Locktype <CanNotDelete/Read-only> ```",
"AuditProcedure": "**From Azure Portal** 1. Navigate to the specific Azure Resource or Resource Group 2. Click on `Locks` 3. Ensure the lock is defined with name and description, with type `Read-only` or `Delete` as appropriate. **From Azure CLI** Review the list of all locks set currently: ``` az lock list --resource-group <resourcegroupname> --resource-name <resourcename> --namespace <Namespace> --resource-type <type> --parent ``` **From Powershell** Run the following command to list all resources. ``` Get-AzResource ``` For each resource, run the following command to check for Resource Locks. ``` Get-AzResourceLock -ResourceName <Resource Name> -ResourceType <Resource Type> -ResourceGroupName <Resource Group Name> ``` Review the output of the `Properties` setting. Compliant settings will have the `CanNotDelete` or `ReadOnly` value.",
"AdditionalInformation": "",
"DefaultValue": "https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-lock-resources:https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-subscription-governance#azure-resource-locks:https://docs.microsoft.com/en-us/azure/governance/blueprints/concepts/resource-locking:https://docs.microsoft.com/en-us/security/benchmark/azure/security-controls-v3-asset-management#am-4-limit-access-to-asset-management",
"References": ""
}
]
}
]
}

View File

@@ -1288,7 +1288,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "Configure TLS encryption for the etcd service.",
@@ -1310,7 +1310,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "Enable client authentication on etcd service.",
@@ -1332,7 +1332,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "Do not use self-signed certificates for TLS.",
@@ -1354,7 +1354,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "etcd should be configured to make use of TLS encryption for peer connections.",
@@ -1376,7 +1376,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "etcd should be configured for peer authentication.",
@@ -1398,7 +1398,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 1 - Master Node",
"AssessmentStatus": "Automated",
"Description": "Do not use automatically generated self-signed certificates for TLS connections between peers.",
@@ -1420,7 +1420,7 @@
],
"Attributes": [
{
"Section": "2 etcd",
"Section": "2 Etcd",
"Profile": "Level 2 - Master Node",
"AssessmentStatus": "Manual",
"Description": "Use a different certificate authority for etcd from the one used for Kubernetes.",
@@ -2634,7 +2634,7 @@
],
"Attributes": [
{
"Section": "5.4",
"Section": "5.4 Secrets Management",
"Profile": "Level 2 - Master Node",
"AssessmentStatus": "Manual",
"Description": "Kubernetes supports mounting secrets as data volumes or as environment variables. Minimize the use of environment variable secrets.",

View File

@@ -19,8 +19,11 @@ Mutelist:
- "StackSet-AWSControlTowerSecurityResources-*"
- "StackSet-AWSControlTowerLoggingResources-*"
- "StackSet-AWSControlTowerExecutionRole-*"
- "AWSControlTowerBP-BASELINE-CLOUDTRAIL-MASTER"
- "AWSControlTowerBP-BASELINE-CONFIG-MASTER"
- "AWSControlTowerBP-BASELINE-CLOUDTRAIL-MASTER*"
- "AWSControlTowerBP-BASELINE-CONFIG-MASTER*"
- "StackSet-AWSControlTower*"
- "CLOUDTRAIL-ENABLED-ON-SHARED-ACCOUNTS-*"
- "AFT-Backend*"
"cloudtrail_*":
Regions:
- "*"

View File

@@ -1,17 +1,17 @@
import os
import pathlib
import sys
from datetime import datetime, timezone
from os import getcwd
import requests
import yaml
from packaging import version
from prowler.lib.logger import logger
timestamp = datetime.today()
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
prowler_version = "4.2.2"
prowler_version = "4.4.0"
html_logo_url = "https://github.com/prowler-cloud/prowler/"
square_logo_img = "https://prowler.com/wp-content/uploads/logo-html.png"
aws_logo = "https://user-images.githubusercontent.com/38561120/235953920-3e3fba08-0795-41dc-b480-9bea57db9f2e.png"
@@ -65,6 +65,8 @@ default_config_file_path = (
default_fixer_config_file_path = (
f"{pathlib.Path(os.path.dirname(os.path.realpath(__file__)))}/fixer_config.yaml"
)
encoding_format_utf_8 = "utf-8"
available_output_formats = ["csv", "json-asff", "json-ocsf", "html"]
def get_default_mute_file_path(provider: str):
@@ -85,7 +87,7 @@ def check_current_version():
"https://api.github.com/repos/prowler-cloud/prowler/tags", timeout=1
)
latest_version = release_response.json()[0]["name"]
if latest_version != prowler_version:
if version.parse(latest_version) > version.parse(prowler_version):
return f"{prowler_version_string} (latest is {latest_version}, upgrade for the latest features)"
else:
return (
@@ -99,52 +101,84 @@ def check_current_version():
def load_and_validate_config_file(provider: str, config_file_path: str) -> dict:
"""
load_and_validate_config_file reads the Prowler config file in YAML format from the default location or the file passed with the --config-file flag
Reads the Prowler config file in YAML format from the default location or the file passed with the --config-file flag.
Args:
provider (str): The provider name (e.g., 'aws', 'gcp', 'azure', 'kubernetes').
config_file_path (str): The path to the configuration file.
Returns:
dict: The configuration dictionary for the specified provider.
"""
try:
with open(config_file_path) as f:
config = {}
with open(config_file_path, "r", encoding=encoding_format_utf_8) as f:
config_file = yaml.safe_load(f)
# Not to introduce a breaking change we have to allow the old format config file without any provider keys
# and a new format with a key for each provider to include their configuration values within
# Check if the new format is passed
if (
"aws" in config_file
or "gcp" in config_file
or "azure" in config_file
or "kubernetes" in config_file
):
# Not to introduce a breaking change, allow the old format config file without any provider keys
# and a new format with a key for each provider to include their configuration values within.
if any(key in config_file for key in ["aws", "gcp", "azure", "kubernetes"]):
config = config_file.get(provider, {})
else:
config = config_file if config_file else {}
# Not to break Azure, K8s and GCP does not support neither use the old config format
# Not to break Azure, K8s and GCP does not support or use the old config format
if provider in ["azure", "gcp", "kubernetes"]:
config = {}
return config
except Exception as error:
logger.critical(
except FileNotFoundError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
sys.exit(1)
except yaml.YAMLError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
except UnicodeDecodeError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
return {}
def load_and_validate_fixer_config_file(
provider: str, fixer_config_file_path: str
) -> dict:
"""
load_and_validate_fixer_config_file reads the Prowler fixer config file in YAML format from the default location or the file passed with the --fixer-config flag
Reads the Prowler fixer config file in YAML format from the default location or the file passed with the --fixer-config flag.
Args:
provider (str): The provider name (e.g., 'aws', 'gcp', 'azure', 'kubernetes').
fixer_config_file_path (str): The path to the fixer configuration file.
Returns:
dict: The fixer configuration dictionary for the specified provider.
"""
try:
with open(fixer_config_file_path) as f:
with open(fixer_config_file_path, "r", encoding=encoding_format_utf_8) as f:
fixer_config_file = yaml.safe_load(f)
return fixer_config_file.get(provider, {})
except Exception as error:
logger.critical(
except FileNotFoundError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
sys.exit(1)
except yaml.YAMLError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
except UnicodeDecodeError as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
return {}

View File

@@ -41,8 +41,24 @@ aws:
[
"amazon-elb"
]
# aws.ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports
ec2_sg_high_risk_ports:
[
25,
110,
135,
143,
445,
3000,
4333,
5000,
5500,
8080,
8088,
]
# AWS VPC Configuration (vpc_endpoint_connections_trust_boundaries, vpc_endpoint_services_allowed_principals_trust_boundaries)
# AWS SSM Configuration (aws.ssm_documents_set_as_public)
# Single account environment: No action required. The AWS account number will be automatically added by the checks.
# Multi account environment: Any additional trusted account number should be added as a space separated list, e.g.
# trusted_account_ids : ["123456789012", "098765432109", "678901234567"]
@@ -86,6 +102,8 @@ aws:
"ruby2.5",
"ruby2.7",
]
# aws.awslambda_function_vpc_is_in_multi_azs
lambda_min_azs: 2
# AWS Organizations
# aws.organizations_scp_check_deny_regions
@@ -262,10 +280,60 @@ aws:
"LookupEvents",
"Search",
]
# AWS RDS Configuration
# aws.rds_instance_backup_enabled
# Whether to check RDS instance replicas or not
check_rds_instance_replicas: False
# AWS ACM Configuration
# aws.acm_certificates_expiration_check
days_to_expire_threshold: 7
# aws.acm_certificates_rsa_key_length
insecure_key_algorithms:
[
"RSA-1024",
]
# AWS EKS Configuration
# aws.eks_control_plane_logging_all_types_enabled
# EKS control plane logging types that must be enabled
eks_required_log_types:
[
"api",
"audit",
"authenticator",
"controllerManager",
"scheduler",
]
# aws.eks_cluster_uses_a_supported_version
# EKS clusters must be version 1.28 or higher
eks_cluster_oldest_version_supported: "1.28"
# AWS CodeBuild Configuration
# aws.codebuild_project_no_secrets_in_variables
# CodeBuild sensitive variables that are excluded from the check
excluded_sensitive_environment_variables:
[
]
# AWS ELB Configuration
# aws.elb_is_in_multiple_az
# Minimum number of Availability Zones that an CLB must be in
elb_min_azs: 2
# AWS ELBv2 Configuration
# aws.elbv2_is_in_multiple_az
# Minimum number of Availability Zones that an ELBv2 must be in
elbv2_min_azs: 2
# Known secrets to ignore on detection
# this will include a list of regex patterns to ignore on detection
secrets_ignore_patterns: []
# Azure Configuration
azure:
# Azure Network Configuration

View File

@@ -0,0 +1,53 @@
class ProwlerException(Exception):
"""Base exception for all Prowler SDK errors."""
ERROR_CODES = {
(1901, "UnexpectedError"): {
"message": "Unexpected error occurred.",
"remediation": "Please review the error message and try again.",
}
}
def __init__(
self, code, provider=None, file=None, original_exception=None, error_info=None
):
"""
Initialize the ProwlerException class.
Args:
code (int): The error code.
provider (str): The provider name.
file (str): The file name.
original_exception (Exception): The original exception.
error_info (dict): The error information.
Example:
A ProwlerException is raised with the following parameters and format:
>>> original_exception = Exception("Error occurred.")
ProwlerException(1901, "AWS", "file.txt", original_exception)
>>> [1901] Unexpected error occurred. - Exception: Error occurred.
"""
self.code = code
self.provider = provider
self.file = file
if error_info is None:
error_info = self.ERROR_CODES.get((code, self.__class__.__name__))
self.message = error_info.get("message")
self.remediation = error_info.get("remediation")
self.original_exception = original_exception
# Format -> [code] message - original_exception
if original_exception is None:
super().__init__(f"[{self.code}] {self.message}")
else:
super().__init__(
f"[{self.code}] {self.message} - {self.original_exception}"
)
def __str__(self):
"""Overriding the __str__ method"""
return f"{self.__class__.__name__}[{self.code}]: {self.message} - {self.original_exception}"
class UnexpectedError(ProwlerException):
def __init__(self, provider, file, original_exception=None):
super().__init__(1901, provider, file, original_exception)

View File

@@ -6,7 +6,6 @@ import re
import shutil
import sys
import traceback
from pkgutil import walk_packages
from types import ModuleType
from typing import Any
@@ -15,69 +14,15 @@ from colorama import Fore, Style
import prowler
from prowler.config.config import orange_color
from prowler.lib.check.compliance_models import load_compliance_framework
from prowler.lib.check.custom_checks_metadata import update_check_metadata
from prowler.lib.check.models import Check, load_check_metadata
from prowler.lib.check.models import Check
from prowler.lib.check.utils import recover_checks_from_provider
from prowler.lib.logger import logger
from prowler.lib.mutelist.mutelist import mutelist_findings
from prowler.lib.outputs.outputs import report
from prowler.lib.utils.utils import open_file, parse_json_file, print_boxes
from prowler.providers.common.models import Audit_Metadata
# Load all checks metadata
def bulk_load_checks_metadata(provider: str) -> dict:
bulk_check_metadata = {}
checks = recover_checks_from_provider(provider)
# Build list of check's metadata files
for check_info in checks:
# Build check path name
check_name = check_info[0]
check_path = check_info[1]
# Ignore fixer files
if check_name.endswith("_fixer"):
continue
# Append metadata file extension
metadata_file = f"{check_path}/{check_name}.metadata.json"
# Load metadata
check_metadata = load_check_metadata(metadata_file)
bulk_check_metadata[check_metadata.CheckID] = check_metadata
return bulk_check_metadata
# Bulk load all compliance frameworks specification
def bulk_load_compliance_frameworks(provider: str) -> dict:
"""Bulk load all compliance frameworks specification into a dict"""
try:
bulk_compliance_frameworks = {}
available_compliance_framework_modules = list_compliance_modules()
for compliance_framework in available_compliance_framework_modules:
if provider in compliance_framework.name:
compliance_specification_dir_path = (
f"{compliance_framework.module_finder.path}/{provider}"
)
# for compliance_framework in available_compliance_framework_modules:
for filename in os.listdir(compliance_specification_dir_path):
file_path = os.path.join(
compliance_specification_dir_path, filename
)
# Check if it is a file and ti size is greater than 0
if os.path.isfile(file_path) and os.stat(file_path).st_size > 0:
# Open Compliance file in JSON
# cis_v1.4_aws.json --> cis_v1.4_aws
compliance_framework_name = filename.split(".json")[0]
# Store the compliance info
bulk_compliance_frameworks[compliance_framework_name] = (
load_compliance_framework(file_path)
)
except Exception as e:
logger.error(f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}] -- {e}")
return bulk_compliance_frameworks
# Exclude checks to run
def exclude_checks_to_run(checks_to_execute: set, excluded_checks: list) -> set:
for check in excluded_checks:
@@ -126,9 +71,10 @@ def parse_checks_from_file(input_file: str, provider: str) -> set:
# Load checks from custom folder
def parse_checks_from_folder(provider, input_folder: str) -> int:
def parse_checks_from_folder(provider, input_folder: str) -> set:
# TODO: move the AWS-specific code into the provider
try:
imported_checks = 0
custom_checks = set()
# Check if input folder is a S3 URI
if provider.type == "aws" and re.search(
"^s3://([^/]+)/(.*?([^/]+))/$", input_folder
@@ -156,8 +102,8 @@ def parse_checks_from_folder(provider, input_folder: str) -> int:
if os.path.exists(prowler_module):
shutil.rmtree(prowler_module)
shutil.copytree(check_module, prowler_module)
imported_checks += 1
return imported_checks
custom_checks.add(check.name)
return custom_checks
except Exception as error:
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
@@ -373,100 +319,12 @@ def parse_checks_from_compliance_framework(
return checks_to_execute
def recover_checks_from_provider(
provider: str, service: str = None, include_fixers: bool = False
) -> list[tuple]:
"""
Recover all checks from the selected provider and service
Returns a list of tuples with the following format (check_name, check_path)
"""
try:
checks = []
modules = list_modules(provider, service)
for module_name in modules:
# Format: "prowler.providers.{provider}.services.{service}.{check_name}.{check_name}"
check_module_name = module_name.name
# We need to exclude common shared libraries in services
if (
check_module_name.count(".") == 6
and "lib" not in check_module_name
and (not check_module_name.endswith("_fixer") or include_fixers)
):
check_path = module_name.module_finder.path
# Check name is the last part of the check_module_name
check_name = check_module_name.split(".")[-1]
check_info = (check_name, check_path)
checks.append(check_info)
except ModuleNotFoundError:
logger.critical(f"Service {service} was not found for the {provider} provider.")
sys.exit(1)
except Exception as e:
logger.critical(f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}]: {e}")
sys.exit(1)
else:
return checks
def list_compliance_modules():
"""
list_compliance_modules returns the available compliance frameworks and returns their path
"""
# This module path requires the full path including "prowler."
module_path = "prowler.compliance"
return walk_packages(
importlib.import_module(module_path).__path__,
importlib.import_module(module_path).__name__ + ".",
)
# List all available modules in the selected provider and service
def list_modules(provider: str, service: str):
# This module path requires the full path including "prowler."
module_path = f"prowler.providers.{provider}.services"
if service:
module_path += f".{service}"
return walk_packages(
importlib.import_module(module_path).__path__,
importlib.import_module(module_path).__name__ + ".",
)
# Import an input check using its path
def import_check(check_path: str) -> ModuleType:
lib = importlib.import_module(f"{check_path}")
return lib
def run_check(check: Check, verbose: bool = False, only_logs: bool = False) -> list:
"""
Run the check and return the findings
Args:
check (Check): check class
output_options (Any): output options
Returns:
list: list of findings
"""
findings = []
if verbose:
print(
f"\nCheck ID: {check.CheckID} - {Fore.MAGENTA}{check.ServiceName}{Fore.YELLOW} [{check.Severity}]{Style.RESET_ALL}"
)
logger.debug(f"Executing check: {check.CheckID}")
try:
findings = check.execute()
except Exception as error:
if not only_logs:
print(
f"Something went wrong in {check.CheckID}, please use --log-level ERROR"
)
logger.error(
f"{check.CheckID} -- {error.__class__.__name__}[{traceback.extract_tb(error.__traceback__)[-1].lineno}]: {error}"
)
finally:
return findings
def run_fixer(check_findings: list) -> int:
"""
Run the fixer for the check if it exists and there are any FAIL findings
@@ -548,6 +406,7 @@ def execute_checks(
global_provider: Any,
custom_checks_metadata: Any,
config_file: str,
output_options: Any,
) -> list:
# List to store all the check's findings
all_findings = []
@@ -583,22 +442,51 @@ def execute_checks(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
# Set verbose flag
verbose = False
if hasattr(output_options, "verbose"):
verbose = output_options.verbose
elif hasattr(output_options, "fixer"):
verbose = output_options.fixer
# Execution with the --only-logs flag
if global_provider.output_options.only_logs:
if output_options.only_logs:
for check_name in checks_to_execute:
# Recover service from check name
service = check_name.split("_")[0]
try:
try:
# Import check module
check_module_path = f"prowler.providers.{global_provider.type}.services.{service}.{check_name}.{check_name}"
lib = import_check(check_module_path)
# Recover functions from check
check_to_execute = getattr(lib, check_name)
check = check_to_execute()
except ModuleNotFoundError:
logger.error(
f"Check '{check_name}' was not found for the {global_provider.type.upper()} provider"
)
continue
if verbose:
print(
f"\nCheck ID: {check.CheckID} - {Fore.MAGENTA}{check.ServiceName}{Fore.YELLOW} [{check.Severity}]{Style.RESET_ALL}"
)
check_findings = execute(
service,
check_name,
check,
global_provider,
services_executed,
checks_executed,
custom_checks_metadata,
output_options,
)
report(check_findings, global_provider, output_options)
all_findings.extend(check_findings)
# Update Audit Status
services_executed.add(service)
checks_executed.add(check_name)
global_provider.audit_metadata = update_audit_metadata(
global_provider.audit_metadata, services_executed, checks_executed
)
# If check does not exists in the provider or is from another provider
except ModuleNotFoundError:
logger.error(
@@ -611,9 +499,9 @@ def execute_checks(
else:
# Prepare your messages
messages = [f"Config File: {Fore.YELLOW}{config_file}{Style.RESET_ALL}"]
if global_provider.mutelist_file_path:
if global_provider.mutelist.mutelist_file_path:
messages.append(
f"Mutelist File: {Fore.YELLOW}{global_provider.mutelist_file_path}{Style.RESET_ALL}"
f"Mutelist File: {Fore.YELLOW}{global_provider.mutelist.mutelist_file_path}{Style.RESET_ALL}"
)
if global_provider.type == "aws":
messages.append(
@@ -647,15 +535,39 @@ def execute_checks(
f"-> Scanning {orange_color}{service}{Style.RESET_ALL} service"
)
try:
try:
# Import check module
check_module_path = f"prowler.providers.{global_provider.type}.services.{service}.{check_name}.{check_name}"
lib = import_check(check_module_path)
# Recover functions from check
check_to_execute = getattr(lib, check_name)
check = check_to_execute()
except ModuleNotFoundError:
logger.error(
f"Check '{check_name}' was not found for the {global_provider.type.upper()} provider"
)
continue
if verbose:
print(
f"\nCheck ID: {check.CheckID} - {Fore.MAGENTA}{check.ServiceName}{Fore.YELLOW} [{check.Severity}]{Style.RESET_ALL}"
)
check_findings = execute(
service,
check_name,
check,
global_provider,
custom_checks_metadata,
output_options,
)
report(check_findings, global_provider, output_options)
all_findings.extend(check_findings)
services_executed.add(service)
checks_executed.add(check_name)
global_provider.audit_metadata = update_audit_metadata(
global_provider.audit_metadata,
services_executed,
checks_executed,
custom_checks_metadata,
)
all_findings.extend(check_findings)
# If check does not exists in the provider or is from another provider
except ModuleNotFoundError:
@@ -670,77 +582,96 @@ def execute_checks(
)
bar()
bar.title = f"-> {Fore.GREEN}Scan completed!{Style.RESET_ALL}"
# Custom report interface
if os.environ.get("PROWLER_REPORT_LIB_PATH"):
try:
logger.info("Using custom report interface ...")
lib = os.environ["PROWLER_REPORT_LIB_PATH"]
outputs_module = importlib.import_module(lib)
custom_report_interface = getattr(outputs_module, "report")
# TODO: review this call and see if we can remove the global_provider.output_options since it is contained in the global_provider
custom_report_interface(check_findings, output_options, global_provider)
except Exception:
sys.exit(1)
return all_findings
def execute(
service: str,
check_name: str,
check: Check,
global_provider: Any,
services_executed: set,
checks_executed: set,
custom_checks_metadata: Any,
output_options: Any = None,
):
try:
# Import check module
check_module_path = f"prowler.providers.{global_provider.type}.services.{service}.{check_name}.{check_name}"
lib = import_check(check_module_path)
# Recover functions from check
check_to_execute = getattr(lib, check_name)
check_class = check_to_execute()
"""
Execute the check and report the findings
Args:
service (str): service name
check_name (str): check name
global_provider (Any): provider object
custom_checks_metadata (Any): custom checks metadata
output_options (Any): output options, depending on the provider
Returns:
list: list of findings
"""
try:
# Update check metadata to reflect that in the outputs
if custom_checks_metadata and custom_checks_metadata["Checks"].get(
check_class.CheckID
check.CheckID
):
check_class = update_check_metadata(
check_class, custom_checks_metadata["Checks"][check_class.CheckID]
check = update_check_metadata(
check, custom_checks_metadata["Checks"][check.CheckID]
)
# Run check
verbose = (
global_provider.output_options.verbose
or global_provider.output_options.fixer
)
check_findings = run_check(
check_class, verbose, global_provider.output_options.only_logs
)
only_logs = False
if hasattr(output_options, "only_logs"):
only_logs = output_options.only_logs
# Update Audit Status
services_executed.add(service)
checks_executed.add(check_name)
global_provider.audit_metadata = update_audit_metadata(
global_provider.audit_metadata, services_executed, checks_executed
)
# Mutelist findings
if hasattr(global_provider, "mutelist") and global_provider.mutelist:
check_findings = mutelist_findings(
global_provider,
check_findings,
)
# Refactor(Outputs)
# Report the check's findings
report(check_findings, global_provider)
# Refactor(Outputs)
if os.environ.get("PROWLER_REPORT_LIB_PATH"):
try:
logger.info("Using custom report interface ...")
lib = os.environ["PROWLER_REPORT_LIB_PATH"]
outputs_module = importlib.import_module(lib)
custom_report_interface = getattr(outputs_module, "report")
# TODO: review this call and see if we can remove the global_provider.output_options since it is contained in the global_provider
custom_report_interface(
check_findings, global_provider.output_options, global_provider
# Execute the check
check_findings = []
logger.debug(f"Executing check: {check.CheckID}")
try:
check_findings = check.execute()
except Exception as error:
if not only_logs:
print(
f"Something went wrong in {check.CheckID}, please use --log-level ERROR"
)
except Exception:
sys.exit(1)
logger.error(
f"{check.CheckID} -- {error.__class__.__name__}[{traceback.extract_tb(error.__traceback__)[-1].lineno}]: {error}"
)
# Exclude findings per status
if hasattr(output_options, "status") and output_options.status:
check_findings = [
finding
for finding in check_findings
if finding.status in output_options.status
]
# Before returning the findings, we need to apply the mute list logic
if hasattr(global_provider, "mutelist") and global_provider.mutelist.mutelist:
is_finding_muted_args = {}
if global_provider.type == "aws":
is_finding_muted_args["aws_account_id"] = (
global_provider.identity.account
)
elif global_provider.type == "kubernetes":
is_finding_muted_args["cluster"] = global_provider.identity.cluster
for finding in check_findings:
is_finding_muted_args["finding"] = finding
finding.muted = global_provider.mutelist.is_finding_muted(
**is_finding_muted_args
)
except ModuleNotFoundError:
logger.error(
f"Check '{check_name}' was not found for the {global_provider.type.upper()} provider"
f"Check '{check.CheckID}' was not found for the {global_provider.type.upper()} provider"
)
check_findings = []
except Exception as error:
@@ -770,34 +701,3 @@ def update_audit_metadata(
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def recover_checks_from_service(service_list: list, provider: str) -> set:
"""
Recover all checks from the selected provider and service
Returns a set of checks from the given services
"""
try:
checks = set()
service_list = [
"awslambda" if service == "lambda" else service for service in service_list
]
for service in service_list:
service_checks = recover_checks_from_provider(provider, service)
if not service_checks:
logger.error(f"Service '{service}' does not have checks.")
else:
for check in service_checks:
# Recover check name and module name from import path
# Format: "providers.{provider}.services.{service}.{check_name}.{check_name}"
check_name = check[0].split(".")[-1]
# If the service is present in the group list passed as parameters
# if service_name in group_list: checks_from_arn.add(check_name)
checks.add(check_name)
return checks
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)

View File

@@ -4,6 +4,8 @@ from prowler.config.config import valid_severities
from prowler.lib.check.check import (
parse_checks_from_compliance_framework,
parse_checks_from_file,
)
from prowler.lib.check.utils import (
recover_checks_from_provider,
recover_checks_from_service,
)

View File

@@ -1,16 +1,21 @@
import sys
from pydantic import parse_obj_as
from prowler.lib.check.compliance_models import Compliance_Base_Model
from prowler.lib.check.models import Check_Metadata_Model
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.logger import logger
def update_checks_metadata_with_compliance(
bulk_compliance_frameworks: dict, bulk_checks_metadata: dict
):
"""Update the check metadata model with the compliance framework"""
) -> dict:
"""
Update the check metadata model with the compliance framework
Args:
bulk_compliance_frameworks (dict): The compliance frameworks
bulk_checks_metadata (dict): The checks metadata
Returns:
dict: The checks metadata with the compliance frameworks
"""
try:
for check in bulk_checks_metadata:
check_compliance = []
@@ -21,8 +26,8 @@ def update_checks_metadata_with_compliance(
if check in requirement.Checks:
# Include the requirement into the check's framework requirements
compliance_requirements.append(requirement)
# Create the Compliance_Model
compliance = Compliance_Base_Model(
# Create the Compliance
compliance = Compliance(
Framework=framework.Framework,
Provider=framework.Provider,
Version=framework.Version,
@@ -33,53 +38,6 @@ def update_checks_metadata_with_compliance(
check_compliance.append(compliance)
# Save it into the check's metadata
bulk_checks_metadata[check].Compliance = check_compliance
# Add requirements of Manual Controls
for framework in bulk_compliance_frameworks.values():
for requirement in framework.Requirements:
compliance_requirements = []
# Verify if requirement is Manual
if not requirement.Checks:
compliance_requirements.append(requirement)
# Create the Compliance_Model
compliance = Compliance_Base_Model(
Framework=framework.Framework,
Provider=framework.Provider,
Version=framework.Version,
Description=framework.Description,
Requirements=compliance_requirements,
)
# Include the compliance framework for the check
check_compliance.append(compliance)
# Create metadata for Manual Control
manual_check_metadata = {
"Provider": framework.Provider.lower(),
"CheckID": "manual_check",
"CheckTitle": "Manual Check",
"CheckType": [],
"ServiceName": "",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "low",
"ResourceType": "",
"Description": "",
"Risk": "",
"RelatedUrl": "",
"Remediation": {
"Code": {"CLI": "", "NativeIaC": "", "Other": "", "Terraform": ""},
"Recommendation": {"Text": "", "Url": ""},
},
"Categories": [],
"Tags": {},
"DependsOn": [],
"RelatedTo": [],
"Notes": "",
}
manual_check = parse_obj_as(Check_Metadata_Model, manual_check_metadata)
# Save it into the check's metadata
bulk_checks_metadata["manual_check"] = manual_check
bulk_checks_metadata["manual_check"].Compliance = check_compliance
return bulk_checks_metadata
except Exception as e:
logger.critical(f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}] -- {e}")

View File

@@ -1,9 +1,11 @@
import os
import sys
from enum import Enum
from typing import Optional, Union
from pydantic import BaseModel, ValidationError, root_validator
from prowler.lib.check.utils import list_compliance_modules
from prowler.lib.logger import logger
@@ -91,7 +93,6 @@ class CIS_Requirement_Attribute(BaseModel):
AdditionalInformation: str
DefaultValue: Optional[str]
References: str
DefaultValue: Optional[str]
# Well Architected Requirement Attribute
@@ -168,6 +169,19 @@ class Mitre_Requirement(BaseModel):
Checks: list[str]
# KISA-ISMS-P Requirement Attribute
class KISA_ISMSP_Requirement_Attribute(BaseModel):
"""KISA ISMS-P Requirement Attribute"""
Domain: str
Subdomain: str
Section: str
AuditChecklist: Optional[list[str]]
RelatedRegulations: Optional[list[str]]
AuditEvidence: Optional[list[str]]
NonComplianceCases: Optional[list[str]]
# Base Compliance Model
# TODO: move this to compliance folder
class Compliance_Requirement(BaseModel):
@@ -182,6 +196,7 @@ class Compliance_Requirement(BaseModel):
ENS_Requirement_Attribute,
ISO27001_2013_Requirement_Attribute,
AWS_Well_Architected_Requirement_Attribute,
KISA_ISMSP_Requirement_Attribute,
# Generic_Compliance_Requirement_Attribute must be the last one since it is the fallback for generic compliance framework
Generic_Compliance_Requirement_Attribute,
]
@@ -189,8 +204,8 @@ class Compliance_Requirement(BaseModel):
Checks: list[str]
class Compliance_Base_Model(BaseModel):
"""Compliance_Base_Model holds the base model for every compliance framework"""
class Compliance(BaseModel):
"""Compliance holds the base model for every compliance framework"""
Framework: str
Provider: str
@@ -214,16 +229,137 @@ class Compliance_Base_Model(BaseModel):
raise ValueError("Framework or Provider must not be empty")
return values
@staticmethod
def list(bulk_compliance_frameworks: dict, provider: str = None) -> list[str]:
"""
Returns a list of compliance frameworks from bulk compliance frameworks
Args:
bulk_compliance_frameworks (dict): The bulk compliance frameworks
provider (str): The provider name
Returns:
list: The list of compliance frameworks
"""
if provider:
compliance_frameworks = [
compliance_framework
for compliance_framework in bulk_compliance_frameworks.keys()
if provider in compliance_framework
]
else:
compliance_frameworks = [
compliance_framework
for compliance_framework in bulk_compliance_frameworks.keys()
]
return compliance_frameworks
@staticmethod
def get(
bulk_compliance_frameworks: dict, compliance_framework_name: str
) -> "Compliance":
"""
Returns a compliance framework from bulk compliance frameworks
Args:
bulk_compliance_frameworks (dict): The bulk compliance frameworks
compliance_framework_name (str): The compliance framework name
Returns:
Compliance: The compliance framework
"""
return bulk_compliance_frameworks.get(compliance_framework_name, None)
@staticmethod
def list_requirements(
bulk_compliance_frameworks: dict, compliance_framework: str = None
) -> list:
"""
Returns a list of compliance requirements from a compliance framework
Args:
bulk_compliance_frameworks (dict): The bulk compliance frameworks
compliance_framework (str): The compliance framework name
Returns:
list: The list of compliance requirements for the provided compliance framework
"""
compliance_requirements = []
if bulk_compliance_frameworks and compliance_framework:
compliance_requirements = [
compliance_requirement.Id
for compliance_requirement in bulk_compliance_frameworks.get(
compliance_framework
).Requirements
]
return compliance_requirements
@staticmethod
def get_requirement(
bulk_compliance_frameworks: dict, compliance_framework: str, requirement_id: str
) -> Union[Mitre_Requirement, Compliance_Requirement]:
"""
Returns a compliance requirement from a compliance framework
Args:
bulk_compliance_frameworks (dict): The bulk compliance frameworks
compliance_framework (str): The compliance framework name
requirement_id (str): The compliance requirement ID
Returns:
Mitre_Requirement | Compliance_Requirement: The compliance requirement
"""
requirement = None
for compliance_requirement in bulk_compliance_frameworks.get(
compliance_framework
).Requirements:
if compliance_requirement.Id == requirement_id:
requirement = compliance_requirement
break
return requirement
@staticmethod
def get_bulk(provider: str) -> dict:
"""Bulk load all compliance frameworks specification into a dict"""
try:
bulk_compliance_frameworks = {}
available_compliance_framework_modules = list_compliance_modules()
for compliance_framework in available_compliance_framework_modules:
if provider in compliance_framework.name:
compliance_specification_dir_path = (
f"{compliance_framework.module_finder.path}/{provider}"
)
# for compliance_framework in available_compliance_framework_modules:
for filename in os.listdir(compliance_specification_dir_path):
file_path = os.path.join(
compliance_specification_dir_path, filename
)
# Check if it is a file and ti size is greater than 0
if os.path.isfile(file_path) and os.stat(file_path).st_size > 0:
# Open Compliance file in JSON
# cis_v1.4_aws.json --> cis_v1.4_aws
compliance_framework_name = filename.split(".json")[0]
# Store the compliance info
bulk_compliance_frameworks[compliance_framework_name] = (
load_compliance_framework(file_path)
)
except Exception as e:
logger.error(f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}] -- {e}")
return bulk_compliance_frameworks
# Testing Pending
def load_compliance_framework(
compliance_specification_file: str,
) -> Compliance_Base_Model:
) -> Compliance:
"""load_compliance_framework loads and parse a Compliance Framework Specification"""
try:
compliance_framework = Compliance_Base_Model.parse_file(
compliance_specification_file
)
compliance_framework = Compliance.parse_file(compliance_specification_file)
except ValidationError as error:
logger.critical(
f"Compliance Framework Specification from {compliance_specification_file} is not valid: {error}"

View File

@@ -7,11 +7,20 @@ from dataclasses import dataclass
from pydantic import BaseModel, ValidationError, validator
from prowler.config.config import valid_severities
from prowler.lib.check.utils import recover_checks_from_provider
from prowler.lib.logger import logger
class Code(BaseModel):
"""Check's remediation information using IaC like CloudFormation, Terraform or the native CLI"""
"""
Represents the remediation code using IaC like CloudFormation, Terraform or the native CLI.
Attributes:
NativeIaC (str): The NativeIaC code.
Terraform (str): The Terraform code.
CLI (str): The CLI code.
Other (str): Other code.
"""
NativeIaC: str
Terraform: str
@@ -20,21 +29,61 @@ class Code(BaseModel):
class Recommendation(BaseModel):
"""Check's recommendation information"""
"""
Represents a recommendation.
Attributes:
Text (str): The text of the recommendation.
Url (str): The URL associated with the recommendation.
"""
Text: str
Url: str
class Remediation(BaseModel):
"""Check's remediation: Code and Recommendation"""
"""
Represents a remediation action for a specific .
Attributes:
Code (Code): The code associated with the remediation action.
Recommendation (Recommendation): The recommendation for the remediation action.
"""
Code: Code
Recommendation: Recommendation
class Check_Metadata_Model(BaseModel):
"""Check Metadata Model"""
class CheckMetadata(BaseModel):
"""
Model representing the metadata of a check.
Attributes:
Provider (str): The provider of the check.
CheckID (str): The ID of the check.
CheckTitle (str): The title of the check.
CheckType (list[str]): The type of the check.
CheckAliases (list[str], optional): The aliases of the check. Defaults to an empty list.
ServiceName (str): The name of the service.
SubServiceName (str): The name of the sub-service.
ResourceIdTemplate (str): The template for the resource ID.
Severity (str): The severity of the check.
ResourceType (str): The type of the resource.
Description (str): The description of the check.
Risk (str): The risk associated with the check.
RelatedUrl (str): The URL related to the check.
Remediation (Remediation): The remediation steps for the check.
Categories (list[str]): The categories of the check.
DependsOn (list[str]): The dependencies of the check.
RelatedTo (list[str]): The related checks.
Notes (str): Additional notes for the check.
Compliance (list, optional): The compliance information for the check. Defaults to None.
Validators:
valid_category(value): Validator function to validate the categories of the check.
severity_to_lower(severity): Validator function to convert the severity to lowercase.
valid_severity(severity): Validator function to validate the severity of the check.
"""
Provider: str
CheckID: str
@@ -81,8 +130,36 @@ class Check_Metadata_Model(BaseModel):
)
return severity
@staticmethod
def get_bulk(provider: str) -> dict[str, "CheckMetadata"]:
"""
Load the metadata of all checks for a given provider reading the check's metadata files.
Args:
provider (str): The name of the provider.
Returns:
dict[str, CheckMetadata]: A dictionary containing the metadata of all checks, with the CheckID as the key.
"""
class Check(ABC, Check_Metadata_Model):
bulk_check_metadata = {}
checks = recover_checks_from_provider(provider)
# Build list of check's metadata files
for check_info in checks:
# Build check path name
check_name = check_info[0]
check_path = check_info[1]
# Ignore fixer files
if check_name.endswith("_fixer"):
continue
# Append metadata file extension
metadata_file = f"{check_path}/{check_name}.metadata.json"
# Load metadata
check_metadata = load_check_metadata(metadata_file)
bulk_check_metadata[check_metadata.CheckID] = check_metadata
return bulk_check_metadata
class Check(ABC, CheckMetadata):
"""Prowler Check"""
def __init__(self, **data):
@@ -93,9 +170,11 @@ class Check(ABC, Check_Metadata_Model):
+ ".metadata.json"
)
# Store it to validate them with Pydantic
data = Check_Metadata_Model.parse_file(metadata_file).dict()
data = CheckMetadata.parse_file(metadata_file).dict()
# Calls parents init function
super().__init__(**data)
# TODO: verify that the CheckID is the same as the filename and classname
# to mimic the test done at test_<provider>_checks_metadata_is_valid
def metadata(self) -> dict:
"""Return the JSON representation of the check's metadata"""
@@ -112,14 +191,14 @@ class Check_Report:
status: str
status_extended: str
check_metadata: Check_Metadata_Model
check_metadata: CheckMetadata
resource_details: str
resource_tags: list
muted: bool
def __init__(self, metadata):
self.status = ""
self.check_metadata = Check_Metadata_Model.parse_raw(metadata)
self.check_metadata = CheckMetadata.parse_raw(metadata)
self.status_extended = ""
self.resource_details = ""
self.resource_tags = []
@@ -192,12 +271,22 @@ class Check_Report_Kubernetes(Check_Report):
# Testing Pending
def load_check_metadata(metadata_file: str) -> Check_Metadata_Model:
"""load_check_metadata loads and parse a Check's metadata file"""
def load_check_metadata(metadata_file: str) -> CheckMetadata:
"""
Load check metadata from a file.
Args:
metadata_file (str): The path to the metadata file.
Returns:
CheckMetadata: The loaded check metadata.
Raises:
ValidationError: If the metadata file is not valid.
"""
try:
check_metadata = Check_Metadata_Model.parse_file(metadata_file)
check_metadata = CheckMetadata.parse_file(metadata_file)
except ValidationError as error:
logger.critical(f"Metadata from {metadata_file} is not valid: {error}")
# TODO: remove this exit and raise an exception
sys.exit(1)
else:
return check_metadata

View File

@@ -0,0 +1,95 @@
import importlib
import sys
from pkgutil import walk_packages
from prowler.lib.logger import logger
def recover_checks_from_provider(
provider: str, service: str = None, include_fixers: bool = False
) -> list[tuple]:
"""
Recover all checks from the selected provider and service
Returns a list of tuples with the following format (check_name, check_path)
"""
try:
checks = []
modules = list_modules(provider, service)
for module_name in modules:
# Format: "prowler.providers.{provider}.services.{service}.{check_name}.{check_name}"
check_module_name = module_name.name
# We need to exclude common shared libraries in services
if (
check_module_name.count(".") == 6
and "lib" not in check_module_name
and (not check_module_name.endswith("_fixer") or include_fixers)
):
check_path = module_name.module_finder.path
# Check name is the last part of the check_module_name
check_name = check_module_name.split(".")[-1]
check_info = (check_name, check_path)
checks.append(check_info)
except ModuleNotFoundError:
logger.critical(f"Service {service} was not found for the {provider} provider.")
sys.exit(1)
except Exception as e:
logger.critical(f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}]: {e}")
sys.exit(1)
else:
return checks
# List all available modules in the selected provider and service
def list_modules(provider: str, service: str):
# This module path requires the full path including "prowler."
module_path = f"prowler.providers.{provider}.services"
if service:
module_path += f".{service}"
return walk_packages(
importlib.import_module(module_path).__path__,
importlib.import_module(module_path).__name__ + ".",
)
def recover_checks_from_service(service_list: list, provider: str) -> set:
"""
Recover all checks from the selected provider and service
Returns a set of checks from the given services
"""
try:
checks = set()
service_list = [
"awslambda" if service == "lambda" else service for service in service_list
]
for service in service_list:
service_checks = recover_checks_from_provider(provider, service)
if not service_checks:
logger.error(f"Service '{service}' does not have checks.")
else:
for check in service_checks:
# Recover check name and module name from import path
# Format: "providers.{provider}.services.{service}.{check_name}.{check_name}"
check_name = check[0].split(".")[-1]
# If the service is present in the group list passed as parameters
# if service_name in group_list: checks_from_arn.add(check_name)
checks.add(check_name)
return checks
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def list_compliance_modules():
"""
list_compliance_modules returns the available compliance frameworks and returns their path
"""
# This module path requires the full path including "prowler."
module_path = "prowler.compliance"
return walk_packages(
importlib.import_module(module_path).__path__,
importlib.import_module(module_path).__name__ + ".",
)

View File

@@ -5,6 +5,7 @@ from argparse import RawTextHelpFormatter
from dashboard.lib.arguments.arguments import init_dashboard_parser
from prowler.config.config import (
available_compliance_frameworks,
available_output_formats,
check_current_version,
default_config_file_path,
default_fixer_config_file_path,
@@ -147,7 +148,7 @@ Detailed documentation at https://docs.prowler.com
nargs="+",
help="Output modes, by default csv and json-oscf are saved. When using AWS Security Hub integration, json-asff output is also saved.",
default=["csv", "json-ocsf", "html"],
choices=["csv", "json-asff", "json-ocsf", "html"],
choices=available_output_formats,
)
common_outputs_parser.add_argument(
"--output-filename",
@@ -262,7 +263,7 @@ Detailed documentation at https://docs.prowler.com
group.add_argument(
"--compliance",
nargs="+",
help="Compliance Framework to check against for. The format should be the following: framework_version_provider (e.g.: ens_rd2022_aws)",
help="Compliance Framework to check against for. The format should be the following: framework_version_provider (e.g.: cis_3.0_aws)",
choices=available_compliance_frameworks,
)
group.add_argument(

View File

@@ -1,373 +1,345 @@
import re
from typing import Any
from abc import ABC, abstractmethod
import yaml
from prowler.lib.logger import logger
from prowler.lib.mutelist.models import mutelist_schema
from prowler.lib.outputs.utils import unroll_tags
def get_mutelist_file_from_local_file(mutelist_path: str):
try:
with open(mutelist_path) as f:
mutelist = yaml.safe_load(f)["Mutelist"]
return mutelist
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return {}
def validate_mutelist(mutelist: dict) -> dict:
try:
mutelist = mutelist_schema.validate(mutelist)
return mutelist
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- Mutelist YAML is malformed - {error}[{error.__traceback__.tb_lineno}]"
)
return {}
def mutelist_findings(
global_provider: Any,
check_findings: list[Any],
) -> list[Any]:
# Check if finding is muted
for finding in check_findings:
# TODO: Move this mapping to the execute_check function and pass that output to the mutelist and the report
if global_provider.type == "aws":
finding.muted = is_muted(
global_provider.mutelist,
global_provider.identity.account,
finding.check_metadata.CheckID,
finding.region,
finding.resource_id,
unroll_tags(finding.resource_tags),
)
elif global_provider.type == "azure":
finding.muted = is_muted(
global_provider.mutelist,
finding.subscription,
finding.check_metadata.CheckID,
# TODO: add region to the findings when we add Azure Locations
# finding.region,
"",
finding.resource_name,
unroll_tags(finding.resource_tags),
)
elif global_provider.type == "gcp":
finding.muted = is_muted(
global_provider.mutelist,
finding.project_id,
finding.check_metadata.CheckID,
finding.location,
finding.resource_name,
unroll_tags(finding.resource_tags),
)
elif global_provider.type == "kubernetes":
finding.muted = is_muted(
global_provider.mutelist,
global_provider.identity.cluster,
finding.check_metadata.CheckID,
finding.namespace,
finding.resource_name,
unroll_tags(finding.resource_tags),
)
return check_findings
def is_muted(
mutelist: dict,
audited_account: str,
check: str,
finding_region: str,
finding_resource: str,
finding_tags,
) -> bool:
class Mutelist(ABC):
"""
Check if the provided finding is muted for the audited account, check, region, resource and tags.
Abstract base class for managing a mutelist.
Args:
mutelist (dict): Dictionary containing information about muted checks for different accounts.
audited_account (str): The account being audited.
check (str): The check to be evaluated for muting.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags: The tags associated with the finding.
Attributes:
_mutelist (dict): Dictionary containing information about muted checks for different accounts.
_mutelist_file_path (str): Path to the mutelist file.
MUTELIST_KEY (str): Key used to access the mutelist in the mutelist file.
Returns:
bool: True if the finding is muted for the audited account, check, region, resource and tags., otherwise False.
Methods:
__init__: Initializes a Mutelist object.
mutelist: Property that returns the mutelist dictionary.
mutelist_file_path: Property that returns the mutelist file path.
is_finding_muted: Abstract method to check if a finding is muted.
get_mutelist_file_from_local_file: Retrieves the mutelist file from a local file.
validate_mutelist: Validates the mutelist against a schema.
is_muted: Checks if a finding is muted for the audited account, check, region, resource, and tags.
is_muted_in_check: Checks if a check is muted.
is_excepted: Checks if the account, region, resource, and tags are excepted based on the exceptions.
"""
try:
# By default is not muted
is_finding_muted = False
# We always check all the accounts present in the mutelist
# if one mutes the finding we set the finding as muted
for account in mutelist["Accounts"]:
if account == audited_account or account == "*":
if is_muted_in_check(
mutelist["Accounts"][account]["Checks"],
audited_account,
check,
finding_region,
finding_resource,
finding_tags,
_mutelist: dict = {}
_mutelist_file_path: str = None
MUTELIST_KEY = "Mutelist"
def __init__(
self, mutelist_path: str = "", mutelist_content: dict = {}
) -> "Mutelist":
if mutelist_path:
self._mutelist_file_path = mutelist_path
self.get_mutelist_file_from_local_file(mutelist_path)
else:
self._mutelist = mutelist_content
if self._mutelist:
self.validate_mutelist()
@property
def mutelist(self) -> dict:
return self._mutelist
@property
def mutelist_file_path(self) -> dict:
return self._mutelist_file_path
@abstractmethod
def is_finding_muted(self) -> bool:
raise NotImplementedError
def get_mutelist_file_from_local_file(self, mutelist_path: str):
try:
with open(mutelist_path) as f:
self._mutelist = yaml.safe_load(f)[self.MUTELIST_KEY]
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
def validate_mutelist(self) -> bool:
try:
self._mutelist = mutelist_schema.validate(self._mutelist)
return True
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- Mutelist YAML is malformed - {error}[{error.__traceback__.tb_lineno}]"
)
self._mutelist = {}
return False
def is_muted(
self,
audited_account: str,
check: str,
finding_region: str,
finding_resource: str,
finding_tags,
) -> bool:
"""
Check if the provided finding is muted for the audited account, check, region, resource and tags.
The Mutelist works in a way that each field is ANDed, so if a check is muted for an account, region, resource and tags, it will be muted.
The exceptions are ORed, so if a check is excepted for an account, region, resource or tags, it will not be muted.
The only particularity is the tags, which are ORed.
So, for the following Mutelist:
```
Mutelist:
Accounts:
'*':
Checks:
ec2_instance_detailed_monitoring_enabled:
Regions: ['*']
Resources:
- 'i-123456789'
Tags:
- 'Name=AdminInstance | Environment=Prod'
```
The check `ec2_instance_detailed_monitoring_enabled` will be muted for all accounts and regions and for the resource_id 'i-123456789' with at least one of the tags 'Name=AdminInstance' or 'Environment=Prod'.
Args:
mutelist (dict): Dictionary containing information about muted checks for different accounts.
audited_account (str): The account being audited.
check (str): The check to be evaluated for muting.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags: The tags associated with the finding.
Returns:
bool: True if the finding is muted for the audited account, check, region, resource and tags., otherwise False.
"""
try:
# By default is not muted
is_finding_muted = False
# We always check all the accounts present in the mutelist
# if one mutes the finding we set the finding as muted
for account in self._mutelist.get("Accounts", []):
if account == audited_account or account == "*":
if self.is_muted_in_check(
self._mutelist["Accounts"][account]["Checks"],
audited_account,
check,
finding_region,
finding_resource,
finding_tags,
):
is_finding_muted = True
break
return is_finding_muted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_muted_in_check(
self,
muted_checks,
audited_account,
check,
finding_region,
finding_resource,
finding_tags,
) -> bool:
"""
Check if the provided check is muted.
Args:
muted_checks (dict): Dictionary containing information about muted checks.
audited_account (str): The account to be audited.
check (str): The check to be evaluated for muting.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags (str): The tags associated with the finding.
Returns:
bool: True if the check is muted, otherwise False.
"""
try:
# Default value is not muted
is_check_muted = False
for muted_check, muted_check_info in muted_checks.items():
# map lambda to awslambda
muted_check = re.sub("^lambda", "awslambda", muted_check)
check_match = (
"*" == muted_check
or check == muted_check
or self.is_item_matched([muted_check], check)
)
# Check if the finding is excepted
exceptions = muted_check_info.get("Exceptions")
if (
self.is_excepted(
exceptions,
audited_account,
finding_region,
finding_resource,
finding_tags,
)
and check_match
):
is_finding_muted = True
# Break loop and return default value since is excepted
break
return is_finding_muted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
muted_regions = muted_check_info.get("Regions")
muted_resources = muted_check_info.get("Resources")
muted_tags = muted_check_info.get("Tags", "*")
# We need to set the muted_tags if None, "" or [], so the falsy helps
if not muted_tags:
muted_tags = "*"
# If there is a *, it affects to all checks
if check_match:
muted_in_check = True
muted_in_region = self.is_item_matched(
muted_regions, finding_region
)
muted_in_resource = self.is_item_matched(
muted_resources, finding_resource
)
muted_in_tags = self.is_item_matched(
muted_tags, finding_tags, tag=True
)
# For a finding to be muted requires the following set to True:
# - muted_in_check -> True
# - muted_in_region -> True
# - muted_in_tags -> True
# - muted_in_resource -> True
# - excepted -> False
def is_muted_in_check(
muted_checks,
audited_account,
check,
finding_region,
finding_resource,
finding_tags,
) -> bool:
"""
Check if the provided check is muted.
if (
muted_in_check
and muted_in_region
and muted_in_tags
and muted_in_resource
):
is_check_muted = True
Args:
muted_checks (dict): Dictionary containing information about muted checks.
audited_account (str): The account to be audited.
check (str): The check to be evaluated for muting.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags (str): The tags associated with the finding.
Returns:
bool: True if the check is muted, otherwise False.
"""
try:
# Default value is not muted
is_check_muted = False
for muted_check, muted_check_info in muted_checks.items():
# map lambda to awslambda
muted_check = re.sub("^lambda", "awslambda", muted_check)
check_match = (
"*" == muted_check
or check == muted_check
or re.search(muted_check, check)
return is_check_muted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
# Check if the finding is excepted
exceptions = muted_check_info.get("Exceptions")
if (
is_excepted(
exceptions,
audited_account,
finding_region,
finding_resource,
finding_tags,
)
and check_match
):
# Break loop and return default value since is excepted
break
return False
muted_regions = muted_check_info.get("Regions")
muted_resources = muted_check_info.get("Resources")
muted_tags = muted_check_info.get("Tags", "*")
# We need to set the muted_tags if None, "" or [], so the falsy helps
if not muted_tags:
muted_tags = "*"
# If there is a *, it affects to all checks
if check_match:
muted_in_check = True
muted_in_region = is_muted_in_region(muted_regions, finding_region)
muted_in_resource = is_muted_in_resource(
muted_resources, finding_resource
)
muted_in_tags = is_muted_in_tags(muted_tags, finding_tags)
def is_excepted(
self,
exceptions,
audited_account,
finding_region,
finding_resource,
finding_tags,
) -> bool:
"""
Check if the provided account, region, resource, and tags are excepted based on the exceptions dictionary.
# For a finding to be muted requires the following set to True:
# - muted_in_check -> True
# - muted_in_region -> True
# - muted_in_tags -> True
# - muted_in_resource -> True
# - excepted -> False
Args:
exceptions (dict): Dictionary containing exceptions for different attributes like Accounts, Regions, Resources, and Tags.
audited_account (str): The account to be audited.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags (str): The tags associated with the finding.
Returns:
bool: True if the account, region, resource, and tags are excepted based on the exceptions, otherwise False.
"""
try:
excepted = False
is_account_excepted = False
is_region_excepted = False
is_resource_excepted = False
is_tag_excepted = False
if exceptions:
excepted_accounts = exceptions.get("Accounts", [])
is_account_excepted = self.is_item_matched(
excepted_accounts, audited_account
)
excepted_regions = exceptions.get("Regions", [])
is_region_excepted = self.is_item_matched(
excepted_regions, finding_region
)
excepted_resources = exceptions.get("Resources", [])
is_resource_excepted = self.is_item_matched(
excepted_resources, finding_resource
)
excepted_tags = exceptions.get("Tags", [])
is_tag_excepted = self.is_item_matched(
excepted_tags, finding_tags, tag=True
)
if (
muted_in_check
and muted_in_region
and muted_in_tags
and muted_in_resource
not is_account_excepted
and not is_region_excepted
and not is_resource_excepted
and not is_tag_excepted
):
is_check_muted = True
return is_check_muted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_muted_in_region(
mutelist_regions,
finding_region,
) -> bool:
"""
Check if the finding_region is present in the mutelist_regions.
Args:
mutelist_regions (list): List of regions in the mute list.
finding_region (str): Region to check if it is muted.
Returns:
bool: True if the finding_region is muted in any of the mutelist_regions, otherwise False.
"""
try:
return __is_item_matched__(mutelist_regions, finding_region)
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_muted_in_tags(muted_tags, finding_tags) -> bool:
"""
Check if any of the muted tags are present in the finding tags.
Args:
muted_tags (list): List of muted tags to be checked.
finding_tags (str): String containing tags to search for muted tags.
Returns:
bool: True if any of the muted tags are present in the finding tags, otherwise False.
"""
try:
return __is_item_matched__(muted_tags, finding_tags)
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_muted_in_resource(muted_resources, finding_resource) -> bool:
"""
Check if any of the muted_resources are present in the finding_resource.
Args:
muted_resources (list): List of muted resources to be checked.
finding_resource (str): Resource to search for muted resources.
Returns:
bool: True if any of the muted_resources are present in the finding_resource, otherwise False.
"""
try:
return __is_item_matched__(muted_resources, finding_resource)
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def is_excepted(
exceptions,
audited_account,
finding_region,
finding_resource,
finding_tags,
) -> bool:
"""
Check if the provided account, region, resource, and tags are excepted based on the exceptions dictionary.
Args:
exceptions (dict): Dictionary containing exceptions for different attributes like Accounts, Regions, Resources, and Tags.
audited_account (str): The account to be audited.
finding_region (str): The region where the finding occurred.
finding_resource (str): The resource related to the finding.
finding_tags (str): The tags associated with the finding.
Returns:
bool: True if the account, region, resource, and tags are excepted based on the exceptions, otherwise False.
"""
try:
excepted = False
is_account_excepted = False
is_region_excepted = False
is_resource_excepted = False
is_tag_excepted = False
if exceptions:
excepted_accounts = exceptions.get("Accounts", [])
is_account_excepted = __is_item_matched__(
excepted_accounts, audited_account
excepted = False
elif (
(is_account_excepted or not excepted_accounts)
and (is_region_excepted or not excepted_regions)
and (is_resource_excepted or not excepted_resources)
and (is_tag_excepted or not excepted_tags)
):
excepted = True
return excepted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
excepted_regions = exceptions.get("Regions", [])
is_region_excepted = __is_item_matched__(excepted_regions, finding_region)
@staticmethod
def is_item_matched(matched_items, finding_items, tag=False) -> bool:
"""
Check if any of the items in matched_items are present in finding_items.
excepted_resources = exceptions.get("Resources", [])
is_resource_excepted = __is_item_matched__(
excepted_resources, finding_resource
)
Args:
matched_items (list): List of items to be matched.
finding_items (str): String to search for matched items.
tag (bool): If True the search will have a different logic due to the tags being ANDed or ORed:
- Check of AND logic -> True if all the tags are present in the finding.
- Check of OR logic -> True if any of the tags is present in the finding.
excepted_tags = exceptions.get("Tags", [])
is_tag_excepted = __is_item_matched__(excepted_tags, finding_tags)
if (
not is_account_excepted
and not is_region_excepted
and not is_resource_excepted
and not is_tag_excepted
):
excepted = False
elif (
(is_account_excepted or not excepted_accounts)
and (is_region_excepted or not excepted_regions)
and (is_resource_excepted or not excepted_resources)
and (is_tag_excepted or not excepted_tags)
):
excepted = True
return excepted
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
def __is_item_matched__(matched_items, finding_items):
"""
Check if any of the items in matched_items are present in finding_items.
Args:
matched_items (list): List of items to be matched.
finding_items (str): String to search for matched items.
Returns:
bool: True if any of the matched_items are present in finding_items, otherwise False.
"""
try:
is_item_matched = False
if matched_items and (finding_items or finding_items == ""):
for item in matched_items:
if item.startswith("*"):
item = ".*" + item[1:]
if re.search(item, finding_items):
Returns:
bool: True if any of the matched_items are present in finding_items, otherwise False.
"""
try:
is_item_matched = False
if matched_items and (finding_items or finding_items == ""):
if tag:
is_item_matched = True
break
return is_item_matched
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False
for item in matched_items:
if item.startswith("*"):
item = ".*" + item[1:]
if tag:
if not re.search(item, finding_items):
is_item_matched = False
break
else:
if re.search(item, finding_items):
is_item_matched = True
break
return is_item_matched
except Exception as error:
logger.error(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
return False

View File

@@ -0,0 +1,405 @@
from json import dump
from os import SEEK_SET
from typing import Optional
from pydantic import BaseModel, validator
from prowler.config.config import prowler_version, timestamp_utc
from prowler.lib.logger import logger
from prowler.lib.outputs.finding import Finding
from prowler.lib.outputs.output import Output
from prowler.lib.utils.utils import hash_sha512
class ASFF(Output):
"""
ASFF class represents a transformation of findings into AWS Security Finding Format (ASFF).
This class provides methods to transform a list of findings into the ASFF format required by AWS Security Hub. It includes operations such as generating unique identifiers, formatting timestamps, handling compliance frameworks, and ensuring the status values match the allowed values in ASFF.
Attributes:
- _data: A list to store the transformed findings.
- _file_descriptor: A file descriptor to write to file.
Methods:
- transform(findings: list[Finding]) -> None: Transforms a list of findings into ASFF format.
- batch_write_data_to_file() -> None: Writes the findings data to a file in JSON ASFF format.
- generate_status(status: str, muted: bool = False) -> str: Generates the ASFF status based on the provided status and muted flag.
References:
- AWS Security Hub API Reference: https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
- AWS Security Finding Format Syntax: https://docs.aws.amazon.com/securityhub/latest/userguide/securityhub-findings-format-syntax.html
"""
def transform(self, findings: list[Finding]) -> None:
"""
Transforms a list of findings into AWS Security Finding Format (ASFF).
This method iterates over the list of findings provided as input and transforms each finding into the ASFF format required by AWS Security Hub. It performs several operations for each finding, including generating unique identifiers, formatting timestamps, handling compliance frameworks, and ensuring the status values match the allowed values in ASFF.
Parameters:
- findings (list[Finding]): A list of Finding objects representing the findings to be transformed.
Returns:
- None
Notes:
- The method skips findings with a status of "MANUAL" as it is not valid in SecurityHub.
- It generates unique identifiers for each finding based on specific attributes.
- It formats timestamps in the required ASFF format.
- It handles compliance frameworks and associated standards for each finding.
- It ensures that the finding status matches the allowed values in ASFF.
References:
- AWS Security Hub API Reference: https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
- AWS Security Finding Format Syntax: https://docs.aws.amazon.com/securityhub/latest/userguide/securityhub-findings-format-syntax.html
"""
try:
for finding in findings:
# MANUAL status is not valid in SecurityHub
# https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
if finding.status == "MANUAL":
continue
timestamp = timestamp_utc.strftime("%Y-%m-%dT%H:%M:%SZ")
associated_standards, compliance_summary = ASFF.format_compliance(
finding.compliance
)
# Ensures finding_status matches allowed values in ASFF
finding_status = ASFF.generate_status(finding.status, finding.muted)
self._data.append(
AWSSecurityFindingFormat(
# The following line cannot be changed because it is the format we use to generate unique findings for AWS Security Hub
# If changed some findings could be lost because the unique identifier will be different
Id=f"prowler-{finding.check_id}-{finding.account_uid}-{finding.region}-{hash_sha512(finding.resource_uid)}",
ProductArn=f"arn:{finding.partition}:securityhub:{finding.region}::product/prowler/prowler",
ProductFields=ProductFields(
ProwlerResourceName=finding.resource_uid,
),
GeneratorId="prowler-" + finding.check_id,
AwsAccountId=finding.account_uid,
Types=(
finding.check_type.split(",")
if finding.check_type
else ["Software and Configuration Checks"]
),
FirstObservedAt=timestamp,
UpdatedAt=timestamp,
CreatedAt=timestamp,
Severity=Severity(Label=finding.severity.value),
Title=finding.check_title,
Description=(
(finding.status_extended[:1000] + "...")
if len(finding.status_extended) > 1000
else finding.status_extended
),
Resources=[
Resource(
Id=finding.resource_uid,
Type=finding.resource_type,
Partition=finding.partition,
Region=finding.region,
Tags=finding.resource_tags,
)
],
Compliance=Compliance(
Status=finding_status,
AssociatedStandards=associated_standards,
RelatedRequirements=compliance_summary,
),
Remediation=Remediation(
Recommendation=Recommendation(
Text=finding.remediation_recommendation_text,
Url=finding.remediation_recommendation_url,
)
),
)
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def batch_write_data_to_file(self) -> None:
"""
Writes the findings data to a file in JSON ASFF format.
This method iterates over the findings data stored in the '_data' attribute and writes it to the file descriptor '_file_descriptor' in JSON format. It starts by writing the JSON opening/header '[', then iterates over each finding, dumping it to the file with an indent of 4 spaces. After writing all findings, it writes the closing ']' to complete the JSON array structure. Finally, it closes the file descriptor.
Returns:
None
"""
try:
if (
getattr(self, "_file_descriptor", None)
and not self._file_descriptor.closed
and self._data
):
# Write JSON opening/header [
self._file_descriptor.write("[")
# Write findings
for finding in self._data:
dump(
finding.dict(exclude_none=True),
self._file_descriptor,
indent=4,
)
self._file_descriptor.write(",")
# Write footer/closing ]
if self._file_descriptor.tell() > 0:
if self._file_descriptor.tell() != 1:
self._file_descriptor.seek(
self._file_descriptor.tell() - 1, SEEK_SET
)
self._file_descriptor.truncate()
self._file_descriptor.write("]")
# Close file descriptor
self._file_descriptor.close()
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
@staticmethod
def generate_status(status: str, muted: bool = False) -> str:
"""
Generates the ASFF status based on the provided status and muted flag.
Parameters:
- status (str): The status of the finding.
- muted (bool): Flag indicating if the finding is muted.
Returns:
- str: The ASFF status corresponding to the provided status and muted flag.
References:
- AWS Security Hub API Reference: https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
"""
json_asff_status = ""
if muted:
# Per AWS Security Hub "MUTED" is not a valid status
# https://docs.aws.amazon.com/securityhub/1.0/APIReference/API_Compliance.html
json_asff_status = "WARNING"
else:
if status == "PASS":
json_asff_status = "PASSED"
elif status == "FAIL":
json_asff_status = "FAILED"
else:
# MANUAL is set to NOT_AVAILABLE
json_asff_status = "NOT_AVAILABLE"
return json_asff_status
@staticmethod
def format_compliance(compliance: dict) -> tuple[list[dict], list[str]]:
"""
Transforms a dictionary of compliance data into a tuple of associated standards and compliance summaries.
Parameters:
- compliance (dict): A dictionary containing compliance data where keys are standards and values are lists of compliance details.
Returns:
- tuple[list[dict], list[str]]: A tuple containing a list of associated standards (each as a dictionary with 'StandardsId') and a list of compliance summaries.
Notes:
- The method limits the number of associated standards to 20.
- Each compliance summary is a concatenation of the standard key and its associated compliance details.
- If the concatenated summary exceeds 64 characters, it is truncated to 63 characters.
Example:
format_compliance({"standard1": ["detail1", "detail2"], "standard2": ["detail3"]}) -> ([{"StandardsId": "standard1"}, {"StandardsId": "standard2"}], ["standard1 detail1 detail2", "standard2 detail3"])
"""
compliance_summary = []
associated_standards = []
for key, value in compliance.items():
if (
len(associated_standards) < 20
): # AssociatedStandards should NOT have more than 20 items
associated_standards.append({"StandardsId": key})
item = f"{key} {' '.join(value)}"
if len(item) > 64:
item = item[0:63]
compliance_summary.append(item)
return associated_standards, compliance_summary
class ProductFields(BaseModel):
"""
Class representing the Product Fields of a finding in the AWS Security Finding Format.
Attributes:
- ProviderName (str): The name of the provider, default value is "Prowler".
- ProviderVersion (str): The version of the provider, fetched from the prowler_version in config.py.
- ProwlerResourceName (str): The name of the Prowler resource.
"""
ProviderName: str = "Prowler"
ProviderVersion: str = prowler_version
ProwlerResourceName: str
class Severity(BaseModel):
"""
Class representing the severity of a finding in the AWS Security Finding Format.
Attributes:
- Label (str): A string representing the severity label of the finding.
This class is used to define the severity level of a finding in the AWS Security Finding Format.
"""
Label: str
@validator("Label", pre=True, always=True)
def severity_uppercase(severity):
return severity.upper()
class Resource(BaseModel):
"""
Class representing a resource in the AWS Security Finding Format.
Attributes:
- Type (str): The type of the resource.
- Id (str): The unique identifier of the resource.
- Partition (str): The partition where the resource resides.
- Region (str): The region where the resource is located.
- Tags (Optional[dict]): Optional dictionary of tags associated with the resource.
This class defines the structure of a resource within the AWS Security Finding Format. It includes attributes to specify the type, unique identifier, partition, region, and optional tags of the resource.
"""
Type: str
Id: str
Partition: str
Region: str
Tags: Optional[dict]
@validator("Tags", pre=True, always=True)
def tags_cannot_be_empty_dict(tags):
if not tags:
return None
return tags
class Compliance(BaseModel):
"""
Class representing the compliance details of a finding in the AWS Security Finding Format.
Attributes:
- Status (str): The compliance status of the finding.
- RelatedRequirements (list[str]): A list of related compliance requirements for the finding.
- AssociatedStandards (list[dict]): A list of associated standards with the finding, where each item is a dictionary containing the 'StandardsId'.
This class defines the structure of compliance information within the AWS Security Finding Format. It includes attributes to specify the compliance status, related requirements, and associated standards of a finding.
"""
Status: str
RelatedRequirements: list[str]
AssociatedStandards: list[dict]
@validator("Status", pre=True, always=True)
def status(status):
if status not in ["PASSED", "WARNING", "FAILED", "NOT_AVAILABLE"]:
raise ValueError("must contain a space")
return status
class Recommendation(BaseModel):
"""
Class representing a recommendation for remediation in the AWS Security Finding Format.
Attributes:
- Text (str): The text description of the recommendation.
- Url (str): The URL link for additional information related to the recommendation.
This class defines the structure of a recommendation within the AWS Security Finding Format. It includes attributes to specify the text description and URL link for further details regarding the recommendation.
"""
Text: str = ""
Url: str = ""
@validator("Text", pre=True, always=True)
def text_must_not_exceed_512_chars(text):
text_validated = text
if len(text) > 512:
text_validated = text[:509] + "..."
return text_validated
@validator("Url", pre=True, always=True)
def set_default_url_if_empty(url):
default_url = "https://docs.aws.amazon.com/securityhub/latest/userguide/what-is-securityhub.html"
if url:
default_url = url
return default_url
class Remediation(BaseModel):
"""
Class representing a remediation action in the AWS Security Finding Format.
Attributes:
- Recommendation (Recommendation): An instance of the Recommendation class providing details for remediation.
This class defines the structure of a remediation action within the AWS Security Finding Format. It includes an attribute to specify the recommendation for remediation, which is an instance of the Recommendation class.
"""
Recommendation: Recommendation
class AWSSecurityFindingFormat(BaseModel):
"""
AWSSecurityFindingFormat generates a finding's output in JSON ASFF format: https://docs.aws.amazon.com/securityhub/latest/userguide/securityhub-findings-format-syntax.html
Attributes:
- SchemaVersion (str): The version of the ASFF schema being used, default value is "2018-10-08".
- Id (str): The unique identifier of the finding.
- ProductArn (str): The ARN of the product generating the finding.
- RecordState (str): The state of the finding record, default value is "ACTIVE".
- ProductFields (ProductFields): An instance of the ProductFields class representing the product fields of the finding.
- GeneratorId (str): The ID of the generator.
- AwsAccountId (str): The AWS account ID associated with the finding.
- Types (list[str]): A list of types associated with the finding, default value is None.
- FirstObservedAt (str): The timestamp when the finding was first observed.
- UpdatedAt (str): The timestamp when the finding was last updated.
- CreatedAt (str): The timestamp when the finding was created.
- Severity (Severity): An instance of the Severity class representing the severity of the finding.
- Title (str): The title of the finding.
- Description (str): The description of the finding, truncated to 1024 characters if longer.
- Resources (list[Resource]): A list of resources associated with the finding, default value is None.
- Compliance (Compliance): An instance of the Compliance class representing the compliance details of the finding.
- Remediation (Remediation): An instance of the Remediation class providing details for remediation.
This class defines the structure of a finding in the AWS Security Finding Format, including various attributes such as schema version, identifiers, timestamps, severity, title, description, resources, compliance details, and remediation information.
"""
SchemaVersion: str = "2018-10-08"
Id: str
ProductArn: str
RecordState: str = "ACTIVE"
ProductFields: ProductFields
GeneratorId: str
AwsAccountId: str
Types: list[str] = None
FirstObservedAt: str
UpdatedAt: str
CreatedAt: str
Severity: Severity
Title: str
Description: str
Resources: list[Resource] = None
Compliance: Compliance
Remediation: Remediation
@validator("Description", pre=True, always=True)
def description_must_not_exceed_1024_chars(description):
description_validated = description
if len(description) > 1024:
description_validated = description[:1021] + "..."
return description_validated

View File

@@ -2,7 +2,6 @@ from operator import attrgetter
from prowler.config.config import timestamp
from prowler.lib.logger import logger
from prowler.lib.outputs.common_models import FindingOutput
from prowler.lib.outputs.utils import unroll_list, unroll_tags
from prowler.lib.utils.utils import outputs_unix_timestamp
@@ -22,87 +21,6 @@ def get_provider_data_mapping(provider) -> dict:
return data
def generate_provider_output(provider, finding, csv_data) -> FindingOutput:
"""
generate_provider_output returns the provider's Finding output model
"""
# TODO: we have to standardize this between the above mapping and the provider.get_output_mapping()
try:
if provider.type == "aws":
# TODO: probably Organization UID is without the account id
csv_data["auth_method"] = f"profile: {csv_data['auth_method']}"
csv_data["resource_name"] = finding.resource_id
csv_data["resource_uid"] = finding.resource_arn
csv_data["region"] = finding.region
elif provider.type == "azure":
# TODO: we should show the authentication method used I think
csv_data["auth_method"] = (
f"{provider.identity.identity_type}: {provider.identity.identity_id}"
)
# Get the first tenant domain ID, just in case
csv_data["account_organization_uid"] = csv_data["account_organization_uid"][
0
]
csv_data["account_uid"] = (
csv_data["account_organization_uid"]
if "Tenant:" in finding.subscription
else provider.identity.subscriptions[finding.subscription]
)
csv_data["account_name"] = finding.subscription
csv_data["resource_name"] = finding.resource_name
csv_data["resource_uid"] = finding.resource_id
csv_data["region"] = finding.location
elif provider.type == "gcp":
csv_data["auth_method"] = f"Principal: {csv_data['auth_method']}"
csv_data["account_uid"] = provider.projects[finding.project_id].id
csv_data["account_name"] = provider.projects[finding.project_id].name
csv_data["account_tags"] = provider.projects[finding.project_id].labels
csv_data["resource_name"] = finding.resource_name
csv_data["resource_uid"] = finding.resource_id
csv_data["region"] = finding.location
if (
provider.projects
and finding.project_id in provider.projects
and getattr(provider.projects[finding.project_id], "organization")
):
csv_data["account_organization_uid"] = provider.projects[
finding.project_id
].organization.id
# TODO: for now is None since we don't retrieve that data
csv_data["account_organization"] = provider.projects[
finding.project_id
].organization.display_name
elif provider.type == "kubernetes":
if provider.identity.context == "In-Cluster":
csv_data["auth_method"] = "in-cluster"
else:
csv_data["auth_method"] = "kubeconfig"
csv_data["resource_name"] = finding.resource_name
csv_data["resource_uid"] = finding.resource_id
csv_data["account_name"] = f"context: {provider.identity.context}"
csv_data["region"] = f"namespace: {finding.namespace}"
# Finding Unique ID
# TODO: move this to a function
# TODO: in Azure, GCP and K8s there are fidings without resource_name
csv_data["finding_uid"] = (
f"prowler-{provider.type}-{finding.check_metadata.CheckID}-{csv_data['account_uid']}-{csv_data['region']}-{csv_data['resource_name']}"
)
finding_output = FindingOutput(**csv_data)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
return finding_output
# TODO: add test for outputs_unix_timestamp
def fill_common_finding_data(finding: dict, unix_timestamp: bool) -> dict:
finding_data = {

View File

@@ -1,77 +0,0 @@
from datetime import datetime
from enum import Enum
from typing import Optional, Union
from pydantic import BaseModel
from prowler.config.config import prowler_version
class Status(str, Enum):
PASS = "PASS"
FAIL = "FAIL"
MANUAL = "MANUAL"
class Severity(str, Enum):
critical = "critical"
high = "high"
medium = "medium"
low = "low"
informational = "informational"
class FindingOutput(BaseModel):
"""
FindingOutput generates a finding's output. It can be written to CSV or another format doing the mapping.
This is the base finding output model for every provider.
"""
auth_method: str
timestamp: Union[int, datetime]
account_uid: str
# Optional since depends on permissions
account_name: Optional[str]
# Optional since depends on permissions
account_email: Optional[str]
# Optional since depends on permissions
account_organization_uid: Optional[str]
# Optional since depends on permissions
account_organization_name: Optional[str]
# Optional since depends on permissions
account_tags: Optional[list[str]]
finding_uid: str
provider: str
check_id: str
check_title: str
check_type: str
status: Status
status_extended: str
muted: bool = False
service_name: str
subservice_name: str
severity: Severity
resource_type: str
resource_uid: str
resource_name: str
resource_details: str
resource_tags: str
# Only present for AWS and Azure
partition: Optional[str]
region: str
description: str
risk: str
related_url: str
remediation_recommendation_text: str
remediation_recommendation_url: str
remediation_code_nativeiac: str
remediation_code_terraform: str
remediation_code_cli: str
remediation_code_other: str
compliance: dict
categories: str
depends_on: str
related_to: str
notes: str
prowler_version: str = prowler_version

View File

@@ -0,0 +1,97 @@
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.outputs.compliance.aws_well_architected.models import (
AWSWellArchitectedModel,
)
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.finding import Finding
class AWSWellArchitected(ComplianceOutput):
"""
This class represents the AWS Well-Architected compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into AWS Well-Architected compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: Compliance,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into AWS Well-Architected compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (Compliance): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = AWSWellArchitectedModel(
Provider=finding.provider,
Description=compliance.Description,
AccountId=finding.account_uid,
Region=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Name=attribute.Name,
Requirements_Attributes_WellArchitectedQuestionId=attribute.WellArchitectedQuestionId,
Requirements_Attributes_WellArchitectedPracticeId=attribute.WellArchitectedPracticeId,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_LevelOfRisk=attribute.LevelOfRisk,
Requirements_Attributes_AssessmentMethod=attribute.AssessmentMethod,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_ImplementationGuidanceUrl=attribute.ImplementationGuidanceUrl,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = AWSWellArchitectedModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
AccountId="",
Region="",
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Name=attribute.Name,
Requirements_Attributes_WellArchitectedQuestionId=attribute.WellArchitectedQuestionId,
Requirements_Attributes_WellArchitectedPracticeId=attribute.WellArchitectedPracticeId,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_LevelOfRisk=attribute.LevelOfRisk,
Requirements_Attributes_AssessmentMethod=attribute.AssessmentMethod,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_ImplementationGuidanceUrl=attribute.ImplementationGuidanceUrl,
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

View File

@@ -0,0 +1,32 @@
from typing import Optional
from pydantic import BaseModel
class AWSWellArchitectedModel(BaseModel):
"""
AWSWellArchitectedModel generates a finding's output in AWS Well-Architected Framework format.
"""
Provider: str
Description: str
AccountId: str
Region: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Name: str
Requirements_Attributes_WellArchitectedQuestionId: str
Requirements_Attributes_WellArchitectedPracticeId: str
Requirements_Attributes_Section: str
Requirements_Attributes_SubSection: Optional[str]
Requirements_Attributes_LevelOfRisk: str
Requirements_Attributes_AssessmentMethod: str
Requirements_Attributes_Description: str
Requirements_Attributes_ImplementationGuidanceUrl: str
Status: str
StatusExtended: str
ResourceId: str
CheckId: str
Muted: bool
ResourceName: str

View File

@@ -1,60 +0,0 @@
from csv import DictWriter
from prowler.config.config import timestamp
from prowler.lib.logger import logger
from prowler.lib.outputs.compliance.models import Check_Output_CSV_AWS_Well_Architected
from prowler.lib.outputs.csv.csv import generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def write_compliance_row_aws_well_architected_framework(
file_descriptors, finding, compliance, output_options, provider
):
try:
compliance_output = compliance.Framework
if compliance.Version != "":
compliance_output += "_" + compliance.Version
if compliance.Provider != "":
compliance_output += "_" + compliance.Provider
compliance_output = compliance_output.lower().replace("-", "_")
csv_header = generate_csv_fields(Check_Output_CSV_AWS_Well_Architected)
csv_writer = DictWriter(
file_descriptors[compliance_output],
fieldnames=csv_header,
delimiter=";",
)
for requirement in compliance.Requirements:
requirement_description = requirement.Description
requirement_id = requirement.Id
for attribute in requirement.Attributes:
compliance_row = Check_Output_CSV_AWS_Well_Architected(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=provider.identity.account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Description=requirement_description,
Requirements_Attributes_Name=attribute.Name,
Requirements_Attributes_WellArchitectedQuestionId=attribute.WellArchitectedQuestionId,
Requirements_Attributes_WellArchitectedPracticeId=attribute.WellArchitectedPracticeId,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_LevelOfRisk=attribute.LevelOfRisk,
Requirements_Attributes_AssessmentMethod=attribute.AssessmentMethod,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_ImplementationGuidanceUrl=attribute.ImplementationGuidanceUrl,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
Muted=finding.muted,
)
csv_writer.writerow(compliance_row.__dict__)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)

View File

@@ -2,75 +2,6 @@ from colorama import Fore, Style
from tabulate import tabulate
from prowler.config.config import orange_color
from prowler.lib.logger import logger
from prowler.lib.outputs.compliance.cis_aws import generate_compliance_row_cis_aws
from prowler.lib.outputs.compliance.cis_azure import generate_compliance_row_cis_azure
from prowler.lib.outputs.compliance.cis_gcp import generate_compliance_row_cis_gcp
from prowler.lib.outputs.compliance.cis_kubernetes import (
generate_compliance_row_cis_kubernetes,
)
from prowler.lib.outputs.csv.csv import write_csv
def write_compliance_row_cis(
file_descriptors,
finding,
compliance,
output_options,
provider,
input_compliance_frameworks,
):
try:
compliance_output = (
"cis_" + compliance.Version + "_" + compliance.Provider.lower()
)
# Only with the version of CIS that was selected
if compliance_output in str(input_compliance_frameworks):
for requirement in compliance.Requirements:
for attribute in requirement.Attributes:
if compliance.Provider == "AWS":
(compliance_row, csv_header) = generate_compliance_row_cis_aws(
finding,
compliance,
requirement,
attribute,
output_options,
provider,
)
elif compliance.Provider == "Azure":
(compliance_row, csv_header) = (
generate_compliance_row_cis_azure(
finding,
compliance,
requirement,
attribute,
output_options,
)
)
elif compliance.Provider == "GCP":
(compliance_row, csv_header) = generate_compliance_row_cis_gcp(
finding, compliance, requirement, attribute, output_options
)
elif compliance.Provider == "Kubernetes":
(compliance_row, csv_header) = (
generate_compliance_row_cis_kubernetes(
finding,
compliance,
requirement,
attribute,
output_options,
provider,
)
)
write_csv(
file_descriptors[compliance_output], csv_header, compliance_row
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def get_cis_table(

View File

@@ -0,0 +1,97 @@
from prowler.lib.check.compliance_models import Compliance
from prowler.lib.outputs.compliance.cis.models import AWSCISModel
from prowler.lib.outputs.compliance.compliance_output import ComplianceOutput
from prowler.lib.outputs.finding import Finding
class AWSCIS(ComplianceOutput):
"""
This class represents the AWS CIS compliance output.
Attributes:
- _data (list): A list to store transformed data from findings.
- _file_descriptor (TextIOWrapper): A file descriptor to write data to a file.
Methods:
- transform: Transforms findings into AWS CIS compliance format.
"""
def transform(
self,
findings: list[Finding],
compliance: Compliance,
compliance_name: str,
) -> None:
"""
Transforms a list of findings into AWS CIS compliance format.
Parameters:
- findings (list): A list of findings.
- compliance (Compliance): A compliance model.
- compliance_name (str): The name of the compliance model.
Returns:
- None
"""
for finding in findings:
# Get the compliance requirements for the finding
finding_requirements = finding.compliance.get(compliance_name, [])
for requirement in compliance.Requirements:
if requirement.Id in finding_requirements:
for attribute in requirement.Attributes:
compliance_row = AWSCISModel(
Provider=finding.provider,
Description=compliance.Description,
AccountId=finding.account_uid,
Region=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_uid,
ResourceName=finding.resource_name,
CheckId=finding.check_id,
Muted=finding.muted,
)
self._data.append(compliance_row)
# Add manual requirements to the compliance output
for requirement in compliance.Requirements:
if not requirement.Checks:
for attribute in requirement.Attributes:
compliance_row = AWSCISModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
AccountId="",
Region="",
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status="MANUAL",
StatusExtended="Manual check",
ResourceId="manual_check",
ResourceName="Manual check",
CheckId="manual",
Muted=False,
)
self._data.append(compliance_row)

Some files were not shown because too many files have changed in this diff Show More