Compare commits

..

1 Commits

Author SHA1 Message Date
github-actions
3aaf2b2c27 chore(release): 3.11.1 2023-11-10 11:29:23 +00:00
718 changed files with 17086 additions and 17867 deletions

View File

@@ -13,10 +13,10 @@ name: "CodeQL"
on:
push:
branches: [ "master", "prowler-4.0-dev" ]
branches: [ "master", prowler-2, prowler-3.0-dev ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ "master", "prowler-4.0-dev" ]
branches: [ "master" ]
schedule:
- cron: '00 12 * * *'

View File

@@ -4,11 +4,9 @@ on:
push:
branches:
- "master"
- "prowler-4.0-dev"
pull_request:
branches:
- "master"
- "prowler-4.0-dev"
jobs:
build:
runs-on: ubuntu-latest
@@ -28,7 +26,6 @@ jobs:
README.md
docs/**
permissions/**
mkdocs.yml
- name: Install poetry
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
run: |

View File

@@ -102,7 +102,7 @@ All the checks MUST fill the `report.status` and `report.status_extended` with t
- Status -- `report.status`
- `PASS` --> If the check is passing against the configured value.
- `FAIL` --> If the check is passing against the configured value.
- `MANUAL` --> This value cannot be used unless a manual operation is required in order to determine if the `report.status` is whether `PASS` or `FAIL`.
- `INFO` --> This value cannot be used unless a manual operation is required in order to determine if the `report.status` is whether `PASS` or `FAIL`.
- Status Extended -- `report.status_extended`
- MUST end in a dot `.`
- MUST include the service audited with the resource and a brief explanation of the result generated, e.g.: `EC2 AMI ami-0123456789 is not public.`

View File

@@ -136,16 +136,26 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
=== "AWS CloudShell"
After the migration of AWS CloudShell from Amazon Linux 2 to Amazon Linux 2023 [[1]](https://aws.amazon.com/about-aws/whats-new/2023/12/aws-cloudshell-migrated-al2023/) [2](https://docs.aws.amazon.com/cloudshell/latest/userguide/cloudshell-AL2023-migration.html), there is no longer a need to manually compile Python 3.9 as it's already included in AL2023. Prowler can thus be easily installed following the Generic method of installation via pip. Follow the steps below to successfully execute Prowler v3 in AWS CloudShell:
Prowler can be easely executed in AWS CloudShell but it has some prerequsites to be able to to so. AWS CloudShell is a container running with `Amazon Linux release 2 (Karoo)` that comes with Python 3.7, since Prowler requires Python >= 3.9 we need to first install a newer version of Python. Follow the steps below to successfully execute Prowler v3 in AWS CloudShell:
_Requirements_:
* Open AWS CloudShell `bash`.
* First install all dependences and then Python, in this case we need to compile it because there is not a package available at the time this document is written:
```
sudo yum -y install gcc openssl-devel bzip2-devel libffi-devel
wget https://www.python.org/ftp/python/3.9.16/Python-3.9.16.tgz
tar zxf Python-3.9.16.tgz
cd Python-3.9.16/
./configure --enable-optimizations
sudo make altinstall
python3.9 --version
cd
```
_Commands_:
* Once Python 3.9 is available we can install Prowler from pip:
```
pip install prowler
pip3.9 install prowler
prowler -v
```

View File

@@ -1,19 +1,19 @@
# Mute Listing
# Allowlisting
Sometimes you may find resources that are intentionally configured in a certain way that may be a bad practice but it is all right with it, for example an AWS S3 Bucket open to the internet hosting a web site, or an AWS Security Group with an open port needed in your use case.
Mute List option works along with other options and adds a `MUTED` instead of `MANUAL`, `PASS` or `FAIL` to any output format.
Allowlist option works along with other options and adds a `WARNING` instead of `INFO`, `PASS` or `FAIL` to any output format.
You can use `-w`/`--mutelist-file` with the path of your mutelist yaml file, but first, let's review the syntax.
You can use `-w`/`--allowlist-file` with the path of your allowlist yaml file, but first, let's review the syntax.
## Mute List Yaml File Syntax
## Allowlist Yaml File Syntax
### Account, Check and/or Region can be * to apply for all the cases.
### Resources and tags are lists that can have either Regex or Keywords.
### Tags is an optional list that matches on tuples of 'key=value' and are "ANDed" together.
### Use an alternation Regex to match one of multiple tags with "ORed" logic.
### For each check you can except Accounts, Regions, Resources and/or Tags.
########################### MUTE LIST EXAMPLE ###########################
Mute List:
########################### ALLOWLIST EXAMPLE ###########################
Allowlist:
Accounts:
"123456789012":
Checks:
@@ -79,10 +79,10 @@ You can use `-w`/`--mutelist-file` with the path of your mutelist yaml file, but
Tags:
- "environment=prod" # Will ignore every resource except in account 123456789012 except the ones containing the string "test" and tag environment=prod
## Mute specific regions
If you want to mute failed findings only in specific regions, create a file with the following syntax and run it with `prowler aws -w mutelist.yaml`:
## Allowlist specific regions
If you want to allowlist/mute failed findings only in specific regions, create a file with the following syntax and run it with `prowler aws -w allowlist.yaml`:
Mute List:
Allowlist:
Accounts:
"*":
Checks:
@@ -93,50 +93,50 @@ If you want to mute failed findings only in specific regions, create a file with
Resources:
- "*"
## Default AWS Mute List
Prowler provides you a Default AWS Mute List with the AWS Resources that should be muted such as all resources created by AWS Control Tower when setting up a landing zone.
You can execute Prowler with this mutelist using the following command:
## Default AWS Allowlist
Prowler provides you a Default AWS Allowlist with the AWS Resources that should be allowlisted such as all resources created by AWS Control Tower when setting up a landing zone.
You can execute Prowler with this allowlist using the following command:
```sh
prowler aws --mutelist prowler/config/aws_mutelist.yaml
prowler aws --allowlist prowler/config/aws_allowlist.yaml
```
## Supported Mute List Locations
## Supported Allowlist Locations
The mutelisting flag supports the following locations:
The allowlisting flag supports the following locations:
### Local file
You will need to pass the local path where your Mute List YAML file is located:
You will need to pass the local path where your Allowlist YAML file is located:
```
prowler <provider> -w mutelist.yaml
prowler <provider> -w allowlist.yaml
```
### AWS S3 URI
You will need to pass the S3 URI where your Mute List YAML file was uploaded to your bucket:
You will need to pass the S3 URI where your Allowlist YAML file was uploaded to your bucket:
```
prowler aws -w s3://<bucket>/<prefix>/mutelist.yaml
prowler aws -w s3://<bucket>/<prefix>/allowlist.yaml
```
> Make sure that the used AWS credentials have s3:GetObject permissions in the S3 path where the mutelist file is located.
> Make sure that the used AWS credentials have s3:GetObject permissions in the S3 path where the allowlist file is located.
### AWS DynamoDB Table ARN
You will need to pass the DynamoDB Mute List Table ARN:
You will need to pass the DynamoDB Allowlist Table ARN:
```
prowler aws -w arn:aws:dynamodb:<region_name>:<account_id>:table/<table_name>
```
1. The DynamoDB Table must have the following String keys:
<img src="../img/mutelist-keys.png"/>
<img src="../img/allowlist-keys.png"/>
- The Mute List Table must have the following columns:
- Accounts (String): This field can contain either an Account ID or an `*` (which applies to all the accounts that use this table as an mutelist).
- The Allowlist Table must have the following columns:
- Accounts (String): This field can contain either an Account ID or an `*` (which applies to all the accounts that use this table as an allowlist).
- Checks (String): This field can contain either a Prowler Check Name or an `*` (which applies to all the scanned checks).
- Regions (List): This field contains a list of regions where this mutelist rule is applied (it can also contains an `*` to apply all scanned regions).
- Resources (List): This field contains a list of regex expressions that applies to the resources that are wanted to be muted.
- Tags (List): -Optional- This field contains a list of tuples in the form of 'key=value' that applies to the resources tags that are wanted to be muted.
- Exceptions (Map): -Optional- This field contains a map of lists of accounts/regions/resources/tags that are wanted to be excepted in the mutelist.
- Regions (List): This field contains a list of regions where this allowlist rule is applied (it can also contains an `*` to apply all scanned regions).
- Resources (List): This field contains a list of regex expressions that applies to the resources that are wanted to be allowlisted.
- Tags (List): -Optional- This field contains a list of tuples in the form of 'key=value' that applies to the resources tags that are wanted to be allowlisted.
- Exceptions (Map): -Optional- This field contains a map of lists of accounts/regions/resources/tags that are wanted to be excepted in the allowlist.
The following example will mute all resources in all accounts for the EC2 checks in the regions `eu-west-1` and `us-east-1` with the tags `environment=dev` and `environment=prod`, except the resources containing the string `test` in the account `012345678912` and region `eu-west-1` with the tag `environment=prod`:
The following example will allowlist all resources in all accounts for the EC2 checks in the regions `eu-west-1` and `us-east-1` with the tags `environment=dev` and `environment=prod`, except the resources containing the string `test` in the account `012345678912` and region `eu-west-1` with the tag `environment=prod`:
<img src="../img/mutelist-row.png"/>
<img src="../img/allowlist-row.png"/>
> Make sure that the used AWS credentials have `dynamodb:PartiQLSelect` permissions in the table.
@@ -151,7 +151,7 @@ prowler aws -w arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME
Make sure that the credentials that Prowler uses can invoke the Lambda Function:
```
- PolicyName: GetMuteList
- PolicyName: GetAllowList
PolicyDocument:
Version: '2012-10-17'
Statement:
@@ -160,14 +160,14 @@ Make sure that the credentials that Prowler uses can invoke the Lambda Function:
Resource: arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME
```
The Lambda Function can then generate an Mute List dynamically. Here is the code an example Python Lambda Function that
generates an Mute List:
The Lambda Function can then generate an Allowlist dynamically. Here is the code an example Python Lambda Function that
generates an Allowlist:
```
def handler(event, context):
checks = {}
checks["vpc_flow_logs_enabled"] = { "Regions": [ "*" ], "Resources": [ "" ], Optional("Tags"): [ "key:value" ] }
al = { "Mute List": { "Accounts": { "*": { "Checks": checks } } } }
al = { "Allowlist": { "Accounts": { "*": { "Checks": checks } } } }
return al
```

View File

@@ -37,3 +37,7 @@ If your IAM entity enforces MFA you can use `--mfa` and Prowler will ask you to
- ARN of your MFA device
- TOTP (Time-Based One-Time Password)
## STS Endpoint Region
If you are using Prowler in AWS regions that are not enabled by default you need to use the argument `--sts-endpoint-region` to point the AWS STS API calls `assume-role` and `get-caller-identity` to the non-default region, e.g.: `prowler aws --sts-endpoint-region eu-south-2`.

View File

@@ -32,14 +32,3 @@ Prowler's AWS Provider uses the Boto3 [Standard](https://boto3.amazonaws.com/v1/
- Retry attempts on nondescriptive, transient error codes. Specifically, these HTTP status codes: 500, 502, 503, 504.
- Any retry attempt will include an exponential backoff by a base factor of 2 for a maximum backoff time of 20 seconds.
## Notes for validating retry attempts
If you are making changes to Prowler, and want to validate if requests are being retried or given up on, you can take the following approach
* Run prowler with `--log-level DEBUG` and `--log-file debuglogs.txt`
* Search for retry attempts using `grep -i 'Retry needed' debuglogs.txt`
This is based off of the [AWS documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/retries.html#checking-retry-attempts-in-your-client-logs), which states that if a retry is performed, you will see a message starting with "Retry needed".
You can determine the total number of calls made using `grep -i 'Sending http request' debuglogs.txt | wc -l`

View File

@@ -1,26 +1,26 @@
# AWS CloudShell
## Installation
After the migration of AWS CloudShell from Amazon Linux 2 to Amazon Linux 2023 [[1]](https://aws.amazon.com/about-aws/whats-new/2023/12/aws-cloudshell-migrated-al2023/) [[2]](https://docs.aws.amazon.com/cloudshell/latest/userguide/cloudshell-AL2023-migration.html), there is no longer a need to manually compile Python 3.9 as it's already included in AL2023. Prowler can thus be easily installed following the Generic method of installation via pip. Follow the steps below to successfully execute Prowler v3 in AWS CloudShell:
```shell
pip install prowler
Prowler can be easily executed in AWS CloudShell but it has some prerequisites to be able to to so. AWS CloudShell is a container running with `Amazon Linux release 2 (Karoo)` that comes with Python 3.7, since Prowler requires Python >= 3.9 we need to first install a newer version of Python. Follow the steps below to successfully execute Prowler v3 in AWS CloudShell:
- First install all dependences and then Python, in this case we need to compile it because there is not a package available at the time this document is written:
```
sudo yum -y install gcc openssl-devel bzip2-devel libffi-devel
wget https://www.python.org/ftp/python/3.9.16/Python-3.9.16.tgz
tar zxf Python-3.9.16.tgz
cd Python-3.9.16/
./configure --enable-optimizations
sudo make altinstall
python3.9 --version
cd
```
- Once Python 3.9 is available we can install Prowler from pip:
```
pip3.9 install prowler
```
- Now enjoy Prowler:
```
prowler -v
prowler
```
## Download Files
To download the results from AWS CloudShell, select Actions -> Download File and add the full path of each file. For the CSV file it will be something like `/home/cloudshell-user/output/prowler-output-123456789012-20221220191331.csv`
## Clone Prowler from Github
The limited storage that AWS CloudShell provides for the user's home directory causes issues when installing the poetry dependencies to run Prowler from GitHub. Here is a workaround:
```shell
git clone https://github.com/prowler-cloud/prowler.git
cd prowler
pip install poetry
mkdir /tmp/pypoetry
poetry config cache-dir /tmp/pypoetry
poetry shell
poetry install
python prowler.py -v
```
- To download the results from AWS CloudShell, select Actions -> Download File and add the full path of each file. For the CSV file it will be something like `/home/cloudshell-user/output/prowler-output-123456789012-20221220191331.csv`

View File

@@ -23,6 +23,14 @@ prowler aws -R arn:aws:iam::<account_id>:role/<role_name>
prowler aws -T/--session-duration <seconds> -I/--external-id <external_id> -R arn:aws:iam::<account_id>:role/<role_name>
```
## STS Endpoint Region
If you are using Prowler in AWS regions that are not enabled by default you need to use the argument `--sts-endpoint-region` to point the AWS STS API calls `assume-role` and `get-caller-identity` to the non-default region, e.g.: `prowler aws --sts-endpoint-region eu-south-2`.
> Since v3.11.0, Prowler uses a regional token in STS sessions so it can scan all AWS regions without needing the `--sts-endpoint-region` argument.
> Make sure that you have enabled the AWS Region you want to scan in BOTH AWS Accounts (assumed role account and account from which you assume the role).
## Role MFA
If your IAM Role has MFA configured you can use `--mfa` along with `-R`/`--role <role_arn>` and Prowler will ask you to input the following values to get a new temporary session for the IAM Role provided:

View File

@@ -1,16 +0,0 @@
# Use non default Azure regions
Microsoft provides clouds for compliance with regional laws, which are available for your use.
By default, Prowler uses `AzureCloud` cloud which is the comercial one. (you can list all the available with `az cloud list --output table`).
At the time of writing this documentation the available Azure Clouds from different regions are the following:
- AzureCloud
- AzureChinaCloud
- AzureUSGovernment
- AzureGermanCloud
If you want to change the default one you must include the flag `--azure-region`, i.e.:
```console
prowler azure --az-cli-auth --azure-region AzureChinaCloud
```

View File

@@ -1,18 +1,5 @@
# Compliance
Prowler allows you to execute checks based on requirements defined in compliance frameworks. By default, it will execute and give you an overview of the status of each compliance framework:
<img src="../img/compliance.png"/>
> You can find CSVs containing detailed compliance results inside the compliance folder within Prowler's output folder.
## Execute Prowler based on Compliance Frameworks
Prowler can analyze your environment based on a specific compliance framework and get more details, to do it, you can use option `--compliance`:
```sh
prowler <provider> --compliance <compliance_framework>
```
Standard results will be shown and additionally the framework information as the sample below for CIS AWS 1.5. For details a CSV file has been generated as well.
<img src="../img/compliance-cis-sample1.png"/>
Prowler allows you to execute checks based on requirements defined in compliance frameworks.
## List Available Compliance Frameworks
In order to see which compliance frameworks are cover by Prowler, you can use option `--list-compliance`:
@@ -23,12 +10,9 @@ Currently, the available frameworks are:
- `cis_1.4_aws`
- `cis_1.5_aws`
- `cis_2.0_aws`
- `cisa_aws`
- `ens_rd2022_aws`
- `aws_audit_manager_control_tower_guardrails_aws`
- `aws_foundational_security_best_practices_aws`
- `aws_well_architected_framework_reliability_pillar_aws`
- `aws_well_architected_framework_security_pillar_aws`
- `cisa_aws`
- `fedramp_low_revision_4_aws`
@@ -38,9 +22,6 @@ Currently, the available frameworks are:
- `gxp_eu_annex_11_aws`
- `gxp_21_cfr_part_11_aws`
- `hipaa_aws`
- `iso27001_2013_aws`
- `iso27001_2013_aws`
- `mitre_attack_aws`
- `nist_800_53_revision_4_aws`
- `nist_800_53_revision_5_aws`
- `nist_800_171_revision_2_aws`
@@ -57,6 +38,7 @@ prowler <provider> --list-compliance-requirements <compliance_framework(s)>
```
Example for the first requirements of CIS 1.5 for AWS:
```
Listing CIS 1.5 AWS Compliance Requirements:
@@ -89,6 +71,15 @@ Requirement Id: 1.5
```
## Execute Prowler based on Compliance Frameworks
As we mentioned, Prowler can be execute to analyse you environment based on a specific compliance framework, to do it, you can use option `--compliance`:
```sh
prowler <provider> --compliance <compliance_framework>
```
Standard results will be shown and additionally the framework information as the sample below for CIS AWS 1.5. For details a CSV file has been generated as well.
<img src="../img/compliance-cis-sample1.png"/>
## Create and contribute adding other Security Frameworks
This information is part of the Developer Guide and can be found here: https://docs.prowler.cloud/en/latest/tutorials/developer-guide/.

View File

@@ -29,10 +29,10 @@ The following list includes all the AWS checks with configurable variables that
| `organizations_delegated_administrators` | `organizations_trusted_delegated_administrators` | List of Strings |
| `ecr_repositories_scan_vulnerabilities_in_latest_image` | `ecr_repository_vulnerability_minimum_severity` | String |
| `trustedadvisor_premium_support_plan_subscribed` | `verify_premium_support_plans` | Boolean |
| `config_recorder_all_regions_enabled` | `mute_non_default_regions` | Boolean |
| `drs_job_exist` | `mute_non_default_regions` | Boolean |
| `guardduty_is_enabled` | `mute_non_default_regions` | Boolean |
| `securityhub_enabled` | `mute_non_default_regions` | Boolean |
| `config_recorder_all_regions_enabled` | `allowlist_non_default_regions` | Boolean |
| `drs_job_exist` | `allowlist_non_default_regions` | Boolean |
| `guardduty_is_enabled` | `allowlist_non_default_regions` | Boolean |
| `securityhub_enabled` | `allowlist_non_default_regions` | Boolean |
## Azure
@@ -50,8 +50,8 @@ The following list includes all the AWS checks with configurable variables that
aws:
# AWS Global Configuration
# aws.mute_non_default_regions --> Mute Failed Findings in non-default regions for GuardDuty, SecurityHub, DRS and Config
mute_non_default_regions: False
# aws.allowlist_non_default_regions --> Allowlist Failed Findings in non-default regions for GuardDuty, SecurityHub, DRS and Config
allowlist_non_default_regions: False
# AWS IAM Configuration
# aws.iam_user_accesskey_unused --> CIS recommends 45 days

View File

@@ -1,43 +0,0 @@
# Custom Checks Metadata
In certain organizations, the severity of specific checks might differ from the default values defined in the check's metadata. For instance, while `s3_bucket_level_public_access_block` could be deemed `critical` for some organizations, others might assign a different severity level.
The custom metadata option offers a means to override default metadata set by Prowler
You can utilize `--custom-checks-metadata-file` followed by the path to your custom checks metadata YAML file.
## Available Fields
The list of supported check's metadata fields that can be override are listed as follows:
- Severity
## File Syntax
This feature is available for all the providers supported in Prowler since the metadata format is common between all the providers. The following is the YAML format for the custom checks metadata file:
```yaml title="custom_checks_metadata.yaml"
CustomChecksMetadata:
aws:
Checks:
s3_bucket_level_public_access_block:
Severity: high
s3_bucket_no_mfa_delete:
Severity: high
azure:
Checks:
storage_infrastructure_encryption_is_enabled:
Severity: medium
gcp:
Checks:
compute_instance_public_ip:
Severity: critical
```
## Usage
Executing the following command will assess all checks and generate a report while overriding the metadata for those checks:
```sh
prowler <provider> --custom-checks-metadata-file <path/to/custom/metadata>
```
This customization feature enables organizations to tailor the severity of specific checks based on their unique requirements, providing greater flexibility in security assessment and reporting.

View File

Before

Width:  |  Height:  |  Size: 10 KiB

After

Width:  |  Height:  |  Size: 10 KiB

View File

Before

Width:  |  Height:  |  Size: 94 KiB

After

Width:  |  Height:  |  Size: 94 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 93 KiB

View File

@@ -8,7 +8,7 @@ There are different log levels depending on the logging information that is desi
- **DEBUG**: It will show low-level logs from Python.
- **INFO**: It will show all the API calls that are being invoked by the provider.
- **WARNING**: It will show all resources that are being **muted**.
- **WARNING**: It will show all resources that are being **allowlisted**.
- **ERROR**: It will show any errors, e.g., not authorized actions.
- **CRITICAL**: The default log level. If a critical log appears, it will **exit** Prowlers execution.

View File

@@ -9,10 +9,10 @@ Execute Prowler in verbose mode (like in Version 2):
```console
prowler <provider> --verbose
```
## Filter findings by status
Prowler can filter the findings by their status:
## Show only Fails
Prowler can only display the failed findings:
```console
prowler <provider> --status [PASS, FAIL, MANUAL]
prowler <provider> -q/--quiet
```
## Disable Exit Code 3
Prowler does not trigger exit code 3 with failed checks:

View File

@@ -1,187 +0,0 @@
# Parallel Execution
The strategy used here will be to execute Prowler once per service. You can modify this approach as per your requirements.
This can help for really large accounts, but please be aware of AWS API rate limits:
1. **Service-Specific Limits**: Each AWS service has its own rate limits. For instance, Amazon EC2 might have different rate limits for launching instances versus making API calls to describe instances.
2. **API Rate Limits**: Most of the rate limits in AWS are applied at the API level. Each API call to an AWS service counts towards the rate limit for that service.
3. **Throttling Responses**: When you exceed the rate limit for a service, AWS responds with a throttling error. In AWS SDKs, these are typically represented as `ThrottlingException` or `RateLimitExceeded` errors.
For information on Prowler's retrier configuration please refer to this [page](https://docs.prowler.cloud/en/latest/tutorials/aws/boto3-configuration/).
> Note: You might need to increase the `--aws-retries-max-attempts` parameter from the default value of 3. The retrier follows an exponential backoff strategy.
## Linux
Generate a list of services that Prowler supports, and populate this info into a file:
```bash
prowler aws --list-services | awk -F"- " '{print $2}' | sed '/^$/d' > services
```
Make any modifications for services you would like to skip scanning by modifying this file.
Then create a new PowerShell script file `parallel-prowler.sh` and add the following contents. Update the `$profile` variable to the AWS CLI profile you want to run Prowler with.
```bash
#!/bin/bash
# Change these variables as needed
profile="your_profile"
account_id=$(aws sts get-caller-identity --profile "${profile}" --query 'Account' --output text)
echo "Executing in account: ${account_id}"
# Maximum number of concurrent processes
MAX_PROCESSES=5
# Loop through the services
while read service; do
echo "$(date '+%Y-%m-%d %H:%M:%S'): Starting job for service: ${service}"
# Run the command in the background
(prowler -p "$profile" -s "$service" -F "${account_id}-${service}" --ignore-unused-services --only-logs; echo "$(date '+%Y-%m-%d %H:%M:%S') - ${service} has completed") &
# Check if we have reached the maximum number of processes
while [ $(jobs -r | wc -l) -ge ${MAX_PROCESSES} ]; do
# Wait for a second before checking again
sleep 1
done
done < ./services
# Wait for all background processes to finish
wait
echo "All jobs completed"
```
Output will be stored in the `output/` folder that is in the same directory from which you executed the script.
## Windows
Generate a list of services that Prowler supports, and populate this info into a file:
```powershell
prowler aws --list-services | ForEach-Object {
# Capture lines that are likely service names
if ($_ -match '^\- \w+$') {
$_.Trim().Substring(2)
}
} | Where-Object {
# Filter out empty or null lines
$_ -ne $null -and $_ -ne ''
} | Set-Content -Path "services"
```
Make any modifications for services you would like to skip scanning by modifying this file.
Then create a new PowerShell script file `parallel-prowler.ps1` and add the following contents. Update the `$profile` variable to the AWS CLI profile you want to run prowler with.
Change any parameters you would like when calling prowler in the `Start-Job -ScriptBlock` section. Note that you need to keep the `--only-logs` parameter, else some encoding issue occurs when trying to render the progress-bar and prowler won't successfully execute.
```powershell
$profile = "your_profile"
$account_id = Invoke-Expression -Command "aws sts get-caller-identity --profile $profile --query 'Account' --output text"
Write-Host "Executing Prowler in $account_id"
# Maximum number of concurrent jobs
$MAX_PROCESSES = 5
# Read services from a file
$services = Get-Content -Path "services"
# Array to keep track of started jobs
$jobs = @()
foreach ($service in $services) {
# Start the command as a job
$job = Start-Job -ScriptBlock {
prowler -p ${using:profile} -s ${using:service} -F "${using:account_id}-${using:service}" --ignore-unused-services --only-logs
$endTimestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
Write-Output "${endTimestamp} - $using:service has completed"
}
$jobs += $job
Write-Host "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - Starting job for service: $service"
# Check if we have reached the maximum number of jobs
while (($jobs | Where-Object { $_.State -eq 'Running' }).Count -ge $MAX_PROCESSES) {
Start-Sleep -Seconds 1
# Check for any completed jobs and receive their output
$completedJobs = $jobs | Where-Object { $_.State -eq 'Completed' }
foreach ($completedJob in $completedJobs) {
Receive-Job -Job $completedJob -Keep | ForEach-Object { Write-Host $_ }
$jobs = $jobs | Where-Object { $_.Id -ne $completedJob.Id }
Remove-Job -Job $completedJob
}
}
}
# Check for any remaining completed jobs
$remainingCompletedJobs = $jobs | Where-Object { $_.State -eq 'Completed' }
foreach ($remainingJob in $remainingCompletedJobs) {
Receive-Job -Job $remainingJob -Keep | ForEach-Object { Write-Host $_ }
Remove-Job -Job $remainingJob
}
Write-Host "$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss') - All jobs completed"
```
Output will be stored in `C:\Users\YOUR-USER\Documents\output\`
## Combining the output files
Guidance is provided for the CSV file format. From the ouput directory, execute either the following Bash or PowerShell script. The script will collect the output from the CSV files, only include the header from the first file, and then output the result as CombinedCSV.csv in the current working directory.
There is no logic implemented in terms of which CSV files it will combine. If you have additional CSV files from other actions, such as running a quick inventory, you will need to move that out of the current (or any nested) directory, or move the output you want to combine into its own folder and run the script from there.
```bash
#!/bin/bash
# Initialize a variable to indicate the first file
firstFile=true
# Find all CSV files and loop through them
find . -name "*.csv" -print0 | while IFS= read -r -d '' file; do
if [ "$firstFile" = true ]; then
# For the first file, keep the header
cat "$file" > CombinedCSV.csv
firstFile=false
else
# For subsequent files, skip the header
tail -n +2 "$file" >> CombinedCSV.csv
fi
done
```
```powershell
# Get all CSV files from current directory and its subdirectories
$csvFiles = Get-ChildItem -Recurse -Filter "*.csv"
# Initialize a variable to track if it's the first file
$firstFile = $true
# Loop through each CSV file
foreach ($file in $csvFiles) {
if ($firstFile) {
# For the first file, keep the header and change the flag
$combinedCsv = Import-Csv -Path $file.FullName
$firstFile = $false
} else {
# For subsequent files, skip the header
$tempCsv = Import-Csv -Path $file.FullName
$combinedCsv += $tempCsv | Select-Object * -Skip 1
}
}
# Export the combined data to a new CSV file
$combinedCsv | Export-Csv -Path "CombinedCSV.csv" -NoTypeInformation
```
## TODO: Additional Improvements
Some services need to instantiate another service to perform a check. For instance, `cloudwatch` will instantiate Prowler's `iam` service to perform the `cloudwatch_cross_account_sharing_disabled` check. When the `iam` service is instantiated, it will perform the `__init__` function, and pull all the information required for that service. This provides an opportunity for an improvement in the above script to group related services together so that the `iam` services (or any other cross-service references) isn't repeatedily instantiated by grouping dependant services together. A complete mapping between these services still needs to be further investigated, but these are the cross-references that have been noted:
* inspector2 needs lambda and ec2
* cloudwatch needs iam
* dlm needs ec2

View File

@@ -43,71 +43,46 @@ Hereunder is the structure for each of the supported report formats by Prowler:
![HTML Output](../img/output-html.png)
### CSV
CSV format has a set of common columns for all the providers, and then provider specific columns.
The common columns are the following:
The following are the columns present in the CSV format:
- ASSESSMENT_START_TIME
- FINDING_UNIQUE_ID
- PROVIDER
- CHECK_ID
- CHECK_TITLE
- CHECK_TYPE
- STATUS
- STATUS_EXTENDED
- SERVICE_NAME
- SUBSERVICE_NAME
- SEVERITY
- RESOURCE_TYPE
- RESOURCE_DETAILS
- RESOURCE_TAGS
- DESCRIPTION
- RISK
- RELATED_URL
- REMEDIATION_RECOMMENDATION_TEXT
- REMEDIATION_RECOMMENDATION_URL
- REMEDIATION_RECOMMENDATION_CODE_NATIVEIAC
- REMEDIATION_RECOMMENDATION_CODE_TERRAFORM
- REMEDIATION_RECOMMENDATION_CODE_CLI
- REMEDIATION_RECOMMENDATION_CODE_OTHER
- COMPLIANCE
- CATEGORIES
- DEPENDS_ON
- RELATED_TO
- NOTES
And then by the provider specific columns:
#### AWS
- PROFILE
- ACCOUNT_ID
- ACCOUNT_NAME
- ACCOUNT_EMAIL
- ACCOUNT_ARN
- ACCOUNT_ORG
- ACCOUNT_TAGS
- REGION
- RESOURCE_ID
- RESOURCE_ARN
#### AZURE
- TENANT_DOMAIN
- SUBSCRIPTION
- RESOURCE_ID
- RESOURCE_NAME
#### GCP
- PROJECT_ID
- LOCATION
- RESOURCE_ID
- RESOURCE_NAME
- ACCOUNT_NAME
- ACCOUNT_EMAIL
- ACCOUNT_ARN
- ACCOUNT_ORG
- ACCOUNT_TAGS
- REGION
- CHECK_ID
- CHECK_TITLE
- CHECK_TYPE
- STATUS
- STATUS_EXTENDED
- SERVICE_NAME
- SUBSERVICE_NAME
- SEVERITY
- RESOURCE_ID
- RESOURCE_ARN
- RESOURCE_TYPE
- RESOURCE_DETAILS
- RESOURCE_TAGS
- DESCRIPTION
- COMPLIANCE
- RISK
- RELATED_URL
- REMEDIATION_RECOMMENDATION_TEXT
- REMEDIATION_RECOMMENDATION_URL
- REMEDIATION_RECOMMENDATION_CODE_NATIVEIAC
- REMEDIATION_RECOMMENDATION_CODE_TERRAFORM
- REMEDIATION_RECOMMENDATION_CODE_CLI
- REMEDIATION_RECOMMENDATION_CODE_OTHER
- CATEGORIES
- DEPENDS_ON
- RELATED_TO
- NOTES
> Since Prowler v3 the CSV column delimiter is the semicolon (`;`)
### JSON

View File

@@ -36,12 +36,10 @@ nav:
- Slack Integration: tutorials/integrations.md
- Configuration File: tutorials/configuration_file.md
- Logging: tutorials/logging.md
- Mute List: tutorials/mutelist.md
- Allowlist: tutorials/allowlist.md
- Check Aliases: tutorials/check-aliases.md
- Custom Metadata: tutorials/custom-checks-metadata.md
- Ignore Unused Services: tutorials/ignore-unused-services.md
- Pentesting: tutorials/pentesting.md
- Parallel Execution: tutorials/parallel-execution.md
- Developer Guide: developer-guide/introduction.md
- AWS:
- Authentication: tutorials/aws/authentication.md
@@ -58,7 +56,6 @@ nav:
- Boto3 Configuration: tutorials/aws/boto3-configuration.md
- Azure:
- Authentication: tutorials/azure/authentication.md
- Non default clouds: tutorials/azure/use-non-default-cloud.md
- Subscriptions: tutorials/azure/subscriptions.md
- Google Cloud:
- Authentication: tutorials/gcp/authentication.md

760
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -6,12 +6,13 @@ import sys
from colorama import Fore, Style
from prowler.config.config import get_available_compliance_frameworks
from prowler.lib.banner import print_banner
from prowler.lib.check.check import (
bulk_load_checks_metadata,
bulk_load_compliance_frameworks,
exclude_checks_to_run,
exclude_services_to_run,
execute_checks,
list_categories,
list_checks_json,
list_services,
@@ -25,20 +26,15 @@ from prowler.lib.check.check import (
)
from prowler.lib.check.checks_loader import load_checks_to_execute
from prowler.lib.check.compliance import update_checks_metadata_with_compliance
from prowler.lib.check.custom_checks_metadata import (
parse_custom_checks_metadata_file,
update_checks_metadata,
)
from prowler.lib.check.managers import ExecutionManager
from prowler.lib.cli.parser import ProwlerArgumentParser
from prowler.lib.logger import logger, set_logging_config
from prowler.lib.outputs.compliance.compliance import display_compliance_table
from prowler.lib.outputs.compliance import display_compliance_table
from prowler.lib.outputs.html import add_html_footer, fill_html_overview_statistics
from prowler.lib.outputs.json import close_json
from prowler.lib.outputs.outputs import extract_findings_statistics
from prowler.lib.outputs.slack import send_slack_message
from prowler.lib.outputs.summary_table import display_summary_table
from prowler.lib.ui.live_display import live_display
from prowler.providers.aws.aws_provider import get_available_aws_service_regions
from prowler.providers.aws.lib.s3.s3 import send_to_s3_bucket
from prowler.providers.aws.lib.security_hub.security_hub import (
batch_send_to_security_hub,
@@ -46,16 +42,12 @@ from prowler.providers.aws.lib.security_hub.security_hub import (
resolve_security_hub_previous_findings,
verify_security_hub_integration_enabled_per_region,
)
from prowler.providers.common.allowlist import set_provider_allowlist
from prowler.providers.common.audit_info import (
set_provider_audit_info,
set_provider_execution_parameters,
)
from prowler.providers.common.clean import clean_provider_local_output_directories
from prowler.providers.common.common import (
get_global_provider,
set_global_provider_object,
)
from prowler.providers.common.mutelist import set_provider_mutelist
from prowler.providers.common.outputs import set_provider_output_options
from prowler.providers.common.quick_inventory import run_provider_quick_inventory
@@ -76,19 +68,13 @@ def prowler():
checks_folder = args.checks_folder
severities = args.severity
compliance_framework = args.compliance
custom_checks_metadata_file = args.custom_checks_metadata_file
live_display.initialize(args)
# if not args.no_banner:
# print_banner(args)
if not args.no_banner:
print_banner(args)
# We treat the compliance framework as another output format
if compliance_framework:
args.output_modes.extend(compliance_framework)
# If no input compliance framework, set all
else:
args.output_modes.extend(get_available_compliance_frameworks(provider))
# Set Logger configuration
set_logging_config(args.log_level, args.log_file, args.only_logs)
@@ -111,19 +97,9 @@ def prowler():
bulk_compliance_frameworks = bulk_load_compliance_frameworks(provider)
# Complete checks metadata with the compliance framework specification
bulk_checks_metadata = update_checks_metadata_with_compliance(
update_checks_metadata_with_compliance(
bulk_compliance_frameworks, bulk_checks_metadata
)
# Update checks metadata if the --custom-checks-metadata-file is present
custom_checks_metadata = None
if custom_checks_metadata_file:
custom_checks_metadata = parse_custom_checks_metadata_file(
provider, custom_checks_metadata_file
)
bulk_checks_metadata = update_checks_metadata(
bulk_checks_metadata, custom_checks_metadata
)
if args.list_compliance:
print_compliance_frameworks(bulk_compliance_frameworks)
sys.exit()
@@ -158,7 +134,6 @@ def prowler():
# Set the audit info based on the selected provider
audit_info = set_provider_audit_info(provider, args.__dict__)
set_global_provider_object(args)
# Import custom checks from folder
if checks_folder:
@@ -183,12 +158,12 @@ def prowler():
# Sort final check list
checks_to_execute = sorted(checks_to_execute)
# Parse Mute List
mutelist_file = set_provider_mutelist(provider, audit_info, args)
# Parse Allowlist
allowlist_file = set_provider_allowlist(provider, audit_info, args)
# Set output options based on the selected provider
audit_output_options = set_provider_output_options(
provider, args, audit_info, mutelist_file, bulk_checks_metadata
provider, args, audit_info, allowlist_file, bulk_checks_metadata
)
# Run the quick inventory for the provider if available
@@ -198,16 +173,10 @@ def prowler():
# Execute checks
findings = []
if len(checks_to_execute):
execution_manager = ExecutionManager(
checks_to_execute,
provider,
audit_info,
audit_output_options,
custom_checks_metadata,
findings = execute_checks(
checks_to_execute, provider, audit_info, audit_output_options
)
findings = execution_manager.execute_checks()
else:
logger.error(
"There are no checks to execute. Please, check your input arguments"
@@ -269,20 +238,16 @@ def prowler():
f"{Style.BRIGHT}\nSending findings to AWS Security Hub, please wait...{Style.RESET_ALL}"
)
# Verify where AWS Security Hub is enabled
global_provider = get_global_provider()
aws_security_enabled_regions = []
security_hub_regions = (
global_provider.get_available_aws_service_regions("securityhub")
get_available_aws_service_regions("securityhub", audit_info)
if not audit_info.audited_regions
else audit_info.audited_regions
)
for region in security_hub_regions:
# Save the regions where AWS Security Hub is enabled
if verify_security_hub_integration_enabled_per_region(
audit_info.audited_partition,
region,
audit_info.audit_session,
audit_info.audited_account,
region, audit_info.audit_session
):
aws_security_enabled_regions.append(region)
@@ -322,12 +287,8 @@ def prowler():
provider,
)
if findings:
compliance_overview = False
if not compliance_framework:
compliance_overview = True
compliance_framework = get_available_compliance_frameworks(provider)
for compliance in sorted(compliance_framework):
if compliance_framework and findings:
for compliance in compliance_framework:
# Display compliance table
display_compliance_table(
findings,
@@ -335,11 +296,6 @@ def prowler():
compliance,
audit_output_options.output_filename,
audit_output_options.output_directory,
compliance_overview,
)
if compliance_overview:
print(
f"\nDetailed compliance results are in {Fore.YELLOW}{audit_output_options.output_directory}/compliance/{Style.RESET_ALL}\n"
)
# If custom checks were passed, remove the modules

View File

@@ -1,4 +1,4 @@
Mute List:
Allowlist:
Accounts:
"*":
########################### AWS CONTROL TOWER ###########################

View File

@@ -3,8 +3,8 @@
### Tags is an optional list that matches on tuples of 'key=value' and are "ANDed" together.
### Use an alternation Regex to match one of multiple tags with "ORed" logic.
### For each check you can except Accounts, Regions, Resources and/or Tags.
########################### MUTE LIST EXAMPLE ###########################
Mute List:
########################### ALLOWLIST EXAMPLE ###########################
Allowlist:
Accounts:
"123456789012":
Checks:

View File

@@ -11,7 +11,7 @@ from prowler.lib.logger import logger
timestamp = datetime.today()
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
prowler_version = "3.11.3"
prowler_version = "3.11.1"
html_logo_url = "https://github.com/prowler-cloud/prowler/"
html_logo_img = "https://user-images.githubusercontent.com/3985464/113734260-7ba06900-96fb-11eb-82bc-d4f68a1e2710.png"
square_logo_img = "https://user-images.githubusercontent.com/38561120/235905862-9ece5bd7-9aa3-4e48-807a-3a9035eb8bfb.png"
@@ -22,22 +22,13 @@ gcp_logo = "https://user-images.githubusercontent.com/38561120/235928332-eb4accd
orange_color = "\033[38;5;208m"
banner_color = "\033[1;92m"
# Severities
valid_severities = ["critical", "high", "medium", "low", "informational"]
# Statuses
finding_statuses = ["PASS", "FAIL", "MANUAL"]
# Compliance
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
def get_available_compliance_frameworks(provider=None):
def get_available_compliance_frameworks():
available_compliance_frameworks = []
providers = ["aws", "gcp", "azure"]
if provider:
providers = [provider]
for provider in providers:
for provider in ["aws", "gcp", "azure"]:
with os.scandir(f"{actual_directory}/../compliance/{provider}") as files:
for file in files:
if file.is_file() and file.name.endswith(".json"):
@@ -56,6 +47,7 @@ aws_services_json_file = "aws_regions_by_service.json"
# gcp_zones_json_file = "gcp_zones.json"
default_output_directory = getcwd() + "/output"
output_file_timestamp = timestamp.strftime("%Y%m%d%H%M%S")
timestamp_iso = timestamp.isoformat(sep=" ", timespec="seconds")
csv_file_suffix = ".csv"
@@ -78,9 +70,7 @@ def check_current_version():
if latest_version != prowler_version:
return f"{prowler_version_string} (latest is {latest_version}, upgrade for the latest features)"
else:
return (
f"{prowler_version_string} (You are running the latest version, yay!)"
)
return f"{prowler_version_string} (it is the latest version, yay!)"
except requests.RequestException:
return f"{prowler_version_string}"
except Exception:

View File

@@ -2,10 +2,10 @@
aws:
# AWS Global Configuration
# aws.mute_non_default_regions --> Set to True to mute failed findings in non-default regions for GuardDuty, SecurityHub, DRS and Config
mute_non_default_regions: False
# If you want to mute failed findings only in specific regions, create a file with the following syntax and run it with `prowler aws -w mutelist.yaml`:
# Mute List:
# aws.allowlist_non_default_regions --> Set to True to allowlist failed findings in non-default regions for GuardDuty, SecurityHub, DRS and Config
allowlist_non_default_regions: False
# If you want to allowlist/mute failed findings only in specific regions, create a file with the following syntax and run it with `prowler aws -w allowlist.yaml`:
# Allowlist:
# Accounts:
# "*":
# Checks:
@@ -92,6 +92,3 @@ azure:
# GCP Configuration
gcp:
# Kubernetes Configuration
kubernetes:

View File

@@ -1,19 +0,0 @@
CustomChecksMetadata:
aws:
Checks:
s3_bucket_level_public_access_block:
Severity: high
s3_bucket_no_mfa_delete:
Severity: high
azure:
Checks:
storage_infrastructure_encryption_is_enabled:
Severity: medium
gcp:
Checks:
compute_instance_public_ip:
Severity: critical
kubernetes:
Checks:
apiserver_anonymous_requests:
Severity: low

View File

@@ -15,13 +15,13 @@ def print_banner(args):
"""
print(banner)
if args.verbose:
if args.verbose or args.quiet:
print(
f"""
Color code for results:
- {Fore.YELLOW}MANUAL (Manual check){Style.RESET_ALL}
- {Fore.YELLOW}INFO (Information){Style.RESET_ALL}
- {Fore.GREEN}PASS (Recommended value){Style.RESET_ALL}
- {orange_color}MUTED (Muted by muted list){Style.RESET_ALL}
- {orange_color}WARNING (Ignored by allowlist){Style.RESET_ALL}
- {Fore.RED}FAIL (Fix required){Style.RESET_ALL}
"""
)

View File

@@ -10,19 +10,17 @@ from pkgutil import walk_packages
from types import ModuleType
from typing import Any
from alive_progress import alive_bar
from colorama import Fore, Style
import prowler
from prowler.config.config import orange_color
from prowler.lib.check.compliance_models import load_compliance_framework
from prowler.lib.check.custom_checks_metadata import update_check_metadata
from prowler.lib.check.managers import ExecutionManager
from prowler.lib.check.models import Check, load_check_metadata
from prowler.lib.logger import logger
from prowler.lib.outputs.outputs import report
from prowler.lib.ui.live_display import live_display
from prowler.lib.utils.utils import open_file, parse_json_file
from prowler.providers.aws.lib.mutelist.mutelist import mutelist_findings
from prowler.providers.common.common import get_global_provider
from prowler.providers.aws.lib.allowlist.allowlist import allowlist_findings
from prowler.providers.common.models import Audit_Metadata
from prowler.providers.common.outputs import Provider_Output_Options
@@ -108,20 +106,14 @@ def exclude_services_to_run(
# Load checks from checklist.json
def parse_checks_from_file(input_file: str, provider: str) -> set:
"""parse_checks_from_file returns a set of checks read from the given file"""
try:
checks_to_execute = set()
with open_file(input_file) as f:
json_file = parse_json_file(f)
checks_to_execute = set()
with open_file(input_file) as f:
json_file = parse_json_file(f)
for check_name in json_file[provider]:
checks_to_execute.add(check_name)
for check_name in json_file[provider]:
checks_to_execute.add(check_name)
return checks_to_execute
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
return checks_to_execute
# Load checks from custom folder
@@ -317,7 +309,7 @@ def print_checks(
def parse_checks_from_compliance_framework(
compliance_frameworks: list, bulk_compliance_frameworks: dict
) -> list:
"""parse_checks_from_compliance_framework returns a set of checks from the given compliance_frameworks"""
"""Parse checks from compliance frameworks specification"""
checks_to_execute = set()
try:
for framework in compliance_frameworks:
@@ -424,7 +416,6 @@ def execute_checks(
provider: str,
audit_info: Any,
audit_output_options: Provider_Output_Options,
custom_checks_metadata: Any,
) -> list:
# List to store all the check's findings
all_findings = []
@@ -432,10 +423,8 @@ def execute_checks(
services_executed = set()
checks_executed = set()
global_provider = get_global_provider()
# Initialize the Audit Metadata
global_provider.audit_metadata = Audit_Metadata(
audit_info.audit_metadata = Audit_Metadata(
services_scanned=0,
expected_checks=checks_to_execute,
completed_checks=0,
@@ -472,7 +461,6 @@ def execute_checks(
audit_info,
services_executed,
checks_executed,
custom_checks_metadata,
)
all_findings.extend(check_findings)
@@ -495,57 +483,46 @@ def execute_checks(
print(
f"{Style.BRIGHT}Executing {checks_num} {check_noun}, please wait...{Style.RESET_ALL}\n"
)
execution_manager = ExecutionManager(provider, checks_to_execute)
total_checks = execution_manager.total_checks_per_service()
completed_checks = {service: 0 for service in total_checks}
service_findings = []
for service, check_name in execution_manager.execute_checks():
try:
check_findings = execute(
service,
check_name,
provider,
audit_output_options,
audit_info,
services_executed,
checks_executed,
custom_checks_metadata,
with alive_bar(
total=len(checks_to_execute),
ctrl_c=False,
bar="blocks",
spinner="classic",
stats=False,
enrich_print=False,
) as bar:
for check_name in checks_to_execute:
# Recover service from check name
service = check_name.split("_")[0]
bar.title = (
f"-> Scanning {orange_color}{service}{Style.RESET_ALL} service"
)
all_findings.extend(check_findings)
service_findings.extend(check_findings)
# Update the completed checks count
completed_checks[service] += 1
try:
check_findings = execute(
service,
check_name,
provider,
audit_output_options,
audit_info,
services_executed,
checks_executed,
)
all_findings.extend(check_findings)
# Check if all checks for the service are completed
if completed_checks[service] == total_checks[service]:
# All checks for the service are completed
# Add a summary table or perform other actions
live_display.add_results_for_service(service, service_findings)
# Clear service_findings
service_findings = []
# If check does not exists in the provider or is from another provider
except ModuleNotFoundError:
logger.error(
f"Check '{check_name}' was not found for the {provider.upper()} provider"
)
except Exception as error:
logger.error(
f"{check_name} - {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
# If check does not exists in the provider or is from another provider
except ModuleNotFoundError:
logger.error(
f"Check '{check_name}' was not found for the {provider.upper()} provider"
)
except Exception as error:
logger.error(
f"{check_name} - {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
bar()
bar.title = f"-> {Fore.GREEN}Scan completed!{Style.RESET_ALL}"
return all_findings
def create_check_service_dict(checks_to_execute):
output = {}
for check_name in checks_to_execute:
service = check_name.split("_")[0]
if service not in output.keys():
output[service] = []
output[service].append(check_name)
return output
def execute(
service: str,
check_name: str,
@@ -554,9 +531,7 @@ def execute(
audit_info: Any,
services_executed: set,
checks_executed: set,
custom_checks_metadata: Any,
):
global_provider = get_global_provider()
# Import check module
check_module_path = (
f"prowler.providers.{provider}.services.{service}.{check_name}.{check_name}"
@@ -566,25 +541,21 @@ def execute(
check_to_execute = getattr(lib, check_name)
c = check_to_execute()
# Update check metadata to reflect that in the outputs
if custom_checks_metadata and custom_checks_metadata["Checks"].get(c.CheckID):
c = update_check_metadata(c, custom_checks_metadata["Checks"][c.CheckID])
# Run check
check_findings = run_check(c, audit_output_options)
# Update Audit Status
services_executed.add(service)
checks_executed.add(check_name)
global_provider.audit_metadata = update_audit_metadata(
global_provider.audit_metadata, services_executed, checks_executed
audit_info.audit_metadata = update_audit_metadata(
audit_info.audit_metadata, services_executed, checks_executed
)
# Mute List findings
if audit_output_options.mutelist_file:
check_findings = mutelist_findings(
audit_output_options.mutelist_file,
global_provider.audited_account,
# Allowlist findings
if audit_output_options.allowlist_file:
check_findings = allowlist_findings(
audit_output_options.allowlist_file,
audit_info.audited_account,
check_findings,
)
@@ -627,32 +598,22 @@ def update_audit_metadata(
)
def recover_checks_from_service(service_list: list, provider: str) -> set:
"""
Recover all checks from the selected provider and service
def recover_checks_from_service(service_list: list, provider: str) -> list:
checks = set()
service_list = [
"awslambda" if service == "lambda" else service for service in service_list
]
for service in service_list:
modules = recover_checks_from_provider(provider, service)
if not modules:
logger.error(f"Service '{service}' does not have checks.")
Returns a set of checks from the given services
"""
try:
checks = set()
service_list = [
"awslambda" if service == "lambda" else service for service in service_list
]
for service in service_list:
service_checks = recover_checks_from_provider(provider, service)
if not service_checks:
logger.error(f"Service '{service}' does not have checks.")
else:
for check in service_checks:
# Recover check name and module name from import path
# Format: "providers.{provider}.services.{service}.{check_name}.{check_name}"
check_name = check[0].split(".")[-1]
# If the service is present in the group list passed as parameters
# if service_name in group_list: checks_from_arn.add(check_name)
checks.add(check_name)
return checks
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
for check_module in modules:
# Recover check name and module name from import path
# Format: "providers.{provider}.services.{service}.{check_name}.{check_name}"
check_name = check_module[0].split(".")[-1]
# If the service is present in the group list passed as parameters
# if service_name in group_list: checks_from_arn.add(check_name)
checks.add(check_name)
return checks

View File

@@ -1,48 +0,0 @@
import ast
import os
import pathlib
from prowler.lib.logger import logger
class ImportFinder(ast.NodeVisitor):
def __init__(self, provider):
self.imports = set()
self.provider = provider
def visit_ImportFrom(self, node):
if node.module and f"prowler.providers.{self.provider}.services" in node.module:
for name in node.names:
if "_client" in name.name:
self.imports.add(name.name)
self.generic_visit(node)
def analyze_check_file(file_path, provider):
# Prase the check file
with open(file_path, "r") as file:
node = ast.parse(file.read(), filename=file_path)
finder = ImportFinder(provider)
finder.visit(node)
return list(finder.imports)
def get_dependencies_for_checks(provider, checks_dict):
current_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
prowler_dir = current_directory.parent.parent
check_dependencies = {}
for service_name, checks in checks_dict.items():
check_dependencies[service_name] = {}
for check_name in checks:
relative_path = f"providers/{provider}/services/{service_name}/{check_name}/{check_name}.py"
check_file_path = prowler_dir / relative_path
if not check_file_path.exists():
logger.error(
f"{check_name} does not exist at {relative_path}! Cannot determine service dependencies"
)
continue
clients = analyze_check_file(str(check_file_path), provider)
check_dependencies[service_name][check_name] = clients
return check_dependencies

View File

@@ -1,6 +1,5 @@
from colorama import Fore, Style
from prowler.config.config import valid_severities
from prowler.lib.check.check import (
parse_checks_from_compliance_framework,
parse_checks_from_file,
@@ -11,6 +10,7 @@ from prowler.lib.logger import logger
# Generate the list of checks to execute
# PENDING Test for this function
def load_checks_to_execute(
bulk_checks_metadata: dict,
bulk_compliance_frameworks: dict,
@@ -22,93 +22,69 @@ def load_checks_to_execute(
categories: set,
provider: str,
) -> set:
"""Generate the list of checks to execute based on the cloud provider and the input arguments given"""
try:
# Local subsets
checks_to_execute = set()
check_aliases = {}
check_severities = {key: [] for key in valid_severities}
check_categories = {}
"""Generate the list of checks to execute based on the cloud provider and input arguments specified"""
checks_to_execute = set()
# First, loop over the bulk_checks_metadata to extract the needed subsets
for check, metadata in bulk_checks_metadata.items():
# Aliases
for alias in metadata.CheckAliases:
check_aliases[alias] = check
# Handle if there are checks passed using -c/--checks
if check_list:
for check_name in check_list:
checks_to_execute.add(check_name)
# Severities
if metadata.Severity:
check_severities[metadata.Severity].append(check)
# Handle if there are some severities passed using --severity
elif severities:
for check in bulk_checks_metadata:
# Check check's severity
if bulk_checks_metadata[check].Severity in severities:
checks_to_execute.add(check)
# Categories
for category in metadata.Categories:
if category not in check_categories:
check_categories[category] = []
check_categories[category].append(check)
# Handle if there are checks passed using -c/--checks
if check_list:
for check_name in check_list:
checks_to_execute.add(check_name)
# Handle if there are some severities passed using --severity
elif severities:
for severity in severities:
checks_to_execute.update(check_severities[severity])
if service_list:
checks_to_execute = (
recover_checks_from_service(service_list, provider)
& checks_to_execute
)
# Handle if there are checks passed using -C/--checks-file
elif checks_file:
# Handle if there are checks passed using -C/--checks-file
elif checks_file:
try:
checks_to_execute = parse_checks_from_file(checks_file, provider)
except Exception as e:
logger.error(f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}] -- {e}")
# Handle if there are services passed using -s/--services
elif service_list:
checks_to_execute = recover_checks_from_service(service_list, provider)
# Handle if there are services passed using -s/--services
elif service_list:
checks_to_execute = recover_checks_from_service(service_list, provider)
# Handle if there are compliance frameworks passed using --compliance
elif compliance_frameworks:
# Handle if there are compliance frameworks passed using --compliance
elif compliance_frameworks:
try:
checks_to_execute = parse_checks_from_compliance_framework(
compliance_frameworks, bulk_compliance_frameworks
)
except Exception as e:
logger.error(f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}] -- {e}")
# Handle if there are categories passed using --categories
elif categories:
for category in categories:
checks_to_execute.update(check_categories[category])
# Handle if there are categories passed using --categories
elif categories:
for cat in categories:
for check in bulk_checks_metadata:
# Check check's categories
if cat in bulk_checks_metadata[check].Categories:
checks_to_execute.add(check)
# If there are no checks passed as argument
else:
# If there are no checks passed as argument
else:
try:
# Get all check modules to run with the specific provider
checks = recover_checks_from_provider(provider)
except Exception as e:
logger.error(f"{e.__class__.__name__}[{e.__traceback__.tb_lineno}] -- {e}")
else:
for check_info in checks:
# Recover check name from import path (last part)
# Format: "providers.{provider}.services.{service}.{check_name}.{check_name}"
check_name = check_info[0]
checks_to_execute.add(check_name)
# Check Aliases
checks_to_execute = update_checks_to_execute_with_aliases(
checks_to_execute, check_aliases
)
# Get Check Aliases mapping
check_aliases = {}
for check, metadata in bulk_checks_metadata.items():
for alias in metadata.CheckAliases:
check_aliases[alias] = check
return checks_to_execute
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
def update_checks_to_execute_with_aliases(
checks_to_execute: set, check_aliases: dict
) -> set:
"""update_checks_to_execute_with_aliases returns the checks_to_execute updated using the check aliases."""
# Verify if any input check is an alias of another check
for input_check in checks_to_execute:
if (
@@ -121,4 +97,5 @@ def update_checks_to_execute_with_aliases(
print(
f"\nUsing alias {Fore.YELLOW}{input_check}{Style.RESET_ALL} for check {Fore.YELLOW}{check_aliases[input_check]}{Style.RESET_ALL}...\n"
)
return checks_to_execute

View File

@@ -1,77 +0,0 @@
import sys
import yaml
from jsonschema import validate
from prowler.config.config import valid_severities
from prowler.lib.logger import logger
custom_checks_metadata_schema = {
"type": "object",
"properties": {
"Checks": {
"type": "object",
"patternProperties": {
".*": {
"type": "object",
"properties": {
"Severity": {
"type": "string",
"enum": valid_severities,
}
},
"required": ["Severity"],
"additionalProperties": False,
}
},
"additionalProperties": False,
}
},
"required": ["Checks"],
"additionalProperties": False,
}
def parse_custom_checks_metadata_file(provider: str, parse_custom_checks_metadata_file):
"""parse_custom_checks_metadata_file returns the custom_checks_metadata object if it is valid, otherwise aborts the execution returning the ValidationError."""
try:
with open(parse_custom_checks_metadata_file) as f:
custom_checks_metadata = yaml.safe_load(f)["CustomChecksMetadata"][provider]
validate(custom_checks_metadata, schema=custom_checks_metadata_schema)
return custom_checks_metadata
except Exception as error:
logger.critical(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
sys.exit(1)
def update_checks_metadata(bulk_checks_metadata, custom_checks_metadata):
"""update_checks_metadata returns the bulk_checks_metadata with the check's metadata updated based on the custom_checks_metadata provided."""
try:
# Update checks metadata from CustomChecksMetadata file
for check, custom_metadata in custom_checks_metadata["Checks"].items():
check_metadata = bulk_checks_metadata.get(check)
if check_metadata:
bulk_checks_metadata[check] = update_check_metadata(
check_metadata, custom_metadata
)
return bulk_checks_metadata
except Exception as error:
logger.critical(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
)
sys.exit(1)
def update_check_metadata(check_metadata, custom_metadata):
"""update_check_metadata updates the check_metadata fields present in the custom_metadata and returns the updated version of the check_metadata. If some field is not present or valid the check_metadata is returned with the original fields."""
try:
if custom_metadata:
for attribute in custom_metadata:
try:
setattr(check_metadata, attribute, custom_metadata[attribute])
except ValueError:
pass
finally:
return check_metadata

View File

@@ -1,369 +0,0 @@
import importlib
import os
import sys
import traceback
from types import ModuleType
from typing import Any, Set
from colorama import Fore, Style
from prowler.lib.check.check_to_client_mapper import get_dependencies_for_checks
from prowler.lib.check.custom_checks_metadata import update_check_metadata
from prowler.lib.check.models import Check
from prowler.lib.logger import logger
from prowler.lib.outputs.outputs import report
from prowler.lib.ui.live_display import live_display
from prowler.providers.aws.lib.mutelist.mutelist import mutelist_findings
from prowler.providers.common.common import get_global_provider
from prowler.providers.common.models import Audit_Metadata
from prowler.providers.common.outputs import Provider_Output_Options
class ExecutionManager:
def __init__(
self,
checks_to_execute: list,
provider: str,
audit_info: Any,
audit_output_options: Provider_Output_Options,
custom_checks_metadata: Any,
):
self.checks_to_execute = checks_to_execute
self.provider = provider
self.audit_info = audit_info
self.audit_output_options = audit_output_options
self.custom_checks_metadata = custom_checks_metadata
self.live_display = live_display
self.live_display.start()
self.loaded_clients = {} # defaultdict(lambda: False)
self.check_dict = self.create_check_service_dict(checks_to_execute)
self.check_dependencies = get_dependencies_for_checks(provider, self.check_dict)
self.remaining_checks = self.initialize_remaining_checks(
self.check_dependencies
)
self.services_queue = self.initialize_services_queue(self.check_dependencies)
# For tracking the executed services and checks
self.services_executed: Set[str] = set()
self.checks_executed: Set[str] = set()
# Initialize the Audit Metadata
self.audit_info.audit_metadata = Audit_Metadata(
services_scanned=0,
expected_checks=self.checks_to_execute,
completed_checks=0,
audit_progress=0,
)
def update_tracking(self, service: str, check: str):
self.services_executed.add(service)
self.checks_executed.add(check)
@staticmethod
def initialize_remaining_checks(check_dependencies):
remaining_checks = {}
for service, checks in check_dependencies.items():
for check_name, clients in checks.items():
remaining_checks[(service, check_name)] = clients
return remaining_checks
@staticmethod
def initialize_services_queue(check_dependencies):
return list(check_dependencies.keys())
@staticmethod
def create_check_service_dict(checks_to_execute):
output = {}
for check_name in checks_to_execute:
service = check_name.split("_")[0]
if service not in output.keys():
output[service] = []
output[service].append(check_name)
return output
def total_checks_per_service(self):
"""Returns a dictionary with the total number of checks for each service."""
total_checks = {}
for service, checks in self.check_dict.items():
total_checks[service] = len(checks)
return total_checks
def find_next_service(self):
# Prioritize services that use already loaded clients
for service in self.services_queue:
checks = self.check_dependencies[service]
if any(
client in self.loaded_clients
for check in checks.values()
for client in check
):
return service
return None if not self.services_queue else self.services_queue[0]
@staticmethod
def import_check(check_path: str) -> ModuleType:
"""
Imports an input check using its path
When importing a module using importlib.import_module, it's loaded and added to the sys.modules cache.
This means that the module remains in memory and is not garbage collected immediately after use, as it's still referenced in sys.modules.
This behavior is intentional, as importing modules can be a costly operation, and keeping them in memory allows for faster re-use.
release_check deletes this reference if it is no longer required by any of the remaining checks
"""
lib = importlib.import_module(f"{check_path}")
return lib
# Imports service clients, and tracks if it needs to be imported
def import_client(self, client_name):
if not self.loaded_clients.get(client_name):
# Dynamically import the client
module_name, _ = client_name.rsplit("_", 1)
client_module = importlib.import_module(
f"prowler.providers.{self.provider}.services.{module_name}.{client_name}"
)
self.loaded_clients[client_name] = client_module
def release_clients(self, completed_check_clients):
for client_name in completed_check_clients:
# Determine if any of the remaining checks still require the client
if not any(
client == client_name
for check in self.remaining_checks
for client in self.remaining_checks[check]
):
# Delete the reference to the client for this object
del self.loaded_clients[client_name]
module_name, _ = client_name.rsplit("_", 1)
# Delete the reference to the client in sys.modules
del sys.modules[
f"prowler.providers.aws.services.{module_name}.{client_name}"
]
def generate_checks(self):
"""
This is a generator function, which will:
* Determine the next service whose checks will be executed
* Load all the clients which are required by the checks into memory (init them)
* Yield the service and check name, 1-by-1, to be used within execute_checks
* Pass the completed checks to release_clients to determine if the clients that were required by the check are no longer needed, and can be garabage collected
It will complete the checks for a service, before moving onto the next one
It uses find_next_service to prioritize the next service based on if any of that service's checks require a client that has already been loaded
"""
while self.remaining_checks:
current_service = self.find_next_service()
if not current_service:
# Execution has completed, return
break
# Remove the service from the services_queue
self.services_queue.remove(current_service)
checks = self.check_dependencies[current_service]
clients_for_service = list(
set(client for client_list in checks.values() for client in client_list)
)
for client in clients_for_service:
self.live_display.add_client_init_section(client)
self.import_client(client)
# Add the display component
total_checks = len(self.check_dict[current_service])
self.live_display.add_service_section(current_service, total_checks)
for check_name, clients_for_check in checks.items():
yield current_service, check_name
self.live_display.increment_check_progress()
self.live_display.increment_overall_check_progress()
del self.remaining_checks[(current_service, check_name)]
self.release_clients(clients_for_check)
self.live_display.increment_overall_service_progress()
def execute_checks(self) -> list:
# List to store all the check's findings
all_findings = []
# Services and checks executed for the Audit Status
global_provider = get_global_provider()
# Initialize the Audit Metadata
global_provider.audit_metadata = Audit_Metadata(
services_scanned=0,
expected_checks=self.checks_to_execute,
completed_checks=0,
audit_progress=0,
)
if os.name != "nt":
try:
from resource import RLIMIT_NOFILE, getrlimit
# Check ulimit for the maximum system open files
soft, _ = getrlimit(RLIMIT_NOFILE)
if soft < 4096:
logger.warning(
f"Your session file descriptors limit ({soft} open files) is below 4096. We recommend to increase it to avoid errors. Solve it running this command `ulimit -n 4096`. For more info visit https://docs.prowler.cloud/en/latest/troubleshooting/"
)
except Exception as error:
logger.error("Unable to retrieve ulimit default settings")
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
# Execution with the --only-logs flag
if self.audit_output_options.only_logs:
for service, check_name in self.generate_checks():
try:
check_findings = self.execute(service, check_name)
all_findings.extend(check_findings)
# If check does not exists in the provider or is from another provider
except ModuleNotFoundError:
logger.error(
f"Check '{check_name}' was not found for the {self.provider.upper()} provider"
)
except Exception as error:
logger.error(
f"{check_name} - {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
# Default execution
total_checks = self.total_checks_per_service()
self.live_display.add_overall_progress_section(
total_checks_dict=total_checks
)
# For tracking when a service is completed
completed_checks = {service: 0 for service in total_checks}
service_findings = []
for service, check_name in self.generate_checks():
try:
check_findings = self.execute(
service,
check_name,
)
all_findings.extend(check_findings)
service_findings.extend(check_findings)
# Update the completed checks count
completed_checks[service] += 1
# Check if all checks for the service are completed
if completed_checks[service] == total_checks[service]:
# All checks for the service are completed
# Add a summary table or perform other actions
live_display.add_results_for_service(service, service_findings)
# Clear service_findings
service_findings = []
# If check does not exists in the provider or is from another provider
except ModuleNotFoundError:
logger.error(
f"Check '{check_name}' was not found for the {self.provider.upper()} provider"
)
except Exception as error:
logger.error(
f"{check_name} - {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
self.live_display.hide_service_section()
return all_findings
def execute(
self,
service: str,
check_name: str,
):
try:
# Import check module
check_module_path = f"prowler.providers.{self.provider}.services.{service}.{check_name}.{check_name}"
lib = self.import_check(check_module_path)
# Recover functions from check
check_to_execute = getattr(lib, check_name)
c = check_to_execute()
# Update check metadata to reflect that in the outputs
if self.custom_checks_metadata and self.custom_checks_metadata[
"Checks"
].get(c.CheckID):
c = update_check_metadata(
c, self.custom_checks_metadata["Checks"][c.CheckID]
)
# Run check
check_findings = self.run_check(c, self.audit_output_options)
# Update Audit Status
self.update_tracking(service, check_name)
self.update_audit_metadata()
# Mutelist findings
if self.audit_output_options.mutelist_file:
check_findings = mutelist_findings(
self.audit_output_options.mutelist_file,
self.audit_info.audited_account,
check_findings,
)
# Report the check's findings
report(check_findings, self.audit_output_options, self.audit_info)
if os.environ.get("PROWLER_REPORT_LIB_PATH"):
try:
logger.info("Using custom report interface ...")
lib = os.environ["PROWLER_REPORT_LIB_PATH"]
outputs_module = importlib.import_module(lib)
custom_report_interface = getattr(outputs_module, "report")
custom_report_interface(
check_findings, self.audit_output_options, self.audit_info
)
except Exception:
sys.exit(1)
except Exception as error:
logger.error(
f"{check_name} - {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return check_findings
@staticmethod
def run_check(check: Check, output_options: Provider_Output_Options) -> list:
findings = []
if output_options.verbose:
print(
f"\nCheck ID: {check.CheckID} - {Fore.MAGENTA}{check.ServiceName}{Fore.YELLOW} [{check.Severity}]{Style.RESET_ALL}"
)
logger.debug(f"Executing check: {check.CheckID}")
try:
findings = check.execute()
except Exception as error:
if not output_options.only_logs:
print(
f"Something went wrong in {check.CheckID}, please use --log-level ERROR"
)
logger.error(
f"{check.CheckID} -- {error.__class__.__name__}[{traceback.extract_tb(error.__traceback__)[-1].lineno}]: {error}"
)
finally:
return findings
def update_audit_metadata(self):
"""update_audit_metadata returns the audit_metadata updated with the new status
Updates the given audit_metadata using the length of the services_executed and checks_executed
"""
try:
self.audit_info.audit_metadata.services_scanned = len(
self.services_executed
)
self.audit_info.audit_metadata.completed_checks = len(self.checks_executed)
self.audit_info.audit_metadata.audit_progress = (
100
* len(self.checks_executed)
/ len(self.audit_info.audit_metadata.expected_checks)
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)

View File

@@ -2,13 +2,10 @@ import os
import sys
from abc import ABC, abstractmethod
from dataclasses import dataclass
from functools import wraps
from pydantic import BaseModel, ValidationError
from pydantic.main import ModelMetaclass
from prowler.lib.logger import logger
from prowler.lib.ui.live_display import live_display
class Code(BaseModel):
@@ -60,29 +57,9 @@ class Check_Metadata_Model(BaseModel):
Compliance: list = None
class CheckMeta(ModelMetaclass):
"""
Dynamically decorates the execute function of all subclasses of the Check class
By making CheckMeta inherit from ModelMetaclass, it ensures that all features provided by Pydantic's BaseModel (such as data validation, serialization, and so forth) are preserved. CheckMeta just adds additional behavior (decorator application) on top of the existing features.
This also works because ModelMetaclass inherits from ABCMeta, as does the ABC class (its got to do with how metaclasses work when applying it to a class that inherits from other classes that have a metaclass).
The primary role of CheckMeta is to automatically apply a decorator to the execute method of subclasses. This behavior does not conflict with the typical responsibilities of ModelMetaclass
"""
def __new__(cls, name, bases, dct):
if "execute" in dct and not getattr(
dct["execute"], "__isabstractmethod__", False
):
dct["execute"] = Check.update_title_with_findings_decorator(dct["execute"])
return super(CheckMeta, cls).__new__(cls, name, bases, dct)
class Check(ABC, Check_Metadata_Model, metaclass=CheckMeta):
class Check(ABC, Check_Metadata_Model):
"""Prowler Check"""
title_bar_task: int = None
progress_task: int = None
def __init__(self, **data):
"""Check's init function. Calls the CheckMetadataModel init."""
# Parse the Check's metadata file
@@ -95,43 +72,6 @@ class Check(ABC, Check_Metadata_Model, metaclass=CheckMeta):
# Calls parents init function
super().__init__(**data)
self.live_display_enabled = False
service_section = live_display.get_service_section()
if service_section:
self.live_display_enabled = True
self.title_bar_task = service_section.title_bar.add_task(
f"{self.CheckTitle}...", start=False
)
def increment_task_progress(self):
if self.live_display_enabled:
current_section = live_display.get_service_section()
current_section.task_progress.update(self.progress_task, advance=1)
def start_task(self, message, count):
if self.live_display_enabled:
current_section = live_display.get_service_section()
self.progress_task = current_section.task_progress.add_task(
description=message, total=count, visible=True
)
def update_title_with_findings(self, findings):
if self.live_display_enabled:
current_section = live_display.get_service_section()
# current_section.task_progress.remove_task(self.progress_task)
total_failed = len(
[report for report in findings if report.status == "FAIL"]
)
total_checked = len(findings)
if total_failed == 0:
message = f"{self.CheckTitle} [pass]All resources passed ({total_checked})[/pass]"
else:
message = f"{self.CheckTitle} [fail]{total_failed}/{total_checked} failed![/fail]"
current_section.title_bar.update(
task_id=self.title_bar_task, description=message
)
def metadata(self) -> dict:
"""Return the JSON representation of the check's metadata"""
return self.json()
@@ -140,24 +80,6 @@ class Check(ABC, Check_Metadata_Model, metaclass=CheckMeta):
def execute(self):
"""Execute the check's logic"""
@staticmethod
def update_title_with_findings_decorator(func):
"""
Decorator to update the title bar in the live_display with findings after executing a check.
"""
@wraps(func)
def wrapper(check_instance, *args, **kwargs):
# Execute the original check's logic
findings = func(check_instance, *args, **kwargs)
# Update the title bar with the findings
check_instance.update_title_with_findings(findings)
return findings
return wrapper
@dataclass
class Check_Report:
@@ -224,22 +146,6 @@ class Check_Report_GCP(Check_Report):
self.location = ""
@dataclass
class Check_Report_Kubernetes(Check_Report):
# TODO change class name to CheckReportKubernetes
"""Contains the Kubernetes Check's finding information."""
resource_name: str
resource_id: str
namespace: str
def __init__(self, metadata):
super().__init__(metadata)
self.resource_name = ""
self.resource_id = ""
self.namespace = ""
# Testing Pending
def load_check_metadata(metadata_file: str) -> Check_Metadata_Model:
"""load_check_metadata loads and parse a Check's metadata file"""

View File

@@ -7,8 +7,6 @@ from prowler.config.config import (
check_current_version,
default_config_file_path,
default_output_directory,
valid_severities,
finding_statuses,
)
from prowler.providers.common.arguments import (
init_providers_parser,
@@ -51,7 +49,6 @@ Detailed documentation at https://docs.prowler.cloud
self.__init_exclude_checks_parser__()
self.__init_list_checks_parser__()
self.__init_config_parser__()
self.__init_custom_checks_metadata_parser__()
# Init Providers Arguments
init_providers_parser(self)
@@ -117,10 +114,10 @@ Detailed documentation at https://docs.prowler.cloud
"Outputs"
)
common_outputs_parser.add_argument(
"--status",
nargs="+",
help=f"Filter by the status of the findings {finding_statuses}",
choices=finding_statuses,
"-q",
"--quiet",
action="store_true",
help="Store or send only Prowler failed findings",
)
common_outputs_parser.add_argument(
"-M",
@@ -223,11 +220,11 @@ Detailed documentation at https://docs.prowler.cloud
group.add_argument(
"-s", "--services", nargs="+", help="List of services to be executed."
)
common_checks_parser.add_argument(
group.add_argument(
"--severity",
nargs="+",
help=f"List of severities to be executed {valid_severities}",
choices=valid_severities,
help="List of severities to be executed [informational, low, medium, high, critical]",
choices=["informational", "low", "medium", "high", "critical"],
)
group.add_argument(
"--compliance",
@@ -289,15 +286,3 @@ Detailed documentation at https://docs.prowler.cloud
default=default_config_file_path,
help="Set configuration file path",
)
def __init_custom_checks_metadata_parser__(self):
# CustomChecksMetadata
custom_checks_metadata_subparser = (
self.common_providers_parser.add_argument_group("Custom Checks Metadata")
)
custom_checks_metadata_subparser.add_argument(
"--custom-checks-metadata-file",
nargs="?",
default=None,
help="Path for the custom checks metadata YAML file. See example prowler/config/custom_checks_metadata_example.yaml for reference and format. See more in https://docs.prowler.cloud/en/latest/tutorials/custom-checks-metadata/",
)

View File

@@ -401,8 +401,7 @@ def display_compliance_table(
"Bajo": 0,
}
if finding.status == "FAIL":
if attribute.Tipo != "recomendacion":
fail_count += 1
fail_count += 1
marcos[marco_categoria][
"Estado"
] = f"{Fore.RED}NO CUMPLE{Style.RESET_ALL}"

View File

@@ -1,55 +0,0 @@
from csv import DictWriter
from prowler.config.config import timestamp
from prowler.lib.outputs.models import (
Check_Output_CSV_AWS_Well_Architected,
generate_csv_fields,
)
from prowler.lib.utils.utils import outputs_unix_timestamp
def write_compliance_row_aws_well_architected_framework(
file_descriptors, finding, compliance, output_options, audit_info
):
compliance_output = compliance.Framework
if compliance.Version != "":
compliance_output += "_" + compliance.Version
if compliance.Provider != "":
compliance_output += "_" + compliance.Provider
compliance_output = compliance_output.lower().replace("-", "_")
csv_header = generate_csv_fields(Check_Output_CSV_AWS_Well_Architected)
csv_writer = DictWriter(
file_descriptors[compliance_output],
fieldnames=csv_header,
delimiter=";",
)
for requirement in compliance.Requirements:
requirement_description = requirement.Description
requirement_id = requirement.Id
for attribute in requirement.Attributes:
compliance_row = Check_Output_CSV_AWS_Well_Architected(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=audit_info.audited_account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Description=requirement_description,
Requirements_Attributes_Name=attribute.Name,
Requirements_Attributes_WellArchitectedQuestionId=attribute.WellArchitectedQuestionId,
Requirements_Attributes_WellArchitectedPracticeId=attribute.WellArchitectedPracticeId,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_LevelOfRisk=attribute.LevelOfRisk,
Requirements_Attributes_AssessmentMethod=attribute.AssessmentMethod,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_ImplementationGuidanceUrl=attribute.ImplementationGuidanceUrl,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
)
csv_writer.writerow(compliance_row.__dict__)

View File

@@ -1,36 +0,0 @@
from prowler.lib.outputs.compliance.cis_aws import generate_compliance_row_cis_aws
from prowler.lib.outputs.compliance.cis_gcp import generate_compliance_row_cis_gcp
from prowler.lib.outputs.csv import write_csv
def write_compliance_row_cis(
file_descriptors,
finding,
compliance,
output_options,
audit_info,
input_compliance_frameworks,
):
compliance_output = "cis_" + compliance.Version + "_" + compliance.Provider.lower()
# Only with the version of CIS that was selected
if compliance_output in str(input_compliance_frameworks):
for requirement in compliance.Requirements:
for attribute in requirement.Attributes:
if compliance.Provider == "AWS":
(compliance_row, csv_header) = generate_compliance_row_cis_aws(
finding,
compliance,
requirement,
attribute,
output_options,
audit_info,
)
elif compliance.Provider == "GCP":
(compliance_row, csv_header) = generate_compliance_row_cis_gcp(
finding, compliance, requirement, attribute, output_options
)
write_csv(
file_descriptors[compliance_output], csv_header, compliance_row
)

View File

@@ -1,34 +0,0 @@
from prowler.config.config import timestamp
from prowler.lib.outputs.models import Check_Output_CSV_AWS_CIS, generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def generate_compliance_row_cis_aws(
finding, compliance, requirement, attribute, output_options, audit_info
):
compliance_row = Check_Output_CSV_AWS_CIS(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=audit_info.audited_account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(output_options.unix_timestamp, timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
)
csv_header = generate_csv_fields(Check_Output_CSV_AWS_CIS)
return compliance_row, csv_header

View File

@@ -1,35 +0,0 @@
from prowler.config.config import timestamp
from prowler.lib.outputs.models import Check_Output_CSV_GCP_CIS, generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def generate_compliance_row_cis_gcp(
finding, compliance, requirement, attribute, output_options
):
compliance_row = Check_Output_CSV_GCP_CIS(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
ProjectId=finding.project_id,
Location=finding.location.lower(),
AssessmentDate=outputs_unix_timestamp(output_options.unix_timestamp, timestamp),
Requirements_Id=requirement.Id,
Requirements_Description=requirement.Description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
ResourceName=finding.resource_name,
CheckId=finding.check_metadata.CheckID,
)
csv_header = generate_csv_fields(Check_Output_CSV_GCP_CIS)
return compliance_row, csv_header

View File

@@ -1,472 +0,0 @@
import sys
from colorama import Fore, Style
from tabulate import tabulate
from prowler.config.config import orange_color
from prowler.lib.check.models import Check_Report
from prowler.lib.logger import logger
from prowler.lib.outputs.compliance.aws_well_architected_framework import (
write_compliance_row_aws_well_architected_framework,
)
from prowler.lib.outputs.compliance.cis import write_compliance_row_cis
from prowler.lib.outputs.compliance.ens_rd2022_aws import (
write_compliance_row_ens_rd2022_aws,
)
from prowler.lib.outputs.compliance.generic import write_compliance_row_generic
from prowler.lib.outputs.compliance.iso27001_2013_aws import (
write_compliance_row_iso27001_2013_aws,
)
from prowler.lib.outputs.compliance.mitre_attack_aws import (
write_compliance_row_mitre_attack_aws,
)
def add_manual_controls(
output_options, audit_info, file_descriptors, input_compliance_frameworks
):
try:
# Check if MANUAL control was already added to output
if "manual_check" in output_options.bulk_checks_metadata:
manual_finding = Check_Report(
output_options.bulk_checks_metadata["manual_check"].json()
)
manual_finding.status = "MANUAL"
manual_finding.status_extended = "Manual check"
manual_finding.resource_id = "manual_check"
manual_finding.resource_name = "Manual check"
manual_finding.region = ""
manual_finding.location = ""
manual_finding.project_id = ""
fill_compliance(
output_options,
manual_finding,
audit_info,
file_descriptors,
input_compliance_frameworks,
)
del output_options.bulk_checks_metadata["manual_check"]
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def get_check_compliance_frameworks_in_input(
check_id, bulk_checks_metadata, input_compliance_frameworks
):
"""get_check_compliance_frameworks_in_input returns a list of Compliance for the given check if the compliance framework is present in the input compliance to execute"""
check_compliances = []
if bulk_checks_metadata and bulk_checks_metadata[check_id]:
for compliance in bulk_checks_metadata[check_id].Compliance:
compliance_name = ""
if compliance.Version:
compliance_name = (
compliance.Framework.lower()
+ "_"
+ compliance.Version.lower()
+ "_"
+ compliance.Provider.lower()
)
else:
compliance_name = (
compliance.Framework.lower() + "_" + compliance.Provider.lower()
)
if compliance_name.replace("-", "_") in input_compliance_frameworks:
check_compliances.append(compliance)
return check_compliances
def fill_compliance(
output_options, finding, audit_info, file_descriptors, input_compliance_frameworks
):
try:
# We have to retrieve all the check's compliance requirements and get the ones matching with the input ones
check_compliances = get_check_compliance_frameworks_in_input(
finding.check_metadata.CheckID,
output_options.bulk_checks_metadata,
input_compliance_frameworks,
)
for compliance in check_compliances:
if compliance.Framework == "ENS" and compliance.Version == "RD2022":
write_compliance_row_ens_rd2022_aws(
file_descriptors, finding, compliance, output_options, audit_info
)
elif compliance.Framework == "CIS":
write_compliance_row_cis(
file_descriptors,
finding,
compliance,
output_options,
audit_info,
input_compliance_frameworks,
)
elif (
"AWS-Well-Architected-Framework" in compliance.Framework
and compliance.Provider == "AWS"
):
write_compliance_row_aws_well_architected_framework(
file_descriptors, finding, compliance, output_options, audit_info
)
elif (
compliance.Framework == "ISO27001"
and compliance.Version == "2013"
and compliance.Provider == "AWS"
):
write_compliance_row_iso27001_2013_aws(
file_descriptors, finding, compliance, output_options, audit_info
)
elif (
compliance.Framework == "MITRE-ATTACK"
and compliance.Version == ""
and compliance.Provider == "AWS"
):
write_compliance_row_mitre_attack_aws(
file_descriptors, finding, compliance, output_options, audit_info
)
else:
write_compliance_row_generic(
file_descriptors, finding, compliance, output_options, audit_info
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def display_compliance_table(
findings: list,
bulk_checks_metadata: dict,
compliance_framework: str,
output_filename: str,
output_directory: str,
compliance_overview: bool,
):
try:
if "ens_rd2022_aws" == compliance_framework:
marcos = {}
ens_compliance_table = {
"Proveedor": [],
"Marco/Categoria": [],
"Estado": [],
"Alto": [],
"Medio": [],
"Bajo": [],
"Opcional": [],
}
pass_count = fail_count = 0
for finding in findings:
check = bulk_checks_metadata[finding.check_metadata.CheckID]
check_compliances = check.Compliance
for compliance in check_compliances:
if (
compliance.Framework == "ENS"
and compliance.Provider == "AWS"
and compliance.Version == "RD2022"
):
for requirement in compliance.Requirements:
for attribute in requirement.Attributes:
marco_categoria = (
f"{attribute.Marco}/{attribute.Categoria}"
)
# Check if Marco/Categoria exists
if marco_categoria not in marcos:
marcos[marco_categoria] = {
"Estado": f"{Fore.GREEN}CUMPLE{Style.RESET_ALL}",
"Opcional": 0,
"Alto": 0,
"Medio": 0,
"Bajo": 0,
}
if finding.status == "FAIL":
fail_count += 1
marcos[marco_categoria][
"Estado"
] = f"{Fore.RED}NO CUMPLE{Style.RESET_ALL}"
elif finding.status == "PASS":
pass_count += 1
if attribute.Nivel == "opcional":
marcos[marco_categoria]["Opcional"] += 1
elif attribute.Nivel == "alto":
marcos[marco_categoria]["Alto"] += 1
elif attribute.Nivel == "medio":
marcos[marco_categoria]["Medio"] += 1
elif attribute.Nivel == "bajo":
marcos[marco_categoria]["Bajo"] += 1
# Add results to table
for marco in sorted(marcos):
ens_compliance_table["Proveedor"].append(compliance.Provider)
ens_compliance_table["Marco/Categoria"].append(marco)
ens_compliance_table["Estado"].append(marcos[marco]["Estado"])
ens_compliance_table["Opcional"].append(
f"{Fore.BLUE}{marcos[marco]['Opcional']}{Style.RESET_ALL}"
)
ens_compliance_table["Alto"].append(
f"{Fore.LIGHTRED_EX}{marcos[marco]['Alto']}{Style.RESET_ALL}"
)
ens_compliance_table["Medio"].append(
f"{orange_color}{marcos[marco]['Medio']}{Style.RESET_ALL}"
)
ens_compliance_table["Bajo"].append(
f"{Fore.YELLOW}{marcos[marco]['Bajo']}{Style.RESET_ALL}"
)
if fail_count + pass_count < 1:
print(
f"\nThere are no resources for {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL}.\n"
)
else:
print(
f"\nEstado de Cumplimiento de {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL}:"
)
overview_table = [
[
f"{Fore.RED}{round(fail_count / (fail_count + pass_count) * 100, 2)}% ({fail_count}) NO CUMPLE{Style.RESET_ALL}",
f"{Fore.GREEN}{round(pass_count / (fail_count + pass_count) * 100, 2)}% ({pass_count}) CUMPLE{Style.RESET_ALL}",
]
]
print(tabulate(overview_table, tablefmt="rounded_grid"))
if not compliance_overview:
print(
f"\nResultados de {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL}:"
)
print(
tabulate(
ens_compliance_table,
headers="keys",
tablefmt="rounded_grid",
)
)
print(
f"{Style.BRIGHT}* Solo aparece el Marco/Categoria que contiene resultados.{Style.RESET_ALL}"
)
print(
f"\nResultados detallados de {compliance_framework.upper()} en:"
)
print(
f" - CSV: {output_directory}/compliance/{output_filename}_{compliance_framework}.csv\n"
)
elif "cis_" in compliance_framework:
sections = {}
cis_compliance_table = {
"Provider": [],
"Section": [],
"Level 1": [],
"Level 2": [],
}
pass_count = fail_count = 0
for finding in findings:
check = bulk_checks_metadata[finding.check_metadata.CheckID]
check_compliances = check.Compliance
for compliance in check_compliances:
if (
compliance.Framework == "CIS"
and compliance.Version in compliance_framework
):
for requirement in compliance.Requirements:
for attribute in requirement.Attributes:
section = attribute.Section
# Check if Section exists
if section not in sections:
sections[section] = {
"Status": f"{Fore.GREEN}PASS{Style.RESET_ALL}",
"Level 1": {"FAIL": 0, "PASS": 0},
"Level 2": {"FAIL": 0, "PASS": 0},
}
if finding.status == "FAIL":
fail_count += 1
elif finding.status == "PASS":
pass_count += 1
if attribute.Profile == "Level 1":
if finding.status == "FAIL":
sections[section]["Level 1"]["FAIL"] += 1
else:
sections[section]["Level 1"]["PASS"] += 1
elif attribute.Profile == "Level 2":
if finding.status == "FAIL":
sections[section]["Level 2"]["FAIL"] += 1
else:
sections[section]["Level 2"]["PASS"] += 1
# Add results to table
sections = dict(sorted(sections.items()))
for section in sections:
cis_compliance_table["Provider"].append(compliance.Provider)
cis_compliance_table["Section"].append(section)
if sections[section]["Level 1"]["FAIL"] > 0:
cis_compliance_table["Level 1"].append(
f"{Fore.RED}FAIL({sections[section]['Level 1']['FAIL']}){Style.RESET_ALL}"
)
else:
cis_compliance_table["Level 1"].append(
f"{Fore.GREEN}PASS({sections[section]['Level 1']['PASS']}){Style.RESET_ALL}"
)
if sections[section]["Level 2"]["FAIL"] > 0:
cis_compliance_table["Level 2"].append(
f"{Fore.RED}FAIL({sections[section]['Level 2']['FAIL']}){Style.RESET_ALL}"
)
else:
cis_compliance_table["Level 2"].append(
f"{Fore.GREEN}PASS({sections[section]['Level 2']['PASS']}){Style.RESET_ALL}"
)
if fail_count + pass_count < 1:
print(
f"\nThere are no resources for {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL}.\n"
)
else:
print(
f"\nCompliance Status of {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL} Framework:"
)
overview_table = [
[
f"{Fore.RED}{round(fail_count / (fail_count + pass_count) * 100, 2)}% ({fail_count}) FAIL{Style.RESET_ALL}",
f"{Fore.GREEN}{round(pass_count / (fail_count + pass_count) * 100, 2)}% ({pass_count}) PASS{Style.RESET_ALL}",
]
]
print(tabulate(overview_table, tablefmt="rounded_grid"))
if not compliance_overview:
print(
f"\nFramework {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL} Results:"
)
print(
tabulate(
cis_compliance_table,
headers="keys",
tablefmt="rounded_grid",
)
)
print(
f"{Style.BRIGHT}* Only sections containing results appear.{Style.RESET_ALL}"
)
print(
f"\nDetailed results of {compliance_framework.upper()} are in:"
)
print(
f" - CSV: {output_directory}/compliance/{output_filename}_{compliance_framework}.csv\n"
)
elif "mitre_attack" in compliance_framework:
tactics = {}
mitre_compliance_table = {
"Provider": [],
"Tactic": [],
"Status": [],
}
pass_count = fail_count = 0
for finding in findings:
check = bulk_checks_metadata[finding.check_metadata.CheckID]
check_compliances = check.Compliance
for compliance in check_compliances:
if (
"MITRE-ATTACK" in compliance.Framework
and compliance.Version in compliance_framework
):
for requirement in compliance.Requirements:
for tactic in requirement.Tactics:
if tactic not in tactics:
tactics[tactic] = {"FAIL": 0, "PASS": 0}
if finding.status == "FAIL":
fail_count += 1
tactics[tactic]["FAIL"] += 1
elif finding.status == "PASS":
pass_count += 1
tactics[tactic]["PASS"] += 1
# Add results to table
tactics = dict(sorted(tactics.items()))
for tactic in tactics:
mitre_compliance_table["Provider"].append(compliance.Provider)
mitre_compliance_table["Tactic"].append(tactic)
if tactics[tactic]["FAIL"] > 0:
mitre_compliance_table["Status"].append(
f"{Fore.RED}FAIL({tactics[tactic]['FAIL']}){Style.RESET_ALL}"
)
else:
mitre_compliance_table["Status"].append(
f"{Fore.GREEN}PASS({tactics[tactic]['PASS']}){Style.RESET_ALL}"
)
if fail_count + pass_count < 1:
print(
f"\nThere are no resources for {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL}.\n"
)
else:
print(
f"\nCompliance Status of {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL} Framework:"
)
overview_table = [
[
f"{Fore.RED}{round(fail_count / (fail_count + pass_count) * 100, 2)}% ({fail_count}) FAIL{Style.RESET_ALL}",
f"{Fore.GREEN}{round(pass_count / (fail_count + pass_count) * 100, 2)}% ({pass_count}) PASS{Style.RESET_ALL}",
]
]
print(tabulate(overview_table, tablefmt="rounded_grid"))
if not compliance_overview:
print(
f"\nFramework {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL} Results:"
)
print(
tabulate(
mitre_compliance_table,
headers="keys",
tablefmt="rounded_grid",
)
)
print(
f"{Style.BRIGHT}* Only sections containing results appear.{Style.RESET_ALL}"
)
print(
f"\nDetailed results of {compliance_framework.upper()} are in:"
)
print(
f" - CSV: {output_directory}/compliance/{output_filename}_{compliance_framework}.csv\n"
)
else:
pass_count = fail_count = 0
for finding in findings:
check = bulk_checks_metadata[finding.check_metadata.CheckID]
check_compliances = check.Compliance
for compliance in check_compliances:
if (
compliance.Framework.upper()
in compliance_framework.upper().replace("_", "-")
and compliance.Version in compliance_framework.upper()
and compliance.Provider in compliance_framework.upper()
):
for requirement in compliance.Requirements:
for attribute in requirement.Attributes:
if finding.status == "FAIL":
fail_count += 1
elif finding.status == "PASS":
pass_count += 1
if fail_count + pass_count < 1:
print(
f"\nThere are no resources for {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL}.\n"
)
else:
print(
f"\nCompliance Status of {Fore.YELLOW}{compliance_framework.upper()}{Style.RESET_ALL} Framework:"
)
overview_table = [
[
f"{Fore.RED}{round(fail_count / (fail_count + pass_count) * 100, 2)}% ({fail_count}) FAIL{Style.RESET_ALL}",
f"{Fore.GREEN}{round(pass_count / (fail_count + pass_count) * 100, 2)}% ({pass_count}) PASS{Style.RESET_ALL}",
]
]
print(tabulate(overview_table, tablefmt="rounded_grid"))
if not compliance_overview:
print(f"\nDetailed results of {compliance_framework.upper()} are in:")
print(
f" - CSV: {output_directory}/compliance/{output_filename}_{compliance_framework}.csv\n"
)
except Exception as error:
logger.critical(
f"{error.__class__.__name__}:{error.__traceback__.tb_lineno} -- {error}"
)
sys.exit(1)

View File

@@ -1,45 +0,0 @@
from csv import DictWriter
from prowler.config.config import timestamp
from prowler.lib.outputs.models import Check_Output_CSV_ENS_RD2022, generate_csv_fields
from prowler.lib.utils.utils import outputs_unix_timestamp
def write_compliance_row_ens_rd2022_aws(
file_descriptors, finding, compliance, output_options, audit_info
):
compliance_output = "ens_rd2022_aws"
csv_header = generate_csv_fields(Check_Output_CSV_ENS_RD2022)
csv_writer = DictWriter(
file_descriptors[compliance_output],
fieldnames=csv_header,
delimiter=";",
)
for requirement in compliance.Requirements:
requirement_description = requirement.Description
requirement_id = requirement.Id
for attribute in requirement.Attributes:
compliance_row = Check_Output_CSV_ENS_RD2022(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=audit_info.audited_account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Description=requirement_description,
Requirements_Attributes_IdGrupoControl=attribute.IdGrupoControl,
Requirements_Attributes_Marco=attribute.Marco,
Requirements_Attributes_Categoria=attribute.Categoria,
Requirements_Attributes_DescripcionControl=attribute.DescripcionControl,
Requirements_Attributes_Nivel=attribute.Nivel,
Requirements_Attributes_Tipo=attribute.Tipo,
Requirements_Attributes_Dimensiones=",".join(attribute.Dimensiones),
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
)
csv_writer.writerow(compliance_row.__dict__)

View File

@@ -1,51 +0,0 @@
from csv import DictWriter
from prowler.config.config import timestamp
from prowler.lib.outputs.models import (
Check_Output_CSV_Generic_Compliance,
generate_csv_fields,
)
from prowler.lib.utils.utils import outputs_unix_timestamp
def write_compliance_row_generic(
file_descriptors, finding, compliance, output_options, audit_info
):
compliance_output = compliance.Framework
if compliance.Version != "":
compliance_output += "_" + compliance.Version
if compliance.Provider != "":
compliance_output += "_" + compliance.Provider
compliance_output = compliance_output.lower().replace("-", "_")
csv_header = generate_csv_fields(Check_Output_CSV_Generic_Compliance)
csv_writer = DictWriter(
file_descriptors[compliance_output],
fieldnames=csv_header,
delimiter=";",
)
for requirement in compliance.Requirements:
requirement_description = requirement.Description
requirement_id = requirement.Id
for attribute in requirement.Attributes:
compliance_row = Check_Output_CSV_Generic_Compliance(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=audit_info.audited_account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Description=requirement_description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_SubSection=attribute.SubSection,
Requirements_Attributes_SubGroup=attribute.SubGroup,
Requirements_Attributes_Service=attribute.Service,
Requirements_Attributes_Soc_Type=attribute.Soc_Type,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
)
csv_writer.writerow(compliance_row.__dict__)

View File

@@ -1,53 +0,0 @@
from csv import DictWriter
from prowler.config.config import timestamp
from prowler.lib.outputs.models import (
Check_Output_CSV_AWS_ISO27001_2013,
generate_csv_fields,
)
from prowler.lib.utils.utils import outputs_unix_timestamp
def write_compliance_row_iso27001_2013_aws(
file_descriptors, finding, compliance, output_options, audit_info
):
compliance_output = compliance.Framework
if compliance.Version != "":
compliance_output += "_" + compliance.Version
if compliance.Provider != "":
compliance_output += "_" + compliance.Provider
compliance_output = compliance_output.lower().replace("-", "_")
csv_header = generate_csv_fields(Check_Output_CSV_AWS_ISO27001_2013)
csv_writer = DictWriter(
file_descriptors[compliance_output],
fieldnames=csv_header,
delimiter=";",
)
for requirement in compliance.Requirements:
requirement_description = requirement.Description
requirement_id = requirement.Id
requirement_name = requirement.Name
for attribute in requirement.Attributes:
compliance_row = Check_Output_CSV_AWS_ISO27001_2013(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=audit_info.audited_account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Name=requirement_name,
Requirements_Description=requirement_description,
Requirements_Attributes_Category=attribute.Category,
Requirements_Attributes_Objetive_ID=attribute.Objetive_ID,
Requirements_Attributes_Objetive_Name=attribute.Objetive_Name,
Requirements_Attributes_Check_Summary=attribute.Check_Summary,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
)
csv_writer.writerow(compliance_row.__dict__)

View File

@@ -1,66 +0,0 @@
from csv import DictWriter
from prowler.config.config import timestamp
from prowler.lib.outputs.models import (
Check_Output_MITRE_ATTACK,
generate_csv_fields,
unroll_list,
)
from prowler.lib.utils.utils import outputs_unix_timestamp
def write_compliance_row_mitre_attack_aws(
file_descriptors, finding, compliance, output_options, audit_info
):
compliance_output = compliance.Framework
if compliance.Version != "":
compliance_output += "_" + compliance.Version
if compliance.Provider != "":
compliance_output += "_" + compliance.Provider
compliance_output = compliance_output.lower().replace("-", "_")
csv_header = generate_csv_fields(Check_Output_MITRE_ATTACK)
csv_writer = DictWriter(
file_descriptors[compliance_output],
fieldnames=csv_header,
delimiter=";",
)
for requirement in compliance.Requirements:
requirement_description = requirement.Description
requirement_id = requirement.Id
requirement_name = requirement.Name
attributes_aws_services = ""
attributes_categories = ""
attributes_values = ""
attributes_comments = ""
for attribute in requirement.Attributes:
attributes_aws_services += attribute.AWSService + "\n"
attributes_categories += attribute.Category + "\n"
attributes_values += attribute.Value + "\n"
attributes_comments += attribute.Comment + "\n"
compliance_row = Check_Output_MITRE_ATTACK(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
AccountId=audit_info.audited_account,
Region=finding.region,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Description=requirement_description,
Requirements_Name=requirement_name,
Requirements_Tactics=unroll_list(requirement.Tactics),
Requirements_SubTechniques=unroll_list(requirement.SubTechniques),
Requirements_Platforms=unroll_list(requirement.Platforms),
Requirements_TechniqueURL=requirement.TechniqueURL,
Requirements_Attributes_AWSServices=attributes_aws_services,
Requirements_Attributes_Categories=attributes_categories,
Requirements_Attributes_Values=attributes_values,
Requirements_Attributes_Comments=attributes_comments,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
CheckId=finding.check_metadata.CheckID,
)
csv_writer.writerow(compliance_row.__dict__)

View File

@@ -1,10 +0,0 @@
from csv import DictWriter
def write_csv(file_descriptor, headers, row):
csv_writer = DictWriter(
file_descriptor,
fieldnames=headers,
delimiter=";",
)
csv_writer.writerow(row.__dict__)

View File

@@ -12,6 +12,8 @@ from prowler.config.config import (
from prowler.lib.logger import logger
from prowler.lib.outputs.html import add_html_header
from prowler.lib.outputs.models import (
Aws_Check_Output_CSV,
Azure_Check_Output_CSV,
Check_Output_CSV_AWS_CIS,
Check_Output_CSV_AWS_ISO27001_2013,
Check_Output_CSV_AWS_Well_Architected,
@@ -19,19 +21,19 @@ from prowler.lib.outputs.models import (
Check_Output_CSV_GCP_CIS,
Check_Output_CSV_Generic_Compliance,
Check_Output_MITRE_ATTACK,
Gcp_Check_Output_CSV,
generate_csv_fields,
)
from prowler.lib.utils.utils import file_exists, open_file
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info
from prowler.providers.common.outputs import get_provider_output_model
from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info
def initialize_file_descriptor(
filename: str,
output_mode: str,
audit_info: Any,
audit_info: AWS_Audit_Info,
format: Any = None,
) -> TextIOWrapper:
"""Open/Create the output file. If needed include headers or the required format"""
@@ -73,15 +75,27 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
for output_mode in output_modes:
if output_mode == "csv":
filename = f"{output_directory}/{output_filename}{csv_file_suffix}"
output_model = get_provider_output_model(
audit_info.__class__.__name__
)
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
audit_info,
output_model,
)
if isinstance(audit_info, AWS_Audit_Info):
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
audit_info,
Aws_Check_Output_CSV,
)
if isinstance(audit_info, Azure_Audit_Info):
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
audit_info,
Azure_Check_Output_CSV,
)
if isinstance(audit_info, GCP_Audit_Info):
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
audit_info,
Gcp_Check_Output_CSV,
)
file_descriptors.update({output_mode: file_descriptor})
elif output_mode == "json":
@@ -109,7 +123,7 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
elif isinstance(audit_info, GCP_Audit_Info):
if output_mode == "cis_2.0_gcp":
filename = f"{output_directory}/compliance/{output_filename}_cis_2.0_gcp{csv_file_suffix}"
filename = f"{output_directory}/{output_filename}_cis_2.0_gcp{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename, output_mode, audit_info, Check_Output_CSV_GCP_CIS
)
@@ -124,7 +138,7 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
file_descriptors.update({output_mode: file_descriptor})
elif output_mode == "ens_rd2022_aws":
filename = f"{output_directory}/compliance/{output_filename}_ens_rd2022_aws{csv_file_suffix}"
filename = f"{output_directory}/{output_filename}_ens_rd2022_aws{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
@@ -134,14 +148,14 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
file_descriptors.update({output_mode: file_descriptor})
elif output_mode == "cis_1.5_aws":
filename = f"{output_directory}/compliance/{output_filename}_cis_1.5_aws{csv_file_suffix}"
filename = f"{output_directory}/{output_filename}_cis_1.5_aws{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename, output_mode, audit_info, Check_Output_CSV_AWS_CIS
)
file_descriptors.update({output_mode: file_descriptor})
elif output_mode == "cis_1.4_aws":
filename = f"{output_directory}/compliance/{output_filename}_cis_1.4_aws{csv_file_suffix}"
filename = f"{output_directory}/{output_filename}_cis_1.4_aws{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename, output_mode, audit_info, Check_Output_CSV_AWS_CIS
)
@@ -151,7 +165,7 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
output_mode
== "aws_well_architected_framework_security_pillar_aws"
):
filename = f"{output_directory}/compliance/{output_filename}_aws_well_architected_framework_security_pillar_aws{csv_file_suffix}"
filename = f"{output_directory}/{output_filename}_aws_well_architected_framework_security_pillar_aws{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
@@ -164,7 +178,7 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
output_mode
== "aws_well_architected_framework_reliability_pillar_aws"
):
filename = f"{output_directory}/compliance/{output_filename}_aws_well_architected_framework_reliability_pillar_aws{csv_file_suffix}"
filename = f"{output_directory}/{output_filename}_aws_well_architected_framework_reliability_pillar_aws{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
@@ -174,7 +188,7 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
file_descriptors.update({output_mode: file_descriptor})
elif output_mode == "iso27001_2013_aws":
filename = f"{output_directory}/compliance/{output_filename}_iso27001_2013_aws{csv_file_suffix}"
filename = f"{output_directory}/{output_filename}_iso27001_2013_aws{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
@@ -184,7 +198,7 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
file_descriptors.update({output_mode: file_descriptor})
elif output_mode == "mitre_attack_aws":
filename = f"{output_directory}/compliance/{output_filename}_mitre_attack_aws{csv_file_suffix}"
filename = f"{output_directory}/{output_filename}_mitre_attack_aws{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
@@ -195,26 +209,14 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
else:
# Generic Compliance framework
if (
isinstance(audit_info, AWS_Audit_Info)
and "aws" in output_mode
or (
isinstance(audit_info, Azure_Audit_Info)
and "azure" in output_mode
)
or (
isinstance(audit_info, GCP_Audit_Info)
and "gcp" in output_mode
)
):
filename = f"{output_directory}/compliance/{output_filename}_{output_mode}{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
audit_info,
Check_Output_CSV_Generic_Compliance,
)
file_descriptors.update({output_mode: file_descriptor})
filename = f"{output_directory}/{output_filename}_{output_mode}{csv_file_suffix}"
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
audit_info,
Check_Output_CSV_Generic_Compliance,
)
file_descriptors.update({output_mode: file_descriptor})
except Exception as error:
logger.error(

View File

@@ -21,7 +21,6 @@ from prowler.lib.utils.utils import open_file
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info
from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info
from prowler.providers.kubernetes.lib.audit_info.models import Kubernetes_Audit_Info
def add_html_header(file_descriptor, audit_info):
@@ -170,11 +169,11 @@ def add_html_header(file_descriptor, audit_info):
def fill_html(file_descriptor, finding, output_options):
try:
row_class = "p-3 mb-2 bg-success-custom"
if finding.status == "MANUAL":
if finding.status == "INFO":
row_class = "table-info"
elif finding.status == "FAIL":
row_class = "table-danger"
elif finding.status == "MUTED":
elif finding.status == "WARNING":
row_class = "table-warning"
file_descriptor.write(
f"""
@@ -339,9 +338,8 @@ def add_html_footer(output_filename, output_directory):
def get_aws_html_assessment_summary(audit_info):
try:
if isinstance(audit_info, AWS_Audit_Info):
profile = (
audit_info.profile if audit_info.profile is not None else "default"
)
if not audit_info.profile:
audit_info.profile = "ENV"
if isinstance(audit_info.audited_regions, list):
audited_regions = " ".join(audit_info.audited_regions)
elif not audit_info.audited_regions:
@@ -363,7 +361,7 @@ def get_aws_html_assessment_summary(audit_info):
</li>
<li class="list-group-item">
<b>AWS-CLI Profile:</b> """
+ profile
+ audit_info.profile
+ """
</li>
<li class="list-group-item">
@@ -523,53 +521,6 @@ def get_gcp_html_assessment_summary(audit_info):
sys.exit(1)
def get_kubernetes_html_assessment_summary(audit_info):
try:
if isinstance(audit_info, Kubernetes_Audit_Info):
return (
"""
<div class="col-md-2">
<div class="card">
<div class="card-header">
Kubernetes Assessment Summary
</div>
<ul class="list-group list-group-flush">
<li class="list-group-item">
<b>Kubernetes Context:</b> """
+ audit_info.context["name"]
+ """
</li>
</ul>
</div>
</div>
<div class="col-md-4">
<div class="card">
<div class="card-header">
Kubernetes Credentials
</div>
<ul class="list-group list-group-flush">
<li class="list-group-item">
<b>Kubernetes Cluster:</b> """
+ audit_info.context["context"]["cluster"]
+ """
</li>
<li class="list-group-item">
<b>Kubernetes User:</b> """
+ audit_info.context["context"]["user"]
+ """
</li>
</ul>
</div>
</div>
"""
)
except Exception as error:
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
sys.exit(1)
def get_assessment_summary(audit_info):
"""
get_assessment_summary gets the HTML assessment summary for the provider
@@ -580,7 +531,6 @@ def get_assessment_summary(audit_info):
# AWS_Audit_Info --> aws
# GCP_Audit_Info --> gcp
# Azure_Audit_Info --> azure
# Kubernetes_Audit_Info --> kubernetes
provider = audit_info.__class__.__name__.split("_")[0].lower()
# Dynamically get the Provider quick inventory handler

View File

@@ -31,7 +31,6 @@ from prowler.lib.outputs.models import (
unroll_dict_to_list,
)
from prowler.lib.utils.utils import hash_sha512, open_file, outputs_unix_timestamp
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
def fill_json_asff(finding_output, audit_info, finding, output_options):
@@ -116,8 +115,8 @@ def generate_json_asff_status(status: str) -> str:
json_asff_status = "PASSED"
elif status == "FAIL":
json_asff_status = "FAILED"
elif status == "MUTED":
json_asff_status = "MUTED"
elif status == "WARNING":
json_asff_status = "WARNING"
else:
json_asff_status = "NOT_AVAILABLE"
@@ -156,11 +155,6 @@ def fill_json_ocsf(audit_info, finding, output_options) -> Check_Output_JSON_OCS
aws_org_uid = ""
account = None
org = None
profile = ""
if isinstance(audit_info, AWS_Audit_Info):
profile = (
audit_info.profile if audit_info.profile is not None else "default"
)
if (
hasattr(audit_info, "organizations_metadata")
and audit_info.organizations_metadata
@@ -255,7 +249,9 @@ def fill_json_ocsf(audit_info, finding, output_options) -> Check_Output_JSON_OCS
original_time=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
profiles=[profile],
profiles=[audit_info.profile]
if hasattr(audit_info, "organizations_metadata")
else [],
)
compliance = Compliance_OCSF(
status=generate_json_ocsf_status(finding.status),
@@ -293,7 +289,7 @@ def generate_json_ocsf_status(status: str):
json_ocsf_status = "Success"
elif status == "FAIL":
json_ocsf_status = "Failure"
elif status == "MUTED":
elif status == "WARNING":
json_ocsf_status = "Other"
else:
json_ocsf_status = "Unknown"
@@ -307,7 +303,7 @@ def generate_json_ocsf_status_id(status: str):
json_ocsf_status_id = 1
elif status == "FAIL":
json_ocsf_status_id = 2
elif status == "MUTED":
elif status == "WARNING":
json_ocsf_status_id = 99
else:
json_ocsf_status_id = 0

View File

@@ -10,19 +10,10 @@ from prowler.config.config import prowler_version, timestamp
from prowler.lib.check.models import Remediation
from prowler.lib.logger import logger
from prowler.lib.utils.utils import outputs_unix_timestamp
from prowler.providers.aws.lib.audit_info.models import AWSOrganizationsInfo
from prowler.providers.aws.lib.audit_info.models import AWS_Organizations_Info
def get_check_compliance(finding, provider, output_options) -> dict:
"""get_check_compliance returns a map with the compliance framework as key and the requirements where the finding's check is present.
Example:
{
"CIS-1.4": ["2.1.3"],
"CIS-1.5": ["2.1.3"],
}
"""
def get_check_compliance(finding, provider, output_options):
try:
check_compliance = {}
# We have to retrieve all the check's compliance requirements
@@ -85,18 +76,6 @@ def generate_provider_output_csv(
)
finding_output = output_model(**data)
if provider == "kubernetes":
data["resource_id"] = finding.resource_id
data["resource_name"] = finding.resource_name
data["namespace"] = finding.namespace
data[
"finding_unique_id"
] = f"prowler-{provider}-{finding.check_metadata.CheckID}-{finding.namespace}-{finding.resource_id}"
data["compliance"] = unroll_dict(
get_check_compliance(finding, provider, output_options)
)
finding_output = output_model(**data)
if provider == "aws":
data["profile"] = audit_info.profile
data["account_id"] = audit_info.audited_account
@@ -369,16 +348,6 @@ class Gcp_Check_Output_CSV(Check_Output_CSV):
resource_name: str = ""
class Kubernetes_Check_Output_CSV(Check_Output_CSV):
"""
Kubernetes_Check_Output_CSV generates a finding's output in CSV format for the Kubernetes provider.
"""
namespace: str = ""
resource_id: str = ""
resource_name: str = ""
def generate_provider_output_json(
provider: str, finding, audit_info, mode: str, output_options
):
@@ -483,7 +452,7 @@ class Aws_Check_Output_JSON(Check_Output_JSON):
Profile: str = ""
AccountId: str = ""
OrganizationsInfo: Optional[AWSOrganizationsInfo]
OrganizationsInfo: Optional[AWS_Organizations_Info]
Region: str = ""
ResourceId: str = ""
ResourceArn: str = ""
@@ -509,7 +478,7 @@ class Azure_Check_Output_JSON(Check_Output_JSON):
class Gcp_Check_Output_JSON(Check_Output_JSON):
"""
Gcp_Check_Output_JSON generates a finding's output in JSON format for the GCP provider.
Gcp_Check_Output_JSON generates a finding's output in JSON format for the AWS provider.
"""
ProjectId: str = ""
@@ -521,19 +490,6 @@ class Gcp_Check_Output_JSON(Check_Output_JSON):
super().__init__(**metadata)
class Kubernetes_Check_Output_JSON(Check_Output_JSON):
"""
Kubernetes_Check_Output_JSON generates a finding's output in JSON format for the Kubernetes provider.
"""
ResourceId: str = ""
ResourceName: str = ""
Namespace: str = ""
def __init__(self, **metadata):
super().__init__(**metadata)
class Check_Output_MITRE_ATTACK(BaseModel):
"""
Check_Output_MITRE_ATTACK generates a finding's output in CSV MITRE ATTACK format.

View File

@@ -4,10 +4,7 @@ from colorama import Fore, Style
from prowler.config.config import available_compliance_frameworks, orange_color
from prowler.lib.logger import logger
from prowler.lib.outputs.compliance.compliance import (
add_manual_controls,
fill_compliance,
)
from prowler.lib.outputs.compliance import add_manual_controls, fill_compliance
from prowler.lib.outputs.file_descriptors import fill_file_descriptors
from prowler.lib.outputs.html import fill_html
from prowler.lib.outputs.json import fill_json_asff, fill_json_ocsf
@@ -20,17 +17,15 @@ from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info
def stdout_report(finding, color, verbose, status):
def stdout_report(finding, color, verbose, is_quiet):
if finding.check_metadata.Provider == "aws":
details = finding.region
if finding.check_metadata.Provider == "azure":
details = finding.check_metadata.ServiceName
if finding.check_metadata.Provider == "gcp":
details = finding.location.lower()
if finding.check_metadata.Provider == "kubernetes":
details = finding.namespace.lower()
if verbose and (not status or finding.status in status):
if verbose and not (is_quiet and finding.status != "FAIL"):
print(
f"\t{color}{finding.status}{Style.RESET_ALL} {details}: {finding.status_extended}"
)
@@ -62,35 +57,28 @@ def report(check_findings, output_options, audit_info):
# Print findings by stdout
color = set_report_color(finding.status)
stdout_report(
finding, color, output_options.verbose, output_options.status
finding, color, output_options.verbose, output_options.is_quiet
)
if file_descriptors:
# Check if --status is enabled and if the filter applies
if (
not output_options.status
or finding.status in output_options.status
):
input_compliance_frameworks = list(
set(output_options.output_modes).intersection(
available_compliance_frameworks
# Check if --quiet to only add fails to outputs
if not (finding.status != "FAIL" and output_options.is_quiet):
if any(
compliance in output_options.output_modes
for compliance in available_compliance_frameworks
):
fill_compliance(
output_options,
finding,
audit_info,
file_descriptors,
)
)
fill_compliance(
output_options,
finding,
audit_info,
file_descriptors,
input_compliance_frameworks,
)
add_manual_controls(
output_options,
audit_info,
file_descriptors,
input_compliance_frameworks,
)
add_manual_controls(
output_options,
audit_info,
file_descriptors,
)
# AWS specific outputs
if finding.check_metadata.Provider == "aws":
@@ -152,7 +140,7 @@ def report(check_findings, output_options, audit_info):
file_descriptors["json-ocsf"].write(",")
else: # No service resources in the whole account
color = set_report_color("MANUAL")
color = set_report_color("INFO")
if output_options.verbose:
print(f"\t{color}INFO{Style.RESET_ALL} There are no resources")
# Separator between findings and bar
@@ -177,12 +165,12 @@ def set_report_color(status: str) -> str:
color = Fore.RED
elif status == "ERROR":
color = Fore.BLACK
elif status == "MUTED":
elif status == "WARNING":
color = orange_color
elif status == "MANUAL":
elif status == "INFO":
color = Fore.YELLOW
else:
raise Exception("Invalid Report Status. Must be PASS, FAIL, ERROR or MUTED")
raise Exception("Invalid Report Status. Must be PASS, FAIL, ERROR or WARNING")
return color

View File

@@ -39,9 +39,6 @@ def display_summary_table(
elif provider == "gcp":
entity_type = "Project ID/s"
audited_entities = ", ".join(audit_info.project_ids)
elif provider == "kubernetes":
entity_type = "Context"
audited_entities = audit_info.context["name"]
if findings:
current = {

View File

@@ -1,485 +0,0 @@
import os
import pathlib
from datetime import timedelta
from time import time
from rich.align import Align
from rich.console import Console, Group
from rich.layout import Layout
from rich.live import Live
from rich.padding import Padding
from rich.panel import Panel
from rich.progress import (
BarColumn,
MofNCompleteColumn,
Progress,
TextColumn,
TimeElapsedColumn,
TimeRemainingColumn,
)
from rich.rule import Rule
from rich.table import Table
from rich.text import Text
from rich.theme import Theme
from prowler.config.config import prowler_version, timestamp
from prowler.providers.aws.models import AWSIdentityInfo, AWSAssumeRole
# Defines a subclass of Live for creating and managing the live display in the CLI
class LiveDisplay(Live):
def __init__(self, *args, **kwargs):
# Load a theme for the console display from a file
theme = self.load_theme_from_file()
super().__init__(renderable=None, console=Console(theme=theme), *args, **kwargs)
self.sections = {} # Stores different sections of the layout
self.enabled = False # Flag to enable or disable the live display
# Sets up the layout of the live display
def make_layout(self):
"""
Defines the layout.
Making sections invisible so it doesnt show the default Layout metadata before content is added
Text(" ") is to stop the layout metadata from rendering before the layout is updated with real content
client_and_service handles client init (when importing clients) and service check execution
"""
self.layout = Layout(name="root")
# Split layout into intro, overall progress, and main sections
self.layout.split(
Layout(name="intro", ratio=3, minimum_size=9),
Layout(Text(" "), name="overall_progress", minimum_size=5),
Layout(name="main", ratio=10),
)
# Further split intro layout into body and creds sections
self.layout["intro"].split_row(
Layout(name="body", ratio=3),
Layout(name="creds", ratio=2, visible=False),
)
# Split main layout into client_and_service and results sections
self.layout["main"].split_row(
Layout(
Text(" "), name="client_and_service", ratio=3
), # For client_init and service
Layout(name="results", ratio=2, visible=False),
)
# Loads a theme from a YAML file located in the same directory as this file
def load_theme_from_file(self):
# Loads theme.yaml from the same folder as this file
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
with open(f"{actual_directory}/theme.yaml") as f:
theme = Theme.from_file(f)
return theme
# Initializes the layout and sections based on CLI arguments
def initialize(self, args):
# A way to get around parsing args to LiveDisplay when it is intialized
# This is so that the live_display object can be intialized in this file, and imported to other parts of prowler
self.cli_args = args
self.enabled = not args.only_logs
if self.enabled:
# Initialize layout
self.make_layout()
# Apply layout
self.update(self.layout)
# Add Intro section
intro_layout = self.layout["intro"]
intro_section = IntroSection(args, intro_layout)
self.sections["intro"] = intro_section
# Start live display
self.start()
# Adds AWS credentials to the display
def print_aws_credentials(self, aws_identity_info: AWSIdentityInfo, assumed_role_info: AWSAssumeRole):
# Adds the AWS credentials to the display - will need to extend to gcp and azure
# Create a new function for gcp and azure in this class, that will call a function in the intro_section class
intro_section = self.sections["intro"]
intro_section.add_aws_credentials(aws_identity_info, assumed_role_info)
# Adds and manages the overall progress section
def add_overall_progress_section(self, total_checks_dict):
overall_progress_section = OverallProgressSection(total_checks_dict)
overall_progress_layout = self.layout["overall_progress"]
overall_progress_layout.update(overall_progress_section)
overall_progress_layout.visible = True
self.sections["overall_progress"] = overall_progress_section
# Add results section
self.add_results_section()
# Wrapper function to increment the overall progress
def increment_overall_check_progress(self):
# Called by ExecutionManager
if self.enabled:
section = self.sections["overall_progress"]
section.increment_check_progress()
# Wrapper function to increment the progress for the current service
def increment_overall_service_progress(self):
# Called by ExecutionManager
if self.enabled:
section = self.sections["overall_progress"]
section.increment_service_progress()
# Adds and manages the results section
def add_results_section(self):
# Intializes the results section
results_layout = self.layout["results"]
results_section = ResultsSection()
results_layout.update(results_section)
results_layout.visible = True
self.sections["results"] = results_section
def add_results_for_service(self, service_name, service_findings):
# Adds rows to the Service Check Results table
if self.enabled:
results_section = self.sections["results"]
results_section.add_results_for_service(service_name, service_findings)
# Client Init Section
def add_client_init_section(self, service_name):
# Used to track progress of client init process
if self.enabled:
client_init_section = ClientInitSection(service_name)
self.sections["client_and_service"] = client_init_section
self.layout["client_and_service"].update(client_init_section)
self.layout["client_and_service"].visible = True
# Service Section
def add_service_section(self, service_name, total_checks):
# Used to create the ServiceSection when checks start to execute (after clients have been imported)
if self.enabled:
service_section = ServiceSection(service_name, total_checks)
self.sections["client_and_service"] = service_section
self.layout["client_and_service"].update(service_section)
def increment_check_progress(self):
if self.enabled:
service_section = self.sections["client_and_service"]
service_section.increment_check_progress()
# Misc
def get_service_section(self):
# Used by Check
if self.enabled:
return self.sections["client_and_service"]
def get_client_init_section(self):
# Used by AWSService
if self.enabled:
return self.sections["client_and_service"]
def hide_service_section(self):
# To hide the last service after execution has completed
self.layout["client_and_service"].visible = False
def print_message(self, message):
# No use yet
self.console.print(message)
# The following classes (ServiceSection, ClientInitSection, IntroSection, OverallProgressSection, ResultsSection)
# are used to define different sections of the live display, each with its own layout, progress bars,
class ServiceSection:
def __init__(self, service_name, total_checks) -> None:
self.service_name = service_name
self.total_checks = total_checks
self.renderables = self.create_service_section()
self.start_check_progress()
def __rich__(self):
return Padding(self.renderables, (2, 2))
def create_service_section(self):
# Create the progress components
self.check_progress = Progress(
TextColumn("[bold]{task.description}"),
BarColumn(bar_width=None),
MofNCompleteColumn(),
transient=False, # Optional: set True if you want the progress bar to disappear after completion
)
# Used to add titles that dont need progress bars
self.title_bar = Progress(
TextColumn("[progress.description]{task.description}"), transient=True
)
# Progress Bar for Service Init and Checks
self.task_progress = Progress(
TextColumn("[progress.description]{task.description}"),
BarColumn(bar_width=None),
MofNCompleteColumn(),
TimeElapsedColumn(),
TimeRemainingColumn(),
transient=True,
)
return Group(
Panel(
Group(
self.check_progress,
Rule(style="bold blue"),
self.title_bar,
Rule(style="bold blue"),
self.task_progress,
),
title=f"Service: {self.service_name}",
),
)
def start_check_progress(self):
self.check_progress_task_id = self.check_progress.add_task(
"Checks executed", total=self.total_checks
)
def increment_check_progress(self):
self.check_progress.update(self.check_progress_task_id, advance=1)
class ClientInitSection:
def __init__(self, client_name) -> None:
self.client_name = client_name
self.renderables = self.create_client_init_section()
def __rich__(self):
return Padding(self.renderables, (2, 2))
def create_client_init_section(self):
# Progress Bar for Checks
self.task_progress_bar = Progress(
TextColumn("[progress.description]{task.description}"),
BarColumn(bar_width=None),
MofNCompleteColumn(),
TimeElapsedColumn(),
TimeRemainingColumn(),
transient=True,
)
return Group(
Panel(
Group(
self.task_progress_bar,
),
title=f"Intializing {self.client_name.replace('_', ' ')}",
),
)
class IntroSection:
def __init__(self, args, layout: Layout) -> None:
self.body_layout = layout["body"]
self.creds_layout = layout["creds"]
self.renderables = []
self.title = f"Prowler v{prowler_version}"
if not args.no_banner:
self.create_banner(args)
def __rich__(self):
return Group(*self.renderables)
def create_banner(self, args):
banner_text = f"""[banner_color] _
_ __ _ __ _____ _| | ___ _ __
| '_ \| '__/ _ \ \ /\ / / |/ _ \ '__|
| |_) | | | (_) \ V V /| | __/ |
| .__/|_| \___/ \_/\_/ |_|\___|_|v{prowler_version}
|_|[/banner_color][banner_blue]the handy cloud security tool[/banner_blue]
[info]Date: {timestamp.strftime('%Y-%m-%d %H:%M:%S')}[/info]
"""
if args.verbose:
banner_text += """
Color code for results:
- [info]INFO (Information)[/info]
- [pass]PASS (Recommended value)[/pass]
- [orange_color]WARNING (Ignored by mutelist)[/orange_color]
- [fail]FAIL (Fix required)[/fail]
"""
self.renderables.append(banner_text)
self.body_layout.update(Group(*self.renderables))
self.body_layout.visible = True
def add_aws_credentials(self, aws_identity_info: AWSIdentityInfo, assumed_role_info: AWSAssumeRole):
# Beautify audited regions, and set to "all" if there is no filter region
regions = (
", ".join(aws_identity_info.audited_regions)
if aws_identity_info.audited_regions is not None
else "all"
)
# Beautify audited profile, set and to "default" if there is no profile set
profile = aws_identity_info.profile if aws_identity_info.profile is not None else "default"
content = Text()
content.append(
"This report is being generated using credentials below:\n\n", style="bold"
)
content.append("AWS-CLI Profile: ", style="bold")
content.append(f"[{profile}]\n", style="info")
content.append("AWS Filter Region: ", style="bold")
content.append(f"[{regions}]\n", style="info")
content.append("AWS Account: ", style="bold")
content.append(f"[{aws_identity_info.account}]\n", style="info")
content.append("UserId: ", style="bold")
content.append(f"[{aws_identity_info.user_id}]\n", style="info")
content.append("Caller Identity ARN: ", style="bold")
content.append(f"[{aws_identity_info.identity_arn}]\n", style="info")
# If a role has been assumed, print the Assumed Role ARN
if assumed_role_info.role_arn is not None:
content.append("Assumed Role ARN: ", style="bold")
content.append(f"[{assumed_role_info.role_arn}]\n", style="info")
self.creds_layout.update(content)
self.creds_layout.visible = True
class OverallProgressSection:
def __init__(self, total_checks_dict: dict) -> None:
self.start_time = time() # Start the timer
self.renderables = self.create_renderable(total_checks_dict)
def __rich__(self):
elapsed_time = self.total_time_taken()
return Group(*self.renderables, f"Total time taken: {elapsed_time}")
def total_time_taken(self):
elapsed_seconds = int(time() - self.start_time)
elapsed_time = timedelta(seconds=elapsed_seconds)
return elapsed_time
def create_renderable(self, total_checks_dict):
services_num = len(total_checks_dict) # number of keys == number of services
checks_num = sum(total_checks_dict.values())
plural_string = "checks"
singular_string = "check"
check_noun = plural_string if checks_num > 1 else singular_string
# Create the progress bar
self.overall_progress_bar = Progress(
TextColumn("[bold]{task.description}"),
BarColumn(bar_width=None),
MofNCompleteColumn(),
transient=False, # Optional: set True if you want the progress bar to disappear after completion
)
# Create the Services Completed task, to track the number of services completed
self.service_progress_task_id = self.overall_progress_bar.add_task(
"Services completed", total=services_num
)
# Create the Checks Completed task, to track the number of checks completed across all services
self.check_progress_task_id = self.overall_progress_bar.add_task(
"Checks executed", total=checks_num
)
content = Text()
content.append(
f"Executing {checks_num} {check_noun} across {services_num} services, please wait...\n",
style="bold",
)
return [content, self.overall_progress_bar]
def increment_check_progress(self):
self.overall_progress_bar.update(self.check_progress_task_id, advance=1)
def increment_service_progress(self):
self.overall_progress_bar.update(self.service_progress_task_id, advance=1)
class ResultsSection:
def __init__(self, verbose=True):
self.verbose = verbose
self.table = Table(title="Service Check Results")
self.table.add_column("Service", justify="left")
if self.verbose:
self.serverities = ["critical", "high", "medium", "low"]
# Add columns for each severity level when verbose, report on the count of fails per severity per service
for severity in self.serverities:
styled_header = (
f"[{severity.lower()}]{severity.capitalize()}[/{severity.lower()}]"
)
self.table.add_column(styled_header, justify="center")
else:
# Dynamically track the status's, report on the status counts for each service
self.status_columns = set(["PASS", "FAIL"])
self.service_findings = {} # Dictionary to store findings for each service
# Dictionary to map plain statuses to their stylized forms
self.status_headers = {
"FAIL": "[fail]Fail[/fail]",
"PASS": "[pass]Pass[/pass]",
}
# Add the initial columns with styling
for status, header in self.status_headers.items():
self.table.add_column(header, justify="center")
def add_results_for_service(self, service_name, service_findings):
if self.verbose:
# Count fails per severity
severity_counts = {severity: 0 for severity in self.serverities}
for finding in service_findings:
if finding.status == "FAIL":
severity_counts[finding.check_metadata.Severity] += 1
# Add row with severity counts
row = [service_name] + [
str(severity_counts[severity]) for severity in self.serverities
]
self.table.add_row(*row)
else:
# Update the dictionary with the new findings
status_counts = {report.status: 0 for report in service_findings}
for report in service_findings:
status_counts[report.status] += 1
self.service_findings[service_name] = status_counts
# Update status_columns and table columns
self.status_columns.update(status_counts.keys())
for status in self.status_columns:
if status not in self.status_headers:
# [{status.lower()}] is for the styling (defined in theme.yaml)
# If new status, add it to status_headers and table
styled_header = (
f"[{status.lower()}]{status.capitalize()}[/{status.lower()}]"
)
self.status_headers[status] = styled_header
self.table.add_column(styled_header, justify="center")
# Update the table with findings for all services
self._update_table()
def _update_table(self):
# Used for when verbose = false
# Clear existing rows
self.table.rows.clear()
# Add updated rows for all services
for service, counts in self.service_findings.items():
row = [service]
for status in self.status_columns:
count = counts.get(status, 0)
percentage = (
f"{(count / sum(counts.values()) * 100):.2f}%" if counts else "0%"
)
row.append(f"{count} ({percentage})")
self.table.add_row(*row)
def __rich__(self):
# This method allows the ResultsSection to be directly rendered by Rich
if not self.table.rows:
return Text("")
return Padding(Align.center(self.table), (0, 2))
# Create an instance of LiveDisplay to import elsewhere (ExecutionManager, the checks, the services)
live_display = LiveDisplay(vertical_overflow="visible")

View File

@@ -1,16 +0,0 @@
[styles]
info = yellow1
warning = dark_orange
fail = bold red
pass = bold green
banner_blue = dodger_blue3 bold
banner_color = bold green
orange_color = dark_orange
critical = bold bright_red
high = bold red
medium = bold dark_orange
low = bold yellow1
# style names must be lower case, start with a letter, and only contain letters or the characters ".", "-", "_".

View File

@@ -11,7 +11,7 @@ from prowler.lib.check.check import list_modules, recover_checks_from_service
from prowler.lib.logger import logger
from prowler.lib.utils.utils import open_file, parse_json_file
from prowler.providers.aws.config import AWS_STS_GLOBAL_ENDPOINT_REGION
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info, AWSAssumeRole
from prowler.providers.aws.lib.audit_info.models import AWS_Assume_Role, AWS_Audit_Info
from prowler.providers.aws.lib.credentials.credentials import create_sts_session
@@ -109,7 +109,7 @@ class AWS_Provider:
def assume_role(
session: session.Session,
assumed_role_info: AWSAssumeRole,
assumed_role_info: AWS_Assume_Role,
sts_endpoint_region: str = None,
) -> dict:
try:
@@ -152,31 +152,23 @@ def input_role_mfa_token_and_code() -> tuple[str]:
def generate_regional_clients(
service: str,
audit_info: AWS_Audit_Info,
service: str, audit_info: AWS_Audit_Info, global_service: bool = False
) -> dict:
"""generate_regional_clients returns a dict with the following format for the given service:
Example:
{"eu-west-1": boto3_service_client}
"""
try:
regional_clients = {}
service_regions = get_available_aws_service_regions(service, audit_info)
# Get the regions enabled for the account and get the intersection with the service available regions
if audit_info.enabled_regions:
enabled_regions = service_regions.intersection(audit_info.enabled_regions)
else:
enabled_regions = service_regions
for region in enabled_regions:
# Check if it is global service to gather only one region
if global_service:
if service_regions:
if audit_info.profile_region in service_regions:
service_regions = [audit_info.profile_region]
service_regions = service_regions[:1]
for region in service_regions:
regional_client = audit_info.audit_session.client(
service, region_name=region, config=audit_info.session_config
)
regional_client.region = region
regional_clients[region] = regional_client
return regional_clients
except Exception as error:
logger.error(
@@ -184,22 +176,6 @@ def generate_regional_clients(
)
def get_aws_enabled_regions(audit_info: AWS_Audit_Info) -> set:
"""get_aws_enabled_regions returns a set of enabled AWS regions"""
# EC2 Client to check enabled regions
service = "ec2"
default_region = get_default_region(service, audit_info)
ec2_client = audit_info.audit_session.client(service, region_name=default_region)
enabled_regions = set()
# With AllRegions=False we only get the enabled regions for the account
for region in ec2_client.describe_regions(AllRegions=False).get("Regions", []):
enabled_regions.add(region.get("RegionName"))
return enabled_regions
def get_aws_available_regions():
try:
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
@@ -240,8 +216,6 @@ def get_checks_from_input_arn(audit_resources: list, provider: str) -> set:
service = "efs"
elif service == "logs":
service = "cloudwatch"
elif service == "cognito":
service = "cognito-idp"
# Check if Prowler has checks in service
try:
list_modules(provider, service)
@@ -293,18 +267,17 @@ def get_regions_from_audit_resources(audit_resources: list) -> set:
return audited_regions
def get_available_aws_service_regions(service: str, audit_info: AWS_Audit_Info) -> set:
def get_available_aws_service_regions(service: str, audit_info: AWS_Audit_Info) -> list:
# Get json locally
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
with open_file(f"{actual_directory}/{aws_services_json_file}") as f:
data = parse_json_file(f)
json_regions = set(
data["services"][service]["regions"][audit_info.audited_partition]
)
# Check for input aws audit_info.audited_regions
if audit_info.audited_regions:
# Get common regions between input and json
regions = json_regions.intersection(audit_info.audited_regions)
# Check if it is a subservice
json_regions = data["services"][service]["regions"][audit_info.audited_partition]
if audit_info.audited_regions: # Check for input aws audit_info.audited_regions
regions = list(
set(json_regions).intersection(audit_info.audited_regions)
) # Get common regions between input and json
else: # Get all regions from json of the service and partition
regions = json_regions
return regions

View File

@@ -1,539 +0,0 @@
import os
import pathlib
import sys
from argparse import Namespace
from typing import Any, Optional
from boto3 import client, session
from botocore.config import Config
from botocore.credentials import RefreshableCredentials
from botocore.session import get_session
from colorama import Fore, Style
from prowler.config.config import aws_services_json_file
from prowler.lib.check.check import list_modules, recover_checks_from_service
from prowler.lib.ui.live_display import live_display
from prowler.lib.logger import logger
from prowler.lib.utils.utils import open_file, parse_json_file
from prowler.providers.aws.config import (
AWS_STS_GLOBAL_ENDPOINT_REGION,
BOTO3_USER_AGENT_EXTRA,
)
from prowler.providers.aws.models import (
AWSOrganizationsInfo,
AWSCredentials,
AWSAssumeRole,
AWSAssumeRoleConfiguration,
AWSIdentityInfo,
AWSSession,
)
from prowler.providers.aws.lib.arn.arn import parse_iam_credentials_arn
from prowler.providers.aws.lib.credentials.credentials import (
create_sts_session,
validate_AWSCredentials,
)
from prowler.providers.aws.lib.organizations.organizations import (
get_organizations_metadata,
)
from prowler.providers.common.provider import Provider
class AwsProvider(Provider):
session: AWSSession = AWSSession(
session=None, session_config=None, original_session=None
)
identity: AWSIdentityInfo = AWSIdentityInfo(
account=None,
account_arn=None,
user_id=None,
partition=None,
identity_arn=None,
profile=None,
profile_region=None,
audited_regions=[],
)
assumed_role: AWSAssumeRoleConfiguration = AWSAssumeRoleConfiguration(
assumed_role_info=AWSAssumeRole(
role_arn=None,
session_duration=None,
external_id=None,
mfa_enabled=False,
),
assumed_role_credentials=AWSCredentials(
aws_access_key_id=None,
aws_session_token=None,
aws_secret_access_key=None,
expiration=None,
),
)
organizations_metadata: AWSOrganizationsInfo = AWSOrganizationsInfo(
account_details_email=None,
account_details_name=None,
account_details_arn=None,
account_details_org=None,
account_details_tags=None,
)
audit_resources: Optional[Any]
audit_metadata: Optional[Any]
audit_config: dict = {}
mfa_enabled: bool = False
ignore_unused_services: bool = False
def __init__(self, arguments: Namespace):
logger.info("Setting AWS provider ...")
# Parse input arguments
# Assume Role Options
input_role = getattr(arguments, "role", None)
input_session_duration = getattr(arguments, "session_duration", None)
input_external_id = getattr(arguments, "external_id", None)
# STS Endpoint Region
sts_endpoint_region = getattr(arguments, "sts_endpoint_region", None)
# MFA Configuration (false by default)
input_mfa = getattr(arguments, "mfa", None)
input_profile = getattr(arguments, "profile", None)
input_regions = getattr(arguments, "region", None)
organizations_role_arn = getattr(arguments, "organizations_role", None)
# Set the maximum retries for the standard retrier config
aws_retries_max_attempts = getattr(arguments, "aws_retries_max_attempts", None)
# Set if unused services must be ignored
ignore_unused_services = getattr(arguments, "ignore_unused_services", None)
# Set the maximum retries for the standard retrier config
self.session.session_config = self.__set_session_config__(
aws_retries_max_attempts
)
# Set ignore unused services
self.ignore_unused_services = ignore_unused_services
# Start populating AWS identity object
self.identity.profile = input_profile
self.identity.audited_regions = input_regions
# We need to create an original sessions using regular auth path (creds, profile, etc)
logger.info("Generating original session ...")
self.session.session = self.setup_session(input_mfa)
# After the session is created, validate it
logger.info("Validating credentials ...")
caller_identity = validate_AWSCredentials(
self.session.session, input_regions, sts_endpoint_region
)
logger.info("Credentials validated")
logger.info(f"Original caller identity UserId: {caller_identity['UserId']}")
logger.info(f"Original caller identity ARN: {caller_identity['Arn']}")
# Set values of AWS identity object
self.identity.account = caller_identity["Account"]
self.identity.identity_arn = caller_identity["Arn"]
self.identity.user_id = caller_identity["UserId"]
self.identity.partition = parse_iam_credentials_arn(
caller_identity["Arn"]
).partition
self.identity.account_arn = (
f"arn:{self.identity.partition}:iam::{self.identity.account}:root"
)
# save original session
self.session.original_session = self.session.session
# time for checking role assumption
if input_role:
# session will be the assumed one
self.session.session = self.setup_assumed_session(
input_role,
input_external_id,
input_mfa,
input_session_duration,
sts_endpoint_region,
)
logger.info("Audit session is the new session created assuming role")
# check if organizations info is gonna be retrieved
if organizations_role_arn:
logger.info(
f"Getting organizations metadata for account {organizations_role_arn}"
)
# session will be the assumed one with organizations permissions
self.session.session = self.setup_assumed_session(
organizations_role_arn,
input_external_id,
input_mfa,
input_session_duration,
sts_endpoint_region,
)
self.organizations_metadata = get_organizations_metadata(
self.identity.account, self.assumed_role.assumed_role_credentials
)
logger.info("Organizations metadata retrieved")
if self.session.session.region_name:
self.identity.profile_region = self.session.session.region_name
else:
self.identity.profile_region = "us-east-1"
if not getattr(arguments, "only_logs", None):
self.print_credentials()
# Parse Scan Tags
if getattr(arguments, "resource_tags", None):
input_resource_tags = arguments.resource_tags
self.audit_resources = self.get_tagged_resources(input_resource_tags)
# Parse Input Resource ARNs
self.audit_resources = getattr(arguments, "resource_arn", None)
def setup_session(self, input_mfa: bool):
logger.info("Creating regular session ...")
# Input MFA only if a role is not going to be assumed
if input_mfa and not self.assumed_role.assumed_role_info.role_arn:
mfa_ARN, mfa_TOTP = self.__input_role_mfa_token_and_code__()
get_session_token_arguments = {
"SerialNumber": mfa_ARN,
"TokenCode": mfa_TOTP,
}
sts_client = client("sts")
session_credentials = sts_client.get_session_token(
**get_session_token_arguments
)
return session.Session(
aws_access_key_id=session_credentials["Credentials"]["AccessKeyId"],
aws_secret_access_key=session_credentials["Credentials"][
"SecretAccessKey"
],
aws_session_token=session_credentials["Credentials"]["SessionToken"],
profile_name=self.identity.profile,
)
else:
return session.Session(
profile_name=self.identity.profile,
)
def setup_assumed_session(
self,
input_role: str,
input_external_id: str,
input_mfa: str,
session_duration: int,
sts_endpoint_region: str,
):
logger.info("Creating assumed session ...")
# store information about the role is gonna be assumed
self.assumed_role.assumed_role_info.role_arn = input_role
self.assumed_role.assumed_role_info.session_duration = session_duration
self.assumed_role.assumed_role_info.external_id = input_external_id
self.assumed_role.assumed_role_info.mfa_enabled = input_mfa
# Check if role arn is valid
try:
# this returns the arn already parsed into a dict to be used when it is needed to access its fields
role_arn_parsed = parse_iam_credentials_arn(
self.assumed_role.assumed_role_info.role_arn
)
except Exception as error:
logger.critical(f"{error.__class__.__name__} -- {error}")
sys.exit(1)
else:
logger.info(f"Assuming role {self.assumed_role.assumed_role_info.role_arn}")
# Assume the role
assumed_role_response = self.__assume_role__(
self.session.session,
sts_endpoint_region,
)
logger.info("Role assumed")
# Set the info needed to create a session with an assumed role
self.assumed_role.assumed_role_credentials = AWSCredentials(
aws_access_key_id=assumed_role_response["Credentials"]["AccessKeyId"],
aws_session_token=assumed_role_response["Credentials"]["SessionToken"],
aws_secret_access_key=assumed_role_response["Credentials"][
"SecretAccessKey"
],
expiration=assumed_role_response["Credentials"]["Expiration"],
)
# Set identity parameters
self.identity.account = role_arn_parsed.account_id
self.identity.partition = role_arn_parsed.partition
self.identity.account_arn = (
f"arn:{self.identity.partition}:iam::{self.identity.account}:root"
)
# From botocore we can use RefreshableCredentials class, which has an attribute (refresh_using)
# that needs to be a method without arguments that retrieves a new set of fresh credentials
# asuming the role again. -> https://github.com/boto/botocore/blob/098cc255f81a25b852e1ecdeb7adebd94c7b1b73/botocore/credentials.py#L395
assumed_refreshable_credentials = RefreshableCredentials(
access_key=self.assumed_role.assumed_role_credentials.aws_access_key_id,
secret_key=self.assumed_role.assumed_role_credentials.aws_secret_access_key,
token=self.assumed_role.assumed_role_credentials.aws_session_token,
expiry_time=self.assumed_role.assumed_role_credentials.expiration,
refresh_using=self.refresh_credentials,
method="sts-assume-role",
)
# Here we need the botocore session since it needs to use refreshable credentials
assumed_botocore_session = get_session()
assumed_botocore_session._credentials = assumed_refreshable_credentials
assumed_botocore_session.set_config_variable(
"region", self.identity.profile_region
)
return session.Session(
profile_name=self.identity.profile,
botocore_session=assumed_botocore_session,
)
# Refresh credentials method using assume role
# This method is called "adding ()" to the name, so it cannot accept arguments
# https://github.com/boto/botocore/blob/098cc255f81a25b852e1ecdeb7adebd94c7b1b73/botocore/credentials.py#L570
def refresh_credentials(self):
live_display.print_aws_credentials(self.identity, self.assumed_role.assumed_role_info)
def generate_regional_clients(
self, service: str, global_service: bool = False
) -> dict:
try:
regional_clients = {}
service_regions = self.get_available_aws_service_regions(service)
# Check if it is global service to gather only one region
if global_service:
if service_regions:
if self.identity.profile_region in service_regions:
service_regions = [self.identity.profile_region]
service_regions = service_regions[:1]
for region in service_regions:
regional_client = self.session.session.client(
service, region_name=region, config=self.session.session_config
)
regional_client.region = region
regional_clients[region] = regional_client
return regional_clients
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def get_available_aws_service_regions(self, service: str) -> list:
# Get json locally
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
with open_file(f"{actual_directory}/{aws_services_json_file}") as f:
data = parse_json_file(f)
# Check if it is a subservice
json_regions = data["services"][service]["regions"][self.identity.partition]
if (
self.identity.audited_regions
): # Check for input aws audit_info.audited_regions
regions = list(
set(json_regions).intersection(self.identity.audited_regions)
) # Get common regions between input and json
else: # Get all regions from json of the service and partition
regions = json_regions
return regions
def get_aws_available_regions():
try:
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
with open_file(f"{actual_directory}/{aws_services_json_file}") as f:
data = parse_json_file(f)
regions = set()
for service in data["services"].values():
for partition in service["regions"]:
for item in service["regions"][partition]:
regions.add(item)
return list(regions)
except Exception as error:
logger.error(f"{error.__class__.__name__}: {error}")
return []
def get_checks_from_input_arn(audit_resources: list, provider: str) -> set:
"""get_checks_from_input_arn gets the list of checks from the input arns"""
checks_from_arn = set()
is_subservice_in_checks = False
# Handle if there are audit resources so only their services are executed
if audit_resources:
services_without_subservices = ["guardduty", "kms", "s3", "elb", "efs"]
service_list = set()
sub_service_list = set()
for resource in audit_resources:
service = resource.split(":")[2]
sub_service = resource.split(":")[5].split("/")[0].replace("-", "_")
# WAF Services does not have checks
if service != "wafv2" and service != "waf":
# Parse services when they are different in the ARNs
if service == "lambda":
service = "awslambda"
elif service == "elasticloadbalancing":
service = "elb"
elif service == "elasticfilesystem":
service = "efs"
elif service == "logs":
service = "cloudwatch"
# Check if Prowler has checks in service
try:
list_modules(provider, service)
except ModuleNotFoundError:
# Service is not supported
pass
else:
service_list.add(service)
# Get subservices to execute only applicable checks
if service not in services_without_subservices:
# Parse some specific subservices
if service == "ec2":
if sub_service == "security_group":
sub_service = "securitygroup"
if sub_service == "network_acl":
sub_service = "networkacl"
if sub_service == "image":
sub_service = "ami"
if service == "rds":
if sub_service == "cluster_snapshot":
sub_service = "snapshot"
sub_service_list.add(sub_service)
else:
sub_service_list.add(service)
checks = recover_checks_from_service(service_list, provider)
# Filter only checks with audited subservices
for check in checks:
if any(sub_service in check for sub_service in sub_service_list):
if not (sub_service == "policy" and "password_policy" in check):
checks_from_arn.add(check)
is_subservice_in_checks = True
if not is_subservice_in_checks:
checks_from_arn = checks
# Return final checks list
return sorted(checks_from_arn)
def get_regions_from_audit_resources(audit_resources: list) -> set:
"""get_regions_from_audit_resources gets the regions from the audit resources arns"""
audited_regions = set()
for resource in audit_resources:
region = resource.split(":")[3]
if region:
audited_regions.add(region)
return audited_regions
def get_tagged_resources(self, input_resource_tags: list):
"""
get_tagged_resources returns a list of the resources that are going to be scanned based on the given input tags
"""
try:
resource_tags = []
tagged_resources = []
for tag in input_resource_tags:
key = tag.split("=")[0]
value = tag.split("=")[1]
resource_tags.append({"Key": key, "Values": [value]})
# Get Resources with resource_tags for all regions
for regional_client in self.generate_regional_clients(
"resourcegroupstaggingapi"
).values():
try:
get_resources_paginator = regional_client.get_paginator(
"get_resources"
)
for page in get_resources_paginator.paginate(
TagFilters=resource_tags
):
for resource in page["ResourceTagMappingList"]:
tagged_resources.append(resource["ResourceARN"])
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
sys.exit(1)
else:
return tagged_resources
def get_default_region(self, service: str) -> str:
"""get_default_region gets the default region based on the profile and audited service regions"""
service_regions = self.get_available_aws_service_regions(service)
default_region = (
self.get_global_region()
) # global region of the partition when all regions are audited and there is no profile region
if self.identity.profile_region in service_regions:
# return profile region only if it is audited
default_region = self.identity.profile_region
# return first audited region if specific regions are audited
elif self.identity.audited_regions:
default_region = self.identity.audited_regions[0]
return default_region
def get_global_region(self) -> str:
"""get_global_region gets the global region based on the audited partition"""
global_region = "us-east-1"
if self.identity.partition == "aws-cn":
global_region = "cn-north-1"
elif self.identity.partition == "aws-us-gov":
global_region = "us-gov-east-1"
elif "aws-iso" in self.identity.partition:
global_region = "aws-iso-global"
return global_region
def __input_role_mfa_token_and_code__() -> tuple[str]:
"""input_role_mfa_token_and_code ask for the AWS MFA ARN and TOTP and returns it."""
mfa_ARN = input("Enter ARN of MFA: ")
mfa_TOTP = input("Enter MFA code: ")
return (mfa_ARN.strip(), mfa_TOTP.strip())
def __set_session_config__(self, aws_retries_max_attempts: bool):
session_config = Config(
retries={"max_attempts": 3, "mode": "standard"},
user_agent_extra=BOTO3_USER_AGENT_EXTRA,
)
if aws_retries_max_attempts:
# Create the new config
config = Config(
retries={
"max_attempts": aws_retries_max_attempts,
"mode": "standard",
},
)
# Merge the new configuration
session_config = self.session.session_config.merge(config)
return session_config
def __assume_role__(
self,
session,
sts_endpoint_region: str,
) -> dict:
try:
assume_role_arguments = {
"RoleArn": self.assumed_role.assumed_role_info.role_arn,
"RoleSessionName": "ProwlerAsessmentSession",
"DurationSeconds": self.assumed_role.assumed_role_info.session_duration,
}
# Set the info to assume the role from the partition, account and role name
if self.assumed_role.assumed_role_info.external_id:
assume_role_arguments[
"ExternalId"
] = self.assumed_role.assumed_role_info.external_id
if self.assumed_role.assumed_role_info.mfa_enabled:
mfa_ARN, mfa_TOTP = self.__input_role_mfa_token_and_code__()
assume_role_arguments["SerialNumber"] = mfa_ARN
assume_role_arguments["TokenCode"] = mfa_TOTP
# Set the STS Endpoint Region
if sts_endpoint_region is None:
sts_endpoint_region = AWS_STS_GLOBAL_ENDPOINT_REGION
sts_client = create_sts_session(session, sts_endpoint_region)
assumed_credentials = sts_client.assume_role(**assume_role_arguments)
except Exception as error:
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
sys.exit(1)
else:
return assumed_credentials

View File

@@ -498,6 +498,17 @@
]
}
},
"appfabric": {
"regions": {
"aws": [
"ap-northeast-1",
"eu-west-1",
"us-east-1"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"appflow": {
"regions": {
"aws": [
@@ -797,10 +808,7 @@
"cn-north-1",
"cn-northwest-1"
],
"aws-us-gov": [
"us-gov-east-1",
"us-gov-west-1"
]
"aws-us-gov": []
}
},
"artifact": {
@@ -1061,17 +1069,6 @@
]
}
},
"b2bi": {
"regions": {
"aws": [
"us-east-1",
"us-east-2",
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"backup": {
"regions": {
"aws": [
@@ -1492,7 +1489,6 @@
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
@@ -1719,7 +1715,6 @@
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
@@ -2092,18 +2087,14 @@
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-central-2",
"eu-north-1",
"eu-south-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
@@ -2196,49 +2187,15 @@
"aws-us-gov": []
}
},
"cognito": {
"regions": {
"aws": [
"af-south-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-south-1",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
"us-east-2",
"us-west-1",
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": [
"us-gov-west-1"
]
}
},
"cognito-identity": {
"regions": {
"aws": [
"af-south-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
@@ -2265,14 +2222,12 @@
"cognito-idp": {
"regions": {
"aws": [
"af-south-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
@@ -2361,22 +2316,15 @@
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-central-2",
"eu-north-1",
"eu-south-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
@@ -2562,15 +2510,6 @@
]
}
},
"cost-optimization-hub": {
"regions": {
"aws": [
"us-east-1"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"costexplorer": {
"regions": {
"aws": [
@@ -2934,7 +2873,6 @@
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
@@ -3021,7 +2959,6 @@
"cn-northwest-1"
],
"aws-us-gov": [
"us-gov-east-1",
"us-gov-west-1"
]
}
@@ -3059,10 +2996,7 @@
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": [
"us-gov-east-1",
"us-gov-west-1"
]
"aws-us-gov": []
}
},
"ds": {
@@ -3453,42 +3387,6 @@
]
}
},
"eks-auth": {
"regions": {
"aws": [
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-central-2",
"eu-north-1",
"eu-south-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
"us-east-2",
"us-west-1",
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"elastic-inference": {
"regions": {
"aws": [
@@ -3735,7 +3633,6 @@
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
@@ -3743,7 +3640,6 @@
"eu-west-1",
"eu-west-2",
"eu-west-3",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
@@ -3764,23 +3660,18 @@
"emr-serverless": {
"regions": {
"aws": [
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-south-1",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
@@ -4798,7 +4689,6 @@
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-central-2",
@@ -4808,7 +4698,6 @@
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
@@ -4907,40 +4796,6 @@
]
}
},
"inspector-scan": {
"regions": {
"aws": [
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-central-2",
"eu-north-1",
"eu-south-1",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"me-south-1",
"sa-east-1",
"us-east-1",
"us-east-2",
"us-west-1",
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": [
"us-gov-east-1",
"us-gov-west-1"
]
}
},
"inspector2": {
"regions": {
"aws": [
@@ -5758,44 +5613,6 @@
]
}
},
"launch-wizard": {
"regions": {
"aws": [
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-south-1",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
"us-east-2",
"us-west-1",
"us-west-2"
],
"aws-cn": [
"cn-north-1",
"cn-northwest-1"
],
"aws-us-gov": [
"us-gov-east-1",
"us-gov-west-1"
]
}
},
"launchwizard": {
"regions": {
"aws": [
@@ -5909,7 +5726,6 @@
"eu-central-2",
"eu-north-1",
"eu-south-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
@@ -5993,7 +5809,6 @@
"eu-central-2",
"eu-north-1",
"eu-south-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
@@ -6405,17 +6220,14 @@
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"me-central-1",
"sa-east-1",
"us-east-1",
"us-east-2",
@@ -6846,7 +6658,6 @@
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
@@ -6916,7 +6727,6 @@
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
@@ -7293,11 +7103,8 @@
"regions": {
"aws": [
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ca-central-1",
"eu-central-1",
"eu-west-1",
"eu-west-2",
@@ -7567,10 +7374,7 @@
"us-west-1",
"us-west-2"
],
"aws-cn": [
"cn-north-1",
"cn-northwest-1"
],
"aws-cn": [],
"aws-us-gov": []
}
},
@@ -7995,20 +7799,6 @@
]
}
},
"redshift-serverless": {
"regions": {
"aws": [
"ap-south-1",
"ca-central-1",
"eu-west-3",
"us-west-1"
],
"aws-cn": [
"cn-north-1"
],
"aws-us-gov": []
}
},
"rekognition": {
"regions": {
"aws": [
@@ -8033,16 +7823,6 @@
]
}
},
"repostspace": {
"regions": {
"aws": [
"eu-central-1",
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"resiliencehub": {
"regions": {
"aws": [
@@ -8347,10 +8127,7 @@
"cn-north-1",
"cn-northwest-1"
],
"aws-us-gov": [
"us-gov-east-1",
"us-gov-west-1"
]
"aws-us-gov": []
}
},
"route53-recovery-readiness": {
@@ -9083,7 +8860,6 @@
"eu-west-1",
"eu-west-2",
"eu-west-3",
"il-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
@@ -9914,21 +9690,6 @@
]
}
},
"thinclient": {
"regions": {
"aws": [
"ap-south-1",
"ca-central-1",
"eu-central-1",
"eu-west-1",
"eu-west-2",
"us-east-1",
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"timestream": {
"regions": {
"aws": [
@@ -9966,14 +9727,10 @@
"tnb": {
"regions": {
"aws": [
"ap-northeast-2",
"ap-southeast-2",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-south-2",
"eu-west-3",
"sa-east-1",
"us-east-1",
"us-west-2"
],
@@ -10647,7 +10404,6 @@
"eu-central-1",
"eu-west-1",
"eu-west-2",
"il-central-1",
"sa-east-1",
"us-east-1",
"us-west-2"

View File

@@ -9,7 +9,7 @@ from schema import Optional, Schema
from prowler.lib.logger import logger
from prowler.lib.outputs.models import unroll_tags
mutelist_schema = Schema(
allowlist_schema = Schema(
{
"Accounts": {
str: {
@@ -32,38 +32,38 @@ mutelist_schema = Schema(
)
def parse_mutelist_file(audit_info, mutelist_file):
def parse_allowlist_file(audit_info, allowlist_file):
try:
# Check if file is a S3 URI
if re.search("^s3://([^/]+)/(.*?([^/]+))$", mutelist_file):
bucket = mutelist_file.split("/")[2]
key = ("/").join(mutelist_file.split("/")[3:])
if re.search("^s3://([^/]+)/(.*?([^/]+))$", allowlist_file):
bucket = allowlist_file.split("/")[2]
key = ("/").join(allowlist_file.split("/")[3:])
s3_client = audit_info.audit_session.client("s3")
mutelist = yaml.safe_load(
allowlist = yaml.safe_load(
s3_client.get_object(Bucket=bucket, Key=key)["Body"]
)["Mute List"]
)["Allowlist"]
# Check if file is a Lambda Function ARN
elif re.search(r"^arn:(\w+):lambda:", mutelist_file):
lambda_region = mutelist_file.split(":")[3]
elif re.search(r"^arn:(\w+):lambda:", allowlist_file):
lambda_region = allowlist_file.split(":")[3]
lambda_client = audit_info.audit_session.client(
"lambda", region_name=lambda_region
)
lambda_response = lambda_client.invoke(
FunctionName=mutelist_file, InvocationType="RequestResponse"
FunctionName=allowlist_file, InvocationType="RequestResponse"
)
lambda_payload = lambda_response["Payload"].read()
mutelist = yaml.safe_load(lambda_payload)["Mute List"]
allowlist = yaml.safe_load(lambda_payload)["Allowlist"]
# Check if file is a DynamoDB ARN
elif re.search(
r"^arn:aws(-cn|-us-gov)?:dynamodb:[a-z]{2}-[a-z-]+-[1-9]{1}:[0-9]{12}:table\/[a-zA-Z0-9._-]+$",
mutelist_file,
allowlist_file,
):
mutelist = {"Accounts": {}}
table_region = mutelist_file.split(":")[3]
allowlist = {"Accounts": {}}
table_region = allowlist_file.split(":")[3]
dynamodb_resource = audit_info.audit_session.resource(
"dynamodb", region_name=table_region
)
dynamo_table = dynamodb_resource.Table(mutelist_file.split("/")[1])
dynamo_table = dynamodb_resource.Table(allowlist_file.split("/")[1])
response = dynamo_table.scan(
FilterExpression=Attr("Accounts").is_in(
[audit_info.audited_account, "*"]
@@ -80,8 +80,8 @@ def parse_mutelist_file(audit_info, mutelist_file):
)
dynamodb_items.update(response["Items"])
for item in dynamodb_items:
# Create mutelist for every item
mutelist["Accounts"][item["Accounts"]] = {
# Create allowlist for every item
allowlist["Accounts"][item["Accounts"]] = {
"Checks": {
item["Checks"]: {
"Regions": item["Regions"],
@@ -90,24 +90,24 @@ def parse_mutelist_file(audit_info, mutelist_file):
}
}
if "Tags" in item:
mutelist["Accounts"][item["Accounts"]]["Checks"][item["Checks"]][
allowlist["Accounts"][item["Accounts"]]["Checks"][item["Checks"]][
"Tags"
] = item["Tags"]
if "Exceptions" in item:
mutelist["Accounts"][item["Accounts"]]["Checks"][item["Checks"]][
allowlist["Accounts"][item["Accounts"]]["Checks"][item["Checks"]][
"Exceptions"
] = item["Exceptions"]
else:
with open(mutelist_file) as f:
mutelist = yaml.safe_load(f)["Mute List"]
with open(allowlist_file) as f:
allowlist = yaml.safe_load(f)["Allowlist"]
try:
mutelist_schema.validate(mutelist)
allowlist_schema.validate(allowlist)
except Exception as error:
logger.critical(
f"{error.__class__.__name__} -- Mute List YAML is malformed - {error}[{error.__traceback__.tb_lineno}]"
f"{error.__class__.__name__} -- Allowlist YAML is malformed - {error}[{error.__traceback__.tb_lineno}]"
)
sys.exit(1)
return mutelist
return allowlist
except Exception as error:
logger.critical(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
@@ -115,27 +115,27 @@ def parse_mutelist_file(audit_info, mutelist_file):
sys.exit(1)
def mutelist_findings(
mutelist: dict,
def allowlist_findings(
allowlist: dict,
audited_account: str,
check_findings: [Any],
):
# Check if finding is muted
# Check if finding is allowlisted
for finding in check_findings:
if is_muted(
mutelist,
if is_allowlisted(
allowlist,
audited_account,
finding.check_metadata.CheckID,
finding.region,
finding.resource_id,
unroll_tags(finding.resource_tags),
):
finding.status = "MUTED"
finding.status = "WARNING"
return check_findings
def is_muted(
mutelist: dict,
def is_allowlisted(
allowlist: dict,
audited_account: str,
check: str,
finding_region: str,
@@ -143,30 +143,31 @@ def is_muted(
finding_tags,
):
try:
muted_checks = {}
# By default is not muted
is_finding_muted = False
# First set account key from mutelist dict
if audited_account in mutelist["Accounts"]:
muted_checks = mutelist["Accounts"][audited_account]["Checks"]
allowlisted_checks = {}
# By default is not allowlisted
is_finding_allowlisted = False
# First set account key from allowlist dict
if audited_account in allowlist["Accounts"]:
allowlisted_checks = allowlist["Accounts"][audited_account]["Checks"]
# If there is a *, it affects to all accounts
# This cannot be elif since in the case of * and single accounts we
# want to merge muted checks from * to the other accounts check list
if "*" in mutelist["Accounts"]:
checks_multi_account = mutelist["Accounts"]["*"]["Checks"]
muted_checks.update(checks_multi_account)
# Test if it is muted
if is_muted_in_check(
muted_checks,
# want to merge allowlisted checks from * to the other accounts check list
if "*" in allowlist["Accounts"]:
checks_multi_account = allowlist["Accounts"]["*"]["Checks"]
allowlisted_checks.update(checks_multi_account)
# Test if it is allowlisted
if is_allowlisted_in_check(
allowlisted_checks,
audited_account,
check,
finding_region,
finding_resource,
finding_tags,
):
is_finding_muted = True
is_finding_allowlisted = True
return is_finding_muted
return is_finding_allowlisted
except Exception as error:
logger.critical(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
@@ -174,8 +175,8 @@ def is_muted(
sys.exit(1)
def is_muted_in_check(
muted_checks,
def is_allowlisted_in_check(
allowlisted_checks,
audited_account,
check,
finding_region,
@@ -183,15 +184,15 @@ def is_muted_in_check(
finding_tags,
):
try:
# Default value is not muted
is_check_muted = False
# Default value is not allowlisted
is_check_allowlisted = False
for muted_check, muted_check_info in muted_checks.items():
for allowlisted_check, allowlisted_check_info in allowlisted_checks.items():
# map lambda to awslambda
muted_check = re.sub("^lambda", "awslambda", muted_check)
allowlisted_check = re.sub("^lambda", "awslambda", allowlisted_check)
# Check if the finding is excepted
exceptions = muted_check_info.get("Exceptions")
exceptions = allowlisted_check_info.get("Exceptions")
if is_excepted(
exceptions,
audited_account,
@@ -202,36 +203,40 @@ def is_muted_in_check(
# Break loop and return default value since is excepted
break
muted_regions = muted_check_info.get("Regions")
muted_resources = muted_check_info.get("Resources")
muted_tags = muted_check_info.get("Tags")
allowlisted_regions = allowlisted_check_info.get("Regions")
allowlisted_resources = allowlisted_check_info.get("Resources")
allowlisted_tags = allowlisted_check_info.get("Tags")
# If there is a *, it affects to all checks
if (
"*" == muted_check
or check == muted_check
or re.search(muted_check, check)
"*" == allowlisted_check
or check == allowlisted_check
or re.search(allowlisted_check, check)
):
muted_in_check = True
muted_in_region = is_muted_in_region(muted_regions, finding_region)
muted_in_resource = is_muted_in_resource(
muted_resources, finding_resource
allowlisted_in_check = True
allowlisted_in_region = is_allowlisted_in_region(
allowlisted_regions, finding_region
)
allowlisted_in_resource = is_allowlisted_in_resource(
allowlisted_resources, finding_resource
)
allowlisted_in_tags = is_allowlisted_in_tags(
allowlisted_tags, finding_tags
)
muted_in_tags = is_muted_in_tags(muted_tags, finding_tags)
# For a finding to be muted requires the following set to True:
# - muted_in_check -> True
# - muted_in_region -> True
# - muted_in_tags -> True or muted_in_resource -> True
# For a finding to be allowlisted requires the following set to True:
# - allowlisted_in_check -> True
# - allowlisted_in_region -> True
# - allowlisted_in_tags -> True or allowlisted_in_resource -> True
# - excepted -> False
if (
muted_in_check
and muted_in_region
and (muted_in_tags or muted_in_resource)
allowlisted_in_check
and allowlisted_in_region
and (allowlisted_in_tags or allowlisted_in_resource)
):
is_check_muted = True
is_check_allowlisted = True
return is_check_muted
return is_check_allowlisted
except Exception as error:
logger.critical(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
@@ -239,12 +244,12 @@ def is_muted_in_check(
sys.exit(1)
def is_muted_in_region(
mutelist_regions,
def is_allowlisted_in_region(
allowlisted_regions,
finding_region,
):
try:
return __is_item_matched__(mutelist_regions, finding_region)
return __is_item_matched__(allowlisted_regions, finding_region)
except Exception as error:
logger.critical(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
@@ -252,9 +257,9 @@ def is_muted_in_region(
sys.exit(1)
def is_muted_in_tags(muted_tags, finding_tags):
def is_allowlisted_in_tags(allowlisted_tags, finding_tags):
try:
return __is_item_matched__(muted_tags, finding_tags)
return __is_item_matched__(allowlisted_tags, finding_tags)
except Exception as error:
logger.critical(
f"{error.__class__.__name__} -- {error}[{error.__traceback__.tb_lineno}]"
@@ -262,9 +267,9 @@ def is_muted_in_tags(muted_tags, finding_tags):
sys.exit(1)
def is_muted_in_resource(muted_resources, finding_resource):
def is_allowlisted_in_resource(allowlisted_resources, finding_resource):
try:
return __is_item_matched__(muted_resources, finding_resource)
return __is_item_matched__(allowlisted_resources, finding_resource)
except Exception as error:
logger.critical(

View File

@@ -1,5 +1,4 @@
from argparse import ArgumentTypeError, Namespace
from re import search
from prowler.providers.aws.aws_provider import get_aws_available_regions
from prowler.providers.aws.lib.arn.arn import arn_type
@@ -27,6 +26,12 @@ def init_parser(self):
help="ARN of the role to be assumed",
# Pending ARN validation
)
aws_auth_subparser.add_argument(
"--sts-endpoint-region",
nargs="?",
default=None,
help="Specify the AWS STS endpoint region to use. Read more at https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp_enable-regions.html",
)
aws_auth_subparser.add_argument(
"--mfa",
action="store_true",
@@ -79,11 +84,6 @@ def init_parser(self):
action="store_true",
help="Skip updating previous findings of Prowler in Security Hub",
)
aws_security_hub_subparser.add_argument(
"--send-sh-only-fails",
action="store_true",
help="Send only Prowler failed findings to SecurityHub",
)
# AWS Quick Inventory
aws_quick_inventory_subparser = aws_parser.add_argument_group("Quick Inventory")
aws_quick_inventory_subparser.add_argument(
@@ -99,7 +99,6 @@ def init_parser(self):
"-B",
"--output-bucket",
nargs="?",
type=validate_bucket,
default=None,
help="Custom output bucket, requires -M <mode> and it can work also with -o flag.",
)
@@ -107,7 +106,6 @@ def init_parser(self):
"-D",
"--output-bucket-no-assume",
nargs="?",
type=validate_bucket,
default=None,
help="Same as -B but do not use the assumed role credentials to put objects to the bucket, instead uses the initial credentials.",
)
@@ -119,16 +117,15 @@ def init_parser(self):
default=None,
help="Shodan API key used by check ec2_elastic_ip_shodan.",
)
# Mute List
mutelist_subparser = aws_parser.add_argument_group("Mute List")
mutelist_subparser.add_argument(
# Allowlist
allowlist_subparser = aws_parser.add_argument_group("Allowlist")
allowlist_subparser.add_argument(
"-w",
"--mutelist-file",
"--allowlist-file",
nargs="?",
default=None,
help="Path for mutelist yaml file. See example prowler/config/aws_mutelist.yaml for reference and format. It also accepts AWS DynamoDB Table or Lambda ARNs or S3 URIs, see more in https://docs.prowler.cloud/en/latest/tutorials/mutelist/",
help="Path for allowlist yaml file. See example prowler/config/aws_allowlist.yaml for reference and format. It also accepts AWS DynamoDB Table or Lambda ARNs or S3 URIs, see more in https://docs.prowler.cloud/en/latest/tutorials/allowlist/",
)
# Based Scans
aws_based_scans_subparser = aws_parser.add_argument_group("AWS Based Scans")
aws_based_scans_parser = aws_based_scans_subparser.add_mutually_exclusive_group()
@@ -187,13 +184,3 @@ def validate_arguments(arguments: Namespace) -> tuple[bool, str]:
return (False, "To use -I/-T options -R option is needed")
return (True, "")
def validate_bucket(bucket_name):
"""validate_bucket validates that the input bucket_name is valid"""
if search("(?!(^xn--|.+-s3alias$))^[a-z0-9][a-z0-9-]{1,61}[a-z0-9]$", bucket_name):
return bucket_name
else:
raise ArgumentTypeError(
"Bucket name must be valid (https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucketnamingrules.html)"
)

View File

@@ -2,7 +2,7 @@ from boto3 import session
from botocore.config import Config
from prowler.providers.aws.config import BOTO3_USER_AGENT_EXTRA
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info, AWSAssumeRole
from prowler.providers.aws.lib.audit_info.models import AWS_Assume_Role, AWS_Audit_Info
# Default Current Audit Info
current_audit_info = AWS_Audit_Info(
@@ -25,7 +25,7 @@ current_audit_info = AWS_Audit_Info(
profile=None,
profile_region=None,
credentials=None,
assumed_role_info=AWSAssumeRole(
assumed_role_info=AWS_Assume_Role(
role_arn=None,
session_duration=None,
external_id=None,
@@ -38,5 +38,4 @@ current_audit_info = AWS_Audit_Info(
audit_metadata=None,
audit_config=None,
ignore_unused_services=False,
enabled_regions=set(),
)

View File

@@ -1,4 +1,4 @@
from dataclasses import dataclass, field
from dataclasses import dataclass
from datetime import datetime
from typing import Any, Optional
@@ -7,7 +7,7 @@ from botocore.config import Config
@dataclass
class AWSCredentials:
class AWS_Credentials:
aws_access_key_id: str
aws_session_token: str
aws_secret_access_key: str
@@ -15,7 +15,7 @@ class AWSCredentials:
@dataclass
class AWSAssumeRole:
class AWS_Assume_Role:
role_arn: str
session_duration: int
external_id: str
@@ -23,7 +23,7 @@ class AWSAssumeRole:
@dataclass
class AWSOrganizationsInfo:
class AWS_Organizations_Info:
account_details_email: str
account_details_name: str
account_details_arn: str
@@ -44,13 +44,12 @@ class AWS_Audit_Info:
audited_partition: str
profile: str
profile_region: str
credentials: AWSCredentials
credentials: AWS_Credentials
mfa_enabled: bool
assumed_role_info: AWSAssumeRole
assumed_role_info: AWS_Assume_Role
audited_regions: list
audit_resources: list
organizations_metadata: AWSOrganizationsInfo
audit_metadata: Optional[Any]
organizations_metadata: AWS_Organizations_Info
audit_metadata: Optional[Any] = None
audit_config: Optional[dict] = None
ignore_unused_services: bool = False
enabled_regions: set = field(default_factory=set)

View File

@@ -8,12 +8,16 @@ from prowler.providers.aws.config import AWS_STS_GLOBAL_ENDPOINT_REGION
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
def validate_AWSCredentials(
def validate_aws_credentials(
session: session, input_regions: list, sts_endpoint_region: str = None
) -> dict:
try:
# For a valid STS GetCallerIdentity we have to use the right AWS Region
# Check if the --sts-endpoint-region is set
if sts_endpoint_region is not None:
aws_region = sts_endpoint_region
# If there is no region passed with -f/--region/--filter-region
if input_regions is None or len(input_regions) == 0:
elif input_regions is None or len(input_regions) == 0:
# If you have a region configured in your AWS config or credentials file
if session.region_name is not None:
aws_region = session.region_name
@@ -38,7 +42,7 @@ def validate_AWSCredentials(
return caller_identity
def print_AWSCredentials(audit_info: AWS_Audit_Info):
def print_aws_credentials(audit_info: AWS_Audit_Info):
# Beautify audited regions, set "all" if there is no filter region
regions = (
", ".join(audit_info.audited_regions)

View File

@@ -3,12 +3,12 @@ import sys
from boto3 import client
from prowler.lib.logger import logger
from prowler.providers.aws.lib.audit_info.models import AWSOrganizationsInfo
from prowler.providers.aws.lib.audit_info.models import AWS_Organizations_Info
def get_organizations_metadata(
metadata_account: str, assumed_credentials: dict
) -> AWSOrganizationsInfo:
) -> AWS_Organizations_Info:
try:
organizations_client = client(
"organizations",
@@ -30,7 +30,7 @@ def get_organizations_metadata(
account_details_tags = ""
for tag in list_tags_for_resource["Tags"]:
account_details_tags += tag["Key"] + ":" + tag["Value"] + ","
organizations_info = AWSOrganizationsInfo(
organizations_info = AWS_Organizations_Info(
account_details_email=organizations_metadata["Account"]["Email"],
account_details_name=organizations_metadata["Account"]["Name"],
account_details_arn=organizations_metadata["Account"]["Arn"],

View File

@@ -1,11 +1,8 @@
def is_condition_block_restrictive(
condition_statement: dict, source_account: str, is_cross_account_allowed=False
def is_account_only_allowed_in_condition(
condition_statement: dict, source_account: str
):
"""
is_condition_block_restrictive parses the IAM Condition policy block and, by default, returns True if the source_account passed as argument is within, False if not.
If argument is_cross_account_allowed is True it tests if the Condition block includes any of the operators mutelisted returning True if does, False if not.
is_account_only_allowed_in_condition parses the IAM Condition policy block and returns True if the source_account passed as argument is within, False if not.
@param condition_statement: dict with an IAM Condition block, e.g.:
{
@@ -57,19 +54,13 @@ def is_condition_block_restrictive(
condition_statement[condition_operator][value],
list,
):
# if there is an arn/account without the source account -> we do not consider it safe
# here by default we assume is true and look for false entries
is_condition_key_restrictive = True
# if cross account is not allowed check for each condition block looking for accounts
# different than default
if not is_cross_account_allowed:
# if there is an arn/account without the source account -> we do not consider it safe
# here by default we assume is true and look for false entries
for item in condition_statement[condition_operator][value]:
if source_account not in item:
is_condition_key_restrictive = False
break
if is_condition_key_restrictive:
is_condition_valid = True
for item in condition_statement[condition_operator][value]:
if source_account not in item:
is_condition_key_restrictive = False
break
if is_condition_key_restrictive:
is_condition_valid = True
@@ -79,13 +70,10 @@ def is_condition_block_restrictive(
condition_statement[condition_operator][value],
str,
):
if is_cross_account_allowed:
if (
source_account
in condition_statement[condition_operator][value]
):
is_condition_valid = True
else:
if (
source_account
in condition_statement[condition_operator][value]
):
is_condition_valid = True
return is_condition_valid

View File

@@ -1,3 +1,5 @@
import sys
from prowler.config.config import (
csv_file_suffix,
html_file_suffix,
@@ -39,9 +41,10 @@ def send_to_s3_bucket(
s3_client.upload_file(file_name, output_bucket_name, object_name)
except Exception as error:
logger.error(
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}] -- {error}"
)
sys.exit(1)
def get_s3_object_path(output_directory: str) -> str:

View File

@@ -14,22 +14,20 @@ def prepare_security_hub_findings(
findings: [], audit_info: AWS_Audit_Info, output_options, enabled_regions: []
) -> dict:
security_hub_findings_per_region = {}
# Create a key per audited region
for region in enabled_regions:
# Create a key per region
for region in audit_info.audited_regions:
security_hub_findings_per_region[region] = []
for finding in findings:
# We don't send the MANUAL findings to AWS Security Hub
if finding.status == "MANUAL":
# We don't send the INFO findings to AWS Security Hub
if finding.status == "INFO":
continue
# We don't send findings to not enabled regions
if finding.region not in enabled_regions:
continue
# Handle status filters, if any
if not output_options.status or finding.status in output_options.status:
# Handle quiet mode
if output_options.is_quiet and finding.status != "FAIL":
continue
# Get the finding region
@@ -49,10 +47,8 @@ def prepare_security_hub_findings(
def verify_security_hub_integration_enabled_per_region(
partition: str,
region: str,
session: session.Session,
aws_account_number: str,
) -> bool:
f"""verify_security_hub_integration_enabled returns True if the {SECURITY_HUB_INTEGRATION_NAME} is enabled for the given region. Otherwise returns false."""
prowler_integration_enabled = False
@@ -66,8 +62,7 @@ def verify_security_hub_integration_enabled_per_region(
security_hub_client.describe_hub()
# Check if Prowler integration is enabled in Security Hub
security_hub_prowler_integration_arn = f"arn:{partition}:securityhub:{region}:{aws_account_number}:product-subscription/{SECURITY_HUB_INTEGRATION_NAME}"
if security_hub_prowler_integration_arn not in str(
if "prowler/prowler" not in str(
security_hub_client.list_enabled_products_for_import()
):
logger.error(

View File

@@ -1,16 +1,10 @@
from concurrent.futures import ThreadPoolExecutor, as_completed
from functools import wraps
import threading
from prowler.lib.logger import logger
from prowler.lib.ui.live_display import live_display
from prowler.providers.aws.aws_provider import (
generate_regional_clients,
get_default_region,
)
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
from prowler.providers.aws.aws_provider_new import AwsProvider
MAX_WORKERS = 10
class AWSService:
@@ -18,22 +12,21 @@ class AWSService:
- AWS Regional Clients
- Shared information like the account ID and ARN, the the AWS partition and the checks audited
- AWS Session
- Thread pool for the __threading_call__
- Also handles if the AWS Service is Global
"""
def __init__(self, service: str, provider: AwsProvider, global_service=False):
def __init__(self, service: str, audit_info: AWS_Audit_Info, global_service=False):
# Audit Information
self.provider = provider
self.audited_account = provider.identity.account
self.audited_account_arn = provider.identity.account_arn
self.audited_partition = provider.identity.partition
self.audit_resources = provider.audit_resources
self.audited_checks = provider.audit_metadata.expected_checks
self.audit_config = provider.audit_config
self.audit_info = audit_info
self.audited_account = audit_info.audited_account
self.audited_account_arn = audit_info.audited_account_arn
self.audited_partition = audit_info.audited_partition
self.audit_resources = audit_info.audit_resources
self.audited_checks = audit_info.audit_metadata.expected_checks
self.audit_config = audit_info.audit_config
# AWS Session
self.session = provider.session.session
self.session = audit_info.audit_session
# We receive the service using __class__.__name__ or the service name in lowercase
# e.g.: AccessAnalyzer --> we need a lowercase string, so service.lower()
@@ -41,105 +34,24 @@ class AWSService:
# Generate Regional Clients
if not global_service:
self.regional_clients = provider.generate_regional_clients(
self.service, global_service
self.regional_clients = generate_regional_clients(
self.service, audit_info, global_service
)
# Get a single region and client if the service needs it (e.g. AWS Global Service)
# We cannot include this within an else because some services needs both the regional_clients
# and a single client like S3
self.region = provider.get_default_region(self.service)
self.region = get_default_region(self.service, audit_info)
self.client = self.session.client(self.service, self.region)
# Thread pool for __threading_call__
self.thread_pool = ThreadPoolExecutor(max_workers=MAX_WORKERS)
self.live_display_enabled = False
# Progress bar to add tasks to
service_init_section = live_display.get_client_init_section()
if service_init_section:
# Only Flags is not set to True
self.task_progress_bar = service_init_section.task_progress_bar
self.progress_tasks = []
# For us in other functions
self.live_display_enabled = True
def __get_session__(self):
return self.session
def __threading_call__(self, call, iterator=None, *args, **kwargs):
# Use the provided iterator, or default to self.regional_clients
items = iterator if iterator is not None else self.regional_clients.values()
# Determine the total count for logging
item_count = len(items)
# Trim leading and trailing underscores from the call's name
call_name = call.__name__.strip("_")
# Add Capitalization
call_name = " ".join([x.capitalize() for x in call_name.split("_")])
# Print a message based on the call's name, and if its regional or processing a list of items
if iterator is None:
logger.info(
f"{self.service.upper()} - Starting threads for '{call_name}' function across {item_count} regions..."
)
else:
logger.info(
f"{self.service.upper()} - Starting threads for '{call_name}' function to process {item_count} items..."
)
if self.live_display_enabled:
# Setup the progress bar
task_id = self.task_progress_bar.add_task(
f"- {call_name}...", total=item_count, task_type="Service"
)
self.progress_tasks.append(task_id)
# Submit tasks to the thread pool
futures = [
self.thread_pool.submit(call, item, *args, **kwargs) for item in items
]
# Wait for all tasks to complete
for future in as_completed(futures):
try:
future.result() # Raises exceptions from the thread, if any
if self.live_display_enabled:
# Update the progress bar
self.task_progress_bar.update(task_id, advance=1)
except Exception:
# Handle exceptions if necessary
pass # Replace 'pass' with any additional exception handling logic. Currently handled within the called function
# Make the task disappear once completed
# self.progress.remove_task(task_id)
@staticmethod
def progress_decorator(func):
"""
Decorator to update the progress bar before and after a function call.
To be used for methods within global services, which do not make use of the __threading_call__ function
"""
@wraps(func)
def wrapper(self, *args, **kwargs):
# Trim leading and trailing underscores from the call's name
func_name = func.__name__.strip("_")
# Add Capitalization
func_name = " ".join([x.capitalize() for x in func_name.split("_")])
if self.live_display_enabled:
task_id = self.task_progress_bar.add_task(
f"- {func_name}...", total=1, task_type="Service"
)
self.progress_tasks.append(task_id)
result = func(self, *args, **kwargs) # Execute the function
if self.live_display_enabled:
self.task_progress_bar.update(task_id, advance=1)
# self.task_progress_bar.remove_task(task_id) # Uncomment if you want to remove the task on completion
return result
return wrapper
def __threading_call__(self, call):
threads = []
for regional_client in self.regional_clients.values():
threads.append(threading.Thread(target=call, args=(regional_client,)))
for t in threads:
t.start()
for t in threads:
t.join()

View File

@@ -1,54 +0,0 @@
from dataclasses import dataclass
from datetime import datetime
from boto3 import session
from botocore.config import Config
@dataclass
class AWSOrganizationsInfo:
account_details_email: str
account_details_name: str
account_details_arn: str
account_details_org: str
account_details_tags: str
@dataclass
class AWSCredentials:
aws_access_key_id: str
aws_session_token: str
aws_secret_access_key: str
expiration: datetime
@dataclass
class AWSAssumeRole:
role_arn: str
session_duration: int
external_id: str
mfa_enabled: bool
@dataclass
class AWSAssumeRoleConfiguration:
assumed_role_info: AWSAssumeRole
assumed_role_credentials: AWSCredentials
@dataclass
class AWSIdentityInfo:
account: str
account_arn: str
user_id: str
partition: str
identity_arn: str
profile: str
profile_region: str
audited_regions: list
@dataclass
class AWSSession:
session: session.Session
session_config: Config
original_session: None

View File

@@ -1,6 +1,6 @@
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.accessanalyzer.accessanalyzer_service import (
AccessAnalyzer,
)
from prowler.providers.common.common import get_global_provider
accessanalyzer_client = AccessAnalyzer(get_global_provider())
accessanalyzer_client = AccessAnalyzer(current_audit_info)

View File

@@ -19,23 +19,17 @@ class accessanalyzer_enabled(Check):
f"IAM Access Analyzer {analyzer.name} is enabled."
)
else:
if analyzer.status == "NOT_AVAILABLE":
report.status = "FAIL"
report.status_extended = f"IAM Access Analyzer in account {analyzer.name} is not enabled."
elif analyzer.status == "NOT_AVAILABLE":
report.status = "FAIL"
report.status_extended = (
f"IAM Access Analyzer in account {analyzer.name} is not enabled."
)
else:
report.status = "FAIL"
report.status_extended = (
f"IAM Access Analyzer {analyzer.name} is not active."
)
if (
accessanalyzer_client.audit_config.get(
"mute_non_default_regions", False
)
and not analyzer.region == accessanalyzer_client.region
):
report.status = "MUTED"
else:
report.status = "FAIL"
report.status_extended = (
f"IAM Access Analyzer {analyzer.name} is not active."
)
findings.append(report)

View File

@@ -10,9 +10,9 @@ from prowler.providers.aws.lib.service.service import AWSService
################## AccessAnalyzer
class AccessAnalyzer(AWSService):
def __init__(self, provider):
def __init__(self, audit_info):
# Call AWSService's __init__
super().__init__(__class__.__name__, provider)
super().__init__(__class__.__name__, audit_info)
self.analyzers = []
self.__threading_call__(self.__list_analyzers__)
self.__list_findings__()
@@ -85,36 +85,21 @@ class AccessAnalyzer(AWSService):
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
# TODO: We need to include ListFindingsV2
# https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/accessanalyzer/client/list_findings_v2.html
def __list_findings__(self):
logger.info("AccessAnalyzer - Listing Findings per Analyzer...")
try:
for analyzer in self.analyzers:
try:
if analyzer.status == "ACTIVE":
regional_client = self.regional_clients[analyzer.region]
list_findings_paginator = regional_client.get_paginator(
"list_findings"
)
for page in list_findings_paginator.paginate(
analyzerArn=analyzer.arn
):
for finding in page["findings"]:
analyzer.findings.append(Finding(id=finding["id"]))
except ClientError as error:
if error.response["Error"]["Code"] == "ValidationException":
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
if analyzer.status == "ACTIVE":
regional_client = self.regional_clients[analyzer.region]
list_findings_paginator = regional_client.get_paginator(
"list_findings"
)
for page in list_findings_paginator.paginate(
analyzerArn=analyzer.arn
):
for finding in page["findings"]:
analyzer.findings.append(Finding(id=finding["id"]))
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"

View File

@@ -1,4 +1,4 @@
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.account.account_service import Account
from prowler.providers.common.common import get_global_provider
account_client = Account(get_global_provider())
account_client = Account(current_audit_info)

View File

@@ -10,6 +10,6 @@ class account_maintain_current_contact_details(Check):
report.region = account_client.region
report.resource_id = account_client.audited_account
report.resource_arn = account_client.audited_account_arn
report.status = "MANUAL"
report.status_extended = "Login to the AWS Console. Choose your account name on the top right of the window -> My Account -> Contact Information."
report.status = "INFO"
report.status_extended = "Manual check: Login to the AWS Console. Choose your account name on the top right of the window -> My Account -> Contact Information."
return [report]

View File

@@ -10,6 +10,6 @@ class account_security_contact_information_is_registered(Check):
report.region = account_client.region
report.resource_id = account_client.audited_account
report.resource_arn = account_client.audited_account_arn
report.status = "MANUAL"
report.status_extended = "Login to the AWS Console. Choose your account name on the top right of the window -> My Account -> Alternate Contacts -> Security Section."
report.status = "INFO"
report.status_extended = "Manual check: Login to the AWS Console. Choose your account name on the top right of the window -> My Account -> Alternate Contacts -> Security Section."
return [report]

View File

@@ -10,6 +10,6 @@ class account_security_questions_are_registered_in_the_aws_account(Check):
report.region = account_client.region
report.resource_id = account_client.audited_account
report.resource_arn = account_client.audited_account_arn
report.status = "MANUAL"
report.status_extended = "Login to the AWS Console as root. Choose your account name on the top right of the window -> My Account -> Configure Security Challenge Questions."
report.status = "INFO"
report.status_extended = "Manual check: Login to the AWS Console as root. Choose your account name on the top right of the window -> My Account -> Configure Security Challenge Questions."
return [report]

View File

@@ -9,9 +9,9 @@ from prowler.providers.aws.lib.service.service import AWSService
class Account(AWSService):
def __init__(self, provider):
def __init__(self, audit_info):
# Call AWSService's __init__
super().__init__(__class__.__name__, provider)
super().__init__(__class__.__name__, audit_info)
self.number_of_contacts = 4
self.contact_base = self.__get_contact_information__()
self.contacts_billing = self.__get_alternate_contact__("BILLING")

View File

@@ -1,4 +1,4 @@
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.acm.acm_service import ACM
from prowler.providers.common.common import get_global_provider
acm_client = ACM(get_global_provider())
acm_client = ACM(current_audit_info)

View File

@@ -10,13 +10,13 @@ from prowler.providers.aws.lib.service.service import AWSService
################## ACM
class ACM(AWSService):
def __init__(self, provider):
def __init__(self, audit_info):
# Call AWSService's __init__
super().__init__(__class__.__name__, provider)
super().__init__(__class__.__name__, audit_info)
self.certificates = []
self.__threading_call__(self.__list_certificates__)
self.__threading_call__(self.__describe_certificates__, self.certificates)
self.__threading_call__(self.__list_tags_for_certificate__, self.certificates)
self.__describe_certificates__()
self.__list_tags_for_certificate__()
def __list_certificates__(self, regional_client):
logger.info("ACM - Listing Certificates...")
@@ -59,29 +59,33 @@ class ACM(AWSService):
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __describe_certificates__(self, certificate):
def __describe_certificates__(self):
logger.info("ACM - Describing Certificates...")
try:
regional_client = self.regional_clients[certificate.region]
response = regional_client.describe_certificate(
CertificateArn=certificate.arn
)["Certificate"]
if (
response["Options"]["CertificateTransparencyLoggingPreference"]
== "ENABLED"
):
certificate.transparency_logging = True
for certificate in self.certificates:
regional_client = self.regional_clients[certificate.region]
response = regional_client.describe_certificate(
CertificateArn=certificate.arn
)["Certificate"]
if (
response["Options"]["CertificateTransparencyLoggingPreference"]
== "ENABLED"
):
certificate.transparency_logging = True
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __list_tags_for_certificate__(self, certificate):
def __list_tags_for_certificate__(self):
logger.info("ACM - List Tags...")
try:
regional_client = self.regional_clients[certificate.region]
response = regional_client.list_tags_for_certificate(
CertificateArn=certificate.arn
)["Tags"]
certificate.tags = response
for certificate in self.certificates:
regional_client = self.regional_clients[certificate.region]
response = regional_client.list_tags_for_certificate(
CertificateArn=certificate.arn
)["Tags"]
certificate.tags = response
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"

View File

@@ -1,4 +1,4 @@
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.apigateway.apigateway_service import APIGateway
from prowler.providers.common.common import get_global_provider
apigateway_client = APIGateway(get_global_provider())
apigateway_client = APIGateway(current_audit_info)

View File

@@ -1,7 +1,7 @@
{
"Provider": "aws",
"CheckID": "apigateway_restapi_authorizers_enabled",
"CheckTitle": "Check if API Gateway has configured authorizers at api or method level.",
"CheckTitle": "Check if API Gateway has configured authorizers.",
"CheckAliases": [
"apigateway_authorizers_enabled"
],
@@ -13,7 +13,7 @@
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
"Severity": "medium",
"ResourceType": "AwsApiGatewayRestApi",
"Description": "Check if API Gateway has configured authorizers at api or method level.",
"Description": "Check if API Gateway has configured authorizers.",
"Risk": "If no authorizer is enabled anyone can use the service.",
"RelatedUrl": "",
"Remediation": {

View File

@@ -13,41 +13,12 @@ class apigateway_restapi_authorizers_enabled(Check):
report.resource_id = rest_api.name
report.resource_arn = rest_api.arn
report.resource_tags = rest_api.tags
# it there are not authorizers at api level and resources without methods (default case) ->
report.status = "FAIL"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} does not have an authorizer configured at api level."
if rest_api.authorizer:
report.status = "PASS"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} has an authorizer configured at api level"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} has an authorizer configured."
else:
# we want to know if api has not authorizers and all the resources don't have methods configured
resources_have_methods = False
all_methods_authorized = True
resource_paths_with_unathorized_methods = []
for resource in rest_api.resources:
# if the resource has methods test if they have all configured authorizer
if resource.resource_methods:
resources_have_methods = True
for (
http_method,
authorization_method,
) in resource.resource_methods.items():
if authorization_method == "NONE":
all_methods_authorized = False
unauthorized_method = (
resource.path + " -> " + http_method
)
resource_paths_with_unathorized_methods.append(
unauthorized_method
)
# if there are methods in at least one resource and are all authorized
if all_methods_authorized and resources_have_methods:
report.status = "PASS"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} has all methods authorized"
# if there are methods in at least one result but some of then are not authorized-> list it
elif not all_methods_authorized:
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} does not have authorizers at api level and the following paths and methods are unauthorized: {'; '.join(resource_paths_with_unathorized_methods)}."
report.status = "FAIL"
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} does not have an authorizer configured."
findings.append(report)
return findings

View File

@@ -9,15 +9,14 @@ from prowler.providers.aws.lib.service.service import AWSService
################## APIGateway
class APIGateway(AWSService):
def __init__(self, provider):
def __init__(self, audit_info):
# Call AWSService's __init__
super().__init__(__class__.__name__, provider)
super().__init__(__class__.__name__, audit_info)
self.rest_apis = []
self.__threading_call__(self.__get_rest_apis__, self.rest_apis)
self.__threading_call__(self.__get_authorizers__, self.rest_apis)
self.__threading_call__(self.__get_rest_api__, self.rest_apis)
self.__threading_call__(self.__get_stages__, self.rest_apis)
self.__threading_call__(self.__get_resources__, self.rest_apis)
self.__threading_call__(self.__get_rest_apis__)
self.__get_authorizers__()
self.__get_rest_api__()
self.__get_stages__()
def __get_rest_apis__(self, regional_client):
logger.info("APIGateway - Getting Rest APIs...")
@@ -43,93 +42,60 @@ class APIGateway(AWSService):
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __get_authorizers__(self, rest_api):
def __get_authorizers__(self):
logger.info("APIGateway - Getting Rest APIs authorizer...")
try:
regional_client = self.regional_clients[rest_api.region]
authorizers = regional_client.get_authorizers(restApiId=rest_api.id)[
"items"
]
if authorizers:
rest_api.authorizer = True
for rest_api in self.rest_apis:
regional_client = self.regional_clients[rest_api.region]
authorizers = regional_client.get_authorizers(restApiId=rest_api.id)[
"items"
]
if authorizers:
rest_api.authorizer = True
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
logger.error(f"{error.__class__.__name__}: {error}")
def __get_rest_api__(self, rest_api):
def __get_rest_api__(self):
logger.info("APIGateway - Describing Rest API...")
try:
regional_client = self.regional_clients[rest_api.region]
rest_api_info = regional_client.get_rest_api(restApiId=rest_api.id)
if rest_api_info["endpointConfiguration"]["types"] == ["PRIVATE"]:
rest_api.public_endpoint = False
for rest_api in self.rest_apis:
regional_client = self.regional_clients[rest_api.region]
rest_api_info = regional_client.get_rest_api(restApiId=rest_api.id)
if rest_api_info["endpointConfiguration"]["types"] == ["PRIVATE"]:
rest_api.public_endpoint = False
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
logger.error(f"{error.__class__.__name__}: {error}")
def __get_stages__(self, rest_api):
def __get_stages__(self):
logger.info("APIGateway - Getting stages for Rest APIs...")
try:
regional_client = self.regional_clients[rest_api.region]
stages = regional_client.get_stages(restApiId=rest_api.id)
for stage in stages["item"]:
waf = None
logging = False
client_certificate = False
if "webAclArn" in stage:
waf = stage["webAclArn"]
if "methodSettings" in stage:
if stage["methodSettings"]:
logging = True
if "clientCertificateId" in stage:
client_certificate = True
arn = f"arn:{self.audited_partition}:apigateway:{regional_client.region}::/restapis/{rest_api.id}/stages/{stage['stageName']}"
rest_api.stages.append(
Stage(
name=stage["stageName"],
arn=arn,
logging=logging,
client_certificate=client_certificate,
waf=waf,
tags=[stage.get("tags")],
)
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __get_resources__(self, rest_api):
try:
regional_client = self.regional_clients[rest_api.region]
get_resources_paginator = regional_client.get_paginator("get_resources")
for page in get_resources_paginator.paginate(restApiId=rest_api.id):
for resource in page["items"]:
id = resource["id"]
resource_methods = []
methods_auth = {}
for resource_method in resource.get("resourceMethods", {}).keys():
resource_methods.append(resource_method)
for resource_method in resource_methods:
if resource_method != "OPTIONS":
method_config = regional_client.get_method(
restApiId=rest_api.id,
resourceId=id,
httpMethod=resource_method,
)
auth_type = method_config["authorizationType"]
methods_auth.update({resource_method: auth_type})
rest_api.resources.append(
PathResourceMethods(
path=resource["path"], resource_methods=methods_auth
for rest_api in self.rest_apis:
regional_client = self.regional_clients[rest_api.region]
stages = regional_client.get_stages(restApiId=rest_api.id)
for stage in stages["item"]:
waf = None
logging = False
client_certificate = False
if "webAclArn" in stage:
waf = stage["webAclArn"]
if "methodSettings" in stage:
if stage["methodSettings"]:
logging = True
if "clientCertificateId" in stage:
client_certificate = True
arn = f"arn:{self.audited_partition}:apigateway:{regional_client.region}::/restapis/{rest_api.id}/stages/{stage['stageName']}"
rest_api.stages.append(
Stage(
name=stage["stageName"],
arn=arn,
logging=logging,
client_certificate=client_certificate,
waf=waf,
tags=[stage.get("tags")],
)
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
logger.error(f"{error.__class__.__name__}: {error}")
class Stage(BaseModel):
@@ -141,11 +107,6 @@ class Stage(BaseModel):
tags: Optional[list] = []
class PathResourceMethods(BaseModel):
path: str
resource_methods: dict
class RestAPI(BaseModel):
id: str
arn: str
@@ -155,4 +116,3 @@ class RestAPI(BaseModel):
public_endpoint: bool = True
stages: list[Stage] = []
tags: Optional[list] = []
resources: list[PathResourceMethods] = []

View File

@@ -1,6 +1,6 @@
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.apigatewayv2.apigatewayv2_service import (
ApiGatewayV2,
)
from prowler.providers.common.common import get_global_provider
apigatewayv2_client = ApiGatewayV2(get_global_provider())
apigatewayv2_client = ApiGatewayV2(current_audit_info)

View File

@@ -9,13 +9,13 @@ from prowler.providers.aws.lib.service.service import AWSService
################## ApiGatewayV2
class ApiGatewayV2(AWSService):
def __init__(self, provider):
def __init__(self, audit_info):
# Call AWSService's __init__
super().__init__(__class__.__name__, provider)
super().__init__(__class__.__name__, audit_info)
self.apis = []
self.__threading_call__(self.__get_apis__, self.apis)
self.__threading_call__(self.__get_authorizers__, self.apis)
self.__threading_call__(self.__get_stages__, self.apis)
self.__threading_call__(self.__get_apis__)
self.__get_authorizers__()
self.__get_stages__()
def __get_apis__(self, regional_client):
logger.info("APIGatewayv2 - Getting APIs...")
@@ -41,32 +41,36 @@ class ApiGatewayV2(AWSService):
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __get_authorizers__(self, api):
def __get_authorizers__(self):
logger.info("APIGatewayv2 - Getting APIs authorizer...")
try:
regional_client = self.regional_clients[api.region]
authorizers = regional_client.get_authorizers(ApiId=api.id)["Items"]
if authorizers:
api.authorizer = True
for api in self.apis:
regional_client = self.regional_clients[api.region]
authorizers = regional_client.get_authorizers(ApiId=api.id)["Items"]
if authorizers:
api.authorizer = True
except Exception as error:
logger.error(
f"{error.__class__.__name__}:{error.__traceback__.tb_lineno} -- {error}"
)
def __get_stages__(self, api):
def __get_stages__(self):
logger.info("APIGatewayv2 - Getting stages for APIs...")
try:
regional_client = self.regional_clients[api.region]
stages = regional_client.get_stages(ApiId=api.id)
for stage in stages["Items"]:
logging = False
if "AccessLogSettings" in stage:
logging = True
api.stages.append(
Stage(
name=stage["StageName"],
logging=logging,
tags=[stage.get("Tags")],
for api in self.apis:
regional_client = self.regional_clients[api.region]
stages = regional_client.get_stages(ApiId=api.id)
for stage in stages["Items"]:
logging = False
if "AccessLogSettings" in stage:
logging = True
api.stages.append(
Stage(
name=stage["StageName"],
logging=logging,
tags=[stage.get("Tags")],
)
)
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}:{error.__traceback__.tb_lineno} -- {error}"

View File

@@ -1,4 +1,4 @@
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.appstream.appstream_service import AppStream
from prowler.providers.common.common import get_global_provider
appstream_client = AppStream(get_global_provider())
appstream_client = AppStream(current_audit_info)

View File

@@ -9,12 +9,12 @@ from prowler.providers.aws.lib.service.service import AWSService
################## AppStream
class AppStream(AWSService):
def __init__(self, provider):
def __init__(self, audit_info):
# Call AWSService's __init__
super().__init__(__class__.__name__, provider)
super().__init__(__class__.__name__, audit_info)
self.fleets = []
self.__threading_call__(self.__describe_fleets__)
self.__threading_call__(self.__list_tags_for_resource__, self.fleets)
self.__list_tags_for_resource__()
def __describe_fleets__(self, regional_client):
logger.info("AppStream - Describing Fleets...")
@@ -50,13 +50,15 @@ class AppStream(AWSService):
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __list_tags_for_resource__(self, fleet):
def __list_tags_for_resource__(self):
logger.info("AppStream - List Tags...")
try:
regional_client = self.regional_clients[fleet.region]
response = regional_client.list_tags_for_resource(ResourceArn=fleet.arn)[
"Tags"
]
fleet.tags = [response]
for fleet in self.fleets:
regional_client = self.regional_clients[fleet.region]
response = regional_client.list_tags_for_resource(
ResourceArn=fleet.arn
)["Tags"]
fleet.tags = [response]
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"

View File

@@ -1,4 +1,4 @@
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.athena.athena_service import Athena
from prowler.providers.common.common import get_global_provider
athena_client = Athena(get_global_provider())
athena_client = Athena(current_audit_info)

View File

@@ -9,18 +9,14 @@ from prowler.providers.aws.lib.service.service import AWSService
################## Athena
class Athena(AWSService):
def __init__(self, provider):
def __init__(self, audit_info):
# Call AWSService's __init__
super().__init__(__class__.__name__, provider)
super().__init__(__class__.__name__, audit_info)
self.workgroups = {}
self.__threading_call__(self.__list_workgroups__)
self.__threading_call__(self.__get_workgroups__, self.workgroups.values())
self.__threading_call__(
self.__list_query_executions__, self.workgroups.values()
)
self.__threading_call__(
self.__list_tags_for_resource__, self.workgroups.values()
)
self.__get_workgroups__()
self.__list_query_executions__()
self.__list_tags_for_resource__()
def __list_workgroups__(self, regional_client):
logger.info("Athena - Listing WorkGroups...")
@@ -48,65 +44,86 @@ class Athena(AWSService):
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __get_workgroups__(self, workgroup):
def __get_workgroups__(self):
logger.info("Athena - Getting WorkGroups...")
try:
wg = self.regional_clients[workgroup.region].get_work_group(
WorkGroup=workgroup.name
)
for workgroup in self.workgroups.values():
try:
wg = self.regional_clients[workgroup.region].get_work_group(
WorkGroup=workgroup.name
)
wg_configuration = wg.get("WorkGroup").get("Configuration")
self.workgroups[
workgroup.arn
].enforce_workgroup_configuration = wg_configuration.get(
"EnforceWorkGroupConfiguration", False
)
wg_configuration = wg.get("WorkGroup").get("Configuration")
self.workgroups[
workgroup.arn
].enforce_workgroup_configuration = wg_configuration.get(
"EnforceWorkGroupConfiguration", False
)
# We include an empty EncryptionConfiguration to handle if the workgroup does not have encryption configured
encryption = (
wg_configuration.get(
"ResultConfiguration",
{"EncryptionConfiguration": {}},
)
.get(
"EncryptionConfiguration",
{"EncryptionOption": ""},
)
.get("EncryptionOption")
)
# We include an empty EncryptionConfiguration to handle if the workgroup does not have encryption configured
encryption = (
wg_configuration.get(
"ResultConfiguration",
{"EncryptionConfiguration": {}},
)
.get(
"EncryptionConfiguration",
{"EncryptionOption": ""},
)
.get("EncryptionOption")
)
if encryption in ["SSE_S3", "SSE_KMS", "CSE_KMS"]:
encryption_configuration = EncryptionConfiguration(
encryption_option=encryption, encrypted=True
)
self.workgroups[
workgroup.arn
].encryption_configuration = encryption_configuration
if encryption in ["SSE_S3", "SSE_KMS", "CSE_KMS"]:
encryption_configuration = EncryptionConfiguration(
encryption_option=encryption, encrypted=True
)
self.workgroups[
workgroup.arn
].encryption_configuration = encryption_configuration
except Exception as error:
logger.error(
f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{workgroup.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __list_query_executions__(self, workgroup):
def __list_query_executions__(self):
logger.info("Athena - Listing Queries...")
try:
queries = (
self.regional_clients[workgroup.region]
.list_query_executions(WorkGroup=workgroup.name)
.get("QueryExecutionIds", [])
)
if queries:
workgroup.queries = True
for workgroup in self.workgroups.values():
try:
queries = (
self.regional_clients[workgroup.region]
.list_query_executions(WorkGroup=workgroup.name)
.get("QueryExecutionIds", [])
)
if queries:
workgroup.queries = True
except Exception as error:
logger.error(
f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{workgroup.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __list_tags_for_resource__(self, workgroup):
def __list_tags_for_resource__(self):
logger.info("Athena - Listing Tags...")
try:
regional_client = self.regional_clients[workgroup.region]
workgroup.tags = regional_client.list_tags_for_resource(
ResourceARN=workgroup.arn
)["Tags"]
for workgroup in self.workgroups.values():
try:
regional_client = self.regional_clients[workgroup.region]
workgroup.tags = regional_client.list_tags_for_resource(
ResourceARN=workgroup.arn
)["Tags"]
except Exception as error:
logger.error(
f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"

View File

@@ -12,7 +12,7 @@ class athena_workgroup_encryption(Check):
# Only check for enabled and used workgroups (has recent queries)
if (
workgroup.state == "ENABLED" and workgroup.queries
) or not athena_client.provider.ignore_unused_services:
) or not athena_client.audit_info.ignore_unused_services:
report = Check_Report_AWS(self.metadata())
report.region = workgroup.region
report.resource_id = workgroup.name

View File

@@ -12,7 +12,7 @@ class athena_workgroup_enforce_configuration(Check):
# Only check for enabled and used workgroups (has recent queries)
if (
workgroup.state == "ENABLED" and workgroup.queries
) or not athena_client.provider.ignore_unused_services:
) or not athena_client.audit_info.ignore_unused_services:
report = Check_Report_AWS(self.metadata())
report.region = workgroup.region
report.resource_id = workgroup.name

View File

@@ -1,4 +1,4 @@
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.autoscaling.autoscaling_service import AutoScaling
from prowler.providers.common.common import get_global_provider
autoscaling_client = AutoScaling(get_global_provider())
autoscaling_client = AutoScaling(current_audit_info)

View File

@@ -7,9 +7,9 @@ from prowler.providers.aws.lib.service.service import AWSService
################## AutoScaling
class AutoScaling(AWSService):
def __init__(self, provider):
def __init__(self, audit_info):
# Call AWSService's __init__
super().__init__(__class__.__name__, provider)
super().__init__(__class__.__name__, audit_info)
self.launch_configurations = []
self.__threading_call__(self.__describe_launch_configurations__)
self.groups = []

View File

@@ -1,4 +1,4 @@
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.awslambda.awslambda_service import Lambda
from prowler.providers.common.common import get_global_provider
awslambda_client = Lambda(get_global_provider())
awslambda_client = Lambda(current_audit_info)

View File

@@ -8,9 +8,7 @@ from prowler.providers.aws.services.cloudtrail.cloudtrail_client import (
class awslambda_function_invoke_api_operations_cloudtrail_logging_enabled(Check):
def execute(self):
findings = []
functions = awslambda_client.functions.values()
self.start_task("Processing functions...", len(functions))
for function in functions:
for function in awslambda_client.functions.values():
report = Check_Report_AWS(self.metadata())
report.region = function.region
report.resource_id = function.name
@@ -51,7 +49,5 @@ class awslambda_function_invoke_api_operations_cloudtrail_logging_enabled(Check)
report.status_extended = f"Lambda function {function.name} is recorded by CloudTrail trail {trail.name}."
break
findings.append(report)
self.increment_task_progress()
self.update_title_with_findings(findings)
return findings

View File

@@ -11,92 +11,57 @@ from prowler.providers.aws.services.awslambda.awslambda_client import awslambda_
class awslambda_function_no_secrets_in_code(Check):
def execute(self):
findings = []
if awslambda_client.functions:
functions = awslambda_client.functions.values()
self.start_task("Processing functions...", len(functions))
for function, function_code in awslambda_client.__get_function_code__():
if function_code:
report = Check_Report_AWS(self.metadata())
report.region = function.region
report.resource_id = function.name
report.resource_arn = function.arn
report.resource_tags = function.tags
for function in awslambda_client.functions.values():
if function.code:
report = Check_Report_AWS(self.metadata())
report.region = function.region
report.resource_id = function.name
report.resource_arn = function.arn
report.resource_tags = function.tags
report.status = "PASS"
report.status_extended = (
f"No secrets found in Lambda function {function.name} code."
)
with tempfile.TemporaryDirectory() as tmp_dir_name:
function_code.code_zip.extractall(tmp_dir_name)
# List all files
files_in_zip = next(os.walk(tmp_dir_name))[2]
secrets_findings = []
for file in files_in_zip:
secrets = SecretsCollection()
with default_settings():
secrets.scan_file(f"{tmp_dir_name}/{file}")
detect_secrets_output = secrets.json()
if detect_secrets_output:
for (
file_name
) in (
detect_secrets_output.keys()
): # Appears that only 1 file is being scanned at a time, so could rework this
output_file_name = file_name.replace(
f"{tmp_dir_name}/", ""
)
secrets_string = ", ".join(
[
f"{secret['type']} on line {secret['line_number']}"
for secret in detect_secrets_output[
file_name
]
]
)
secrets_findings.append(
f"{output_file_name}: {secrets_string}"
)
report.status = "PASS"
report.status_extended = (
f"No secrets found in Lambda function {function.name} code."
)
with tempfile.TemporaryDirectory() as tmp_dir_name:
function_code.code_zip.extractall(tmp_dir_name)
# List all files
files_in_zip = next(os.walk(tmp_dir_name))[2]
secrets_findings = []
for file in files_in_zip:
secrets = SecretsCollection()
with default_settings():
secrets.scan_file(f"{tmp_dir_name}/{file}")
detect_secrets_output = secrets.json()
if detect_secrets_output:
for (
file_name
) in (
detect_secrets_output.keys()
): # Appears that only 1 file is being scanned at a time, so could rework this
output_file_name = file_name.replace(
f"{tmp_dir_name}/", ""
)
secrets_string = ", ".join(
[
f"{secret['type']} on line {secret['line_number']}"
for secret in detect_secrets_output[
file_name
]
]
)
secrets_findings.append(
f"{output_file_name}: {secrets_string}"
)
report.status = "PASS"
report.status_extended = (
f"No secrets found in Lambda function {function.name} code."
)
with tempfile.TemporaryDirectory() as tmp_dir_name:
function.code.code_zip.extractall(tmp_dir_name)
# List all files
files_in_zip = next(os.walk(tmp_dir_name))[2]
secrets_findings = []
for file in files_in_zip:
secrets = SecretsCollection()
with default_settings():
secrets.scan_file(f"{tmp_dir_name}/{file}")
detect_secrets_output = secrets.json()
if detect_secrets_output:
for (
file_name
) in (
detect_secrets_output.keys()
): # Appears that only 1 file is being scanned at a time, so could rework this
output_file_name = file_name.replace(
f"{tmp_dir_name}/", ""
)
secrets_string = ", ".join(
[
f"{secret['type']} on line {secret['line_number']}"
for secret in detect_secrets_output[file_name]
]
)
secrets_findings.append(
f"{output_file_name}: {secrets_string}"
)
if secrets_findings:
final_output_string = "; ".join(secrets_findings)
report.status = "FAIL"
report.status_extended = f"Potential {'secrets' if len(secrets_findings) > 1 else 'secret'} found in Lambda function {function.name} code -> {final_output_string}."
if secrets_findings:
final_output_string = "; ".join(secrets_findings)
report.status = "FAIL"
# report.status_extended = f"Potential {'secrets' if len(secrets_findings)>1 else 'secret'} found in Lambda function {function.name} code. {final_output_string}."
if len(secrets_findings) > 1:
report.status_extended = f"Potential secrets found in Lambda function {function.name} code -> {final_output_string}."
else:
report.status_extended = f"Potential secret found in Lambda function {function.name} code -> {final_output_string}."
# break // Don't break as there may be additional findings
findings.append(report)
findings.append(report)
self.increment_task_progress()
self.update_title_with_findings(findings)
return findings

View File

@@ -12,8 +12,6 @@ from prowler.providers.aws.services.awslambda.awslambda_client import awslambda_
class awslambda_function_no_secrets_in_variables(Check):
def execute(self):
findings = []
functions = awslambda_client.functions.values()
self.start_task("Processing functions...", len(functions))
for function in awslambda_client.functions.values():
report = Check_Report_AWS(self.metadata())
report.region = function.region
@@ -54,6 +52,5 @@ class awslambda_function_no_secrets_in_variables(Check):
os.remove(temp_env_data_file.name)
findings.append(report)
self.increment_task_progress()
self.update_title_with_findings(findings)
return findings

Some files were not shown because too many files have changed in this diff Show More