Compare commits

...

46 Commits
3.9.0 ... 3.3.4

Author SHA1 Message Date
github-actions
7fb0e84686 chore(release): 3.3.4 2023-04-05 12:02:43 +00:00
Pepe Fagoaga
e07167a6fc chore(release): 3.3.3 2023-04-05 13:12:38 +02:00
Sergio Garcia
f8e69ee26c chore(regions_update): Changes in regions for AWS services. (#2173)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-04-05 12:39:51 +02:00
dependabot[bot]
43cba63389 build(deps-dev): bump moto from 4.1.5 to 4.1.6 (#2164)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@verica.io>
2023-04-05 12:39:45 +02:00
Nacho Rivera
cc19c768fc fix(elbv2 desync check): Mixed elbv2 desync and smuggling (#2171) 2023-04-05 12:39:38 +02:00
Sergio Garcia
4e9a107478 chore(regions_update): Changes in regions for AWS services. (#2170)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-04-05 12:39:34 +02:00
dependabot[bot]
6635de3526 build(deps): bump botocore from 1.29.100 to 1.29.105 (#2163)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-04-05 12:39:26 +02:00
dependabot[bot]
eb1597b708 build(deps): bump mkdocs-material from 9.1.4 to 9.1.5 (#2162)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-04-05 12:39:20 +02:00
Nacho Rivera
f0d9800e96 fix(pipeline build): fixed wording when build and push (#2169)
# Conflicts:
#	poetry.lock
2023-04-05 12:39:06 +02:00
dependabot[bot]
f660118da9 build(deps-dev): bump pylint from 2.17.0 to 2.17.2 (#2161)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
# Conflicts:
#	poetry.lock
2023-04-05 12:37:54 +02:00
Pepe Fagoaga
c66d75fc79 fix(dax): Call list_tags using the cluster ARN (#2167) 2023-04-05 12:35:08 +02:00
Pepe Fagoaga
6968172f93 fix(iam): Handle LimitExceededException when calling generate_credential_report (#2168) 2023-04-05 12:35:02 +02:00
Pepe Fagoaga
dc1c8bf91b fix(cloudformation): Handle ValidationError (#2166) 2023-04-05 12:34:57 +02:00
Pepe Fagoaga
529eb4b858 fix(rds): Handle DBSnapshotNotFound (#2165) 2023-04-05 12:34:51 +02:00
Michael Göhler
0eeb919c91 fix(secretsmanager_automatic_rotation_enabled): Improve description for Secrets Manager secret rotation (#2156) 2023-04-05 12:34:42 +02:00
Sergio Garcia
441fffb32d chore(regions_update): Changes in regions for AWS services. (#2159)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-04-05 12:34:36 +02:00
Igor Ceron
dbc18ecea9 fix(docs): check extra_742 name adjusted in the V2 to V3 mapping (#2154) 2023-04-05 12:34:31 +02:00
Sergio Garcia
79289c23c1 chore(regions_update): Changes in regions for AWS services. (#2155)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-04-05 12:34:24 +02:00
Pepe Fagoaga
2580d820ec fix(pypi): Build from release branch (#2151) 2023-03-30 10:16:03 +02:00
Pepe Fagoaga
3f3888f499 chore(poetry): lock 2023-03-30 08:22:25 +02:00
Sergio Garcia
b20a532ab5 chore(regions_update): Changes in regions for AWS services. (#2149)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
Pepe Fagoaga
b11c9991a3 fix(ssm): Handle ValidationException when retrieving documents (#2146) 2023-03-30 08:20:49 +02:00
Nacho Rivera
fd9cbe6ba3 fix(audit_info): azure subscriptions parsing error (#2147) 2023-03-30 08:20:49 +02:00
Nacho Rivera
227b1db493 fix(delete check): delete check ec2_securitygroup_in_use_without_ingress_filtering (#2148) 2023-03-30 08:20:49 +02:00
Sergio Garcia
10d8c12913 chore(regions_update): Changes in regions for AWS services. (#2145)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
dependabot[bot]
d9a1cc333e build(deps): bump botocore from 1.29.90 to 1.29.100 (#2142)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
dependabot[bot]
d1670c8347 build(deps): bump mkdocs-material from 9.1.3 to 9.1.4 (#2140)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
Nacho Rivera
b03bc8fc0b fix(azure output): change default values of audit identity metadata (#2144) 2023-03-30 08:20:49 +02:00
dependabot[bot]
79c975ab9c build(deps): bump pydantic from 1.10.6 to 1.10.7 (#2139)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-30 08:20:49 +02:00
dependabot[bot]
68cf285fb9 build(deps): bump alive-progress from 3.0.1 to 3.1.0 (#2138)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-30 08:20:47 +02:00
Sergio Garcia
36ba0ffb1f chore(docs): Developer Guide - how to create a new check (#2137) 2023-03-30 08:20:22 +02:00
Pepe Fagoaga
d2b1818700 fix(resource_not_found): Handle error (#2136) 2023-03-30 08:20:22 +02:00
Nacho Rivera
2df02acbd8 feat(defender service): retrieving key dicts with get (#2129) 2023-03-30 08:20:20 +02:00
Pepe Fagoaga
7fd11b2f0e fix(s3): handle if ignore_public_acls is None (#2128) 2023-03-30 08:19:42 +02:00
Sergio Garcia
47bab87603 fix(brew): move brew formula action to the bottom (#2135) 2023-03-30 08:18:52 +02:00
Pepe Fagoaga
1835e1df98 fix(aws_provider): Fix assessment session name (#2132) 2023-03-30 08:18:52 +02:00
Sergio Garcia
dc88b033a0 chore(regions_update): Changes in regions for AWS services. (#2126)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:18:52 +02:00
Sergio Garcia
0f784e34db chore(regions_update): Changes in regions for AWS services. (#2123)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:18:52 +02:00
Ben Nugent
698f299edf fix(quickinventory): AttributError when creating inventory table (#2122) 2023-03-30 08:18:52 +02:00
Sergio Garcia
172bb36dfb fix(output bucket): solve IsADirectoryError using compliance flag (#2121) 2023-03-30 08:18:52 +02:00
Sergio Garcia
4b1e6c7493 chore(regions_update): Changes in regions for AWS services. (#2120)
Co-authored-by: sergargar <sergargar@users.noreply.github.com>
2023-03-30 08:18:52 +02:00
Sergio Garcia
6d5c277c73 docs: improve reporting documentation (#2119) 2023-03-30 08:18:52 +02:00
Sergio Garcia
2d4b7ce499 docs: improve quick inventory section (#2117) 2023-03-30 08:18:52 +02:00
Toni de la Fuente
3ee3a3b2f1 docs(developer-guide): added phase 1 of the developer guide (#1904)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2023-03-30 08:18:52 +02:00
Pepe Fagoaga
e9e6ff0e24 docs: Remove list severities (#2116) 2023-03-30 08:18:52 +02:00
Sergio Garcia
e0c6de76d4 chore(version): check latest version (#2106) 2023-03-30 08:18:52 +02:00
66 changed files with 1301 additions and 1049 deletions

View File

@@ -69,7 +69,7 @@ jobs:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Build container image (latest)
- name: Build and push container image (latest)
if: github.event_name == 'push'
uses: docker/build-push-action@v2
with:
@@ -77,11 +77,11 @@ jobs:
tags: |
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
file: ${{ env.DOCKERFILE_PATH }}
file: ${{ env.DOCKERFILE_PATH }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build container image (release)
- name: Build and push container image (release)
if: github.event_name == 'release'
uses: docker/build-push-action@v2
with:

View File

@@ -6,7 +6,6 @@ on:
env:
RELEASE_TAG: ${{ github.event.release.tag_name }}
GITHUB_BRANCH: master
jobs:
release-prowler-job:
@@ -17,8 +16,7 @@ jobs:
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v3
with:
ref: ${{ env.GITHUB_BRANCH }}
- name: Install dependencies
run: |
pipx install poetry
@@ -59,13 +57,6 @@ jobs:
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
# Create pull request to github.com/Homebrew/homebrew-core to update prowler formula
- name: Bump Homebrew formula
uses: mislav/bump-homebrew-formula-action@v2
with:
formula-name: prowler
env:
COMMITTER_TOKEN: ${{ secrets.PROWLER_ACCESS_TOKEN }}
- name: Replicate PyPi Package
run: |
rm -rf ./dist && rm -rf ./build && rm -rf prowler.egg-info
@@ -76,3 +67,11 @@ jobs:
run: |
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
poetry publish
# Create pull request to github.com/Homebrew/homebrew-core to update prowler formula
- name: Bump Homebrew formula
uses: mislav/bump-homebrew-formula-action@v2
with:
formula-name: prowler
base-branch: release-${{ env.RELEASE_TAG }}
env:
COMMITTER_TOKEN: ${{ secrets.PROWLER_ACCESS_TOKEN }}

View File

@@ -19,6 +19,7 @@ repos:
hooks:
- id: pretty-format-toml
args: [--autofix]
files: pyproject.toml
## BASH
- repo: https://github.com/koalaman/shellcheck-precommit
@@ -56,7 +57,7 @@ repos:
args: ["--ignore=E266,W503,E203,E501,W605"]
- repo: https://github.com/python-poetry/poetry
rev: 1.4.0 # add version here
rev: 1.4.0 # add version here
hooks:
- id: poetry-check
- id: poetry-lock

BIN
docs/img/output-html.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 631 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 320 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 220 KiB

View File

@@ -31,7 +31,7 @@ checks_v3_to_v2_mapping = {
"awslambda_function_url_cors_policy": "extra7180",
"awslambda_function_url_public": "extra7179",
"awslambda_function_using_supported_runtimes": "extra762",
"cloudformation_outputs_find_secrets": "extra742",
"cloudformation_stack_outputs_find_secrets": "extra742",
"cloudformation_stacks_termination_protection_enabled": "extra7154",
"cloudfront_distributions_field_level_encryption_enabled": "extra767",
"cloudfront_distributions_geo_restrictions_enabled": "extra732",
@@ -113,7 +113,6 @@ checks_v3_to_v2_mapping = {
"ec2_securitygroup_allow_wide_open_public_ipv4": "extra778",
"ec2_securitygroup_default_restrict_traffic": "check43",
"ec2_securitygroup_from_launch_wizard": "extra7173",
"ec2_securitygroup_in_use_without_ingress_filtering": "extra74",
"ec2_securitygroup_not_used": "extra75",
"ec2_securitygroup_with_many_ingress_egress_rules": "extra777",
"ecr_repositories_lifecycle_policy_enabled": "extra7194",
@@ -138,7 +137,6 @@ checks_v3_to_v2_mapping = {
"elbv2_internet_facing": "extra79",
"elbv2_listeners_underneath": "extra7158",
"elbv2_logging_enabled": "extra717",
"elbv2_request_smugling": "extra7142",
"elbv2_ssl_listeners": "extra793",
"elbv2_waf_acl_attached": "extra7129",
"emr_cluster_account_public_block_enabled": "extra7178",

View File

@@ -81,36 +81,4 @@ Standard results will be shown and additionally the framework information as the
## Create and contribute adding other Security Frameworks
If you want to create or contribute with your own security frameworks or add public ones to Prowler you need to make sure the checks are available if not you have to create your own. Then create a compliance file per provider like in `prowler/compliance/aws/` and name it as `<framework>_<version>_<provider>.json` then follow the following format to create yours.
Each file version of a framework will have the following structure at high level with the case that each framework needs to be generally identified), one requirement can be also called one control but one requirement can be linked to multiple prowler checks.:
- `Framework`: string. Indistiguish name of the framework, like CIS
- `Provider`: string. Provider where the framework applies, such as AWS, Azure, OCI,...
- `Version`: string. Version of the framework itself, like 1.4 for CIS.
- `Requirements`: array of objects. Include all requirements or controls with the mapping to Prowler.
- `Requirements_Id`: string. Unique identifier per each requirement in the specific framework
- `Requirements_Description`: string. Description as in the framework.
- `Requirements_Attributes`: array of objects. Includes all needed attributes per each requirement, like levels, sections, etc. Whatever helps to create a dedicated report with the result of the findings. Attributes would be taken as closely as possible from the framework's own terminology directly.
- `Requirements_Checks`: array. Prowler checks that are needed to prove this requirement. It can be one or multiple checks. In case of no automation possible this can be empty.
```
{
"Framework": "<framework>-<provider>",
"Version": "<version>",
"Requirements": [
{
"Id": "<unique-id>",
"Description": "Requiemente full description",
"Checks": [
"Here is the prowler check or checks that is going to be executed"
],
"Attributes": [
{
<Add here your custom attributes.>
}
]
}
```
Finally, to have a proper output file for your reports, your framework data model has to be created in `prowler/lib/outputs/models.py` and also the CLI table output in `prowler/lib/outputs/compliance.py`.
This information is part of the Developer Guide and can be found here: https://docs.prowler.cloud/en/latest/tutorials/developer-guide/.

View File

@@ -0,0 +1,281 @@
# Developer Guide
You can extend Prowler in many different ways, in most cases you will want to create your own checks and compliance security frameworks, here is where you can learn about how to get started with it. We also include how to create custom outputs, integrations and more.
## Get the code and install all dependencies
First of all, you need a version of Python 3.9 or higher and also pip installed to be able to install all dependencies requred. Once that is satisfied go a head and clone the repo:
```
git clone https://github.com/prowler-cloud/prowler
cd prowler
```
For isolation and avoid conflicts with other environments, we recommend usage of `poetry`:
```
pip install poetry
```
Then install all dependencies including the ones for developers:
```
poetry install
poetry shell
```
## Contributing with your code or fixes to Prowler
This repo has git pre-commit hooks managed via the pre-commit tool. Install it how ever you like, then in the root of this repo run:
```
pre-commit install
```
You should get an output like the following:
```
pre-commit installed at .git/hooks/pre-commit
```
Before we merge any of your pull requests we pass checks to the code, we use the following tools and automation to make sure the code is secure and dependencies up-to-dated (these should have been already installed if you ran `pipenv install -d`):
- `bandit` for code security review.
- `safety` and `dependabot` for dependencies.
- `hadolint` and `dockle` for our containers security.
- `snyk` in Docker Hub.
- `clair` in Amazon ECR.
- `vulture`, `flake8`, `black` and `pylint` for formatting and best practices.
You can see all dependencies in file `Pipfile`.
## Create a new check for a Provider
### If the check you want to create belongs to an existing service
To create a new check, you will need to create a folder inside the specific service, i.e. `prowler/providers/<provider>/services/<service>/<check_name>/`, with the name of check following the pattern: `service_subservice_action`.
Inside that folder, create the following files:
- An empty `__init__.py`: to make Python treat this check folder as a package.
- A `check_name.py` containing the check's logic, for example:
```
# Import the Check_Report of the specific provider
from prowler.lib.check.models import Check, Check_Report_AWS
# Import the client of the specific service
from prowler.providers.aws.services.ec2.ec2_client import ec2_client
# Create the class for the check
class ec2_ebs_volume_encryption(Check):
def execute(self):
findings = []
# Iterate the service's asset that want to be analyzed
for volume in ec2_client.volumes:
# Initialize a Check Report for each item and assign the region, resource_id, resource_arn and resource_tags
report = Check_Report_AWS(self.metadata())
report.region = volume.region
report.resource_id = volume.id
report.resource_arn = volume.arn
report.resource_tags = volume.tags
# Make the logic with conditions and create a PASS and a FAIL with a status and a status_extended
if volume.encrypted:
report.status = "PASS"
report.status_extended = f"EBS Snapshot {volume.id} is encrypted."
else:
report.status = "FAIL"
report.status_extended = f"EBS Snapshot {volume.id} is unencrypted."
findings.append(report) # Append a report for each item
return findings
```
- A `check_name.metadata.json` containing the check's metadata, for example:
```
{
"Provider": "aws",
"CheckID": "ec2_ebs_volume_encryption",
"CheckTitle": "Ensure there are no EBS Volumes unencrypted.",
"CheckType": [
"Data Protection"
],
"ServiceName": "ec2",
"SubServiceName": "volume",
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
"Severity": "medium",
"ResourceType": "AwsEc2Volume",
"Description": "Ensure there are no EBS Volumes unencrypted.",
"Risk": "Data encryption at rest prevents data visibility in the event of its unauthorized access or theft.",
"RelatedUrl": "",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "Encrypt all EBS volumes and Enable Encryption by default You can configure your AWS account to enforce the encryption of the new EBS volumes and snapshot copies that you create. For example; Amazon EBS encrypts the EBS volumes created when you launch an instance and the snapshots that you copy from an unencrypted snapshot.",
"Url": "https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSEncryption.html"
}
},
"Categories": [
"encryption"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}
```
### If the check you want to create belongs to a service not supported already by Prowler you will need to create a new service first
To create a new service, you will need to create a folder inside the specific provider, i.e. `prowler/providers/<provider>/services/<service>/`.
Inside that folder, create the following files:
- An empty `__init__.py`: to make Python treat this service folder as a package.
- A `<service>_service.py`, containing all the service's logic and API Calls:
```
# You must import the following libraries
import threading
from typing import Optional
from pydantic import BaseModel
from prowler.lib.logger import logger
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
from prowler.providers.aws.aws_provider import generate_regional_clients
# Create a class for the Service
################## <Service>
class <Service>:
def __init__(self, audit_info):
self.service = "<service>" # The name of the service boto3 client
self.session = audit_info.audit_session
self.audited_account = audit_info.audited_account
self.audit_resources = audit_info.audit_resources
self.regional_clients = generate_regional_clients(self.service, audit_info)
self.<items> = [] # Create an empty list of the items to be gathered, e.g., instances
self.__threading_call__(self.__describe_<items>__)
self.__describe_<item>__() # Optionally you can create another function to retrieve more data about each item
def __get_session__(self):
return self.session
def __threading_call__(self, call):
threads = []
for regional_client in self.regional_clients.values():
threads.append(threading.Thread(target=call, args=(regional_client,)))
for t in threads:
t.start()
for t in threads:
t.join()
def __describe_<items>__(self, regional_client):
"""Get ALL <Service> <Items>"""
logger.info("<Service> - Describing <Items>...")
try:
describe_<items>_paginator = regional_client.get_paginator("describe_<items>") # Paginator to get every item
for page in describe_<items>_paginator.paginate():
for <item> in page["<Items>"]:
if not self.audit_resources or (
is_resource_filtered(<item>["<item_arn>"], self.audit_resources)
):
self.<items>.append(
<Item>(
arn=stack["<item_arn>"],
name=stack["<item_name>"],
tags=stack.get("Tags", []),
region=regional_client.region,
)
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __describe_<item>__(self):
"""Get Details for a <Service> <Item>"""
logger.info("<Service> - Describing <Item> to get specific details...")
try:
for <item> in self.<items>:
<item>_details = self.regional_clients[<item>.region].describe_<item>(
<Attribute>=<item>.name
)
# For example, check if item is Public
<item>.public = <item>_details.get("Public", False)
except Exception as error:
logger.error(
f"{<item>.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
class <Item>(BaseModel):
"""<Item> holds a <Service> <Item>"""
arn: str
"""<Items>[].Arn"""
name: str
"""<Items>[].Name"""
public: bool
"""<Items>[].Public"""
tags: Optional[list] = []
region: str
```
- A `<service>_client_.py`, containing the initialization of the service's class we have just created so the service's checks can use them:
```
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.<service>.<service>_service import <Service>
<service>_client = <Service>(current_audit_info)
```
## Create a new security compliance framework
If you want to create or contribute with your own security frameworks or add public ones to Prowler you need to make sure the checks are available if not you have to create your own. Then create a compliance file per provider like in `prowler/compliance/aws/` and name it as `<framework>_<version>_<provider>.json` then follow the following format to create yours.
Each file version of a framework will have the following structure at high level with the case that each framework needs to be generally identified, one requirement can be also called one control but one requirement can be linked to multiple prowler checks.:
- `Framework`: string. Indistiguish name of the framework, like CIS
- `Provider`: string. Provider where the framework applies, such as AWS, Azure, OCI,...
- `Version`: string. Version of the framework itself, like 1.4 for CIS.
- `Requirements`: array of objects. Include all requirements or controls with the mapping to Prowler.
- `Requirements_Id`: string. Unique identifier per each requirement in the specific framework
- `Requirements_Description`: string. Description as in the framework.
- `Requirements_Attributes`: array of objects. Includes all needed attributes per each requirement, like levels, sections, etc. Whatever helps to create a dedicated report with the result of the findings. Attributes would be taken as closely as possible from the framework's own terminology directly.
- `Requirements_Checks`: array. Prowler checks that are needed to prove this requirement. It can be one or multiple checks. In case of no automation possible this can be empty.
```
{
"Framework": "<framework>-<provider>",
"Version": "<version>",
"Requirements": [
{
"Id": "<unique-id>",
"Description": "Requiemente full description",
"Checks": [
"Here is the prowler check or checks that is going to be executed"
],
"Attributes": [
{
<Add here your custom attributes.>
}
]
},
...
]
}
```
Finally, to have a proper output file for your reports, your framework data model has to be created in `prowler/lib/outputs/models.py` and also the CLI table output in `prowler/lib/outputs/compliance.py`.
## Create a custom output format
## Create a new integration
## Contribute with documentation
We use `mkdocs` to build this Prowler documentation site so you can easely contribute back with new docs or improving them.
1. Install `mkdocs` with your favorite package manager.
2. Inside the `prowler` repository folder run `mkdocs serve` and point your browser to `http://localhost:8000` and you will see live changes to your local copy of this documentation site.
3. Make all needed changes to docs or add new documents. To do so just edit existing md files inside `prowler/docs` and if you are adding a new section or file please make sure you add it to `mkdocs.yaml` file in the root folder of the Prowler repo.
4. Once you are done with changes, please send a pull request to us for review and merge. Thank you in advance!
## Want some swag as appreciation for your contribution?
If you are like us and you love swag, we are happy to thank you for your contribution with some laptop stickers or whatever other swag we may have at that time. Please, tell us more details and your pull request link in our [Slack workspace here](https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog). You can also reach out to Toni de la Fuente on Twitter [here](https://twitter.com/ToniBlyx), his DMs are open.

View File

@@ -52,14 +52,15 @@ prowler <provider> -e/--excluded-checks ec2 rds
prowler <provider> -C/--checks-file <checks_list>.json
```
## Severities
Each check of Prowler has a severity, there are options related with it:
## Severities
Each of Prowler's checks has a severity, which can be:
- informational
- low
- medium
- high
- critical
- List the available checks in the provider:
```console
prowler <provider> --list-severities
```
- Execute specific severity(s):
To execute specific severity(s):
```console
prowler <provider> --severity critical high
```

View File

@@ -33,9 +33,8 @@ Several checks analyse resources that are exposed to the Internet, these are:
- ec2_instance_internet_facing_with_instance_profile
- ec2_instance_public_ip
- ec2_networkacl_allow_ingress_any_port
- ec2_securitygroup_allow_ingress_from_internet_to_any_port
- ec2_securitygroup_allow_wide_open_public_ipv4
- ec2_securitygroup_in_use_without_ingress_filtering
- ec2_securitygroup_allow_ingress_from_internet_to_any_port
- ecr_repositories_not_publicly_accessible
- eks_control_plane_endpoint_access_restricted
- eks_endpoints_not_publicly_accessible

View File

@@ -14,4 +14,6 @@ prowler <provider> -i
- Also, it creates by default a CSV and JSON to see detailed information about the resources extracted.
![Quick Inventory Example](../img/quick-inventory.png)
![Quick Inventory Example](../img/quick-inventory.jpg)
> The inventorying process is done with `resourcegroupstaggingapi` calls (except for the IAM resources which are done with Boto3 API calls.)

View File

@@ -46,9 +46,11 @@ Prowler supports natively the following output formats:
Hereunder is the structure for each of the supported report formats by Prowler:
### HTML
![HTML Output](../img/output-html.png)
### CSV
| ASSESSMENT_START_TIME | FINDING_UNIQUE_ID | PROVIDER | PROFILE | ACCOUNT_ID | ACCOUNT_NAME | ACCOUNT_EMAIL | ACCOUNT_ARN | ACCOUNT_ORG | ACCOUNT_TAGS | REGION | CHECK_ID | CHECK_TITLE | CHECK_TYPE | STATUS | STATUS_EXTENDED | SERVICE_NAME | SUBSERVICE_NAME | SEVERITY | RESOURCE_ID | RESOURCE_ARN | RESOURCE_TYPE | RESOURCE_DETAILS | RESOURCE_TAGS | DESCRIPTION | RISK | RELATED_URL | REMEDIATION_RECOMMENDATION_TEXT | REMEDIATION_RECOMMENDATION_URL | REMEDIATION_RECOMMENDATION_CODE_NATIVEIAC | REMEDIATION_RECOMMENDATION_CODE_TERRAFORM | REMEDIATION_RECOMMENDATION_CODE_CLI | REMEDIATION_RECOMMENDATION_CODE_OTHER | CATEGORIES | DEPENDS_ON | RELATED_TO | NOTES |
| ------- | ----------- | ------ | -------- | ------------ | ----------- | ---------- | ---------- | --------------------- | -------------------------- | -------------- | ----------------- | ------------------------ | --------------- | ---------- | ----------------- | --------- | -------------- | ----------------- | ------------------ | --------------------- | -------------------- | ------------------- | ------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- |
| ASSESSMENT_START_TIME | FINDING_UNIQUE_ID | PROVIDER | PROFILE | ACCOUNT_ID | ACCOUNT_NAME | ACCOUNT_EMAIL | ACCOUNT_ARN | ACCOUNT_ORG | ACCOUNT_TAGS | REGION | CHECK_ID | CHECK_TITLE | CHECK_TYPE | STATUS | STATUS_EXTENDED | SERVICE_NAME | SUBSERVICE_NAME | SEVERITY | RESOURCE_ID | RESOURCE_ARN | RESOURCE_TYPE | RESOURCE_DETAILS | RESOURCE_TAGS | DESCRIPTION | COMPLIANCE | RISK | RELATED_URL | REMEDIATION_RECOMMENDATION_TEXT | REMEDIATION_RECOMMENDATION_URL | REMEDIATION_RECOMMENDATION_CODE_NATIVEIAC | REMEDIATION_RECOMMENDATION_CODE_TERRAFORM | REMEDIATION_RECOMMENDATION_CODE_CLI | REMEDIATION_RECOMMENDATION_CODE_OTHER | CATEGORIES | DEPENDS_ON | RELATED_TO | NOTES |
| ------- | ----------- | ------ | -------- | ------------ | ----------- | ---------- | ---------- | --------------------- | -------------------------- | -------------- | ----------------- | ------------------------ | --------------- | ---------- | ----------------- | --------- | -------------- | ----------------- | ------------------ | --------------------- | -------------------- | ------------------- | ------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- |
### JSON
@@ -71,6 +73,10 @@ Hereunder is the structure for each of the supported report formats by Prowler:
"Severity": "low",
"ResourceId": "rds-instance-id",
"ResourceArn": "",
"ResourceTags": {
"test": "test",
"enironment": "dev"
},
"ResourceType": "AwsRdsDbInstance",
"ResourceDetails": "",
"Description": "Ensure RDS instances have minor version upgrade enabled.",
@@ -89,7 +95,15 @@ Hereunder is the structure for each of the supported report formats by Prowler:
}
},
"Categories": [],
"Notes": ""
"Notes": "",
"Compliance": {
"CIS-1.4": [
"1.20"
],
"CIS-1.5": [
"1.20"
]
}
},{
"AssessmentStartTime": "2022-12-01T14:16:57.354413",
"FindingUniqueId": "",
@@ -109,7 +123,7 @@ Hereunder is the structure for each of the supported report formats by Prowler:
"ResourceId": "rds-instance-id",
"ResourceArn": "",
"ResourceType": "AwsRdsDbInstance",
"ResourceDetails": "",
"ResourceTags": {},
"Description": "Ensure RDS instances have minor version upgrade enabled.",
"Risk": "Auto Minor Version Upgrade is a feature that you can enable to have your database automatically upgraded when a new minor database engine version is available. Minor version upgrades often patch security vulnerabilities and fix bugs and therefore should be applied.",
"RelatedUrl": "https://aws.amazon.com/blogs/database/best-practices-for-upgrading-amazon-rds-to-major-and-minor-versions-of-postgresql/",
@@ -126,7 +140,8 @@ Hereunder is the structure for each of the supported report formats by Prowler:
}
},
"Categories": [],
"Notes": ""
"Notes": "",
"Compliance: {}
}]
```
@@ -166,7 +181,30 @@ Hereunder is the structure for each of the supported report formats by Prowler:
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": []
"RelatedRequirements": [
"CISA your-systems-2 booting-up-thing-to-do-first-3",
"CIS-1.5 2.3.2",
"AWS-Foundational-Security-Best-Practices rds",
"RBI-Cyber-Security-Framework annex_i_6",
"FFIEC d3-cc-pm-b-1 d3-cc-pm-b-3"
],
"AssociatedStandards": [
{
"StandardsId": "CISA"
},
{
"StandardsId": "CIS-1.5"
},
{
"StandardsId": "AWS-Foundational-Security-Best-Practices"
},
{
"StandardsId": "RBI-Cyber-Security-Framework"
},
{
"StandardsId": "FFIEC"
}
]
},
"Remediation": {
"Recommendation": {
@@ -205,7 +243,30 @@ Hereunder is the structure for each of the supported report formats by Prowler:
],
"Compliance": {
"Status": "PASSED",
"RelatedRequirements": []
"RelatedRequirements": [
"CISA your-systems-2 booting-up-thing-to-do-first-3",
"CIS-1.5 2.3.2",
"AWS-Foundational-Security-Best-Practices rds",
"RBI-Cyber-Security-Framework annex_i_6",
"FFIEC d3-cc-pm-b-1 d3-cc-pm-b-3"
],
"AssociatedStandards": [
{
"StandardsId": "CISA"
},
{
"StandardsId": "CIS-1.5"
},
{
"StandardsId": "AWS-Foundational-Security-Best-Practices"
},
{
"StandardsId": "RBI-Cyber-Security-Framework"
},
{
"StandardsId": "FFIEC"
}
]
},
"Remediation": {
"Recommendation": {

View File

@@ -37,6 +37,7 @@ nav:
- Logging: tutorials/logging.md
- Allowlist: tutorials/allowlist.md
- Pentesting: tutorials/pentesting.md
- Developer Guide: tutorials/developer-guide.md
- AWS:
- Assume Role: tutorials/aws/role-assumption.md
- AWS Security Hub: tutorials/aws/securityhub.md
@@ -50,6 +51,7 @@ nav:
- Azure:
- Authentication: tutorials/azure/authentication.md
- Subscriptions: tutorials/azure/subscriptions.md
- Developer Guide: tutorials/developer-guide.md
- Security: security.md
- Contact Us: contact.md
- Troubleshooting: troubleshooting.md

388
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry and should not be changed by hand.
# This file is automatically @generated by Poetry 1.4.0 and should not be changed by hand.
[[package]]
name = "about-time"
@@ -14,14 +14,14 @@ files = [
[[package]]
name = "alive-progress"
version = "3.0.1"
version = "3.1.0"
description = "A new kind of Progress Bar, with real-time throughput, ETA, and very cool animations!"
category = "main"
optional = false
python-versions = ">=3.7, <4"
files = [
{file = "alive-progress-3.0.1.tar.gz", hash = "sha256:3245114253b6adb4b38f2a2a1828edfcd9e8c012f7e30a5cef1932ca7344eb44"},
{file = "alive_progress-3.0.1-py3-none-any.whl", hash = "sha256:1e910ef0ceccfd2b09682f119330bd9a0bf2b4c4ddfa83de4d3d3741e104d98f"},
{file = "alive-progress-3.1.0.tar.gz", hash = "sha256:c076a076591ff926ac19941bd73065e298118b6d38900d2d6ff53d2e355be3c1"},
{file = "alive_progress-3.1.0-py3-none-any.whl", hash = "sha256:908ded254221e958fe2851bcbcd24451fb3820011d473db71f329c370a4018a2"},
]
[package.dependencies]
@@ -42,14 +42,14 @@ files = [
[[package]]
name = "astroid"
version = "2.15.0"
version = "2.15.2"
description = "An abstract syntax tree for Python with inference support."
category = "dev"
optional = false
python-versions = ">=3.7.2"
files = [
{file = "astroid-2.15.0-py3-none-any.whl", hash = "sha256:e3e4d0ffc2d15d954065579689c36aac57a339a4679a679579af6401db4d3fdb"},
{file = "astroid-2.15.0.tar.gz", hash = "sha256:525f126d5dc1b8b0b6ee398b33159105615d92dc4a17f2cd064125d57f6186fa"},
{file = "astroid-2.15.2-py3-none-any.whl", hash = "sha256:dea89d9f99f491c66ac9c04ebddf91e4acf8bd711722175fe6245c0725cc19bb"},
{file = "astroid-2.15.2.tar.gz", hash = "sha256:6e61b85c891ec53b07471aec5878f4ac6446a41e590ede0f2ce095f39f7d49dd"},
]
[package.dependencies]
@@ -316,14 +316,14 @@ crt = ["botocore[crt] (>=1.21.0,<2.0a0)"]
[[package]]
name = "botocore"
version = "1.29.90"
version = "1.29.105"
description = "Low-level, data-driven core of boto 3."
category = "main"
optional = false
python-versions = ">= 3.7"
files = [
{file = "botocore-1.29.90-py3-none-any.whl", hash = "sha256:1b8c1b8c366875e65d39237a296842b9c0ea33af2ba4a2771db2ba6aefa663ef"},
{file = "botocore-1.29.90.tar.gz", hash = "sha256:2dbbc2c7d93ddefcf9896268597212d446e5d416fbceb1b12c793660fa9f83f3"},
{file = "botocore-1.29.105-py3-none-any.whl", hash = "sha256:06a2838daad3f346cba5460d0d3deb198225b556ff9ca729798d787fadbdebde"},
{file = "botocore-1.29.105.tar.gz", hash = "sha256:17c82391dfd6aaa8f96fbbb08cad2c2431ef3cda0ece89e6e6ba444c5eed45c2"},
]
[package.dependencies]
@@ -631,35 +631,31 @@ toml = ["tomli"]
[[package]]
name = "cryptography"
version = "39.0.2"
version = "40.0.1"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
category = "main"
optional = false
python-versions = ">=3.6"
files = [
{file = "cryptography-39.0.2-cp36-abi3-macosx_10_12_universal2.whl", hash = "sha256:2725672bb53bb92dc7b4150d233cd4b8c59615cd8288d495eaa86db00d4e5c06"},
{file = "cryptography-39.0.2-cp36-abi3-macosx_10_12_x86_64.whl", hash = "sha256:23df8ca3f24699167daf3e23e51f7ba7334d504af63a94af468f468b975b7dd7"},
{file = "cryptography-39.0.2-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:eb40fe69cfc6f5cdab9a5ebd022131ba21453cf7b8a7fd3631f45bbf52bed612"},
{file = "cryptography-39.0.2-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bc0521cce2c1d541634b19f3ac661d7a64f9555135e9d8af3980965be717fd4a"},
{file = "cryptography-39.0.2-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ffd394c7896ed7821a6d13b24657c6a34b6e2650bd84ae063cf11ccffa4f1a97"},
{file = "cryptography-39.0.2-cp36-abi3-manylinux_2_24_x86_64.whl", hash = "sha256:e8a0772016feeb106efd28d4a328e77dc2edae84dfbac06061319fdb669ff828"},
{file = "cryptography-39.0.2-cp36-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:8f35c17bd4faed2bc7797d2a66cbb4f986242ce2e30340ab832e5d99ae60e011"},
{file = "cryptography-39.0.2-cp36-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:b49a88ff802e1993b7f749b1eeb31134f03c8d5c956e3c125c75558955cda536"},
{file = "cryptography-39.0.2-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:5f8c682e736513db7d04349b4f6693690170f95aac449c56f97415c6980edef5"},
{file = "cryptography-39.0.2-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:d7d84a512a59f4412ca8549b01f94be4161c94efc598bf09d027d67826beddc0"},
{file = "cryptography-39.0.2-cp36-abi3-win32.whl", hash = "sha256:c43ac224aabcbf83a947eeb8b17eaf1547bce3767ee2d70093b461f31729a480"},
{file = "cryptography-39.0.2-cp36-abi3-win_amd64.whl", hash = "sha256:788b3921d763ee35dfdb04248d0e3de11e3ca8eb22e2e48fef880c42e1f3c8f9"},
{file = "cryptography-39.0.2-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:d15809e0dbdad486f4ad0979753518f47980020b7a34e9fc56e8be4f60702fac"},
{file = "cryptography-39.0.2-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:50cadb9b2f961757e712a9737ef33d89b8190c3ea34d0fb6675e00edbe35d074"},
{file = "cryptography-39.0.2-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:103e8f7155f3ce2ffa0049fe60169878d47a4364b277906386f8de21c9234aa1"},
{file = "cryptography-39.0.2-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:6236a9610c912b129610eb1a274bdc1350b5df834d124fa84729ebeaf7da42c3"},
{file = "cryptography-39.0.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:e944fe07b6f229f4c1a06a7ef906a19652bdd9fd54c761b0ff87e83ae7a30354"},
{file = "cryptography-39.0.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:35d658536b0a4117c885728d1a7032bdc9a5974722ae298d6c533755a6ee3915"},
{file = "cryptography-39.0.2-pp39-pypy39_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:30b1d1bfd00f6fc80d11300a29f1d8ab2b8d9febb6ed4a38a76880ec564fae84"},
{file = "cryptography-39.0.2-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:e029b844c21116564b8b61216befabca4b500e6816fa9f0ba49527653cae2108"},
{file = "cryptography-39.0.2-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:fa507318e427169ade4e9eccef39e9011cdc19534f55ca2f36ec3f388c1f70f3"},
{file = "cryptography-39.0.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:8bc0008ef798231fac03fe7d26e82d601d15bd16f3afaad1c6113771566570f3"},
{file = "cryptography-39.0.2.tar.gz", hash = "sha256:bc5b871e977c8ee5a1bbc42fa8d19bcc08baf0c51cbf1586b0e87a2694dde42f"},
{file = "cryptography-40.0.1-cp36-abi3-macosx_10_12_universal2.whl", hash = "sha256:918cb89086c7d98b1b86b9fdb70c712e5a9325ba6f7d7cfb509e784e0cfc6917"},
{file = "cryptography-40.0.1-cp36-abi3-macosx_10_12_x86_64.whl", hash = "sha256:9618a87212cb5200500e304e43691111570e1f10ec3f35569fdfcd17e28fd797"},
{file = "cryptography-40.0.1-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3a4805a4ca729d65570a1b7cac84eac1e431085d40387b7d3bbaa47e39890b88"},
{file = "cryptography-40.0.1-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63dac2d25c47f12a7b8aa60e528bfb3c51c5a6c5a9f7c86987909c6c79765554"},
{file = "cryptography-40.0.1-cp36-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:0a4e3406cfed6b1f6d6e87ed243363652b2586b2d917b0609ca4f97072994405"},
{file = "cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1e0af458515d5e4028aad75f3bb3fe7a31e46ad920648cd59b64d3da842e4356"},
{file = "cryptography-40.0.1-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:d8aa3609d337ad85e4eb9bb0f8bcf6e4409bfb86e706efa9a027912169e89122"},
{file = "cryptography-40.0.1-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:cf91e428c51ef692b82ce786583e214f58392399cf65c341bc7301d096fa3ba2"},
{file = "cryptography-40.0.1-cp36-abi3-win32.whl", hash = "sha256:650883cc064297ef3676b1db1b7b1df6081794c4ada96fa457253c4cc40f97db"},
{file = "cryptography-40.0.1-cp36-abi3-win_amd64.whl", hash = "sha256:a805a7bce4a77d51696410005b3e85ae2839bad9aa38894afc0aa99d8e0c3160"},
{file = "cryptography-40.0.1-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:cd033d74067d8928ef00a6b1327c8ea0452523967ca4463666eeba65ca350d4c"},
{file = "cryptography-40.0.1-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:d36bbeb99704aabefdca5aee4eba04455d7a27ceabd16f3b3ba9bdcc31da86c4"},
{file = "cryptography-40.0.1-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:32057d3d0ab7d4453778367ca43e99ddb711770477c4f072a51b3ca69602780a"},
{file = "cryptography-40.0.1-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:f5d7b79fa56bc29580faafc2ff736ce05ba31feaa9d4735048b0de7d9ceb2b94"},
{file = "cryptography-40.0.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:7c872413353c70e0263a9368c4993710070e70ab3e5318d85510cc91cce77e7c"},
{file = "cryptography-40.0.1-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:28d63d75bf7ae4045b10de5413fb1d6338616e79015999ad9cf6fc538f772d41"},
{file = "cryptography-40.0.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:6f2bbd72f717ce33100e6467572abaedc61f1acb87b8d546001328d7f466b778"},
{file = "cryptography-40.0.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:cc3a621076d824d75ab1e1e530e66e7e8564e357dd723f2533225d40fe35c60c"},
{file = "cryptography-40.0.1.tar.gz", hash = "sha256:2803f2f8b1e95f614419926c7e6f55d828afc614ca5ed61543877ae668cc3472"},
]
[package.dependencies]
@@ -668,10 +664,10 @@ cffi = ">=1.12"
[package.extras]
docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"]
docstest = ["pyenchant (>=1.6.11)", "sphinxcontrib-spelling (>=4.0.1)", "twine (>=1.12.0)"]
pep8test = ["black", "check-manifest", "mypy", "ruff", "types-pytz", "types-requests"]
pep8test = ["black", "check-manifest", "mypy", "ruff"]
sdist = ["setuptools-rust (>=0.11.4)"]
ssh = ["bcrypt (>=3.1.5)"]
test = ["hypothesis (>=1.11.4,!=3.79.2)", "iso8601", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-shard (>=0.1.2)", "pytest-subtests", "pytest-xdist", "pytz"]
test = ["iso8601", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-shard (>=0.1.2)", "pytest-subtests", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
tox = ["tox"]
@@ -1237,14 +1233,14 @@ min-versions = ["babel (==2.9.0)", "click (==7.0)", "colorama (==0.4)", "ghp-imp
[[package]]
name = "mkdocs-material"
version = "9.1.3"
version = "9.1.5"
description = "Documentation that simply works"
category = "main"
optional = true
python-versions = ">=3.7"
files = [
{file = "mkdocs_material-9.1.3-py3-none-any.whl", hash = "sha256:a8d14d03569008afb0f5a5785c253249b5ff038e3a5509f96a393b8596bf5062"},
{file = "mkdocs_material-9.1.3.tar.gz", hash = "sha256:0be1b5d76c00efc9b2ecbd2d71014be950351e710f5947f276264878afc82ca0"},
{file = "mkdocs_material-9.1.5-py3-none-any.whl", hash = "sha256:981e1ef0250e2fbcc23610e9b20e5f242fe1f808079b9bcfbb6c49aa2999343c"},
{file = "mkdocs_material-9.1.5.tar.gz", hash = "sha256:744519bca52b1e8fe7c2e80e15ed59baf8948111ec763ae6ae629c409bd16d6e"},
]
[package.dependencies]
@@ -1289,14 +1285,14 @@ test = ["pytest", "pytest-cov"]
[[package]]
name = "moto"
version = "4.1.5"
version = "4.1.6"
description = ""
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
{file = "moto-4.1.5-py2.py3-none-any.whl", hash = "sha256:363577f7a0cdf639852420f6ba5caa9aa3c90a688feae6315f8ee4bf324b8c27"},
{file = "moto-4.1.5.tar.gz", hash = "sha256:63542b7b9f307b00fae460b42d15cf9346de3ad3b1287fba38fc68f3c05e4da4"},
{file = "moto-4.1.6-py2.py3-none-any.whl", hash = "sha256:cfe398a1f6e317d061c47c3d2dd8c6893f3eb49154984a7cbb8bcd4ba517d67d"},
{file = "moto-4.1.6.tar.gz", hash = "sha256:fdcc2731212ca050a28b2bc83e87628294bcbd55cb4f4c4692f972023fb1e7e6"},
]
[package.dependencies]
@@ -1311,13 +1307,13 @@ werkzeug = ">=0.5,<2.2.0 || >2.2.0,<2.2.1 || >2.2.1"
xmltodict = "*"
[package.extras]
all = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "ecdsa (!=0.15)", "graphql-core", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.2.8)", "pyparsing (>=3.0.7)", "python-jose[cryptography] (>=3.1.0,<4.0.0)", "setuptools", "sshpubkeys (>=3.1.0)"]
all = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "ecdsa (!=0.15)", "graphql-core", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.2.8)", "py-partiql-parser (==0.1.0)", "pyparsing (>=3.0.7)", "python-jose[cryptography] (>=3.1.0,<4.0.0)", "setuptools", "sshpubkeys (>=3.1.0)"]
apigateway = ["PyYAML (>=5.1)", "ecdsa (!=0.15)", "openapi-spec-validator (>=0.2.8)", "python-jose[cryptography] (>=3.1.0,<4.0.0)"]
apigatewayv2 = ["PyYAML (>=5.1)"]
appsync = ["graphql-core"]
awslambda = ["docker (>=3.0.0)"]
batch = ["docker (>=3.0.0)"]
cloudformation = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "ecdsa (!=0.15)", "graphql-core", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.2.8)", "pyparsing (>=3.0.7)", "python-jose[cryptography] (>=3.1.0,<4.0.0)", "setuptools", "sshpubkeys (>=3.1.0)"]
cloudformation = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "ecdsa (!=0.15)", "graphql-core", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.2.8)", "py-partiql-parser (==0.1.0)", "pyparsing (>=3.0.7)", "python-jose[cryptography] (>=3.1.0,<4.0.0)", "setuptools", "sshpubkeys (>=3.1.0)"]
cognitoidp = ["ecdsa (!=0.15)", "python-jose[cryptography] (>=3.1.0,<4.0.0)"]
ds = ["sshpubkeys (>=3.1.0)"]
dynamodb = ["docker (>=3.0.0)"]
@@ -1329,8 +1325,8 @@ eks = ["sshpubkeys (>=3.1.0)"]
glue = ["pyparsing (>=3.0.7)"]
iotdata = ["jsondiff (>=1.1.2)"]
route53resolver = ["sshpubkeys (>=3.1.0)"]
s3 = ["PyYAML (>=5.1)"]
server = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "ecdsa (!=0.15)", "flask (!=2.2.0,!=2.2.1)", "flask-cors", "graphql-core", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.2.8)", "pyparsing (>=3.0.7)", "python-jose[cryptography] (>=3.1.0,<4.0.0)", "setuptools", "sshpubkeys (>=3.1.0)"]
s3 = ["PyYAML (>=5.1)", "py-partiql-parser (==0.1.0)"]
server = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "ecdsa (!=0.15)", "flask (!=2.2.0,!=2.2.1)", "flask-cors", "graphql-core", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.2.8)", "py-partiql-parser (==0.1.0)", "pyparsing (>=3.0.7)", "python-jose[cryptography] (>=3.1.0,<4.0.0)", "setuptools", "sshpubkeys (>=3.1.0)"]
ssm = ["PyYAML (>=5.1)"]
xray = ["aws-xray-sdk (>=0.93,!=0.96)", "setuptools"]
@@ -1530,19 +1526,19 @@ files = [
[[package]]
name = "platformdirs"
version = "3.1.1"
version = "3.2.0"
description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
{file = "platformdirs-3.1.1-py3-none-any.whl", hash = "sha256:e5986afb596e4bb5bde29a79ac9061aa955b94fca2399b7aaac4090860920dd8"},
{file = "platformdirs-3.1.1.tar.gz", hash = "sha256:024996549ee88ec1a9aa99ff7f8fc819bb59e2c3477b410d90a16d32d6e707aa"},
{file = "platformdirs-3.2.0-py3-none-any.whl", hash = "sha256:ebe11c0d7a805086e99506aa331612429a72ca7cd52a1f0d277dc4adc20cb10e"},
{file = "platformdirs-3.2.0.tar.gz", hash = "sha256:d5b638ca397f25f979350ff789db335903d7ea010ab28903f57b27e1b16c2b08"},
]
[package.extras]
docs = ["furo (>=2022.12.7)", "proselint (>=0.13)", "sphinx (>=6.1.3)", "sphinx-autodoc-typehints (>=1.22,!=1.23.4)"]
test = ["appdirs (==1.4.4)", "covdefaults (>=2.2.2)", "pytest (>=7.2.1)", "pytest-cov (>=4)", "pytest-mock (>=3.10)"]
test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=7.2.2)", "pytest-cov (>=4)", "pytest-mock (>=3.10)"]
[[package]]
name = "pluggy"
@@ -1606,48 +1602,48 @@ files = [
[[package]]
name = "pydantic"
version = "1.10.6"
version = "1.10.7"
description = "Data validation and settings management using python type hints"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
{file = "pydantic-1.10.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f9289065611c48147c1dd1fd344e9d57ab45f1d99b0fb26c51f1cf72cd9bcd31"},
{file = "pydantic-1.10.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8c32b6bba301490d9bb2bf5f631907803135e8085b6aa3e5fe5a770d46dd0160"},
{file = "pydantic-1.10.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd9b9e98068fa1068edfc9eabde70a7132017bdd4f362f8b4fd0abed79c33083"},
{file = "pydantic-1.10.6-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1c84583b9df62522829cbc46e2b22e0ec11445625b5acd70c5681ce09c9b11c4"},
{file = "pydantic-1.10.6-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:b41822064585fea56d0116aa431fbd5137ce69dfe837b599e310034171996084"},
{file = "pydantic-1.10.6-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:61f1f08adfaa9cc02e0cbc94f478140385cbd52d5b3c5a657c2fceb15de8d1fb"},
{file = "pydantic-1.10.6-cp310-cp310-win_amd64.whl", hash = "sha256:32937835e525d92c98a1512218db4eed9ddc8f4ee2a78382d77f54341972c0e7"},
{file = "pydantic-1.10.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:bbd5c531b22928e63d0cb1868dee76123456e1de2f1cb45879e9e7a3f3f1779b"},
{file = "pydantic-1.10.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e277bd18339177daa62a294256869bbe84df1fb592be2716ec62627bb8d7c81d"},
{file = "pydantic-1.10.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:89f15277d720aa57e173954d237628a8d304896364b9de745dcb722f584812c7"},
{file = "pydantic-1.10.6-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b243b564cea2576725e77aeeda54e3e0229a168bc587d536cd69941e6797543d"},
{file = "pydantic-1.10.6-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:3ce13a558b484c9ae48a6a7c184b1ba0e5588c5525482681db418268e5f86186"},
{file = "pydantic-1.10.6-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:3ac1cd4deed871dfe0c5f63721e29debf03e2deefa41b3ed5eb5f5df287c7b70"},
{file = "pydantic-1.10.6-cp311-cp311-win_amd64.whl", hash = "sha256:b1eb6610330a1dfba9ce142ada792f26bbef1255b75f538196a39e9e90388bf4"},
{file = "pydantic-1.10.6-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:4ca83739c1263a044ec8b79df4eefc34bbac87191f0a513d00dd47d46e307a65"},
{file = "pydantic-1.10.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ea4e2a7cb409951988e79a469f609bba998a576e6d7b9791ae5d1e0619e1c0f2"},
{file = "pydantic-1.10.6-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:53de12b4608290992a943801d7756f18a37b7aee284b9ffa794ee8ea8153f8e2"},
{file = "pydantic-1.10.6-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:60184e80aac3b56933c71c48d6181e630b0fbc61ae455a63322a66a23c14731a"},
{file = "pydantic-1.10.6-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:415a3f719ce518e95a92effc7ee30118a25c3d032455d13e121e3840985f2efd"},
{file = "pydantic-1.10.6-cp37-cp37m-win_amd64.whl", hash = "sha256:72cb30894a34d3a7ab6d959b45a70abac8a2a93b6480fc5a7bfbd9c935bdc4fb"},
{file = "pydantic-1.10.6-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3091d2eaeda25391405e36c2fc2ed102b48bac4b384d42b2267310abae350ca6"},
{file = "pydantic-1.10.6-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:751f008cd2afe812a781fd6aa2fb66c620ca2e1a13b6a2152b1ad51553cb4b77"},
{file = "pydantic-1.10.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:12e837fd320dd30bd625be1b101e3b62edc096a49835392dcf418f1a5ac2b832"},
{file = "pydantic-1.10.6-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:587d92831d0115874d766b1f5fddcdde0c5b6c60f8c6111a394078ec227fca6d"},
{file = "pydantic-1.10.6-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:476f6674303ae7965730a382a8e8d7fae18b8004b7b69a56c3d8fa93968aa21c"},
{file = "pydantic-1.10.6-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:3a2be0a0f32c83265fd71a45027201e1278beaa82ea88ea5b345eea6afa9ac7f"},
{file = "pydantic-1.10.6-cp38-cp38-win_amd64.whl", hash = "sha256:0abd9c60eee6201b853b6c4be104edfba4f8f6c5f3623f8e1dba90634d63eb35"},
{file = "pydantic-1.10.6-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6195ca908045054dd2d57eb9c39a5fe86409968b8040de8c2240186da0769da7"},
{file = "pydantic-1.10.6-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:43cdeca8d30de9a897440e3fb8866f827c4c31f6c73838e3a01a14b03b067b1d"},
{file = "pydantic-1.10.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c19eb5163167489cb1e0161ae9220dadd4fc609a42649e7e84a8fa8fff7a80f"},
{file = "pydantic-1.10.6-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:012c99a9c0d18cfde7469aa1ebff922e24b0c706d03ead96940f5465f2c9cf62"},
{file = "pydantic-1.10.6-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:528dcf7ec49fb5a84bf6fe346c1cc3c55b0e7603c2123881996ca3ad79db5bfc"},
{file = "pydantic-1.10.6-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:163e79386c3547c49366e959d01e37fc30252285a70619ffc1b10ede4758250a"},
{file = "pydantic-1.10.6-cp39-cp39-win_amd64.whl", hash = "sha256:189318051c3d57821f7233ecc94708767dd67687a614a4e8f92b4a020d4ffd06"},
{file = "pydantic-1.10.6-py3-none-any.whl", hash = "sha256:acc6783751ac9c9bc4680379edd6d286468a1dc8d7d9906cd6f1186ed682b2b0"},
{file = "pydantic-1.10.6.tar.gz", hash = "sha256:cf95adb0d1671fc38d8c43dd921ad5814a735e7d9b4d9e437c088002863854fd"},
{file = "pydantic-1.10.7-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e79e999e539872e903767c417c897e729e015872040e56b96e67968c3b918b2d"},
{file = "pydantic-1.10.7-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:01aea3a42c13f2602b7ecbbea484a98169fb568ebd9e247593ea05f01b884b2e"},
{file = "pydantic-1.10.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:516f1ed9bc2406a0467dd777afc636c7091d71f214d5e413d64fef45174cfc7a"},
{file = "pydantic-1.10.7-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ae150a63564929c675d7f2303008d88426a0add46efd76c3fc797cd71cb1b46f"},
{file = "pydantic-1.10.7-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:ecbbc51391248116c0a055899e6c3e7ffbb11fb5e2a4cd6f2d0b93272118a209"},
{file = "pydantic-1.10.7-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f4a2b50e2b03d5776e7f21af73e2070e1b5c0d0df255a827e7c632962f8315af"},
{file = "pydantic-1.10.7-cp310-cp310-win_amd64.whl", hash = "sha256:a7cd2251439988b413cb0a985c4ed82b6c6aac382dbaff53ae03c4b23a70e80a"},
{file = "pydantic-1.10.7-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:68792151e174a4aa9e9fc1b4e653e65a354a2fa0fed169f7b3d09902ad2cb6f1"},
{file = "pydantic-1.10.7-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dfe2507b8ef209da71b6fb5f4e597b50c5a34b78d7e857c4f8f3115effaef5fe"},
{file = "pydantic-1.10.7-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10a86d8c8db68086f1e30a530f7d5f83eb0685e632e411dbbcf2d5c0150e8dcd"},
{file = "pydantic-1.10.7-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d75ae19d2a3dbb146b6f324031c24f8a3f52ff5d6a9f22f0683694b3afcb16fb"},
{file = "pydantic-1.10.7-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:464855a7ff7f2cc2cf537ecc421291b9132aa9c79aef44e917ad711b4a93163b"},
{file = "pydantic-1.10.7-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:193924c563fae6ddcb71d3f06fa153866423ac1b793a47936656e806b64e24ca"},
{file = "pydantic-1.10.7-cp311-cp311-win_amd64.whl", hash = "sha256:b4a849d10f211389502059c33332e91327bc154acc1845f375a99eca3afa802d"},
{file = "pydantic-1.10.7-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:cc1dde4e50a5fc1336ee0581c1612215bc64ed6d28d2c7c6f25d2fe3e7c3e918"},
{file = "pydantic-1.10.7-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e0cfe895a504c060e5d36b287ee696e2fdad02d89e0d895f83037245218a87fe"},
{file = "pydantic-1.10.7-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:670bb4683ad1e48b0ecb06f0cfe2178dcf74ff27921cdf1606e527d2617a81ee"},
{file = "pydantic-1.10.7-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:950ce33857841f9a337ce07ddf46bc84e1c4946d2a3bba18f8280297157a3fd1"},
{file = "pydantic-1.10.7-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:c15582f9055fbc1bfe50266a19771bbbef33dd28c45e78afbe1996fd70966c2a"},
{file = "pydantic-1.10.7-cp37-cp37m-win_amd64.whl", hash = "sha256:82dffb306dd20bd5268fd6379bc4bfe75242a9c2b79fec58e1041fbbdb1f7914"},
{file = "pydantic-1.10.7-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8c7f51861d73e8b9ddcb9916ae7ac39fb52761d9ea0df41128e81e2ba42886cd"},
{file = "pydantic-1.10.7-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6434b49c0b03a51021ade5c4daa7d70c98f7a79e95b551201fff682fc1661245"},
{file = "pydantic-1.10.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64d34ab766fa056df49013bb6e79921a0265204c071984e75a09cbceacbbdd5d"},
{file = "pydantic-1.10.7-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:701daea9ffe9d26f97b52f1d157e0d4121644f0fcf80b443248434958fd03dc3"},
{file = "pydantic-1.10.7-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:cf135c46099ff3f919d2150a948ce94b9ce545598ef2c6c7bf55dca98a304b52"},
{file = "pydantic-1.10.7-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:b0f85904f73161817b80781cc150f8b906d521fa11e3cdabae19a581c3606209"},
{file = "pydantic-1.10.7-cp38-cp38-win_amd64.whl", hash = "sha256:9f6f0fd68d73257ad6685419478c5aece46432f4bdd8d32c7345f1986496171e"},
{file = "pydantic-1.10.7-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c230c0d8a322276d6e7b88c3f7ce885f9ed16e0910354510e0bae84d54991143"},
{file = "pydantic-1.10.7-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:976cae77ba6a49d80f461fd8bba183ff7ba79f44aa5cfa82f1346b5626542f8e"},
{file = "pydantic-1.10.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d45fc99d64af9aaf7e308054a0067fdcd87ffe974f2442312372dfa66e1001d"},
{file = "pydantic-1.10.7-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d2a5ebb48958754d386195fe9e9c5106f11275867051bf017a8059410e9abf1f"},
{file = "pydantic-1.10.7-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:abfb7d4a7cd5cc4e1d1887c43503a7c5dd608eadf8bc615413fc498d3e4645cd"},
{file = "pydantic-1.10.7-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:80b1fab4deb08a8292d15e43a6edccdffa5377a36a4597bb545b93e79c5ff0a5"},
{file = "pydantic-1.10.7-cp39-cp39-win_amd64.whl", hash = "sha256:d71e69699498b020ea198468e2480a2f1e7433e32a3a99760058c6520e2bea7e"},
{file = "pydantic-1.10.7-py3-none-any.whl", hash = "sha256:0cd181f1d0b1d00e2b705f1bf1ac7799a2d938cce3376b8007df62b29be3c2c6"},
{file = "pydantic-1.10.7.tar.gz", hash = "sha256:cfc83c0678b6ba51b0532bea66860617c4cd4251ecf76e9846fa5a9f3454e97e"},
]
[package.dependencies]
@@ -1707,18 +1703,18 @@ tests = ["coverage[toml] (==5.0.4)", "pytest (>=6.0.0,<7.0.0)"]
[[package]]
name = "pylint"
version = "2.17.0"
version = "2.17.2"
description = "python code static checker"
category = "dev"
optional = false
python-versions = ">=3.7.2"
files = [
{file = "pylint-2.17.0-py3-none-any.whl", hash = "sha256:e097d8325f8c88e14ad12844e3fe2d963d3de871ea9a8f8ad25ab1c109889ddc"},
{file = "pylint-2.17.0.tar.gz", hash = "sha256:1460829b6397cb5eb0cdb0b4fc4b556348e515cdca32115f74a1eb7c20b896b4"},
{file = "pylint-2.17.2-py3-none-any.whl", hash = "sha256:001cc91366a7df2970941d7e6bbefcbf98694e00102c1f121c531a814ddc2ea8"},
{file = "pylint-2.17.2.tar.gz", hash = "sha256:1b647da5249e7c279118f657ca28b6aaebb299f86bf92affc632acf199f7adbb"},
]
[package.dependencies]
astroid = ">=2.15.0,<=2.17.0-dev0"
astroid = ">=2.15.2,<=2.17.0-dev0"
colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
dill = [
{version = ">=0.2", markers = "python_version < \"3.11\""},
@@ -1865,26 +1861,26 @@ six = ">=1.5"
[[package]]
name = "pywin32"
version = "305"
version = "306"
description = "Python for Window Extensions"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "pywin32-305-cp310-cp310-win32.whl", hash = "sha256:421f6cd86e84bbb696d54563c48014b12a23ef95a14e0bdba526be756d89f116"},
{file = "pywin32-305-cp310-cp310-win_amd64.whl", hash = "sha256:73e819c6bed89f44ff1d690498c0a811948f73777e5f97c494c152b850fad478"},
{file = "pywin32-305-cp310-cp310-win_arm64.whl", hash = "sha256:742eb905ce2187133a29365b428e6c3b9001d79accdc30aa8969afba1d8470f4"},
{file = "pywin32-305-cp311-cp311-win32.whl", hash = "sha256:19ca459cd2e66c0e2cc9a09d589f71d827f26d47fe4a9d09175f6aa0256b51c2"},
{file = "pywin32-305-cp311-cp311-win_amd64.whl", hash = "sha256:326f42ab4cfff56e77e3e595aeaf6c216712bbdd91e464d167c6434b28d65990"},
{file = "pywin32-305-cp311-cp311-win_arm64.whl", hash = "sha256:4ecd404b2c6eceaca52f8b2e3e91b2187850a1ad3f8b746d0796a98b4cea04db"},
{file = "pywin32-305-cp36-cp36m-win32.whl", hash = "sha256:48d8b1659284f3c17b68587af047d110d8c44837736b8932c034091683e05863"},
{file = "pywin32-305-cp36-cp36m-win_amd64.whl", hash = "sha256:13362cc5aa93c2beaf489c9c9017c793722aeb56d3e5166dadd5ef82da021fe1"},
{file = "pywin32-305-cp37-cp37m-win32.whl", hash = "sha256:a55db448124d1c1484df22fa8bbcbc45c64da5e6eae74ab095b9ea62e6d00496"},
{file = "pywin32-305-cp37-cp37m-win_amd64.whl", hash = "sha256:109f98980bfb27e78f4df8a51a8198e10b0f347257d1e265bb1a32993d0c973d"},
{file = "pywin32-305-cp38-cp38-win32.whl", hash = "sha256:9dd98384da775afa009bc04863426cb30596fd78c6f8e4e2e5bbf4edf8029504"},
{file = "pywin32-305-cp38-cp38-win_amd64.whl", hash = "sha256:56d7a9c6e1a6835f521788f53b5af7912090674bb84ef5611663ee1595860fc7"},
{file = "pywin32-305-cp39-cp39-win32.whl", hash = "sha256:9d968c677ac4d5cbdaa62fd3014ab241718e619d8e36ef8e11fb930515a1e918"},
{file = "pywin32-305-cp39-cp39-win_amd64.whl", hash = "sha256:50768c6b7c3f0b38b7fb14dd4104da93ebced5f1a50dc0e834594bff6fbe1271"},
{file = "pywin32-306-cp310-cp310-win32.whl", hash = "sha256:06d3420a5155ba65f0b72f2699b5bacf3109f36acbe8923765c22938a69dfc8d"},
{file = "pywin32-306-cp310-cp310-win_amd64.whl", hash = "sha256:84f4471dbca1887ea3803d8848a1616429ac94a4a8d05f4bc9c5dcfd42ca99c8"},
{file = "pywin32-306-cp311-cp311-win32.whl", hash = "sha256:e65028133d15b64d2ed8f06dd9fbc268352478d4f9289e69c190ecd6818b6407"},
{file = "pywin32-306-cp311-cp311-win_amd64.whl", hash = "sha256:a7639f51c184c0272e93f244eb24dafca9b1855707d94c192d4a0b4c01e1100e"},
{file = "pywin32-306-cp311-cp311-win_arm64.whl", hash = "sha256:70dba0c913d19f942a2db25217d9a1b726c278f483a919f1abfed79c9cf64d3a"},
{file = "pywin32-306-cp312-cp312-win32.whl", hash = "sha256:383229d515657f4e3ed1343da8be101000562bf514591ff383ae940cad65458b"},
{file = "pywin32-306-cp312-cp312-win_amd64.whl", hash = "sha256:37257794c1ad39ee9be652da0462dc2e394c8159dfd913a8a4e8eb6fd346da0e"},
{file = "pywin32-306-cp312-cp312-win_arm64.whl", hash = "sha256:5821ec52f6d321aa59e2db7e0a35b997de60c201943557d108af9d4ae1ec7040"},
{file = "pywin32-306-cp37-cp37m-win32.whl", hash = "sha256:1c73ea9a0d2283d889001998059f5eaaba3b6238f767c9cf2833b13e6a685f65"},
{file = "pywin32-306-cp37-cp37m-win_amd64.whl", hash = "sha256:72c5f621542d7bdd4fdb716227be0dd3f8565c11b280be6315b06ace35487d36"},
{file = "pywin32-306-cp38-cp38-win32.whl", hash = "sha256:e4c092e2589b5cf0d365849e73e02c391c1349958c5ac3e9d5ccb9a28e017b3a"},
{file = "pywin32-306-cp38-cp38-win_amd64.whl", hash = "sha256:e8ac1ae3601bee6ca9f7cb4b5363bf1c0badb935ef243c4733ff9a393b1690c0"},
{file = "pywin32-306-cp39-cp39-win32.whl", hash = "sha256:e25fd5b485b55ac9c057f67d94bc203f3f6595078d1fb3b458c9c28b7153a802"},
{file = "pywin32-306-cp39-cp39-win_amd64.whl", hash = "sha256:39b61c15272833b5c329a2989999dcae836b1eed650252ab1b7bfbe1d59f30f4"},
]
[[package]]
@@ -1954,100 +1950,72 @@ pyyaml = "*"
[[package]]
name = "regex"
version = "2022.10.31"
version = "2023.3.23"
description = "Alternative regular expression module, to replace re."
category = "main"
optional = true
python-versions = ">=3.6"
python-versions = ">=3.8"
files = [
{file = "regex-2022.10.31-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a8ff454ef0bb061e37df03557afda9d785c905dab15584860f982e88be73015f"},
{file = "regex-2022.10.31-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1eba476b1b242620c266edf6325b443a2e22b633217a9835a52d8da2b5c051f9"},
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d0e5af9a9effb88535a472e19169e09ce750c3d442fb222254a276d77808620b"},
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d03fe67b2325cb3f09be029fd5da8df9e6974f0cde2c2ac6a79d2634e791dd57"},
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a9d0b68ac1743964755ae2d89772c7e6fb0118acd4d0b7464eaf3921c6b49dd4"},
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8a45b6514861916c429e6059a55cf7db74670eaed2052a648e3e4d04f070e001"},
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8b0886885f7323beea6f552c28bff62cbe0983b9fbb94126531693ea6c5ebb90"},
{file = "regex-2022.10.31-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:5aefb84a301327ad115e9d346c8e2760009131d9d4b4c6b213648d02e2abe144"},
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:702d8fc6f25bbf412ee706bd73019da5e44a8400861dfff7ff31eb5b4a1276dc"},
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:a3c1ebd4ed8e76e886507c9eddb1a891673686c813adf889b864a17fafcf6d66"},
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:50921c140561d3db2ab9f5b11c5184846cde686bb5a9dc64cae442926e86f3af"},
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:7db345956ecce0c99b97b042b4ca7326feeec6b75facd8390af73b18e2650ffc"},
{file = "regex-2022.10.31-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:763b64853b0a8f4f9cfb41a76a4a85a9bcda7fdda5cb057016e7706fde928e66"},
{file = "regex-2022.10.31-cp310-cp310-win32.whl", hash = "sha256:44136355e2f5e06bf6b23d337a75386371ba742ffa771440b85bed367c1318d1"},
{file = "regex-2022.10.31-cp310-cp310-win_amd64.whl", hash = "sha256:bfff48c7bd23c6e2aec6454aaf6edc44444b229e94743b34bdcdda2e35126cf5"},
{file = "regex-2022.10.31-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4b4b1fe58cd102d75ef0552cf17242705ce0759f9695334a56644ad2d83903fe"},
{file = "regex-2022.10.31-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:542e3e306d1669b25936b64917285cdffcd4f5c6f0247636fec037187bd93542"},
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c27cc1e4b197092e50ddbf0118c788d9977f3f8f35bfbbd3e76c1846a3443df7"},
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b8e38472739028e5f2c3a4aded0ab7eadc447f0d84f310c7a8bb697ec417229e"},
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:76c598ca73ec73a2f568e2a72ba46c3b6c8690ad9a07092b18e48ceb936e9f0c"},
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c28d3309ebd6d6b2cf82969b5179bed5fefe6142c70f354ece94324fa11bf6a1"},
{file = "regex-2022.10.31-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9af69f6746120998cd9c355e9c3c6aec7dff70d47247188feb4f829502be8ab4"},
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a5f9505efd574d1e5b4a76ac9dd92a12acb2b309551e9aa874c13c11caefbe4f"},
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:5ff525698de226c0ca743bfa71fc6b378cda2ddcf0d22d7c37b1cc925c9650a5"},
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:4fe7fda2fe7c8890d454f2cbc91d6c01baf206fbc96d89a80241a02985118c0c"},
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:2cdc55ca07b4e70dda898d2ab7150ecf17c990076d3acd7a5f3b25cb23a69f1c"},
{file = "regex-2022.10.31-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:44a6c2f6374e0033873e9ed577a54a3602b4f609867794c1a3ebba65e4c93ee7"},
{file = "regex-2022.10.31-cp311-cp311-win32.whl", hash = "sha256:d8716f82502997b3d0895d1c64c3b834181b1eaca28f3f6336a71777e437c2af"},
{file = "regex-2022.10.31-cp311-cp311-win_amd64.whl", hash = "sha256:61edbca89aa3f5ef7ecac8c23d975fe7261c12665f1d90a6b1af527bba86ce61"},
{file = "regex-2022.10.31-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:0a069c8483466806ab94ea9068c34b200b8bfc66b6762f45a831c4baaa9e8cdd"},
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d26166acf62f731f50bdd885b04b38828436d74e8e362bfcb8df221d868b5d9b"},
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac741bf78b9bb432e2d314439275235f41656e189856b11fb4e774d9f7246d81"},
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75f591b2055523fc02a4bbe598aa867df9e953255f0b7f7715d2a36a9c30065c"},
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b30bddd61d2a3261f025ad0f9ee2586988c6a00c780a2fb0a92cea2aa702c54"},
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ef4163770525257876f10e8ece1cf25b71468316f61451ded1a6f44273eedeb5"},
{file = "regex-2022.10.31-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:7b280948d00bd3973c1998f92e22aa3ecb76682e3a4255f33e1020bd32adf443"},
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:d0213671691e341f6849bf33cd9fad21f7b1cb88b89e024f33370733fec58742"},
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:22e7ebc231d28393dfdc19b185d97e14a0f178bedd78e85aad660e93b646604e"},
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:8ad241da7fac963d7573cc67a064c57c58766b62a9a20c452ca1f21050868dfa"},
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:586b36ebda81e6c1a9c5a5d0bfdc236399ba6595e1397842fd4a45648c30f35e"},
{file = "regex-2022.10.31-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:0653d012b3bf45f194e5e6a41df9258811ac8fc395579fa82958a8b76286bea4"},
{file = "regex-2022.10.31-cp36-cp36m-win32.whl", hash = "sha256:144486e029793a733e43b2e37df16a16df4ceb62102636ff3db6033994711066"},
{file = "regex-2022.10.31-cp36-cp36m-win_amd64.whl", hash = "sha256:c14b63c9d7bab795d17392c7c1f9aaabbffd4cf4387725a0ac69109fb3b550c6"},
{file = "regex-2022.10.31-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:4cac3405d8dda8bc6ed499557625585544dd5cbf32072dcc72b5a176cb1271c8"},
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23cbb932cc53a86ebde0fb72e7e645f9a5eec1a5af7aa9ce333e46286caef783"},
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:74bcab50a13960f2a610cdcd066e25f1fd59e23b69637c92ad470784a51b1347"},
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78d680ef3e4d405f36f0d6d1ea54e740366f061645930072d39bca16a10d8c93"},
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ce6910b56b700bea7be82c54ddf2e0ed792a577dfaa4a76b9af07d550af435c6"},
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:659175b2144d199560d99a8d13b2228b85e6019b6e09e556209dfb8c37b78a11"},
{file = "regex-2022.10.31-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:1ddf14031a3882f684b8642cb74eea3af93a2be68893901b2b387c5fd92a03ec"},
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b683e5fd7f74fb66e89a1ed16076dbab3f8e9f34c18b1979ded614fe10cdc4d9"},
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:2bde29cc44fa81c0a0c8686992c3080b37c488df167a371500b2a43ce9f026d1"},
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:4919899577ba37f505aaebdf6e7dc812d55e8f097331312db7f1aab18767cce8"},
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:9c94f7cc91ab16b36ba5ce476f1904c91d6c92441f01cd61a8e2729442d6fcf5"},
{file = "regex-2022.10.31-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:ae1e96785696b543394a4e3f15f3f225d44f3c55dafe3f206493031419fedf95"},
{file = "regex-2022.10.31-cp37-cp37m-win32.whl", hash = "sha256:c670f4773f2f6f1957ff8a3962c7dd12e4be54d05839b216cb7fd70b5a1df394"},
{file = "regex-2022.10.31-cp37-cp37m-win_amd64.whl", hash = "sha256:8e0caeff18b96ea90fc0eb6e3bdb2b10ab5b01a95128dfeccb64a7238decf5f0"},
{file = "regex-2022.10.31-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:131d4be09bea7ce2577f9623e415cab287a3c8e0624f778c1d955ec7c281bd4d"},
{file = "regex-2022.10.31-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:e613a98ead2005c4ce037c7b061f2409a1a4e45099edb0ef3200ee26ed2a69a8"},
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:052b670fafbe30966bbe5d025e90b2a491f85dfe5b2583a163b5e60a85a321ad"},
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aa62a07ac93b7cb6b7d0389d8ef57ffc321d78f60c037b19dfa78d6b17c928ee"},
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5352bea8a8f84b89d45ccc503f390a6be77917932b1c98c4cdc3565137acc714"},
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20f61c9944f0be2dc2b75689ba409938c14876c19d02f7585af4460b6a21403e"},
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:29c04741b9ae13d1e94cf93fca257730b97ce6ea64cfe1eba11cf9ac4e85afb6"},
{file = "regex-2022.10.31-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:543883e3496c8b6d58bd036c99486c3c8387c2fc01f7a342b760c1ea3158a318"},
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b7a8b43ee64ca8f4befa2bea4083f7c52c92864d8518244bfa6e88c751fa8fff"},
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:6a9a19bea8495bb419dc5d38c4519567781cd8d571c72efc6aa959473d10221a"},
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:6ffd55b5aedc6f25fd8d9f905c9376ca44fcf768673ffb9d160dd6f409bfda73"},
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:4bdd56ee719a8f751cf5a593476a441c4e56c9b64dc1f0f30902858c4ef8771d"},
{file = "regex-2022.10.31-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8ca88da1bd78990b536c4a7765f719803eb4f8f9971cc22d6ca965c10a7f2c4c"},
{file = "regex-2022.10.31-cp38-cp38-win32.whl", hash = "sha256:5a260758454580f11dd8743fa98319bb046037dfab4f7828008909d0aa5292bc"},
{file = "regex-2022.10.31-cp38-cp38-win_amd64.whl", hash = "sha256:5e6a5567078b3eaed93558842346c9d678e116ab0135e22eb72db8325e90b453"},
{file = "regex-2022.10.31-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5217c25229b6a85049416a5c1e6451e9060a1edcf988641e309dbe3ab26d3e49"},
{file = "regex-2022.10.31-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4bf41b8b0a80708f7e0384519795e80dcb44d7199a35d52c15cc674d10b3081b"},
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cf0da36a212978be2c2e2e2d04bdff46f850108fccc1851332bcae51c8907cc"},
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d403d781b0e06d2922435ce3b8d2376579f0c217ae491e273bab8d092727d244"},
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a37d51fa9a00d265cf73f3de3930fa9c41548177ba4f0faf76e61d512c774690"},
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4f781ffedd17b0b834c8731b75cce2639d5a8afe961c1e58ee7f1f20b3af185"},
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d243b36fbf3d73c25e48014961e83c19c9cc92530516ce3c43050ea6276a2ab7"},
{file = "regex-2022.10.31-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:370f6e97d02bf2dd20d7468ce4f38e173a124e769762d00beadec3bc2f4b3bc4"},
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:597f899f4ed42a38df7b0e46714880fb4e19a25c2f66e5c908805466721760f5"},
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7dbdce0c534bbf52274b94768b3498abdf675a691fec5f751b6057b3030f34c1"},
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:22960019a842777a9fa5134c2364efaed5fbf9610ddc5c904bd3a400973b0eb8"},
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:7f5a3ffc731494f1a57bd91c47dc483a1e10048131ffb52d901bfe2beb6102e8"},
{file = "regex-2022.10.31-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:7ef6b5942e6bfc5706301a18a62300c60db9af7f6368042227ccb7eeb22d0892"},
{file = "regex-2022.10.31-cp39-cp39-win32.whl", hash = "sha256:395161bbdbd04a8333b9ff9763a05e9ceb4fe210e3c7690f5e68cedd3d65d8e1"},
{file = "regex-2022.10.31-cp39-cp39-win_amd64.whl", hash = "sha256:957403a978e10fb3ca42572a23e6f7badff39aa1ce2f4ade68ee452dc6807692"},
{file = "regex-2022.10.31.tar.gz", hash = "sha256:a3a98921da9a1bf8457aeee6a551948a83601689e5ecdd736894ea9bbec77e83"},
{file = "regex-2023.3.23-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:845a5e2d84389c4ddada1a9b95c055320070f18bb76512608374aca00d22eca8"},
{file = "regex-2023.3.23-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:87d9951f5a538dd1d016bdc0dcae59241d15fa94860964833a54d18197fcd134"},
{file = "regex-2023.3.23-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:37ae17d3be44c0b3f782c28ae9edd8b47c1f1776d4cabe87edc0b98e1f12b021"},
{file = "regex-2023.3.23-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0b8eb1e3bca6b48dc721818a60ae83b8264d4089a4a41d62be6d05316ec38e15"},
{file = "regex-2023.3.23-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:df45fac182ebc3c494460c644e853515cc24f5ad9da05f8ffb91da891bfee879"},
{file = "regex-2023.3.23-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b7006105b10b59971d3b248ad75acc3651c7e4cf54d81694df5a5130a3c3f7ea"},
{file = "regex-2023.3.23-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:93f3f1aa608380fe294aa4cb82e2afda07a7598e828d0341e124b8fd9327c715"},
{file = "regex-2023.3.23-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:787954f541ab95d8195d97b0b8cf1dc304424adb1e07365967e656b92b38a699"},
{file = "regex-2023.3.23-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:20abe0bdf03630fe92ccafc45a599bca8b3501f48d1de4f7d121153350a2f77d"},
{file = "regex-2023.3.23-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:11d00c31aeab9a6e0503bc77e73ed9f4527b3984279d997eb145d7c7be6268fd"},
{file = "regex-2023.3.23-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:d5bbe0e1511b844794a3be43d6c145001626ba9a6c1db8f84bdc724e91131d9d"},
{file = "regex-2023.3.23-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:ea3c0cb56eadbf4ab2277e7a095676370b3e46dbfc74d5c383bd87b0d6317910"},
{file = "regex-2023.3.23-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d895b4c863059a4934d3e874b90998df774644a41b349ebb330f85f11b4ef2c0"},
{file = "regex-2023.3.23-cp310-cp310-win32.whl", hash = "sha256:9d764514d19b4edcc75fd8cb1423448ef393e8b6cbd94f38cab983ab1b75855d"},
{file = "regex-2023.3.23-cp310-cp310-win_amd64.whl", hash = "sha256:11d1f2b7a0696dc0310de0efb51b1f4d813ad4401fe368e83c0c62f344429f98"},
{file = "regex-2023.3.23-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:8a9c63cde0eaa345795c0fdeb19dc62d22e378c50b0bc67bf4667cd5b482d98b"},
{file = "regex-2023.3.23-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dd7200b4c27b68cf9c9646da01647141c6db09f48cc5b51bc588deaf8e98a797"},
{file = "regex-2023.3.23-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22720024b90a6ba673a725dcc62e10fb1111b889305d7c6b887ac7466b74bedb"},
{file = "regex-2023.3.23-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6b190a339090e6af25f4a5fd9e77591f6d911cc7b96ecbb2114890b061be0ac1"},
{file = "regex-2023.3.23-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e76b6fc0d8e9efa39100369a9b3379ce35e20f6c75365653cf58d282ad290f6f"},
{file = "regex-2023.3.23-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7868b8f218bf69a2a15402fde08b08712213a1f4b85a156d90473a6fb6b12b09"},
{file = "regex-2023.3.23-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2472428efc4127374f494e570e36b30bb5e6b37d9a754f7667f7073e43b0abdd"},
{file = "regex-2023.3.23-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:c37df2a060cb476d94c047b18572ee2b37c31f831df126c0da3cd9227b39253d"},
{file = "regex-2023.3.23-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:4479f9e2abc03362df4045b1332d4a2b7885b245a30d4f4b051c4083b97d95d8"},
{file = "regex-2023.3.23-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:e2396e0678167f2d0c197da942b0b3fb48fee2f0b5915a0feb84d11b6686afe6"},
{file = "regex-2023.3.23-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:75f288c60232a5339e0ff2fa05779a5e9c74e9fc085c81e931d4a264501e745b"},
{file = "regex-2023.3.23-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c869260aa62cee21c5eb171a466c0572b5e809213612ef8d495268cd2e34f20d"},
{file = "regex-2023.3.23-cp311-cp311-win32.whl", hash = "sha256:25f0532fd0c53e96bad84664171969de9673b4131f2297f1db850d3918d58858"},
{file = "regex-2023.3.23-cp311-cp311-win_amd64.whl", hash = "sha256:5ccfafd98473e007cebf7da10c1411035b7844f0f204015efd050601906dbb53"},
{file = "regex-2023.3.23-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6572ff287176c0fb96568adb292674b421fa762153ed074d94b1d939ed92c253"},
{file = "regex-2023.3.23-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a610e0adfcb0fc84ea25f6ea685e39e74cbcd9245a72a9a7aab85ff755a5ed27"},
{file = "regex-2023.3.23-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:086afe222d58b88b62847bdbd92079b4699350b4acab892f88a935db5707c790"},
{file = "regex-2023.3.23-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:79e29fd62fa2f597a6754b247356bda14b866131a22444d67f907d6d341e10f3"},
{file = "regex-2023.3.23-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c07ce8e9eee878a48ebeb32ee661b49504b85e164b05bebf25420705709fdd31"},
{file = "regex-2023.3.23-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:86b036f401895e854de9fefe061518e78d506d8a919cc250dc3416bca03f6f9a"},
{file = "regex-2023.3.23-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:78ac8dd8e18800bb1f97aad0d73f68916592dddf233b99d2b5cabc562088503a"},
{file = "regex-2023.3.23-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:539dd010dc35af935b32f248099e38447bbffc10b59c2b542bceead2bed5c325"},
{file = "regex-2023.3.23-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9bf4a5626f2a0ea006bf81e8963f498a57a47d58907eaa58f4b3e13be68759d8"},
{file = "regex-2023.3.23-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:cf86b4328c204c3f315074a61bc1c06f8a75a8e102359f18ce99fbcbbf1951f0"},
{file = "regex-2023.3.23-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:2848bf76673c83314068241c8d5b7fa9ad9bed866c979875a0e84039349e8fa7"},
{file = "regex-2023.3.23-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:c125a02d22c555e68f7433bac8449992fa1cead525399f14e47c2d98f2f0e467"},
{file = "regex-2023.3.23-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:cd1671e9d5ac05ce6aa86874dd8dfa048824d1dbe73060851b310c6c1a201a96"},
{file = "regex-2023.3.23-cp38-cp38-win32.whl", hash = "sha256:fffe57312a358be6ec6baeb43d253c36e5790e436b7bf5b7a38df360363e88e9"},
{file = "regex-2023.3.23-cp38-cp38-win_amd64.whl", hash = "sha256:dbb3f87e15d3dd76996d604af8678316ad2d7d20faa394e92d9394dfd621fd0c"},
{file = "regex-2023.3.23-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c88e8c226473b5549fe9616980ea7ca09289246cfbdf469241edf4741a620004"},
{file = "regex-2023.3.23-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:6560776ec19c83f3645bbc5db64a7a5816c9d8fb7ed7201c5bcd269323d88072"},
{file = "regex-2023.3.23-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1b1fc2632c01f42e06173d8dd9bb2e74ab9b0afa1d698058c867288d2c7a31f3"},
{file = "regex-2023.3.23-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fdf7ad455f1916b8ea5cdbc482d379f6daf93f3867b4232d14699867a5a13af7"},
{file = "regex-2023.3.23-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5fc33b27b1d800fc5b78d7f7d0f287e35079ecabe68e83d46930cf45690e1c8c"},
{file = "regex-2023.3.23-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4c49552dc938e3588f63f8a78c86f3c9c75301e813bca0bef13bdb4b87ccf364"},
{file = "regex-2023.3.23-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e152461e9a0aedec7d37fc66ec0fa635eca984777d3d3c3e36f53bf3d3ceb16e"},
{file = "regex-2023.3.23-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:db034255e72d2995cf581b14bb3fc9c00bdbe6822b49fcd4eef79e1d5f232618"},
{file = "regex-2023.3.23-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:55ae114da21b7a790b90255ea52d2aa3a0d121a646deb2d3c6a3194e722fc762"},
{file = "regex-2023.3.23-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:ef3f528fe1cc3d139508fe1b22523745aa77b9d6cb5b0bf277f48788ee0b993f"},
{file = "regex-2023.3.23-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:a81c9ec59ca2303acd1ccd7b9ac409f1e478e40e96f8f79b943be476c5fdb8bb"},
{file = "regex-2023.3.23-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:cde09c4fdd070772aa2596d97e942eb775a478b32459e042e1be71b739d08b77"},
{file = "regex-2023.3.23-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:3cd9f5dd7b821f141d3a6ca0d5d9359b9221e4f051ca3139320adea9f1679691"},
{file = "regex-2023.3.23-cp39-cp39-win32.whl", hash = "sha256:7304863f3a652dab5e68e6fb1725d05ebab36ec0390676d1736e0571ebb713ef"},
{file = "regex-2023.3.23-cp39-cp39-win_amd64.whl", hash = "sha256:54c3fa855a3f7438149de3211738dd9b5f0c733f48b54ae05aa7fce83d48d858"},
{file = "regex-2023.3.23.tar.gz", hash = "sha256:dc80df325b43ffea5cdea2e3eaa97a44f3dd298262b1c7fe9dbb2a9522b956a7"},
]
[[package]]
@@ -2129,14 +2097,14 @@ six = "*"
[[package]]
name = "rich"
version = "13.3.2"
version = "13.3.3"
description = "Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal"
category = "dev"
optional = false
python-versions = ">=3.7.0"
files = [
{file = "rich-13.3.2-py3-none-any.whl", hash = "sha256:a104f37270bf677148d8acb07d33be1569eeee87e2d1beb286a4e9113caf6f2f"},
{file = "rich-13.3.2.tar.gz", hash = "sha256:91954fe80cfb7985727a467ca98a7618e5dd15178cc2da10f553b36a93859001"},
{file = "rich-13.3.3-py3-none-any.whl", hash = "sha256:540c7d6d26a1178e8e8b37e9ba44573a3cd1464ff6348b99ee7061b95d1c6333"},
{file = "rich-13.3.3.tar.gz", hash = "sha256:dc84400a9d842b3a9c5ff74addd8eb798d155f36c1c91303888e0a66850d2a15"},
]
[package.dependencies]
@@ -2270,14 +2238,14 @@ contextlib2 = ">=0.5.5"
[[package]]
name = "setuptools"
version = "67.6.0"
version = "67.6.1"
description = "Easily download, build, install, upgrade, and uninstall Python packages"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
{file = "setuptools-67.6.0-py3-none-any.whl", hash = "sha256:b78aaa36f6b90a074c1fa651168723acbf45d14cb1196b6f02c0fd07f17623b2"},
{file = "setuptools-67.6.0.tar.gz", hash = "sha256:2ee892cd5f29f3373097f5a814697e397cf3ce313616df0af11231e2ad118077"},
{file = "setuptools-67.6.1-py3-none-any.whl", hash = "sha256:e728ca814a823bf7bf60162daf9db95b93d532948c4c0bea762ce62f60189078"},
{file = "setuptools-67.6.1.tar.gz", hash = "sha256:257de92a9d50a60b8e22abfcbb771571fde0dbf3ec234463212027a4eeecbe9a"},
]
[package.extras]
@@ -2398,26 +2366,26 @@ files = [
[[package]]
name = "tomlkit"
version = "0.11.6"
version = "0.11.7"
description = "Style preserving TOML library"
category = "dev"
optional = false
python-versions = ">=3.6"
python-versions = ">=3.7"
files = [
{file = "tomlkit-0.11.6-py3-none-any.whl", hash = "sha256:07de26b0d8cfc18f871aec595fda24d95b08fef89d147caa861939f37230bf4b"},
{file = "tomlkit-0.11.6.tar.gz", hash = "sha256:71b952e5721688937fb02cf9d354dbcf0785066149d2855e44531ebdd2b65d73"},
{file = "tomlkit-0.11.7-py3-none-any.whl", hash = "sha256:5325463a7da2ef0c6bbfefb62a3dc883aebe679984709aee32a317907d0a8d3c"},
{file = "tomlkit-0.11.7.tar.gz", hash = "sha256:f392ef70ad87a672f02519f99967d28a4d3047133e2d1df936511465fbb3791d"},
]
[[package]]
name = "types-pyyaml"
version = "6.0.12.8"
version = "6.0.12.9"
description = "Typing stubs for PyYAML"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "types-PyYAML-6.0.12.8.tar.gz", hash = "sha256:19304869a89d49af00be681e7b267414df213f4eb89634c4495fa62e8f942b9f"},
{file = "types_PyYAML-6.0.12.8-py3-none-any.whl", hash = "sha256:5314a4b2580999b2ea06b2e5f9a7763d860d6e09cdf21c0e9561daa9cbd60178"},
{file = "types-PyYAML-6.0.12.9.tar.gz", hash = "sha256:c51b1bd6d99ddf0aa2884a7a328810ebf70a4262c292195d3f4f9a0005f9eeb6"},
{file = "types_PyYAML-6.0.12.9-py3-none-any.whl", hash = "sha256:5aed5aa66bd2d2e158f75dda22b059570ede988559f030cf294871d3b647e3e8"},
]
[[package]]
@@ -2670,4 +2638,4 @@ docs = ["mkdocs", "mkdocs-material"]
[metadata]
lock-version = "2.0"
python-versions = "^3.9"
content-hash = "ab9263db7c7c836f9192444bcf680a4183d16ae49cff7d60082e61a80973001d"
content-hash = "56f617ddd8a800d53179955550c4d73cc4fcb0e9a1652c789d4ec7eddb814fb1"

View File

@@ -1626,7 +1626,7 @@
}
],
"Checks": [
"ec2_securitygroup_in_use_without_ingress_filtering"
"ec2_securitygroup_allow_ingress_from_internet_to_any_port"
]
},
{

View File

@@ -1,15 +1,17 @@
import json
import os
import pathlib
from datetime import datetime, timezone
from os import getcwd
import requests
import yaml
from prowler.lib.logger import logger
timestamp = datetime.today()
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
prowler_version = "3.3.0"
prowler_version = "3.3.4"
html_logo_url = "https://github.com/prowler-cloud/prowler/"
html_logo_img = "https://user-images.githubusercontent.com/3985464/113734260-7ba06900-96fb-11eb-82bc-d4f68a1e2710.png"
@@ -43,6 +45,20 @@ html_file_suffix = ".html"
config_yaml = f"{pathlib.Path(os.path.dirname(os.path.realpath(__file__)))}/config.yaml"
def check_current_version(prowler_version):
try:
release_response = requests.get(
"https://api.github.com/repos/prowler-cloud/prowler/tags"
)
latest_version = json.loads(release_response)[0]["name"]
if latest_version != prowler_version:
return f"(latest is {latest_version}, upgrade for the latest features)"
else:
return "(it is the latest version, yay!)"
except Exception:
return ""
def change_config_var(variable, value):
try:
with open(config_yaml) as f:

View File

@@ -4,6 +4,7 @@ from argparse import RawTextHelpFormatter
from prowler.config.config import (
available_compliance_frameworks,
check_current_version,
default_output_directory,
prowler_version,
)
@@ -36,7 +37,7 @@ Detailed documentation at https://docs.prowler.cloud
"-v",
"--version",
action="version",
version=f"Prowler {prowler_version}",
version=f"Prowler {prowler_version} {check_current_version(prowler_version)}",
help="show Prowler version",
)
# Common arguments parser

View File

@@ -208,6 +208,8 @@ def send_to_s3_bucket(
filename = f"{output_filename}{json_asff_file_suffix}"
elif output_mode == "html":
filename = f"{output_filename}{html_file_suffix}"
else: # Compliance output mode
filename = f"{output_filename}_{output_mode}{csv_file_suffix}"
logger.info(f"Sending outputs to S3 bucket {output_bucket}")
bucket_remote_dir = output_directory
while "prowler/" in bucket_remote_dir: # Check if it is not a custom directory

View File

@@ -85,7 +85,7 @@ def assume_role(session: session.Session, assumed_role_info: AWS_Assume_Role) ->
if assumed_role_info.external_id:
assumed_credentials = sts_client.assume_role(
RoleArn=assumed_role_info.role_arn,
RoleSessionName="ProwlerProAsessmentSession",
RoleSessionName="ProwlerAsessmentSession",
DurationSeconds=assumed_role_info.session_duration,
ExternalId=assumed_role_info.external_id,
)

View File

@@ -123,9 +123,9 @@
"acm-pca": {
"regions": {
"aws": [
"ap-east-1",
"ap-northeast-3",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"eu-south-1",
"eu-west-2",
@@ -133,15 +133,18 @@
"me-south-1",
"us-east-2",
"us-west-1",
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-south-2",
"eu-west-1",
"us-east-1",
"af-south-1",
"eu-central-2",
"eu-north-1",
"eu-west-3",
"sa-east-1",
@@ -149,14 +152,15 @@
],
"aws-cn": [],
"aws-us-gov": [
"us-gov-west-1",
"us-gov-east-1"
"us-gov-east-1",
"us-gov-west-1"
]
}
},
"ahl": {
"regions": {
"aws": [
"ap-south-1",
"us-east-1",
"us-east-2",
"us-west-2"
@@ -457,23 +461,29 @@
"ap-east-1",
"ap-northeast-1",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"eu-west-1",
"eu-west-2",
"me-south-1",
"us-east-2",
"us-west-2",
"af-south-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-south-1",
"eu-central-2",
"me-central-1",
"sa-east-1",
"us-east-1",
"us-west-1",
"us-west-2",
"af-south-1",
"ap-southeast-2",
"ap-southeast-3",
"eu-central-1",
"eu-north-1",
"eu-south-1",
"eu-south-2",
"us-east-1",
"eu-west-3"
],
"aws-cn": [
@@ -481,8 +491,8 @@
"cn-northwest-1"
],
"aws-us-gov": [
"us-gov-west-1",
"us-gov-east-1"
"us-gov-east-1",
"us-gov-west-1"
]
}
},
@@ -661,7 +671,6 @@
"appstream": {
"regions": {
"aws": [
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"ap-southeast-1",
@@ -671,12 +680,14 @@
"eu-west-2",
"sa-east-1",
"us-west-2",
"ap-northeast-1",
"ap-southeast-2",
"us-east-1",
"us-east-2"
],
"aws-cn": [],
"aws-us-gov": [
"us-gov-east-1",
"us-gov-west-1"
]
}
@@ -735,16 +746,22 @@
"arc-zonal-shift": {
"regions": {
"aws": [
"ap-northeast-1",
"af-south-1",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"eu-central-1",
"eu-north-1",
"eu-west-1",
"sa-east-1",
"us-east-1",
"us-east-2",
"us-west-2"
"us-west-2",
"ap-northeast-1",
"ap-northeast-2",
"ap-southeast-3",
"eu-west-2",
"eu-west-3",
"us-east-2"
],
"aws-cn": [],
"aws-us-gov": []
@@ -1705,26 +1722,29 @@
"codebuild": {
"regions": {
"aws": [
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"ap-southeast-1",
"eu-central-1",
"eu-south-1",
"eu-south-2",
"eu-west-1",
"eu-west-3",
"me-south-1",
"us-west-1",
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-3",
"ap-southeast-2",
"eu-north-1",
"eu-west-2",
"us-east-1",
"us-west-2",
"ap-south-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-2",
"eu-north-1",
"me-central-1",
"sa-east-1",
"us-east-2"
@@ -1873,21 +1893,24 @@
"eu-north-1",
"eu-south-1",
"eu-west-1",
"me-south-1",
"us-west-2",
"af-south-1",
"ap-northeast-1",
"ca-central-1",
"eu-central-1",
"eu-central-2",
"eu-west-2",
"eu-west-3",
"sa-east-1",
"us-east-1",
"us-east-2",
"us-west-1",
"ap-southeast-1"
"ap-southeast-1",
"ca-central-1",
"us-east-2"
],
"aws-cn": [
"cn-north-1",
"cn-northwest-1"
"cn-northwest-1",
"cn-north-1"
],
"aws-us-gov": [
"us-gov-west-1"
@@ -4809,7 +4832,9 @@
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
"aws-us-gov": [
"us-gov-west-1"
]
}
},
"iotwireless": {
@@ -5176,6 +5201,38 @@
]
}
},
"launchwizard": {
"regions": {
"aws": [
"af-south-1",
"ap-east-1",
"ap-northeast-1",
"ap-northeast-3",
"eu-central-1",
"eu-north-1",
"eu-west-2",
"us-east-2",
"us-west-1",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"eu-south-1",
"eu-west-1",
"eu-west-3",
"me-south-1",
"sa-east-1",
"us-west-2",
"ap-northeast-2",
"ca-central-1",
"us-east-1"
],
"aws-cn": [],
"aws-us-gov": [
"us-gov-west-1",
"us-gov-east-1"
]
}
},
"lex-models": {
"regions": {
"aws": [
@@ -5970,6 +6027,7 @@
"sa-east-1",
"us-east-1",
"eu-central-1",
"me-central-1",
"us-west-2"
],
"aws-cn": [],
@@ -7568,9 +7626,9 @@
"us-west-1",
"us-west-2",
"af-south-1",
"ap-northeast-2",
"ap-southeast-3",
"eu-central-1",
"eu-central-2",
"eu-south-1",
"eu-west-1",
"eu-west-3",
@@ -7578,6 +7636,7 @@
"me-south-1",
"us-east-2",
"ap-east-1",
"ap-northeast-2",
"ap-south-1",
"ap-southeast-1",
"ca-central-1",
@@ -7642,21 +7701,23 @@
"ap-northeast-3",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"eu-south-2",
"sa-east-1",
"us-west-1",
"ap-northeast-1",
"ap-south-1",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-south-1",
"eu-central-2",
"eu-west-1",
"me-central-1",
"me-south-1",
"us-east-1",
"ap-northeast-2",
"eu-north-1",
"eu-south-1",
"eu-west-2",
"me-central-1",
"us-east-2",
"us-west-2"
],
@@ -7683,16 +7744,19 @@
"us-east-2",
"af-south-1",
"ap-northeast-1",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-3",
"ca-central-1",
"eu-south-2",
"eu-west-2",
"me-central-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-2",
"eu-central-2",
"eu-north-1",
"me-central-1",
"sa-east-1",
"us-west-1",
"us-west-2"
@@ -7839,26 +7903,29 @@
"aws": [
"ap-east-1",
"ap-northeast-2",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ca-central-1",
"eu-north-1",
"eu-west-2",
"us-east-2",
"us-west-2",
"af-south-1",
"ap-northeast-3",
"ap-southeast-3",
"eu-north-1",
"eu-south-1",
"eu-west-1",
"eu-west-3",
"me-central-1",
"me-south-1",
"sa-east-1",
"us-east-1",
"ap-northeast-1",
"ap-south-1",
"eu-central-1",
"eu-central-2",
"eu-south-2",
"me-south-1",
"us-west-1"
],
"aws-cn": [
@@ -7875,9 +7942,12 @@
"regions": {
"aws": [
"ap-northeast-1",
"ap-southeast-1",
"ap-southeast-2",
"eu-central-1",
"eu-west-1",
"eu-west-2",
"sa-east-1",
"us-east-1",
"us-east-2",
"us-west-2"
@@ -7926,24 +7996,28 @@
"ap-south-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"us-east-2",
"ap-east-1",
"ap-northeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"me-south-1",
"sa-east-1",
"us-west-1",
"us-west-2",
"af-south-1",
"ap-south-2",
"ap-southeast-1",
"eu-central-2",
"eu-south-1",
"eu-west-3",
"me-central-1",
"us-east-1"
"me-south-1",
"us-east-1",
"me-central-1"
],
"aws-cn": [
"cn-northwest-1",
@@ -7968,16 +8042,20 @@
"af-south-1",
"ap-east-1",
"ap-northeast-3",
"ap-southeast-1",
"ap-south-2",
"ap-southeast-2",
"ap-southeast-3",
"eu-central-1",
"me-central-1",
"me-south-1",
"sa-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"ap-southeast-1",
"eu-central-1",
"eu-central-2",
"eu-south-1",
"eu-south-2",
"eu-west-1",
"us-west-1"
],
@@ -8944,8 +9022,7 @@
"eu-west-2",
"me-south-1",
"us-east-1",
"us-east-2",
"us-west-2"
"us-east-2"
],
"aws-cn": [],
"aws-us-gov": []
@@ -9379,7 +9456,9 @@
"us-east-1"
],
"aws-cn": [],
"aws-us-gov": []
"aws-us-gov": [
"us-gov-west-1"
]
}
},
"wisdom": {

View File

@@ -166,7 +166,7 @@ def create_inventory_table(resources: list, resources_in_region: dict) -> dict:
elif "documentation" in split_parts and "parts" in split_parts:
resource_type = "restapis-documentation-parts"
else:
resource_type = resource.split(":")[5].split("/")[1]
resource_type = resource["arn"].split(":")[5].split("/")[1]
else:
resource_type = resource["arn"].split(":")[5].split("/")[0]
if service not in resources_type:

View File

@@ -1,6 +1,7 @@
import threading
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from prowler.lib.logger import logger
@@ -65,8 +66,8 @@ class CloudFormation:
def __describe_stack__(self):
"""Get Details for a CloudFormation Stack"""
logger.info("CloudFormation - Describing Stack to get specific details...")
try:
for stack in self.stacks:
for stack in self.stacks:
try:
stack_details = self.regional_clients[stack.region].describe_stacks(
StackName=stack.name
)
@@ -79,10 +80,16 @@ class CloudFormation:
stack.root_nested_stack = stack_details["Stacks"][0]["RootId"]
stack.is_nested_stack = True if stack.root_nested_stack != "" else False
except Exception as error:
logger.error(
f"{stack.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except ClientError as error:
if error.response["Error"]["Code"] != "ValidationError":
logger.warning(
f"{stack.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
continue
except Exception as error:
logger.error(
f"{stack.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
class Stack(BaseModel):

View File

@@ -2,6 +2,7 @@ import threading
from enum import Enum
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from prowler.lib.logger import logger
@@ -68,8 +69,8 @@ class CodeArtifact:
def __list_packages__(self, regional_client):
logger.info("CodeArtifact - Listing Packages and retrieving information...")
try:
for repository in self.repositories:
for repository in self.repositories:
try:
if self.repositories[repository].region == regional_client.region:
list_packages_paginator = regional_client.get_paginator(
"list_packages"
@@ -142,12 +143,21 @@ class CodeArtifact:
# Save all the packages information
self.repositories[repository].packages = packages
except Exception as error:
logger.error(
f"{regional_client.region} --"
f" {error.__class__.__name__}[{error.__traceback__.tb_lineno}]:"
f" {error}"
)
except ClientError as error:
if error.response["Error"]["Code"] == "ResourceNotFoundException":
logger.warning(
f"{regional_client.region} --"
f" {error.__class__.__name__}[{error.__traceback__.tb_lineno}]:"
f" {error}"
)
continue
except Exception as error:
logger.error(
f"{regional_client.region} --"
f" {error.__class__.__name__}[{error.__traceback__.tb_lineno}]:"
f" {error}"
)
def __list_tags_for_resource__(self):
logger.info("CodeArtifact - List Tags...")

View File

@@ -1,6 +1,7 @@
import threading
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from prowler.lib.logger import logger
@@ -168,15 +169,24 @@ class DAX:
def __list_tags_for_resource__(self):
logger.info("DAX - List Tags...")
try:
for cluster in self.clusters:
for cluster in self.clusters:
try:
regional_client = self.regional_clients[cluster.region]
response = regional_client.list_tags(ResourceName=cluster.name)["Tags"]
# In the DAX service to call list_tags we need to pass the cluster ARN as the resource name
response = regional_client.list_tags(ResourceName=cluster.arn)["Tags"]
cluster.tags = response
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except ClientError as error:
if error.response["Error"]["Code"] != "InvalidARNFault":
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
continue
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
class Table(BaseModel):

View File

@@ -1,34 +0,0 @@
{
"Provider": "aws",
"CheckID": "ec2_securitygroup_in_use_without_ingress_filtering",
"CheckTitle": "Ensure there are no Security Groups without ingress filtering being used.",
"CheckType": [
"Infrastructure Security"
],
"ServiceName": "ec2",
"SubServiceName": "securitygroup",
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
"Severity": "high",
"ResourceType": "AwsEc2SecurityGroup",
"Description": "Ensure there are no Security Groups without ingress filtering being used.",
"Risk": "If Security groups are not filtering traffic appropriately the attack surface is increased.",
"RelatedUrl": "",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "You can grant access to a specific CIDR range or to another security group in your VPC or in a peer VPC.",
"Url": "https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html"
}
},
"Categories": [
"internet-exposed"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}

View File

@@ -1,28 +0,0 @@
from prowler.lib.check.models import Check, Check_Report_AWS
from prowler.providers.aws.services.ec2.ec2_client import ec2_client
from prowler.providers.aws.services.ec2.lib.security_groups import check_security_group
class ec2_securitygroup_in_use_without_ingress_filtering(Check):
def execute(self):
findings = []
for security_group in ec2_client.security_groups:
report = Check_Report_AWS(self.metadata())
report.region = security_group.region
report.resource_id = security_group.id
report.resource_arn = security_group.arn
report.resource_tags = security_group.tags
report.status = "PASS"
report.status_extended = f"Security group {security_group.name} ({security_group.id}) has ingress filtering."
for ingress_rule in security_group.ingress_rules:
if check_security_group(ingress_rule, "-1"):
report.status = "FAIL"
if len(security_group.network_interfaces) > 0:
report.status_extended = f"Security group {security_group.name} ({security_group.id}) has no ingress filtering and it is being used."
else:
report.status_extended = f"Security group {security_group.name} ({security_group.id}) has no ingress filtering and it is not being used."
break
findings.append(report)
return findings

View File

@@ -195,8 +195,8 @@ class EC2:
def __get_snapshot_public__(self):
logger.info("EC2 - Gettting snapshots encryption...")
try:
for snapshot in self.snapshots:
for snapshot in self.snapshots:
try:
regional_client = self.regional_clients[snapshot.region]
snapshot_public = regional_client.describe_snapshot_attribute(
Attribute="createVolumePermission", SnapshotId=snapshot.id
@@ -205,10 +205,20 @@ class EC2:
if "Group" in permission:
if permission["Group"] == "all":
snapshot.public = True
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except ClientError as error:
if error.response["Error"]["Code"] == "InvalidSnapshot.NotFound":
logger.warning(
f"{snapshot.region} --"
f" {error.__class__.__name__}[{error.__traceback__.tb_lineno}]:"
f" {error}"
)
continue
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __describe_network_interfaces__(self, regional_client):
logger.info("EC2 - Describing Network Interfaces...")
@@ -239,23 +249,25 @@ class EC2:
def __get_instance_user_data__(self):
logger.info("EC2 - Gettting instance user data...")
try:
for instance in self.instances:
for instance in self.instances:
try:
regional_client = self.regional_clients[instance.region]
user_data = regional_client.describe_instance_attribute(
Attribute="userData", InstanceId=instance.id
)["UserData"]
if "Value" in user_data:
instance.user_data = user_data["Value"]
except ClientError as error:
if error.response["Error"]["Code"] == "InvalidInstanceID.NotFound":
logger.warning(
except ClientError as error:
if error.response["Error"]["Code"] == "InvalidInstanceID.NotFound":
logger.warning(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
continue
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __describe_images__(self, regional_client):
logger.info("EC2 - Describing Images...")

View File

@@ -1,7 +1,7 @@
{
"Provider": "aws",
"CheckID": "elbv2_desync_mitigation_mode",
"CheckTitle": "Check whether the Application Load Balancer is configured with defensive or strictest desync mitigation mode.",
"CheckTitle": "Check whether the Application Load Balancer is configured with defensive or strictest desync mitigation mode, if not check if at least is configured with the drop_invalid_header_fields attribute",
"CheckType": [
"Data Protection"
],
@@ -10,7 +10,7 @@
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
"Severity": "medium",
"ResourceType": "AwsElasticLoadBalancingV2LoadBalancer",
"Description": "Check whether the Application Load Balancer is configured with defensive or strictest desync mitigation mode.",
"Description": "Check whether the Application Load Balancer is configured with defensive or strictest desync mitigation mode, if not check if at least is configured with the drop_invalid_header_fields attribute",
"Risk": "HTTP Desync issues can lead to request smuggling and make your applications vulnerable to request queue or cache poisoning; which could lead to credential hijacking or execution of unauthorized commands.",
"RelatedUrl": "",
"Remediation": {
@@ -21,7 +21,7 @@
"Terraform": ""
},
"Recommendation": {
"Text": "Ensure Application Load Balancer is configured with defensive or strictest desync mitigation mode.",
"Text": "Ensure Application Load Balancer is configured with defensive or strictest desync mitigation mode or with the drop_invalid_header_fields attribute enabled",
"Url": "https://aws.amazon.com/about-aws/whats-new/2020/08/application-and-classic-load-balancers-adding-defense-in-depth-with-introduction-of-desync-mitigation-mode/"
}
},

View File

@@ -15,9 +15,11 @@ class elbv2_desync_mitigation_mode(Check):
report.status = "PASS"
report.status_extended = f"ELBv2 ALB {lb.name} is configured with correct desync mitigation mode."
if lb.desync_mitigation_mode == "monitor":
report.status = "FAIL"
report.status_extended = f"ELBv2 ALB {lb.name} does not have desync mitigation mode set as defensive or strictest."
if lb.drop_invalid_header_fields == "false":
report.status = "FAIL"
report.status_extended = f"ELBv2 ALB {lb.name} does not have desync mitigation mode set as defensive or strictest and is not dropping invalid header fields"
elif lb.drop_invalid_header_fields == "true":
report.status_extended = f"ELBv2 ALB {lb.name} does not have desync mitigation mode set as defensive or strictest but is dropping invalid header fields"
findings.append(report)
return findings

View File

@@ -1,32 +0,0 @@
{
"Provider": "aws",
"CheckID": "elbv2_request_smugling",
"CheckTitle": "Check if Application Load Balancer is dropping invalid packets to prevent header based HTTP request smuggling.",
"CheckType": [
"Data Protection"
],
"ServiceName": "elbv2",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
"Severity": "medium",
"ResourceType": "AwsElasticLoadBalancingV2LoadBalancer",
"Description": "Check if Application Load Balancer is dropping invalid packets to prevent header based HTTP request smuggling.",
"Risk": "ALB can be target of actors sending bad HTTP headers.",
"RelatedUrl": "",
"Remediation": {
"Code": {
"CLI": "aws elbv2 modify-load-balancer-attributes --load-balancer-arn <lb_arn> --attributes Key=routing.http.drop_invalid_header_fields.enabled,Value=true",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/ELBv2/drop-invalid-header-fields-enabled.html",
"Terraform": "https://docs.bridgecrew.io/docs/ensure-that-alb-drops-http-headers#terraform"
},
"Recommendation": {
"Text": "Ensure Application Load Balancer is configured for HTTP headers with header fields that are not valid are removed by the load balancer (true).",
"Url": "https://docs.aws.amazon.com/elasticloadbalancing/latest/application/application-load-balancers.html#deletion-protection"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}

View File

@@ -1,27 +0,0 @@
from prowler.lib.check.models import Check, Check_Report_AWS
from prowler.providers.aws.services.elbv2.elbv2_client import elbv2_client
class elbv2_request_smugling(Check):
def execute(self):
findings = []
for lb in elbv2_client.loadbalancersv2:
if lb.type == "application":
report = Check_Report_AWS(self.metadata())
report.region = lb.region
report.resource_id = lb.name
report.resource_arn = lb.arn
report.resource_tags = lb.tags
report.status = "FAIL"
report.status_extended = (
f"ELBv2 ALB {lb.name} is not dropping invalid header fields."
)
if lb.drop_invalid_header_fields == "true":
report.status = "PASS"
report.status_extended = (
f"ELBv2 ALB {lb.name} is dropping invalid header fields."
)
findings.append(report)
return findings

View File

@@ -2,6 +2,7 @@ import csv
from datetime import datetime
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from prowler.lib.logger import logger
@@ -104,6 +105,13 @@ class IAM:
credential_lines = credential.split("\n")
csv_reader = csv.DictReader(credential_lines, delimiter=",")
credential_list = list(csv_reader)
except ClientError as error:
if error.response["Error"]["Code"] != "LimitExceededException":
logger.warning(
f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{self.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"

View File

@@ -1,6 +1,7 @@
import threading
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from prowler.lib.logger import logger
@@ -136,8 +137,8 @@ class RDS:
def __describe_db_snapshot_attributes__(self, regional_client):
logger.info("RDS - Describe Snapshot Attributes...")
try:
for snapshot in self.db_snapshots:
for snapshot in self.db_snapshots:
try:
if snapshot.region == regional_client.region:
response = regional_client.describe_db_snapshot_attributes(
DBSnapshotIdentifier=snapshot.id
@@ -145,11 +146,16 @@ class RDS:
for att in response["DBSnapshotAttributes"]:
if "all" in att["AttributeValues"]:
snapshot.public = True
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except ClientError as error:
if error.response["Error"]["Code"] != "DBSnapshotNotFound":
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
continue
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
def __describe_db_cluster_snapshots__(self, regional_client):
logger.info("RDS - Describe Cluster Snapshots...")

View File

@@ -6,20 +6,19 @@ class s3_bucket_level_public_access_block(Check):
def execute(self):
findings = []
for bucket in s3_client.buckets:
report = Check_Report_AWS(self.metadata())
report.region = bucket.region
report.resource_id = bucket.name
report.resource_arn = bucket.arn
report.resource_tags = bucket.tags
report.status = "PASS"
report.status_extended = (
f"Block Public Access is configured for the S3 Bucket {bucket.name}."
)
if not (
bucket.public_access_block.ignore_public_acls
and bucket.public_access_block.restrict_public_buckets
):
report.status = "FAIL"
report.status_extended = f"Block Public Access is not configured for the S3 Bucket {bucket.name}."
findings.append(report)
if bucket.public_access_block:
report = Check_Report_AWS(self.metadata())
report.region = bucket.region
report.resource_id = bucket.name
report.resource_arn = bucket.arn
report.resource_tags = bucket.tags
report.status = "PASS"
report.status_extended = f"Block Public Access is configured for the S3 Bucket {bucket.name}."
if not (
bucket.public_access_block.ignore_public_acls
and bucket.public_access_block.restrict_public_buckets
):
report.status = "FAIL"
report.status_extended = f"Block Public Access is not configured for the S3 Bucket {bucket.name}."
findings.append(report)
return findings

View File

@@ -21,50 +21,51 @@ class s3_bucket_public_access(Check):
else:
# 2. If public access is not blocked at account level, check it at each bucket level
for bucket in s3_client.buckets:
report = Check_Report_AWS(self.metadata())
report.region = bucket.region
report.resource_id = bucket.name
report.resource_arn = bucket.arn
report.resource_tags = bucket.tags
report.status = "PASS"
report.status_extended = f"S3 Bucket {bucket.name} is not public."
if not (
bucket.public_access_block.ignore_public_acls
and bucket.public_access_block.restrict_public_buckets
):
# 3. If bucket has no public block, check bucket ACL
for grantee in bucket.acl_grantees:
if grantee.type in "Group":
if (
"AllUsers" in grantee.URI
or "AuthenticatedUsers" in grantee.URI
):
report.status = "FAIL"
report.status_extended = f"S3 Bucket {bucket.name} has public access due to bucket ACL."
if bucket.public_access_block:
report = Check_Report_AWS(self.metadata())
report.region = bucket.region
report.resource_id = bucket.name
report.resource_arn = bucket.arn
report.resource_tags = bucket.tags
report.status = "PASS"
report.status_extended = f"S3 Bucket {bucket.name} is not public."
if not (
bucket.public_access_block.ignore_public_acls
and bucket.public_access_block.restrict_public_buckets
):
# 3. If bucket has no public block, check bucket ACL
for grantee in bucket.acl_grantees:
if grantee.type in "Group":
if (
"AllUsers" in grantee.URI
or "AuthenticatedUsers" in grantee.URI
):
report.status = "FAIL"
report.status_extended = f"S3 Bucket {bucket.name} has public access due to bucket ACL."
# 4. Check bucket policy
if bucket.policy:
for statement in bucket.policy["Statement"]:
if (
"Principal" in statement
and "*" == statement["Principal"]
and statement["Effect"] == "Allow"
):
report.status = "FAIL"
report.status_extended = f"S3 Bucket {bucket.name} has public access due to bucket policy."
else:
# 4. Check bucket policy
if bucket.policy:
for statement in bucket.policy["Statement"]:
if (
"Principal" in statement
and "AWS" in statement["Principal"]
and "*" == statement["Principal"]
and statement["Effect"] == "Allow"
):
if type(statement["Principal"]["AWS"]) == str:
principals = [statement["Principal"]["AWS"]]
else:
principals = statement["Principal"]["AWS"]
for principal_arn in principals:
if principal_arn == "*":
report.status = "FAIL"
report.status_extended = f"S3 Bucket {bucket.name} has public access due to bucket policy."
findings.append(report)
report.status = "FAIL"
report.status_extended = f"S3 Bucket {bucket.name} has public access due to bucket policy."
else:
if (
"Principal" in statement
and "AWS" in statement["Principal"]
and statement["Effect"] == "Allow"
):
if type(statement["Principal"]["AWS"]) == str:
principals = [statement["Principal"]["AWS"]]
else:
principals = statement["Principal"]["AWS"]
for principal_arn in principals:
if principal_arn == "*":
report.status = "FAIL"
report.status_extended = f"S3 Bucket {bucket.name} has public access due to bucket policy."
findings.append(report)
return findings

View File

@@ -1,15 +1,15 @@
{
"Provider": "aws",
"CheckID": "secretsmanager_automatic_rotation_enabled",
"CheckTitle": "Check if Secrets Manager key rotation is enabled.",
"CheckTitle": "Check if Secrets Manager secret rotation is enabled.",
"CheckType": [],
"ServiceName": "secretsmanager",
"SubServiceName": "",
"ResourceIdTemplate": "arn:aws:secretsmanager:region:account-id:secret:secret-name",
"Severity": "medium",
"ResourceType": "AwsSecretsManagerSecret",
"Description": "Check if Secrets Manager key rotation is enabled.",
"Risk": "Rotating secrets minimizes exposure to attacks using stolen keys.",
"Description": "Check if Secrets Manager secret rotation is enabled.",
"Risk": "Rotating secrets minimizes exposure to attacks using stolen secrets.",
"RelatedUrl": "https://docs.aws.amazon.com/secretsmanager/latest/userguide/rotating-secrets_strategies.html",
"Remediation": {
"Code": {

View File

@@ -3,6 +3,7 @@ import threading
from enum import Enum
from typing import Optional
from botocore.client import ClientError
from pydantic import BaseModel
from prowler.lib.logger import logger
@@ -78,20 +79,29 @@ class SSM:
def __get_document__(self, regional_client):
logger.info("SSM - Getting Document...")
try:
for document in self.documents.values():
for document in self.documents.values():
try:
if document.region == regional_client.region:
document_info = regional_client.get_document(Name=document.name)
self.documents[document.name].content = json.loads(
document_info["Content"]
)
except Exception as error:
logger.error(
f"{regional_client.region} --"
f" {error.__class__.__name__}[{error.__traceback__.tb_lineno}]:"
f" {error}"
)
except ClientError as error:
if error.response["Error"]["Code"] == "ValidationException":
logger.warning(
f"{regional_client.region} --"
f" {error.__class__.__name__}[{error.__traceback__.tb_lineno}]:"
f" {error}"
)
continue
except Exception as error:
logger.error(
f"{regional_client.region} --"
f" {error.__class__.__name__}[{error.__traceback__.tb_lineno}]:"
f" {error}"
)
def __describe_document_permission__(self, regional_client):
logger.info("SSM - Describing Document Permission...")

View File

@@ -6,10 +6,10 @@ from pydantic import BaseModel
class Azure_Identity_Info(BaseModel):
identity_id: str = None
identity_type: str = None
identity_id: str = ""
identity_type: str = ""
tenant_ids: list[str] = []
domain: str = None
domain: str = ""
subscriptions: dict = {}

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_app_services_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan App Services"
report.resource_id = pricings["AppServices"].resource_id
report.status_extended = f"Defender plan Defender for App Services from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["AppServices"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for App Services from subscription {subscription} is set to OFF (pricing tier not standard)"
if "AppServices" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan App Services"
report.resource_id = pricings["AppServices"].resource_id
report.status_extended = f"Defender plan Defender for App Services from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["AppServices"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for App Services from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_arm_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_id = pricings["Arm"].resource_id
report.resource_name = "Defender planARM"
report.status_extended = f"Defender plan Defender for ARM from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["Arm"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for ARM from subscription {subscription} is set to OFF (pricing tier not standard)"
if "Arm" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_id = pricings["Arm"].resource_id
report.resource_name = "Defender planARM"
report.status_extended = f"Defender plan Defender for ARM from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["Arm"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for ARM from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_azure_sql_databases_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_id = pricings["SqlServers"].resource_id
report.resource_name = "Defender plan Azure sql db servers"
report.status_extended = f"Defender plan Defender for Azure sql db servers from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["SqlServers"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Azure sql db servers from subscription {subscription} is set to OFF (pricing tier not standard)"
if "SqlServers" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_id = pricings["SqlServers"].resource_id
report.resource_name = "Defender plan Azure sql db servers"
report.status_extended = f"Defender plan Defender for Azure sql db servers from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["SqlServers"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Azure sql db servers from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_containers_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_id = pricings["Containers"].resource_id
report.resource_name = "Defender plan Container Registries"
report.status_extended = f"Defender plan Defender for Containers from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["Containers"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Containers from subscription {subscription} is set to OFF (pricing tier not standard)"
if "Containers" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_id = pricings["Containers"].resource_id
report.resource_name = "Defender plan Container Registries"
report.status_extended = f"Defender plan Defender for Containers from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["Containers"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Containers from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_cosmosdb_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_id = pricings["CosmosDbs"].resource_id
report.resource_name = "Defender plan Cosmos DB"
report.status_extended = f"Defender plan Defender for Cosmos DB from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["CosmosDbs"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Cosmos DB from subscription {subscription} is set to OFF (pricing tier not standard)"
if "CosmosDbs" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_id = pricings["CosmosDbs"].resource_id
report.resource_name = "Defender plan Cosmos DB"
report.status_extended = f"Defender plan Defender for Cosmos DB from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["CosmosDbs"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Cosmos DB from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,19 +6,26 @@ class defender_ensure_defender_for_databases_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.resource_name = "Defender plan Databases"
report.subscription = subscription
report.resource_id = pricings["SqlServers"].resource_id
report.status_extended = f"Defender plan Defender for Databases from subscription {subscription} is set to ON (pricing tier standard)"
if (
pricings["SqlServers"].pricing_tier != "Standard"
or pricings["SqlServerVirtualMachines"].pricing_tier != "Standard"
or pricings["OpenSourceRelationalDatabases"].pricing_tier != "Standard"
or pricings["CosmosDbs"].pricing_tier != "Standard"
"SqlServers" in pricings
and "SqlServerVirtualMachines" in pricings
and "OpenSourceRelationalDatabases" in pricings
and "CosmosDbs" in pricings
):
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Databases from subscription {subscription} is set to OFF (pricing tier not standard)"
report = Check_Report_Azure(self.metadata())
report.resource_name = "Defender plan Databases"
report.subscription = subscription
report.resource_id = pricings["SqlServers"].resource_id
report.status_extended = f"Defender plan Defender for Databases from subscription {subscription} is set to ON (pricing tier standard)"
if (
pricings["SqlServers"].pricing_tier != "Standard"
or pricings["SqlServerVirtualMachines"].pricing_tier != "Standard"
or pricings["OpenSourceRelationalDatabases"].pricing_tier
!= "Standard"
or pricings["CosmosDbs"].pricing_tier != "Standard"
):
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Databases from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_dns_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan DNS"
report.resource_id = pricings["Dns"].resource_id
report.status_extended = f"Defender plan Defender for DNS from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["Dns"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for DNS from subscription {subscription} is set to OFF (pricing tier not standard)"
if "Dns" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan DNS"
report.resource_id = pricings["Dns"].resource_id
report.status_extended = f"Defender plan Defender for DNS from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["Dns"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for DNS from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_keyvault_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan KeyVaults"
report.resource_id = pricings["KeyVaults"].resource_id
report.status_extended = f"Defender plan Defender for KeyVaults from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["KeyVaults"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for KeyVaults subscription from {subscription} is set to OFF (pricing tier not standard)"
if "KeyVaults" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan KeyVaults"
report.resource_id = pricings["KeyVaults"].resource_id
report.status_extended = f"Defender plan Defender for KeyVaults from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["KeyVaults"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for KeyVaults subscription from {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,15 +6,18 @@ class defender_ensure_defender_for_os_relational_databases_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan Open-Source Relational Databases"
report.resource_id = pricings["OpenSourceRelationalDatabases"].resource_id
report.status_extended = f"Defender plan Defender for Open-Source Relational Databases from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["OpenSourceRelationalDatabases"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Open-Source Relational Databases from subscription {subscription} is set to OFF (pricing tier not standard)"
if "OpenSourceRelationalDatabases" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan Open-Source Relational Databases"
report.resource_id = pricings[
"OpenSourceRelationalDatabases"
].resource_id
report.status_extended = f"Defender plan Defender for Open-Source Relational Databases from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["OpenSourceRelationalDatabases"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Open-Source Relational Databases from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_server_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan Servers"
report.resource_id = pricings["VirtualMachines"].resource_id
report.status_extended = f"Defender plan Defender for Servers from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["VirtualMachines"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Servers from subscription {subscription} is set to OFF (pricing tier not standard)"
if "VirtualMachines" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan Servers"
report.resource_id = pricings["VirtualMachines"].resource_id
report.status_extended = f"Defender plan Defender for Servers from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["VirtualMachines"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Servers from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
return findings

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_sql_servers_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan SQL Server VMs"
report.resource_id = pricings["SqlServerVirtualMachines"].resource_id
report.status_extended = f"Defender plan Defender for SQL Server VMs from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["SqlServerVirtualMachines"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for SQL Server VMs from subscription {subscription} is set to OFF (pricing tier not standard)"
if "SqlServerVirtualMachines" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan SQL Server VMs"
report.resource_id = pricings["SqlServerVirtualMachines"].resource_id
report.status_extended = f"Defender plan Defender for SQL Server VMs from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["SqlServerVirtualMachines"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for SQL Server VMs from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -6,15 +6,16 @@ class defender_ensure_defender_for_storage_is_on(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, pricings in defender_client.pricings.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan Storage Accounts"
report.resource_id = pricings["StorageAccounts"].resource_id
report.status_extended = f"Defender plan Defender for Storage Accounts from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["StorageAccounts"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Storage Accounts from subscription {subscription} is set to OFF (pricing tier not standard)"
if "StorageAccounts" in pricings:
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = subscription
report.resource_name = "Defender plan Storage Accounts"
report.resource_id = pricings["StorageAccounts"].resource_id
report.status_extended = f"Defender plan Defender for Storage Accounts from subscription {subscription} is set to ON (pricing tier standard)"
if pricings["StorageAccounts"].pricing_tier != "Standard":
report.status = "FAIL"
report.status_extended = f"Defender plan Defender for Storage Accounts from subscription {subscription} is set to OFF (pricing tier not standard)"
findings.append(report)
findings.append(report)
return findings

View File

@@ -291,7 +291,7 @@ Caller Identity ARN: {Fore.YELLOW}[{audit_info.audited_identity_arn}]{Style.RESE
set_azure_audit_info returns the Azure_Audit_Info
"""
logger.info("Setting Azure session ...")
subscription_ids = arguments.get("subscriptions")
subscription_ids = arguments.get("subscription_ids")
logger.info("Checking if any credentials mode is set ...")
az_cli_auth = arguments.get("az_cli_auth")

View File

@@ -22,10 +22,10 @@ packages = [
{include = "prowler"}
]
readme = "README.md"
version = "3.3.0"
version = "3.3.4"
[tool.poetry.dependencies]
alive-progress = "3.0.1"
alive-progress = "3.1.0"
arnparse = "0.0.2"
azure-identity = "1.12.0"
azure-mgmt-authorization = "3.0.0"
@@ -34,13 +34,13 @@ azure-mgmt-storage = "21.0.0"
azure-mgmt-subscription = "3.1.1"
azure-storage-blob = "12.15.0"
boto3 = "1.26.90"
botocore = "1.29.90"
botocore = "1.29.105"
colorama = "0.4.6"
detect-secrets = "1.4.0"
mkdocs = {version = "1.4.2", optional = true}
mkdocs-material = {version = "9.1.3", optional = true}
mkdocs-material = {version = "9.1.5", optional = true}
msgraph-core = "0.2.2"
pydantic = "1.10.6"
pydantic = "1.10.7"
python = "^3.9"
schema = "0.7.5"
shodan = "1.28.0"
@@ -56,9 +56,9 @@ coverage = "7.2.2"
docker = "6.0.1"
flake8 = "6.0.0"
freezegun = "1.2.2"
moto = "4.1.5"
moto = "4.1.6"
openapi-spec-validator = "0.5.6"
pylint = "2.17.0"
pylint = "2.17.2"
pytest = "7.2.2"
pytest-xdist = "3.2.1"
safety = "2.3.5"

View File

@@ -1,6 +1,29 @@
from unittest import mock
from prowler.config.config import check_current_version, prowler_version
from prowler.providers.aws.aws_provider import get_aws_available_regions
MOCK_PROWLER_VERSION = "3.3.0"
def mock_prowler_get_latest_release(_):
"""Mock requests.get() to get the Prowler latest release"""
return b'[{"name": "3.3.0"}]'
class Test_Config:
def test_get_aws_available_regions(self):
assert len(get_aws_available_regions()) == 31
@mock.patch(
"prowler.config.config.requests.get", new=mock_prowler_get_latest_release
)
@mock.patch("prowler.config.config.prowler_version", new=MOCK_PROWLER_VERSION)
def test_check_current_version_with_latest(self):
assert (
check_current_version(prowler_version) == "(it is the latest version, yay!)"
)
assert (
check_current_version("0.0.0")
== f"(latest is {prowler_version}, upgrade for the latest features)"
)

View File

@@ -512,6 +512,62 @@ class Test_Outputs:
== "binary/octet-stream"
)
@mock_s3
def test_send_to_s3_bucket_compliance(self):
# Create mock session
session = boto3.session.Session(
region_name="us-east-1",
)
# Create mock audit_info
input_audit_info = AWS_Audit_Info(
session_config=None,
original_session=None,
audit_session=session,
audited_account=AWS_ACCOUNT_ID,
audited_identity_arn="test-arn",
audited_user_id="test",
audited_partition="aws",
profile="default",
profile_region="eu-west-1",
credentials=None,
assumed_role_info=None,
audited_regions=["eu-west-2", "eu-west-1"],
organizations_metadata=None,
audit_resources=None,
)
# Creat mock bucket
bucket_name = "test_bucket"
client = boto3.client("s3")
client.create_bucket(Bucket=bucket_name)
# Create mock csv output file
fixtures_dir = "tests/lib/outputs/fixtures"
output_directory = getcwd() + "/" + fixtures_dir
output_mode = "cis_1.4_aws"
filename = f"prowler-output-{input_audit_info.audited_account}"
# Send mock csv file to mock S3 Bucket
send_to_s3_bucket(
filename,
output_directory,
output_mode,
bucket_name,
input_audit_info.audit_session,
)
# Check if the file has been sent by checking its content type
assert (
client.get_object(
Bucket=bucket_name,
Key=fixtures_dir
+ "/"
+ output_mode
+ "/"
+ filename
+ "_"
+ output_mode
+ csv_file_suffix,
)["ContentType"]
== "binary/octet-stream"
)
@mock_s3
def test_send_to_s3_bucket_custom_directory(self):
# Create mock session

View File

@@ -136,7 +136,7 @@ class Test_DynamoDB_Service:
{"Key": "test", "Value": "test"},
]
# Test DynamoDB Describe Table
# Test DynamoDB Describe Continuous Backups
@mock_dynamodb
def test__describe_continuous_backups__(self):
# Generate DynamoDB Client
@@ -167,7 +167,7 @@ class Test_DynamoDB_Service:
assert dynamo.tables[0].pitr
assert dynamo.tables[0].region == AWS_REGION
# Test DAX List Tables
# Test DAX Describe Clusters
@mock_dax
def test__describe_clusters__(self):
# Generate DAX Client
@@ -198,13 +198,17 @@ class Test_DynamoDB_Service:
audit_info = self.set_mocked_audit_info()
dax = DAX(audit_info)
assert len(dax.clusters) == 2
assert dax.clusters[0].name == "daxcluster1"
assert dax.clusters[1].name == "daxcluster2"
assert dax.clusters[0].region == AWS_REGION
assert dax.clusters[1].region == AWS_REGION
assert dax.clusters[0].encryption
assert dax.clusters[0].tags == [
{"Key": "test", "Value": "test"},
]
assert dax.clusters[1].name == "daxcluster2"
assert dax.clusters[1].region == AWS_REGION
assert dax.clusters[1].encryption
assert dax.clusters[1].tags == [
{"Key": "test", "Value": "test"},
]

View File

@@ -1,11 +1,22 @@
from unittest import mock
from boto3 import client, resource
from mock import patch
from moto import mock_ec2
AWS_REGION = "us-east-1"
def mock_generate_regional_clients(service, audit_info):
regional_client = audit_info.audit_session.client(service, region_name=AWS_REGION)
regional_client.region = AWS_REGION
return {AWS_REGION: regional_client}
@patch(
"prowler.providers.aws.services.ec2.ec2_service.generate_regional_clients",
new=mock_generate_regional_clients,
)
class Test_ec2_ebs_public_snapshot:
@mock_ec2
def test_ec2_default_snapshots(self):
@@ -28,7 +39,7 @@ class Test_ec2_ebs_public_snapshot:
result = check.execute()
# Default snapshots
assert len(result) == 1124
assert len(result) == 564
@mock_ec2
def test_ec2_public_snapshot(self):
@@ -62,7 +73,7 @@ class Test_ec2_ebs_public_snapshot:
results = check.execute()
# Default snapshots + 1 created
assert len(results) == 1125
assert len(results) == 565
for snap in results:
if snap.resource_id == snapshot.id:
@@ -103,7 +114,7 @@ class Test_ec2_ebs_public_snapshot:
results = check.execute()
# Default snapshots + 1 created
assert len(results) == 1125
assert len(results) == 565
for snap in results:
if snap.resource_id == snapshot.id:

View File

@@ -1,11 +1,22 @@
from unittest import mock
from boto3 import resource
from mock import patch
from moto import mock_ec2
AWS_REGION = "us-east-1"
def mock_generate_regional_clients(service, audit_info):
regional_client = audit_info.audit_session.client(service, region_name=AWS_REGION)
regional_client.region = AWS_REGION
return {AWS_REGION: regional_client}
@patch(
"prowler.providers.aws.services.ec2.ec2_service.generate_regional_clients",
new=mock_generate_regional_clients,
)
class Test_ec2_ebs_snapshots_encrypted:
@mock_ec2
def test_ec2_default_snapshots(self):
@@ -28,7 +39,7 @@ class Test_ec2_ebs_snapshots_encrypted:
result = check.execute()
# Default snapshots
assert len(result) == 1124
assert len(result) == 564
@mock_ec2
def test_ec2_unencrypted_snapshot(self):
@@ -56,7 +67,7 @@ class Test_ec2_ebs_snapshots_encrypted:
results = check.execute()
# Default snapshots + 1 created
assert len(results) == 1125
assert len(results) == 565
for snap in results:
if snap.resource_id == snapshot.id:
@@ -97,7 +108,7 @@ class Test_ec2_ebs_snapshots_encrypted:
results = check.execute()
# Default snapshots + 1 created
assert len(results) == 1125
assert len(results) == 565
for snap in results:
if snap.resource_id == snapshot.id:

View File

@@ -1,191 +0,0 @@
from re import search
from unittest import mock
from boto3 import client, resource
from moto import mock_ec2
AWS_REGION = "us-east-1"
EXAMPLE_AMI_ID = "ami-12c6146b"
class Test_ec2_securitygroup_in_use_without_ingress_filtering:
@mock_ec2
def test_ec2_default_sgs(self):
# Create EC2 Mocked Resources
ec2_client = client("ec2", region_name=AWS_REGION)
ec2_client.create_vpc(CidrBlock="10.0.0.0/16")
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.ec2.ec2_service import EC2
current_audit_info.audited_partition = "aws"
current_audit_info.audited_regions = ["eu-west-1", "us-east-1"]
with mock.patch(
"prowler.providers.aws.services.ec2.ec2_securitygroup_in_use_without_ingress_filtering.ec2_securitygroup_in_use_without_ingress_filtering.ec2_client",
new=EC2(current_audit_info),
):
# Test Check
from prowler.providers.aws.services.ec2.ec2_securitygroup_in_use_without_ingress_filtering.ec2_securitygroup_in_use_without_ingress_filtering import (
ec2_securitygroup_in_use_without_ingress_filtering,
)
check = ec2_securitygroup_in_use_without_ingress_filtering()
result = check.execute()
# One default sg per region
assert len(result) == 3
# All are compliant by default
assert result[0].status == "PASS"
@mock_ec2
def test_ec2_unused_public_default_sg(self):
# Create EC2 Mocked Resources
ec2_client = client("ec2", region_name=AWS_REGION)
ec2_client.create_vpc(CidrBlock="10.0.0.0/16")
default_sg_id = ec2_client.describe_security_groups(GroupNames=["default"])[
"SecurityGroups"
][0]["GroupId"]
ec2_client.authorize_security_group_ingress(
GroupId=default_sg_id,
IpPermissions=[
{
"IpProtocol": "-1",
"IpRanges": [{"CidrIp": "0.0.0.0/0"}],
}
],
)
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.ec2.ec2_service import EC2
current_audit_info.audited_partition = "aws"
current_audit_info.audited_regions = ["eu-west-1", "us-east-1"]
with mock.patch(
"prowler.providers.aws.services.ec2.ec2_securitygroup_in_use_without_ingress_filtering.ec2_securitygroup_in_use_without_ingress_filtering.ec2_client",
new=EC2(current_audit_info),
):
# Test Check
from prowler.providers.aws.services.ec2.ec2_securitygroup_in_use_without_ingress_filtering.ec2_securitygroup_in_use_without_ingress_filtering import (
ec2_securitygroup_in_use_without_ingress_filtering,
)
check = ec2_securitygroup_in_use_without_ingress_filtering()
result = check.execute()
# One default sg per region
assert len(result) == 3
# Search changed sg
for sg in result:
if sg.resource_id == default_sg_id:
assert sg.status == "FAIL"
assert search(
"has no ingress filtering and it is not being used",
sg.status_extended,
)
assert (
sg.resource_arn
== f"arn:{current_audit_info.audited_partition}:ec2:{AWS_REGION}:{current_audit_info.audited_account}:security-group/{default_sg_id}"
)
@mock_ec2
def test_ec2_used_public_default_sg(self):
# Create EC2 Mocked Resources
ec2_client = client("ec2", region_name=AWS_REGION)
ec2_client.create_vpc(CidrBlock="10.0.0.0/16")
default_sg_id = ec2_client.describe_security_groups(GroupNames=["default"])[
"SecurityGroups"
][0]["GroupId"]
ec2_client.authorize_security_group_ingress(
GroupId=default_sg_id,
IpPermissions=[
{
"IpProtocol": "-1",
"IpRanges": [{"CidrIp": "0.0.0.0/0"}],
}
],
)
ec2 = resource("ec2", region_name=AWS_REGION)
ec2.create_instances(
ImageId=EXAMPLE_AMI_ID,
MinCount=1,
MaxCount=1,
SecurityGroupIds=[
default_sg_id,
],
)
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.ec2.ec2_service import EC2
current_audit_info.audited_partition = "aws"
current_audit_info.audited_regions = ["eu-west-1", "us-east-1"]
with mock.patch(
"prowler.providers.aws.services.ec2.ec2_securitygroup_in_use_without_ingress_filtering.ec2_securitygroup_in_use_without_ingress_filtering.ec2_client",
new=EC2(current_audit_info),
):
# Test Check
from prowler.providers.aws.services.ec2.ec2_securitygroup_in_use_without_ingress_filtering.ec2_securitygroup_in_use_without_ingress_filtering import (
ec2_securitygroup_in_use_without_ingress_filtering,
)
check = ec2_securitygroup_in_use_without_ingress_filtering()
result = check.execute()
# One default sg per region
assert len(result) == 3
# Search changed sg
for sg in result:
if sg.resource_id == default_sg_id:
assert sg.status == "FAIL"
assert search(
"has no ingress filtering and it is being used",
sg.status_extended,
)
assert (
sg.resource_arn
== f"arn:{current_audit_info.audited_partition}:ec2:{AWS_REGION}:{current_audit_info.audited_account}:security-group/{default_sg_id}"
)
@mock_ec2
def test_ec2_private_default_sg(self):
# Create EC2 Mocked Resources
ec2_client = client("ec2", region_name=AWS_REGION)
ec2_client.create_vpc(CidrBlock="10.0.0.0/16")
default_sg_id = ec2_client.describe_security_groups(GroupNames=["default"])[
"SecurityGroups"
][0]["GroupId"]
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.ec2.ec2_service import EC2
current_audit_info.audited_partition = "aws"
current_audit_info.audited_regions = ["eu-west-1", "us-east-1"]
with mock.patch(
"prowler.providers.aws.services.ec2.ec2_securitygroup_in_use_without_ingress_filtering.ec2_securitygroup_in_use_without_ingress_filtering.ec2_client",
new=EC2(current_audit_info),
):
# Test Check
from prowler.providers.aws.services.ec2.ec2_securitygroup_in_use_without_ingress_filtering.ec2_securitygroup_in_use_without_ingress_filtering import (
ec2_securitygroup_in_use_without_ingress_filtering,
)
check = ec2_securitygroup_in_use_without_ingress_filtering()
result = check.execute()
# One default sg per region
assert len(result) == 3
# Search changed sg
for sg in result:
if sg.resource_id == default_sg_id:
assert sg.status == "PASS"
assert search(
"has ingress filtering",
sg.status_extended,
)
assert (
sg.resource_arn
== f"arn:{current_audit_info.audited_partition}:ec2:{AWS_REGION}:{current_audit_info.audited_account}:security-group/{default_sg_id}"
)

View File

@@ -11,7 +11,6 @@ AWS_ACCOUNT_NUMBER = "123456789012"
class Test_elbv2_desync_mitigation_mode:
@mock_elbv2
def test_elb_no_balancers(self):
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.elbv2.elbv2_service import ELBv2
@@ -33,7 +32,7 @@ class Test_elbv2_desync_mitigation_mode:
@mock_ec2
@mock_elbv2
def test_elbv2_without_desync_mitigation_mode(self):
def test_elbv2_without_desync_mitigation_mode_and_not_dropping_headers(self):
conn = client("elbv2", region_name=AWS_REGION)
ec2 = resource("ec2", region_name=AWS_REGION)
@@ -60,6 +59,10 @@ class Test_elbv2_desync_mitigation_mode:
LoadBalancerArn=lb["LoadBalancerArn"],
Attributes=[
{"Key": "routing.http.desync_mitigation_mode", "Value": "monitor"},
{
"Key": "routing.http.drop_invalid_header_fields.enabled",
"Value": "false",
},
],
)
@@ -82,7 +85,68 @@ class Test_elbv2_desync_mitigation_mode:
assert len(result) == 1
assert result[0].status == "FAIL"
assert search(
"does not have desync mitigation mode set as defensive or strictest",
"does not have desync mitigation mode set as defensive or strictest and is not dropping invalid header fields",
result[0].status_extended,
)
assert result[0].resource_id == "my-lb"
assert result[0].resource_arn == lb["LoadBalancerArn"]
@mock_ec2
@mock_elbv2
def test_elbv2_without_desync_mitigation_mode_but_dropping_headers(self):
conn = client("elbv2", region_name=AWS_REGION)
ec2 = resource("ec2", region_name=AWS_REGION)
security_group = ec2.create_security_group(
GroupName="a-security-group", Description="First One"
)
vpc = ec2.create_vpc(CidrBlock="172.28.7.0/24", InstanceTenancy="default")
subnet1 = ec2.create_subnet(
VpcId=vpc.id, CidrBlock="172.28.7.192/26", AvailabilityZone=f"{AWS_REGION}a"
)
subnet2 = ec2.create_subnet(
VpcId=vpc.id, CidrBlock="172.28.7.0/26", AvailabilityZone=f"{AWS_REGION}b"
)
lb = conn.create_load_balancer(
Name="my-lb",
Subnets=[subnet1.id, subnet2.id],
SecurityGroups=[security_group.id],
Scheme="internal",
Type="application",
)["LoadBalancers"][0]
conn.modify_load_balancer_attributes(
LoadBalancerArn=lb["LoadBalancerArn"],
Attributes=[
{"Key": "routing.http.desync_mitigation_mode", "Value": "monitor"},
{
"Key": "routing.http.drop_invalid_header_fields.enabled",
"Value": "true",
},
],
)
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.elbv2.elbv2_service import ELBv2
current_audit_info.audited_partition = "aws"
with mock.patch(
"prowler.providers.aws.services.elbv2.elbv2_desync_mitigation_mode.elbv2_desync_mitigation_mode.elbv2_client",
new=ELBv2(current_audit_info),
):
from prowler.providers.aws.services.elbv2.elbv2_desync_mitigation_mode.elbv2_desync_mitigation_mode import (
elbv2_desync_mitigation_mode,
)
check = elbv2_desync_mitigation_mode()
result = check.execute()
assert len(result) == 1
assert result[0].status == "PASS"
assert search(
"does not have desync mitigation mode set as defensive or strictest but is dropping invalid header fields",
result[0].status_extended,
)
assert result[0].resource_id == "my-lb"

View File

@@ -8,10 +8,9 @@ AWS_REGION = "eu-west-1"
AWS_ACCOUNT_NUMBER = "123456789012"
class Test_elbv2_request_smugling:
class Test_elbv2_internet_facing:
@mock_elbv2
def test_elb_no_balancers(self):
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.elbv2.elbv2_service import ELBv2

View File

@@ -1,151 +0,0 @@
from re import search
from unittest import mock
from boto3 import client, resource
from moto import mock_ec2, mock_elbv2
AWS_REGION = "eu-west-1"
AWS_ACCOUNT_NUMBER = "123456789012"
class Test_elbv2_request_smugling:
@mock_elbv2
def test_elb_no_balancers(self):
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.elbv2.elbv2_service import ELBv2
current_audit_info.audited_partition = "aws"
with mock.patch(
"prowler.providers.aws.services.elbv2.elbv2_request_smugling.elbv2_request_smugling.elbv2_client",
new=ELBv2(current_audit_info),
):
# Test Check
from prowler.providers.aws.services.elbv2.elbv2_request_smugling.elbv2_request_smugling import (
elbv2_request_smugling,
)
check = elbv2_request_smugling()
result = check.execute()
assert len(result) == 0
@mock_ec2
@mock_elbv2
def test_elbv2_without_dropping(self):
conn = client("elbv2", region_name=AWS_REGION)
ec2 = resource("ec2", region_name=AWS_REGION)
security_group = ec2.create_security_group(
GroupName="a-security-group", Description="First One"
)
vpc = ec2.create_vpc(CidrBlock="172.28.7.0/24", InstanceTenancy="default")
subnet1 = ec2.create_subnet(
VpcId=vpc.id, CidrBlock="172.28.7.192/26", AvailabilityZone=f"{AWS_REGION}a"
)
subnet2 = ec2.create_subnet(
VpcId=vpc.id, CidrBlock="172.28.7.0/26", AvailabilityZone=f"{AWS_REGION}b"
)
lb = conn.create_load_balancer(
Name="my-lb",
Subnets=[subnet1.id, subnet2.id],
SecurityGroups=[security_group.id],
Scheme="internal",
Type="application",
)["LoadBalancers"][0]
conn.modify_load_balancer_attributes(
LoadBalancerArn=lb["LoadBalancerArn"],
Attributes=[
{
"Key": "routing.http.drop_invalid_header_fields.enabled",
"Value": "false",
},
],
)
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.elbv2.elbv2_service import ELBv2
current_audit_info.audited_partition = "aws"
with mock.patch(
"prowler.providers.aws.services.elbv2.elbv2_request_smugling.elbv2_request_smugling.elbv2_client",
new=ELBv2(current_audit_info),
):
from prowler.providers.aws.services.elbv2.elbv2_request_smugling.elbv2_request_smugling import (
elbv2_request_smugling,
)
check = elbv2_request_smugling()
result = check.execute()
assert len(result) == 1
assert result[0].status == "FAIL"
assert search(
"is not dropping invalid header fields",
result[0].status_extended,
)
assert result[0].resource_id == "my-lb"
assert result[0].resource_arn == lb["LoadBalancerArn"]
@mock_ec2
@mock_elbv2
def test_elbv2_with_dropping(self):
conn = client("elbv2", region_name=AWS_REGION)
ec2 = resource("ec2", region_name=AWS_REGION)
security_group = ec2.create_security_group(
GroupName="a-security-group", Description="First One"
)
vpc = ec2.create_vpc(CidrBlock="172.28.7.0/24", InstanceTenancy="default")
subnet1 = ec2.create_subnet(
VpcId=vpc.id, CidrBlock="172.28.7.192/26", AvailabilityZone=f"{AWS_REGION}a"
)
subnet2 = ec2.create_subnet(
VpcId=vpc.id, CidrBlock="172.28.7.0/26", AvailabilityZone=f"{AWS_REGION}b"
)
lb = conn.create_load_balancer(
Name="my-lb",
Subnets=[subnet1.id, subnet2.id],
SecurityGroups=[security_group.id],
Scheme="internal",
)["LoadBalancers"][0]
conn.modify_load_balancer_attributes(
LoadBalancerArn=lb["LoadBalancerArn"],
Attributes=[
{
"Key": "routing.http.drop_invalid_header_fields.enabled",
"Value": "true",
},
],
)
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
from prowler.providers.aws.services.elbv2.elbv2_service import ELBv2
current_audit_info.audited_partition = "aws"
with mock.patch(
"prowler.providers.aws.services.elbv2.elbv2_request_smugling.elbv2_request_smugling.elbv2_client",
new=ELBv2(current_audit_info),
):
from prowler.providers.aws.services.elbv2.elbv2_request_smugling.elbv2_request_smugling import (
elbv2_request_smugling,
)
check = elbv2_request_smugling()
result = check.execute()
assert len(result) == 1
assert result[0].status == "PASS"
assert search(
"is dropping invalid header fields",
result[0].status_extended,
)
assert result[0].resource_id == "my-lb"
assert result[0].resource_arn == lb["LoadBalancerArn"]

View File

@@ -152,3 +152,43 @@ class Test_s3_bucket_level_public_access_block:
== f"arn:{audit_info.audited_partition}:s3:::{bucket_name_us}"
)
assert result[0].region == AWS_REGION
@mock_s3
def test_bucket_can_not_retrieve_public_access_block(self):
s3_client = client("s3", region_name=AWS_REGION)
bucket_name_us = "bucket_test_us"
s3_client.create_bucket(Bucket=bucket_name_us)
s3_client.put_public_access_block(
Bucket=bucket_name_us,
PublicAccessBlockConfiguration={
"BlockPublicAcls": True,
"IgnorePublicAcls": True,
"BlockPublicPolicy": True,
"RestrictPublicBuckets": True,
},
)
from prowler.providers.aws.services.s3.s3_service import S3
audit_info = self.set_mocked_audit_info()
with mock.patch(
"prowler.providers.aws.lib.audit_info.audit_info.current_audit_info",
new=audit_info,
):
# To test this behaviour we need to set public_access_block to None
s3 = S3(audit_info)
s3.buckets[0].public_access_block = None
with mock.patch(
"prowler.providers.aws.services.s3.s3_bucket_level_public_access_block.s3_bucket_level_public_access_block.s3_client",
new=s3,
):
# Test Check
from prowler.providers.aws.services.s3.s3_bucket_level_public_access_block.s3_bucket_level_public_access_block import (
s3_bucket_level_public_access_block,
)
check = s3_bucket_level_public_access_block()
result = check.execute()
assert len(result) == 0

View File

@@ -422,3 +422,47 @@ class Test_s3_bucket_public_access:
== f"arn:{audit_info.audited_partition}:s3:::{bucket_name_us}"
)
assert result[0].region == AWS_REGION
@mock_s3
@mock_s3control
def test_bucket_can_not_retrieve_public_access_block(self):
s3_client = client("s3", region_name=AWS_REGION)
bucket_name_us = "bucket_test_us"
s3_client.create_bucket(Bucket=bucket_name_us)
s3_client.put_public_access_block(
Bucket=bucket_name_us,
PublicAccessBlockConfiguration={
"BlockPublicAcls": True,
"IgnorePublicAcls": True,
"BlockPublicPolicy": True,
"RestrictPublicBuckets": True,
},
)
from prowler.providers.aws.services.s3.s3_service import S3, S3Control
audit_info = self.set_mocked_audit_info()
with mock.patch(
"prowler.providers.aws.lib.audit_info.audit_info.current_audit_info",
new=audit_info,
):
# To test this behaviour we need to set public_access_block to None
s3 = S3(audit_info)
s3.buckets[0].public_access_block = None
with mock.patch(
"prowler.providers.aws.services.s3.s3_bucket_public_access.s3_bucket_public_access.s3_client",
new=s3,
):
with mock.patch(
"prowler.providers.aws.services.s3.s3_bucket_public_access.s3_bucket_public_access.s3control_client",
new=S3Control(audit_info),
):
# Test Check
from prowler.providers.aws.services.s3.s3_bucket_public_access.s3_bucket_public_access import (
s3_bucket_public_access,
)
check = s3_bucket_public_access()
result = check.execute()
assert len(result) == 0