Compare commits

...

24 Commits

Author SHA1 Message Date
Prowler Bot
359d30439b fix(compliance): CIS details for new EFS Controls (#5865)
Co-authored-by: Gary Mclean <gary.mclean@optiim.io>
2024-11-22 10:32:52 -04:00
Prowler Bot
50f430a261 fix(severity): add enum for severity values (#5857)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2024-11-21 14:20:10 -04:00
Prowler Bot
c9009b02bf fix(kubernetes): filter apiGroup in permission checks (#5845)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2024-11-20 11:07:01 -04:00
Prowler Bot
0a97ee544a fix(kubernetes): validate seccomp profile at pod and container levels (#5828)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2024-11-19 10:58:27 -04:00
Prowler Bot
94bc429bda chore(iam): add missing service catalog permissions (#5822)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2024-11-19 09:46:34 -04:00
Prowler Bot
22a3869ac1 fix(wafv2): only list resources for regional Web ACLs (#5812)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2024-11-18 13:08:53 -04:00
Prowler Bot
69a2d317a7 fix(iam): use get to get the key (#5788)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2024-11-15 10:17:45 -05:00
Prowler Bot
fc27e1c439 fix(compliance): use subscriptionid instead of name for azure cis (#5789)
Co-authored-by: Pedro Martín <pedromarting3@gmail.com>
2024-11-15 10:17:27 -05:00
Sergio Garcia
c5cbd9f9e5 chore(version): update Prowler version (#5776) 2024-11-15 15:53:13 +01:00
Prowler Bot
f604ab77f0 chore(ec2): add name from image information to status_extended (#5758)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2024-11-13 15:17:20 -05:00
Prowler Bot
17a04a5760 fix(gcp): scan only ACTIVE projects (#5752)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2024-11-13 10:10:15 -05:00
Prowler Bot
786e1f890c fix(ec2): add default value to Name key for image information (#5754)
Co-authored-by: Rubén De la Torre Vico <rubendltv22@gmail.com>
2024-11-13 10:10:06 -05:00
Sergio Garcia
f330d713f6 chore(version): update Prowler version (#5737) 2024-11-12 14:54:52 -05:00
Sergio Garcia
dba914dede fix(aws): remove cloudwatch_log_group_no_critical_pii_in_logs check (#5735) 2024-11-12 12:40:37 -05:00
Prowler Bot
f8e0db43b3 fix(ec2): unique finding per Security Group in high risk ports check (#5698)
Co-authored-by: Mario Rodriguez Lopez <101330800+MarioRgzLpz@users.noreply.github.com>
2024-11-08 15:17:46 -05:00
Sergio Garcia
237a34aef5 chore(version): update Prowler version (#5680) 2024-11-08 08:14:52 +01:00
Sergio Garcia
f2aa659bc7 chore(version): update Prowler version (#5679) 2024-11-07 14:53:57 -05:00
Prowler Bot
d619c6de73 chore(aws): deprecate glue_etl_jobs_logging_enabled check (#5677)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-11-07 11:54:23 -05:00
Prowler Bot
85d39b48a3 fix(aws): update EKS check in compliance frameworks (#5675)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-11-07 11:50:19 -05:00
Prowler Bot
bc70f5cacf fix(guardduty): fix guardduty_is_enabled_fixer test (#5678)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
2024-11-07 11:49:46 -05:00
Prowler Bot
9b7ed481c0 fix(mutelist): set arguments while loading providers (#5673)
Co-authored-by: thomscode <thomscode@gmail.com>
2024-11-07 10:20:52 -05:00
Prowler Bot
fae8ad640e fix(docker): add g++ to Dockerfile for presidio-analyzer compatibility (#5648)
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-11-06 12:19:08 -05:00
Sergio Garcia
5959d81972 chore(version): update Prowler version (#5643) 2024-11-06 08:11:25 +01:00
Sergio Garcia
d84d0e7693 chore(backport): master changes to v4.5 branch (#5641)
Co-authored-by: sansns-aws <107269923+sansns@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2024-11-05 14:13:11 -05:00
70 changed files with 911 additions and 1397 deletions

View File

@@ -6,7 +6,7 @@ LABEL maintainer="https://github.com/prowler-cloud/prowler"
#hadolint ignore=DL3018
RUN apk --no-cache upgrade && apk --no-cache add curl git
# Create nonroot user
# Create non-root user
RUN mkdir -p /home/prowler && \
echo 'prowler:x:1000:1000:prowler:/home/prowler:' > /etc/passwd && \
echo 'prowler:x:1000:' > /etc/group && \

View File

@@ -63,9 +63,9 @@ It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, Fe
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 457 | 67 -> `prowler aws --list-services` | 30 -> `prowler aws --list-compliance` | 9 -> `prowler aws --list-categories` |
| AWS | 553 | 77 -> `prowler aws --list-services` | 30 -> `prowler aws --list-compliance` | 9 -> `prowler aws --list-categories` |
| GCP | 77 | 13 -> `prowler gcp --list-services` | 2 -> `prowler gcp --list-compliance` | 2 -> `prowler gcp --list-categories`|
| Azure | 136 | 17 -> `prowler azure --list-services` | 3 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
| Azure | 138 | 17 -> `prowler azure --list-services` | 3 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
| Kubernetes | 83 | 7 -> `prowler kubernetes --list-services` | 1 -> `prowler kubernetes --list-compliance` | 7 -> `prowler kubernetes --list-categories` |
# 💻 Installation

View File

@@ -30,8 +30,6 @@ The following list includes all the AWS checks with configurable variables that
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_entropy` | Integer |
| `cloudtrail_threat_detection_privilege_escalation` | `threat_detection_privilege_escalation_minutes` | Integer |
| `cloudwatch_log_group_no_secrets_in_logs` | `secrets_ignore_patterns` | List of Strings |
| `cloudwatch_log_group_no_critical_pii_in_logs` | `critical_pii_entities` | List of Strings |
| `cloudwatch_log_group_no_critical_pii_in_logs` | `pii_language` | String |
| `cloudwatch_log_group_retention_policy_specific_days_enabled` | `log_group_retention_days` | Integer |
| `codebuild_project_no_secrets_in_variables` | `excluded_sensitive_environment_variables` | List of Strings |
| `codebuild_project_no_secrets_in_variables` | `secrets_ignore_patterns` | List of Strings |

View File

@@ -91,6 +91,8 @@ Resources:
- 'shield:GetSubscriptionState'
- 'securityhub:BatchImportFindings'
- 'securityhub:GetFindings'
- 'servicecatalog:Describe*'
- 'servicecatalog:List*'
- 'ssm:GetDocument'
- 'ssm-incidents:List*'
- 'support:Describe*'

View File

@@ -39,6 +39,8 @@
"shield:GetSubscriptionState",
"securityhub:BatchImportFindings",
"securityhub:GetFindings",
"servicecatalog:Describe*",
"servicecatalog:List*",
"ssm:GetDocument",
"ssm-incidents:List*",
"support:Describe*",

677
poetry.lock generated
View File

@@ -756,47 +756,6 @@ files = [
{file = "blinker-1.8.2.tar.gz", hash = "sha256:8f77b09d3bf7c795e969e9486f39c2c5e9c39d4ee07424be2bc594ece9642d83"},
]
[[package]]
name = "blis"
version = "1.0.1"
description = "The Blis BLAS-like linear algebra library, as a self-contained C-extension."
optional = false
python-versions = "*"
files = [
{file = "blis-1.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:232f68f78fd9ab2f8bf01424cd6a265edd97e14e26186ef48eca1b6f212e928f"},
{file = "blis-1.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:90a796d8311b6439d63c91ed09163be995173a1ca5bedc510c1b45cbdbb7886c"},
{file = "blis-1.0.1-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:388b0b0c55df884eb5853292d53c7c3daaa009b3d991e7a95413e46e82e88058"},
{file = "blis-1.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3c89149eed90c3b76f50c88d8d68661ca07d8b4bfaa517eedf5b77ddf640bb1"},
{file = "blis-1.0.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:f23cff163c0a256b580c33364f1598a4f49f532074c673a734d9b02a101adce1"},
{file = "blis-1.0.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8d4d403d4b4979d5b1e6f12637ed25fb3bd4e260440997d9ba9e752f9e9c91cb"},
{file = "blis-1.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:a6440bed28f405ab81d1d57b3fb96b7ea8b5b1f3d3a0d29860b0a814fe4ece27"},
{file = "blis-1.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5076b33c04b864709d0d22c40e9f57dc06ee9c2bd2f7ab72b10ffa74a470d9e6"},
{file = "blis-1.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:b6b1f791fa3eeb267b97f5877a6bdcc09c868603b84422acf7fd138ec9b96c3c"},
{file = "blis-1.0.1-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c249649c0f7c06b2368a9d5c6b12209255981e96304c6140a3beffaff6cae62"},
{file = "blis-1.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:632580f1d3ff44fb36e8a21a9457f23aeaff5d35d108bd2ef0393b9f6d85de93"},
{file = "blis-1.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:64393297304712818020734fa75735f4543243eefc44858ef3c99375d5bb029a"},
{file = "blis-1.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d3b3293c795007dbf4ba8ceaf3184a6bf510ca3252d0229607792f52e8702bb2"},
{file = "blis-1.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:60e1b03663bee7a37b4b3131b4179d283868ccb10a3260fed01dd980866bc49f"},
{file = "blis-1.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:e031b6cf0dcc026d697a860bf53bc02b5f26ddb5a3ecd23933c51cf22803825b"},
{file = "blis-1.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f860c1723928c40d4920a05130d21a600dcb5fbf07aa1fe8f4bdff2c4a5238a5"},
{file = "blis-1.0.1-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:399196506829c278836028511f351f32f2c992263e91ef876c6bc41dc2483d3d"},
{file = "blis-1.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ce20590af2c6ff02d66ffed4148ea8ea1e3f772febb8471e3e58614e71d428c1"},
{file = "blis-1.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:e6ad60d9cd81523429f46109b31e3c4bbdd0dc28a328d6dbdcdff8447a53a61e"},
{file = "blis-1.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:eb553b233fc815957c5bbb5d2fc2f6d2b199c123ec15c5000db935662849e543"},
{file = "blis-1.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:376188493f590c4310ca534b687ef96c21c8224eb1ef4a0420703eebe175d6fa"},
{file = "blis-1.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:136eae35dd9fd6c1923b95d5623aa174abd43d5b764bed79fd37bf6ad40282e7"},
{file = "blis-1.0.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:16b5a418c55825d0d7912f63befee865da7522ce8f388b8ef76224050c5f8386"},
{file = "blis-1.0.1-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:806ab0838efa5e434e6fcdce86542601d0263256d483623bc86e91728a645de4"},
{file = "blis-1.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:33ad60cde863a7cf8e69fcf0c3b965ab0f804f0332adb35552788bb76660c97f"},
{file = "blis-1.0.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:40459f37f430de0c6059631cc9f7588c3bccb6ade074571015ea09856b91cb69"},
{file = "blis-1.0.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:f0977eaba2d3f64df649201aa281cf3216e7bec18929a7374c3f3f36100db029"},
{file = "blis-1.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:5b4c8298c49b25058462731139b711e35e73623280cee03c3b0804cbdee50c0d"},
{file = "blis-1.0.1.tar.gz", hash = "sha256:91739cd850ca8100dcddbd8ad66942cab20c9473cdea9a35b165b11d7b8d91e4"},
]
[package.dependencies]
numpy = ">=2.0.0,<3.0.0"
[[package]]
name = "boto3"
version = "1.35.29"
@@ -849,17 +808,6 @@ files = [
{file = "cachetools-5.5.0.tar.gz", hash = "sha256:2cc24fb4cbe39633fb7badd9db9ca6295d766d9c2995f245725a46715d050f2a"},
]
[[package]]
name = "catalogue"
version = "2.0.10"
description = "Super lightweight function registries for your library"
optional = false
python-versions = ">=3.6"
files = [
{file = "catalogue-2.0.10-py3-none-any.whl", hash = "sha256:58c2de0020aa90f4a2da7dfad161bf7b3b054c86a5f09fcedc0b2b740c109a9f"},
{file = "catalogue-2.0.10.tar.gz", hash = "sha256:4f56daa940913d3f09d589c191c74e5a6d51762b3a9e37dd53b7437afd6cda15"},
]
[[package]]
name = "certifi"
version = "2024.8.30"
@@ -1106,26 +1054,6 @@ click = ">=4.0"
[package.extras]
dev = ["coveralls", "pytest (>=3.6)", "pytest-cov", "wheel"]
[[package]]
name = "cloudpathlib"
version = "0.19.0"
description = "pathlib-style classes for cloud storage services."
optional = false
python-versions = ">=3.8"
files = [
{file = "cloudpathlib-0.19.0-py3-none-any.whl", hash = "sha256:eb7758648812d5906af44f14cf9a6a64f687342a6f547a1c20deb7241d769dcb"},
{file = "cloudpathlib-0.19.0.tar.gz", hash = "sha256:919edbfd9a4e935d2423da210b143df89cb0ec6d378366a0dffa2e9fd0664fe8"},
]
[package.dependencies]
typing_extensions = {version = ">4", markers = "python_version < \"3.11\""}
[package.extras]
all = ["cloudpathlib[azure]", "cloudpathlib[gs]", "cloudpathlib[s3]"]
azure = ["azure-storage-blob (>=12)", "azure-storage-file-datalake (>=12)"]
gs = ["google-cloud-storage"]
s3 = ["boto3 (>=1.34.0)"]
[[package]]
name = "colorama"
version = "0.4.6"
@@ -1137,21 +1065,6 @@ files = [
{file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
]
[[package]]
name = "confection"
version = "0.1.5"
description = "The sweetest config system for Python"
optional = false
python-versions = ">=3.6"
files = [
{file = "confection-0.1.5-py3-none-any.whl", hash = "sha256:e29d3c3f8eac06b3f77eb9dfb4bf2fc6bcc9622a98ca00a698e3d019c6430b14"},
{file = "confection-0.1.5.tar.gz", hash = "sha256:8e72dd3ca6bd4f48913cd220f10b8275978e740411654b6e8ca6d7008c590f0e"},
]
[package.dependencies]
pydantic = ">=1.7.4,<1.8 || >1.8,<1.8.1 || >1.8.1,<3.0.0"
srsly = ">=2.4.0,<3.0.0"
[[package]]
name = "coverage"
version = "7.6.1"
@@ -1288,48 +1201,6 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["certifi", "cryptography-vectors (==43.0.1)", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
[[package]]
name = "cymem"
version = "2.0.8"
description = "Manage calls to calloc/free through Cython"
optional = false
python-versions = "*"
files = [
{file = "cymem-2.0.8-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:77b5d3a73c41a394efd5913ab7e48512054cd2dabb9582d489535456641c7666"},
{file = "cymem-2.0.8-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bd33da892fb560ba85ea14b1528c381ff474048e861accc3366c8b491035a378"},
{file = "cymem-2.0.8-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:29a551eda23eebd6d076b855f77a5ed14a1d1cae5946f7b3cb5de502e21b39b0"},
{file = "cymem-2.0.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e8260445652ae5ab19fff6851f32969a7b774f309162e83367dd0f69aac5dbf7"},
{file = "cymem-2.0.8-cp310-cp310-win_amd64.whl", hash = "sha256:a63a2bef4c7e0aec7c9908bca0a503bf91ac7ec18d41dd50dc7dff5d994e4387"},
{file = "cymem-2.0.8-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6b84b780d52cb2db53d4494fe0083c4c5ee1f7b5380ceaea5b824569009ee5bd"},
{file = "cymem-2.0.8-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0d5f83dc3cb5a39f0e32653cceb7c8ce0183d82f1162ca418356f4a8ed9e203e"},
{file = "cymem-2.0.8-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ac218cf8a43a761dc6b2f14ae8d183aca2bbb85b60fe316fd6613693b2a7914"},
{file = "cymem-2.0.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42c993589d1811ec665d37437d5677b8757f53afadd927bf8516ac8ce2d3a50c"},
{file = "cymem-2.0.8-cp311-cp311-win_amd64.whl", hash = "sha256:ab3cf20e0eabee9b6025ceb0245dadd534a96710d43fb7a91a35e0b9e672ee44"},
{file = "cymem-2.0.8-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:cb51fddf1b920abb1f2742d1d385469bc7b4b8083e1cfa60255e19bc0900ccb5"},
{file = "cymem-2.0.8-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9235957f8c6bc2574a6a506a1687164ad629d0b4451ded89d49ebfc61b52660c"},
{file = "cymem-2.0.8-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a2cc38930ff5409f8d61f69a01e39ecb185c175785a1c9bec13bcd3ac8a614ba"},
{file = "cymem-2.0.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7bf49e3ea2c441f7b7848d5c61b50803e8cbd49541a70bb41ad22fce76d87603"},
{file = "cymem-2.0.8-cp312-cp312-win_amd64.whl", hash = "sha256:ecd12e3bacf3eed5486e4cd8ede3c12da66ee0e0a9d0ae046962bc2bb503acef"},
{file = "cymem-2.0.8-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:167d8019db3b40308aabf8183fd3fbbc256323b645e0cbf2035301058c439cd0"},
{file = "cymem-2.0.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:17cd2c2791c8f6b52f269a756ba7463f75bf7265785388a2592623b84bb02bf8"},
{file = "cymem-2.0.8-cp36-cp36m-win_amd64.whl", hash = "sha256:6204f0a3307bf45d109bf698ba37997ce765f21e359284328e4306c7500fcde8"},
{file = "cymem-2.0.8-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b9c05db55ea338648f8e5f51dd596568c7f62c5ae32bf3fa5b1460117910ebae"},
{file = "cymem-2.0.8-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ce641f7ba0489bd1b42a4335a36f38c8507daffc29a512681afaba94a0257d2"},
{file = "cymem-2.0.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e6b83a5972a64f62796118da79dfeed71f4e1e770b2b7455e889c909504c2358"},
{file = "cymem-2.0.8-cp37-cp37m-win_amd64.whl", hash = "sha256:ada6eb022e4a0f4f11e6356a5d804ceaa917174e6cf33c0b3e371dbea4dd2601"},
{file = "cymem-2.0.8-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1e593cd57e2e19eb50c7ddaf7e230b73c890227834425b9dadcd4a86834ef2ab"},
{file = "cymem-2.0.8-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d513f0d5c6d76facdc605e42aa42c8d50bb7dedca3144ec2b47526381764deb0"},
{file = "cymem-2.0.8-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e370dd54359101b125bfb191aca0542718077b4edb90ccccba1a28116640fed"},
{file = "cymem-2.0.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:84f8c58cde71b8fc7024883031a4eec66c0a9a4d36b7850c3065493652695156"},
{file = "cymem-2.0.8-cp38-cp38-win_amd64.whl", hash = "sha256:6a6edddb30dd000a27987fcbc6f3c23b7fe1d74f539656952cb086288c0e4e29"},
{file = "cymem-2.0.8-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b896c83c08dadafe8102a521f83b7369a9c5cc3e7768eca35875764f56703f4c"},
{file = "cymem-2.0.8-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a4f8f2bfee34f6f38b206997727d29976666c89843c071a968add7d61a1e8024"},
{file = "cymem-2.0.8-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7372e2820fa66fd47d3b135f3eb574ab015f90780c3a21cfd4809b54f23a4723"},
{file = "cymem-2.0.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4e57bee56d35b90fc2cba93e75b2ce76feaca05251936e28a96cf812a1f5dda"},
{file = "cymem-2.0.8-cp39-cp39-win_amd64.whl", hash = "sha256:ceeab3ce2a92c7f3b2d90854efb32cb203e78cb24c836a5a9a2cac221930303b"},
{file = "cymem-2.0.8.tar.gz", hash = "sha256:8fb09d222e21dcf1c7e907dc85cf74501d4cea6c4ed4ac6c9e016f98fb59cbbf"},
]
[[package]]
name = "dash"
version = "2.18.1"
@@ -2280,42 +2151,6 @@ websocket-client = ">=0.32.0,<0.40.0 || >0.40.0,<0.41.dev0 || >=0.43.dev0"
[package.extras]
adal = ["adal (>=1.0.2)"]
[[package]]
name = "langcodes"
version = "3.4.1"
description = "Tools for labeling human languages with IETF language tags"
optional = false
python-versions = ">=3.8"
files = [
{file = "langcodes-3.4.1-py3-none-any.whl", hash = "sha256:68f686fc3d358f222674ecf697ddcee3ace3c2fe325083ecad2543fd28a20e77"},
{file = "langcodes-3.4.1.tar.gz", hash = "sha256:a24879fed238013ac3af2424b9d1124e38b4a38b2044fd297c8ff38e5912e718"},
]
[package.dependencies]
language-data = ">=1.2"
[package.extras]
build = ["build", "twine"]
test = ["pytest", "pytest-cov"]
[[package]]
name = "language-data"
version = "1.2.0"
description = "Supplementary data about languages used by the langcodes module"
optional = false
python-versions = "*"
files = [
{file = "language_data-1.2.0-py3-none-any.whl", hash = "sha256:77d5cab917f91ee0b2f1aa7018443e911cf8985ef734ca2ba3940770f6a3816b"},
{file = "language_data-1.2.0.tar.gz", hash = "sha256:82a86050bbd677bfde87d97885b17566cfe75dad3ac4f5ce44b52c28f752e773"},
]
[package.dependencies]
marisa-trie = ">=0.7.7"
[package.extras]
build = ["build", "twine"]
test = ["pytest", "pytest-cov"]
[[package]]
name = "lazy-object-proxy"
version = "1.10.0"
@@ -2362,97 +2197,6 @@ files = [
{file = "lazy_object_proxy-1.10.0-pp310.pp311.pp312.pp38.pp39-none-any.whl", hash = "sha256:80fa48bd89c8f2f456fc0765c11c23bf5af827febacd2f523ca5bc1893fcc09d"},
]
[[package]]
name = "marisa-trie"
version = "1.2.1"
description = "Static memory-efficient and fast Trie-like structures for Python."
optional = false
python-versions = ">=3.7"
files = [
{file = "marisa_trie-1.2.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a2eb41d2f9114d8b7bd66772c237111e00d2bae2260824560eaa0a1e291ce9e8"},
{file = "marisa_trie-1.2.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:9e956e6a46f604b17d570901e66f5214fb6f658c21e5e7665deace236793cef6"},
{file = "marisa_trie-1.2.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bd45142501300e7538b2e544905580918b67b1c82abed1275fe4c682c95635fa"},
{file = "marisa_trie-1.2.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a8443d116c612cfd1961fbf76769faf0561a46d8e317315dd13f9d9639ad500c"},
{file = "marisa_trie-1.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:875a6248e60fbb48d947b574ffa4170f34981f9e579bde960d0f9a49ea393ecc"},
{file = "marisa_trie-1.2.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:746a7c60a17fccd3cfcfd4326926f02ea4fcdfc25d513411a0c4fc8e4a1ca51f"},
{file = "marisa_trie-1.2.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:e70869737cc0e5bd903f620667da6c330d6737048d1f44db792a6af68a1d35be"},
{file = "marisa_trie-1.2.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:06b099dd743676dbcd8abd8465ceac8f6d97d8bfaabe2c83b965495523b4cef2"},
{file = "marisa_trie-1.2.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:d2a82eb21afdaf22b50d9b996472305c05ca67fc4ff5a026a220320c9c961db6"},
{file = "marisa_trie-1.2.1-cp310-cp310-win32.whl", hash = "sha256:8951e7ce5d3167fbd085703b4cbb3f47948ed66826bef9a2173c379508776cf5"},
{file = "marisa_trie-1.2.1-cp310-cp310-win_amd64.whl", hash = "sha256:5685a14b3099b1422c4f59fa38b0bf4b5342ee6cc38ae57df9666a0b28eeaad3"},
{file = "marisa_trie-1.2.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ed3fb4ed7f2084597e862bcd56c56c5529e773729a426c083238682dba540e98"},
{file = "marisa_trie-1.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0fe69fb9ffb2767746181f7b3b29bbd3454d1d24717b5958e030494f3d3cddf3"},
{file = "marisa_trie-1.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4728ed3ae372d1ea2cdbd5eaa27b8f20a10e415d1f9d153314831e67d963f281"},
{file = "marisa_trie-1.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8cf4f25cf895692b232f49aa5397af6aba78bb679fb917a05fce8d3cb1ee446d"},
{file = "marisa_trie-1.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7cca7f96236ffdbf49be4b2e42c132e3df05968ac424544034767650913524de"},
{file = "marisa_trie-1.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d7eb20bf0e8b55a58d2a9b518aabc4c18278787bdba476c551dd1c1ed109e509"},
{file = "marisa_trie-1.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b1ec93f0d1ee6d7ab680a6d8ea1a08bf264636358e92692072170032dda652ba"},
{file = "marisa_trie-1.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e2699255d7ac610dee26d4ae7bda5951d05c7d9123a22e1f7c6a6f1964e0a4e4"},
{file = "marisa_trie-1.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:c484410911182457a8a1a0249d0c09c01e2071b78a0a8538cd5f7fa45589b13a"},
{file = "marisa_trie-1.2.1-cp311-cp311-win32.whl", hash = "sha256:ad548117744b2bcf0e3d97374608be0a92d18c2af13d98b728d37cd06248e571"},
{file = "marisa_trie-1.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:436f62d27714970b9cdd3b3c41bdad046f260e62ebb0daa38125ef70536fc73b"},
{file = "marisa_trie-1.2.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:638506eacf20ca503fff72221a7e66a6eadbf28d6a4a6f949fcf5b1701bb05ec"},
{file = "marisa_trie-1.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:de1665eaafefa48a308e4753786519888021740501a15461c77bdfd57638e6b4"},
{file = "marisa_trie-1.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f713af9b8aa66a34cd3a78c7d150a560a75734713abe818a69021fd269e927fa"},
{file = "marisa_trie-1.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b2a7d00f53f4945320b551bccb826b3fb26948bde1a10d50bb9802fabb611b10"},
{file = "marisa_trie-1.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98042040d1d6085792e8d0f74004fc0f5f9ca6091c298f593dd81a22a4643854"},
{file = "marisa_trie-1.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6532615111eec2c79e711965ece0bc95adac1ff547a7fff5ffca525463116deb"},
{file = "marisa_trie-1.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:20948e40ab2038e62b7000ca6b4a913bc16c91a2c2e6da501bd1f917eeb28d51"},
{file = "marisa_trie-1.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:66b23e5b35dd547f85bf98db7c749bc0ffc57916ade2534a6bbc32db9a4abc44"},
{file = "marisa_trie-1.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6704adf0247d2dda42e876b793be40775dff46624309ad99bc7537098bee106d"},
{file = "marisa_trie-1.2.1-cp312-cp312-win32.whl", hash = "sha256:3ad356442c2fea4c2a6f514738ddf213d23930f942299a2b2c05df464a00848a"},
{file = "marisa_trie-1.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:f2806f75817392cedcacb24ac5d80b0350dde8d3861d67d045c1d9b109764114"},
{file = "marisa_trie-1.2.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:b5ea16e69bfda0ac028c921b58de1a4aaf83d43934892977368579cd3c0a2554"},
{file = "marisa_trie-1.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9f627f4e41be710b6cb6ed54b0128b229ac9d50e2054d9cde3af0fef277c23cf"},
{file = "marisa_trie-1.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5e649f3dc8ab5476732094f2828cc90cac3be7c79bc0c8318b6fda0c1d248db4"},
{file = "marisa_trie-1.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:46e528ee71808c961baf8c3ce1c46a8337ec7a96cc55389d11baafe5b632f8e9"},
{file = "marisa_trie-1.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:36aa4401a1180615f74d575571a6550081d84fc6461e9aefc0bb7b2427af098e"},
{file = "marisa_trie-1.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ce59bcd2cda9bb52b0e90cc7f36413cd86c3d0ce7224143447424aafb9f4aa48"},
{file = "marisa_trie-1.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f4cd800704a5fc57e53c39c3a6b0c9b1519ebdbcb644ede3ee67a06eb542697d"},
{file = "marisa_trie-1.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2428b495003c189695fb91ceeb499f9fcced3a2dce853e17fa475519433c67ff"},
{file = "marisa_trie-1.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:735c363d9aaac82eaf516a28f7c6b95084c2e176d8231c87328dc80e112a9afa"},
{file = "marisa_trie-1.2.1-cp313-cp313-win32.whl", hash = "sha256:eba6ca45500ca1a042466a0684aacc9838e7f20fe2605521ee19f2853062798f"},
{file = "marisa_trie-1.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:aa7cd17e1c690ce96c538b2f4aae003d9a498e65067dd433c52dd069009951d4"},
{file = "marisa_trie-1.2.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:5e43891a37b0d7f618819fea14bd951289a0a8e3dd0da50c596139ca83ebb9b1"},
{file = "marisa_trie-1.2.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6946100a43f933fad6bc458c502a59926d80b321d5ac1ed2ff9c56605360496f"},
{file = "marisa_trie-1.2.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a4177dc0bd1374e82be9b2ba4d0c2733b0a85b9d154ceeea83a5bee8c1e62fbf"},
{file = "marisa_trie-1.2.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f35c2603a6be168088ed1db6ad1704b078aa8f39974c60888fbbced95dcadad4"},
{file = "marisa_trie-1.2.1-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:d659fda873d8dcb2c14c2c331de1dee21f5a902d7f2de7978b62c6431a8850ef"},
{file = "marisa_trie-1.2.1-cp37-cp37m-musllinux_1_2_i686.whl", hash = "sha256:b0ef26733d3c836be79e812071e1a431ce1f807955a27a981ebb7993d95f842b"},
{file = "marisa_trie-1.2.1-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:536ea19ce6a2ce61c57fed4123ecd10d18d77a0db45cd2741afff2b8b68f15b3"},
{file = "marisa_trie-1.2.1-cp37-cp37m-win32.whl", hash = "sha256:0ee6cf6a16d9c3d1c94e21c8e63c93d8b34bede170ca4e937e16e1c0700d399f"},
{file = "marisa_trie-1.2.1-cp37-cp37m-win_amd64.whl", hash = "sha256:7e7b1786e852e014d03e5f32dbd991f9a9eb223dd3fa9a2564108b807e4b7e1c"},
{file = "marisa_trie-1.2.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:952af3a5859c3b20b15a00748c36e9eb8316eb2c70bd353ae1646da216322908"},
{file = "marisa_trie-1.2.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:24a81aa7566e4ec96fc4d934581fe26d62eac47fc02b35fa443a0bb718b471e8"},
{file = "marisa_trie-1.2.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:9c9b32b14651a6dcf9e8857d2df5d29d322a1ea8c0be5c8ffb88f9841c4ec62b"},
{file = "marisa_trie-1.2.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7ac170d20b97beb75059ba65d1ccad6b434d777c8992ab41ffabdade3b06dd74"},
{file = "marisa_trie-1.2.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:da4e4facb79614cc4653cfd859f398e4db4ca9ab26270ff12610e50ed7f1f6c6"},
{file = "marisa_trie-1.2.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:25688f34cac3bec01b4f655ffdd6c599a01f0bd596b4a79cf56c6f01a7df3560"},
{file = "marisa_trie-1.2.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:1db3213b451bf058d558f6e619bceff09d1d130214448a207c55e1526e2773a1"},
{file = "marisa_trie-1.2.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:d5648c6dcc5dc9200297fb779b1663b8a4467bda034a3c69bd9c32d8afb33b1d"},
{file = "marisa_trie-1.2.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:5bd39a4e1cc839a88acca2889d17ebc3f202a5039cd6059a13148ce75c8a6244"},
{file = "marisa_trie-1.2.1-cp38-cp38-win32.whl", hash = "sha256:594f98491a96c7f1ffe13ce292cef1b4e63c028f0707effdea0f113364c1ae6c"},
{file = "marisa_trie-1.2.1-cp38-cp38-win_amd64.whl", hash = "sha256:5fe5a286f997848a410eebe1c28657506adaeb405220ee1e16cfcfd10deb37f2"},
{file = "marisa_trie-1.2.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c0fe2ace0cb1806badbd1c551a8ec2f8d4cf97bf044313c082ef1acfe631ddca"},
{file = "marisa_trie-1.2.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:67f0c2ec82c20a02c16fc9ba81dee2586ef20270127c470cb1054767aa8ba310"},
{file = "marisa_trie-1.2.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a3c98613180cf1730e221933ff74b454008161b1a82597e41054127719964188"},
{file = "marisa_trie-1.2.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:429858a0452a7bedcf67bc7bb34383d00f666c980cb75a31bcd31285fbdd4403"},
{file = "marisa_trie-1.2.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b2eacb84446543082ec50f2fb563f1a94c96804d4057b7da8ed815958d0cdfbe"},
{file = "marisa_trie-1.2.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:852d7bcf14b0c63404de26e7c4c8d5d65ecaeca935e93794331bc4e2f213660b"},
{file = "marisa_trie-1.2.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:e58788004adda24c401d1751331618ed20c507ffc23bfd28d7c0661a1cf0ad16"},
{file = "marisa_trie-1.2.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:aefe0973cc4698e0907289dc0517ab0c7cdb13d588201932ff567d08a50b0e2e"},
{file = "marisa_trie-1.2.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:6c50c861faad0a5c091bd763e0729f958c316e678dfa065d3984fbb9e4eacbcd"},
{file = "marisa_trie-1.2.1-cp39-cp39-win32.whl", hash = "sha256:b1ce340da608530500ab4f963f12d6bfc8d8680900919a60dbdc9b78c02060a4"},
{file = "marisa_trie-1.2.1-cp39-cp39-win_amd64.whl", hash = "sha256:ce37d8ca462bb64cc13f529b9ed92f7b21fe8d1f1679b62e29f9cb7d0e888b49"},
{file = "marisa_trie-1.2.1.tar.gz", hash = "sha256:3a27c408e2aefc03e0f1d25b2ff2afb85aac3568f6fa2ae2a53b57a2e87ce29d"},
]
[package.dependencies]
setuptools = "*"
[package.extras]
test = ["hypothesis", "pytest", "readme-renderer"]
[[package]]
name = "markdown"
version = "3.7"
@@ -3146,48 +2890,6 @@ files = [
[package.extras]
dev = ["build", "pytest", "pytest-cov", "twine"]
[[package]]
name = "murmurhash"
version = "1.0.10"
description = "Cython bindings for MurmurHash"
optional = false
python-versions = ">=3.6"
files = [
{file = "murmurhash-1.0.10-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:3e90eef568adca5e17a91f96975e9a782ace3a617bbb3f8c8c2d917096e9bfeb"},
{file = "murmurhash-1.0.10-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f8ecb00cc1ab57e4b065f9fb3ea923b55160c402d959c69a0b6dbbe8bc73efc3"},
{file = "murmurhash-1.0.10-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3310101004d9e2e0530c2fed30174448d998ffd1b50dcbfb7677e95db101aa4b"},
{file = "murmurhash-1.0.10-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c65401a6f1778676253cbf89c1f45a8a7feb7d73038e483925df7d5943c08ed9"},
{file = "murmurhash-1.0.10-cp310-cp310-win_amd64.whl", hash = "sha256:f23f2dfc7174de2cdc5007c0771ab8376a2a3f48247f32cac4a5563e40c6adcc"},
{file = "murmurhash-1.0.10-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:90ed37ee2cace9381b83d56068334f77e3e30bc521169a1f886a2a2800e965d6"},
{file = "murmurhash-1.0.10-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:22e9926fdbec9d24ced9b0a42f0fee68c730438be3cfb00c2499fd495caec226"},
{file = "murmurhash-1.0.10-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:54bfbfd68baa99717239b8844600db627f336a08b1caf4df89762999f681cdd1"},
{file = "murmurhash-1.0.10-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:18b9d200a09d48ef67f6840b77c14f151f2b6c48fd69661eb75c7276ebdb146c"},
{file = "murmurhash-1.0.10-cp311-cp311-win_amd64.whl", hash = "sha256:e5d7cfe392c0a28129226271008e61e77bf307afc24abf34f386771daa7b28b0"},
{file = "murmurhash-1.0.10-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:96f0a070344d4802ea76a160e0d4c88b7dc10454d2426f48814482ba60b38b9e"},
{file = "murmurhash-1.0.10-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9f61862060d677c84556610ac0300a0776cb13cb3155f5075ed97e80f86e55d9"},
{file = "murmurhash-1.0.10-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b3b6d2d877d8881a08be66d906856d05944be0faf22b9a0390338bcf45299989"},
{file = "murmurhash-1.0.10-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d8f54b0031d8696fed17ed6e9628f339cdea0ba2367ca051e18ff59193f52687"},
{file = "murmurhash-1.0.10-cp312-cp312-win_amd64.whl", hash = "sha256:97e09d675de2359e586f09de1d0de1ab39f9911edffc65c9255fb5e04f7c1f85"},
{file = "murmurhash-1.0.10-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1b64e5332932993fef598e78d633b1ba664789ab73032ed511f3dc615a631a1a"},
{file = "murmurhash-1.0.10-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e2a38437a8497e082408aa015c6d90554b9e00c2c221fdfa79728a2d99a739e"},
{file = "murmurhash-1.0.10-cp36-cp36m-win_amd64.whl", hash = "sha256:55f4e4f9291a53c36070330950b472d72ba7d331e4ce3ce1ab349a4f458f7bc4"},
{file = "murmurhash-1.0.10-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:16ef9f0855952493fe08929d23865425906a8c0c40607ac8a949a378652ba6a9"},
{file = "murmurhash-1.0.10-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cc3351ae92b89c2fcdc6e41ac6f17176dbd9b3554c96109fd0713695d8663e7"},
{file = "murmurhash-1.0.10-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6559fef7c2e7349a42a63549067709b656d6d1580752bd76be1541d8b2d65718"},
{file = "murmurhash-1.0.10-cp37-cp37m-win_amd64.whl", hash = "sha256:8bf49e3bb33febb7057ae3a5d284ef81243a1e55eaa62bdcd79007cddbdc0461"},
{file = "murmurhash-1.0.10-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f1605fde07030516eb63d77a598dd164fb9bf217fd937dbac588fe7e47a28c40"},
{file = "murmurhash-1.0.10-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4904f7e68674a64eb2b08823c72015a5e14653e0b4b109ea00c652a005a59bad"},
{file = "murmurhash-1.0.10-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0438f0cb44cf1cd26251f72c1428213c4197d40a4e3f48b1efc3aea12ce18517"},
{file = "murmurhash-1.0.10-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db1171a3f9a10571931764cdbfaa5371f4cf5c23c680639762125cb075b833a5"},
{file = "murmurhash-1.0.10-cp38-cp38-win_amd64.whl", hash = "sha256:1c9fbcd7646ad8ba67b895f71d361d232c6765754370ecea473dd97d77afe99f"},
{file = "murmurhash-1.0.10-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:7024ab3498434f22f8e642ae31448322ad8228c65c8d9e5dc2d563d57c14c9b8"},
{file = "murmurhash-1.0.10-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a99dedfb7f0cc5a4cd76eb409ee98d3d50eba024f934e705914f6f4d765aef2c"},
{file = "murmurhash-1.0.10-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b580b8503647de5dd7972746b7613ea586270f17ac92a44872a9b1b52c36d68"},
{file = "murmurhash-1.0.10-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d75840212bf75eb1352c946c3cf1622dacddd6d6bdda34368237d1eb3568f23a"},
{file = "murmurhash-1.0.10-cp39-cp39-win_amd64.whl", hash = "sha256:a4209962b9f85de397c3203ea4b3a554da01ae9fd220fdab38757d4e9eba8d1a"},
{file = "murmurhash-1.0.10.tar.gz", hash = "sha256:5282aab1317804c6ebd6dd7f69f15ba9075aee671c44a34be2bde0f1b11ef88a"},
]
[[package]]
name = "mypy-extensions"
version = "1.0.0"
@@ -3621,17 +3323,6 @@ tzdata = ">=2020.1"
[package.extras]
test = ["time-machine (>=2.6.0)"]
[[package]]
name = "phonenumbers"
version = "8.13.47"
description = "Python version of Google's common library for parsing, formatting, storing and validating international phone numbers."
optional = false
python-versions = "*"
files = [
{file = "phonenumbers-8.13.47-py2.py3-none-any.whl", hash = "sha256:5d3c0142ef7055ca5551884352e3b6b93bfe002a0bc95b8eaba39b0e2184541b"},
{file = "phonenumbers-8.13.47.tar.gz", hash = "sha256:53c5e7c6d431cafe4efdd44956078404ae9bc8b0eacc47be3105d3ccc88aaffa"},
]
[[package]]
name = "platformdirs"
version = "4.3.6"
@@ -3708,75 +3399,6 @@ docs = ["sphinx (>=1.7.1)"]
redis = ["redis"]
tests = ["pytest (>=5.4.1)", "pytest-cov (>=2.8.1)", "pytest-mypy (>=0.8.0)", "pytest-timeout (>=2.1.0)", "redis", "sphinx (>=6.0.0)", "types-redis"]
[[package]]
name = "preshed"
version = "3.0.9"
description = "Cython hash table that trusts the keys are pre-hashed"
optional = false
python-versions = ">=3.6"
files = [
{file = "preshed-3.0.9-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:4f96ef4caf9847b2bb9868574dcbe2496f974e41c2b83d6621c24fb4c3fc57e3"},
{file = "preshed-3.0.9-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a61302cf8bd30568631adcdaf9e6b21d40491bd89ba8ebf67324f98b6c2a2c05"},
{file = "preshed-3.0.9-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99499e8a58f58949d3f591295a97bca4e197066049c96f5d34944dd21a497193"},
{file = "preshed-3.0.9-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ea6b6566997dc3acd8c6ee11a89539ac85c77275b4dcefb2dc746d11053a5af8"},
{file = "preshed-3.0.9-cp310-cp310-win_amd64.whl", hash = "sha256:bfd523085a84b1338ff18f61538e1cfcdedc4b9e76002589a301c364d19a2e36"},
{file = "preshed-3.0.9-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:e7c2364da27f2875524ce1ca754dc071515a9ad26eb5def4c7e69129a13c9a59"},
{file = "preshed-3.0.9-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:182138033c0730c683a6d97e567ceb8a3e83f3bff5704f300d582238dbd384b3"},
{file = "preshed-3.0.9-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:345a10be3b86bcc6c0591d343a6dc2bfd86aa6838c30ced4256dfcfa836c3a64"},
{file = "preshed-3.0.9-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:51d0192274aa061699b284f9fd08416065348edbafd64840c3889617ee1609de"},
{file = "preshed-3.0.9-cp311-cp311-win_amd64.whl", hash = "sha256:96b857d7a62cbccc3845ac8c41fd23addf052821be4eb987f2eb0da3d8745aa1"},
{file = "preshed-3.0.9-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:b4fe6720012c62e6d550d6a5c1c7ad88cacef8388d186dad4bafea4140d9d198"},
{file = "preshed-3.0.9-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:e04f05758875be9751e483bd3c519c22b00d3b07f5a64441ec328bb9e3c03700"},
{file = "preshed-3.0.9-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a55091d0e395f1fdb62ab43401bb9f8b46c7d7794d5b071813c29dc1ab22fd0"},
{file = "preshed-3.0.9-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7de8f5138bcac7870424e09684dc3dd33c8e30e81b269f6c9ede3d8c7bb8e257"},
{file = "preshed-3.0.9-cp312-cp312-win_amd64.whl", hash = "sha256:24229c77364628743bc29c5620c5d6607ed104f0e02ae31f8a030f99a78a5ceb"},
{file = "preshed-3.0.9-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b73b0f7ecc58095ebbc6ca26ec806008ef780190fe685ce471b550e7eef58dc2"},
{file = "preshed-3.0.9-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5cb90ecd5bec71c21d95962db1a7922364d6db2abe284a8c4b196df8bbcc871e"},
{file = "preshed-3.0.9-cp36-cp36m-win_amd64.whl", hash = "sha256:e304a0a8c9d625b70ba850c59d4e67082a6be9c16c4517b97850a17a282ebee6"},
{file = "preshed-3.0.9-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:1fa6d3d5529b08296ff9b7b4da1485c080311fd8744bbf3a86019ff88007b382"},
{file = "preshed-3.0.9-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ef1e5173809d85edd420fc79563b286b88b4049746b797845ba672cf9435c0e7"},
{file = "preshed-3.0.9-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7fe81eb21c7d99e8b9a802cc313b998c5f791bda592903c732b607f78a6b7dc4"},
{file = "preshed-3.0.9-cp37-cp37m-win_amd64.whl", hash = "sha256:78590a4a952747c3766e605ce8b747741005bdb1a5aa691a18aae67b09ece0e6"},
{file = "preshed-3.0.9-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3452b64d97ce630e200c415073040aa494ceec6b7038f7a2a3400cbd7858e952"},
{file = "preshed-3.0.9-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ac970d97b905e9e817ec13d31befd5b07c9cfec046de73b551d11a6375834b79"},
{file = "preshed-3.0.9-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eebaa96ece6641cd981491cba995b68c249e0b6877c84af74971eacf8990aa19"},
{file = "preshed-3.0.9-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2d473c5f6856e07a88d41fe00bb6c206ecf7b34c381d30de0b818ba2ebaf9406"},
{file = "preshed-3.0.9-cp38-cp38-win_amd64.whl", hash = "sha256:0de63a560f10107a3f0a9e252cc3183b8fdedcb5f81a86938fd9f1dcf8a64adf"},
{file = "preshed-3.0.9-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3a9ad9f738084e048a7c94c90f40f727217387115b2c9a95c77f0ce943879fcd"},
{file = "preshed-3.0.9-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a671dfa30b67baa09391faf90408b69c8a9a7f81cb9d83d16c39a182355fbfce"},
{file = "preshed-3.0.9-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23906d114fc97c17c5f8433342495d7562e96ecfd871289c2bb2ed9a9df57c3f"},
{file = "preshed-3.0.9-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:778cf71f82cedd2719b256f3980d556d6fb56ec552334ba79b49d16e26e854a0"},
{file = "preshed-3.0.9-cp39-cp39-win_amd64.whl", hash = "sha256:a6e579439b329eb93f32219ff27cb358b55fbb52a4862c31a915a098c8a22ac2"},
{file = "preshed-3.0.9.tar.gz", hash = "sha256:721863c5244ffcd2651ad0928951a2c7c77b102f4e11a251ad85d37ee7621660"},
]
[package.dependencies]
cymem = ">=2.0.2,<2.1.0"
murmurhash = ">=0.28.0,<1.1.0"
[[package]]
name = "presidio-analyzer"
version = "2.2.355"
description = "Presidio Analyzer package"
optional = false
python-versions = "<4.0,>=3.8"
files = [
{file = "presidio_analyzer-2.2.355-py3-none-any.whl", hash = "sha256:c4c5bc6d82e4f94059fd554c31365fc5d29afe51763391b6ecc33b628bdb5858"},
]
[package.dependencies]
phonenumbers = ">=8.12,<9.0.0"
pyyaml = "*"
regex = "*"
spacy = ">=3.4.4,<4.0.0"
tldextract = "*"
[package.extras]
azure-ai-language = ["azure-ai-textanalytics", "azure-core"]
server = ["flask (>=1.1)"]
stanza = ["spacy_stanza", "stanza"]
transformers = ["spacy_huggingface_pipelines"]
[[package]]
name = "proto-plus"
version = "1.24.0"
@@ -4897,31 +4519,6 @@ files = [
[package.extras]
optional = ["SQLAlchemy (>=1.4,<3)", "aiodns (>1.0)", "aiohttp (>=3.7.3,<4)", "boto3 (<=2)", "websocket-client (>=1,<2)", "websockets (>=9.1,<14)"]
[[package]]
name = "smart-open"
version = "7.0.5"
description = "Utils for streaming large files (S3, HDFS, GCS, Azure Blob Storage, gzip, bz2...)"
optional = false
python-versions = "<4.0,>=3.7"
files = [
{file = "smart_open-7.0.5-py3-none-any.whl", hash = "sha256:8523ed805c12dff3eaa50e9c903a6cb0ae78800626631c5fe7ea073439847b89"},
{file = "smart_open-7.0.5.tar.gz", hash = "sha256:d3672003b1dbc85e2013e4983b88eb9a5ccfd389b0d4e5015f39a9ee5620ec18"},
]
[package.dependencies]
wrapt = "*"
[package.extras]
all = ["azure-common", "azure-core", "azure-storage-blob", "boto3", "google-cloud-storage (>=2.6.0)", "paramiko", "requests", "zstandard"]
azure = ["azure-common", "azure-core", "azure-storage-blob"]
gcs = ["google-cloud-storage (>=2.6.0)"]
http = ["requests"]
s3 = ["boto3"]
ssh = ["paramiko"]
test = ["awscli", "azure-common", "azure-core", "azure-storage-blob", "boto3", "google-cloud-storage (>=2.6.0)", "moto[server]", "numpy", "paramiko", "pyopenssl", "pytest", "pytest-benchmark", "pytest-rerunfailures", "requests", "responses", "zstandard"]
webhdfs = ["requests"]
zst = ["zstandard"]
[[package]]
name = "smmap"
version = "5.0.1"
@@ -4944,152 +4541,6 @@ files = [
{file = "sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc"},
]
[[package]]
name = "spacy"
version = "3.8.2"
description = "Industrial-strength Natural Language Processing (NLP) in Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "spacy-3.8.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:795e74493036dda0a576093b797a4fdc1aaa83d66f6c9af0e5b6b1c640dc2222"},
{file = "spacy-3.8.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d432e78fe2a7424aba9a741db07ce58487d3b74fae4e20a779142828e61bc402"},
{file = "spacy-3.8.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:536a8ba17359540de502285934bf357805d978128d7bd5e84ba53d28b32a0ffb"},
{file = "spacy-3.8.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:6a0d0d39baa1cb9f5bb82874cbe1067bf494f76277a383f1f7b29f7a855d41a9"},
{file = "spacy-3.8.2-cp310-cp310-win_amd64.whl", hash = "sha256:9dcfcfda558b3e47946b2041c7a4203b78e542d0de20997a7c0a6d11b58b2522"},
{file = "spacy-3.8.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:7e5d48918028cff6d69d9915dad64f0e32ebd5f1e4f1fa81a2e17e56a6f61e05"},
{file = "spacy-3.8.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:455f845b88ed795d7e595070ee84b65b3ea382357811e09fc744789a20b7b5f7"},
{file = "spacy-3.8.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05d8a4cbfdb90049053564790a0d12fa790c9471580cb6a1f8bdc2b6e74703dd"},
{file = "spacy-3.8.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:e3c3e67f786f1410d08420ffcaba0f80dc58387ab6172dcdac1a73353d3a85c7"},
{file = "spacy-3.8.2-cp311-cp311-win_amd64.whl", hash = "sha256:cfe0c4558f635c67677e36d9a315f51d51f824870589c4846c95e880042a2ceb"},
{file = "spacy-3.8.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:0ce56f3c46dd4cebb5aaa3a40966e813b7fc6a540d547788a7d00cca10cd60a9"},
{file = "spacy-3.8.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:09faa873cf23d5136b1c1ce6378054a885f650fda96e1655a3ab49e2e7fdd15b"},
{file = "spacy-3.8.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:33e992a11de9b727c61288541945c0ffc37ed998aca76bfd557937c2c195d7d4"},
{file = "spacy-3.8.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:be962a8188fb20d6c2065e1e865d1799ebbf544c1af67ab8c75cb279bf5448c7"},
{file = "spacy-3.8.2-cp312-cp312-win_amd64.whl", hash = "sha256:04546c5f5ed607387d4e9ecf57614e90c5784866a10a3c6dbe5b06e9b18a2f29"},
{file = "spacy-3.8.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:7c5fb8b804ebf1c2791b384d61391e9d0227bcfdecd6c861291690813b8a6eb1"},
{file = "spacy-3.8.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3647233b2454e8e7bae94232563c9bff849db9e26bf61ef51122ef723de009fe"},
{file = "spacy-3.8.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ca20e2d9b4aeaedd7068d6419762d66cfad82bc8b1e63e36714601686d67f163"},
{file = "spacy-3.8.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:be3aa3e7d456627acbcb7f585156ee463c01d006a07aeb20b43a8543a02cd047"},
{file = "spacy-3.8.2-cp39-cp39-win_amd64.whl", hash = "sha256:54c63d31ef410ebb5b0fd72729afaf50f876bf2bc29f73c6c5fc3676ae4158a1"},
{file = "spacy-3.8.2.tar.gz", hash = "sha256:4b37ebd25ada4059b0dc9e0893e70dde5df83485329a068ef04580e70892a65d"},
]
[package.dependencies]
catalogue = ">=2.0.6,<2.1.0"
cymem = ">=2.0.2,<2.1.0"
jinja2 = "*"
langcodes = ">=3.2.0,<4.0.0"
murmurhash = ">=0.28.0,<1.1.0"
numpy = {version = ">=1.19.0", markers = "python_version >= \"3.9\""}
packaging = ">=20.0"
preshed = ">=3.0.2,<3.1.0"
pydantic = ">=1.7.4,<1.8 || >1.8,<1.8.1 || >1.8.1,<3.0.0"
requests = ">=2.13.0,<3.0.0"
setuptools = "*"
spacy-legacy = ">=3.0.11,<3.1.0"
spacy-loggers = ">=1.0.0,<2.0.0"
srsly = ">=2.4.3,<3.0.0"
thinc = ">=8.3.0,<8.4.0"
tqdm = ">=4.38.0,<5.0.0"
typer = ">=0.3.0,<1.0.0"
wasabi = ">=0.9.1,<1.2.0"
weasel = ">=0.1.0,<0.5.0"
[package.extras]
apple = ["thinc-apple-ops (>=1.0.0,<2.0.0)"]
cuda = ["cupy (>=5.0.0b4,<13.0.0)"]
cuda-autodetect = ["cupy-wheel (>=11.0.0,<13.0.0)"]
cuda100 = ["cupy-cuda100 (>=5.0.0b4,<13.0.0)"]
cuda101 = ["cupy-cuda101 (>=5.0.0b4,<13.0.0)"]
cuda102 = ["cupy-cuda102 (>=5.0.0b4,<13.0.0)"]
cuda110 = ["cupy-cuda110 (>=5.0.0b4,<13.0.0)"]
cuda111 = ["cupy-cuda111 (>=5.0.0b4,<13.0.0)"]
cuda112 = ["cupy-cuda112 (>=5.0.0b4,<13.0.0)"]
cuda113 = ["cupy-cuda113 (>=5.0.0b4,<13.0.0)"]
cuda114 = ["cupy-cuda114 (>=5.0.0b4,<13.0.0)"]
cuda115 = ["cupy-cuda115 (>=5.0.0b4,<13.0.0)"]
cuda116 = ["cupy-cuda116 (>=5.0.0b4,<13.0.0)"]
cuda117 = ["cupy-cuda117 (>=5.0.0b4,<13.0.0)"]
cuda11x = ["cupy-cuda11x (>=11.0.0,<13.0.0)"]
cuda12x = ["cupy-cuda12x (>=11.5.0,<13.0.0)"]
cuda80 = ["cupy-cuda80 (>=5.0.0b4,<13.0.0)"]
cuda90 = ["cupy-cuda90 (>=5.0.0b4,<13.0.0)"]
cuda91 = ["cupy-cuda91 (>=5.0.0b4,<13.0.0)"]
cuda92 = ["cupy-cuda92 (>=5.0.0b4,<13.0.0)"]
ja = ["sudachidict-core (>=20211220)", "sudachipy (>=0.5.2,!=0.6.1)"]
ko = ["natto-py (>=0.9.0)"]
lookups = ["spacy-lookups-data (>=1.0.3,<1.1.0)"]
th = ["pythainlp (>=2.0)"]
transformers = ["spacy-transformers (>=1.1.2,<1.4.0)"]
[[package]]
name = "spacy-legacy"
version = "3.0.12"
description = "Legacy registered functions for spaCy backwards compatibility"
optional = false
python-versions = ">=3.6"
files = [
{file = "spacy-legacy-3.0.12.tar.gz", hash = "sha256:b37d6e0c9b6e1d7ca1cf5bc7152ab64a4c4671f59c85adaf7a3fcb870357a774"},
{file = "spacy_legacy-3.0.12-py2.py3-none-any.whl", hash = "sha256:476e3bd0d05f8c339ed60f40986c07387c0a71479245d6d0f4298dbd52cda55f"},
]
[[package]]
name = "spacy-loggers"
version = "1.0.5"
description = "Logging utilities for SpaCy"
optional = false
python-versions = ">=3.6"
files = [
{file = "spacy-loggers-1.0.5.tar.gz", hash = "sha256:d60b0bdbf915a60e516cc2e653baeff946f0cfc461b452d11a4d5458c6fe5f24"},
{file = "spacy_loggers-1.0.5-py3-none-any.whl", hash = "sha256:196284c9c446cc0cdb944005384270d775fdeaf4f494d8e269466cfa497ef645"},
]
[[package]]
name = "srsly"
version = "2.4.8"
description = "Modern high-performance serialization utilities for Python"
optional = false
python-versions = ">=3.6"
files = [
{file = "srsly-2.4.8-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:17f3bcb418bb4cf443ed3d4dcb210e491bd9c1b7b0185e6ab10b6af3271e63b2"},
{file = "srsly-2.4.8-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0b070a58e21ab0e878fd949f932385abb4c53dd0acb6d3a7ee75d95d447bc609"},
{file = "srsly-2.4.8-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:98286d20014ed2067ad02b0be1e17c7e522255b188346e79ff266af51a54eb33"},
{file = "srsly-2.4.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:18685084e2e0cc47c25158cbbf3e44690e494ef77d6418c2aae0598c893f35b0"},
{file = "srsly-2.4.8-cp310-cp310-win_amd64.whl", hash = "sha256:980a179cbf4eb5bc56f7507e53f76720d031bcf0cef52cd53c815720eb2fc30c"},
{file = "srsly-2.4.8-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5472ed9f581e10c32e79424c996cf54c46c42237759f4224806a0cd4bb770993"},
{file = "srsly-2.4.8-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:50f10afe9230072c5aad9f6636115ea99b32c102f4c61e8236d8642c73ec7a13"},
{file = "srsly-2.4.8-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c994a89ba247a4d4f63ef9fdefb93aa3e1f98740e4800d5351ebd56992ac75e3"},
{file = "srsly-2.4.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ace7ed4a0c20fa54d90032be32f9c656b6d75445168da78d14fe9080a0c208ad"},
{file = "srsly-2.4.8-cp311-cp311-win_amd64.whl", hash = "sha256:7a919236a090fb93081fbd1cec030f675910f3863825b34a9afbcae71f643127"},
{file = "srsly-2.4.8-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:7583c03d114b4478b7a357a1915305163e9eac2dfe080da900555c975cca2a11"},
{file = "srsly-2.4.8-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:94ccdd2f6db824c31266aaf93e0f31c1c43b8bc531cd2b3a1d924e3c26a4f294"},
{file = "srsly-2.4.8-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:db72d2974f91aee652d606c7def98744ca6b899bd7dd3009fd75ebe0b5a51034"},
{file = "srsly-2.4.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6a60c905fd2c15e848ce1fc315fd34d8a9cc72c1dee022a0d8f4c62991131307"},
{file = "srsly-2.4.8-cp312-cp312-win_amd64.whl", hash = "sha256:e0b8d5722057000694edf105b8f492e7eb2f3aa6247a5f0c9170d1e0d074151c"},
{file = "srsly-2.4.8-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:196b4261f9d6372d1d3d16d1216b90c7e370b4141471322777b7b3c39afd1210"},
{file = "srsly-2.4.8-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4750017e6d78590b02b12653e97edd25aefa4734281386cc27501d59b7481e4e"},
{file = "srsly-2.4.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa034cd582ba9e4a120c8f19efa263fcad0f10fc481e73fb8c0d603085f941c4"},
{file = "srsly-2.4.8-cp36-cp36m-win_amd64.whl", hash = "sha256:5a78ab9e9d177ee8731e950feb48c57380036d462b49e3fb61a67ce529ff5f60"},
{file = "srsly-2.4.8-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:087e36439af517e259843df93eb34bb9e2d2881c34fa0f541589bcfbc757be97"},
{file = "srsly-2.4.8-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ad141d8a130cb085a0ed3a6638b643e2b591cb98a4591996780597a632acfe20"},
{file = "srsly-2.4.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:24d05367b2571c0d08d00459636b951e3ca2a1e9216318c157331f09c33489d3"},
{file = "srsly-2.4.8-cp37-cp37m-win_amd64.whl", hash = "sha256:3fd661a1c4848deea2849b78f432a70c75d10968e902ca83c07c89c9b7050ab8"},
{file = "srsly-2.4.8-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:ec37233fe39af97b00bf20dc2ceda04d39b9ea19ce0ee605e16ece9785e11f65"},
{file = "srsly-2.4.8-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d2fd4bc081f1d6a6063396b6d97b00d98e86d9d3a3ac2949dba574a84e148080"},
{file = "srsly-2.4.8-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7347cff1eb4ef3fc335d9d4acc89588051b2df43799e5d944696ef43da79c873"},
{file = "srsly-2.4.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5a9dc1da5cc94d77056b91ba38365c72ae08556b6345bef06257c7e9eccabafe"},
{file = "srsly-2.4.8-cp38-cp38-win_amd64.whl", hash = "sha256:dc0bf7b6f23c9ecb49ec0924dc645620276b41e160e9b283ed44ca004c060d79"},
{file = "srsly-2.4.8-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:ff8df21d00d73c371bead542cefef365ee87ca3a5660de292444021ff84e3b8c"},
{file = "srsly-2.4.8-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0ac3e340e65a9fe265105705586aa56054dc3902789fcb9a8f860a218d6c0a00"},
{file = "srsly-2.4.8-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:06d1733f4275eff4448e96521cc7dcd8fdabd68ba9b54ca012dcfa2690db2644"},
{file = "srsly-2.4.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:be5b751ad88fdb58fb73871d456248c88204f213aaa3c9aab49b6a1802b3fa8d"},
{file = "srsly-2.4.8-cp39-cp39-win_amd64.whl", hash = "sha256:822a38b8cf112348f3accbc73274a94b7bf82515cb14a85ba586d126a5a72851"},
{file = "srsly-2.4.8.tar.gz", hash = "sha256:b24d95a65009c2447e0b49cda043ac53fecf4f09e358d87a57446458f91b8a91"},
]
[package.dependencies]
catalogue = ">=2.0.3,<2.1.0"
[[package]]
name = "std-uritemplate"
version = "2.0.0"
@@ -5161,76 +4612,6 @@ files = [
doc = ["reno", "sphinx"]
test = ["pytest", "tornado (>=4.5)", "typeguard"]
[[package]]
name = "thinc"
version = "8.3.2"
description = "A refreshing functional take on deep learning, compatible with your favorite libraries"
optional = false
python-versions = ">=3.6"
files = [
{file = "thinc-8.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6af5b1b57fb1874079f7e84cd99c983c3dcb234a55845d8585d7e066b09755fb"},
{file = "thinc-8.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f8b753b63714d38f36e951241466c650afe3177b0c8b220e180ebf4888f09f5e"},
{file = "thinc-8.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d471e1261e5e650f93cfae9880928c2ee68ad0426656f02da4490dd24716a93b"},
{file = "thinc-8.3.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8b46786063787a60a0d732a5d43d0196f632d3d35780c8fe1232d1378b1b5980"},
{file = "thinc-8.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:f96c274c4119c92fb8fd8a708381080d47ad92994ef3041c791ed6d4b5c27761"},
{file = "thinc-8.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:12e998780f40d36d4d5f3b760ef60ac60637643f2965ebe1948801ba44261a03"},
{file = "thinc-8.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:54a5411daaca1718a73982767b714c1d0a5e142de73c916367baf1c13d79e8f0"},
{file = "thinc-8.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ed88275031dcaadd85d3deeb8eb12d1ec0ee6b4679e24cc893c81a30409ac4ee"},
{file = "thinc-8.3.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ef0868e55108f05300f4508e6896ae4e9492f3415220e3da65579f693225816e"},
{file = "thinc-8.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:813942d59881c4e4165ce95fef37ba30ce3366dac43289697d13a952a8208854"},
{file = "thinc-8.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:bce8ca6a62ab82f4595210ba7f18bbdb6e33561277c59060f2f04bdb93ac4fbc"},
{file = "thinc-8.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b014a282e9ea6330a678b472d74f479c7a38168cbf570bdc740e50d960dd78a1"},
{file = "thinc-8.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a80384c228ac6bbaf4ab7f7c9ca4a53c6053f2fb37b2b50c4730b9057f07e9fd"},
{file = "thinc-8.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ae309b0788478984eafeac4e3c33a2de84a6ea251fd1e3528d8018d4b4347247"},
{file = "thinc-8.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:fe8dac2749db23f8ebf09d7a7f29e1b99d67e7d7b183e106aa2b6c9b570f3015"},
{file = "thinc-8.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:e4b1e4149a3bfdeb308cee4b53b07d234e5b35495a7f35241b80acf7cb4a33d3"},
{file = "thinc-8.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a6b507b1fecd1771fc448aa27dc42d024e5799d10f1ddad6abc6353ae72ef540"},
{file = "thinc-8.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4edb20939c3a157beb386aee221a5e1bbfb7ffb90d63d22c80047ca0fa4d026d"},
{file = "thinc-8.3.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:d701de93d6d6bb029d24088d7f8cb8200f486658fd08dd859767b5eda6eba268"},
{file = "thinc-8.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:2977c4811b7612984ded795dce182419b9f3058a1a55c191f75024ec2f4cb218"},
{file = "thinc-8.3.2.tar.gz", hash = "sha256:3e8ef69eac89a601e11d47fc9e43d26ffe7ef682dcf667c94ff35ff690549aeb"},
]
[package.dependencies]
blis = ">=1.0.0,<1.1.0"
catalogue = ">=2.0.4,<2.1.0"
confection = ">=0.0.1,<1.0.0"
cymem = ">=2.0.2,<2.1.0"
murmurhash = ">=1.0.2,<1.1.0"
numpy = {version = ">=2.0.0,<2.1.0", markers = "python_version >= \"3.9\""}
packaging = ">=20.0"
preshed = ">=3.0.2,<3.1.0"
pydantic = ">=1.7.4,<1.8 || >1.8,<1.8.1 || >1.8.1,<3.0.0"
setuptools = "*"
srsly = ">=2.4.0,<3.0.0"
wasabi = ">=0.8.1,<1.2.0"
[package.extras]
apple = ["thinc-apple-ops (>=1.0.0,<2.0.0)"]
cuda = ["cupy (>=5.0.0b4)"]
cuda-autodetect = ["cupy-wheel (>=11.0.0)"]
cuda100 = ["cupy-cuda100 (>=5.0.0b4)"]
cuda101 = ["cupy-cuda101 (>=5.0.0b4)"]
cuda102 = ["cupy-cuda102 (>=5.0.0b4)"]
cuda110 = ["cupy-cuda110 (>=5.0.0b4)"]
cuda111 = ["cupy-cuda111 (>=5.0.0b4)"]
cuda112 = ["cupy-cuda112 (>=5.0.0b4)"]
cuda113 = ["cupy-cuda113 (>=5.0.0b4)"]
cuda114 = ["cupy-cuda114 (>=5.0.0b4)"]
cuda115 = ["cupy-cuda115 (>=5.0.0b4)"]
cuda116 = ["cupy-cuda116 (>=5.0.0b4)"]
cuda117 = ["cupy-cuda117 (>=5.0.0b4)"]
cuda11x = ["cupy-cuda11x (>=11.0.0)"]
cuda12x = ["cupy-cuda12x (>=11.5.0)"]
cuda80 = ["cupy-cuda80 (>=5.0.0b4)"]
cuda90 = ["cupy-cuda90 (>=5.0.0b4)"]
cuda91 = ["cupy-cuda91 (>=5.0.0b4)"]
cuda92 = ["cupy-cuda92 (>=5.0.0b4)"]
datasets = ["ml-datasets (>=0.2.0,<0.3.0)"]
mxnet = ["mxnet (>=1.5.1,<1.6.0)"]
tensorflow = ["tensorflow (>=2.0.0,<2.6.0)"]
torch = ["torch (>=1.6.0)"]
[[package]]
name = "tldextract"
version = "5.1.2"
@@ -5274,26 +4655,6 @@ files = [
{file = "tomlkit-0.13.2.tar.gz", hash = "sha256:fff5fe59a87295b278abd31bec92c15d9bc4a06885ab12bcea52c71119392e79"},
]
[[package]]
name = "tqdm"
version = "4.66.5"
description = "Fast, Extensible Progress Meter"
optional = false
python-versions = ">=3.7"
files = [
{file = "tqdm-4.66.5-py3-none-any.whl", hash = "sha256:90279a3770753eafc9194a0364852159802111925aa30eb3f9d85b0e805ac7cd"},
{file = "tqdm-4.66.5.tar.gz", hash = "sha256:e1020aef2e5096702d8a025ac7d16b1577279c9d63f8375b63083e9a5f0fcbad"},
]
[package.dependencies]
colorama = {version = "*", markers = "platform_system == \"Windows\""}
[package.extras]
dev = ["pytest (>=6)", "pytest-cov", "pytest-timeout", "pytest-xdist"]
notebook = ["ipywidgets (>=6)"]
slack = ["slack-sdk"]
telegram = ["requests"]
[[package]]
name = "typer"
version = "0.12.5"
@@ -5408,20 +4769,6 @@ files = [
[package.dependencies]
tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""}
[[package]]
name = "wasabi"
version = "1.1.3"
description = "A lightweight console printing and formatting toolkit"
optional = false
python-versions = ">=3.6"
files = [
{file = "wasabi-1.1.3-py3-none-any.whl", hash = "sha256:f76e16e8f7e79f8c4c8be49b4024ac725713ab10cd7f19350ad18a8e3f71728c"},
{file = "wasabi-1.1.3.tar.gz", hash = "sha256:4bb3008f003809db0c3e28b4daf20906ea871a2bb43f9914197d540f4f2e0878"},
]
[package.dependencies]
colorama = {version = ">=0.4.6", markers = "sys_platform == \"win32\" and python_version >= \"3.7\""}
[[package]]
name = "watchdog"
version = "5.0.3"
@@ -5464,28 +4811,6 @@ files = [
[package.extras]
watchmedo = ["PyYAML (>=3.10)"]
[[package]]
name = "weasel"
version = "0.4.1"
description = "Weasel: A small and easy workflow system"
optional = false
python-versions = ">=3.7"
files = [
{file = "weasel-0.4.1-py3-none-any.whl", hash = "sha256:24140a090ea1ac512a2b2f479cc64192fd1d527a7f3627671268d08ed5ac418c"},
{file = "weasel-0.4.1.tar.gz", hash = "sha256:aabc210f072e13f6744e5c3a28037f93702433405cd35673f7c6279147085aa9"},
]
[package.dependencies]
cloudpathlib = ">=0.7.0,<1.0.0"
confection = ">=0.0.4,<0.2.0"
packaging = ">=20.0"
pydantic = ">=1.7.4,<1.8 || >1.8,<1.8.1 || >1.8.1,<3.0.0"
requests = ">=2.13.0,<3.0.0"
smart-open = ">=5.2.1,<8.0.0"
srsly = ">=2.4.3,<3.0.0"
typer = ">=0.3.0,<1.0.0"
wasabi = ">=0.9.1,<1.2.0"
[[package]]
name = "websocket-client"
version = "1.8.0"
@@ -5747,4 +5072,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.9,<3.13"
content-hash = "3ec22e10b783a77283ec4ce26cf8761341f8684353e20628d509b2f86bd17ae0"
content-hash = "6550ac072ff73af5ec22e44d8252771859288578b89756206fe1a852d0c4cd97"

View File

@@ -485,7 +485,7 @@
"codeartifact_packages_external_public_publishing_disabled",
"ecr_repositories_not_publicly_accessible",
"efs_not_publicly_accessible",
"eks_endpoints_not_publicly_accessible",
"eks_cluster_not_publicly_accessible",
"elb_internet_facing",
"elbv2_internet_facing",
"s3_account_level_public_access_blocks",
@@ -664,7 +664,7 @@
"awslambda_function_not_publicly_accessible",
"apigateway_restapi_waf_acl_attached",
"cloudfront_distributions_using_waf",
"eks_control_plane_endpoint_access_restricted",
"eks_cluster_not_publicly_accessible",
"sagemaker_models_network_isolation_enabled",
"sagemaker_models_vpc_settings_configured",
"sagemaker_notebook_instance_vpc_settings_configured",

View File

@@ -645,7 +645,7 @@
],
"Attributes": [
{
"Section": "2.4 Relational Database Service (RDS)",
"Section": "2.4 Elastic File System (EFS)",
"Profile": "Level 1",
"AssessmentStatus": "Manual",
"Description": "EFS data should be encrypted at rest using AWS KMS (Key Management Service).",

View File

@@ -643,7 +643,7 @@
],
"Attributes": [
{
"Section": "2.4 Relational Database Service (RDS)",
"Section": "2.4 Elastic File System (EFS)",
"Profile": "Level 1",
"AssessmentStatus": "Manual",
"Description": "EFS data should be encrypted at rest using AWS KMS (Key Management Service).",

View File

@@ -643,7 +643,7 @@
],
"Attributes": [
{
"Section": "2.4 Relational Database Service (RDS)",
"Section": "2.4 Elastic File System (EFS)",
"Profile": "Level 1",
"AssessmentStatus": "Automated",
"Description": "EFS data should be encrypted at rest using AWS KMS (Key Management Service).",

View File

@@ -1509,9 +1509,9 @@
"iam_user_mfa_enabled_console_access",
"networkfirewall_in_all_vpc",
"eks_cluster_network_policy_enabled",
"eks_control_plane_endpoint_access_restricted",
"eks_cluster_not_publicly_accessible",
"eks_cluster_private_nodes_enabled",
"eks_endpoints_not_publicly_accessible",
"eks_cluster_not_publicly_accessible",
"kafka_cluster_is_public",
"kafka_cluster_unrestricted_access_disabled",
"vpc_peering_routing_tables_with_least_privilege",

View File

@@ -1509,9 +1509,9 @@
"iam_user_mfa_enabled_console_access",
"networkfirewall_in_all_vpc",
"eks_cluster_network_policy_enabled",
"eks_control_plane_endpoint_access_restricted",
"eks_cluster_not_publicly_accessible",
"eks_cluster_private_nodes_enabled",
"eks_endpoints_not_publicly_accessible",
"eks_cluster_not_publicly_accessible",
"kafka_cluster_is_public",
"kafka_cluster_unrestricted_access_disabled",
"vpc_peering_routing_tables_with_least_privilege",

View File

@@ -19,7 +19,7 @@
"ec2_ebs_public_snapshot",
"ec2_instance_profile_attached",
"ec2_instance_public_ip",
"eks_endpoints_not_publicly_accessible",
"eks_cluster_not_publicly_accessible",
"emr_cluster_master_nodes_no_public_ip",
"iam_aws_attached_policy_no_administrative_privileges",
"iam_customer_attached_policy_no_administrative_privileges",
@@ -61,7 +61,7 @@
"ec2_ebs_public_snapshot",
"ec2_instance_profile_attached",
"ec2_instance_public_ip",
"eks_endpoints_not_publicly_accessible",
"eks_cluster_not_publicly_accessible",
"emr_cluster_master_nodes_no_public_ip",
"iam_aws_attached_policy_no_administrative_privileges",
"iam_customer_attached_policy_no_administrative_privileges",
@@ -102,7 +102,7 @@
"Checks": [
"ec2_ebs_public_snapshot",
"ec2_instance_public_ip",
"eks_endpoints_not_publicly_accessible",
"eks_cluster_not_publicly_accessible",
"emr_cluster_master_nodes_no_public_ip",
"awslambda_function_not_publicly_accessible",
"awslambda_function_url_public",

View File

@@ -971,7 +971,7 @@
"Checks": [
"ec2_ebs_public_snapshot",
"ec2_instance_public_ip",
"eks_endpoints_not_publicly_accessible",
"eks_cluster_not_publicly_accessible",
"emr_cluster_master_nodes_no_public_ip",
"awslambda_function_url_public",
"rds_instance_no_public_access",

View File

@@ -3043,9 +3043,7 @@
{
"Id": "9.4",
"Description": "Ensure that Register with Entra ID is enabled on App Service",
"Checks": [
""
],
"Checks": [],
"Attributes": [
{
"Section": "9. AppService",
@@ -3175,9 +3173,7 @@
{
"Id": "9.10",
"Description": "Ensure Azure Key Vaults are Used to Store Secrets",
"Checks": [
""
],
"Checks": [],
"Attributes": [
{
"Section": "9. AppService",

View File

@@ -12,7 +12,7 @@ from prowler.lib.logger import logger
timestamp = datetime.today()
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
prowler_version = "4.5.0"
prowler_version = "4.5.4"
html_logo_url = "https://github.com/prowler-cloud/prowler/"
square_logo_img = "https://prowler.com/wp-content/uploads/logo-html.png"
aws_logo = "https://user-images.githubusercontent.com/38561120/235953920-3e3fba08-0795-41dc-b480-9bea57db9f2e.png"

View File

@@ -72,31 +72,6 @@ aws:
# AWS Cloudwatch Configuration
# aws.cloudwatch_log_group_retention_policy_specific_days_enabled --> by default is 365 days
log_group_retention_days: 365
# aws.cloudwatch_log_group_no_critical_pii_in_logs --> see all available entities in https://microsoft.github.io/presidio/supported_entities/
critical_pii_entities : [
"CREDIT_CARD", # Credit card numbers are highly sensitive financial information.
"CRYPTO", # Crypto wallet numbers (e.g., Bitcoin addresses) can give access to cryptocurrency.
"IBAN_CODE", # International Bank Account Numbers are critical financial information.
"US_BANK_NUMBER", # US bank account numbers are sensitive and should be protected.
"US_SSN", # US Social Security Numbers are critical PII used for identity verification.
"US_PASSPORT", # US passport numbers can be used for identity theft.
"US_ITIN", # US Individual Taxpayer Identification Numbers are sensitive personal identifiers.
#"UK_NHS", # UK NHS numbers can be used to access medical records and other private information.
#"ES_NIF", # Spanish NIF (Personal tax ID) is critical for identification and tax purposes.
#"ES_NIE", # Spanish NIE (Foreigners ID card) is a critical identifier for foreign residents.
#"IT_FISCAL_CODE", # Italian personal identification code is sensitive PII for tax and legal purposes.
#"IT_PASSPORT", # Italian passport numbers are critical PII.
#"IT_IDENTITY_CARD", # Italian identity card numbers are critical for personal identification.
#"PL_PESEL", # Polish PESEL numbers are sensitive personal identifiers.
#"SG_NRIC_FIN", # Singapore National Registration Identification Card is critical PII.
#"AU_ABN", # Australian Business Numbers are critical for business identification.
#"AU_TFN", # Australian Tax File Numbers are sensitive and used for taxation purposes.
#"AU_MEDICARE", # Australian Medicare numbers are sensitive medical identifiers.
#"IN_PAN", # Indian Permanent Account Numbers are critical for tax purposes and identity.
#"IN_AADHAAR", # Indian Aadhaar numbers are highly sensitive and serve as a universal identity number.
#"FI_PERSONAL_IDENTITY_CODE" # Finnish Personal Identity Code is sensitive PII for personal identification.
]
pii_language: "en" # Language for recognizing PII entities
# AWS AppStream Session Configuration
# aws.appstream_fleet_session_idle_disconnect_timeout

View File

@@ -42,7 +42,7 @@ class AzureCIS(ComplianceOutput):
compliance_row = AzureCISModel(
Provider=finding.provider,
Description=compliance.Description,
Subscription=finding.account_name,
SubscriptionId=finding.account_uid,
Location=finding.region,
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,
@@ -73,7 +73,7 @@ class AzureCIS(ComplianceOutput):
compliance_row = AzureCISModel(
Provider=compliance.Provider.lower(),
Description=compliance.Description,
Subscription="",
SubscriptionId="",
Location="",
AssessmentDate=str(finding.timestamp),
Requirements_Id=requirement.Id,

View File

@@ -38,7 +38,7 @@ class AzureCISModel(BaseModel):
Provider: str
Description: str
Subscription: str
SubscriptionId: str
Location: str
AssessmentDate: str
Requirements_Id: str

View File

@@ -1262,7 +1262,9 @@ class AwsProvider(Provider):
logger.critical(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
raise error
if raise_on_exception:
raise error
return Connection(error=error)
@staticmethod
def create_sts_session(

View File

@@ -23,7 +23,9 @@
"Url": "https://docs.aws.amazon.com/autoscaling/ec2/userguide/as-add-availability-zone.html"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -1,32 +0,0 @@
{
"Provider": "aws",
"CheckID": "cloudwatch_log_group_no_critical_pii_in_logs",
"CheckTitle": "Check if secrets exists in CloudWatch logs.",
"CheckType": [],
"ServiceName": "cloudwatch",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:cloudwatch:region:account-id:log-group/resource-id",
"Severity": "medium",
"ResourceType": "Other",
"Description": "Check if secrets exists in CloudWatch logs",
"Risk": "Storing sensitive data in CloudWatch logs could allow an attacker with read-only access to escalate their privileges or gain unauthorised access to systems.",
"RelatedUrl": "https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudwatch-alarms-for-cloudtrail.html",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "It is recommended that sensitive information is not logged to CloudWatch logs. Alternatively, sensitive data may be masked using a protection policy",
"Url": "https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/mask-sensitive-log-data.html"
}
},
"Categories": [
"secrets"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}

View File

@@ -1,147 +0,0 @@
from json import dumps
from typing import Set
from presidio_analyzer import AnalyzerEngine
from prowler.lib.check.models import Check, Check_Report_AWS
from prowler.providers.aws.services.cloudwatch.cloudwatch_service import (
convert_to_cloudwatch_timestamp_format,
)
from prowler.providers.aws.services.cloudwatch.logs_client import logs_client
class cloudwatch_log_group_no_critical_pii_in_logs(Check):
def execute(self):
findings = []
# Initialize the PII Analyzer engine
analyzer = AnalyzerEngine()
if logs_client.log_groups:
critical_pii_entities = logs_client.audit_config.get(
"critical_pii_entities",
[
"CREDIT_CARD",
"EMAIL_ADDRESS",
"PHONE_NUMBER",
"US_SSN",
"US_BANK_NUMBER",
"IBAN_CODE",
"US_PASSPORT",
],
)
pii_language = logs_client.audit_config.get("pii_language", "en")
for log_group in logs_client.log_groups.values():
report = Check_Report_AWS(self.metadata())
report.status = "PASS"
report.status_extended = (
f"No critical PII found in {log_group.name} log group."
)
report.region = log_group.region
report.resource_id = log_group.name
report.resource_arn = log_group.arn
report.resource_tags = log_group.tags
log_group_pii = []
if log_group.log_streams:
for log_stream_name in log_group.log_streams:
log_stream_pii = {}
log_stream_events = [
dumps(event["message"])
for event in log_group.log_streams[log_stream_name]
]
# Process log data in manageable chunks since the limit of Presidio Analyzer is 100,000 characters
MAX_CHUNK_SIZE = 100000
for i in range(0, len(log_stream_events)):
chunk = log_stream_events[i]
# Split if chunk exceeds max allowed size for analyzer
if len(chunk) > MAX_CHUNK_SIZE:
split_chunks = [
chunk[j : j + MAX_CHUNK_SIZE]
for j in range(0, len(chunk), MAX_CHUNK_SIZE)
]
else:
split_chunks = [chunk]
for split_chunk in split_chunks:
# PII detection for each split chunk
pii_detection_result = analyzer.analyze(
text=split_chunk,
entities=critical_pii_entities,
score_threshold=1,
language=pii_language,
)
# Track cumulative character count to map PII to log event
cumulative_char_count = 0
for j, log_event in enumerate(
log_stream_events[i : i + len(split_chunks)]
):
log_event_length = len(log_event)
for pii in pii_detection_result:
# Check if PII start position falls within this log event
if (
cumulative_char_count
<= pii.start
< cumulative_char_count + log_event_length
):
flagged_event = log_group.log_streams[
log_stream_name
][j]
cloudwatch_timestamp = (
convert_to_cloudwatch_timestamp_format(
flagged_event["timestamp"]
)
)
if (
cloudwatch_timestamp
not in log_stream_pii
):
log_stream_pii[cloudwatch_timestamp] = (
SecretsDict()
)
# Add the detected PII entity to log_stream_pii
log_stream_pii[
cloudwatch_timestamp
].add_secret(
pii.start - cumulative_char_count,
pii.entity_type,
)
cumulative_char_count += (
log_event_length + 1
) # +1 to account for '\n'
if log_stream_pii:
pii_string = "; ".join(
[
f"at {timestamp} - {str(log_stream_pii[timestamp])}"
for timestamp in log_stream_pii
]
)
log_group_pii.append(
f"in log stream {log_stream_name} {pii_string}"
)
if log_group_pii:
pii_string = "; ".join(log_group_pii)
report.status = "FAIL"
report.status_extended = f"Potential critical PII found in log group {log_group.name} {pii_string}."
findings.append(report)
return findings
class SecretsDict(dict[int, Set[str]]):
"""Dictionary to track unique PII types on each line."""
def add_secret(self, line_number: int, pii_type: str) -> None:
"""Add a PII type to a specific line number, ensuring no duplicates."""
self.setdefault(line_number, set()).add(pii_type)
def __str__(self) -> str:
"""Generate a formatted string representation of the dictionary."""
return ", ".join(
f"{', '.join(sorted(pii_types))} on line {line_number}"
for line_number, pii_types in sorted(self.items())
)

View File

@@ -93,8 +93,6 @@ class Logs(AWSService):
if (
"cloudwatch_log_group_no_secrets_in_logs"
in provider.audit_metadata.expected_checks
or "cloudwatch_log_group_no_critical_pii_in_logs"
in provider.audit_metadata.expected_checks
):
self.events_per_log_group_threshold = (
1000 # The threshold for number of events to return per log group.

View File

@@ -23,7 +23,9 @@
"Url": "https://www.trendmicro.com/cloudoneconformity-staging/knowledge-base/aws/DMS/multi-az.html#"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -12,11 +12,12 @@ class ec2_ami_public(Check):
report.resource_arn = image.arn
report.resource_tags = image.tags
report.status = "PASS"
report.status_extended = f"EC2 AMI {image.id} is not public."
report.status_extended = (
f"EC2 AMI {image.name if image.name else image.id} is not public."
)
if image.public:
report.status = "FAIL"
report.status_extended = f"EC2 AMI {image.id} is currently public."
report.resource_id = image.id
report.status_extended = f"EC2 AMI {image.name if image.name else image.id} is currently public."
findings.append(report)

View File

@@ -17,34 +17,38 @@ class ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports(Check
and vpc_client.vpcs[security_group.vpc_id].in_use
and len(security_group.network_interfaces) > 0
):
check_ports = ec2_client.audit_config.get(
"ec2_high_risk_ports",
[25, 110, 135, 143, 445, 3000, 4333, 5000, 5500, 8080, 8088],
)
for port in check_ports:
report = Check_Report_AWS(self.metadata())
report.region = security_group.region
report.resource_details = security_group.name
report.resource_id = security_group.id
report.resource_arn = security_group_arn
report.resource_tags = security_group.tags
report.status = "PASS"
report.status_extended = f"Security group {security_group.name} ({security_group.id}) does not have port {port} open to the Internet."
# only proceed if check "..._to_all_ports" did not run or did not FAIL to avoid to report open ports twice
if not ec2_client.is_failed_check(
ec2_securitygroup_allow_ingress_from_internet_to_all_ports.__name__,
security_group_arn,
):
# Loop through every security group's ingress rule and check it
for ingress_rule in security_group.ingress_rules:
report = Check_Report_AWS(self.metadata())
report.region = security_group.region
report.resource_details = security_group.name
report.resource_id = security_group.id
report.resource_arn = security_group_arn
report.resource_tags = security_group.tags
report.status = "PASS"
report.status_extended = f"Security group {security_group.name} ({security_group.id}) does not have any high-risk port open to the Internet."
# only proceed if check "..._to_all_ports" did not run or did not FAIL to avoid to report open ports twice
if not ec2_client.is_failed_check(
ec2_securitygroup_allow_ingress_from_internet_to_all_ports.__name__,
security_group_arn,
):
check_ports = ec2_client.audit_config.get(
"ec2_high_risk_ports",
[25, 110, 135, 143, 445, 3000, 4333, 5000, 5500, 8080, 8088],
)
# Loop through every security group's ingress rule and check it
open_ports = []
for ingress_rule in security_group.ingress_rules:
for port in check_ports:
if check_security_group(
ingress_rule, "tcp", [port], any_address=True
):
report.status = "FAIL"
report.status_extended = f"Security group {security_group.name} ({security_group.id}) has port {port} (high risk port) open to the Internet."
break
else:
report.status_extended = f"Security group {security_group.name} ({security_group.id}) has all ports open to the Internet and therefore was not checked against port {port}."
findings.append(report)
open_ports.append(port)
if open_ports:
report.status = "FAIL"
open_ports_str = ", ".join(map(str, open_ports))
report.status_extended = f"Security group {security_group.name} ({security_group.id}) has the following high-risk ports open to the Internet: {open_ports_str}."
else:
report.status_extended = f"Security group {security_group.name} ({security_group.id}) has all ports open to the Internet and therefore was not checked against high-risk ports."
findings.append(report)
return findings

View File

@@ -357,7 +357,7 @@ class EC2(AWSService):
Image(
id=image["ImageId"],
arn=arn,
name=image["Name"],
name=image.get("Name", ""),
public=image.get("Public", False),
region=regional_client.region,
tags=image.get("Tags"),

View File

@@ -1,3 +1,4 @@
from prowler.lib.check.models import Severity
from prowler.providers.aws.services.ec2.ec2_service import Instance
from prowler.providers.aws.services.vpc.vpc_service import VpcSubnet
@@ -15,13 +16,13 @@ def get_instance_public_status(
tuple: The status and severity of the instance status.
"""
status = f"Instance {instance.id} has {service} exposed to 0.0.0.0/0 but with no public IP address."
severity = "medium"
severity = Severity.medium
if instance.public_ip:
status = f"Instance {instance.id} has {service} exposed to 0.0.0.0/0 on public IP address {instance.public_ip} but it is placed in a private subnet {instance.subnet_id}."
severity = "high"
severity = Severity.high
if vpc_subnets[instance.subnet_id].public:
status = f"Instance {instance.id} has {service} exposed to 0.0.0.0/0 on public IP address {instance.public_ip} in public subnet {instance.subnet_id}."
severity = "critical"
severity = Severity.critical
return status, severity

View File

@@ -23,7 +23,9 @@
"Url": "https://redis.io/blog/highly-available-in-memory-cloud-datastores/"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -23,7 +23,9 @@
"Url": "https://www.trendmicro.com/cloudoneconformity-staging/knowledge-base/aws/ElastiCache/elasticache-multi-az.html#"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -23,7 +23,9 @@
"Url": "https://docs.aws.amazon.com/elasticloadbalancing/latest/classic/enable-disable-crosszone-lb.html"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -23,7 +23,9 @@
"Url": "https://docs.aws.amazon.com/elasticloadbalancing/latest/application/load-balancer-subnets.html"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -1,7 +1,7 @@
{
"Provider": "aws",
"CheckID": "glue_etl_jobs_logging_enabled",
"CheckTitle": "Check if Glue ETL Jobs have logging enabled.",
"CheckTitle": "[DEPRECATED] Check if Glue ETL Jobs have logging enabled.",
"CheckType": [
"Software and Configuration Checks/Industry and Regulatory Standards/AWS Foundational Security Best Practices"
],
@@ -10,7 +10,7 @@
"ResourceIdTemplate": "arn:partition:glue:region:account-id:job/job-name",
"Severity": "medium",
"ResourceType": "AwsGlueJob",
"Description": "Ensure that Glue ETL Jobs have CloudWatch logs enabled.",
"Description": "[DEPRECATED] Ensure that Glue ETL Jobs have CloudWatch logs enabled.",
"Risk": "Without logging enabled, AWS Glue jobs lack visibility into job activities and failures, making it difficult to detect unauthorized access, troubleshoot issues, and ensure compliance. This may result in untracked security incidents or operational issues that affect data processing.",
"RelatedUrl": "https://docs.aws.amazon.com/glue/latest/dg/monitor-continuous-logging.html",
"Remediation": {
@@ -28,5 +28,5 @@
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
"Notes": "This check is being removed since logs for all AWS Glue jobs are now always sent to Amazon CloudWatch."
}

View File

@@ -25,7 +25,9 @@
"Url": "https://docs.aws.amazon.com/amazon-mq/latest/developer-guide/rabbitmq-broker-architecture.html#rabbitmq-broker-architecture-cluster"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -23,7 +23,9 @@
"Url": "https://docs.aws.amazon.com/securityhub/latest/userguide/neptune-controls.html#neptune-9"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -25,7 +25,9 @@
"Url": "https://aws.amazon.com/es/blogs/networking-and-content-delivery/deployment-models-for-aws-network-firewall/"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -23,7 +23,9 @@
"Url": "https://aws.amazon.com/rds/features/multi-az/"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -0,0 +1,32 @@
{
"Provider": "aws",
"CheckID": "rds_cluster_protected_by_backup_plan",
"CheckTitle": "Check if RDS clusters are protected by a backup plan.",
"CheckType": [
"Software and Configuration Checks, AWS Security Best Practices"
],
"ServiceName": "rds",
"SubServiceName": "",
"ResourceIdTemplate": "arn:aws:rds:region:account-id:db-cluster",
"Severity": "medium",
"ResourceType": "AwsRdsDbInstance",
"Description": "Check if RDS clusters are protected by a backup plan.",
"Risk": "Without a backup plan, RDS clusters are vulnerable to data loss, accidental deletion, or corruption. This could lead to significant operational disruptions or loss of critical data.",
"RelatedUrl": "https://docs.aws.amazon.com/aws-backup/latest/devguide/assigning-resources.html",
"Remediation": {
"Code": {
"CLI": "aws backup create-backup-plan --backup-plan , aws backup tag-resource --resource-arn <rds-cluster-arn> --tags Key=backup,Value=true",
"NativeIaC": "",
"Other": "https://docs.aws.amazon.com/securityhub/latest/userguide/rds-controls.html#rds-26",
"Terraform": ""
},
"Recommendation": {
"Text": "Create a backup plan for the RDS cluster to protect it from data loss, accidental deletion, or corruption.",
"Url": "https://docs.aws.amazon.com/aws-backup/latest/devguide/assigning-resources.html"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""
}

View File

@@ -0,0 +1,33 @@
from prowler.lib.check.models import Check, Check_Report_AWS
from prowler.providers.aws.services.backup.backup_client import backup_client
from prowler.providers.aws.services.rds.rds_client import rds_client
class rds_cluster_protected_by_backup_plan(Check):
def execute(self):
findings = []
for db_cluster_arn, db_cluster in rds_client.db_clusters.items():
report = Check_Report_AWS(self.metadata())
report.region = db_cluster.region
report.resource_id = db_cluster.id
report.resource_arn = db_cluster_arn
report.resource_tags = db_cluster.tags
report.status = "FAIL"
report.status_extended = (
f"RDS Cluster {db_cluster.id} is not protected by a backup plan."
)
if (
db_cluster_arn in backup_client.protected_resources
or f"arn:{rds_client.audited_partition}:rds:*:*:cluster:*"
in backup_client.protected_resources
or "*" in backup_client.protected_resources
):
report.status = "PASS"
report.status_extended = (
f"RDS Cluster {db_cluster.id} is protected by a backup plan."
)
findings.append(report)
return findings

View File

@@ -23,7 +23,9 @@
"Url": "https://aws.amazon.com/rds/features/multi-az/"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -25,7 +25,9 @@
"Url": "https://docs.aws.amazon.com/vpc/latest/userguide/configure-subnets.html"
}
},
"Categories": [],
"Categories": [
"redundancy"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

View File

@@ -99,7 +99,7 @@ class WAFv2(AWSService):
def _list_resources_for_web_acl(self, acl):
logger.info("WAFv2 - Describing resources...")
try:
if acl.scope == Scope.REGIONAL or acl.region in self.regional_clients:
if acl.scope == Scope.REGIONAL:
for resource in self.regional_clients[
acl.region
].list_resources_for_web_acl(

View File

@@ -161,54 +161,54 @@ class Provider(ABC):
if not isinstance(Provider._global, provider_class):
if "aws" in provider_class_name.lower():
provider_class(
arguments.aws_retries_max_attempts,
arguments.role,
arguments.session_duration,
arguments.external_id,
arguments.role_session_name,
arguments.mfa,
arguments.profile,
set(arguments.region) if arguments.region else None,
arguments.organizations_role,
arguments.scan_unused_services,
arguments.resource_tag,
arguments.resource_arn,
arguments.config_file,
arguments.mutelist_file,
retries_max_attempts=arguments.aws_retries_max_attempts,
role_arn=arguments.role,
session_duration=arguments.session_duration,
external_id=arguments.external_id,
role_session_name=arguments.role_session_name,
mfa=arguments.mfa,
profile=arguments.profile,
regions=set(arguments.region) if arguments.region else None,
organizations_role_arn=arguments.organizations_role,
scan_unused_services=arguments.scan_unused_services,
resource_tags=arguments.resource_tag,
resource_arn=arguments.resource_arn,
config_path=arguments.config_file,
mutelist_path=arguments.mutelist_file,
fixer_config=fixer_config,
)
elif "azure" in provider_class_name.lower():
provider_class(
arguments.az_cli_auth,
arguments.sp_env_auth,
arguments.browser_auth,
arguments.managed_identity_auth,
arguments.tenant_id,
arguments.azure_region,
arguments.subscription_id,
arguments.config_file,
arguments.mutelist_file,
az_cli_auth=arguments.az_cli_auth,
sp_env_auth=arguments.sp_env_auth,
browser_auth=arguments.browser_auth,
managed_identity_auth=arguments.managed_identity_auth,
tenant_id=arguments.tenant_id,
region=arguments.azure_region,
subscription_ids=arguments.subscription_id,
config_path=arguments.config_file,
mutelist_path=arguments.mutelist_file,
fixer_config=fixer_config,
)
elif "gcp" in provider_class_name.lower():
provider_class(
arguments.organization_id,
arguments.project_id,
arguments.excluded_project_id,
arguments.credentials_file,
arguments.impersonate_service_account,
arguments.list_project_id,
arguments.config_file,
arguments.mutelist_file,
organization_id=arguments.organization_id,
project_ids=arguments.project_id,
excluded_project_ids=arguments.excluded_project_id,
credentials_file=arguments.credentials_file,
impersonate_service_account=arguments.impersonate_service_account,
list_project_ids=arguments.list_project_id,
config_path=arguments.config_file,
mutelist_path=arguments.mutelist_file,
fixer_config=fixer_config,
)
elif "kubernetes" in provider_class_name.lower():
provider_class(
arguments.kubeconfig_file,
arguments.context,
arguments.namespace,
arguments.config_file,
arguments.mutelist_file,
kubeconfig_file=arguments.kubeconfig_file,
context=arguments.context,
namespace=arguments.namespace,
config_path=arguments.config_file,
mutelist_path=arguments.mutelist_file,
fixer_config=fixer_config,
)

View File

@@ -15,8 +15,8 @@ class GCPBaseException(ProwlerException):
"remediation": "Check the HTTP error and ensure the request is properly formatted.",
},
(3002, "GCPNoAccesibleProjectsError"): {
"message": "No Project IDs can be accessed via Google Credentials",
"remediation": "Ensure the project is accessible and properly set up.",
"message": "No Project IDs are active or can be accessed via Google Credentials",
"remediation": "Ensure the project is active and accessible.",
},
(3003, "GCPSetUpSessionError"): {
"message": "Error setting up session",

View File

@@ -105,22 +105,28 @@ class GcpProvider(Provider):
file=__file__,
message="No Project IDs can be accessed via Google Credentials.",
)
if project_ids:
if self._default_project_id not in project_ids:
self._default_project_id = project_ids[0]
for input_project in project_ids:
for accessible_project in accessible_projects:
if self.is_project_matching(input_project, accessible_project):
self._projects[accessible_project] = accessible_projects[
accessible_project
]
self._project_ids.append(
accessible_projects[accessible_project].id
)
for (
accessible_project_id,
accessible_project,
) in accessible_projects.items():
# Only scan active projects
if accessible_project.lifecycle_state == "ACTIVE":
if self.is_project_matching(
input_project, accessible_project_id
):
self._projects[accessible_project_id] = accessible_project
self._project_ids.append(accessible_project_id)
else:
# If not projects were input, all accessible projects are scanned by default
for project_id, project in accessible_projects.items():
self._projects[project_id] = project
self._project_ids.append(project_id)
# Only scan active projects
if project.lifecycle_state == "ACTIVE":
self._projects[project_id] = project
self._project_ids.append(project_id)
# Remove excluded projects if any input
if excluded_project_ids:
@@ -134,11 +140,11 @@ class GcpProvider(Provider):
if not self._projects:
logger.critical(
"No Input Project IDs can be accessed via Google Credentials."
"No Input Project IDs are active or can be accessed via Google Credentials."
)
raise GCPNoAccesibleProjectsError(
file=__file__,
message="No Input Project IDs can be accessed via Google Credentials.",
message="No Input Project IDs are active or can be accessed via Google Credentials.",
)
if list_project_ids:
@@ -411,7 +417,7 @@ class GcpProvider(Provider):
@staticmethod
def get_projects(
credentials: Credentials, organization_id: str
credentials: Credentials, organization_id: str = None
) -> dict[str, GCPProject]:
"""
Get the projects accessible by the provided credentials. If an organization ID is provided, only the projects under that organization are returned.

View File

@@ -34,7 +34,7 @@ class IAM(GCPService):
ServiceAccount(
name=account["name"],
email=account["email"],
display_name=account["displayName"],
display_name=account.get("displayName", ""),
project_id=project_id,
)
)

View File

@@ -10,18 +10,38 @@ class core_seccomp_profile_docker_default(Check):
report.namespace = pod.namespace
report.resource_name = pod.name
report.resource_id = pod.uid
if (
pod_seccomp_correct = (
pod.security_context
and pod.security_context.seccomp_profile
and pod.security_context.seccomp_profile.type == "RuntimeDefault"
):
)
containers_seccomp_correct = True
# Check container-level seccomp profile
for container in pod.containers.values():
if not (
container.security_context
and container.security_context.seccomp_profile
and container.security_context.seccomp_profile.type
== "RuntimeDefault"
):
containers_seccomp_correct = False
break
# Determine the report status
if pod_seccomp_correct or containers_seccomp_correct:
report.status = "PASS"
report.status_extended = (
f"Pod {pod.name} has docker/default seccomp profile enabled."
)
report.status_extended = f"Pod {pod.name} and its containers have docker/default seccomp profile enabled."
else:
report.status = "FAIL"
report.status_extended = f"Pod {pod.name} does not have docker/default seccomp profile enabled."
if not pod_seccomp_correct and not containers_seccomp_correct:
report.status_extended = f"Pod {pod.name} does not have docker/default seccomp profile enabled at both pod and container levels."
elif not pod_seccomp_correct:
report.status_extended = f"Pod {pod.name} does not have docker/default seccomp profile enabled at pod level."
else:
report.status_extended = f"Pod {pod.name} does not have docker/default seccomp profile enabled at container level."
findings.append(report)
return findings

View File

@@ -1,4 +1,4 @@
def is_rule_allowing_permisions(rules, resources, verbs):
def is_rule_allowing_permissions(rules, resources, verbs):
"""
Check Kubernetes role permissions.
@@ -17,6 +17,9 @@ def is_rule_allowing_permisions(rules, resources, verbs):
if rules:
# Iterate through each rule in the list of rules
for rule in rules:
# Ensure apiGroups are relevant ("" or "v1" for secrets)
if rule.apiGroups and all(api not in ["", "v1"] for api in rule.apiGroups):
continue # Skip rules with unrelated apiGroups
# Check if the rule has resources, verbs, and matches any of the specified resources and verbs
if (
rule.resources

View File

@@ -1,6 +1,6 @@
from prowler.lib.check.models import Check, Check_Report_Kubernetes
from prowler.providers.kubernetes.services.rbac.lib.role_permissions import (
is_rule_allowing_permisions,
is_rule_allowing_permissions,
)
from prowler.providers.kubernetes.services.rbac.rbac_client import rbac_client
@@ -22,7 +22,7 @@ class rbac_minimize_csr_approval_access(Check):
report.status_extended = f"User or group '{subject.name}' does not have access to update the CSR approval sub-resource."
for cr in rbac_client.cluster_roles.values():
if cr.metadata.name == crb.roleRef.name:
if is_rule_allowing_permisions(
if is_rule_allowing_permissions(
cr.rules,
resources,
verbs,

View File

@@ -1,6 +1,6 @@
from prowler.lib.check.models import Check, Check_Report_Kubernetes
from prowler.providers.kubernetes.services.rbac.lib.role_permissions import (
is_rule_allowing_permisions,
is_rule_allowing_permissions,
)
from prowler.providers.kubernetes.services.rbac.rbac_client import rbac_client
@@ -22,7 +22,7 @@ class rbac_minimize_node_proxy_subresource_access(Check):
report.status_extended = f"User or group '{subject.name}' does not have access to the node proxy sub-resource."
for cr in rbac_client.cluster_roles.values():
if cr.metadata.name == crb.roleRef.name:
if is_rule_allowing_permisions(cr.rules, resources, verbs):
if is_rule_allowing_permissions(cr.rules, resources, verbs):
report.status = "FAIL"
report.status_extended = f"User or group '{subject.name}' has access to the node proxy sub-resource."
break

View File

@@ -1,6 +1,6 @@
from prowler.lib.check.models import Check, Check_Report_Kubernetes
from prowler.providers.kubernetes.services.rbac.lib.role_permissions import (
is_rule_allowing_permisions,
is_rule_allowing_permissions,
)
from prowler.providers.kubernetes.services.rbac.rbac_client import rbac_client
@@ -21,7 +21,7 @@ class rbac_minimize_pod_creation_access(Check):
report.status_extended = (
f"ClusterRole {cr.metadata.name} does not have pod create access."
)
if is_rule_allowing_permisions(cr.rules, resources, verbs):
if is_rule_allowing_permissions(cr.rules, resources, verbs):
report.status = "FAIL"
report.status_extended = (
f"ClusterRole {cr.metadata.name} has pod create access."
@@ -39,7 +39,7 @@ class rbac_minimize_pod_creation_access(Check):
f"Role {role.metadata.name} does not have pod create access."
)
if is_rule_allowing_permisions(role.rules, resources, verbs):
if is_rule_allowing_permissions(role.rules, resources, verbs):
report.status = "FAIL"
report.status_extended = (
f"Role {role.metadata.name} has pod create access."

View File

@@ -1,6 +1,6 @@
from prowler.lib.check.models import Check, Check_Report_Kubernetes
from prowler.providers.kubernetes.services.rbac.lib.role_permissions import (
is_rule_allowing_permisions,
is_rule_allowing_permissions,
)
from prowler.providers.kubernetes.services.rbac.rbac_client import rbac_client
@@ -23,7 +23,7 @@ class rbac_minimize_pv_creation_access(Check):
report.status_extended = f"User or group '{subject.name}' does not have access to create PersistentVolumes."
for cr in rbac_client.cluster_roles.values():
if cr.metadata.name == crb.roleRef.name:
if is_rule_allowing_permisions(cr.rules, resources, verbs):
if is_rule_allowing_permissions(cr.rules, resources, verbs):
report.status = "FAIL"
report.status_extended = f"User or group '{subject.name}' has access to create PersistentVolumes."
break

View File

@@ -1,6 +1,6 @@
from prowler.lib.check.models import Check, Check_Report_Kubernetes
from prowler.providers.kubernetes.services.rbac.lib.role_permissions import (
is_rule_allowing_permisions,
is_rule_allowing_permissions,
)
from prowler.providers.kubernetes.services.rbac.rbac_client import rbac_client
@@ -21,7 +21,7 @@ class rbac_minimize_secret_access(Check):
report.status_extended = (
f"ClusterRole {cr.metadata.name} does not have secret access."
)
if is_rule_allowing_permisions(cr.rules, resources, verbs):
if is_rule_allowing_permissions(cr.rules, resources, verbs):
report.status = "FAIL"
report.status_extended = (
f"ClusterRole {cr.metadata.name} has secret access."
@@ -39,7 +39,7 @@ class rbac_minimize_secret_access(Check):
f"Role {role.metadata.name} does not have secret access."
)
if is_rule_allowing_permisions(cr.rules, resources, verbs):
if is_rule_allowing_permissions(cr.rules, resources, verbs):
report.status = "FAIL"
report.status_extended = f"Role {role.metadata.name} has secret access."
findings.append(report)

View File

@@ -1,6 +1,6 @@
from prowler.lib.check.models import Check, Check_Report_Kubernetes
from prowler.providers.kubernetes.services.rbac.lib.role_permissions import (
is_rule_allowing_permisions,
is_rule_allowing_permissions,
)
from prowler.providers.kubernetes.services.rbac.rbac_client import rbac_client
@@ -22,7 +22,7 @@ class rbac_minimize_service_account_token_creation(Check):
report.status_extended = f"User or group '{subject.name}' does not have access to create service account tokens."
for cr in rbac_client.cluster_roles.values():
if cr.metadata.name == crb.roleRef.name:
if is_rule_allowing_permisions(cr.rules, resources, verbs):
if is_rule_allowing_permissions(cr.rules, resources, verbs):
report.status = "FAIL"
report.status_extended = f"User or group '{subject.name}' has access to create service account tokens."
break

View File

@@ -1,6 +1,6 @@
from prowler.lib.check.models import Check, Check_Report_Kubernetes
from prowler.providers.kubernetes.services.rbac.lib.role_permissions import (
is_rule_allowing_permisions,
is_rule_allowing_permissions,
)
from prowler.providers.kubernetes.services.rbac.rbac_client import rbac_client
@@ -25,7 +25,7 @@ class rbac_minimize_webhook_config_access(Check):
report.status_extended = f"User or group '{subject.name}' does not have access to create, update, or delete webhook configurations."
for cr in rbac_client.cluster_roles.values():
if cr.metadata.name == crb.roleRef.name:
if is_rule_allowing_permisions(
if is_rule_allowing_permissions(
cr.rules,
resources,
verbs,

View File

@@ -23,7 +23,7 @@ packages = [
{include = "dashboard"}
]
readme = "README.md"
version = "4.5.0"
version = "4.5.4"
[tool.poetry.dependencies]
alive-progress = "3.1.5"
@@ -62,7 +62,6 @@ microsoft-kiota-abstractions = "1.3.3"
msgraph-sdk = "1.8.0"
numpy = "2.0.2"
pandas = "2.2.3"
presidio-analyzer = "2.2.355"
py-ocsf-models = "0.2.0"
pydantic = "1.10.18"
python = ">=3.9,<3.13"

View File

@@ -33,16 +33,6 @@ old_config_aws = {
"ec2_allowed_instance_owners": ["amazon-elb"],
"trusted_account_ids": [],
"log_group_retention_days": 365,
"critical_pii_entities": [
"CREDIT_CARD", # Credit card numbers are highly sensitive financial information.
"CRYPTO", # Crypto wallet numbers (e.g., Bitcoin addresses) can give access to cryptocurrency.
"IBAN_CODE", # International Bank Account Numbers are critical financial information.
"US_BANK_NUMBER", # US bank account numbers are sensitive and should be protected.
"US_SSN", # US Social Security Numbers are critical PII used for identity verification.
"US_PASSPORT", # US passport numbers can be used for identity theft.
"US_ITIN", # US Individual Taxpayer Identification Numbers are sensitive personal identifiers.
],
"pii_language": "en", # Language for recognizing PII entities
"max_idle_disconnect_timeout_in_seconds": 600,
"max_disconnect_timeout_in_seconds": 300,
"max_session_duration_seconds": 36000,
@@ -107,16 +97,6 @@ config_aws = {
"fargate_windows_latest_version": "1.0.0",
"trusted_account_ids": [],
"log_group_retention_days": 365,
"critical_pii_entities": [
"CREDIT_CARD", # Credit card numbers are highly sensitive financial information.
"CRYPTO", # Crypto wallet numbers (e.g., Bitcoin addresses) can give access to cryptocurrency.
"IBAN_CODE", # International Bank Account Numbers are critical financial information.
"US_BANK_NUMBER", # US bank account numbers are sensitive and should be protected.
"US_SSN", # US Social Security Numbers are critical PII used for identity verification.
"US_PASSPORT", # US passport numbers can be used for identity theft.
"US_ITIN", # US Individual Taxpayer Identification Numbers are sensitive personal identifiers.
],
"pii_language": "en", # Language for recognizing PII entities
"max_idle_disconnect_timeout_in_seconds": 600,
"max_disconnect_timeout_in_seconds": 300,
"max_session_duration_seconds": 36000,

View File

@@ -72,31 +72,6 @@ aws:
# AWS Cloudwatch Configuration
# aws.cloudwatch_log_group_retention_policy_specific_days_enabled --> by default is 365 days
log_group_retention_days: 365
# aws.cloudwatch_log_group_no_critical_pii_in_logs --> see all available entities in https://microsoft.github.io/presidio/supported_entities/
critical_pii_entities : [
"CREDIT_CARD", # Credit card numbers are highly sensitive financial information.
"CRYPTO", # Crypto wallet numbers (e.g., Bitcoin addresses) can give access to cryptocurrency.
"IBAN_CODE", # International Bank Account Numbers are critical financial information.
"US_BANK_NUMBER", # US bank account numbers are sensitive and should be protected.
"US_SSN", # US Social Security Numbers are critical PII used for identity verification.
"US_PASSPORT", # US passport numbers can be used for identity theft.
"US_ITIN", # US Individual Taxpayer Identification Numbers are sensitive personal identifiers.
#"UK_NHS", # UK NHS numbers can be used to access medical records and other private information.
#"ES_NIF", # Spanish NIF (Personal tax ID) is critical for identification and tax purposes.
#"ES_NIE", # Spanish NIE (Foreigners ID card) is a critical identifier for foreign residents.
#"IT_FISCAL_CODE", # Italian personal identification code is sensitive PII for tax and legal purposes.
#"IT_PASSPORT", # Italian passport numbers are critical PII.
#"IT_IDENTITY_CARD", # Italian identity card numbers are critical for personal identification.
#"PL_PESEL", # Polish PESEL numbers are sensitive personal identifiers.
#"SG_NRIC_FIN", # Singapore National Registration Identification Card is critical PII.
#"AU_ABN", # Australian Business Numbers are critical for business identification.
#"AU_TFN", # Australian Tax File Numbers are sensitive and used for taxation purposes.
#"AU_MEDICARE", # Australian Medicare numbers are sensitive medical identifiers.
#"IN_PAN", # Indian Permanent Account Numbers are critical for tax purposes and identity.
#"IN_AADHAAR", # Indian Aadhaar numbers are highly sensitive and serve as a universal identity number.
#"FI_PERSONAL_IDENTITY_CODE" # Finnish Personal Identity Code is sensitive PII for personal identification.
]
pii_language: "en" # Language for recognizing PII entities
# AWS AppStream Session Configuration
# aws.appstream_fleet_session_idle_disconnect_timeout

View File

@@ -28,31 +28,6 @@ trusted_account_ids: []
# AWS Cloudwatch Configuration
# aws.cloudwatch_log_group_retention_policy_specific_days_enabled --> by default is 365 days
log_group_retention_days: 365
# aws.cloudwatch_log_group_no_critical_pii_in_logs --> see all available entities in https://microsoft.github.io/presidio/supported_entities/
critical_pii_entities : [
"CREDIT_CARD", # Credit card numbers are highly sensitive financial information.
"CRYPTO", # Crypto wallet numbers (e.g., Bitcoin addresses) can give access to cryptocurrency.
"IBAN_CODE", # International Bank Account Numbers are critical financial information.
"US_BANK_NUMBER", # US bank account numbers are sensitive and should be protected.
"US_SSN", # US Social Security Numbers are critical PII used for identity verification.
"US_PASSPORT", # US passport numbers can be used for identity theft.
"US_ITIN", # US Individual Taxpayer Identification Numbers are sensitive personal identifiers.
#"UK_NHS", # UK NHS numbers can be used to access medical records and other private information.
#"ES_NIF", # Spanish NIF (Personal tax ID) is critical for identification and tax purposes.
#"ES_NIE", # Spanish NIE (Foreigners ID card) is a critical identifier for foreign residents.
#"IT_FISCAL_CODE", # Italian personal identification code is sensitive PII for tax and legal purposes.
#"IT_PASSPORT", # Italian passport numbers are critical PII.
#"IT_IDENTITY_CARD", # Italian identity card numbers are critical for personal identification.
#"PL_PESEL", # Polish PESEL numbers are sensitive personal identifiers.
#"SG_NRIC_FIN", # Singapore National Registration Identification Card is critical PII.
#"AU_ABN", # Australian Business Numbers are critical for business identification.
#"AU_TFN", # Australian Tax File Numbers are sensitive and used for taxation purposes.
#"AU_MEDICARE", # Australian Medicare numbers are sensitive medical identifiers.
#"IN_PAN", # Indian Permanent Account Numbers are critical for tax purposes and identity.
#"IN_AADHAAR", # Indian Aadhaar numbers are highly sensitive and serve as a universal identity number.
#"FI_PERSONAL_IDENTITY_CODE" # Finnish Personal Identity Code is sensitive PII for personal identification.
]
pii_language: "en" # Language for recognizing PII entities
# AWS AppStream Session Configuration
# aws.appstream_fleet_session_idle_disconnect_timeout

File diff suppressed because one or more lines are too long

View File

@@ -1443,6 +1443,18 @@ aws:
)
assert connection.error.code == 1015
@mock_aws
def test_test_connection_generic_exception(self):
with patch(
"prowler.providers.aws.aws_provider.AwsProvider.setup_session",
side_effect=Exception(),
):
connection = AwsProvider.test_connection(raise_on_exception=False)
assert isinstance(connection, Connection)
assert not connection.is_connected
assert isinstance(connection.error, Exception)
@mock_aws
def test_create_sts_session(self):
current_session = session.Session()

View File

@@ -1,206 +0,0 @@
from re import search
from unittest import mock
from boto3 import client
from moto import mock_aws
from moto.core.utils import unix_time_millis
from tests.providers.aws.utils import (
AWS_REGION_EU_WEST_1,
AWS_REGION_US_EAST_1,
set_mocked_aws_provider,
)
class Test_cloudwatch_log_group_no_critical_pii_in_logs:
def test_cloudwatch_no_log_groups(self):
from prowler.providers.aws.services.cloudwatch.cloudwatch_service import Logs
aws_provider = set_mocked_aws_provider(
[AWS_REGION_EU_WEST_1, AWS_REGION_US_EAST_1]
)
from prowler.providers.common.models import Audit_Metadata
aws_provider.audit_metadata = Audit_Metadata(
services_scanned=0,
# We need to set this check to call _describe_log_groups
expected_checks=["cloudwatch_log_group_no_critical_pii_in_logs"],
completed_checks=0,
audit_progress=0,
)
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
), mock.patch(
"prowler.providers.aws.services.cloudwatch.cloudwatch_log_group_no_critical_pii_in_logs.cloudwatch_log_group_no_critical_pii_in_logs.logs_client",
new=Logs(aws_provider),
):
# Test Check
from prowler.providers.aws.services.cloudwatch.cloudwatch_log_group_no_critical_pii_in_logs.cloudwatch_log_group_no_critical_pii_in_logs import (
cloudwatch_log_group_no_critical_pii_in_logs,
)
check = cloudwatch_log_group_no_critical_pii_in_logs()
result = check.execute()
assert len(result) == 0
@mock_aws
def test_cloudwatch_log_group_without_pii(self):
# Generate Logs Client
logs_client = client("logs", region_name=AWS_REGION_US_EAST_1)
# Request Logs group
logs_client.create_log_group(logGroupName="test", tags={"test": "test"})
logs_client.create_log_stream(logGroupName="test", logStreamName="test stream")
logs_client.put_log_events(
logGroupName="test",
logStreamName="test stream",
logEvents=[
{
"timestamp": int(unix_time_millis()),
"message": "non sensitive message",
}
],
)
from prowler.providers.aws.services.cloudwatch.cloudwatch_service import Logs
aws_provider = set_mocked_aws_provider(
[AWS_REGION_EU_WEST_1, AWS_REGION_US_EAST_1]
)
from prowler.providers.common.models import Audit_Metadata
aws_provider.audit_metadata = Audit_Metadata(
services_scanned=0,
# We need to set this check to call _describe_log_groups
expected_checks=["cloudwatch_log_group_no_critical_pii_in_logs"],
completed_checks=0,
audit_progress=0,
)
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
), mock.patch(
"prowler.providers.aws.services.cloudwatch.cloudwatch_log_group_no_critical_pii_in_logs.cloudwatch_log_group_no_critical_pii_in_logs.logs_client",
new=Logs(aws_provider),
):
# Test Check
from prowler.providers.aws.services.cloudwatch.cloudwatch_log_group_no_critical_pii_in_logs.cloudwatch_log_group_no_critical_pii_in_logs import (
cloudwatch_log_group_no_critical_pii_in_logs,
)
check = cloudwatch_log_group_no_critical_pii_in_logs()
result = check.execute()
assert len(result) == 1
assert result[0].status == "PASS"
assert (
result[0].status_extended == "No critical PII found in test log group."
)
assert result[0].resource_id == "test"
assert (
result[0].resource_arn
== f"arn:aws:logs:{AWS_REGION_US_EAST_1}:123456789012:log-group:test"
)
assert result[0].region == AWS_REGION_US_EAST_1
assert result[0].resource_tags == [{"test": "test"}]
@mock_aws
def test_cloudwatch_log_group_with_pii(self):
# Generate Logs Client
logs_client = client("logs", region_name=AWS_REGION_US_EAST_1)
# Request Logs group
logs_client.create_log_group(logGroupName="test", tags={"test": "test"})
logs_client.create_log_stream(logGroupName="test", logStreamName="test stream")
logs_client.put_log_events(
logGroupName="test",
logStreamName="test stream",
logEvents=[
{
"timestamp": int(unix_time_millis()),
"message": "Credit Card Number = 374752619478856",
}
],
)
from prowler.providers.aws.services.cloudwatch.cloudwatch_service import Logs
aws_provider = set_mocked_aws_provider(
[AWS_REGION_EU_WEST_1, AWS_REGION_US_EAST_1]
)
from prowler.providers.common.models import Audit_Metadata
aws_provider.audit_metadata = Audit_Metadata(
services_scanned=0,
# We need to set this check to call _describe_log_groups
expected_checks=["cloudwatch_log_group_no_critical_pii_in_logs"],
completed_checks=0,
audit_progress=0,
)
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
), mock.patch(
"prowler.providers.aws.services.cloudwatch.cloudwatch_log_group_no_critical_pii_in_logs.cloudwatch_log_group_no_critical_pii_in_logs.logs_client",
new=Logs(aws_provider),
):
# Test Check
from prowler.providers.aws.services.cloudwatch.cloudwatch_log_group_no_critical_pii_in_logs.cloudwatch_log_group_no_critical_pii_in_logs import (
cloudwatch_log_group_no_critical_pii_in_logs,
)
check = cloudwatch_log_group_no_critical_pii_in_logs()
result = check.execute()
assert len(result) == 1
assert result[0].status == "FAIL"
assert search(
"Potential critical PII found in log group", result[0].status_extended
)
assert result[0].resource_id == "test"
assert (
result[0].resource_arn
== f"arn:aws:logs:{AWS_REGION_US_EAST_1}:123456789012:log-group:test"
)
assert result[0].region == AWS_REGION_US_EAST_1
assert result[0].resource_tags == [{"test": "test"}]
@mock_aws
def test_access_denied(self):
from prowler.providers.aws.services.cloudwatch.cloudwatch_service import Logs
aws_provider = set_mocked_aws_provider(
[AWS_REGION_EU_WEST_1, AWS_REGION_US_EAST_1]
)
from prowler.providers.common.models import Audit_Metadata
aws_provider.audit_metadata = Audit_Metadata(
services_scanned=0,
# We need to set this check to call _describe_log_groups
expected_checks=["cloudwatch_log_group_no_critical_pii_in_logs"],
completed_checks=0,
audit_progress=0,
)
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
), mock.patch(
"prowler.providers.aws.services.cloudwatch.cloudwatch_log_group_no_critical_pii_in_logs.cloudwatch_log_group_no_critical_pii_in_logs.logs_client",
new=Logs(aws_provider),
) as logs_client:
# Test Check
from prowler.providers.aws.services.cloudwatch.cloudwatch_log_group_no_critical_pii_in_logs.cloudwatch_log_group_no_critical_pii_in_logs import (
cloudwatch_log_group_no_critical_pii_in_logs,
)
logs_client.log_groups = None
check = cloudwatch_log_group_no_critical_pii_in_logs()
result = check.execute()
assert len(result) == 0

View File

@@ -72,7 +72,7 @@ class Test_ec2_ami_public:
assert len(result) == 1
assert result[0].status == "PASS"
assert result[0].status_extended == f"EC2 AMI {image_id} is not public."
assert result[0].status_extended == "EC2 AMI test-ami is not public."
assert result[0].resource_id == image_id
assert (
result[0].resource_arn
@@ -124,9 +124,7 @@ class Test_ec2_ami_public:
assert len(result) == 1
assert result[0].status == "FAIL"
assert (
result[0].status_extended == f"EC2 AMI {image_id} is currently public."
)
assert result[0].status_extended == "EC2 AMI test-ami is currently public."
assert result[0].resource_id == image_id
assert (
result[0].resource_arn

View File

@@ -59,11 +59,9 @@ class Test_ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports:
)
result = check.execute()
# One default sg per region, each port should be checked
expected_findings_count = (
len(ec2_client.audit_config["ec2_high_risk_ports"]) * 3
) # 3 security groups
assert len(result) == expected_findings_count
assert (
len(result) == 3
) # 3 security groups one default sg per region and one added
# Search changed sg
for sg in result:
# All are compliant by default
@@ -132,21 +130,17 @@ class Test_ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports:
)
result = check.execute()
# One default sg per region, each port should be checked
expected_findings_count = (
len(ec2_client.audit_config["ec2_high_risk_ports"]) * 3
) # 3 security groups
assert len(result) == expected_findings_count
assert (
len(result) == 3
) # 3 security groups one default sg per region and one added
# Search changed sg
iterator = 0
for sg in result:
port = ec2_client.audit_config["ec2_high_risk_ports"][iterator]
if sg.resource_id == default_sg_id:
assert sg.status == "FAIL"
assert sg.region == AWS_REGION_US_EAST_1
assert (
sg.status_extended
== f"Security group {default_sg_name} ({default_sg_id}) has port {port} (high risk port) open to the Internet."
== f"Security group {default_sg_name} ({default_sg_id}) has the following high-risk ports open to the Internet: 25, 110, 135, 143, 445, 3000, 4333, 5000, 5500, 8080, 8088."
)
assert (
sg.resource_arn
@@ -154,9 +148,6 @@ class Test_ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports:
)
assert sg.resource_details == default_sg_name
assert sg.resource_tags == []
iterator = (iterator + 1) % len(
ec2_client.audit_config["ec2_high_risk_ports"]
)
@mock_aws
def test_ec2_compliant_default_sg(self):
@@ -222,21 +213,17 @@ class Test_ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports:
)
result = check.execute()
# One default sg per region, each port should be checked
expected_findings_count = (
len(ec2_client.audit_config["ec2_high_risk_ports"]) * 3
) # 3 security groups
assert len(result) == expected_findings_count
assert (
len(result) == 3
) # 3 security groups one default sg per region and one added
# Search changed sg
iterator = 0
for sg in result:
port = ec2_client.audit_config["ec2_high_risk_ports"][iterator]
if sg.resource_id == default_sg_id:
assert sg.status == "PASS"
assert sg.region == AWS_REGION_US_EAST_1
assert (
sg.status_extended
== f"Security group {default_sg_name} ({default_sg_id}) does not have port {port} open to the Internet."
== f"Security group {default_sg_name} ({default_sg_id}) does not have any high-risk port open to the Internet."
)
assert (
sg.resource_arn
@@ -244,9 +231,6 @@ class Test_ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports:
)
assert sg.resource_details == default_sg_name
assert sg.resource_tags == []
iterator = (iterator + 1) % len(
ec2_client.audit_config["ec2_high_risk_ports"]
)
@mock_aws
def test_ec2_default_sgs_ignoring(self):
@@ -350,10 +334,7 @@ class Test_ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports:
)
result = check.execute()
expected_findings_count = len(
ec2_client.audit_config["ec2_high_risk_ports"]
) # 1 security group in use
assert len(result) == expected_findings_count
assert len(result) == 1 # 1 security group added
for sg in result:
assert sg.status == "PASS"
assert sg.region == AWS_REGION_US_EAST_1
@@ -452,24 +433,21 @@ class Test_ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports:
)
result_specific_port = check_specific_port.execute()
# One default sg per region, each port should be checked
expected_findings_count = (
len(ec2_client.audit_config["ec2_high_risk_ports"]) * 3
) # 2 default security groups + 1 added
assert len(result_specific_port) == expected_findings_count
assert (
len(result_specific_port) == 3
) # 3 security groups one default sg per region and one added
# Search changed sg
iterator = 0
for sg in result_specific_port:
port = ec2_client.audit_config["ec2_high_risk_ports"][iterator]
if sg.resource_id == default_sg_id:
assert sg.status == "PASS"
assert sg.region == AWS_REGION_US_EAST_1
assert (
sg.status_extended
== f"Security group {sg.resource_details} ({sg.resource_id}) has all ports open to the Internet and therefore was not checked against port {port}."
== f"Security group {default_sg_name} ({default_sg_id}) has all ports open to the Internet and therefore was not checked against high-risk ports."
)
assert (
sg.resource_arn
== f"arn:{aws_provider.identity.partition}:ec2:{AWS_REGION_US_EAST_1}:{aws_provider.identity.account}:security-group/{default_sg_id}"
)
assert sg.resource_tags == []
assert sg.resource_details == default_sg_name
iterator = (iterator + 1) % len(
ec2_client.audit_config["ec2_high_risk_ports"]
)
assert sg.resource_tags == []

View File

@@ -1,35 +1,93 @@
from unittest import mock
from uuid import uuid4
import botocore
import botocore.client
from moto import mock_aws
from tests.providers.aws.utils import (
AWS_ACCOUNT_ARN,
AWS_ACCOUNT_NUMBER,
AWS_REGION_EU_WEST_1,
set_mocked_aws_provider,
)
DETECTOR_ID = str(uuid4())
DETECTOR_ARN = f"arn:aws:guardduty:{AWS_REGION_EU_WEST_1}:{AWS_ACCOUNT_NUMBER}:detector/{DETECTOR_ID}"
mock_make_api_call = botocore.client.BaseClient._make_api_call
def mock_make_api_call_create_detector_success(self, operation_name, kwarg):
if operation_name == "CreateDetector":
return {"DetectorId": DETECTOR_ID}
elif operation_name == "GetDetector":
return {"Status": "ENABLED"}
return mock_make_api_call(self, operation_name, kwarg)
def mock_make_api_call_create_detector_failure(self, operation_name, kwarg):
if operation_name == "CreateDetector":
raise botocore.exceptions.ClientError(
{
"Error": {
"Code": "AccessDeniedException",
"Message": "User: arn:aws:iam::012345678901:user/test is not authorized to perform: guardduty:CreateDetector",
}
},
"CreateDetector",
)
return mock_make_api_call(self, operation_name, kwarg)
class Test_guardduty_is_enabled_fixer:
@mock_aws
def test_guardduty_is_enabled_fixer(self):
regional_client = mock.MagicMock()
guardduty_client = mock.MagicMock
guardduty_client.region = AWS_REGION_EU_WEST_1
guardduty_client.detectors = []
guardduty_client.audited_account_arn = AWS_ACCOUNT_ARN
regional_client.create_detector.return_value = None
guardduty_client.regional_clients = {AWS_REGION_EU_WEST_1: regional_client}
with mock.patch(
"prowler.providers.aws.services.guardduty.guardduty_service.GuardDuty",
guardduty_client,
"botocore.client.BaseClient._make_api_call",
new=mock_make_api_call_create_detector_success,
):
from prowler.providers.aws.services.guardduty.guardduty_is_enabled.guardduty_is_enabled_fixer import (
fixer,
from prowler.providers.aws.services.guardduty.guardduty_service import (
GuardDuty,
)
assert fixer(AWS_REGION_EU_WEST_1)
aws_provider = set_mocked_aws_provider([AWS_REGION_EU_WEST_1])
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
), mock.patch(
"prowler.providers.aws.services.guardduty.guardduty_is_enabled.guardduty_is_enabled_fixer.guardduty_client",
new=GuardDuty(aws_provider),
):
from prowler.providers.aws.services.guardduty.guardduty_is_enabled.guardduty_is_enabled_fixer import (
fixer,
)
assert fixer(AWS_REGION_EU_WEST_1)
@mock_aws
def test_guardduty_is_enabled_fixer_failure(self):
with mock.patch(
"botocore.client.BaseClient._make_api_call",
new=mock_make_api_call_create_detector_failure,
):
from prowler.providers.aws.services.guardduty.guardduty_service import (
GuardDuty,
)
aws_provider = set_mocked_aws_provider([AWS_REGION_EU_WEST_1])
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
), mock.patch(
"prowler.providers.aws.services.guardduty.guardduty_is_enabled.guardduty_is_enabled_fixer.guardduty_client",
new=GuardDuty(aws_provider),
):
from prowler.providers.aws.services.guardduty.guardduty_is_enabled.guardduty_is_enabled_fixer import (
fixer,
)
assert not fixer(AWS_REGION_EU_WEST_1)

View File

@@ -0,0 +1,416 @@
from unittest import mock
from moto import mock_aws
from tests.providers.aws.utils import (
AWS_ACCOUNT_NUMBER,
AWS_REGION_US_EAST_1,
set_mocked_aws_provider,
)
class Test_rds_cluster_protected_by_backup_plan:
@mock_aws
def test_rds_no_clusters(self):
from prowler.providers.aws.services.backup.backup_service import Backup
from prowler.providers.aws.services.rds.rds_service import RDS
aws_provider = set_mocked_aws_provider([AWS_REGION_US_EAST_1])
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
):
with mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.rds_client",
new=RDS(aws_provider),
), mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.backup_client",
new=Backup(aws_provider),
):
# Test Check
from prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan import (
rds_cluster_protected_by_backup_plan,
)
check = rds_cluster_protected_by_backup_plan()
result = check.execute()
assert len(result) == 0
@mock_aws
def test_rds_cluster_no_existing_backup_plans(self):
cluster = mock.MagicMock()
backup = mock.MagicMock()
from prowler.providers.aws.services.rds.rds_service import DBCluster
arn = f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1"
cluster.db_clusters = {
arn: DBCluster(
id="db-cluster-1",
arn=f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1",
endpoint="db-cluster-1.c9akciq32.rds.amazonaws.com",
backtrack=1,
parameter_group="test",
engine_version="13.3",
status="available",
public=False,
encrypted=True,
deletion_protection=False,
auto_minor_version_upgrade=True,
multi_az=False,
username="admin",
iam_auth=False,
name="db-cluster-1",
region="us-east-1",
cluster_class="db.m1.small",
engine="aurora-postgres",
allocated_storage=10,
tags=[],
)
}
aws_provider = set_mocked_aws_provider([AWS_REGION_US_EAST_1])
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
):
with mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_client.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.backup_client",
new=backup,
), mock.patch(
"prowler.providers.aws.services.backup.backup_client.backup_client",
new=backup,
):
# Test Check
from prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan import (
rds_cluster_protected_by_backup_plan,
)
check = rds_cluster_protected_by_backup_plan()
result = check.execute()
assert len(result) == 1
assert result[0].status == "FAIL"
assert (
result[0].status_extended
== "RDS Cluster db-cluster-1 is not protected by a backup plan."
)
assert result[0].resource_id == "db-cluster-1"
assert result[0].region == AWS_REGION_US_EAST_1
assert (
result[0].resource_arn
== f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1"
)
assert result[0].resource_tags == []
def test_rds_cluster_without_backup_plan(self):
cluster = mock.MagicMock()
backup = mock.MagicMock()
from prowler.providers.aws.services.rds.rds_service import DBCluster
arn = f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1"
cluster.db_clusters = {
arn: DBCluster(
id="db-cluster-1",
arn=f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1",
endpoint="db-cluster-1.c9akciq32.rds.amazonaws.com",
backtrack=1,
parameter_group="test",
engine_version="13.3",
status="available",
public=False,
encrypted=True,
deletion_protection=False,
auto_minor_version_upgrade=True,
multi_az=False,
username="admin",
iam_auth=False,
name="db-cluster-1",
region="us-east-1",
cluster_class="db.m1.small",
engine="aurora-postgres",
allocated_storage=10,
tags=[],
)
}
backup.protected_resources = [
f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-master-2"
]
aws_provider = set_mocked_aws_provider([AWS_REGION_US_EAST_1])
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
):
with mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_client.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.backup_client",
new=backup,
), mock.patch(
"prowler.providers.aws.services.backup.backup_client.backup_client",
new=backup,
):
# Test Check
from prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan import (
rds_cluster_protected_by_backup_plan,
)
check = rds_cluster_protected_by_backup_plan()
result = check.execute()
assert len(result) == 1
assert result[0].status == "FAIL"
assert (
result[0].status_extended
== "RDS Cluster db-cluster-1 is not protected by a backup plan."
)
assert result[0].resource_id == "db-cluster-1"
assert result[0].region == AWS_REGION_US_EAST_1
assert (
result[0].resource_arn
== f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1"
)
assert result[0].resource_tags == []
def test_rds_cluster_with_backup_plan(self):
cluster = mock.MagicMock()
from prowler.providers.aws.services.rds.rds_service import DBCluster
arn = f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1"
cluster.db_clusters = {
arn: DBCluster(
id="db-cluster-1",
arn=f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1",
endpoint="db-cluster-1.c9akciq32.rds.amazonaws.com",
backtrack=1,
parameter_group="test",
engine_version="13.3",
status="available",
public=False,
encrypted=True,
deletion_protection=False,
auto_minor_version_upgrade=True,
multi_az=False,
username="admin",
iam_auth=False,
name="db-cluster-1",
region="us-east-1",
cluster_class="db.m1.small",
engine="aurora-postgres",
allocated_storage=10,
tags=[],
)
}
backup = mock.MagicMock()
backup.protected_resources = [arn]
aws_provider = set_mocked_aws_provider([AWS_REGION_US_EAST_1])
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
):
with mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_client.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.backup_client",
new=backup,
), mock.patch(
"prowler.providers.aws.services.backup.backup_client.backup_client",
new=backup,
):
# Test Check
from prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan import (
rds_cluster_protected_by_backup_plan,
)
check = rds_cluster_protected_by_backup_plan()
result = check.execute()
assert len(result) == 1
assert result[0].status == "PASS"
assert (
result[0].status_extended
== "RDS Cluster db-cluster-1 is protected by a backup plan."
)
assert result[0].resource_id == "db-cluster-1"
assert result[0].region == AWS_REGION_US_EAST_1
assert (
result[0].resource_arn
== f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1"
)
assert result[0].resource_tags == []
def test_rds_cluster_with_backup_plan_via_cluster_wildcard(self):
cluster = mock.MagicMock()
cluster.audited_partition = "aws"
from prowler.providers.aws.services.rds.rds_service import DBCluster
arn = "arn:aws:rds:*:*:cluster:*"
cluster.db_clusters = {
f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1": DBCluster(
id="db-cluster-1",
arn=f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1",
endpoint="db-cluster-1.c9akciq32.rds.amazonaws.com",
backtrack=1,
parameter_group="test",
engine_version="13.3",
status="available",
public=False,
encrypted=True,
deletion_protection=False,
auto_minor_version_upgrade=True,
multi_az=False,
username="admin",
iam_auth=False,
name="db-cluster-1",
region="us-east-1",
cluster_class="db.m1.small",
engine="aurora-postgres",
allocated_storage=10,
tags=[],
)
}
backup = mock.MagicMock()
backup.protected_resources = [arn]
aws_provider = set_mocked_aws_provider([AWS_REGION_US_EAST_1])
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
):
with mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_client.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.backup_client",
new=backup,
), mock.patch(
"prowler.providers.aws.services.backup.backup_client.backup_client",
new=backup,
):
# Test Check
from prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan import (
rds_cluster_protected_by_backup_plan,
)
check = rds_cluster_protected_by_backup_plan()
result = check.execute()
assert len(result) == 1
assert result[0].status == "PASS"
assert (
result[0].status_extended
== "RDS Cluster db-cluster-1 is protected by a backup plan."
)
assert result[0].resource_id == "db-cluster-1"
assert result[0].region == AWS_REGION_US_EAST_1
assert (
result[0].resource_arn
== f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1"
)
assert result[0].resource_tags == []
def test_rds_cluster_with_backup_plan_via_all_wildcard(self):
cluster = mock.MagicMock()
from prowler.providers.aws.services.rds.rds_service import DBCluster
arn = "*"
cluster.db_clusters = {
f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1": DBCluster(
id="db-cluster-1",
arn=f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1",
endpoint="db-cluster-1.c9akciq32.rds.amazonaws.com",
backtrack=1,
parameter_group="test",
engine_version="13.3",
status="available",
public=False,
encrypted=True,
deletion_protection=False,
auto_minor_version_upgrade=True,
multi_az=False,
username="admin",
iam_auth=False,
name="db-cluster-1",
region="us-east-1",
cluster_class="db.m1.small",
engine="aurora-postgres",
allocated_storage=10,
tags=[],
)
}
backup = mock.MagicMock()
backup.protected_resources = [arn]
aws_provider = set_mocked_aws_provider([AWS_REGION_US_EAST_1])
with mock.patch(
"prowler.providers.common.provider.Provider.get_global_provider",
return_value=aws_provider,
):
with mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_client.rds_client",
new=cluster,
), mock.patch(
"prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan.backup_client",
new=backup,
), mock.patch(
"prowler.providers.aws.services.backup.backup_client.backup_client",
new=backup,
):
# Test Check
from prowler.providers.aws.services.rds.rds_cluster_protected_by_backup_plan.rds_cluster_protected_by_backup_plan import (
rds_cluster_protected_by_backup_plan,
)
check = rds_cluster_protected_by_backup_plan()
result = check.execute()
assert len(result) == 1
assert result[0].status == "PASS"
assert (
result[0].status_extended
== "RDS Cluster db-cluster-1 is protected by a backup plan."
)
assert result[0].resource_id == "db-cluster-1"
assert result[0].region == AWS_REGION_US_EAST_1
assert (
result[0].resource_arn
== f"arn:aws:rds:{AWS_REGION_US_EAST_1}:{AWS_ACCOUNT_NUMBER}:db:db-cluster-1"
)
assert result[0].resource_tags == []

View File

@@ -14,6 +14,7 @@ from prowler.config.config import (
from prowler.providers.common.models import Connection
from prowler.providers.gcp.exceptions.exceptions import (
GCPInvalidProviderIdError,
GCPNoAccesibleProjectsError,
GCPTestConnectionError,
)
from prowler.providers.gcp.gcp_provider import GcpProvider
@@ -40,7 +41,7 @@ class TestGCPProvider:
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
)
}
@@ -109,7 +110,7 @@ class TestGCPProvider:
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
)
}
@@ -183,7 +184,7 @@ class TestGCPProvider:
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
)
}
@@ -247,7 +248,7 @@ class TestGCPProvider:
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
)
}
@@ -319,7 +320,7 @@ class TestGCPProvider:
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
organization=GCPOrganization(
id="test-organization-id",
name="test-organization",
@@ -369,6 +370,69 @@ class TestGCPProvider:
== "test-organization-id"
)
def test_setup_session_with_inactive_project(self):
mocked_credentials = MagicMock()
mocked_credentials.refresh.return_value = None
mocked_credentials._service_account_email = "test-service-account-email"
arguments = Namespace()
arguments.project_id = ["project/55555555"]
arguments.excluded_project_id = []
arguments.organization_id = None
arguments.list_project_id = False
arguments.credentials_file = "test_credentials_file"
arguments.impersonate_service_account = ""
arguments.config_file = default_config_file_path
arguments.fixer_config = default_fixer_config_file_path
projects = {
"test-project": GCPProject(
number="55555555",
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="DELETE_REQUESTED",
)
}
mocked_service = MagicMock()
mocked_service.projects.list.return_value = MagicMock(
execute=MagicMock(return_value={"projects": projects})
)
with patch(
"prowler.providers.gcp.gcp_provider.GcpProvider.get_projects",
return_value=projects,
), patch(
"prowler.providers.gcp.gcp_provider.GcpProvider.update_projects_with_organizations",
return_value=None,
), patch(
"os.path.abspath",
return_value="test_credentials_file",
), patch(
"prowler.providers.gcp.gcp_provider.default",
return_value=(mocked_credentials, MagicMock()),
), patch(
"prowler.providers.gcp.gcp_provider.discovery.build",
return_value=mocked_service,
):
with pytest.raises(Exception) as e:
GcpProvider(
arguments.organization_id,
arguments.project_id,
arguments.excluded_project_id,
arguments.credentials_file,
arguments.impersonate_service_account,
arguments.list_project_id,
arguments.config_file,
arguments.fixer_config,
client_id=None,
client_secret=None,
refresh_token=None,
)
assert e.type == GCPNoAccesibleProjectsError
def test_print_credentials_default_options(self, capsys):
mocked_credentials = MagicMock()
@@ -391,7 +455,7 @@ class TestGCPProvider:
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
)
}
@@ -462,7 +526,7 @@ class TestGCPProvider:
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
)
}
@@ -533,14 +597,14 @@ class TestGCPProvider:
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
),
"test-excluded-project": GCPProject(
number="12345678",
id="project/12345678",
name="test-excluded-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
),
}
@@ -658,7 +722,7 @@ class TestGCPProvider:
id="project/55555555",
name="test-project",
labels={"test": "value"},
lifecycle_state="",
lifecycle_state="ACTIVE",
),
}

View File

@@ -1,11 +1,11 @@
from prowler.providers.kubernetes.services.rbac.lib.role_permissions import (
is_rule_allowing_permisions,
is_rule_allowing_permissions,
)
from prowler.providers.kubernetes.services.rbac.rbac_service import Rule
class TestCheckRolePermissions:
def test_is_rule_allowing_permisions(self):
def test_is_rule_allowing_permissions(self):
# Define some sample rules, resources, and verbs for testing
rules = [
# Rule 1: Allows 'get' and 'list' on 'pods' and 'services'
@@ -16,7 +16,7 @@ class TestCheckRolePermissions:
resources = ["pods", "deployments"]
verbs = ["get", "create"]
assert is_rule_allowing_permisions(rules, resources, verbs)
assert is_rule_allowing_permissions(rules, resources, verbs)
def test_no_permissions(self):
# Test when there are no rules
@@ -24,7 +24,7 @@ class TestCheckRolePermissions:
resources = ["pods", "deployments"]
verbs = ["get", "create"]
assert not is_rule_allowing_permisions(rules, resources, verbs)
assert not is_rule_allowing_permissions(rules, resources, verbs)
def test_no_matching_rules(self):
# Test when there are rules, but none match the specified resources and verbs
@@ -35,7 +35,7 @@ class TestCheckRolePermissions:
resources = ["deployments", "configmaps"]
verbs = ["get", "create"]
assert not is_rule_allowing_permisions(rules, resources, verbs)
assert not is_rule_allowing_permissions(rules, resources, verbs)
def test_empty_rules(self):
# Test when the rules list is empty
@@ -43,7 +43,7 @@ class TestCheckRolePermissions:
resources = ["pods", "deployments"]
verbs = ["get", "create"]
assert not is_rule_allowing_permisions(rules, resources, verbs)
assert not is_rule_allowing_permissions(rules, resources, verbs)
def test_empty_resources_and_verbs(self):
# Test when resources and verbs are empty lists
@@ -54,7 +54,7 @@ class TestCheckRolePermissions:
resources = []
verbs = []
assert not is_rule_allowing_permisions(rules, resources, verbs)
assert not is_rule_allowing_permissions(rules, resources, verbs)
def test_matching_rule_with_empty_resources_or_verbs(self):
# Test when a rule matches, but either resources or verbs are empty
@@ -65,9 +65,31 @@ class TestCheckRolePermissions:
resources = []
verbs = ["get"]
assert not is_rule_allowing_permisions(rules, resources, verbs)
assert not is_rule_allowing_permissions(rules, resources, verbs)
resources = ["pods"]
verbs = []
assert not is_rule_allowing_permisions(rules, resources, verbs)
assert not is_rule_allowing_permissions(rules, resources, verbs)
def test_rule_with_ignored_api_groups(self):
# Test when a rule has apiGroups that are not relevant
rules = [
Rule(resources=["pods"], verbs=["get"], apiGroups=["test"]),
Rule(resources=["services"], verbs=["list"], apiGroups=["test2"]),
]
resources = ["pods"]
verbs = ["get"]
assert not is_rule_allowing_permissions(rules, resources, verbs)
def test_rule_with_relevant_api_groups(self):
# Test when a rule has apiGroups that are relevant
rules = [
Rule(resources=["pods"], verbs=["get"], apiGroups=["", "v1"]),
Rule(resources=["services"], verbs=["list"], apiGroups=["test2"]),
]
resources = ["pods"]
verbs = ["get"]
assert is_rule_allowing_permissions(rules, resources, verbs)