Compare commits

..

10 Commits

Author SHA1 Message Date
César Arroba
09b04f6c9f fix(celery): resolve changelog merge conflict with master 2026-04-07 12:14:02 +02:00
César Arroba
049a7703a5 fix(celery): add changelog entry for Redis broker keepalive fix 2026-04-07 12:05:49 +02:00
Andoni Alonso
ca03d9c0a9 docs: add Google Workspace SAML SSO configuration guide (#10564)
Co-authored-by: Alan Buscaglia <Alan-TheGentleman@users.noreply.github.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-04-07 12:03:21 +02:00
César Arroba
2413626242 fix(celery): add Redis broker keepalive to prevent workers stalling after long tasks 2026-04-07 12:01:46 +02:00
Kay Agahd
8985280621 fix(azure): create distinct report per key/secret in keyvault checks (#10332)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
Co-authored-by: Hugo P.Brito <hugopbrit@gmail.com>
2026-04-07 09:36:48 +01:00
Pepe Fagoaga
b7ee2b9690 chore: rename UI tab regarding the environment (#10588) 2026-04-07 10:30:01 +02:00
Alejandro Bailo
6b2d9b5580 feat(ui): add Vercel provider (#10191)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-04-07 10:13:18 +02:00
kaiisfree
c99ed991b7 fix: show all checks including threat-detection in --list-checks (#10578)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: kaiisfree <kai@users.noreply.github.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-04-06 16:55:15 +01:00
Hugo Pereira Brito
7c0034524a fix(sdk): add missing __init__.py for codebuild GitHub orgs check (#10584) 2026-04-06 16:40:04 +01:00
Josema Camacho
749110de75 chore(sdk): bump cryptography to 46.0.6, oci to 2.169.0, and alibabacloud-tea-openapi to 0.4.4 (#10535) 2026-04-06 15:09:33 +02:00
230 changed files with 1559 additions and 484 deletions

View File

@@ -1,33 +0,0 @@
name: 'Tools: Check Test Init Files'
on:
pull_request:
branches:
- 'master'
- 'v5.*'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
check-test-init-files:
if: github.repository == 'prowler-cloud/prowler'
runs-on: ubuntu-latest
timeout-minutes: 10
permissions:
contents: read
steps:
- name: Harden the runner (Audit all outbound calls)
uses: step-security/harden-runner@fa2e9d605c4eeb9fcad4c99c224cee0c6c7f3594 # v2.16.0
with:
egress-policy: audit
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
persist-credentials: false
- name: Check for __init__.py files in test directories
run: python3 scripts/check_test_init_files.py .

View File

@@ -3,7 +3,7 @@
<img align="center" src="https://github.com/prowler-cloud/prowler/blob/master/docs/img/prowler-logo-white.png#gh-dark-mode-only" width="50%" height="50%">
</p>
<p align="center">
<b><i>Prowler</b> is the Open Cloud Security platform trusted by thousands to automate security and compliance in any cloud environment. With hundreds of ready-to-use checks and compliance frameworks, Prowler delivers real-time, customizable monitoring and seamless integrations, making cloud security simple, scalable, and cost-effective for organizations of any size.
<b><i>Prowler</b> is the Open Cloud Security Platform trusted by thousands to automate security and compliance in any cloud environment. With hundreds of ready-to-use checks and compliance frameworks, Prowler delivers real-time, customizable monitoring and seamless integrations, making cloud security simple, scalable, and cost-effective for organizations of any size.
</p>
<p align="center">
<b>Secure ANY cloud at AI Speed at <a href="https://prowler.com">prowler.com</i></b>
@@ -41,7 +41,7 @@
# Description
**Prowler** is the worlds most widely used _open-source cloud security platform_ that automates security and compliance across **any cloud environment**. With hundreds of ready-to-use security checks, remediation guidance, and compliance frameworks, Prowler is built to _“Secure ANY cloud at AI Speed”_. Prowler delivers **AI-driven**, **customizable**, and **easy-to-use** assessments, dashboards, reports, and integrations, making cloud security **simple**, **scalable**, and **cost-effective** for organizations of any size.
**Prowler** is the worlds most widely used _Open-Source Cloud Security Platform_ that automates security and compliance across **any cloud environment**. With hundreds of ready-to-use security checks, remediation guidance, and compliance frameworks, Prowler is built to _“Secure ANY Cloud at AI Speed”_. Prowler delivers **AI-driven**, **customizable**, and **easy-to-use** assessments, dashboards, reports, and integrations, making cloud security **simple**, **scalable**, and **cost-effective** for organizations of any size.
Prowler includes hundreds of built-in controls to ensure compliance with standards and frameworks, including:

View File

@@ -28,6 +28,7 @@ All notable changes to the **Prowler API** are documented in this file.
- Membership `post_delete` signal using raw FK ids to avoid `DoesNotExist` during cascade deletions [(#10497)](https://github.com/prowler-cloud/prowler/pull/10497)
- Finding group resources endpoints returning false 404 when filters match no results, and `sort` parameter being ignored [(#10510)](https://github.com/prowler-cloud/prowler/pull/10510)
- Jira integration failing with `JiraInvalidIssueTypeError` on non-English Jira instances due to hardcoded `"Task"` issue type; now dynamically fetches available issue types per project [(#10534)](https://github.com/prowler-cloud/prowler/pull/10534)
- Celery workers becoming unresponsive after completing long-running tasks due to stale Redis broker connections, by adding `health_check_interval` and socket keepalive to `broker_transport_options` [(#10592)](https://github.com/prowler-cloud/prowler/pull/10592)
### 🔐 Security

View File

View File

@@ -1,7 +1,6 @@
import warnings
from celery import Celery, Task
from config.env import env
# Suppress specific warnings from django-rest-auth: https://github.com/iMerica/dj-rest-auth/issues/684
@@ -17,7 +16,11 @@ celery_app.config_from_object("django.conf:settings", namespace="CELERY")
celery_app.conf.update(result_extended=True, result_expires=None)
celery_app.conf.broker_transport_options = {
"visibility_timeout": BROKER_VISIBILITY_TIMEOUT
"visibility_timeout": BROKER_VISIBILITY_TIMEOUT,
"health_check_interval": 30,
"socket_keepalive": True,
"socket_connect_timeout": 10,
"retry_on_timeout": True,
}
celery_app.conf.result_backend_transport_options = {
"visibility_timeout": BROKER_VISIBILITY_TIMEOUT

View File

0
api/tests/__init__.py Normal file
View File

View File

View File

View File

@@ -137,6 +137,7 @@
"group": "Tutorials",
"pages": [
"user-guide/tutorials/prowler-app-sso-entra",
"user-guide/tutorials/prowler-app-sso-google-workspace",
"user-guide/tutorials/bulk-provider-provisioning",
"user-guide/tutorials/aws-organizations-bulk-provisioning"
]

Binary file not shown.

After

Width:  |  Height:  |  Size: 94 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 57 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 112 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 78 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 63 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 55 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 64 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 120 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 84 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.1 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 114 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 141 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 84 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 43 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 82 KiB

View File

@@ -0,0 +1,287 @@
---
title: 'SAML SSO: Google Workspace'
---
This page explains how to configure SAML-based Single Sign-On (SSO) in Prowler App using **Google Workspace** as the Identity Provider (IdP). The setup is divided into two parts: create a custom SAML app in Google Admin Console, then complete the configuration in Prowler App.
<Info>
**Parallel Setup Required**
Google Admin Console requires the ACS URL and Entity ID from Prowler App, while Prowler App displays these values only after opening the SAML configuration dialog. To work around this, open Prowler App in a separate browser tab, navigate to the profile page, open the "Configure SAML SSO" dialog, and copy the ACS URL and Entity ID before proceeding with the Google configuration.
</Info>
## Prerequisites
- **Google Workspace**: Super Admin access (or delegated admin with app management permissions).
- **Prowler App**: Administrator access to the organization (role with "Manage Account" permission).
- Prowler App version **5.9.0** or later.
---
## Part A - Google Admin Console
### Step 1: Navigate to Web & Mobile Apps
1. Go to [admin.google.com](https://admin.google.com).
2. In the left sidebar, navigate to **Apps > Web and mobile apps**.
3. Click "Add app", then select "Add custom SAML app".
![Google Admin Console - Web & mobile apps](/images/prowler-app/saml/saml-sso-gw-1.png)
### Step 2: Enter App Details
1. In the **App name** field, enter a name (e.g., `Prowler`).
2. Optionally, add a description (e.g., `Prowler SAML APP`) and upload a logo.
3. Click "Continue".
![Add custom SAML app - App details](/images/prowler-app/saml/saml-sso-gw-2.png)
### Step 3: Download the IdP Metadata
On the **Google Identity Provider details** screen:
1. Google displays two options:
- **Option 1**: Click "Download Metadata" to save the XML file directly. This is the recommended approach.
- **Option 2**: Manually copy the **SSO URL**, **Entity ID**, and **Certificate**.
2. Download the metadata. This file is required to complete the Prowler App configuration in Part B.
3. Click "Continue".
![Google Identity Provider details - Download metadata](/images/prowler-app/saml/saml-sso-gw-3.png)
<Warning>
**Save the Metadata File**
Download and save the IdP metadata XML file before proceeding. This file cannot be easily retrieved later and is required to complete the SAML configuration in Prowler App.
</Warning>
### Step 4: Configure the Service Provider Details
Enter the following values obtained from the SAML SSO configuration dialog in Prowler App (see [Part B, Step 1](#step-1-open-the-saml-configuration-dialog) for details on where to find them):
| Google Workspace Field | Value |
|------------------------|-------|
| **ACS URL** | The Assertion Consumer Service (ACS) URL displayed in Prowler App (e.g., `https://api.prowler.com/api/v1/accounts/saml/your-domain.com/acs/`). Self-hosted deployments use a different base URL. |
| **Entity ID** | The Audience URI displayed in Prowler App (e.g., `urn:prowler.com:sp`). |
| **Name ID format** | Select `EMAIL` from the dropdown. |
| **Name ID** | Select `Basic Information > Primary email` from the dropdown. |
Click "Continue".
![Service provider details - ACS URL, Entity ID, and Name ID configuration](/images/prowler-app/saml/saml-sso-gw-4.png)
### Step 5: Configure Attribute Mapping
To correctly provision users, configure the IdP to send the following attributes in the SAML assertion. The **App Attribute (SAML)** column lists the attribute names that Prowler expects. The **Google Directory Attribute** column shows a recommended source field, but any Google directory attribute can be used as long as it is mapped to the correct Prowler attribute name.
Click "Add mapping" for each entry:
| Google Directory Attribute | App Attribute (SAML) | Required | Notes |
|----------------------------|----------------------|----------|-------|
| `Basic Information > First name` | `firstName` | Yes | |
| `Basic Information > Last name` | `lastName` | Yes | |
| `Employee Details > Department` | `userType` | No | Determines the Prowler role. **Case-sensitive.** |
| `Employee Details > Organization` | `organization` | No | Company name displayed in Prowler App profile. |
<Info>
**Remember the Mapped Fields**
Take note of which Google directory attributes are mapped to each Prowler attribute. To update a user's role or organization in Prowler, modify the corresponding field in the user's Google Workspace profile (e.g., **Department** if mapped to `userType`). Changes propagate to Prowler on the next SAML login.
</Info>
Click "Finish" to create the SAML app.
![Attribute mapping - Google Directory attributes to Prowler SAML attributes](/images/prowler-app/saml/saml-sso-gw-5.png)
<Info>
**Dynamic Updates**
Prowler App updates user attributes each time a user logs in. Any changes made in Google Workspace are reflected on the next login.
</Info>
<Warning>
**Role Assignment via `userType`**
The `userType` attribute controls which Prowler role is assigned to the user:
- If `userType` matches an existing Prowler role name, the user receives that role automatically.
- If `userType` does not match any existing role, Prowler App creates a new role with that name **without permissions**.
- If `userType` is not set, the user receives the `no_permissions` role.
In all cases where the resulting role has no permissions, a Prowler administrator must configure the appropriate permissions through the [RBAC Management](/user-guide/tutorials/prowler-app-rbac) tab. The `userType` value is **case-sensitive** - for example, `Backend` and `backend` are treated as different roles.
</Warning>
### Step 6: Enable the App for Users
By default, newly created SAML apps have user access set to **OFF**. To enable access:
1. Return to **Apps > Web and mobile apps** and select the Prowler SAML app.
2. Click "User access" (or "View details" under the "User access" section).
3. Set the service status to **ON for everyone**, or enable it for specific organizational units or groups.
4. Click "Save".
![Service Status - Set to ON for everyone](/images/prowler-app/saml/saml-sso-gw-17.png)
5. Verify in the apps list that the "User access" column displays **"ON for everyone"**.
![Web & mobile apps list - User access confirmed as "ON for everyone"](/images/prowler-app/saml/saml-sso-gw-19.png)
<Info>
**Propagation Delay**
Changes to the app status can take up to 24 hours to propagate across Google Workspace, although they typically take effect within a few minutes.
</Info>
<Info>
**"Can't Test SAML Login" Error**
If attempting to use the "Test SAML login" option in Google Admin Console and receiving a "Can't test SAML login" message, click "Allow Access" to enable the app for the organizational unit that includes the admin account. This is the same as setting the service status to **ON** as described above.
![Test SAML login - Allow access prompt](/images/prowler-app/saml/saml-sso-gw-15.png)
</Info>
---
## Part B - Prowler App Configuration
### Step 1: Open the SAML Configuration Dialog
1. Navigate to the profile settings page:
- **Prowler Cloud**: `https://cloud.prowler.com/profile`
- **Self-hosted**: `http://{your-domain}/profile`
2. Find the "SAML SSO Integration" card and click "Enable" (or "Update" if already configured).
3. The "Configure SAML SSO" dialog opens, displaying:
- **ACS URL**: The Assertion Consumer Service URL (copy this value for Part A, Step 4). This URL updates dynamically when the email domain is entered.
- **Audience**: The Entity ID (copy this value for Part A, Step 4).
- **Name ID Format**: The expected format (`urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress`).
- **Supported Assertion Attributes**: The list of accepted attributes (`firstName`, `lastName`, `userType`, `organization`).
![Prowler App - Configure SAML SSO dialog (initial state)](/images/prowler-app/saml/saml-sso-gw-prowler-1.png)
### Step 2: Enter the Email Domain and Upload Metadata
1. Enter the **email domain** for the organization (e.g., `prowler.cloud`). Prowler App uses this domain to identify users who should authenticate via SAML. The ACS URL updates automatically to reflect the configured domain.
2. Upload the **metadata XML file** downloaded in Part A, Step 3.
3. Click "Save".
![Prowler App - Configure SAML SSO dialog (domain entered and ready to save)](/images/prowler-app/saml/saml-sso-gw-prowler-2.png)
### Step 3: Verify the Enabled Status
The "SAML SSO Integration" card should now display a **"Status: Enabled"** indicator with a checkmark, confirming that the configuration is complete.
![Prowler App - SAML SSO Integration status showing "Enabled"](/images/prowler-app/saml/saml-sso-gw-prowler-3.png)
---
## Testing the Integration
### Optional: Create a Test User in Google Workspace
To verify the integration without affecting existing users, create a dedicated test user in Google Admin Console:
1. Navigate to **Directory > Users** in Google Admin Console.
2. Click "Add new user".
![Google Admin Console - Users directory](/images/prowler-app/saml/saml-sso-gw-7.png)
3. Fill in the user details (first name, last name, and primary email address in the configured domain).
![Add new user form](/images/prowler-app/saml/saml-sso-gw-8.png)
4. Complete the user creation. Google Workspace generates temporary credentials for the new account.
![User created successfully - Username and temporary password](/images/prowler-app/saml/saml-sso-gw-10.png)
### Optional: Configure User Attributes for Role Mapping
To test the `userType` → role mapping, set the **Department** attribute in the test user's profile. This value is sent as the `userType` SAML attribute based on the mapping configured in Part A, Step 5.
1. In **Directory > Users**, click the test user's name to open the profile.
2. Click "User details", scroll to **Employee information**, and enter a value in the **Department** field (e.g., `Backend`). This value determines the Prowler role assigned to the user.
3. Click "Save".
![User information - Setting Department to "Backend" for userType mapping](/images/prowler-app/saml/saml-sso-gw-13.png)
### SP-Initiated SSO (from Prowler)
1. Navigate to the Prowler login page.
2. Click "Continue with SAML SSO".
3. Enter an email from the configured domain (e.g., `adrian@prowler.cloud`).
4. Click "Log in". The browser redirects to Google for authentication and returns to Prowler App upon success.
![Prowler App - Sign in with SAML SSO](/images/prowler-app/saml/saml-sso-gw-prowler-4.png)
### Verify User Profile and Role Mapping
After a successful SSO login, the user profile in Prowler App reflects the attributes sent by Google Workspace:
- **Name**: Populated from the `firstName` and `lastName` attributes.
- **Role**: Created automatically from the `userType` attribute (e.g., `Backend`). If the role did not exist previously, it is created with no permissions by default.
- **Permissions**: In the screenshot below, the user has no permissions because the `Backend` role did not exist prior to login and was created automatically without any permissions. To resolve this, a Prowler administrator can either:
- Assign the appropriate permissions to the new role via the [RBAC Management](/user-guide/tutorials/prowler-app-rbac) tab.
- Set the `userType` attribute in the IdP to match an existing Prowler role that already has the desired permissions. The updated role is applied on the next SAML login.
For more details on role assignment behavior and attribute mapping, refer to the [SAML SSO Configuration](/user-guide/tutorials/prowler-app-sso#configure-attribute-mapping-in-the-idp) page.
![Prowler App - User profile showing role "Backend" created from userType mapping](/images/prowler-app/saml/saml-sso-gw-prowler-5.png)
### IdP-Initiated SSO (from Google)
1. Sign in to Google Workspace with an account that has access to the Prowler SAML app.
2. Open the Google Workspace app launcher (the grid icon in the top-right corner of any Google page).
3. Click the Prowler app tile.
4. The browser redirects directly to Prowler App, authenticated.
For more information on the SSO login flows, refer to the [SAML SSO Configuration](/user-guide/tutorials/prowler-app-sso#idp-initiated-sso) page.
---
## Troubleshooting
<Warning>
**User Lockout After Misconfiguration**
If SAML is configured with incorrect metadata or an incorrect domain, users who authenticated via SAML cannot fall back to password login. A Prowler administrator must remove the SAML configuration via the API:
```bash
curl -X DELETE 'https://api.prowler.com/api/v1/saml-config' \
-H 'Authorization: Bearer <ADMIN_TOKEN>' \
-H 'Accept: application/vnd.api+json'
```
After removal, affected users must reset their password to regain access using standard email and password login. This also applies when SAML is intentionally removed - all SAML-authenticated users need to reset their password. For more details, refer to the [SAML API Reference](/user-guide/tutorials/prowler-app-sso#saml-api-reference). For additional support, contact [Prowler Support](https://docs.prowler.com/user-guide/contact-support).
</Warning>
<Info>
**Email Domain Uniqueness**
Prowler does not allow two tenants to share the same email domain. If the domain is already associated with another tenant, the configuration will fail. This is by design to prevent authentication ambiguity.
</Info>
<Info>
**Just-in-Time Provisioning**
Users who authenticate via SAML for the first time are automatically created in Prowler App. No prior invitation is needed. User attributes (`firstName`, `lastName`, `userType`) are updated on every login from the Google directory.
</Info>
---
## Quick Summary
1. In **Google Admin Console**, create a custom SAML app using the ACS URL and Entity ID from Prowler App.
2. Configure **attribute mapping**: `firstName`, `lastName`, and optionally `userType` and `organization`.
3. **Download the metadata XML** from Google.
4. **Enable the app** in Google Workspace for the relevant users or groups.
5. In **Prowler App**, enter the email domain, upload the metadata XML, and save.
6. Verify the SAML SSO Integration shows **"Status: Enabled"**.
7. Test login via "Continue with SAML SSO" on the Prowler login page.

View File

@@ -75,7 +75,7 @@ Choose a Method:
<Info>
**IdP Configuration**
The exact steps for configuring an IdP vary depending on the provider (Okta, Azure AD, etc.). Please refer to the IdP's documentation for instructions on creating a SAML application. For SSO integration with Azure AD / Entra ID, see our [Entra ID configuration instructions](/user-guide/tutorials/prowler-app-sso-entra).
The exact steps for configuring an IdP vary depending on the provider (Okta, Azure AD, Google Workspace, etc.). Please refer to the IdP's documentation for instructions on creating a SAML application. For SSO integration with Azure AD / Entra ID, see our [Entra ID configuration instructions](/user-guide/tutorials/prowler-app-sso-entra). For Google Workspace, see our [Google Workspace configuration instructions](/user-guide/tutorials/prowler-app-sso-google-workspace).
</Info>
@@ -88,7 +88,7 @@ Choose a Method:
| `firstName` | The user's first name. | Yes |
| `lastName` | The user's last name. | Yes |
| `userType` | Determines which Prowler role the user receives (e.g., `admin`, `auditor`). If a role with that name already exists, the user receives it automatically; if it does not exist, Prowler App creates a new role with that name without permissions. If `userType` is not defined, the user is assigned the `no_permissions` role. Role permissions can be edited in the [RBAC Management tab](/user-guide/tutorials/prowler-app-rbac). | No |
| `companyName` | The user's company name. This is automatically populated if the IdP sends an `organization` attribute. | No |
| `organization` | The user's company name. | No |
<Info>
**IdP Attribute Mapping**

146
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 2.3.2 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.1.4 and should not be changed by hand.
[[package]]
name = "about-time"
@@ -605,21 +605,21 @@ requests = ">=2.21.0,<3.0.0"
[[package]]
name = "alibabacloud-tea-openapi"
version = "0.4.1"
version = "0.4.4"
description = "Alibaba Cloud openapi SDK Library for Python"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "alibabacloud_tea_openapi-0.4.1-py3-none-any.whl", hash = "sha256:e46bfa3ca34086d2c357d217a0b7284ecbd4b3bab5c88e075e73aec637b0e4a0"},
{file = "alibabacloud_tea_openapi-0.4.1.tar.gz", hash = "sha256:2384b090870fdb089c3c40f3fb8cf0145b8c7d6c14abbac521f86a01abb5edaf"},
{file = "alibabacloud_tea_openapi-0.4.4-py3-none-any.whl", hash = "sha256:cea6bc1fe35b0319a8752cb99eb0ecb0dab7ca1a71b99c12970ba0867410995f"},
{file = "alibabacloud_tea_openapi-0.4.4.tar.gz", hash = "sha256:1b0917bc03cd49417da64945e92731716d53e2eb8707b235f54e45b7473221ce"},
]
[package.dependencies]
alibabacloud-credentials = ">=1.0.2,<2.0.0"
alibabacloud-gateway-spi = ">=0.0.2,<1.0.0"
alibabacloud-tea-util = ">=0.3.13,<1.0.0"
cryptography = ">=3.0.0,<45.0.0"
cryptography = {version = ">=3.0.0,<47.0.0", markers = "python_version >= \"3.8\""}
darabonba-core = ">=1.0.3,<2.0.0"
[[package]]
@@ -1888,7 +1888,6 @@ files = [
{file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
{file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
]
markers = {dev = "platform_system == \"Windows\" or sys_platform == \"win32\""}
[[package]]
name = "contextlib2"
@@ -1983,62 +1982,75 @@ toml = ["tomli ; python_full_version <= \"3.11.0a6\""]
[[package]]
name = "cryptography"
version = "44.0.3"
version = "46.0.6"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
optional = false
python-versions = "!=3.9.0,!=3.9.1,>=3.7"
python-versions = "!=3.9.0,!=3.9.1,>=3.8"
groups = ["main", "dev"]
files = [
{file = "cryptography-44.0.3-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:962bc30480a08d133e631e8dfd4783ab71cc9e33d5d7c1e192f0b7c06397bb88"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffc61e8f3bf5b60346d89cd3d37231019c17a081208dfbbd6e1605ba03fa137"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58968d331425a6f9eedcee087f77fd3c927c88f55368f43ff7e0a19891f2642c"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:e28d62e59a4dbd1d22e747f57d4f00c459af22181f0b2f787ea83f5a876d7c76"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:af653022a0c25ef2e3ffb2c673a50e5a0d02fecc41608f4954176f1933b12359"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:157f1f3b8d941c2bd8f3ffee0af9b049c9665c39d3da9db2dc338feca5e98a43"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:c6cd67722619e4d55fdb42ead64ed8843d64638e9c07f4011163e46bc512cf01"},
{file = "cryptography-44.0.3-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:b424563394c369a804ecbee9b06dfb34997f19d00b3518e39f83a5642618397d"},
{file = "cryptography-44.0.3-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:c91fc8e8fd78af553f98bc7f2a1d8db977334e4eea302a4bfd75b9461c2d8904"},
{file = "cryptography-44.0.3-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:25cd194c39fa5a0aa4169125ee27d1172097857b27109a45fadc59653ec06f44"},
{file = "cryptography-44.0.3-cp37-abi3-win32.whl", hash = "sha256:3be3f649d91cb182c3a6bd336de8b61a0a71965bd13d1a04a0e15b39c3d5809d"},
{file = "cryptography-44.0.3-cp37-abi3-win_amd64.whl", hash = "sha256:3883076d5c4cc56dbef0b898a74eb6992fdac29a7b9013870b34efe4ddb39a0d"},
{file = "cryptography-44.0.3-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:5639c2b16764c6f76eedf722dbad9a0914960d3489c0cc38694ddf9464f1bb2f"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3ffef566ac88f75967d7abd852ed5f182da252d23fac11b4766da3957766759"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:192ed30fac1728f7587c6f4613c29c584abdc565d7417c13904708db10206645"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:7d5fe7195c27c32a64955740b949070f21cba664604291c298518d2e255931d2"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3f07943aa4d7dad689e3bb1638ddc4944cc5e0921e3c227486daae0e31a05e54"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:cb90f60e03d563ca2445099edf605c16ed1d5b15182d21831f58460c48bffb93"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:ab0b005721cc0039e885ac3503825661bd9810b15d4f374e473f8c89b7d5460c"},
{file = "cryptography-44.0.3-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:3bb0847e6363c037df8f6ede57d88eaf3410ca2267fb12275370a76f85786a6f"},
{file = "cryptography-44.0.3-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:b0cc66c74c797e1db750aaa842ad5b8b78e14805a9b5d1348dc603612d3e3ff5"},
{file = "cryptography-44.0.3-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:6866df152b581f9429020320e5eb9794c8780e90f7ccb021940d7f50ee00ae0b"},
{file = "cryptography-44.0.3-cp39-abi3-win32.whl", hash = "sha256:c138abae3a12a94c75c10499f1cbae81294a6f983b3af066390adee73f433028"},
{file = "cryptography-44.0.3-cp39-abi3-win_amd64.whl", hash = "sha256:5d186f32e52e66994dce4f766884bcb9c68b8da62d61d9d215bfe5fb56d21334"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:cad399780053fb383dc067475135e41c9fe7d901a97dd5d9c5dfb5611afc0d7d"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:21a83f6f35b9cc656d71b5de8d519f566df01e660ac2578805ab245ffd8523f8"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:fc3c9babc1e1faefd62704bb46a69f359a9819eb0292e40df3fb6e3574715cd4"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:e909df4053064a97f1e6565153ff8bb389af12c5c8d29c343308760890560aff"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:dad80b45c22e05b259e33ddd458e9e2ba099c86ccf4e88db7bbab4b747b18d06"},
{file = "cryptography-44.0.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:479d92908277bed6e1a1c69b277734a7771c2b78633c224445b5c60a9f4bc1d9"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:896530bc9107b226f265effa7ef3f21270f18a2026bc09fed1ebd7b66ddf6375"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:9b4d4a5dbee05a2c390bf212e78b99434efec37b17a4bff42f50285c5c8c9647"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02f55fb4f8b79c1221b0961488eaae21015b69b210e18c386b69de182ebb1259"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:dd3db61b8fe5be220eee484a17233287d0be6932d056cf5738225b9c05ef4fff"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:978631ec51a6bbc0b7e58f23b68a8ce9e5f09721940933e9c217068388789fe5"},
{file = "cryptography-44.0.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:5d20cc348cca3a8aa7312f42ab953a56e15323800ca3ab0706b8cd452a3a056c"},
{file = "cryptography-44.0.3.tar.gz", hash = "sha256:fe19d8bc5536a91a24a8133328880a41831b6c5df54599a8417b62fe015d3053"},
{file = "cryptography-46.0.6-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:64235194bad039a10bb6d2d930ab3323baaec67e2ce36215fd0952fad0930ca8"},
{file = "cryptography-46.0.6-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:26031f1e5ca62fcb9d1fcb34b2b60b390d1aacaa15dc8b895a9ed00968b97b30"},
{file = "cryptography-46.0.6-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9a693028b9cbe51b5a1136232ee8f2bc242e4e19d456ded3fa7c86e43c713b4a"},
{file = "cryptography-46.0.6-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:67177e8a9f421aa2d3a170c3e56eca4e0128883cf52a071a7cbf53297f18b175"},
{file = "cryptography-46.0.6-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:d9528b535a6c4f8ff37847144b8986a9a143585f0540fbcb1a98115b543aa463"},
{file = "cryptography-46.0.6-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:22259338084d6ae497a19bae5d4c66b7ca1387d3264d1c2c0e72d9e9b6a77b97"},
{file = "cryptography-46.0.6-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:760997a4b950ff00d418398ad73fbc91aa2894b5c1db7ccb45b4f68b42a63b3c"},
{file = "cryptography-46.0.6-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:3dfa6567f2e9e4c5dceb8ccb5a708158a2a871052fa75c8b78cb0977063f1507"},
{file = "cryptography-46.0.6-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:cdcd3edcbc5d55757e5f5f3d330dd00007ae463a7e7aa5bf132d1f22a4b62b19"},
{file = "cryptography-46.0.6-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:d4e4aadb7fc1f88687f47ca20bb7227981b03afaae69287029da08096853b738"},
{file = "cryptography-46.0.6-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:2b417edbe8877cda9022dde3a008e2deb50be9c407eef034aeeb3a8b11d9db3c"},
{file = "cryptography-46.0.6-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:380343e0653b1c9d7e1f55b52aaa2dbb2fdf2730088d48c43ca1c7c0abb7cc2f"},
{file = "cryptography-46.0.6-cp311-abi3-win32.whl", hash = "sha256:bcb87663e1f7b075e48c3be3ecb5f0b46c8fc50b50a97cf264e7f60242dca3f2"},
{file = "cryptography-46.0.6-cp311-abi3-win_amd64.whl", hash = "sha256:6739d56300662c468fddb0e5e291f9b4d084bead381667b9e654c7dd81705124"},
{file = "cryptography-46.0.6-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:2ef9e69886cbb137c2aef9772c2e7138dc581fad4fcbcf13cc181eb5a3ab6275"},
{file = "cryptography-46.0.6-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7f417f034f91dcec1cb6c5c35b07cdbb2ef262557f701b4ecd803ee8cefed4f4"},
{file = "cryptography-46.0.6-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d24c13369e856b94892a89ddf70b332e0b70ad4a5c43cf3e9cb71d6d7ffa1f7b"},
{file = "cryptography-46.0.6-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:aad75154a7ac9039936d50cf431719a2f8d4ed3d3c277ac03f3339ded1a5e707"},
{file = "cryptography-46.0.6-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:3c21d92ed15e9cfc6eb64c1f5a0326db22ca9c2566ca46d845119b45b4400361"},
{file = "cryptography-46.0.6-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:4668298aef7cddeaf5c6ecc244c2302a2b8e40f384255505c22875eebb47888b"},
{file = "cryptography-46.0.6-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:8ce35b77aaf02f3b59c90b2c8a05c73bac12cea5b4e8f3fbece1f5fddea5f0ca"},
{file = "cryptography-46.0.6-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:c89eb37fae9216985d8734c1afd172ba4927f5a05cfd9bf0e4863c6d5465b013"},
{file = "cryptography-46.0.6-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:ed418c37d095aeddf5336898a132fba01091f0ac5844e3e8018506f014b6d2c4"},
{file = "cryptography-46.0.6-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:69cf0056d6947edc6e6760e5f17afe4bea06b56a9ac8a06de9d2bd6b532d4f3a"},
{file = "cryptography-46.0.6-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8e7304c4f4e9490e11efe56af6713983460ee0780f16c63f219984dab3af9d2d"},
{file = "cryptography-46.0.6-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:b928a3ca837c77a10e81a814a693f2295200adb3352395fad024559b7be7a736"},
{file = "cryptography-46.0.6-cp314-cp314t-win32.whl", hash = "sha256:97c8115b27e19e592a05c45d0dd89c57f81f841cc9880e353e0d3bf25b2139ed"},
{file = "cryptography-46.0.6-cp314-cp314t-win_amd64.whl", hash = "sha256:c797e2517cb7880f8297e2c0f43bb910e91381339336f75d2c1c2cbf811b70b4"},
{file = "cryptography-46.0.6-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:12cae594e9473bca1a7aceb90536060643128bb274fcea0fc459ab90f7d1ae7a"},
{file = "cryptography-46.0.6-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:639301950939d844a9e1c4464d7e07f902fe9a7f6b215bb0d4f28584729935d8"},
{file = "cryptography-46.0.6-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:ed3775295fb91f70b4027aeba878d79b3e55c0b3e97eaa4de71f8f23a9f2eb77"},
{file = "cryptography-46.0.6-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:8927ccfbe967c7df312ade694f987e7e9e22b2425976ddbf28271d7e58845290"},
{file = "cryptography-46.0.6-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:b12c6b1e1651e42ab5de8b1e00dc3b6354fdfd778e7fa60541ddacc27cd21410"},
{file = "cryptography-46.0.6-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:063b67749f338ca9c5a0b7fe438a52c25f9526b851e24e6c9310e7195aad3b4d"},
{file = "cryptography-46.0.6-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:02fad249cb0e090b574e30b276a3da6a149e04ee2f049725b1f69e7b8351ec70"},
{file = "cryptography-46.0.6-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:7e6142674f2a9291463e5e150090b95a8519b2fb6e6aaec8917dd8d094ce750d"},
{file = "cryptography-46.0.6-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:456b3215172aeefb9284550b162801d62f5f264a081049a3e94307fe20792cfa"},
{file = "cryptography-46.0.6-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:341359d6c9e68834e204ceaf25936dffeafea3829ab80e9503860dcc4f4dac58"},
{file = "cryptography-46.0.6-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9a9c42a2723999a710445bc0d974e345c32adfd8d2fac6d8a251fa829ad31cfb"},
{file = "cryptography-46.0.6-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:6617f67b1606dfd9fe4dbfa354a9508d4a6d37afe30306fe6c101b7ce3274b72"},
{file = "cryptography-46.0.6-cp38-abi3-win32.whl", hash = "sha256:7f6690b6c55e9c5332c0b59b9c8a3fb232ebf059094c17f9019a51e9827df91c"},
{file = "cryptography-46.0.6-cp38-abi3-win_amd64.whl", hash = "sha256:79e865c642cfc5c0b3eb12af83c35c5aeff4fa5c672dc28c43721c2c9fdd2f0f"},
{file = "cryptography-46.0.6-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:2ea0f37e9a9cf0df2952893ad145fd9627d326a59daec9b0802480fa3bcd2ead"},
{file = "cryptography-46.0.6-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a3e84d5ec9ba01f8fd03802b2147ba77f0c8f2617b2aff254cedd551844209c8"},
{file = "cryptography-46.0.6-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:12f0fa16cc247b13c43d56d7b35287ff1569b5b1f4c5e87e92cc4fcc00cd10c0"},
{file = "cryptography-46.0.6-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:50575a76e2951fe7dbd1f56d181f8c5ceeeb075e9ff88e7ad997d2f42af06e7b"},
{file = "cryptography-46.0.6-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:90e5f0a7b3be5f40c3a0a0eafb32c681d8d2c181fc2a1bdabe9b3f611d9f6b1a"},
{file = "cryptography-46.0.6-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:6728c49e3b2c180ef26f8e9f0a883a2c585638db64cf265b49c9ba10652d430e"},
{file = "cryptography-46.0.6.tar.gz", hash = "sha256:27550628a518c5c6c903d84f637fbecf287f6cb9ced3804838a1295dc1fd0759"},
]
[package.dependencies]
cffi = {version = ">=1.12", markers = "platform_python_implementation != \"PyPy\""}
cffi = {version = ">=2.0.0", markers = "python_full_version >= \"3.9.0\" and platform_python_implementation != \"PyPy\""}
typing-extensions = {version = ">=4.13.2", markers = "python_full_version < \"3.11.0\""}
[package.extras]
docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=3.0.0) ; python_version >= \"3.8\""]
docs = ["sphinx (>=5.3.0)", "sphinx-inline-tabs", "sphinx-rtd-theme (>=3.0.0)"]
docstest = ["pyenchant (>=3)", "readme-renderer (>=30.0)", "sphinxcontrib-spelling (>=7.3.1)"]
nox = ["nox (>=2024.4.15)", "nox[uv] (>=2024.3.2) ; python_version >= \"3.8\""]
pep8test = ["check-sdist ; python_version >= \"3.8\"", "click (>=8.0.1)", "mypy (>=1.4)", "ruff (>=0.3.6)"]
nox = ["nox[uv] (>=2024.4.15)"]
pep8test = ["check-sdist", "click (>=8.0.1)", "mypy (>=1.14)", "ruff (>=0.11.11)"]
sdist = ["build (>=1.0.0)"]
ssh = ["bcrypt (>=3.1.5)"]
test = ["certifi (>=2024)", "cryptography-vectors (==44.0.3)", "pretend (>=0.7)", "pytest (>=7.4.0)", "pytest-benchmark (>=4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
test = ["certifi (>=2024)", "cryptography-vectors (==46.0.6)", "pretend (>=0.7)", "pytest (>=7.4.0)", "pytest-benchmark (>=4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
test-randomorder = ["pytest-randomly"]
[[package]]
@@ -3071,7 +3083,7 @@ files = [
[package.dependencies]
attrs = ">=22.2.0"
jsonschema-specifications = ">=2023.3.6"
jsonschema-specifications = ">=2023.03.6"
referencing = ">=0.28.4"
rpds-py = ">=0.7.1"
@@ -3151,7 +3163,7 @@ files = [
]
[package.dependencies]
certifi = ">=14.5.14"
certifi = ">=14.05.14"
durationpy = ">=0.7"
google-auth = ">=1.0.1"
oauthlib = ">=3.2.2"
@@ -4074,23 +4086,24 @@ signedtoken = ["cryptography (>=3.0.0)", "pyjwt (>=2.0.0,<3)"]
[[package]]
name = "oci"
version = "2.160.3"
version = "2.169.0"
description = "Oracle Cloud Infrastructure Python SDK"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "oci-2.160.3-py3-none-any.whl", hash = "sha256:858bff3e697098bdda44833d2476bfb4632126f0182178e7dbde4dbd156d71f0"},
{file = "oci-2.160.3.tar.gz", hash = "sha256:57514889be3b713a8385d86e3ba8a33cf46e3563c2a7e29a93027fb30b8a2537"},
{file = "oci-2.169.0-py3-none-any.whl", hash = "sha256:c71bb5143f307791082b3e33cc1545c2490a518cfed85ab1948ef5107c36d30b"},
{file = "oci-2.169.0.tar.gz", hash = "sha256:f3c5fff00b01783b5325ea7b13bf140053ec1e9f41da20bfb9c8a349ee7662fa"},
]
[package.dependencies]
certifi = "*"
circuitbreaker = {version = ">=1.3.1,<3.0.0", markers = "python_version >= \"3.7\""}
cryptography = ">=3.2.1,<46.0.0"
pyOpenSSL = ">=17.5.0,<25.0.0"
cryptography = ">=3.2.1,<47.0.0"
pyOpenSSL = ">=17.5.0,<27.0.0"
python-dateutil = ">=2.5.3,<3.0.0"
pytz = ">=2016.10"
urllib3 = {version = ">=2.6.3", markers = "python_version >= \"3.10.0\""}
[package.extras]
adk = ["docstring-parser (>=0.16) ; python_version >= \"3.10\" and python_version < \"4\"", "mcp (>=1.6.0) ; python_version >= \"3.10\" and python_version < \"4\"", "pydantic (>=2.10.6) ; python_version >= \"3.10\" and python_version < \"4\"", "rich (>=13.9.4) ; python_version >= \"3.10\" and python_version < \"4\""]
@@ -4963,7 +4976,7 @@ files = [
]
[package.dependencies]
astroid = ">=3.3.8,<=3.4.0.dev0"
astroid = ">=3.3.8,<=3.4.0-dev0"
colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
dill = [
{version = ">=0.2", markers = "python_version < \"3.11\""},
@@ -5024,18 +5037,19 @@ tests = ["hypothesis (>=3.27.0)", "pytest (>=7.4.0)", "pytest-cov (>=2.10.1)", "
[[package]]
name = "pyopenssl"
version = "24.3.0"
version = "26.0.0"
description = "Python wrapper module around the OpenSSL library"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pyOpenSSL-24.3.0-py3-none-any.whl", hash = "sha256:e474f5a473cd7f92221cc04976e48f4d11502804657a08a989fb3be5514c904a"},
{file = "pyopenssl-24.3.0.tar.gz", hash = "sha256:49f7a019577d834746bc55c5fce6ecbcec0f2b4ec5ce1cf43a9a173b8138bb36"},
{file = "pyopenssl-26.0.0-py3-none-any.whl", hash = "sha256:df94d28498848b98cc1c0ffb8ef1e71e40210d3b0a8064c9d29571ed2904bf81"},
{file = "pyopenssl-26.0.0.tar.gz", hash = "sha256:f293934e52936f2e3413b89c6ce36df66a0b34ae1ea3a053b8c5020ff2f513fc"},
]
[package.dependencies]
cryptography = ">=41.0.5,<45"
cryptography = ">=46.0.0,<47"
typing-extensions = {version = ">=4.9", markers = "python_version < \"3.13\" and python_version >= \"3.8\""}
[package.extras]
docs = ["sphinx (!=5.2.0,!=5.2.0.post0,!=7.2.5)", "sphinx_rtd_theme"]
@@ -5808,10 +5822,10 @@ files = [
]
[package.dependencies]
botocore = ">=1.37.4,<2.0a0"
botocore = ">=1.37.4,<2.0a.0"
[package.extras]
crt = ["botocore[crt] (>=1.37.4,<2.0a0)"]
crt = ["botocore[crt] (>=1.37.4,<2.0a.0)"]
[[package]]
name = "safety"
@@ -6729,4 +6743,4 @@ files = [
[metadata]
lock-version = "2.1"
python-versions = ">=3.10,<3.13"
content-hash = "65f1f9833d61f90f1f89ed70b3677f76c0693bae275dd39699df01c05050bbe6"
content-hash = "91739ee5e383337160f9f08b76944ab4e8629c94084c8a9d115246862557f7c5"

View File

@@ -29,10 +29,14 @@ All notable changes to the **Prowler SDK** are documented in this file.
- `return` statements in `finally` blocks replaced across IAM, Organizations, GCP provider, and custom checks metadata to stop silently swallowing exceptions [(#10102)](https://github.com/prowler-cloud/prowler/pull/10102)
- `JiraConnection` now includes issue types per project fetched during `test_connection`, fixing `JiraInvalidIssueTypeError` on non-English Jira instances [(#10534)](https://github.com/prowler-cloud/prowler/pull/10534)
- `--list-checks` and `--list-checks-json` now include `threat-detection` category checks in their output [(#10578)](https://github.com/prowler-cloud/prowler/pull/10578)
- Missing `__init__.py` in `codebuild_project_uses_allowed_github_organizations` check preventing discovery by `--list-checks` [(#10584)](https://github.com/prowler-cloud/prowler/pull/10584)
- Azure Key Vault checks emitting incorrect findings for keys, secrets, and vault logging [(#10332)](https://github.com/prowler-cloud/prowler/pull/10332)
### 🔐 Security
- Sensitive CLI flag values (tokens, keys, passwords) in HTML output "Parameters used" field now redacted to prevent credential leaks [(#10518)](https://github.com/prowler-cloud/prowler/pull/10518)
- `cryptography` bumped from 44.0.3 to 46.0.6 ([CVE-2026-26007](https://github.com/pyca/cryptography/security/advisories/GHSA-r6ph-v2qm-q3c2), [CVE-2026-34073](https://github.com/pyca/cryptography/security/advisories/GHSA-m959-cc7f-wv43)), `oci` to 2.169.0, and `alibabacloud-tea-openapi` to 0.4.4 [(#10535)](https://github.com/prowler-cloud/prowler/pull/10535)
---

View File

@@ -271,6 +271,8 @@ def prowler():
categories=categories,
resource_groups=resource_groups,
provider=provider,
list_checks=getattr(args, "list_checks", False)
or getattr(args, "list_checks_json", False),
)
# if --list-checks-json, dump a json file and exit

View File

@@ -20,6 +20,7 @@ def load_checks_to_execute(
compliance_frameworks: list = None,
categories: set = None,
resource_groups: set = None,
list_checks: bool = False,
) -> set:
"""Generate the list of checks to execute based on the cloud provider and the input arguments given"""
try:
@@ -209,7 +210,12 @@ def load_checks_to_execute(
):
checks_to_execute.add(check_name)
# Only execute threat detection checks if threat-detection category is set
if (not categories or "threat-detection" not in categories) and not check_list:
# Skip this exclusion when listing checks (--list-checks or --list-checks-json)
if (
(not categories or "threat-detection" not in categories)
and not check_list
and not list_checks
):
for threat_detection_check in check_categories.get("threat-detection", []):
checks_to_execute.discard(threat_detection_check)

View File

@@ -1,15 +1,15 @@
{
"Provider": "azure",
"CheckID": "keyvault_key_expiration_set_in_non_rbac",
"CheckTitle": "Key Vault without RBAC authorization has expiration date set for all enabled keys",
"CheckTitle": "Key in non-RBAC Key Vault has expiration date set",
"CheckType": [],
"ServiceName": "keyvault",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "microsoft.keyvault/vaults",
"ResourceType": "microsoft.keyvault/vaults/keys",
"ResourceGroup": "security",
"Description": "**Azure Key Vaults** using access **policies (non-RBAC)** are assessed to confirm all **enabled keys** have an `expiration` (`exp`) defined. The finding highlights keys in these vaults that lack a set lifetime.",
"Description": "Each **enabled key** in an **Azure Key Vault** using **access policies (non-RBAC)** is assessed to confirm it has an `expiration` (`exp`) attribute defined.",
"Risk": "Non-expiring keys enable indefinite use, degrading **confidentiality** and **integrity**. Stale or compromised keys can decrypt data, forge signatures, and maintain persistence. Absent lifetimes weaken rotation discipline and impede timely revocation, increasing exposure to cryptographic and operational drift.",
"RelatedUrl": "",
"AdditionalURLs": [

View File

@@ -7,20 +7,19 @@ class keyvault_key_expiration_set_in_non_rbac(Check):
findings = []
for subscription, key_vaults in keyvault_client.key_vaults.items():
for keyvault in key_vaults:
if not keyvault.properties.enable_rbac_authorization and keyvault.keys:
report = Check_Report_Azure(
metadata=self.metadata(), resource=keyvault
)
report.subscription = subscription
report.status = "PASS"
report.status_extended = f"Keyvault {keyvault.name} from subscription {subscription} has all the keys with expiration date set."
has_key_without_expiration = False
for key in keyvault.keys:
if not key.attributes.expires and key.enabled:
if not keyvault.properties.enable_rbac_authorization:
for key in keyvault.keys or []:
if not key.enabled:
continue
report = Check_Report_Azure(
metadata=self.metadata(), resource=key
)
report.subscription = subscription
if not key.attributes.expires:
report.status = "FAIL"
report.status_extended = f"Keyvault {keyvault.name} from subscription {subscription} has the key {key.name} without expiration date set."
has_key_without_expiration = True
findings.append(report)
if not has_key_without_expiration:
report.status_extended = f"Key {key.name} in Key Vault {keyvault.name} from subscription {subscription} does not have an expiration date set."
else:
report.status = "PASS"
report.status_extended = f"Key {key.name} in Key Vault {keyvault.name} from subscription {subscription} has an expiration date set."
findings.append(report)
return findings

View File

@@ -7,9 +7,9 @@
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "microsoft.keyvault/vaults",
"ResourceType": "microsoft.keyvault/vaults/keys",
"ResourceGroup": "security",
"Description": "**Azure Key Vault** keys configured with a **rotation policy** that includes a `Rotate` lifetime action.\n\nThe evaluation looks for lifetime actions that schedule automatic key version creation; keys without this policy are not configured for auto-rotation.",
"Description": "Each **Azure Key Vault** key is assessed for a **rotation policy** that includes a `Rotate` lifetime action scheduling automatic key version creation.",
"Risk": "Without **auto-rotation**, keys may outlive policy, increasing exposure if material is leaked and weakening **confidentiality**.\n\nExpired keys without planned rollover can break decrypt/unwrap operations, impacting **availability**. Long-lived keys hinder incident response and enable prolonged misuse of stale versions.",
"RelatedUrl": "",
"AdditionalURLs": [

View File

@@ -7,23 +7,21 @@ class keyvault_key_rotation_enabled(Check):
findings = []
for subscription, key_vaults in keyvault_client.key_vaults.items():
for keyvault in key_vaults:
if keyvault.keys:
report = Check_Report_Azure(
metadata=self.metadata(), resource=keyvault
)
for key in keyvault.keys or []:
report = Check_Report_Azure(metadata=self.metadata(), resource=key)
report.subscription = subscription
for key in keyvault.keys:
if (
key.rotation_policy
and key.rotation_policy.lifetime_actions
and key.rotation_policy.lifetime_actions[0].action
== "Rotate"
):
report.status = "PASS"
report.status_extended = f"Keyvault {keyvault.name} from subscription {subscription} has the key {key.name} with rotation policy set."
else:
report.status = "FAIL"
report.status_extended = f"Keyvault {keyvault.name} from subscription {subscription} has the key {key.name} without rotation policy set."
findings.append(report)
if (
key.rotation_policy
and key.rotation_policy.lifetime_actions
and any(
action.action == "Rotate"
for action in key.rotation_policy.lifetime_actions
)
):
report.status = "PASS"
report.status_extended = f"Key {key.name} in Key Vault {keyvault.name} from subscription {subscription} has a rotation policy set."
else:
report.status = "FAIL"
report.status_extended = f"Key {key.name} in Key Vault {keyvault.name} from subscription {subscription} does not have a rotation policy set."
findings.append(report)
return findings

View File

@@ -1,7 +1,7 @@
{
"Provider": "azure",
"CheckID": "keyvault_logging_enabled",
"CheckTitle": "Key Vault has a diagnostic setting capturing audit logs",
"CheckTitle": "Key Vault has at least one diagnostic setting with audit logging enabled",
"CheckType": [],
"ServiceName": "keyvault",
"SubServiceName": "",

View File

@@ -10,29 +10,20 @@ class keyvault_logging_enabled(Check):
for keyvault in key_vaults:
report = Check_Report_Azure(metadata=self.metadata(), resource=keyvault)
report.subscription = subscription_name
if not keyvault.monitor_diagnostic_settings:
report.status = "FAIL"
report.status_extended = f"There are no diagnostic settings capturing audit logs for Key Vault {keyvault.name} in subscription {subscription_name}."
findings.append(report)
else:
for diagnostic_setting in keyvault.monitor_diagnostic_settings:
report.resource_name = diagnostic_setting.name
report.resource_id = diagnostic_setting.id
report.location = keyvault.location
report.status = "FAIL"
report.status_extended = f"Diagnostic setting {diagnostic_setting.name} for Key Vault {keyvault.name} in subscription {subscription_name} does not have audit logging."
audit = False
allLogs = False
for log in diagnostic_setting.logs:
if log.category_group == "audit" and log.enabled:
audit = True
if log.category_group == "allLogs" and log.enabled:
allLogs = True
if audit and allLogs:
report.status = "PASS"
report.status_extended = f"Diagnostic setting {diagnostic_setting.name} for Key Vault {keyvault.name} in subscription {subscription_name} has audit logging."
break
findings.append(report)
report.status = "FAIL"
report.status_extended = f"Key Vault {keyvault.name} in subscription {subscription_name} does not have a diagnostic setting with audit logging."
for diagnostic_setting in keyvault.monitor_diagnostic_settings or []:
has_audit = False
has_all_logs = False
for log in diagnostic_setting.logs:
if log.category_group == "audit" and log.enabled:
has_audit = True
if log.category_group == "allLogs" and log.enabled:
has_all_logs = True
if has_audit and has_all_logs:
report.status = "PASS"
report.status_extended = f"Key Vault {keyvault.name} in subscription {subscription_name} has a diagnostic setting with audit logging."
break
findings.append(report)
return findings

View File

@@ -1,15 +1,15 @@
{
"Provider": "azure",
"CheckID": "keyvault_non_rbac_secret_expiration_set",
"CheckTitle": "Non-RBAC Key Vault has expiration date set for all secrets",
"CheckTitle": "Secret in non-RBAC Key Vault has expiration date set",
"CheckType": [],
"ServiceName": "keyvault",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "microsoft.keyvault/vaults",
"ResourceType": "microsoft.keyvault/vaults/secrets",
"ResourceGroup": "security",
"Description": "**Azure Key Vault (non-RBAC)** secrets are expected to have an **explicit expiration date**.\n\nThis examines each **enabled secret** to confirm the `expires` attribute is defined.",
"Description": "Each **enabled secret** in an **Azure Key Vault (non-RBAC)** is assessed to confirm it has an **explicit expiration date** (`expires` attribute) defined.",
"Risk": "Secrets without expiration persist indefinitely, widening the window for misuse.\n\nIf leaked or forgotten, they allow long-term, covert access to services and data, undermining **confidentiality** and **integrity**, and complicating incident response and revocation.",
"RelatedUrl": "",
"AdditionalURLs": [

View File

@@ -7,23 +7,19 @@ class keyvault_non_rbac_secret_expiration_set(Check):
findings = []
for subscription, key_vaults in keyvault_client.key_vaults.items():
for keyvault in key_vaults:
if (
not keyvault.properties.enable_rbac_authorization
and keyvault.secrets
):
report = Check_Report_Azure(
metadata=self.metadata(), resource=keyvault
)
report.subscription = subscription
report.status = "PASS"
report.status_extended = f"Keyvault {keyvault.name} from subscription {subscription} has all the secrets with expiration date set."
has_secret_without_expiration = False
for secret in keyvault.secrets:
if not secret.attributes.expires and secret.enabled:
if not keyvault.properties.enable_rbac_authorization:
for secret in keyvault.secrets or []:
if not secret.enabled:
continue
report = Check_Report_Azure(
metadata=self.metadata(), resource=secret
)
report.subscription = subscription
if not secret.attributes.expires:
report.status = "FAIL"
report.status_extended = f"Keyvault {keyvault.name} from subscription {subscription} has the secret {secret.name} without expiration date set."
has_secret_without_expiration = True
findings.append(report)
if not has_secret_without_expiration:
report.status_extended = f"Secret {secret.name} in Key Vault {keyvault.name} from subscription {subscription} does not have an expiration date set."
else:
report.status = "PASS"
report.status_extended = f"Secret {secret.name} in Key Vault {keyvault.name} from subscription {subscription} has an expiration date set."
findings.append(report)
return findings

View File

@@ -1,15 +1,15 @@
{
"Provider": "azure",
"CheckID": "keyvault_rbac_key_expiration_set",
"CheckTitle": "RBAC-enabled Key Vault has expiration date set for all keys",
"CheckTitle": "Key in RBAC-enabled Key Vault has expiration date set",
"CheckType": [],
"ServiceName": "keyvault",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "microsoft.keyvault/vaults",
"ResourceType": "microsoft.keyvault/vaults/keys",
"ResourceGroup": "security",
"Description": "**Azure Key Vaults** with **RBAC-enabled access control** are evaluated to confirm every **enabled key** defines an **expiration** (`exp`). Any key lacking this attribute is identified.",
"Description": "Each **enabled key** in an **Azure Key Vault** with **RBAC-enabled access control** is assessed to confirm it has an **expiration** (`exp`) attribute defined.",
"Risk": "**Keys without expiration** can remain active indefinitely.\nIf exposed, attackers can decrypt data, forge signatures (code/tokens), and maintain persistence, undermining **confidentiality** and **integrity**. Absent end-of-life also weakens rotation discipline and crypto agility.",
"RelatedUrl": "",
"AdditionalURLs": [

View File

@@ -7,20 +7,19 @@ class keyvault_rbac_key_expiration_set(Check):
findings = []
for subscription, key_vaults in keyvault_client.key_vaults.items():
for keyvault in key_vaults:
if keyvault.properties.enable_rbac_authorization and keyvault.keys:
report = Check_Report_Azure(
metadata=self.metadata(), resource=keyvault
)
report.subscription = subscription
report.status = "PASS"
report.status_extended = f"Keyvault {keyvault.name} from subscription {subscription} has all the keys with expiration date set."
has_key_without_expiration = False
for key in keyvault.keys:
if not key.attributes.expires and key.enabled:
if keyvault.properties.enable_rbac_authorization:
for key in keyvault.keys or []:
if not key.enabled:
continue
report = Check_Report_Azure(
metadata=self.metadata(), resource=key
)
report.subscription = subscription
if not key.attributes.expires:
report.status = "FAIL"
report.status_extended = f"Keyvault {keyvault.name} from subscription {subscription} has the key {key.name} without expiration date set."
has_key_without_expiration = True
findings.append(report)
if not has_key_without_expiration:
report.status_extended = f"Key {key.name} in Key Vault {keyvault.name} from subscription {subscription} does not have an expiration date set."
else:
report.status = "PASS"
report.status_extended = f"Key {key.name} in Key Vault {keyvault.name} from subscription {subscription} has an expiration date set."
findings.append(report)
return findings

View File

@@ -1,15 +1,15 @@
{
"Provider": "azure",
"CheckID": "keyvault_rbac_secret_expiration_set",
"CheckTitle": "RBAC-enabled Key Vault has expiration date set for all enabled secrets",
"CheckTitle": "Secret in RBAC-enabled Key Vault has expiration date set",
"CheckType": [],
"ServiceName": "keyvault",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "microsoft.keyvault/vaults",
"ResourceType": "microsoft.keyvault/vaults/secrets",
"ResourceGroup": "security",
"Description": "**Azure Key Vault (RBAC)** secrets are assessed to confirm every **enabled secret** has an `exp` (expiration) date configured",
"Description": "Each **enabled secret** in an **Azure Key Vault (RBAC)** is assessed to confirm it has an `exp` (expiration) date configured.",
"Risk": "Without an **expiration**, secrets become perpetual credentials. Leaked or abandoned values can grant persistent access, undermining **confidentiality** and **integrity**. Attackers can reuse old secrets to maintain footholds, perform unauthorized API calls, and exfiltrate data.",
"RelatedUrl": "",
"AdditionalURLs": [

View File

@@ -5,21 +5,21 @@ from prowler.providers.azure.services.keyvault.keyvault_client import keyvault_c
class keyvault_rbac_secret_expiration_set(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, key_vaults in keyvault_client.key_vaults.items():
for keyvault in key_vaults:
if keyvault.properties.enable_rbac_authorization and keyvault.secrets:
for secret in keyvault.secrets:
if keyvault.properties.enable_rbac_authorization:
for secret in keyvault.secrets or []:
if not secret.enabled:
continue
report = Check_Report_Azure(
metadata=self.metadata(), resource=secret
)
report.subscription = subscription
if not secret.attributes.expires and secret.enabled:
if not secret.attributes.expires:
report.status = "FAIL"
report.status_extended = f"Secret '{secret.name}' in KeyVault '{keyvault.name}' does not have expiration date set."
report.status_extended = f"Secret {secret.name} in Key Vault {keyvault.name} from subscription {subscription} does not have an expiration date set."
else:
report.status = "PASS"
report.status_extended = f"Secret '{secret.name}' in KeyVault '{keyvault.name}' has expiration date set."
report.status_extended = f"Secret {secret.name} in Key Vault {keyvault.name} from subscription {subscription} has an expiration date set."
findings.append(report)
return findings

View File

@@ -46,7 +46,7 @@ dependencies = [
"boto3==1.40.61",
"botocore==1.40.61",
"colorama==0.4.6",
"cryptography==44.0.3",
"cryptography==46.0.6",
"dash==3.1.1",
"dash-bootstrap-components==2.0.3",
"defusedxml>=0.7.1",
@@ -75,10 +75,10 @@ dependencies = [
"uuid6==2024.7.10",
"py-iam-expand==0.1.0",
"h2==4.3.0",
"oci==2.160.3",
"oci==2.169.0",
"alibabacloud_credentials==1.0.3",
"alibabacloud_ram20150501==1.2.0",
"alibabacloud_tea_openapi==0.4.1",
"alibabacloud_tea_openapi==0.4.4",
"alibabacloud_sts20150401==1.1.6",
"alibabacloud_vpc20160428==6.13.0",
"alibabacloud_ecs20140526==7.2.5",

View File

@@ -1,62 +0,0 @@
#!/usr/bin/env python3
"""Fail when __init__.py files are present inside test directories."""
from __future__ import annotations
import sys
from argparse import ArgumentParser
from pathlib import Path
EXCLUDED_TEST_INIT_ROOTS = {
Path("tests/lib/check/fixtures/checks_folder"),
}
def is_test_init_file(path: Path) -> bool:
"""Return True when the file is a test __init__.py."""
return path.name == "__init__.py" and "tests" in path.parts
def is_excluded_test_init_file(path: Path, root: Path) -> bool:
"""Return True when the file belongs to an allowed fixture directory."""
relative_path = path.relative_to(root)
return any(
relative_path.is_relative_to(excluded) for excluded in EXCLUDED_TEST_INIT_ROOTS
)
def find_test_init_files(root: Path) -> list[Path]:
"""Return sorted __init__.py files found under test directories."""
return sorted(
path
for path in root.rglob("__init__.py")
if is_test_init_file(path) and not is_excluded_test_init_file(path, root)
)
def main(argv: list[str] | None = None) -> int:
parser = ArgumentParser(description=__doc__)
parser.add_argument(
"root",
nargs="?",
default=".",
help="Repository root to scan. Defaults to the current directory.",
)
args = parser.parse_args(argv)
root = Path(args.root).resolve()
matches = find_test_init_files(root)
if not matches:
print("No __init__.py files found in test directories.")
return 0
print("Remove __init__.py files from test directories:")
for path in matches:
print(path.relative_to(root))
return 1
if __name__ == "__main__":
sys.exit(main())

0
tests/__init__.py Normal file
View File

0
tests/config/__init__.py Normal file
View File

View File

@@ -629,3 +629,49 @@ class TestCheckLoader:
provider=self.provider,
)
assert exc_info.value.code == 1
def test_list_checks_includes_threat_detection(self):
"""Test that list_checks=True includes threat-detection checks (fixes #10576)"""
bulk_checks_metadata = {
S3_BUCKET_LEVEL_PUBLIC_ACCESS_BLOCK_NAME: self.get_custom_check_s3_metadata(),
CLOUDTRAIL_THREAT_DETECTION_ENUMERATION_NAME: self.get_threat_detection_check_metadata(),
}
result = load_checks_to_execute(
bulk_checks_metadata=bulk_checks_metadata,
provider=self.provider,
list_checks=True,
)
assert CLOUDTRAIL_THREAT_DETECTION_ENUMERATION_NAME in result
assert S3_BUCKET_LEVEL_PUBLIC_ACCESS_BLOCK_NAME in result
def test_list_checks_with_service_includes_threat_detection(self):
"""Test that list_checks=True with service filter includes threat-detection checks (fixes #10576)"""
bulk_checks_metadata = {
S3_BUCKET_LEVEL_PUBLIC_ACCESS_BLOCK_NAME: self.get_custom_check_s3_metadata(),
CLOUDTRAIL_THREAT_DETECTION_ENUMERATION_NAME: self.get_threat_detection_check_metadata(),
}
service_list = ["cloudtrail"]
result = load_checks_to_execute(
bulk_checks_metadata=bulk_checks_metadata,
service_list=service_list,
provider=self.provider,
list_checks=True,
)
assert CLOUDTRAIL_THREAT_DETECTION_ENUMERATION_NAME in result
assert S3_BUCKET_LEVEL_PUBLIC_ACCESS_BLOCK_NAME not in result
def test_scan_still_excludes_threat_detection_by_default(self):
"""Test that without list_checks, threat-detection checks are still excluded"""
bulk_checks_metadata = {
S3_BUCKET_LEVEL_PUBLIC_ACCESS_BLOCK_NAME: self.get_custom_check_s3_metadata(),
CLOUDTRAIL_THREAT_DETECTION_ENUMERATION_NAME: self.get_threat_detection_check_metadata(),
}
result = load_checks_to_execute(
bulk_checks_metadata=bulk_checks_metadata,
provider=self.provider,
)
assert CLOUDTRAIL_THREAT_DETECTION_ENUMERATION_NAME not in result
assert S3_BUCKET_LEVEL_PUBLIC_ACCESS_BLOCK_NAME in result

View File

@@ -1,68 +0,0 @@
from importlib.util import module_from_spec, spec_from_file_location
from pathlib import Path
SCRIPT_PATH = (
Path(__file__).resolve().parents[2] / "scripts" / "check_test_init_files.py"
)
def load_guard_module():
spec = spec_from_file_location("check_test_init_files", SCRIPT_PATH)
assert spec is not None
assert spec.loader is not None
module = module_from_spec(spec)
spec.loader.exec_module(module)
return module
def test_find_test_init_files_detects_only_test_directories(tmp_path):
guard = load_guard_module()
(tmp_path / "tests" / "providers" / "aws").mkdir(parents=True)
(tmp_path / "tests" / "providers" / "aws" / "__init__.py").write_text("")
(tmp_path / "api" / "tests" / "performance").mkdir(parents=True)
(tmp_path / "api" / "tests" / "performance" / "__init__.py").write_text("")
(tmp_path / "prowler" / "providers" / "aws").mkdir(parents=True)
(tmp_path / "prowler" / "providers" / "aws" / "__init__.py").write_text("")
(
tmp_path / "tests" / "lib" / "check" / "fixtures" / "checks_folder" / "check11"
).mkdir(parents=True)
(
tmp_path
/ "tests"
/ "lib"
/ "check"
/ "fixtures"
/ "checks_folder"
/ "check11"
/ "__init__.py"
).write_text("")
matches = guard.find_test_init_files(tmp_path)
assert [path.relative_to(tmp_path) for path in matches] == [
Path("api/tests/performance/__init__.py"),
Path("tests/providers/aws/__init__.py"),
]
def test_main_returns_error_when_test_init_files_exist(tmp_path, capsys):
guard = load_guard_module()
(tmp_path / "tests" / "config").mkdir(parents=True)
(tmp_path / "tests" / "config" / "__init__.py").write_text("")
assert guard.main([str(tmp_path)]) == 1
captured = capsys.readouterr()
assert "Remove __init__.py files from test directories" in captured.out
assert "tests/config/__init__.py" in captured.out
def test_repository_has_no_test_init_files():
guard = load_guard_module()
repo_root = Path(__file__).resolve().parents[2]
assert guard.find_test_init_files(repo_root) == []

View File

View File

View File

Some files were not shown because too many files have changed in this diff Show More