Compare commits

..

19 Commits

Author SHA1 Message Date
Chandrapal Badshah 1bf91c5779 feat: add tenant context and update prompt of recommendation 2025-08-12 13:51:35 +05:30
Chandrapal Badshah 0cda7716ce fix: get latest completed scans instead of last 24 hours 2025-08-12 13:49:21 +05:30
Chandrapal Badshah e4ab6d3589 fix: update supervisor prompt to ignore bad output 2025-08-12 13:47:47 +05:30
Chandrapal Badshah 82289f28bb fix: initiate cache only when lighthouse config is active 2025-08-12 13:46:56 +05:30
Chandrapal Badshah a5b59bc346 fix: update findings prompt for better tool call 2025-08-11 19:27:07 +05:30
Chandrapal Badshah 78e59fbf78 feat: update resource agent prompt 2025-08-11 19:24:51 +05:30
Chandrapal Badshah 2aa506db92 feat: add resources agent to supervisor 2025-08-11 19:23:41 +05:30
Chandrapal Badshah e9dbf58ec5 feat: update fields in tool calling 2025-08-11 19:22:04 +05:30
Chandrapal Badshah 5b7bf307d4 feat: add fields prop for fetching necessary findings data 2025-08-11 19:20:42 +05:30
Chandrapal Badshah c3e50e3600 fix: move recommendation generator to supervisor agent 2025-08-07 11:42:07 +05:30
Chandrapal Badshah a1a8b9e17e chore: eslint and update package lock file 2025-08-05 11:22:54 +05:30
Chandrapal Badshah c3fdfddff0 chore: recreate package lock 2025-08-05 08:10:37 +05:30
Pablo Lara 78dcddee07 chore: some SSR tweaks 2025-08-05 06:27:53 +05:30
Chandrapal Badshah 11649f227a feat: add lighthouse caching 2025-08-05 06:20:59 +05:30
Chandrapal Badshah 3bb3edc0bb feat: add caching recommendations in valkey 2025-08-04 17:52:33 +05:30
Chandrapal Badshah 47cf5758f5 feat: get summary of all scans in last 24 hours 2025-08-04 17:48:21 +05:30
Chandrapal Badshah 5bd9214e4f feat: add tool to detect new failed findings in a scan 2025-08-04 17:48:06 +05:30
Chandrapal Badshah fe358d08e9 feat: add lighthouse summary generator 2025-08-04 17:47:55 +05:30
Chandrapal Badshah 2caf001e21 feat: add getLatestFindings tool 2025-08-04 17:44:05 +05:30
199 changed files with 4335 additions and 5283 deletions
+1 -3
View File
@@ -10,8 +10,6 @@ NEXT_PUBLIC_API_BASE_URL=${API_BASE_URL}
NEXT_PUBLIC_API_DOCS_URL=http://prowler-api:8080/api/v1/docs
AUTH_TRUST_HOST=true
UI_PORT=3000
# Temp URL for feeds need to use actual
RSS_FEED_URL=https://prowler.com/blog/rss
# openssl rand -base64 32
AUTH_SECRET="N/c6mnaS5+SWq81+819OrzQZlmx1Vxtp/orjttJSmw8="
# Google Tag Manager ID
@@ -133,7 +131,7 @@ SENTRY_ENVIRONMENT=local
SENTRY_RELEASE=local
#### Prowler release version ####
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.10.0
NEXT_PUBLIC_PROWLER_RELEASE_VERSION=v5.7.5
# Social login credentials
SOCIAL_GOOGLE_OAUTH_CALLBACK_URL="${AUTH_URL}/api/auth/callback/google"
+2 -2
View File
@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/api-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
category: "/language:${{matrix.language}}"
+1 -1
View File
@@ -11,7 +11,7 @@ jobs:
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@a05cf0859455b5b16317ee22d809887a4043cdf0 # v3.90.2
uses: trufflesecurity/trufflehog@6641d4ba5b684fffe195b9820345de1bf19f3181 # v3.89.2
with:
path: ./
base: ${{ github.event.repository.default_branch }}
@@ -25,7 +25,6 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
@@ -37,11 +36,6 @@ jobs:
python3 -m pip install --user poetry
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Configure Git
run: |
git config --global user.name "prowler-bot"
git config --global user.email "179230569+prowler-bot@users.noreply.github.com"
- name: Parse version and determine branch
run: |
# Validate version format (reusing pattern from sdk-bump-version.yml)
@@ -153,9 +147,6 @@ jobs:
exit 1
fi
git checkout -b "$BRANCH_NAME"
# Push the new branch first so it exists remotely
git push origin "$BRANCH_NAME"
- name: Update prowler dependency in api/pyproject.toml
if: ${{ env.PATCH_VERSION == '0' }}
@@ -177,7 +168,7 @@ jobs:
# Update poetry lock file
echo "Updating poetry.lock file..."
cd api
poetry lock
poetry lock --no-update
cd ..
# Commit and push the changes
@@ -55,20 +55,29 @@ jobs:
comment-author: 'github-actions[bot]'
body-includes: '<!-- changelog-check -->'
- name: Update PR comment with changelog status
if: github.event.pull_request.head.repo.full_name == github.repository
- name: Comment on PR if changelog is missing
if: github.event.pull_request.head.repo.full_name == github.repository && steps.check_folders.outputs.missing_changelogs != ''
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.find_comment.outputs.comment-id }}
edit-mode: replace
body: |
<!-- changelog-check -->
${{ steps.check_folders.outputs.missing_changelogs != '' && format('⚠️ **Changes detected in the following folders without a corresponding update to the `CHANGELOG.md`:**
⚠️ **Changes detected in the following folders without a corresponding update to the `CHANGELOG.md`:**
{0}
${{ steps.check_folders.outputs.missing_changelogs }}
Please add an entry to the corresponding `CHANGELOG.md` file to maintain a clear history of changes.', steps.check_folders.outputs.missing_changelogs) || '✅ All necessary `CHANGELOG.md` files have been updated. Great job! 🎉' }}
Please add an entry to the corresponding `CHANGELOG.md` file to maintain a clear history of changes.
- name: Comment on PR if all changelogs are present
if: github.event.pull_request.head.repo.full_name == github.repository && steps.check_folders.outputs.missing_changelogs == ''
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
comment-id: ${{ steps.find_comment.outputs.comment-id }}
body: |
<!-- changelog-check -->
✅ All necessary `CHANGELOG.md` files have been updated. Great job! 🎉
- name: Fail if changelog is missing
if: steps.check_folders.outputs.missing_changelogs != ''
+2 -2
View File
@@ -56,12 +56,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/sdk-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
category: "/language:${{matrix.language}}"
+2 -2
View File
@@ -48,12 +48,12 @@ jobs:
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
languages: ${{ matrix.language }}
config-file: ./.github/codeql/ui-codeql-config.yml
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@51f77329afa6477de8c49fc9c7046c15b9a4e79d # v3.29.5
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
category: "/language:${{matrix.language}}"
+1 -1
View File
@@ -2,7 +2,7 @@
All notable changes to the **Prowler API** are documented in this file.
## [1.11.0] (Prowler 5.10.0)
## [1.11.0] (Prowler UNRELEASED)
### Added
- Github provider support [(#8271)](https://github.com/prowler-cloud/prowler/pull/8271)
+154 -360
View File
@@ -150,19 +150,19 @@ typing-extensions = {version = ">=4.2", markers = "python_version < \"3.13\""}
[[package]]
name = "alive-progress"
version = "3.3.0"
version = "3.2.0"
description = "A new kind of Progress Bar, with real-time throughput, ETA, and very cool animations!"
optional = false
python-versions = "<4,>=3.9"
groups = ["main"]
files = [
{file = "alive-progress-3.3.0.tar.gz", hash = "sha256:457dd2428b48dacd49854022a46448d236a48f1b7277874071c39395307e830c"},
{file = "alive_progress-3.3.0-py3-none-any.whl", hash = "sha256:63dd33bb94cde15ad9e5b666dbba8fedf71b72a4935d6fb9a92931e69402c9ff"},
{file = "alive-progress-3.2.0.tar.gz", hash = "sha256:ede29d046ff454fe56b941f686f89dd9389430c4a5b7658e445cb0b80e0e4deb"},
{file = "alive_progress-3.2.0-py3-none-any.whl", hash = "sha256:0677929f8d3202572e9d142f08170b34dbbe256cc6d2afbf75ef187c7da964a8"},
]
[package.dependencies]
about-time = "4.2.1"
graphemeu = "0.7.2"
grapheme = "0.6.0"
[[package]]
name = "amqp"
@@ -179,18 +179,6 @@ files = [
[package.dependencies]
vine = ">=5.0.0,<6.0.0"
[[package]]
name = "annotated-types"
version = "0.7.0"
description = "Reusable constraint types to use with typing.Annotated"
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
files = [
{file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
{file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
]
[[package]]
name = "anyio"
version = "4.9.0"
@@ -507,23 +495,6 @@ azure-mgmt-core = ">=1.3.2"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-mgmt-databricks"
version = "2.0.0"
description = "Microsoft Azure Data Bricks Management Client Library for Python"
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "azure-mgmt-databricks-2.0.0.zip", hash = "sha256:70d11362dc2d17f5fb1db0cfe65c1af55b8f136f1a0db9a5b51e7acf760cf5b9"},
{file = "azure_mgmt_databricks-2.0.0-py3-none-any.whl", hash = "sha256:0c29434a7339e74231bd171a6c08dcdf8153abaebd332658d7f66b8ea143fa17"},
]
[package.dependencies]
azure-common = ">=1.1,<2.0"
azure-mgmt-core = ">=1.3.2,<2.0.0"
isodate = ">=0.6.1,<1.0.0"
[[package]]
name = "azure-mgmt-keyvault"
version = "10.3.1"
@@ -594,42 +565,6 @@ azure-common = ">=1.1,<2.0"
azure-mgmt-core = ">=1.3.0,<2.0.0"
msrest = ">=0.6.21"
[[package]]
name = "azure-mgmt-recoveryservices"
version = "3.1.0"
description = "Microsoft Azure Recovery Services Client Library for Python"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "azure_mgmt_recoveryservices-3.1.0-py3-none-any.whl", hash = "sha256:21c58afdf4ae66806783e95f8cd17e3bec31be7178c48784db21f0b05de7fa66"},
{file = "azure_mgmt_recoveryservices-3.1.0.tar.gz", hash = "sha256:7f2db98401708cf145322f50bc491caf7967bec4af3bf7b0984b9f07d3092687"},
]
[package.dependencies]
azure-common = ">=1.1"
azure-mgmt-core = ">=1.5.0"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-mgmt-recoveryservicesbackup"
version = "9.2.0"
description = "Microsoft Azure Recovery Services Backup Management Client Library for Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "azure_mgmt_recoveryservicesbackup-9.2.0-py3-none-any.whl", hash = "sha256:c0002858d0166b6a10189a1fd580a49c83dc31b111e98010a5b2ea0f767dfff1"},
{file = "azure_mgmt_recoveryservicesbackup-9.2.0.tar.gz", hash = "sha256:c402b3e22a6c3879df56bc37e0063142c3352c5102599ff102d19824f1b32b29"},
]
[package.dependencies]
azure-common = ">=1.1"
azure-mgmt-core = ">=1.3.2"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-mgmt-resource"
version = "23.3.0"
@@ -824,34 +759,34 @@ files = [
[[package]]
name = "boto3"
version = "1.39.15"
version = "1.35.99"
description = "The AWS SDK for Python"
optional = false
python-versions = ">=3.9"
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "boto3-1.39.15-py3-none-any.whl", hash = "sha256:38fc54576b925af0075636752de9974e172c8a2cf7133400e3e09b150d20fb6a"},
{file = "boto3-1.39.15.tar.gz", hash = "sha256:b4483625f0d8c35045254dee46cd3c851bbc0450814f20b9b25bee1b5c0d8409"},
{file = "boto3-1.35.99-py3-none-any.whl", hash = "sha256:83e560faaec38a956dfb3d62e05e1703ee50432b45b788c09e25107c5058bd71"},
{file = "boto3-1.35.99.tar.gz", hash = "sha256:e0abd794a7a591d90558e92e29a9f8837d25ece8e3c120e530526fe27eba5fca"},
]
[package.dependencies]
botocore = ">=1.39.15,<1.40.0"
botocore = ">=1.35.99,<1.36.0"
jmespath = ">=0.7.1,<2.0.0"
s3transfer = ">=0.13.0,<0.14.0"
s3transfer = ">=0.10.0,<0.11.0"
[package.extras]
crt = ["botocore[crt] (>=1.21.0,<2.0a0)"]
[[package]]
name = "botocore"
version = "1.39.15"
version = "1.35.99"
description = "Low-level, data-driven core of boto 3."
optional = false
python-versions = ">=3.9"
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "botocore-1.39.15-py3-none-any.whl", hash = "sha256:eb9cfe918ebfbfb8654e1b153b29f0c129d586d2c0d7fb4032731d49baf04cff"},
{file = "botocore-1.39.15.tar.gz", hash = "sha256:2aa29a717f14f8c7ca058c2e297aaed0aa10ecea24b91514eee802814d1b7600"},
{file = "botocore-1.35.99-py3-none-any.whl", hash = "sha256:b22d27b6b617fc2d7342090d6129000af2efd20174215948c0d7ae2da0fab445"},
{file = "botocore-1.35.99.tar.gz", hash = "sha256:1eab44e969c39c5f3d9a3104a0836c24715579a455f12b3979a31d7cde51b3c3"},
]
[package.dependencies]
@@ -860,7 +795,7 @@ python-dateutil = ">=2.1,<3.0.0"
urllib3 = {version = ">=1.25.4,<2.2.0 || >2.2.0,<3", markers = "python_version >= \"3.10\""}
[package.extras]
crt = ["awscrt (==0.23.8)"]
crt = ["awscrt (==0.22.0)"]
[[package]]
name = "cachetools"
@@ -1207,18 +1142,6 @@ files = [
]
markers = {dev = "platform_system == \"Windows\" or sys_platform == \"win32\""}
[[package]]
name = "contextlib2"
version = "21.6.0"
description = "Backports and enhancements for the contextlib module"
optional = false
python-versions = ">=3.6"
groups = ["main"]
files = [
{file = "contextlib2-21.6.0-py2.py3-none-any.whl", hash = "sha256:3fbdb64466afd23abaf6c977627b75b6139a5a3e8ce38405c5b413aed7a0471f"},
{file = "contextlib2-21.6.0.tar.gz", hash = "sha256:ab1e2bfe1d01d968e1b7e8d9023bc51ef3509bba217bb730cee3827e1ee82869"},
]
[[package]]
name = "coverage"
version = "7.5.4"
@@ -1355,18 +1278,21 @@ test-randomorder = ["pytest-randomly"]
[[package]]
name = "dash"
version = "3.1.1"
version = "2.18.2"
description = "A Python framework for building reactive web-apps. Developed by Plotly."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "dash-3.1.1-py3-none-any.whl", hash = "sha256:66fff37e79c6aa114cd55aea13683d1e9afe0e3f96b35388baca95ff6cfdad23"},
{file = "dash-3.1.1.tar.gz", hash = "sha256:916b31cec46da0a3339da0e9df9f446126aa7f293c0544e07adf9fe4ba060b18"},
{file = "dash-2.18.2-py3-none-any.whl", hash = "sha256:0ce0479d1bc958e934630e2de7023b8a4558f23ce1f9f5a4b34b65eb3903a869"},
{file = "dash-2.18.2.tar.gz", hash = "sha256:20e8404f73d0fe88ce2eae33c25bbc513cbe52f30d23a401fa5f24dbb44296c8"},
]
[package.dependencies]
Flask = ">=1.0.4,<3.2"
dash-core-components = "2.0.0"
dash-html-components = "2.0.0"
dash-table = "5.0.0"
Flask = ">=1.0.4,<3.1"
importlib-metadata = "*"
nest-asyncio = "*"
plotly = ">=5.0.0"
@@ -1374,12 +1300,11 @@ requests = "*"
retrying = "*"
setuptools = "*"
typing-extensions = ">=4.1.1"
Werkzeug = "<3.2"
Werkzeug = "<3.1"
[package.extras]
async = ["flask[async]"]
celery = ["celery[redis] (>=5.1.2,<5.4.0)", "kombu (<5.4.0)", "redis (>=3.5.3,<=5.0.4)"]
ci = ["black (==22.3.0)", "flake8 (==7.0.0)", "flaky (==3.8.1)", "flask-talisman (==1.0.0)", "ipython (<9.0.0)", "jupyterlab (<4.0.0)", "mimesis (<=11.1.0)", "mock (==4.0.3)", "mypy (==1.15.0) ; python_version >= \"3.12\"", "numpy (<=1.26.3)", "openpyxl", "orjson (==3.10.3)", "pandas (>=1.4.0)", "pyarrow", "pylint (==3.0.3)", "pyright (==1.1.398) ; python_version >= \"3.7\"", "pytest-mock", "pytest-rerunfailures", "pytest-sugar (==0.9.6)", "pyzmq (==25.1.2)", "xlrd (>=2.0.1)"]
celery = ["celery[redis] (>=5.1.2)", "redis (>=3.5.3)"]
ci = ["black (==22.3.0)", "dash-dangerously-set-inner-html", "dash-flow-example (==0.0.5)", "flake8 (==7.0.0)", "flaky (==3.8.1)", "flask-talisman (==1.0.0)", "jupyterlab (<4.0.0)", "mimesis (<=11.1.0)", "mock (==4.0.3)", "numpy (<=1.26.3)", "openpyxl", "orjson (==3.10.3)", "pandas (>=1.4.0)", "pyarrow", "pylint (==3.0.3)", "pytest-mock", "pytest-rerunfailures", "pytest-sugar (==0.9.6)", "pyzmq (==25.1.2)", "xlrd (>=2.0.1)"]
compress = ["flask-compress"]
dev = ["PyYAML (>=5.4.1)", "coloredlogs (>=15.0.1)", "fire (>=0.4.0)"]
diskcache = ["diskcache (>=5.2.1)", "multiprocess (>=0.70.12)", "psutil (>=5.8.0)"]
@@ -1387,21 +1312,57 @@ testing = ["beautifulsoup4 (>=4.8.2)", "cryptography", "dash-testing-stub (>=0.0
[[package]]
name = "dash-bootstrap-components"
version = "2.0.3"
version = "1.6.0"
description = "Bootstrap themed components for use in Plotly Dash"
optional = false
python-versions = ">=3.9"
python-versions = "<4,>=3.8"
groups = ["main"]
files = [
{file = "dash_bootstrap_components-2.0.3-py3-none-any.whl", hash = "sha256:82754d3d001ad5482b8a82b496c7bf98a1c68d2669d607a89dda7ec627304af5"},
{file = "dash_bootstrap_components-2.0.3.tar.gz", hash = "sha256:5c161b04a6e7ed19a7d54e42f070c29fd6c385d5a7797e7a82999aa2fc15b1de"},
{file = "dash_bootstrap_components-1.6.0-py3-none-any.whl", hash = "sha256:97f0f47b38363f18863e1b247462229266ce12e1e171cfb34d3c9898e6e5cd1e"},
{file = "dash_bootstrap_components-1.6.0.tar.gz", hash = "sha256:960a1ec9397574792f49a8241024fa3cecde0f5930c971a3fc81f016cbeb1095"},
]
[package.dependencies]
dash = ">=3.0.4"
dash = ">=2.0.0"
[package.extras]
pandas = ["numpy (>=2.0.2)", "pandas (>=2.2.3)"]
pandas = ["numpy", "pandas"]
[[package]]
name = "dash-core-components"
version = "2.0.0"
description = "Core component suite for Dash"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "dash_core_components-2.0.0-py3-none-any.whl", hash = "sha256:52b8e8cce13b18d0802ee3acbc5e888cb1248a04968f962d63d070400af2e346"},
{file = "dash_core_components-2.0.0.tar.gz", hash = "sha256:c6733874af975e552f95a1398a16c2ee7df14ce43fa60bb3718a3c6e0b63ffee"},
]
[[package]]
name = "dash-html-components"
version = "2.0.0"
description = "Vanilla HTML components for Dash"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "dash_html_components-2.0.0-py3-none-any.whl", hash = "sha256:b42cc903713c9706af03b3f2548bda4be7307a7cf89b7d6eae3da872717d1b63"},
{file = "dash_html_components-2.0.0.tar.gz", hash = "sha256:8703a601080f02619a6390998e0b3da4a5daabe97a1fd7a9cebc09d015f26e50"},
]
[[package]]
name = "dash-table"
version = "5.0.0"
description = "Dash table"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "dash_table-5.0.0-py3-none-any.whl", hash = "sha256:19036fa352bb1c11baf38068ec62d172f0515f73ca3276c79dee49b95ddc16c9"},
{file = "dash_table-5.0.0.tar.gz", hash = "sha256:18624d693d4c8ef2ddec99a6f167593437a7ea0bf153aa20f318c170c5bc7308"},
]
[[package]]
name = "debugpy"
@@ -1928,54 +1889,6 @@ djangorestframework-jsonapi = ">=6.0.0"
drf-extensions = ">=0.7.1"
drf-spectacular = ">=0.25.0"
[[package]]
name = "dulwich"
version = "0.23.0"
description = "Python Git Library"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "dulwich-0.23.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c13b0d5a9009cde23ecb8cb201df6e23e2a7a82c5e2d6ba6443fbb322c9befc6"},
{file = "dulwich-0.23.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:a68faf8612bf93de1285048d6ad13160f0fb3c5596a86e694e78f4e212886fa5"},
{file = "dulwich-0.23.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:d971566826f16ec67c70641c1fbdb337323aa5b533799bc5a4641f4750e73b36"},
{file = "dulwich-0.23.0-cp310-cp310-win32.whl", hash = "sha256:27d970adf539806dfc4fe3e4c9e8dc6ebf0318977a56e24d22f13413535a51ba"},
{file = "dulwich-0.23.0-cp310-cp310-win_amd64.whl", hash = "sha256:025178533e884ffdb0d9d8db4b8870745d438cbfecb782fd1b56c3b6438e86cf"},
{file = "dulwich-0.23.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d68498fdda13ab00791b483daab3bcfe9f9721c037aa458695e6ad81640c57cc"},
{file = "dulwich-0.23.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:cb7bb930b12471a1cfcea4b3d25a671dc0ad32573f0ad25684684298959a1527"},
{file = "dulwich-0.23.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:a2abbce32fd2bc7902bcc5f69b10bf22576810de21651baaa864b78fd7aec261"},
{file = "dulwich-0.23.0-cp311-cp311-win32.whl", hash = "sha256:9e3151f10ce2a9ff91bca64c74345217f53bdd947dc958032343822009832f7a"},
{file = "dulwich-0.23.0-cp311-cp311-win_amd64.whl", hash = "sha256:3ae9f1d9dc92d4e9a3f89ba2c55221f7b6442c5dd93b3f6f539a3c9eb3f37bdd"},
{file = "dulwich-0.23.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:52cdef66a7994d29528ca79ca59452518bbba3fd56a9c61c61f6c467c1c7956e"},
{file = "dulwich-0.23.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:d473888a6ab9ed5d4a4c3f053cbe5b77f72d54b6efdf5688fed76094316e571e"},
{file = "dulwich-0.23.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:19fcf20224c641a61c774da92f098fbaae9938c7e17a52841e64092adf7e78f9"},
{file = "dulwich-0.23.0-cp312-cp312-win32.whl", hash = "sha256:7fc8b76b704ef35cd001e993e3aa4e1d666a2064bf467c07c560f12b2959dcaf"},
{file = "dulwich-0.23.0-cp312-cp312-win_amd64.whl", hash = "sha256:cb0566b888b578325350b4d67c61a0de35d417e9877560e3a6df88cae4576a59"},
{file = "dulwich-0.23.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:624e2223c8b705b3a217f9c8d3bfed3a573093be0b0ba033c46cba8411fb9630"},
{file = "dulwich-0.23.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:b4eaf326d15bb3fc5316c777b0312f0fe02f6f82a4368cd971d0ce2167b7ec34"},
{file = "dulwich-0.23.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:d754afaf7c133a015c75cc2be11703138b4be932e0eeeb2c70add56083f31109"},
{file = "dulwich-0.23.0-cp313-cp313-win32.whl", hash = "sha256:ac53ec438bde3c1f479782c34240479b36cd47230d091979137b7ecc12c0242e"},
{file = "dulwich-0.23.0-cp313-cp313-win_amd64.whl", hash = "sha256:50d3b4ba45671fb8b7d2afbd02c10b4edbc3290a1f92260e64098b409e9ca35c"},
{file = "dulwich-0.23.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:d8e18ea3fa49f10932077f39c0b960b5045870c550c3d7c74f3cfaac09457cd6"},
{file = "dulwich-0.23.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:3e6df0eb8cca21f210e3ddce2ccb64482646893dbec2fee9f3411d037595bf7b"},
{file = "dulwich-0.23.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:90c0064d7df8e7fe83d3a03c7d60b9e07a92698b18442f926199b2c3f0bf34d4"},
{file = "dulwich-0.23.0-cp39-cp39-win32.whl", hash = "sha256:84eef513aba501cbc1f223863f3b4b351fe732d3fb590cab9bdf5d33eb1a1248"},
{file = "dulwich-0.23.0-cp39-cp39-win_amd64.whl", hash = "sha256:dce943da48217c26e15790fd6df62d27a7f1d067102780351ebf2635fc0ba482"},
{file = "dulwich-0.23.0-py3-none-any.whl", hash = "sha256:d8da6694ca332bb48775e35ee2215aa4673821164a91b83062f699c69f7cd135"},
{file = "dulwich-0.23.0.tar.gz", hash = "sha256:0aa6c2489dd5e978b27e9b75983b7331a66c999f0efc54ebe37cab808ed322ae"},
]
[package.dependencies]
urllib3 = ">=1.25"
[package.extras]
dev = ["dissolve (>=0.1.1)", "mypy (==1.16.0)", "ruff (==0.11.13)"]
fastimport = ["fastimport"]
https = ["urllib3 (>=1.24.1)"]
merge = ["merge3"]
paramiko = ["paramiko"]
pgp = ["gpg"]
[[package]]
name = "durationpy"
version = "0.9"
@@ -2306,20 +2219,18 @@ files = [
]
[[package]]
name = "graphemeu"
version = "0.7.2"
name = "grapheme"
version = "0.6.0"
description = "Unicode grapheme helpers"
optional = false
python-versions = ">=3.7"
python-versions = "*"
groups = ["main"]
files = [
{file = "graphemeu-0.7.2-py3-none-any.whl", hash = "sha256:1444520f6899fd30114fc2a39f297d86d10fa0f23bf7579f772f8bc7efaa2542"},
{file = "graphemeu-0.7.2.tar.gz", hash = "sha256:42bbe373d7c146160f286cd5f76b1a8ad29172d7333ce10705c5cc282462a4f8"},
{file = "grapheme-0.6.0.tar.gz", hash = "sha256:44c2b9f21bbe77cfb05835fec230bd435954275267fea1858013b102f8603cca"},
]
[package.extras]
dev = ["pytest"]
docs = ["sphinx", "sphinx-autobuild"]
test = ["pytest", "sphinx", "sphinx-autobuild", "twine", "wheel"]
[[package]]
name = "gunicorn"
@@ -2458,18 +2369,6 @@ files = [
{file = "hyperframe-6.1.0.tar.gz", hash = "sha256:f630908a00854a7adeabd6382b43923a4c4cd4b821fcb527e6ab9e15382a3b08"},
]
[[package]]
name = "iamdata"
version = "0.1.202507291"
description = "IAM data for AWS actions, resources, and conditions based on IAM policy documents. Checked for updates daily."
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "iamdata-0.1.202507291-py3-none-any.whl", hash = "sha256:11dfdacc3ce0312468aa5ccafee461cd39b1deb7be112042deea91cbcd4b292b"},
{file = "iamdata-0.1.202507291.tar.gz", hash = "sha256:b386ce94819464554dc1258238ee1b232d86f0467edc13fffbf4de7332b3c7ad"},
]
[[package]]
name = "idna"
version = "3.10"
@@ -3986,7 +3885,7 @@ files = [
[[package]]
name = "prowler"
version = "5.10.0"
version = "5.8.0"
description = "Prowler is an Open Source security tool to perform AWS, GCP and Azure security best practices assessments, audits, incident response, continuous monitoring, hardening and forensics readiness. It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, FedRAMP, PCI-DSS, GDPR, HIPAA, FFIEC, SOC2, GXP, AWS Well-Architected Framework Security Pillar, AWS Foundational Technical Review (FTR), ENS (Spanish National Security Scheme) and your custom security frameworks."
optional = false
python-versions = ">3.9.1,<3.13"
@@ -3995,7 +3894,7 @@ files = []
develop = false
[package.dependencies]
alive-progress = "3.3.0"
alive-progress = "3.2.0"
awsipranges = "0.3.3"
azure-identity = "1.21.0"
azure-keyvault-keys = "4.10.0"
@@ -4005,13 +3904,10 @@ azure-mgmt-compute = "34.0.0"
azure-mgmt-containerregistry = "12.0.0"
azure-mgmt-containerservice = "34.1.0"
azure-mgmt-cosmosdb = "9.7.0"
azure-mgmt-databricks = "2.0.0"
azure-mgmt-keyvault = "10.3.1"
azure-mgmt-monitor = "6.0.2"
azure-mgmt-network = "28.1.0"
azure-mgmt-rdbms = "10.1.0"
azure-mgmt-recoveryservices = "3.1.0"
azure-mgmt-recoveryservicesbackup = "9.2.0"
azure-mgmt-resource = "23.3.0"
azure-mgmt-search = "9.1.0"
azure-mgmt-security = "7.0.0"
@@ -4020,14 +3916,13 @@ azure-mgmt-storage = "22.1.1"
azure-mgmt-subscription = "3.1.1"
azure-mgmt-web = "8.0.0"
azure-storage-blob = "12.24.1"
boto3 = "1.39.15"
botocore = "1.39.15"
boto3 = "1.35.99"
botocore = "1.35.99"
colorama = "0.4.6"
cryptography = "44.0.1"
dash = "3.1.1"
dash-bootstrap-components = "2.0.3"
dash = "2.18.2"
dash-bootstrap-components = "1.6.0"
detect-secrets = "1.5.0"
dulwich = "0.23.0"
google-api-python-client = "2.163.0"
google-auth-httplib2 = ">=0.1,<0.3"
jsonschema = "4.23.0"
@@ -4036,13 +3931,12 @@ microsoft-kiota-abstractions = "1.9.2"
msgraph-sdk = "1.23.0"
numpy = "2.0.2"
pandas = "2.2.3"
py-iam-expand = "0.1.0"
py-ocsf-models = "0.5.0"
pydantic = ">=2.0,<3.0"
py-ocsf-models = "0.3.1"
pydantic = "1.10.21"
pygithub = "2.5.0"
python-dateutil = ">=2.9.0.post0,<3.0.0"
pytz = "2025.1"
schema = "0.7.5"
schema = "0.7.7"
shodan = "1.31.0"
slack-sdk = "3.34.0"
tabulate = "0.9.0"
@@ -4051,8 +3945,8 @@ tzlocal = "5.3.1"
[package.source]
type = "git"
url = "https://github.com/prowler-cloud/prowler.git"
reference = "v5.10"
resolved_reference = "ff900a2a455def25eb7f5a7d25248e58eae24a34"
reference = "master"
resolved_reference = "ea97de7f43a2063476b49f7697bb6c7b51137c11"
[[package]]
name = "psutil"
@@ -4166,37 +4060,22 @@ files = [
{file = "psycopg2_binary-2.9.9-cp39-cp39-win_amd64.whl", hash = "sha256:f7ae5d65ccfbebdfa761585228eb4d0df3a8b15cfb53bd953e713e09fbb12957"},
]
[[package]]
name = "py-iam-expand"
version = "0.1.0"
description = "This is a Python package to expand and deobfuscate IAM policies."
optional = false
python-versions = "<3.14,>3.9.1"
groups = ["main"]
files = [
{file = "py_iam_expand-0.1.0-py3-none-any.whl", hash = "sha256:b845ce7b50ac895b02b4f338e09c62a68ea51849794f76e189b02009bd388510"},
{file = "py_iam_expand-0.1.0.tar.gz", hash = "sha256:5a2884dc267ac59a02c3a80fefc0b34c309dac681baa0f87c436067c6cf53a96"},
]
[package.dependencies]
iamdata = ">=0.1.202504091"
[[package]]
name = "py-ocsf-models"
version = "0.5.0"
version = "0.3.1"
description = "This is a Python implementation of the OCSF models. The models are used to represent the data of the OCSF Schema defined in https://schema.ocsf.io/."
optional = false
python-versions = "<3.14,>3.9.1"
python-versions = "<3.13,>3.9.1"
groups = ["main"]
files = [
{file = "py_ocsf_models-0.5.0-py3-none-any.whl", hash = "sha256:7933253f56782c04c412d976796db429577810b951fe4195351794500b5962d8"},
{file = "py_ocsf_models-0.5.0.tar.gz", hash = "sha256:bf05e955809d1ec3ab1007e4a4b2a8a0afa74b6e744ea8ffbf386e46b3af0a76"},
{file = "py_ocsf_models-0.3.1-py3-none-any.whl", hash = "sha256:e722d567a7f3e5190fdd053c2e75a69cf33fab6f5c0a4b7de678768ba340ae3a"},
{file = "py_ocsf_models-0.3.1.tar.gz", hash = "sha256:60defd2cc86e8882f42dc9c6dacca6dc16d6bc05f9477c2a3486a0d4b5882b94"},
]
[package.dependencies]
cryptography = "44.0.1"
email-validator = "2.2.0"
pydantic = ">=2.9.2,<3.0.0"
pydantic = "1.10.21"
[[package]]
name = "pyasn1"
@@ -4294,137 +4173,70 @@ files = [
[[package]]
name = "pydantic"
version = "2.11.7"
description = "Data validation using Python type hints"
version = "1.10.21"
description = "Data validation and settings management using python type hints"
optional = false
python-versions = ">=3.9"
python-versions = ">=3.7"
groups = ["main", "dev"]
files = [
{file = "pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b"},
{file = "pydantic-2.11.7.tar.gz", hash = "sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db"},
{file = "pydantic-1.10.21-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:245e486e0fec53ec2366df9cf1cba36e0bbf066af7cd9c974bbbd9ba10e1e586"},
{file = "pydantic-1.10.21-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:6c54f8d4c151c1de784c5b93dfbb872067e3414619e10e21e695f7bb84d1d1fd"},
{file = "pydantic-1.10.21-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b64708009cfabd9c2211295144ff455ec7ceb4c4fb45a07a804309598f36187"},
{file = "pydantic-1.10.21-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8a148410fa0e971ba333358d11a6dea7b48e063de127c2b09ece9d1c1137dde4"},
{file = "pydantic-1.10.21-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:36ceadef055af06e7756eb4b871cdc9e5a27bdc06a45c820cd94b443de019bbf"},
{file = "pydantic-1.10.21-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:c0501e1d12df6ab1211b8cad52d2f7b2cd81f8e8e776d39aa5e71e2998d0379f"},
{file = "pydantic-1.10.21-cp310-cp310-win_amd64.whl", hash = "sha256:c261127c275d7bce50b26b26c7d8427dcb5c4803e840e913f8d9df3f99dca55f"},
{file = "pydantic-1.10.21-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:8b6350b68566bb6b164fb06a3772e878887f3c857c46c0c534788081cb48adf4"},
{file = "pydantic-1.10.21-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:935b19fdcde236f4fbf691959fa5c3e2b6951fff132964e869e57c70f2ad1ba3"},
{file = "pydantic-1.10.21-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2b6a04efdcd25486b27f24c1648d5adc1633ad8b4506d0e96e5367f075ed2e0b"},
{file = "pydantic-1.10.21-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c1ba253eb5af8d89864073e6ce8e6c8dec5f49920cff61f38f5c3383e38b1c9f"},
{file = "pydantic-1.10.21-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:57f0101e6c97b411f287a0b7cf5ebc4e5d3b18254bf926f45a11615d29475793"},
{file = "pydantic-1.10.21-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:90e85834f0370d737c77a386ce505c21b06bfe7086c1c568b70e15a568d9670d"},
{file = "pydantic-1.10.21-cp311-cp311-win_amd64.whl", hash = "sha256:6a497bc66b3374b7d105763d1d3de76d949287bf28969bff4656206ab8a53aa9"},
{file = "pydantic-1.10.21-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:2ed4a5f13cf160d64aa331ab9017af81f3481cd9fd0e49f1d707b57fe1b9f3ae"},
{file = "pydantic-1.10.21-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3b7693bb6ed3fbe250e222f9415abb73111bb09b73ab90d2d4d53f6390e0ccc1"},
{file = "pydantic-1.10.21-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:185d5f1dff1fead51766da9b2de4f3dc3b8fca39e59383c273f34a6ae254e3e2"},
{file = "pydantic-1.10.21-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:38e6d35cf7cd1727822c79e324fa0677e1a08c88a34f56695101f5ad4d5e20e5"},
{file = "pydantic-1.10.21-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:1d7c332685eafacb64a1a7645b409a166eb7537f23142d26895746f628a3149b"},
{file = "pydantic-1.10.21-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c9b782db6f993a36092480eeaab8ba0609f786041b01f39c7c52252bda6d85f"},
{file = "pydantic-1.10.21-cp312-cp312-win_amd64.whl", hash = "sha256:7ce64d23d4e71d9698492479505674c5c5b92cda02b07c91dfc13633b2eef805"},
{file = "pydantic-1.10.21-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:0067935d35044950be781933ab91b9a708eaff124bf860fa2f70aeb1c4be7212"},
{file = "pydantic-1.10.21-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5e8148c2ce4894ce7e5a4925d9d3fdce429fb0e821b5a8783573f3611933a251"},
{file = "pydantic-1.10.21-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a4973232c98b9b44c78b1233693e5e1938add5af18042f031737e1214455f9b8"},
{file = "pydantic-1.10.21-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:662bf5ce3c9b1cef32a32a2f4debe00d2f4839fefbebe1d6956e681122a9c839"},
{file = "pydantic-1.10.21-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:98737c3ab5a2f8a85f2326eebcd214510f898881a290a7939a45ec294743c875"},
{file = "pydantic-1.10.21-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:0bb58bbe65a43483d49f66b6c8474424d551a3fbe8a7796c42da314bac712738"},
{file = "pydantic-1.10.21-cp313-cp313-win_amd64.whl", hash = "sha256:e622314542fb48542c09c7bd1ac51d71c5632dd3c92dc82ede6da233f55f4848"},
{file = "pydantic-1.10.21-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:d356aa5b18ef5a24d8081f5c5beb67c0a2a6ff2a953ee38d65a2aa96526b274f"},
{file = "pydantic-1.10.21-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:08caa8c0468172d27c669abfe9e7d96a8b1655ec0833753e117061febaaadef5"},
{file = "pydantic-1.10.21-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c677aa39ec737fec932feb68e4a2abe142682f2885558402602cd9746a1c92e8"},
{file = "pydantic-1.10.21-cp37-cp37m-musllinux_1_2_i686.whl", hash = "sha256:79577cc045d3442c4e845df53df9f9202546e2ba54954c057d253fc17cd16cb1"},
{file = "pydantic-1.10.21-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:b6b73ab347284719f818acb14f7cd80696c6fdf1bd34feee1955d7a72d2e64ce"},
{file = "pydantic-1.10.21-cp37-cp37m-win_amd64.whl", hash = "sha256:46cffa24891b06269e12f7e1ec50b73f0c9ab4ce71c2caa4ccf1fb36845e1ff7"},
{file = "pydantic-1.10.21-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:298d6f765e3c9825dfa78f24c1efd29af91c3ab1b763e1fd26ae4d9e1749e5c8"},
{file = "pydantic-1.10.21-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f2f4a2305f15eff68f874766d982114ac89468f1c2c0b97640e719cf1a078374"},
{file = "pydantic-1.10.21-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:35b263b60c519354afb3a60107d20470dd5250b3ce54c08753f6975c406d949b"},
{file = "pydantic-1.10.21-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e23a97a6c2f2db88995496db9387cd1727acdacc85835ba8619dce826c0b11a6"},
{file = "pydantic-1.10.21-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:3c96fed246ccc1acb2df032ff642459e4ae18b315ecbab4d95c95cfa292e8517"},
{file = "pydantic-1.10.21-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:b92893ebefc0151474f682e7debb6ab38552ce56a90e39a8834734c81f37c8a9"},
{file = "pydantic-1.10.21-cp38-cp38-win_amd64.whl", hash = "sha256:b8460bc256bf0de821839aea6794bb38a4c0fbd48f949ea51093f6edce0be459"},
{file = "pydantic-1.10.21-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5d387940f0f1a0adb3c44481aa379122d06df8486cc8f652a7b3b0caf08435f7"},
{file = "pydantic-1.10.21-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:266ecfc384861d7b0b9c214788ddff75a2ea123aa756bcca6b2a1175edeca0fe"},
{file = "pydantic-1.10.21-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61da798c05a06a362a2f8c5e3ff0341743e2818d0f530eaac0d6898f1b187f1f"},
{file = "pydantic-1.10.21-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a621742da75ce272d64ea57bd7651ee2a115fa67c0f11d66d9dcfc18c2f1b106"},
{file = "pydantic-1.10.21-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:9e3e4000cd54ef455694b8be9111ea20f66a686fc155feda1ecacf2322b115da"},
{file = "pydantic-1.10.21-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:f198c8206640f4c0ef5a76b779241efb1380a300d88b1bce9bfe95a6362e674d"},
{file = "pydantic-1.10.21-cp39-cp39-win_amd64.whl", hash = "sha256:e7f0cda108b36a30c8fc882e4fc5b7eec8ef584aa43aa43694c6a7b274fb2b56"},
{file = "pydantic-1.10.21-py3-none-any.whl", hash = "sha256:db70c920cba9d05c69ad4a9e7f8e9e83011abb2c6490e561de9ae24aee44925c"},
{file = "pydantic-1.10.21.tar.gz", hash = "sha256:64b48e2b609a6c22178a56c408ee1215a7206077ecb8a193e2fda31858b2362a"},
]
[package.dependencies]
annotated-types = ">=0.6.0"
pydantic-core = "2.33.2"
typing-extensions = ">=4.12.2"
typing-inspection = ">=0.4.0"
typing-extensions = ">=4.2.0"
[package.extras]
email = ["email-validator (>=2.0.0)"]
timezone = ["tzdata ; python_version >= \"3.9\" and platform_system == \"Windows\""]
[[package]]
name = "pydantic-core"
version = "2.33.2"
description = "Core functionality for Pydantic validation and serialization"
optional = false
python-versions = ">=3.9"
groups = ["main", "dev"]
files = [
{file = "pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8"},
{file = "pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2"},
{file = "pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a"},
{file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac"},
{file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a"},
{file = "pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b"},
{file = "pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22"},
{file = "pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640"},
{file = "pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7"},
{file = "pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef"},
{file = "pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e"},
{file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d"},
{file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30"},
{file = "pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf"},
{file = "pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51"},
{file = "pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab"},
{file = "pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65"},
{file = "pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc"},
{file = "pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1"},
{file = "pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b"},
{file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1"},
{file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6"},
{file = "pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea"},
{file = "pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290"},
{file = "pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2"},
{file = "pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab"},
{file = "pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f"},
{file = "pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d"},
{file = "pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56"},
{file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5"},
{file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e"},
{file = "pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162"},
{file = "pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849"},
{file = "pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9"},
{file = "pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9"},
{file = "pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac"},
{file = "pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5"},
{file = "pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9"},
{file = "pydantic_core-2.33.2-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a2b911a5b90e0374d03813674bf0a5fbbb7741570dcd4b4e85a2e48d17def29d"},
{file = "pydantic_core-2.33.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:6fa6dfc3e4d1f734a34710f391ae822e0a8eb8559a85c6979e14e65ee6ba2954"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c54c939ee22dc8e2d545da79fc5381f1c020d6d3141d3bd747eab59164dc89fb"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:53a57d2ed685940a504248187d5685e49eb5eef0f696853647bf37c418c538f7"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:09fb9dd6571aacd023fe6aaca316bd01cf60ab27240d7eb39ebd66a3a15293b4"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0e6116757f7959a712db11f3e9c0a99ade00a5bbedae83cb801985aa154f071b"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d55ab81c57b8ff8548c3e4947f119551253f4e3787a7bbc0b6b3ca47498a9d3"},
{file = "pydantic_core-2.33.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c20c462aa4434b33a2661701b861604913f912254e441ab8d78d30485736115a"},
{file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:44857c3227d3fb5e753d5fe4a3420d6376fa594b07b621e220cd93703fe21782"},
{file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:eb9b459ca4df0e5c87deb59d37377461a538852765293f9e6ee834f0435a93b9"},
{file = "pydantic_core-2.33.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9fcd347d2cc5c23b06de6d3b7b8275be558a0c90549495c699e379a80bf8379e"},
{file = "pydantic_core-2.33.2-cp39-cp39-win32.whl", hash = "sha256:83aa99b1285bc8f038941ddf598501a86f1536789740991d7d8756e34f1e74d9"},
{file = "pydantic_core-2.33.2-cp39-cp39-win_amd64.whl", hash = "sha256:f481959862f57f29601ccced557cc2e817bce7533ab8e01a797a48b49c9692b3"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c"},
{file = "pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb"},
{file = "pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:87acbfcf8e90ca885206e98359d7dca4bcbb35abdc0ff66672a293e1d7a19101"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:7f92c15cd1e97d4b12acd1cc9004fa092578acfa57b67ad5e43a197175d01a64"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3f26877a748dc4251cfcfda9dfb5f13fcb034f5308388066bcfe9031b63ae7d"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dac89aea9af8cd672fa7b510e7b8c33b0bba9a43186680550ccf23020f32d535"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:970919794d126ba8645f3837ab6046fb4e72bbc057b3709144066204c19a455d"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:3eb3fe62804e8f859c49ed20a8451342de53ed764150cb14ca71357c765dc2a6"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:3abcd9392a36025e3bd55f9bd38d908bd17962cc49bc6da8e7e96285336e2bca"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:3a1c81334778f9e3af2f8aeb7a960736e5cab1dfebfb26aabca09afd2906c039"},
{file = "pydantic_core-2.33.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2807668ba86cb38c6817ad9bc66215ab8584d1d304030ce4f0887336f28a5e27"},
{file = "pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc"},
]
[package.dependencies]
typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
dotenv = ["python-dotenv (>=0.10.4)"]
email = ["email-validator (>=1.0.3)"]
[[package]]
name = "pygithub"
@@ -5254,21 +5066,21 @@ files = [
[[package]]
name = "s3transfer"
version = "0.13.1"
version = "0.10.4"
description = "An Amazon S3 Transfer Manager"
optional = false
python-versions = ">=3.9"
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "s3transfer-0.13.1-py3-none-any.whl", hash = "sha256:a981aa7429be23fe6dfc13e80e4020057cbab622b08c0315288758d67cabc724"},
{file = "s3transfer-0.13.1.tar.gz", hash = "sha256:c3fdba22ba1bd367922f27ec8032d6a1cf5f10c934fb5d68cf60fd5a23d936cf"},
{file = "s3transfer-0.10.4-py3-none-any.whl", hash = "sha256:244a76a24355363a68164241438de1b72f8781664920260c48465896b712a41e"},
{file = "s3transfer-0.10.4.tar.gz", hash = "sha256:29edc09801743c21eb5ecbc617a152df41d3c287f67b615f73e5f750583666a7"},
]
[package.dependencies]
botocore = ">=1.37.4,<2.0a.0"
botocore = ">=1.33.2,<2.0a.0"
[package.extras]
crt = ["botocore[crt] (>=1.37.4,<2.0a.0)"]
crt = ["botocore[crt] (>=1.33.2,<2.0a.0)"]
[[package]]
name = "safety"
@@ -5327,19 +5139,16 @@ typing-extensions = ">=4.7.1"
[[package]]
name = "schema"
version = "0.7.5"
version = "0.7.7"
description = "Simple data validation library"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "schema-0.7.5-py2.py3-none-any.whl", hash = "sha256:f3ffdeeada09ec34bf40d7d79996d9f7175db93b7a5065de0faa7f41083c1e6c"},
{file = "schema-0.7.5.tar.gz", hash = "sha256:f06717112c61895cabc4707752b88716e8420a8819d71404501e114f91043197"},
{file = "schema-0.7.7-py2.py3-none-any.whl", hash = "sha256:5d976a5b50f36e74e2157b47097b60002bd4d42e65425fcc9c9befadb4255dde"},
{file = "schema-0.7.7.tar.gz", hash = "sha256:7da553abd2958a19dc2547c388cde53398b39196175a9be59ea1caf5ab0a1807"},
]
[package.dependencies]
contextlib2 = ">=0.5.5"
[[package]]
name = "sentry-sdk"
version = "2.26.1"
@@ -5649,21 +5458,6 @@ files = [
{file = "typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef"},
]
[[package]]
name = "typing-inspection"
version = "0.4.1"
description = "Runtime typing introspection tools"
optional = false
python-versions = ">=3.9"
groups = ["main", "dev"]
files = [
{file = "typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51"},
{file = "typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28"},
]
[package.dependencies]
typing-extensions = ">=4.12.0"
[[package]]
name = "tzdata"
version = "2025.2"
@@ -6122,4 +5916,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.11,<3.13"
content-hash = "7aa50d0e8afd3dfa080541d0bfd7ea960720a9848d1e6f801bc082528f43c56b"
content-hash = "6802b33984c2f8438c9dc02dac0a0c14d5a78af60251bd0c80ca59bc2182c48e"
+2 -2
View File
@@ -24,7 +24,7 @@ dependencies = [
"drf-spectacular-jsonapi==0.5.1",
"gunicorn==23.0.0",
"lxml==5.3.2",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.10",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
@@ -38,7 +38,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.11.2"
version = "1.10.2"
[project.scripts]
celery = "src.backend.config.settings.celery"
+14 -44
View File
@@ -5167,7 +5167,6 @@ paths:
- aws
- azure
- gcp
- github
- kubernetes
- m365
description: |-
@@ -5176,7 +5175,6 @@ paths:
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
* `github` - GitHub
- in: query
name: filter[provider_type__in]
schema:
@@ -5187,7 +5185,6 @@ paths:
- aws
- azure
- gcp
- github
- kubernetes
- m365
description: |-
@@ -5198,7 +5195,6 @@ paths:
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
* `github` - GitHub
explode: false
style: form
- in: query
@@ -5431,7 +5427,6 @@ paths:
- aws
- azure
- gcp
- github
- kubernetes
- m365
description: |-
@@ -5440,7 +5435,6 @@ paths:
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
* `github` - GitHub
- in: query
name: filter[provider_type__in]
schema:
@@ -5451,7 +5445,6 @@ paths:
- aws
- azure
- gcp
- github
- kubernetes
- m365
description: |-
@@ -5462,7 +5455,6 @@ paths:
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
* `github` - GitHub
explode: false
style: form
- in: query
@@ -5701,7 +5693,6 @@ paths:
- aws
- azure
- gcp
- github
- kubernetes
- m365
description: |-
@@ -5710,7 +5701,6 @@ paths:
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
* `github` - GitHub
- in: query
name: filter[provider_type__in]
schema:
@@ -5721,7 +5711,6 @@ paths:
- aws
- azure
- gcp
- github
- kubernetes
- m365
description: |-
@@ -5732,7 +5721,6 @@ paths:
* `gcp` - GCP
* `kubernetes` - Kubernetes
* `m365` - M365
* `github` - GitHub
explode: false
style: form
- in: query
@@ -8877,6 +8865,7 @@ components:
readOnly: true
enabled:
type: boolean
readOnly: true
connected:
type: boolean
readOnly: true
@@ -8908,16 +8897,11 @@ components:
description: The name of the S3 bucket where files will be stored.
output_directory:
type: string
description: 'The directory path within the bucket where files
will be saved. Optional - defaults to "output" if not provided.
Path will be normalized to remove excessive slashes and invalid
characters are not allowed (< > : " | ? *). Maximum length is
900 characters.'
maxLength: 900
pattern: ^[^<>:"|?*]+$
default: output
description: The directory path within the bucket where files
will be saved.
required:
- bucket_name
- output_directory
credentials:
oneOf:
- type: object
@@ -9022,6 +9006,7 @@ components:
readOnly: true
enabled:
type: boolean
readOnly: true
connected:
type: boolean
readOnly: true
@@ -9054,16 +9039,11 @@ components:
stored.
output_directory:
type: string
description: 'The directory path within the bucket where files
will be saved. Optional - defaults to "output" if not provided.
Path will be normalized to remove excessive slashes and
invalid characters are not allowed (< > : " | ? *). Maximum
length is 900 characters.'
maxLength: 900
pattern: ^[^<>:"|?*]+$
default: output
description: The directory path within the bucket where files
will be saved.
required:
- bucket_name
- output_directory
credentials:
oneOf:
- type: object
@@ -9215,16 +9195,11 @@ components:
description: The name of the S3 bucket where files will be stored.
output_directory:
type: string
description: 'The directory path within the bucket where files
will be saved. Optional - defaults to "output" if not provided.
Path will be normalized to remove excessive slashes and invalid
characters are not allowed (< > : " | ? *). Maximum length is
900 characters.'
maxLength: 900
pattern: ^[^<>:"|?*]+$
default: output
description: The directory path within the bucket where files
will be saved.
required:
- bucket_name
- output_directory
credentials:
oneOf:
- type: object
@@ -10574,16 +10549,11 @@ components:
stored.
output_directory:
type: string
description: 'The directory path within the bucket where files
will be saved. Optional - defaults to "output" if not provided.
Path will be normalized to remove excessive slashes and
invalid characters are not allowed (< > : " | ? *). Maximum
length is 900 characters.'
maxLength: 900
pattern: ^[^<>:"|?*]+$
default: output
description: The directory path within the bucket where files
will be saved.
required:
- bucket_name
- output_directory
credentials:
oneOf:
- type: object
@@ -30,6 +30,9 @@ class TestS3ConfigSerializer:
"""Test that empty values raise validation errors."""
serializer = S3ConfigSerializer()
with pytest.raises(ValidationError, match="Output directory cannot be empty"):
serializer.validate_output_directory("")
with pytest.raises(
ValidationError, match="Output directory cannot be empty or just"
):
-2
View File
@@ -5679,7 +5679,6 @@ class TestIntegrationViewSet:
"integration_type": integration_type,
"configuration": configuration,
"credentials": credentials,
"enabled": True,
},
"relationships": {
"providers": {
@@ -5697,7 +5696,6 @@ class TestIntegrationViewSet:
assert Integration.objects.count() == 1
integration = Integration.objects.first()
assert integration.configuration == data["data"]["attributes"]["configuration"]
assert integration.enabled == data["data"]["attributes"]["enabled"]
assert (
integration.integration_type
== data["data"]["attributes"]["integration_type"]
@@ -9,17 +9,15 @@ from api.v1.serializer_utils.base import BaseValidateSerializer
class S3ConfigSerializer(BaseValidateSerializer):
bucket_name = serializers.CharField()
output_directory = serializers.CharField(allow_blank=True)
output_directory = serializers.CharField()
def validate_output_directory(self, value):
"""
Validate the output_directory field to ensure it's a properly formatted path.
Prevents paths with excessive slashes like "///////test".
If empty, sets a default value.
"""
# If empty or None, set default value
if not value:
return "output"
raise serializers.ValidationError("Output directory cannot be empty.")
# Normalize the path to remove excessive slashes
normalized_path = os.path.normpath(value)
@@ -37,7 +35,7 @@ class S3ConfigSerializer(BaseValidateSerializer):
# Check for empty path after normalization
if not normalized_path or normalized_path == ".":
raise serializers.ValidationError(
"Output directory cannot be empty or just '.' or '/'."
"Output directory cannot be empty or just '.'."
)
# Check for paths that are too long (S3 key limit is 1024 characters, leave some room for filename)
@@ -138,13 +136,12 @@ class IntegrationCredentialField(serializers.JSONField):
},
"output_directory": {
"type": "string",
"description": 'The directory path within the bucket where files will be saved. Optional - defaults to "output" if not provided. Path will be normalized to remove excessive slashes and invalid characters are not allowed (< > : " | ? *). Maximum length is 900 characters.',
"description": 'The directory path within the bucket where files will be saved. Path will be normalized to remove excessive slashes and invalid characters are not allowed (< > : " | ? *). Maximum length is 900 characters.',
"maxLength": 900,
"pattern": '^[^<>:"|?*]+$',
"default": "output",
},
},
"required": ["bucket_name"],
"required": ["bucket_name", "output_directory"],
},
]
}
+3 -1
View File
@@ -2068,19 +2068,21 @@ class IntegrationCreateSerializer(BaseWriteIntegrationSerializer):
"inserted_at": {"read_only": True},
"updated_at": {"read_only": True},
"connected": {"read_only": True},
"enabled": {"read_only": True},
"connection_last_checked_at": {"read_only": True},
}
def validate(self, attrs):
super().validate(attrs)
integration_type = attrs.get("integration_type")
providers = attrs.get("providers")
configuration = attrs.get("configuration")
credentials = attrs.get("credentials")
validated_attrs = super().validate(attrs)
self.validate_integration_data(
integration_type, providers, configuration, credentials
)
validated_attrs = super().validate(attrs)
return validated_attrs
def create(self, validated_data):
+1 -1
View File
@@ -293,7 +293,7 @@ class SchemaView(SpectacularAPIView):
def get(self, request, *args, **kwargs):
spectacular_settings.TITLE = "Prowler API"
spectacular_settings.VERSION = "1.11.2"
spectacular_settings.VERSION = "1.10.2"
spectacular_settings.DESCRIPTION = (
"Prowler API specification.\n\nThis file is auto-generated."
)
+3 -8
View File
@@ -8,12 +8,11 @@ from botocore.exceptions import ClientError, NoCredentialsError, ParamValidation
from celery.utils.log import get_task_logger
from django.conf import settings
from api.db_utils import rls_transaction
from api.models import Scan
from prowler.config.config import (
csv_file_suffix,
html_file_suffix,
json_ocsf_file_suffix,
output_file_timestamp,
)
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
AWSWellArchitected,
@@ -249,19 +248,15 @@ def _generate_output_directory(
# Sanitize the prowler provider name to ensure it is a valid directory name
prowler_provider_sanitized = re.sub(r"[^\w\-]", "-", prowler_provider)
with rls_transaction(tenant_id):
started_at = Scan.objects.get(id=scan_id).started_at
timestamp = started_at.strftime("%Y%m%d%H%M%S")
path = (
f"{output_directory}/{tenant_id}/{scan_id}/prowler-output-"
f"{prowler_provider_sanitized}-{timestamp}"
f"{prowler_provider_sanitized}-{output_file_timestamp}"
)
os.makedirs("/".join(path.split("/")[:-1]), exist_ok=True)
compliance_path = (
f"{output_directory}/{tenant_id}/{scan_id}/compliance/prowler-output-"
f"{prowler_provider_sanitized}-{timestamp}"
f"{prowler_provider_sanitized}-{output_file_timestamp}"
)
os.makedirs("/".join(compliance_path.split("/")[:-1]), exist_ok=True)
+10 -5
View File
@@ -83,12 +83,17 @@ def upload_s3_integration(
for integration in integrations:
try:
connected, s3 = get_s3_client_from_integration(integration)
# Since many scans will be send to the same S3 bucket, we need to
# add the output directory to the S3 output directory to avoid
# overwriting the files and known the scan origin.
folder = os.getenv("OUTPUT_DIRECTORY", "/tmp/prowler_api_output")
s3._output_directory = (
f"{s3._output_directory}{output_directory.split(folder)[-1]}"
)
except Exception as e:
logger.info(
logger.error(
f"S3 connection failed for integration {integration.id}: {e}"
)
integration.connected = False
integration.save()
continue
if connected:
@@ -140,7 +145,7 @@ def upload_s3_integration(
integration.connected = False
integration.save()
logger.error(
f"S3 upload failed, connection failed for integration {integration.id}: {s3.error}"
f"S3 upload failed for integration {integration.id}: {s3.error}"
)
result = integration_executions == len(integrations)
@@ -149,7 +154,7 @@ def upload_s3_integration(
f"All the S3 integrations completed successfully for provider {provider_id}"
)
else:
logger.info(f"Some S3 integrations failed for provider {provider_id}")
logger.error(f"Some S3 integrations failed for provider {provider_id}")
return result
except Exception as e:
logger.error(f"S3 integrations failed for provider {provider_id}: {str(e)}")
+12 -38
View File
@@ -1,7 +1,5 @@
import os
import uuid
import zipfile
from datetime import datetime
from pathlib import Path
from unittest.mock import MagicMock, patch
@@ -129,26 +127,14 @@ class TestOutputs:
_upload_to_s3("tenant", str(zip_path), "scan")
mock_logger.assert_called()
@patch("tasks.jobs.export.rls_transaction")
@patch("tasks.jobs.export.Scan")
def test_generate_output_directory_creates_paths(
self, mock_scan, mock_rls_transaction, tmpdir
):
# Mock the scan object with a started_at timestamp
mock_scan_instance = MagicMock()
mock_scan_instance.started_at = datetime(2023, 6, 15, 10, 30, 45)
mock_scan.objects.get.return_value = mock_scan_instance
# Mock rls_transaction as a context manager
mock_rls_transaction.return_value.__enter__ = MagicMock()
mock_rls_transaction.return_value.__exit__ = MagicMock(return_value=False)
def test_generate_output_directory_creates_paths(self, tmpdir):
from prowler.config.config import output_file_timestamp
base_tmp = Path(str(tmpdir.mkdir("generate_output")))
base_dir = str(base_tmp)
tenant_id = str(uuid.uuid4())
scan_id = str(uuid.uuid4())
tenant_id = "t1"
scan_id = "s1"
provider = "aws"
expected_timestamp = "20230615103045"
path, compliance = _generate_output_directory(
base_dir, provider, tenant_id, scan_id
@@ -157,29 +143,17 @@ class TestOutputs:
assert os.path.isdir(os.path.dirname(path))
assert os.path.isdir(os.path.dirname(compliance))
assert path.endswith(f"{provider}-{expected_timestamp}")
assert compliance.endswith(f"{provider}-{expected_timestamp}")
assert path.endswith(f"{provider}-{output_file_timestamp}")
assert compliance.endswith(f"{provider}-{output_file_timestamp}")
@patch("tasks.jobs.export.rls_transaction")
@patch("tasks.jobs.export.Scan")
def test_generate_output_directory_invalid_character(
self, mock_scan, mock_rls_transaction, tmpdir
):
# Mock the scan object with a started_at timestamp
mock_scan_instance = MagicMock()
mock_scan_instance.started_at = datetime(2023, 6, 15, 10, 30, 45)
mock_scan.objects.get.return_value = mock_scan_instance
# Mock rls_transaction as a context manager
mock_rls_transaction.return_value.__enter__ = MagicMock()
mock_rls_transaction.return_value.__exit__ = MagicMock(return_value=False)
def test_generate_output_directory_invalid_character(self, tmpdir):
from prowler.config.config import output_file_timestamp
base_tmp = Path(str(tmpdir.mkdir("generate_output")))
base_dir = str(base_tmp)
tenant_id = str(uuid.uuid4())
scan_id = str(uuid.uuid4())
tenant_id = "t1"
scan_id = "s1"
provider = "aws/test@check"
expected_timestamp = "20230615103045"
path, compliance = _generate_output_directory(
base_dir, provider, tenant_id, scan_id
@@ -188,5 +162,5 @@ class TestOutputs:
assert os.path.isdir(os.path.dirname(path))
assert os.path.isdir(os.path.dirname(compliance))
assert path.endswith(f"aws-test-check-{expected_timestamp}")
assert compliance.endswith(f"aws-test-check-{expected_timestamp}")
assert path.endswith(f"aws-test-check-{output_file_timestamp}")
assert compliance.endswith(f"aws-test-check-{output_file_timestamp}")
@@ -161,7 +161,7 @@ class TestS3IntegrationUploads:
integration.save.assert_called_once()
assert integration.connected is False
mock_logger.error.assert_any_call(
"S3 upload failed, connection failed for integration i-1: Connection failed"
"S3 upload failed for integration i-1: Connection failed"
)
@patch("tasks.jobs.integrations.rls_transaction")
@@ -204,7 +204,7 @@ class TestS3IntegrationUploads:
result = upload_s3_integration(tenant_id, provider_id, output_directory)
assert result is False
mock_logger.info.assert_any_call(
mock_logger.error.assert_any_call(
"S3 connection failed for integration i-1: failed"
)
@@ -331,35 +331,6 @@ class TestS3IntegrationUploads:
# Verify complex path normalization
assert configuration["output_directory"] == "test/folder/subfolder"
@patch("tasks.jobs.integrations.S3")
def test_s3_client_uses_output_directory_in_object_paths(self, mock_s3_class):
"""Test that S3 client uses output_directory correctly when generating object paths."""
mock_integration = MagicMock()
mock_integration.credentials = {
"aws_access_key_id": "AKIA...",
"aws_secret_access_key": "SECRET",
}
mock_integration.configuration = {
"bucket_name": "test-bucket",
"output_directory": "my-custom-prefix/scan-results",
}
mock_s3_instance = MagicMock()
mock_connection = MagicMock()
mock_connection.is_connected = True
mock_s3_instance.test_connection.return_value = mock_connection
mock_s3_class.return_value = mock_s3_instance
connected, s3 = get_s3_client_from_integration(mock_integration)
assert connected is True
# Verify S3 was initialized with the correct output_directory
mock_s3_class.assert_called_once_with(
**mock_integration.credentials,
bucket_name="test-bucket",
output_directory="my-custom-prefix/scan-results",
)
@pytest.mark.django_db
class TestProwlerIntegrationConnectionTest:
-44
View File
@@ -50,49 +50,6 @@ The GCP provider implementation follows the general [Provider structure](./provi
- **Location:** [`prowler/providers/gcp/lib/`](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/gcp/lib/)
- **Purpose:** Helpers for argument parsing, mutelist management, and other cross-cutting concerns.
## Retry Configuration
GCP services implement automatic retry functionality for rate limiting errors (HTTP 429). This is configured centrally and must be included in all API calls:
### Required Implementation
```python
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
# In discovery.build()
client = discovery.build(
service, version, credentials=credentials,
num_retries=DEFAULT_RETRY_ATTEMPTS
)
# In request.execute()
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
```
### Configuration
- **Default Value**: 3 attempts (configurable in `prowler/providers/gcp/config.py`)
- **Command Line Flag**: `--gcp-retries-max-attempts` for runtime configuration
- **Error Types**: HTTP 429 and quota exceeded errors
- **Backoff Strategy**: Exponential backoff with randomization
### Example Service Implementation
```python
def _get_instances(self):
for project_id in self.project_ids:
try:
client = discovery.build(
"compute", "v1", credentials=self.credentials,
num_retries=DEFAULT_RETRY_ATTEMPTS
)
request = client.instances().list(project=project_id)
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
# Process response...
except Exception as error:
logger.error(f"{error.__class__.__name__}: {error}")
```
## Specific Patterns in GCP Services
The generic service pattern is described in [service page](./services.md#service-structure-and-initialisation). You can find all the currently implemented services in the following locations:
@@ -112,7 +69,6 @@ The best reference to understand how to implement a new service is following the
- Resource discovery and attribute collection can be parallelized using `self.__threading_call__`, typically by region/zone or resource.
- All GCP resources are represented as Pydantic `BaseModel` classes, providing type safety and structured access to resource attributes.
- Each GCP API calls are wrapped in try/except blocks, always logging errors.
- **Retry Configuration**: All `request.execute()` calls must include `num_retries=DEFAULT_RETRY_ATTEMPTS` for automatic retry on rate limiting errors (HTTP 429).
- Tags and additional attributes that cannot be retrieved from the default call should be collected and stored for each resource using dedicated methods and threading.
## Specific Patterns in GCP Checks
+1 -1
View File
@@ -209,7 +209,7 @@ Refer to the [Create Prowler Service Principal](../tutorials/microsoft365/gettin
If the external API permissions described in the mentioned section above are not added only checks that work through MS Graph will be executed. This means that the full provider will not be executed.
???+ note
In order to scan all the checks from M365 required permissions to the service principal application must be added. Refer to the [External API Permissions Assignment](../tutorials/microsoft365/getting-started-m365.md#grant-powershell-modules-permissions) section for more information.
In order to scan all the checks from M365 required permissions to the service principal application must be added. Refer to the [Needed permissions](/docs/tutorials/microsoft365/getting-started-m365.md#needed-permissions) section for more information.
### Service Principal and User Credentials Authentication

Before

Width:  |  Height:  |  Size: 25 KiB

After

Width:  |  Height:  |  Size: 25 KiB

Before

Width:  |  Height:  |  Size: 121 KiB

After

Width:  |  Height:  |  Size: 121 KiB

Before

Width:  |  Height:  |  Size: 41 KiB

After

Width:  |  Height:  |  Size: 41 KiB

Before

Width:  |  Height:  |  Size: 124 KiB

After

Width:  |  Height:  |  Size: 124 KiB

Before

Width:  |  Height:  |  Size: 351 KiB

After

Width:  |  Height:  |  Size: 351 KiB

Before

Width:  |  Height:  |  Size: 139 KiB

After

Width:  |  Height:  |  Size: 139 KiB

Before

Width:  |  Height:  |  Size: 144 KiB

After

Width:  |  Height:  |  Size: 144 KiB

Before

Width:  |  Height:  |  Size: 119 KiB

After

Width:  |  Height:  |  Size: 119 KiB

Before

Width:  |  Height:  |  Size: 103 KiB

After

Width:  |  Height:  |  Size: 103 KiB

Before

Width:  |  Height:  |  Size: 117 KiB

After

Width:  |  Height:  |  Size: 117 KiB

Before

Width:  |  Height:  |  Size: 117 KiB

After

Width:  |  Height:  |  Size: 117 KiB

Before

Width:  |  Height:  |  Size: 359 KiB

After

Width:  |  Height:  |  Size: 359 KiB

Before

Width:  |  Height:  |  Size: 86 KiB

After

Width:  |  Height:  |  Size: 86 KiB

Before

Width:  |  Height:  |  Size: 354 KiB

After

Width:  |  Height:  |  Size: 354 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 261 KiB

Before

Width:  |  Height:  |  Size: 18 KiB

After

Width:  |  Height:  |  Size: 18 KiB

Before

Width:  |  Height:  |  Size: 38 KiB

After

Width:  |  Height:  |  Size: 38 KiB

-10
View File
@@ -671,16 +671,6 @@ prowler azure --az-cli-auth --subscription-ids <subscription ID 1> <subscription
prowler gcp --project-ids <Project ID 1> <Project ID 2> ... <Project ID N>
```
- **GCP Retry Configuration**
To configure the maximum number of retry attempts for Google Cloud SDK API calls, use the `--gcp-retries-max-attempts` flag:
```console
prowler gcp --gcp-retries-max-attempts 5
```
This is useful when experiencing quota exceeded errors (HTTP 429) to increase the number of automatic retry attempts.
#### Kubernetes
Prowler enables security scanning of Kubernetes clusters, supporting both **in-cluster** and **external** execution.
-95
View File
@@ -1,95 +0,0 @@
# GCP Retry Configuration in Prowler
Prowler's GCP Provider uses Google Cloud Python SDK's integrated retry mechanism to automatically retry API calls when encountering rate limiting errors (HTTP 429).
## Quick Configuration
### Using Command Line Flag (Recommended)
```bash
prowler gcp --gcp-retries-max-attempts 5
```
### Using Configuration File
Modify `prowler/providers/gcp/config.py`:
```python
DEFAULT_RETRY_ATTEMPTS = 5 # Default: 3
```
## How It Works
- **Automatic Detection**: Handles HTTP 429 and quota exceeded errors
- **Exponential Backoff**: Each retry uses randomized exponential backoff
- **Centralized Config**: All GCP services use the same retry configuration
- **Transparent**: No additional code needed in services
## Error Examples Handled
```
HttpError 429 when requesting https://cloudresourcemanager.googleapis.com/v1/projects/vms-uat-eiger:getIamPolicy?alt=json returned "Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute'"
```
## Implementation
### Client-Level Configuration
```python
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
client = discovery.build(
service, version, credentials=credentials,
num_retries=DEFAULT_RETRY_ATTEMPTS
)
```
### Request-Level Configuration
```python
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
```
## Services with Retry Support
All major GCP services are covered:
- Cloud Resource Manager, Compute Engine, IAM
- BigQuery, KMS, Cloud Storage, Monitoring
- DNS, Logging, Cloud SQL, GKE, API Keys, DataProc
## Validation
### Debug Logging
```bash
prowler gcp --log-level DEBUG --log-file debuglogs.txt --project-id your-project-id
```
### Check for Retry Messages
```bash
grep -i "sleeping\|retry\|quota exceeded" debuglogs.txt
```
### Expected Output
```
"Sleeping 1.52 seconds before retry 1 of 3"
"Sleeping 3.23 seconds before retry 2 of 3"
```
## Testing in Real Environment
1. **Reduce API Quotas** in GCP Console:
- APIs & Services > Quotas
- Reduce "Read requests per minute" for Compute Engine API
- Reduce "Policy Read Requests per minute" for IAM API
2. **Run Prowler** with debug logging
3. **Monitor logs** for retry messages
## Troubleshooting
If experiencing rate limiting:
1. Use `--gcp-retries-max-attempts` flag to increase attempts
2. Request quota increases from Google Cloud support
3. Optimize scanning to reduce simultaneous API calls
4. Verify retry functionality with debug logging
## Official References
- [Google Cloud Python Client Libraries](https://cloud.google.com/python/docs)
- [Google Cloud Quotas](https://cloud.google.com/docs/quotas)
- [Google API Core Retry](https://googleapis.dev/python/google-api-core/latest/retry.html)
@@ -24,56 +24,7 @@ Personal Access Tokens provide the simplest GitHub authentication method and sup
- Scroll down the left sidebar
- Click "Developer settings"
3. **Generate Fine-Grained Token**
- Click "Personal access tokens"
- Select "Fine-grained tokens"
- Click "Generate new token"
4. **Configure Token Settings**
- **Token name**: Give your token a descriptive name (e.g., "Prowler Security Scanner")
- **Expiration**: Set an appropriate expiration date (recommended: 90 days or less)
- **Repository access**: Choose "All repositories" or "Only select repositories" based on your needs
???+ note "Public repositories"
Even if you select 'Only select repositories', the token will have access to the public repositories that you own or are a member of.
5. **Configure Token Permissions**
To enable Prowler functionality, configure the following permissions:
- **Repository permissions:**
- **Contents**: Read-only access
- **Metadata**: Read-only access
- **Pull requests**: Read-only access
- **Security advisories**: Read-only access
- **Statuses**: Read-only access
- **Organization permissions:**
- **Members**: Read-only access
- **Account permissions:**
- **Email addresses**: Read-only access
6. **Copy and Store the Token**
- Copy the generated token immediately (GitHub displays tokens only once)
- Store tokens securely using environment variables
![GitHub Personal Access Token Permissions](./img/github-pat-permissions.png)
#### **Option 2: Create a Classic Personal Access Token (Not Recommended)**
???+ warning "Security Risk"
Classic tokens provide broad permissions that may exceed what Prowler actually needs. Use fine-grained tokens instead for better security.
1. **Navigate to GitHub Settings**
- Open [GitHub](https://github.com) and sign in
- Click the profile picture in the top right corner
- Select "Settings" from the dropdown menu
2. **Access Developer Settings**
- Scroll down the left sidebar
- Click "Developer settings"
3. **Generate Classic Token**
3. **Generate New Token**
- Click "Personal access tokens"
- Select "Tokens (classic)"
- Click "Generate new token"
Binary file not shown.

Before

Width:  |  Height:  |  Size: 89 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 275 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 150 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 192 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 95 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 158 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 124 KiB

+1 -356
View File
@@ -56,359 +56,4 @@ If the YAML configuration is invalid, an error message will be displayed
![Check muted fidings](../img/mutelist-ui-9.png)
???+ note
The Mutelist configuration takes effect on the next scans.
## Mutelist Ready To Use Examples
Below are examples for different cloud providers supported by Prowler App. Check how the mutelist works [here](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/mutelist/#how-the-mutelist-works).
### AWS Provider
#### Basic AWS Mutelist
```yaml
Mutelist:
Accounts:
"123456789012":
Checks:
"iam_user_hardware_mfa_enabled":
Regions:
- "us-east-1"
Resources:
- "user-1"
- "user-2"
Description: "Mute MFA findings for specific users in us-east-1"
"s3_bucket_public_access":
Regions:
- "*"
Resources:
- "public-website-bucket"
Description: "Mute public access findings for website bucket"
```
#### AWS Service-Wide Muting
```yaml
Mutelist:
Accounts:
"*":
Checks:
"ec2_*":
Regions:
- "*"
Resources:
- "*"
Description: "Mute all EC2-related findings across all accounts and regions"
```
#### AWS Tag-Based Muting
```yaml
Mutelist:
Accounts:
"*":
Checks:
"*":
Regions:
- "*"
Resources:
- "*"
Tags:
- "environment=dev"
- "project=test"
Description: "Mute all findings for resources tagged with environment=dev or project=test"
```
### Azure Provider
???+ note
For Azure provider, the Account ID is the Subscription Name and the Region is the Location.
#### Basic Azure Mutelist
```yaml
Mutelist:
Accounts:
"MySubscription":
Checks:
"storage_blob_public_access_level_is_disabled":
Regions:
- "East US"
- "West US"
Resources:
- "publicstorageblob"
Description: "Mute public access findings for specific blob storage resource"
"app_function_vnet_integration_enabled":
Regions:
- "*"
Resources:
- "app-vnet-peering-*"
Description: "Mute App Function Vnet findings related with the reources pattern"
```
#### Azure Resource Group Muting
```yaml
Mutelist:
Accounts:
"*":
Checks:
"*":
Regions:
- "*"
Resources:
- "rg-dev-*"
- "rg-test-*"
Tags:
- "environment=development"
Description: "Mute all findings for development resource groups"
```
### GCP Provider
???+ note
For GCP provider, the Account ID is the Project ID and the Region is the Zone.
#### Basic GCP Mutelist
```yaml
Mutelist:
Accounts:
"my-gcp-project":
Checks:
"cloudstorage_bucket_public_access":
Regions:
- "us-central1"
- "us-east1"
Resources:
- "public-bucket-*"
Description: "Mute public access findings for specific bucket pattern"
"compute_instance_public_ip":
Regions:
- "*"
Resources:
- "public-instance"
Description: "Mute public access findings for specific compute instance"
```
#### GCP Project-Wide Muting
```yaml
Mutelist:
Accounts:
"*":
Checks:
"*":
Regions:
- "*"
Resources:
- "*"
Tags:
- "environment=staging"
Description: "Mute all GCP findings for staging environment"
```
### Kubernetes Provider
???+ note
For Kubernetes provider, the Account ID is the Cluster Name and the Region is the Namespace.
#### Basic Kubernetes Mutelist
```yaml
Mutelist:
Accounts:
"my-cluster":
Checks:
"etcd_client_cert_auth":
Regions:
- "default"
- "kube-system"
Resources:
- "system-pod-*"
Description: "Mute etcd cert authorization findings for the matching resources"
"kubelet_tls_cert_and_key":
Regions:
- "*"
Resources:
- "*"
Description: "Mute kubelet tls findings across all namespaces"
```
#### Kubernetes Namespace Muting
```yaml
Mutelist:
Accounts:
"*":
Checks:
"*":
Regions:
- "monitoring"
- "logging"
Resources:
- "*"
Description: "Mute all findings for monitoring and logging namespaces"
```
### Microsoft 365 Provider
#### Basic Microsoft 365 Mutelist
```yaml
Mutelist:
Accounts:
"my-tenant.onmicrosoft.com":
Checks:
"entra_admin_portals_access_restriction":
Regions:
- "*"
Resources:
- "test-user"
Description: "Mute findings related with administrative roles access for test-user"
"sharepoint_external_sharing_managed":
Regions:
- "*"
Resources:
- "public-site-*"
Description: "Mute external sharing findings for public sites"
```
#### Microsoft 365 Tenant-Wide Muting
```yaml
Mutelist:
Accounts:
"*":
Checks:
"*":
Regions:
- "*"
Resources:
- "*"
Tags:
- "department=IT"
Description: "Mute all M365 findings for IT department resources"
```
### Multi-Cloud Mutelist
You can combine multiple providers in a single mutelist configuration:
```yaml
Mutelist:
Accounts:
# AWS Account
"123456789012":
Checks:
"s3_bucket_public_access":
Regions:
- "us-east-1"
Resources:
- "public-website"
Description: "Mute public access findings for AWS website bucket"
# Azure Subscription
"MyAzureSubscription":
Checks:
"storage_blob_public_access_level_is_disabled":
Regions:
- "East US"
Resources:
- "public-storage"
Description: "Mute public access findings for Azure storage account"
# GCP Project
"my-gcp-project":
Checks:
"cloudstorage_bucket_public_access":
Regions:
- "us-central1"
Resources:
- "public-bucket"
Description: "Mute public access findings for GCP storage bucket"
# Kubernetes Cluster
"my-k8s-cluster":
Checks:
"kubelet_tls_cert_and_key":
Regions:
- "default"
Resources:
- "kubelet-test"
Description: "Mute kubelet tls findings related with kubelet-test"
# Microsoft 365 Tenant
"my-tenant.onmicrosoft.com":
Checks:
"sharepoint_external_sharing_managed":
Regions:
- "*"
Resources:
- "public-site"
Description: "Mute external sharing findings for public SharePoint site"
```
### Advanced Mutelist Features
#### Using Regular Expressions
```yaml
Mutelist:
Accounts:
"*":
Checks:
"*":
Regions:
- "*"
Resources:
- ".*-test-.*" # Matches any resource containing "-test-"
- "dev-.*" # Matches resources starting with "dev-"
- ".*-prod$" # Matches resources ending with "-prod"
Description: "Mute findings for test and development resources using regex"
```
#### Using Exceptions
```yaml
Mutelist:
Accounts:
"*":
Checks:
"*":
Regions:
- "*"
Resources:
- "*"
Exceptions:
Accounts:
- "987654321098"
Regions:
- "us-west-2"
Resources:
- "critical-resource"
Tags:
- "environment=production"
Description: "Mute all findings except for critical production resources"
```
#### Tag-Based Logic
```yaml
Mutelist:
Accounts:
"*":
Checks:
"*":
Regions:
- "*"
Resources:
- "*"
Tags:
- "environment=dev | environment=test" # OR logic
- "project=alpha" # AND logic
Description: "Mute findings for dev/test environments in alpha project"
```
### Best Practices
1. **Start Small**: Begin with specific resources and gradually expand
2. **Document Reasons**: Always include descriptions for audit trails
3. **Regular Reviews**: Periodically review muted findings
4. **Use Tags**: Leverage resource tags for better organization
5. **Test Changes**: Validate mutelist changes in non-production environments
6. **Monitor Impact**: Track how muting affects your security posture
### Validation
Prowler App validates your mutelist configuration and will display errors for:
- Invalid YAML syntax
- Missing required fields
- Invalid regular expressions
- Unsupported provider-specific configurations
The Mutelist configuration takes effect on the next scans.
+9 -9
View File
@@ -6,38 +6,38 @@ This page provides instructions for creating and configuring a Microsoft Entra I
1. From the "Enterprise Applications" page in the Azure Portal, click "+ New application".
![New application](./img/saml/saml-sso-azure-1.png)
![New application](../img/saml/saml-sso-azure-1.png)
2. At the top of the page, click "+ Create your own application".
![Create application](./img/saml/saml-sso-azure-2.png)
![Create application](../img/saml/saml-sso-azure-2.png)
3. Enter a name for the application and select the "Integrate any other application you don't find in the gallery (Non-gallery)" option.
![Enter name](./img/saml/saml-sso-azure-3.png)
![Enter name](../img/saml/saml-sso-azure-3.png)
4. Assign users and groups to the application, then proceed to "Set up single sign on" and select "SAML" as the method.
![Select SAML](./img/saml/saml-sso-azure-4.png)
![Select SAML](../img/saml/saml-sso-azure-4.png)
5. In the "Basic SAML Configuration" section, click "Edit".
![Edit](./img/saml/saml-sso-azure-5.png)
![Edit](../img/saml/saml-sso-azure-5.png)
6. Enter the "Identifier (Entity ID)" and "Reply URL (Assertion Consumer Service URL)". These values can be obtained from the SAML SSO integration setup in Prowler App. For detailed instructions, refer to the [SAML SSO Configuration](./prowler-app-sso.md) page.
![Enter data](./img/saml/saml-sso-azure-6.png)
![Enter data](../img/saml/saml-sso-azure-6.png)
7. In the "SAML Certificates" section, click "Edit".
![Edit](./img/saml/saml-sso-azure-7.png)
![Edit](../img/saml/saml-sso-azure-7.png)
8. For the "Signing Option," select "Sign SAML response and assertion", and then click "Save".
![Signing options](./img/saml/saml-sso-azure-8.png)
![Signing options](../img/saml/saml-sso-azure-8.png)
9. Once the changes are saved, the metadata XML can be downloaded from the "App Federation Metadata Url".
![Metadata XML](./img/saml/saml-sso-azure-9.png)
![Metadata XML](../img/saml/saml-sso-azure-9.png)
10. Save the downloaded Metadata XML to a file. To complete the setup, upload this file during the Prowler App integration. (See the [SAML SSO Configuration](./prowler-app-sso.md) page for details).
+28 -79
View File
@@ -1,4 +1,4 @@
# SAML Single Sign-On (SSO)
# SAML Single Sign-On (SSO) Configuration
This guide provides comprehensive instructions to configure SAML-based Single Sign-On (SSO) in Prowler App. This configuration allows users to authenticate using the organization's Identity Provider (IdP).
@@ -10,7 +10,7 @@ This document is divided into two main sections:
---
## User Guide Configuration
## User Guide: Configuring SAML SSO in Prowler App
Follow these steps to enable and configure SAML SSO for an organization.
@@ -18,12 +18,12 @@ Follow these steps to enable and configure SAML SSO for an organization.
Prowler can be integrated with SAML SSO identity providers such as Okta to enable single sign-on for the organization's users. The Prowler SAML integration currently supports the following features:
- [**IdP-Initiated SSO**](#idp-initiated-sso): Users can initiate login from their Identity Provider's dashboard.
- [**SP-Initiated SSO**](#sp-initiated-sso): Users can initiate login directly from the Prowler login page.
- **IdP-Initiated SSO**: Users can initiate login from their Identity Provider's dashboard.
- **SP-Initiated SSO**: Users can initiate login directly from the Prowler login page.
- **Just-in-Time Provisioning**: Users from the organization signing into Prowler for the first time will be automatically created.
???+ warning "Deactivate SAML"
If the SAML configuration is removed, users who previously authenticated via SAML will need to reset their password to regain access using standard login. This occurs because accounts no longer have valid authentication credentials without the SAML integration.
If the SAML configuration is removed, users who previously authenticated via SAML will need to reset their password to regain access using standard login. This is because their accounts no longer have valid authentication credentials without the SAML integration.
### Prerequisites
@@ -36,129 +36,78 @@ Prowler can be integrated with SAML SSO identity providers such as Okta to enabl
To access the account settings, click the "Account" button in the top-right corner of Prowler App, or navigate directly to `https://cloud.prowler.com/profile` (or `http://localhost:3000/profile` for local setups).
![Access Profile Settings](./img/saml/saml-step-1.png)
![Access Profile Settings](../img/saml/saml-step-1.png)
#### Step 2: Enable SAML Integration
On the profile page, find the "SAML SSO Integration" card and click "Enable" to begin the configuration process.
![Enable SAML Integration](./img/saml/saml-step-2.png)
![Enable SAML Integration](../img/saml/saml-step-2.png)
???+ info "Choose Your Method"
**Use Step 3A (Generic Method)** for any SAML 2.0 compliant Identity Provider or when you need custom configuration.
#### Step 3: Configure the Identity Provider (IdP)
**Use Step 3B (Okta App Catalog)** if you're using Okta and want a simplified setup process with pre-configured settings.
#### Step 3A: Configure the Identity Provider (IdP) - Generic
Prowler App displays the SAML configuration information needed to configure the IdP. Use this information to create a new SAML application in the IdP.
The Prowler SAML configuration panel displays the information needed to configure the IdP. This information must be used to create a new SAML application in the IdP.
1. **Assertion Consumer Service (ACS) URL**: The endpoint in Prowler that will receive the SAML assertion from the IdP.
2. **Audience URI (Entity ID)**: A unique identifier for the Prowler application (Service Provider).
To configure the IdP, copy the **ACS URL** and **Audience URI** from Prowler App and use them to set up a new SAML application.
To configure the IdP, copy the **ACS URL** and **Audience URI** from Prowler and use them to set up a new SAML application.
![IdP configuration](./img/saml/idp_config.png)
![IdP configuration](../img/saml/idp_config.png)
???+ info "IdP Configuration"
The exact steps for configuring an IdP vary depending on the provider (Okta, Azure AD, etc.). Please refer to the IdP's documentation for instructions on creating a SAML application. For SSO integration with Azure AD / Entra ID, see our [Entra ID configuration instructions](./prowler-app-sso-entra.md).
#### Step 3B: Configure Prowler from App Catalog - Okta
Instead of creating a custom SAML integration, Okta administrators can configure Prowler Cloud directly from Okta's application catalog:
1. **Access App Catalog**: Navigate to the IdP's application catalog (e.g., [Browse App Catalog](https://www.okta.com/integrations/) in Okta).
![Browse App Catalog](./img/saml/app-catalog-browse.png)
2. **Search for Prowler Cloud**: Use the search functionality to find "Prowler Cloud" in the app catalog. The official Prowler Cloud application will appear in the search results.
![Search for Prowler](./img/saml/app-catalog-browse-prowler.png)
3. **Select Prowler Cloud Application**: Click on the Prowler Cloud application from the search results to view its details page.
![Prowler Application Details](./img/saml/app-catalog-browse-prowler-add.png)
4. **Add Integration**: Click the "Add Integration" button to begin adding Prowler Cloud to the organization's applications.
5. **Configure General Settings**: In the "Add Prowler Cloud" configuration screen, the integration automatically configures the necessary settings.
![Add Prowler Configuration](./img/saml/app-catalog-browse-prowler-configure.png)
6. **Assign Users**: Navigate to the **Assignments** tab and assign the appropriate users or groups to the Prowler application by clicking "Assign" and selecting "Assign to People" or "Assign to Groups".
With this step, the Okta app catalog configuration is complete. Users can now access Prowler Cloud using either [IdP-initiated](#idp-initiated-sso) or [SP-initiated SSO](#sp-initiated-sso) flows.
**If you used Step 3B (Okta App Catalog)**, jump to [Step 6: Save and Verify Configuration](#step-6-save-and-verify-configuration).
#### Step 4: Configure Attribute Mapping in the IdP
For Prowler App to correctly identify and provision users, configure the IdP to send the following attributes in the SAML assertion:
For Prowler to correctly identify and provision users, the IdP must be configured to send the following attributes in the SAML assertion:
| Attribute Name | Description | Required |
|----------------|---------------------------------------------------------------------------------------------------------|----------|
| `firstName` | The user's first name. | Yes |
| `lastName` | The user's last name. | Yes |
| `userType` | The Prowler role to be assigned to the user (e.g., `admin`, `auditor`). If a role with that name already exists, it will be used; otherwise, a new role called `no_permissions` will be created with minimal permissions. Role permissions can be edited in the [RBAC Management tab](./prowler-app-rbac.md). | No |
| `userType` | The Prowler role to be assigned to the user (e.g., `admin`, `auditor`). If a role with that name already exists, it will be used; otherwise, a new role called `no_permissions` will be created with minimal permissions. You can then edit the permissions for that role in the [RBAC Management tab](./prowler-app-rbac.md). | No |
| `companyName` | The user's company name. This is automatically populated if the IdP sends an `organization` attribute. | No |
???+ info "IdP Attribute Mapping"
Note that the attribute name is just an example and may be different depending on the IdP. For instance, if the IdP provides a 'division' attribute, it can be mapped to 'userType'.
![IdP configuration](./img/saml/saml_attribute_statements.png)
Note that the attribute name is just an example and may be different in your IdP. For instance, if your IdP provides a 'division' attribute, you can map it to 'userType'.
![IdP configuration](../img/saml/saml_attribute_statements.png)
???+ warning "Dynamic Updates"
Prowler App updates these attributes each time a user logs in. Any changes made in the Identity Provider (IdP) will be reflected when the user logs in again.
These attributes are updated in Prowler each time a user logs in. Any changes made in the identity provider (IdP) will be reflected the next time the user logs in again.
#### Step 5: Upload IdP Metadata to Prowler
Once the IdP is configured, it provides a **metadata XML file**. This file contains the IdP's configuration information, such as its public key and login URL.
To complete the Prowler App configuration:
To complete the Prowler-side configuration:
1. Return to the Prowler SAML configuration page.
2. Enter the **email domain** for the organization (e.g., `mycompany.com`). Prowler App uses this to identify users who should authenticate via SAML.
2. Enter the **email domain** for the organization (e.g., `mycompany.com`). Prowler uses this to identify users who should authenticate via SAML.
3. Upload the **metadata XML file** downloaded from the IdP.
![Configure Prowler with IdP Metadata](./img/saml/saml-step-3.png)
![Configure Prowler with IdP Metadata](../img/saml/saml-step-3.png)
#### Step 6: Save and Verify Configuration
Click the "Save" button to complete the setup. The "SAML Integration" card will now display an "Active" status, indicating the configuration is complete and enabled.
Click the "Save" button to complete the setup. The "SAML Integration" card will now show an "Active" status, indicating that the configuration is complete and enabled.
![Verify Integration Status](./img/saml/saml-step-4.png)
![Verify Integration Status](../img/saml/saml-step-4.png)
???+ info "IdP Configuration"
The exact steps for configuring an IdP vary depending on the provider (Okta, Azure AD, etc.). Please refer to the IdP's documentation for instructions on creating a SAML application.
##### Remove SAML Configuration
SAML SSO can be disabled by removing the existing configuration from the integration panel.
![Remove SAML configuration](./img/saml/saml-step-remove.png)
You can disable SAML SSO by removing the existing configuration from the integration panel.
![Remove SAML configuration](../img/saml/saml-step-remove.png)
### IdP-Initiated SSO
### Signing in with SAML SSO
Once SAML SSO is configured, users can access Prowler Cloud directly from their Identity Provider's dashboard:
Once SAML SSO is enabled, users from the configured domain can sign in by entering their email address on the login page and clicking "Continue with SAML SSO". They will be redirected to the IdP to authenticate and then returned to Prowler.
1. Navigate to the IdP dashboard or portal
2. Click on the Prowler Cloud application tile
3. The system automatically authenticates users and redirects them to Prowler Cloud
This method is convenient for users who primarily work from the IdP portal and prefer a seamless single-click access.
### SP-Initiated SSO
Users can also initiate the login process directly from Prowler's login page:
1. Navigate to the Prowler login page
2. Click "Continue with SAML SSO"
![](./img/saml/saml-signin-1.png)
3. Enter their email address from the configured domain
![](./img/saml/saml-signin-2.png)
4. The system redirects users to the IdP for authentication
5. After successful authentication, users are returned to Prowler App
This method is useful when users bookmark Prowler or navigate directly to the application.
![Sign in with SAML SSO](../img/saml/saml-step-5.png)
---
@@ -227,7 +176,7 @@ When configuring the IdP for testing, use the ngrok URL for the ACS URL:
#### 4. Configure Prowler via API
To create a SAML configuration for testing, use `curl`. Replace placeholders with actual data.
To create a SAML configuration for testing, use `curl`. Make sure to replace placeholders with actual data.
```bash
curl --location 'http://localhost:8080/api/v1/saml-config' \
@@ -247,7 +196,7 @@ curl --location 'http://localhost:8080/api/v1/saml-config' \
#### 5. Initiate Login Flow
To test the end-to-end flow, construct the login URL and open it in a browser. This starts the IdP-initiated login flow.
To test the end-to-end flow, construct the login URL and open it in a browser. This will start the IdP-initiated login flow.
`https://<your-ngrok-url>/api/v1/accounts/saml/<YOUR_DOMAIN>/login/`
-1
View File
@@ -100,7 +100,6 @@ nav:
- Authentication: tutorials/gcp/authentication.md
- Projects: tutorials/gcp/projects.md
- Organization: tutorials/gcp/organization.md
- Retry Configuration: tutorials/gcp/retry-configuration.md
- Kubernetes:
- In-Cluster Execution: tutorials/kubernetes/in-cluster.md
- Non In-Cluster Execution: tutorials/kubernetes/outside-cluster.md
@@ -3,8 +3,8 @@ AWSTemplateFormatVersion: "2010-09-09"
# You can invoke CloudFormation and pass the principal ARN from a command line like this:
# aws cloudformation create-stack \
# --capabilities CAPABILITY_IAM --capabilities CAPABILITY_NAMED_IAM \
# --template-body "file://prowler-scan-role.yaml" \
# --stack-name "ProwlerScanRole" \
# --template-body "file://prowler-pro-saas-scan-role.yaml" \
# --stack-name "ProwlerProSaaSScanRole" \
# --parameters "ParameterKey=ExternalId,ParameterValue=ProvidedExternalID"
Description: |
@@ -12,8 +12,6 @@ Description: |
all read-only permissions to scan your account for security issues.
Contains two AWS managed policies (SecurityAudit and ViewOnlyAccess) and an inline policy.
It sets the trust policy on that IAM Role to permit Prowler to assume that role.
This template is designed to be used in Prowler Cloud, but can also be used in other Prowler deployments.
If you are deploying this template to be used in Prowler Cloud please do not edit the AccountId, IAMPrincipal and ExternalId parameters.
Parameters:
ExternalId:
Description: |
@@ -36,28 +34,6 @@ Parameters:
The IAM principal type and name that will be allowed to assume the role created, leave an * for all the IAM principals in your AWS account. If you are deploying this template to be used in Prowler Cloud please do not edit this.
Type: String
Default: role/prowler*
EnableS3Integration:
Description: |
Enable S3 integration for storing Prowler scan reports.
Type: String
Default: false
AllowedValues:
- true
- false
S3IntegrationBucketName:
Description: |
The S3 bucket name where Prowler will store scan reports for your cloud providers.
Type: String
Default: ""
S3IntegrationBucketAccountId:
Description: |
The AWS Account ID owner of the S3 Bucket.
Type: String
Default: ""
Conditions:
S3IntegrationEnabled: !Equals [!Ref EnableS3Integration, true]
Resources:
ProwlerScan:
@@ -140,37 +116,6 @@ Resources:
Resource:
- "arn:*:apigateway:*::/restapis/*"
- "arn:*:apigateway:*::/apis/*"
- !If
- S3IntegrationEnabled
- PolicyName: S3Integration
PolicyDocument:
Version: "2012-10-17"
Statement:
- Effect: Allow
Action:
- "s3:PutObject"
Resource:
- !Sub "arn:${AWS::Partition}:s3:::${S3IntegrationBucketName}/*"
Condition:
StringEquals:
"s3:ResourceAccount": !Sub ${S3IntegrationBucketAccountId}
- Effect: Allow
Action:
- "s3:ListBucket"
Resource:
- !Sub "arn:${AWS::Partition}:s3:::${S3IntegrationBucketName}"
Condition:
StringEquals:
"s3:ResourceAccount": !Sub ${S3IntegrationBucketAccountId}
- Effect: Allow
Action:
- "s3:DeleteObject"
Resource:
- !Sub "arn:${AWS::Partition}:s3:::${S3IntegrationBucketName}/*test-prowler-connection.txt"
Condition:
StringEquals:
"s3:ResourceAccount": !Sub ${S3IntegrationBucketAccountId}
- !Ref AWS::NoValue
Tags:
- Key: "Service"
Value: "https://prowler.com"
@@ -180,27 +125,3 @@ Resources:
Value: "true"
- Key: "Name"
Value: "ProwlerScan"
Metadata:
AWS::CloudFormation::StackName: "Prowler"
AWS::CloudFormation::Interface:
ParameterGroups:
- Label:
default: Required
Parameters:
- ExternalId
- AccountId
- IAMPrincipal
- EnableS3Integration
- Label:
default: Optional
Parameters:
- S3IntegrationBucketName
- S3IntegrationBucketAccountId
Outputs:
ProwlerScanRoleArn:
Description: "ARN of the ProwlerScan IAM Role"
Value: !GetAtt ProwlerScan.Arn
Export:
Name: !Sub "${AWS::StackName}-ProwlerScanRoleArn"
+1 -38
View File
@@ -1,47 +1,10 @@
## Deployment using Terraform
To deploy the Prowler Scan Role in order to allow scanning your AWS account from Prowler, please run the following commands in your terminal:
To deploy the Prowler Scan Role in order to allow to scan you AWS account from Prowler, please run the following commands in your terminal:
1. `terraform init`
2. `terraform plan`
3. `terraform apply`
During the `terraform plan` and `terraform apply` steps you will be asked for an External ID to be configured in the `ProwlerScan` IAM role.
### Variables
- `external_id` (required): External ID for role assumption security
- `account_id` (optional): AWS Account ID that will assume the role (defaults to Prowler Cloud: "232136659152")
- `iam_principal` (optional): IAM principal pattern allowed to assume the role (defaults to Prowler Cloud: "role/prowler*")
- `enable_s3_integration` (optional): Enable S3 integration for storing scan reports (default: false)
- `s3_integration_bucket_name` (conditional): S3 bucket name for reports (required if `enable_s3_integration` is true)
- `s3_integration_bucket_account` (conditional): S3 bucket owner account ID (required if `enable_s3_integration` is true)
### Usage Examples
#### Basic deployment (without S3 integration):
```bash
terraform apply -var="external_id=your-external-id-here"
```
#### With S3 integration enabled:
```bash
terraform apply \
-var="external_id=your-external-id-here" \
-var="enable_s3_integration=true" \
-var="s3_integration_bucket_name=your-s3-bucket-name" \
-var="s3_integration_bucket_account=123456789012"
```
#### Using terraform.tfvars file:
Create a `terraform.tfvars` file:
```hcl
external_id = "your-external-id-here"
enable_s3_integration = true
s3_integration_bucket_name = "your-s3-bucket-name"
s3_integration_bucket_account = "123456789012"
```
Then run: `terraform apply`
> Note that Terraform will use the AWS credentials of your default profile.
-4
View File
@@ -1,4 +0,0 @@
# Data Sources
###################################
data "aws_partition" "current" {}
data "aws_caller_identity" "current" {}
+55 -25
View File
@@ -1,20 +1,62 @@
# Local validation for conditional requirements
# Variables
###################################
locals {
s3_integration_validation = (
!var.enable_s3_integration ||
(var.enable_s3_integration && var.s3_integration_bucket_name != "" && var.s3_integration_bucket_account != "")
)
}
variable "external_id" {
type = string
description = "This is the External ID that Prowler will use to assume the role ProwlerScan IAM Role."
# Validation check using check block (Terraform 1.5+)
check "s3_integration_requirements" {
assert {
condition = !var.enable_s3_integration || (var.s3_integration_bucket_name != "" && var.s3_integration_bucket_account != "")
error_message = "When enable_s3_integration is true, both s3_integration_bucket_name and s3_integration_bucket_account must be provided and non-empty."
validation {
condition = length(var.external_id) > 0
error_message = "ExternalId must not be empty."
}
}
variable "account_id" {
type = string
description = "AWS Account ID that will assume the role created, if you are deploying this template to be used in Prowler Cloud please do not edit this."
default = "232136659152"
validation {
condition = length(var.account_id) == 12
error_message = "AccountId must be a valid AWS Account ID."
}
}
variable "iam_principal" {
type = string
description = "The IAM principal type and name that will be allowed to assume the role created, leave an * for all the IAM principals in your AWS account. If you are deploying this template to be used in Prowler Cloud please do not edit this."
default = "role/prowler*"
}
##### PLEASE, DO NOT EDIT BELOW THIS LINE #####
# Terraform Provider Configuration
###################################
terraform {
required_version = ">= 1.5"
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.83"
}
}
}
provider "aws" {
region = "us-east-1"
default_tags {
tags = {
"Name" = "ProwlerScan",
"Terraform" = "true",
"Service" = "https://prowler.com",
"Support" = "support@prowler.com"
}
}
}
data "aws_partition" "current" {}
# IAM Role
###################################
data "aws_iam_policy_document" "prowler_assume_role_policy" {
@@ -44,6 +86,7 @@ data "aws_iam_policy_document" "prowler_assume_role_policy" {
resource "aws_iam_role" "prowler_scan" {
name = "ProwlerScan"
assume_role_policy = data.aws_iam_policy_document.prowler_assume_role_policy.json
}
resource "aws_iam_policy" "prowler_scan_policy" {
@@ -66,16 +109,3 @@ resource "aws_iam_role_policy_attachment" "prowler_scan_viewonly_policy_attachme
role = aws_iam_role.prowler_scan.name
policy_arn = "arn:${data.aws_partition.current.partition}:iam::aws:policy/job-function/ViewOnlyAccess"
}
# S3 Integration Module
###################################
module "s3_integration" {
count = var.enable_s3_integration ? 1 : 0
source = "./s3-integration"
s3_integration_bucket_name = var.s3_integration_bucket_name
s3_integration_bucket_account = var.s3_integration_bucket_account
prowler_role_name = aws_iam_role.prowler_scan.name
}
@@ -1,22 +0,0 @@
# Outputs
###################################
output "prowler_role_arn" {
description = "ARN of the Prowler scan role"
value = aws_iam_role.prowler_scan.arn
}
output "prowler_role_name" {
description = "Name of the Prowler scan role"
value = aws_iam_role.prowler_scan.name
}
output "external_id" {
description = "External ID used for role assumption"
value = var.external_id
sensitive = true
}
output "s3_integration_enabled" {
description = "Whether S3 integration is enabled"
value = var.enable_s3_integration
}
@@ -1 +0,0 @@
data "aws_partition" "current" {}
@@ -1,54 +0,0 @@
# S3 Integration Policy
###################################
resource "aws_iam_role_policy" "prowler_s3_integration" {
name = "ProwlerS3Integration"
role = var.prowler_role_name
policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Effect = "Allow"
Action = [
"s3:DeleteObject"
]
Resource = [
"arn:${data.aws_partition.current.partition}:s3:::${var.s3_integration_bucket_name}/*test-prowler-connection.txt"
]
Condition = {
StringEquals = {
"s3:ResourceAccount" = var.s3_integration_bucket_account
}
}
},
{
Effect = "Allow"
Action = [
"s3:PutObject"
]
Resource = [
"arn:${data.aws_partition.current.partition}:s3:::${var.s3_integration_bucket_name}/*"
]
Condition = {
StringEquals = {
"s3:ResourceAccount" = var.s3_integration_bucket_account
}
}
},
{
Effect = "Allow"
Action = [
"s3:ListBucket"
]
Resource = [
"arn:${data.aws_partition.current.partition}:s3:::${var.s3_integration_bucket_name}"
]
Condition = {
StringEquals = {
"s3:ResourceAccount" = var.s3_integration_bucket_account
}
}
}
]
})
}
@@ -1,29 +0,0 @@
variable "s3_integration_bucket_name" {
type = string
description = "The S3 bucket name where Prowler will store scan reports for your cloud providers."
validation {
condition = length(var.s3_integration_bucket_name) > 0
error_message = "s3_integration_bucket_name must not be empty."
}
}
variable "s3_integration_bucket_account" {
type = string
description = "The AWS Account ID owner of the S3 Bucket."
validation {
condition = length(var.s3_integration_bucket_account) == 12 && can(tonumber(var.s3_integration_bucket_account))
error_message = "s3_integration_bucket_account must be a valid 12-digit AWS Account ID."
}
}
variable "prowler_role_name" {
type = string
description = "Name of the Prowler scan IAM role to attach the S3 policy to."
validation {
condition = length(var.prowler_role_name) > 0
error_message = "prowler_role_name must not be empty."
}
}
@@ -1,3 +0,0 @@
terraform {
required_version = ">= 1.5"
}
@@ -1,13 +0,0 @@
# Required variable
external_id = "your-unique-external-id-here"
# Optional Variables
# Prowler Cloud Account
# account_id = "232136659152"
# Prowler Cloud Role
# iam_principal = "role/prowler*"
# S3 Integration (optional)
# enable_s3_integration = true
# s3_bucket_name = "your-prowler-reports-bucket"
# s3_bucket_account = "123456789012"
@@ -1,56 +0,0 @@
# Variables
###################################
variable "external_id" {
type = string
description = "This is the External ID that Prowler will use to assume the role ProwlerScan IAM Role."
validation {
condition = length(var.external_id) > 0
error_message = "ExternalId must not be empty."
}
}
variable "account_id" {
type = string
description = "AWS Account ID that will assume the role created, if you are deploying this template to be used in Prowler Cloud please do not edit this."
default = "232136659152"
validation {
condition = length(var.account_id) == 12
error_message = "AccountId must be a valid AWS Account ID."
}
}
variable "iam_principal" {
type = string
description = "The IAM principal type and name that will be allowed to assume the role created, leave an * for all the IAM principals in your AWS account. If you are deploying this template to be used in Prowler Cloud please do not edit this."
default = "role/prowler*"
}
variable "enable_s3_integration" {
type = bool
description = "Enable S3 integration for storing Prowler scan reports."
default = false
}
variable "s3_integration_bucket_name" {
type = string
description = "The S3 bucket name where Prowler will store scan reports for your cloud providers. Required if enable_s3_integration is true."
default = ""
validation {
condition = length(var.s3_integration_bucket_name) > 0 || var.s3_integration_bucket_name == ""
error_message = "s3_integration_bucket_name must be a valid S3 bucket name."
}
}
variable "s3_integration_bucket_account" {
type = string
description = "The AWS Account ID owner of the S3 Bucket. Required if enable_s3_integration is true."
default = ""
validation {
condition = var.s3_integration_bucket_account == "" || (length(var.s3_integration_bucket_account) == 12 && can(tonumber(var.s3_integration_bucket_account)))
error_message = "s3_integration_bucket_account must be a valid 12-digit AWS Account ID or empty."
}
}
@@ -1,23 +0,0 @@
# Terraform Provider Configuration
###################################
terraform {
required_version = ">= 1.5"
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.83"
}
}
}
provider "aws" {
region = "us-east-1"
default_tags {
tags = {
"Name" = "ProwlerScan",
"Terraform" = "true",
"Service" = "https://prowler.com",
"Support" = "support@prowler.com"
}
}
}
+7 -25
View File
@@ -2,28 +2,7 @@
All notable changes to the **Prowler SDK** are documented in this file.
## [v5.10.2] (Prowler 5.10.2)
### Fixed
- Order requirements by ID in Prowler ThreatScore AWS compliance framework [(#8495)](https://github.com/prowler-cloud/prowler/pull/8495)
- Add explicit resource name to GCP and Azure Defender checks [(#8352)](https://github.com/prowler-cloud/prowler/pull/8352)
- Validation errors in Azure and M365 providers [(#8353)](https://github.com/prowler-cloud/prowler/pull/8353)
- Azure `app_http_logs_enabled` check false positives [(#8507)](https://github.com/prowler-cloud/prowler/pull/8507)
- Azure `storage_geo_redundant_enabled` check false positives [(#8504)](https://github.com/prowler-cloud/prowler/pull/8504)
- AWS `kafka_cluster_is_public` check false positives [(#8514)](https://github.com/prowler-cloud/prowler/pull/8514)
- List all accessible repositories in GitHub [(#8522)](https://github.com/prowler-cloud/prowler/pull/8522)
- GitHub CIS 1.0 Compliance Reports [(#8519)](https://github.com/prowler-cloud/prowler/pull/8519)
---
## [v5.10.1] (Prowler v5.10.1)
### Fixed
- Remove invalid requirements from CIS 1.0 for GitHub provider [(#8472)](https://github.com/prowler-cloud/prowler/pull/8472)
---
## [v5.10.0] (Prowler v5.10.0)
## [v5.10.0] (Prowler UNRELEASED)
### Added
- `bedrock_api_key_no_administrative_privileges` check for AWS provider [(#8321)](https://github.com/prowler-cloud/prowler/pull/8321)
@@ -33,7 +12,6 @@ All notable changes to the **Prowler SDK** are documented in this file.
- `vm_desired_sku_size` check for Azure provider [(#8191)](https://github.com/prowler-cloud/prowler/pull/8191)
- `vm_scaleset_not_empty` check for Azure provider [(#8192)](https://github.com/prowler-cloud/prowler/pull/8192)
- GitHub repository and organization scoping support with `--repository/respositories` and `--organization/organizations` flags [(#8329)](https://github.com/prowler-cloud/prowler/pull/8329)
- GCP provider retry configuration [(#8412)](https://github.com/prowler-cloud/prowler/pull/8412)
- `s3_bucket_shadow_resource_vulnerability` check for AWS provider [(#8398)](https://github.com/prowler-cloud/prowler/pull/8398)
### Changed
@@ -48,8 +26,12 @@ All notable changes to the **Prowler SDK** are documented in this file.
- Use the correct @staticmethod decorator for `set_identity` and `set_session_config` methods in AwsProvider [(#8056)](https://github.com/prowler-cloud/prowler/pull/8056)
- Use the correct default value for `role_session_name` and `session_duration` in AwsSetUpSession [(#8056)](https://github.com/prowler-cloud/prowler/pull/8056)
- Use the correct default value for `role_session_name` and `session_duration` in S3 [(#8417)](https://github.com/prowler-cloud/prowler/pull/8417)
- GitHub App authentication fails to generate output files and HTML header sections [(#8423)](https://github.com/prowler-cloud/prowler/pull/8423)
- S3 `test_connection` uses AWS S3 API `HeadBucket` instead of `GetBucketLocation` [(#8456)](https://github.com/prowler-cloud/prowler/pull/8456)
---
## [v5.9.3] (Prowler UNRELEASED)
### Fixed
- Add more validations to Azure Storage models when some values are None to avoid serialization issues [(#8325)](https://github.com/prowler-cloud/prowler/pull/8325)
- `sns_topics_not_publicly_accessible` false positive with `aws:SourceArn` conditions [(#8326)](https://github.com/prowler-cloud/prowler/issues/8326)
- Remove typo from description req 1.2.3 - Prowler ThreatScore m365 [(#8384)](https://github.com/prowler-cloud/prowler/pull/8384)
File diff suppressed because it is too large Load Diff
File diff suppressed because it is too large Load Diff
+1 -1
View File
@@ -12,7 +12,7 @@ from prowler.lib.logger import logger
timestamp = datetime.today()
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
prowler_version = "5.10.2"
prowler_version = "5.10.0"
html_logo_url = "https://github.com/prowler-cloud/prowler/"
square_logo_img = "https://prowler.com/wp-content/uploads/logo-html.png"
aws_logo = "https://user-images.githubusercontent.com/38561120/235953920-3e3fba08-0795-41dc-b480-9bea57db9f2e.png"
+3 -1
View File
@@ -550,7 +550,9 @@ class Check_Report_GCP(Check_Report):
or ""
)
self.resource_name = (
resource_name or getattr(resource, "name", "") or "GCP Project"
resource_name
or getattr(resource, "name", "")
or getattr(resource, "id", "")
)
self.project_id = project_id or getattr(resource, "project_id", "")
self.location = (
+2 -11
View File
@@ -249,17 +249,8 @@ class Finding(BaseModel):
output_data["auth_method"] = provider.auth_method
output_data["resource_name"] = check_output.resource_name
output_data["resource_uid"] = check_output.resource_id
if hasattr(provider.identity, "account_name"):
# GithubIdentityInfo (Personal Access Token, OAuth)
output_data["account_name"] = provider.identity.account_name
output_data["account_uid"] = provider.identity.account_id
elif hasattr(provider.identity, "app_id"):
# GithubAppIdentityInfo (GitHub App)
# TODO: Get Github App name
output_data["account_name"] = f"app-{provider.identity.app_id}"
output_data["account_uid"] = provider.identity.app_id
output_data["account_name"] = provider.identity.account_name
output_data["account_uid"] = provider.identity.account_id
output_data["region"] = check_output.owner
elif provider.type == "m365":
+1 -8
View File
@@ -556,13 +556,6 @@ class HTML(Output):
str: the HTML assessment summary
"""
try:
if hasattr(provider.identity, "account_name"):
# GithubIdentityInfo (Personal Access Token, OAuth)
account_display = provider.identity.account_name
elif hasattr(provider.identity, "app_id"):
# GithubAppIdentityInfo (GitHub App)
account_display = f"app-{provider.identity.app_id}"
return f"""
<div class="col-md-2">
<div class="card">
@@ -572,7 +565,7 @@ class HTML(Output):
<ul class="list-group
list-group-flush">
<li class="list-group-item">
<b>GitHub account:</b> {account_display}
<b>GitHub account:</b> {provider.identity.account_name}
</li>
</ul>
</div>
@@ -1362,7 +1362,6 @@
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-central-2",
@@ -1395,7 +1394,6 @@
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-central-2",
@@ -5190,10 +5188,8 @@
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-5",
"ca-central-1",
"eu-central-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"us-east-1",
@@ -5234,7 +5230,6 @@
"aws": [
"af-south-1",
"ap-east-1",
"ap-east-2",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
@@ -5635,11 +5630,9 @@
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-5",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
@@ -5670,11 +5663,9 @@
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-5",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
@@ -5767,11 +5758,9 @@
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-5",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
@@ -5801,11 +5790,9 @@
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-5",
"ca-central-1",
"eu-central-1",
"eu-north-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
@@ -6104,7 +6091,6 @@
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ca-central-1",
@@ -6267,14 +6253,11 @@
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-5",
"ca-central-1",
"eu-central-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",
"me-south-1",
"sa-east-1",
"us-east-1",
"us-east-2",
@@ -6719,7 +6702,6 @@
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
@@ -7717,7 +7699,6 @@
"aws": [
"af-south-1",
"ap-east-1",
"ap-east-2",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
@@ -7969,16 +7950,6 @@
"aws-us-gov": []
}
},
"odb": {
"regions": {
"aws": [
"us-east-1",
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"omics": {
"regions": {
"aws": [
@@ -9631,7 +9602,6 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
"eu-central-1",
@@ -9645,7 +9615,6 @@
"il-central-1",
"me-central-1",
"me-south-1",
"mx-central-1",
"sa-east-1",
"us-east-1",
"us-east-2",
@@ -11543,7 +11512,6 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
"eu-central-1",
@@ -26,10 +26,6 @@ class S3BaseException(ProwlerException):
"message": "The specified location constraint is not valid.",
"remediation": "Check the location constraint.",
},
(6005, "S3InvalidBucketRegionError"): {
"message": "The specified bucket region is invalid.",
"remediation": "Check the bucket region.",
},
}
def __init__(self, code, file=None, original_exception=None, message=None):
@@ -79,10 +75,3 @@ class S3IllegalLocationConstraintError(S3BaseException):
super().__init__(
6004, file=file, original_exception=original_exception, message=message
)
class S3InvalidBucketRegionError(S3BaseException):
def __init__(self, file=None, original_exception=None, message=None):
super().__init__(
6005, file=file, original_exception=original_exception, message=message
)
+16 -21
View File
@@ -39,7 +39,6 @@ from prowler.providers.aws.lib.s3.exceptions.exceptions import (
S3ClientError,
S3IllegalLocationConstraintError,
S3InvalidBucketNameError,
S3InvalidBucketRegionError,
S3TestConnectionError,
)
from prowler.providers.aws.lib.session.aws_set_up_session import (
@@ -313,25 +312,28 @@ class S3:
region_name=aws_region,
profile_name=profile,
)
s3_client = session.client(__class__.__name__.lower())
if "s3://" in bucket_name:
bucket_name = bucket_name.removeprefix("s3://")
# Check bucket location, requires s3:ListBucket permission
# https://docs.aws.amazon.com/AmazonS3/latest/API/API_HeadBucket.html
bucket_region = s3_client.head_bucket(Bucket=bucket_name).get(
"BucketRegion"
)
if bucket_region is None:
exception = S3InvalidBucketRegionError()
if raise_on_exception:
raise exception
return Connection(error=exception)
# Check for the bucket location
bucket_location = s3_client.get_bucket_location(Bucket=bucket_name)
if bucket_location["LocationConstraint"] == "EU":
bucket_location["LocationConstraint"] = "eu-west-1"
if (
bucket_location["LocationConstraint"] == ""
or bucket_location["LocationConstraint"] is None
):
bucket_location["LocationConstraint"] = "us-east-1"
# If the bucket location is not the same as the session region, change the session region
if session.region_name != bucket_region:
if (
session.region_name != bucket_location["LocationConstraint"]
and bucket_location["LocationConstraint"] is not None
):
s3_client = session.client(
__class__.__name__.lower(),
region_name=bucket_region,
region_name=bucket_location["LocationConstraint"],
)
# Set a Temp file to upload
with tempfile.TemporaryFile() as temp_file:
@@ -464,19 +466,12 @@ class S3:
if raise_on_exception:
raise session_token_expired
return Connection(error=session_token_expired)
except S3InvalidBucketRegionError as invalid_bucket_region_error:
logger.error(
f"{invalid_bucket_region_error.__class__.__name__}[{invalid_bucket_region_error.__traceback__.tb_lineno}]: {invalid_bucket_region_error}"
)
if raise_on_exception:
raise invalid_bucket_region_error
return Connection(error=invalid_bucket_region_error)
except ClientError as client_error:
if raise_on_exception:
if (
"specified bucket does not exist"
in client_error.response["Error"]["Message"]
or "Not Found" in client_error.response["Error"]["Message"]
):
raise S3InvalidBucketNameError(original_exception=client_error)
elif (
@@ -10,13 +10,13 @@ class kafka_cluster_is_public(Check):
report = Check_Report_AWS(metadata=self.metadata(), resource=cluster)
report.status = "FAIL"
report.status_extended = (
f"Kafka cluster {cluster.name} is publicly accessible."
f"Kafka cluster '{cluster.name}' is publicly accessible."
)
if not cluster.public_access:
if cluster.public_access:
report.status = "PASS"
report.status_extended = (
f"Kafka cluster {cluster.name} is not publicly accessible."
f"Kafka cluster '{cluster.name}' is not publicly accessible."
)
findings.append(report)
@@ -22,10 +22,6 @@ class app_http_logs_enabled(Check):
report.status = "PASS"
report.status_extended = f"App {app.name} has HTTP Logs enabled in diagnostic setting {diagnostic_setting.name} in subscription {subscription_name}"
break
elif log.category_group == "allLogs" and log.enabled:
report.status = "PASS"
report.status_extended = f"App {app.name} has allLogs category group which includes HTTP Logs enabled in diagnostic setting {diagnostic_setting.name} in subscription {subscription_name}"
break
findings.append(report)
return findings
@@ -14,11 +14,6 @@ class defender_additional_email_configured_with_a_security_contact(Check):
report = Check_Report_Azure(
metadata=self.metadata(), resource=contact_configuration
)
report.resource_name = (
contact_configuration.name
if contact_configuration.name
else "Security Contact"
)
report.subscription = subscription_name
if len(contact_configuration.emails) > 0:
@@ -31,11 +31,6 @@ class defender_attack_path_notifications_properly_configured(Check):
report = Check_Report_Azure(
metadata=self.metadata(), resource=contact_configuration
)
report.resource_name = (
contact_configuration.name
if contact_configuration.name
else "Security Contact"
)
report.subscription = subscription_name
actual_risk_level = getattr(
contact_configuration, "attack_path_minimal_risk_level", None
@@ -14,11 +14,6 @@ class defender_ensure_notify_alerts_severity_is_high(Check):
report = Check_Report_Azure(
metadata=self.metadata(), resource=contact_configuration
)
report.resource_name = (
contact_configuration.name
if contact_configuration.name
else "Security Contact"
)
report.subscription = subscription_name
report.status = "FAIL"
report.status_extended = f"Notifications are not enabled for alerts with a minimum severity of high or lower in subscription {subscription_name}."
@@ -12,13 +12,7 @@ class defender_ensure_notify_emails_to_owners(Check):
) in defender_client.security_contact_configurations.items():
for contact_configuration in security_contact_configurations.values():
report = Check_Report_Azure(
metadata=self.metadata(),
resource=contact_configuration,
)
report.resource_name = (
contact_configuration.name
if contact_configuration.name
else "Security Contact"
metadata=self.metadata(), resource=contact_configuration
)
report.subscription = subscription_name
if (
@@ -1,5 +1,6 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.storage.storage_client import storage_client
from prowler.providers.azure.services.storage.storage_service import ReplicationSettings
class storage_geo_redundant_enabled(Check):
@@ -26,16 +27,14 @@ class storage_geo_redundant_enabled(Check):
report.subscription = subscription
if (
storage_account.replication_settings == "Standard_GRS"
or storage_account.replication_settings == "Standard_GZRS"
or storage_account.replication_settings == "Standard_RAGRS"
or storage_account.replication_settings == "Standard_RAGZRS"
storage_account.replication_settings
== ReplicationSettings.STANDARD_GRS
):
report.status = "PASS"
report.status_extended = f"Storage account {storage_account.name} from subscription {subscription} has Geo-redundant storage {storage_account.replication_settings} enabled."
report.status_extended = f"Storage account {storage_account.name} from subscription {subscription} has Geo-redundant storage (GRS) enabled."
else:
report.status = "FAIL"
report.status_extended = f"Storage account {storage_account.name} from subscription {subscription} does not have Geo-redundant storage enabled, it has {storage_account.replication_settings} instead."
report.status_extended = f"Storage account {storage_account.name} from subscription {subscription} does not have Geo-redundant storage (GRS) enabled."
findings.append(report)
@@ -1,3 +1,4 @@
from enum import Enum
from typing import Optional
from azure.mgmt.storage import StorageManagementClient
@@ -34,6 +35,7 @@ class Storage(AzureService):
key_expiration_period_in_days = int(
storage_account.key_policy.key_expiration_period_in_days
)
replication_settings = ReplicationSettings(storage_account.sku.name)
storage_accounts[subscription].append(
Account(
id=storage_account.id,
@@ -82,7 +84,7 @@ class Storage(AzureService):
False,
)
),
replication_settings=storage_account.sku.name,
replication_settings=replication_settings,
allow_cross_tenant_replication=(
True
if getattr(
@@ -271,6 +273,17 @@ class PrivateEndpointConnection(BaseModel):
type: str
class ReplicationSettings(Enum):
STANDARD_LRS = "Standard_LRS"
STANDARD_GRS = "Standard_GRS"
STANDARD_RAGRS = "Standard_RAGRS"
STANDARD_ZRS = "Standard_ZRS"
PREMIUM_LRS = "Premium_LRS"
PREMIUM_ZRS = "Premium_ZRS"
STANDARD_GZRS = "Standard_GZRS"
STANDARD_RAGZRS = "Standard_RAGZRS"
class SMBProtocolSettings(BaseModel):
channel_encryption: list[str]
supported_versions: list[str]
@@ -297,7 +310,7 @@ class Account(BaseModel):
minimum_tls_version: str
private_endpoint_connections: list[PrivateEndpointConnection]
key_expiration_period_in_days: Optional[int] = None
replication_settings: str = "Standard_LRS"
replication_settings: ReplicationSettings = ReplicationSettings.STANDARD_LRS
allow_cross_tenant_replication: bool = True
allow_shared_key_access: bool = True
blob_properties: Optional[BlobProperties] = None
-1
View File
@@ -192,7 +192,6 @@ class Provider(ABC):
)
elif "gcp" in provider_class_name.lower():
provider_class(
retries_max_attempts=arguments.gcp_retries_max_attempts,
organization_id=arguments.organization_id,
project_ids=arguments.project_id,
excluded_project_ids=arguments.excluded_project_id,
-4
View File
@@ -1,4 +0,0 @@
# GCP Provider Configuration
# Default retry configuration
DEFAULT_RETRY_ATTEMPTS = 3
+6 -45
View File
@@ -19,7 +19,6 @@ from prowler.lib.logger import logger
from prowler.lib.utils.utils import print_boxes
from prowler.providers.common.models import Audit_Metadata, Connection
from prowler.providers.common.provider import Provider
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.exceptions.exceptions import (
GCPInvalidProviderIdError,
GCPLoadADCFromDictError,
@@ -70,7 +69,6 @@ class GcpProvider(Provider):
def __init__(
self,
retries_max_attempts: int = None,
organization_id: str = None,
project_ids: list = None,
excluded_project_ids: list = None,
@@ -91,7 +89,6 @@ class GcpProvider(Provider):
GCP Provider constructor
Args:
retries_max_attempts: int -> The maximum number of retries for the Google Cloud SDK retry config (Default: 3)
organization_id: str
project_ids: list
excluded_project_ids: list
@@ -138,10 +135,6 @@ class GcpProvider(Provider):
>>> GcpProvider(
... credentials_file="credentials_file"
... )
- Using custom retry configuration:
>>> GcpProvider(
... retries_max_attempts=5
... )
- Impersonating a service account: If you want to impersonate a GCP service account, you can use the impersonate_service_account parameter. For this method user must be authenticated:
>>> GcpProvider(
... impersonate_service_account="service_account"
@@ -166,14 +159,6 @@ class GcpProvider(Provider):
... )
"""
logger.info("Instantiating GCP Provider ...")
# Update retry configuration if provided
if retries_max_attempts is not None:
import prowler.providers.gcp.config as gcp_config
gcp_config.DEFAULT_RETRY_ATTEMPTS = retries_max_attempts
logger.info(f"GCP retry attempts set to {retries_max_attempts}")
self._impersonated_service_account = impersonate_service_account
# Set the GCP credentials using the provided client_id, client_secret and refresh_token
gcp_credentials = None
@@ -659,9 +644,6 @@ class GcpProvider(Provider):
if asset["resource"]["data"].get("name")
else project_id
)
# Handle empty or null project names
if not project_name or project_name.strip() == "":
project_name = "GCP Project"
gcp_project = GCPProject(
number=project_number,
id=project_id,
@@ -694,15 +676,12 @@ class GcpProvider(Provider):
try:
# Initialize Cloud Resource Manager API for simple project listing
service = discovery.build(
"cloudresourcemanager",
"v1",
credentials=credentials,
num_retries=DEFAULT_RETRY_ATTEMPTS,
"cloudresourcemanager", "v1", credentials=credentials
)
request = service.projects().list()
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for project in response.get("projects", []):
# Extract labels and other project details
@@ -720,9 +699,6 @@ class GcpProvider(Provider):
if project.get("name")
else project_id
)
# Handle empty or null project names
if not project_name or project_name.strip() == "":
project_name = "GCP Project"
project_id = project["projectId"]
gcp_project = GCPProject(
number=project_number,
@@ -763,15 +739,9 @@ class GcpProvider(Provider):
# If no projects were able to be accessed via API, add them manually if provided by the user in arguments
if project_ids:
for input_project in project_ids:
# Handle empty or null project names
project_name = (
input_project
if input_project and input_project.strip() != ""
else "GCP Project"
)
projects[input_project] = GCPProject(
id=input_project,
name=project_name,
name=input_project,
number=0,
labels={},
lifecycle_state="ACTIVE",
@@ -780,15 +750,9 @@ class GcpProvider(Provider):
elif credentials_file:
with open(credentials_file, "r", encoding="utf-8") as file:
project_id = json.load(file)["project_id"]
# Handle empty or null project names
project_name = (
project_id
if project_id and project_id.strip() != ""
else "GCP Project"
)
projects[project_id] = GCPProject(
id=project_id,
name=project_name,
name=project_id,
number=0,
labels={},
lifecycle_state="ACTIVE",
@@ -812,10 +776,7 @@ class GcpProvider(Provider):
"""
try:
service = discovery.build(
"cloudresourcemanager",
"v1",
credentials=self._session,
num_retries=DEFAULT_RETRY_ATTEMPTS,
"cloudresourcemanager", "v1", credentials=self._session
)
# TODO: this call requires more permissions to get that data
# resourcemanager.organizations.get --> add to the docs
@@ -826,7 +787,7 @@ class GcpProvider(Provider):
)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
project.organization.display_name = response.get("displayName")
request = service.projects().list_next(
previous_request=request, previous_response=response
@@ -48,12 +48,3 @@ def init_parser(self):
action="store_true",
help="List available project IDs in Google Cloud which can be scanned by Prowler",
)
# GCP Config
gcp_config_subparser = gcp_parser.add_argument_group("GCP Config")
gcp_config_subparser.add_argument(
"--gcp-retries-max-attempts",
nargs="?",
default=None,
type=int,
help="Set the maximum attempts for the Google Cloud SDK retry config (Default: 3)",
)
+2 -8
View File
@@ -7,7 +7,6 @@ from googleapiclient import discovery
from googleapiclient.discovery import Resource
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
@@ -62,7 +61,7 @@ class GCPService:
request = client.services().get(
name=f"projects/{project_id}/services/{self.service}.googleapis.com"
)
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
if response.get("state") != "DISABLED":
project_ids.append(project_id)
else:
@@ -82,12 +81,7 @@ class GCPService:
credentials: Credentials,
) -> Resource:
try:
return discovery.build(
service,
api_version,
credentials=credentials,
num_retries=DEFAULT_RETRY_ATTEMPTS,
)
return discovery.build(service, api_version, credentials=credentials)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
@@ -1,7 +1,6 @@
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
@@ -25,7 +24,7 @@ class APIKeys(GCPService):
)
)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for key in response.get("keys", []):
self.keys.append(
@@ -1,7 +1,6 @@
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
@@ -20,7 +19,7 @@ class BigQuery(GCPService):
try:
request = self.client.datasets().list(projectId=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for dataset in response.get("datasets", []):
dataset_info = (
@@ -29,7 +28,7 @@ class BigQuery(GCPService):
projectId=project_id,
datasetId=dataset["datasetReference"]["datasetId"],
)
.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
.execute()
)
cmk_encryption = False
public = False
@@ -66,7 +65,7 @@ class BigQuery(GCPService):
projectId=dataset.project_id, datasetId=dataset.name
)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for table in response.get("tables", []):
cmk_encryption = False
@@ -77,7 +76,7 @@ class BigQuery(GCPService):
datasetId=dataset.name,
tableId=table["tableReference"]["tableId"],
)
.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
.execute()
.get("encryptionConfiguration")
):
cmk_encryption = True
@@ -1,7 +1,6 @@
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
@@ -20,9 +19,7 @@ class CloudResourceManager(GCPService):
for project_id in self.project_ids:
try:
policy = (
self.client.projects()
.getIamPolicy(resource=project_id)
.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
self.client.projects().getIamPolicy(resource=project_id).execute()
)
audit_logging = False
if policy.get("auditConfigs"):
@@ -46,11 +43,7 @@ class CloudResourceManager(GCPService):
def _get_organizations(self):
try:
if self.project_ids:
response = (
self.client.organizations()
.search()
.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
)
response = self.client.organizations().search().execute()
for org in response.get("organizations", []):
self.organizations.append(
Organization(
@@ -1,7 +1,6 @@
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
@@ -17,7 +16,7 @@ class CloudSQL(GCPService):
try:
request = self.client.instances().list(project=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for instance in response.get("items", []):
public_ip = False
@@ -3,7 +3,6 @@ from typing import Optional
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
@@ -19,12 +18,12 @@ class CloudStorage(GCPService):
try:
request = self.client.buckets().list(project=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for bucket in response.get("items", []):
bucket_iam = (
self.client.buckets()
.getIamPolicy(bucket=bucket["id"])
.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)["bindings"]
.execute()["bindings"]
)
public = False
if "allAuthenticatedUsers" in str(
@@ -1,7 +1,6 @@
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
@@ -34,7 +33,7 @@ class Compute(GCPService):
try:
request = self.client.regions().list(project=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for region in response.get("items", []):
self.regions.add(region["name"])
@@ -52,7 +51,7 @@ class Compute(GCPService):
try:
request = self.client.zones().list(project=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for zone in response.get("items", []):
self.zones.add(zone["name"])
@@ -69,11 +68,7 @@ class Compute(GCPService):
for project_id in self.project_ids:
try:
enable_oslogin = False
response = (
self.client.projects()
.get(project=project_id)
.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
)
response = self.client.projects().get(project=project_id).execute()
for item in response["commonInstanceMetadata"].get("items", []):
if item["key"] == "enable-oslogin" and item["value"] == "TRUE":
enable_oslogin = True
@@ -91,8 +86,7 @@ class Compute(GCPService):
request = self.client.instances().list(project=project_id, zone=zone)
while request is not None:
response = request.execute(
http=self.__get_AuthorizedHttp_client__(),
num_retries=DEFAULT_RETRY_ATTEMPTS,
http=self.__get_AuthorizedHttp_client__()
)
for instance in response.get("items", []):
@@ -150,7 +144,7 @@ class Compute(GCPService):
try:
request = self.client.networks().list(project=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for network in response.get("items", []):
subnet_mode = (
"legacy"
@@ -184,8 +178,7 @@ class Compute(GCPService):
)
while request is not None:
response = request.execute(
http=self.__get_AuthorizedHttp_client__(),
num_retries=DEFAULT_RETRY_ATTEMPTS,
http=self.__get_AuthorizedHttp_client__()
)
for subnet in response.get("items", []):
self.subnets.append(
@@ -215,8 +208,7 @@ class Compute(GCPService):
)
while request is not None:
response = request.execute(
http=self.__get_AuthorizedHttp_client__(),
num_retries=DEFAULT_RETRY_ATTEMPTS,
http=self.__get_AuthorizedHttp_client__()
)
for address in response.get("items", []):
self.addresses.append(
@@ -243,7 +235,7 @@ class Compute(GCPService):
try:
request = self.client.firewalls().list(project=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for firewall in response.get("items", []):
self.firewalls.append(
@@ -271,7 +263,7 @@ class Compute(GCPService):
# Global URL maps
request = self.client.urlMaps().list(project=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for urlmap in response.get("items", []):
self.load_balancers.append(
LoadBalancer(
@@ -296,7 +288,7 @@ class Compute(GCPService):
project=project_id, region=region
)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for urlmap in response.get("items", []):
self.load_balancers.append(
LoadBalancer(
@@ -330,7 +322,7 @@ class Compute(GCPService):
region=region,
backendService=backend_service_name,
)
.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
.execute()
)
else:
response = (
@@ -339,7 +331,7 @@ class Compute(GCPService):
project=balancer.project_id,
backendService=backend_service_name,
)
.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
.execute()
)
balancer.logging = response.get("logConfig", {}).get(
@@ -1,7 +1,6 @@
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
from prowler.providers.gcp.services.compute.compute_client import compute_client
@@ -25,8 +24,7 @@ class Dataproc(GCPService):
)
while request is not None:
response = request.execute(
http=self.__get_AuthorizedHttp_client__(),
num_retries=DEFAULT_RETRY_ATTEMPTS,
http=self.__get_AuthorizedHttp_client__()
)
for cluster in response.get("clusters", []):
@@ -1,7 +1,6 @@
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
@@ -19,7 +18,7 @@ class DNS(GCPService):
try:
request = self.client.managedZones().list(project=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for managed_zone in response.get("managedZones"):
self.managed_zones.append(
ManagedZone(
@@ -49,7 +48,7 @@ class DNS(GCPService):
try:
request = self.client.policies().list(project=project_id)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for policy in response.get("policies", []):
policy_networks = []
@@ -1,7 +1,6 @@
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
@@ -22,7 +21,7 @@ class GKE(GCPService):
.locations()
.list(parent="projects/" + project_id)
)
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for location in response["locations"]:
self.locations.append(
@@ -44,10 +43,7 @@ class GKE(GCPService):
parent=f"projects/{location.project_id}/locations/{location.name}"
)
)
response = request.execute(
http=self.__get_AuthorizedHttp_client__(),
num_retries=DEFAULT_RETRY_ATTEMPTS,
)
response = request.execute(http=self.__get_AuthorizedHttp_client__())
for cluster in response.get("clusters", []):
node_pools = []
for node_pool in cluster["nodePools"]:
@@ -13,7 +13,7 @@ class iam_no_service_roles_at_project_level(Check):
metadata=self.metadata(),
resource=binding,
resource_id=binding.role,
resource_name=binding.role if binding.role else "Service Role",
resource_name=binding.role,
location=cloudresourcemanager_client.region,
)
if binding.role in [
@@ -31,6 +31,7 @@ class iam_no_service_roles_at_project_level(Check):
metadata=self.metadata(),
resource=cloudresourcemanager_client.projects[project],
project_id=project,
resource_name=project,
location=cloudresourcemanager_client.region,
)
report.status = "PASS"
@@ -3,7 +3,6 @@ from datetime import datetime
from pydantic.v1 import BaseModel
from prowler.lib.logger import logger
from prowler.providers.gcp.config import DEFAULT_RETRY_ATTEMPTS
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.gcp.lib.service.service import GCPService
from prowler.providers.gcp.services.cloudresourcemanager.cloudresourcemanager_client import (
@@ -27,7 +26,7 @@ class IAM(GCPService):
.list(name="projects/" + project_id)
)
while request is not None:
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for account in response.get("accounts", []):
self.service_accounts.append(
@@ -64,7 +63,7 @@ class IAM(GCPService):
+ sa.email
)
)
response = request.execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
response = request.execute()
for key in response.get("keys", []):
sa.keys.append(
@@ -117,7 +116,7 @@ class AccessApproval(GCPService):
self.client.projects().getAccessApprovalSettings(
name=f"projects/{project_id}/accessApprovalSettings"
)
).execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
).execute()
self.settings[project_id] = Setting(
name=response["name"],
project_id=project_id,
@@ -148,7 +147,7 @@ class EssentialContacts(GCPService):
self.client.organizations()
.contacts()
.list(parent="organizations/" + org.id)
).execute(num_retries=DEFAULT_RETRY_ATTEMPTS)
).execute()
if len(response.get("contacts", [])) > 0:
contacts = True

Some files were not shown because too many files have changed in this diff Show More