Compare commits

...

52 Commits

Author SHA1 Message Date
Andoni A. b48b381624 feat(image): add private registry authentication support
Add registry_username, registry_password, and registry_token params
to ImageProvider following the IaC provider pattern (explicit params
with env var fallback). Credentials are injected as environment
variables into Trivy subprocess calls.
2026-02-06 08:18:05 +01:00
Andoni A. 9d5e3f4758 fix(image): replace sys.exit calls with exceptions, fix mutable defaults, add tests
- Create exceptions module (codes 9000-9005) following OCI provider pattern
- Replace all sys.exit(1) calls with typed exceptions
- Fix mutable default arguments in ImageProvider and CheckReportImage
- Add return type hints to all properties and methods
- Add ResourceGroup field to metadata dict
- Add test suite with 23 test cases covering initialization, finding
  processing, scan execution, error handling, and connection testing
- Update CHANGELOG with container image provider entry
2026-02-05 20:37:29 +01:00
Andoni A. beb74a6459 Merge remote-tracking branch 'origin/master' into image-scan-poc 2026-02-05 19:20:34 +01:00
Andoni A. f42de0d21b feat(image): add container image provider POC
Add initial proof of concept for a container image security scanning
provider that uses Trivy for vulnerability detection in container images.
2026-01-29 08:41:11 +01:00
Pedro Martín 17f5633a8d feat(compliance): add CIS 1.12 for Kubernetes (#9778) 2026-01-13 10:16:28 +01:00
Pedro Martín 48274f1d54 feat(compliance): add CIS 6.0 for M365 (#9779) 2026-01-13 10:07:12 +01:00
Pedro Martín 9719f9ee86 feat(compliance): add CIS 5.0 for Azure (#9777) 2026-01-13 09:39:24 +01:00
Alejandro Bailo d38be934a3 feat(ui): add new findings table (#9699) 2026-01-12 15:44:25 +01:00
Rubén De la Torre Vico 0472eb74d2 chore(aws): enhance metadata for bedrock service (#8827)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-12 14:26:37 +01:00
Rubén De la Torre Vico e5b86da6e5 chore(aws): enhance metadata for rds service (#9551)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-12 13:52:29 +01:00
Lee Trout 429c591819 chore(aws): fixup AWS EC2 SG lib (#9216)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2026-01-12 13:47:37 +01:00
Prowler Bot 87c0747174 feat(aws): Update regions for AWS services (#9771)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2026-01-12 13:00:39 +01:00
lydiavilchez 62a8540169 feat(gcp): add check to detect Compute Engine configuration changes (#9698)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2026-01-12 12:22:15 +01:00
Pepe Fagoaga 9ee77c2b97 chore(security): Remove safety check ignores as they are fixed (#9752) 2026-01-12 12:02:22 +01:00
Víctor Fernández Poyatos 5f2cb614ad feat(overviews): Compliance watchlist endpoint (#9596)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2026-01-12 11:40:36 +01:00
Chandrapal Badshah 6c01151d78 docs(lighthouse): update lighthouse architecture docs (#9576)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
Co-authored-by: Andoni Alonso <14891798+andoniaf@users.noreply.github.com>
2026-01-12 10:18:58 +01:00
mchennai 05466cff22 test: Add edge case test for s3_bucket_server_access_logging_enabled (#9725)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-12 10:06:34 +01:00
Rubén De la Torre Vico a57b6d78bf docs: add audit scope column to supported providers table (#9750) 2026-01-12 09:19:29 +01:00
Adrián Peña d3eb30c066 chore: update API PR template (#9749) 2026-01-09 15:13:48 +01:00
Alan Buscaglia 7f2fa275c6 feat: add AI skills pack for Claude Code and OpenCode (#9728)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-09 15:01:18 +01:00
Pepe Fagoaga 42ae5b6e3e chore(template): PR Community Checklist (#9748) 2026-01-09 14:42:07 +01:00
Pepe Fagoaga 7c1bcfc781 fix: typo in subscription error (#9745)
Co-authored-by: Alan Buscaglia <gentlemanprogramming@gmail.com>
2026-01-09 11:32:10 +01:00
dependabot[bot] 68684b107a build(deps-dev): bump authlib from 1.6.5 to 1.6.6 in /api (#9742)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-09 08:25:25 +01:00
dependabot[bot] d04716ea95 build(deps): bump werkzeug from 3.1.4 to 3.1.5 in /api (#9743)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-09 08:23:58 +01:00
dependabot[bot] 8d8b7aad15 build(deps): bump werkzeug from 3.1.4 to 3.1.5 (#9744)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-09 08:22:37 +01:00
Pepe Fagoaga f3ba70dd6b docs: add warning about changes not complaining with roadmap (#9741) 2026-01-08 17:03:38 +01:00
Andoni Alonso 27492cbd42 fix(oci): validate credentials before scanning (#9738) 2026-01-08 15:47:26 +01:00
dependabot[bot] 795220e290 build(deps): bump werkzeug from 3.1.3 to 3.1.4 (#9399)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 15:41:48 +01:00
dependabot[bot] 64ab8e64b0 build(deps): bump urllib3 from 1.26.20 to 2.6.3 (#9734)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 15:41:39 +01:00
dependabot[bot] a0f9df07bd build(deps): bump pynacl from 1.5.0 to 1.6.2 (#9726)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 15:40:55 +01:00
dependabot[bot] 3d16c62f30 build(deps): bump fastmcp from 2.13.1 to 2.14.0 in /mcp_server (#9696)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 15:04:53 +01:00
dependabot[bot] fa2deef241 build(deps): bump aiohttp from 3.12.15 to 3.13.3 in /api (#9723)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 14:12:54 +01:00
dependabot[bot] 211639d849 build(deps-dev): bump marshmallow from 3.26.1 to 3.26.2 in /api (#9651)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:52:58 +01:00
dependabot[bot] 25c90f9f63 build(deps): bump urllib3 from 2.5.0 to 2.6.3 in /api (#9735)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:45:58 +01:00
dependabot[bot] bbdb230bb2 build(deps): bump filelock from 3.12.4 to 3.20.1 in /api (#9594)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:45:14 +01:00
dependabot[bot] 6e2ba66a5a build(deps): bump pynacl from 1.5.0 to 1.6.2 in /api (#9739)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:44:13 +01:00
dependabot[bot] 3332e5b891 build(deps): bump aiohttp from 3.12.14 to 3.13.3 (#9722)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:38:35 +01:00
dependabot[bot] 44d791dfe9 build(deps-dev): bump marshmallow from 3.26.1 to 3.26.2 (#9652)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:37:20 +01:00
dependabot[bot] 73375ee289 build(deps): bump tj-actions/changed-files from 47.0.0 to 47.0.1 (#9711)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 13:30:41 +01:00
Rubén De la Torre Vico 503b56188b chore(aws): enhance metadata for datasync service (#8854)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-08 13:22:59 +01:00
dependabot[bot] 7c9dd8fe89 build(deps): bump peter-evans/create-pull-request from 7.0.8 to 8.0.0 (#9705)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:19:35 +01:00
dependabot[bot] f407a24022 build(deps): bump actions/upload-artifact from 4.6.2 to 6.0.0 (#9712)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:16:15 +01:00
dependabot[bot] 8f5c43744f build(deps): bump softprops/action-gh-release from 2.4.1 to 2.5.0 (#9389)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:15:24 +01:00
Rubén De la Torre Vico 8d78831d29 chore(aws): enhance metadata for s3 service (#9552)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2026-01-08 13:13:32 +01:00
dependabot[bot] 858446c740 build(deps): bump actions/setup-node from 6.0.0 to 6.1.0 (#9707)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 13:00:44 +01:00
dependabot[bot] e9ca8bfda6 build(deps): bump trufflesecurity/trufflehog from 3.91.1 to 3.92.4 (#9710)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2026-01-08 12:56:15 +01:00
dependabot[bot] 5cd446c446 build(deps): bump codecov/codecov-action from 5.5.1 to 5.5.2 (#9708)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 12:56:04 +01:00
dependabot[bot] 319f5b6c38 build(deps): bump actions/cache from 4.3.0 to 5.0.1 (#9706)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 12:54:40 +01:00
dependabot[bot] 64c9dd4947 build(deps): bump docker/login-action from 3.4.0 to 3.6.0 (#9396)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 12:54:03 +01:00
dependabot[bot] 8b2dea52fa build(deps): bump docker/setup-buildx-action from 3.11.1 to 3.12.0 (#9709)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 12:52:42 +01:00
Andoni Alonso da567138fa docs(developer-guide): add missing compliance framework link (#9736) 2026-01-08 10:19:16 +01:00
Sergio Garcia 5b59986ae7 docs(azure): enhance Managed Identity authentication documentation (#9012)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2026-01-08 09:04:04 +01:00
283 changed files with 24913 additions and 3503 deletions
+19 -2
View File
@@ -14,14 +14,26 @@ Please add a detailed description of how to review this PR.
### Checklist
- Are there new checks included in this PR? Yes / No
- If so, do we need to update permissions for the provider? Please review this carefully.
<details>
<summary><b>Community Checklist</b></summary>
- [ ] This feature/issue is listed in [here](https://github.com/prowler-cloud/prowler/issues?q=sort%3Aupdated-desc+is%3Aissue+is%3Aopen) or roadmap.prowler.com
- [ ] Is it assigned to me, if not, request it via the issue/feature in [here](https://github.com/prowler-cloud/prowler/issues?q=sort%3Aupdated-desc+is%3Aissue+is%3Aopen) or [Prowler Community Slack](goto.prowler.com/slack)
</details>
- [ ] Review if the code is being covered by tests.
- [ ] Review if code is being documented following this specification https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
- [ ] Review if backport is needed.
- [ ] Review if is needed to change the [Readme.md](https://github.com/prowler-cloud/prowler/blob/master/README.md)
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/prowler/CHANGELOG.md), if applicable.
#### SDK/CLI
- Are there new checks included in this PR? Yes / No
- If so, do we need to update permissions for the provider? Please review this carefully.
#### UI
- [ ] All issue/task requirements work as expected on the UI
- [ ] Screenshots/Video of the functionality flow (if applicable) - Mobile (X < 640px)
@@ -30,6 +42,11 @@ Please add a detailed description of how to review this PR.
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/ui/CHANGELOG.md), if applicable.
#### API
- [ ] All issue/task requirements work as expected on the API
- [ ] Endpoint response output (if applicable)
- [ ] EXPLAIN ANALYZE output for new/modified queries or indexes (if applicable)
- [ ] Performance test results (if applicable)
- [ ] Any other relevant evidence of the implementation (if applicable)
- [ ] Verify if API specs need to be regenerated.
- [ ] Check if version updates are required (e.g., specs, Poetry, etc.).
- [ ] Ensure new entries are added to [CHANGELOG.md](https://github.com/prowler-cloud/prowler/blob/master/api/CHANGELOG.md), if applicable.
+3 -3
View File
@@ -110,7 +110,7 @@ jobs:
git --no-pager diff
- name: Create PR for next API minor version to master
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -164,7 +164,7 @@ jobs:
git --no-pager diff
- name: Create PR for first API patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -235,7 +235,7 @@ jobs:
git --no-pager diff
- name: Create PR for next API patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
+1 -1
View File
@@ -37,7 +37,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
api/**
@@ -102,7 +102,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build and push API container for ${{ matrix.arch }}
id: container-push
@@ -125,13 +125,13 @@ jobs:
steps:
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
+3 -3
View File
@@ -32,7 +32,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: api/Dockerfile
@@ -67,7 +67,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: api/**
files_ignore: |
@@ -77,7 +77,7 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
+2 -4
View File
@@ -37,7 +37,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
api/**
@@ -60,9 +60,7 @@ jobs:
- name: Safety
if: steps.check-changes.outputs.any_changed == 'true'
# 76352, 76353, 77323 come from SDK, but they cannot upgrade it yet. It does not affect API
# TODO: Botocore needs urllib3 1.X so we need to ignore these vulnerabilities 77744,77745. Remove this once we upgrade to urllib3 2.X
run: poetry run safety check --ignore 70612,66963,74429,76352,76353,77323,77744,77745
run: poetry run safety check
- name: Vulture
if: steps.check-changes.outputs.any_changed == 'true'
+2 -2
View File
@@ -77,7 +77,7 @@ jobs:
- name: Check for API changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
api/**
@@ -100,7 +100,7 @@ jobs:
- name: Upload coverage reports to Codecov
if: steps.check-changes.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
+3 -3
View File
@@ -106,7 +106,7 @@ jobs:
git --no-pager diff
- name: Create PR for documentation update to master
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -161,7 +161,7 @@ jobs:
git --no-pager diff
- name: Create PR for documentation update to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -225,7 +225,7 @@ jobs:
git --no-pager diff
- name: Create PR for documentation update to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
+1 -1
View File
@@ -28,6 +28,6 @@ jobs:
fetch-depth: 0
- name: Scan for secrets with TruffleHog
uses: trufflesecurity/trufflehog@aade3bff5594fe8808578dd4db3dfeae9bf2abdc # v3.91.1
uses: trufflesecurity/trufflehog@ef6e76c3c4023279497fab4721ffa071a722fd05 # v3.92.4
with:
extra_args: '--results=verified,unknown'
@@ -100,7 +100,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build and push MCP container for ${{ matrix.arch }}
id: container-push
@@ -131,13 +131,13 @@ jobs:
steps:
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
+3 -3
View File
@@ -32,7 +32,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: mcp_server/Dockerfile
@@ -66,7 +66,7 @@ jobs:
- name: Check for MCP changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: mcp_server/**
files_ignore: |
@@ -75,7 +75,7 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build MCP container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
+1 -1
View File
@@ -35,7 +35,7 @@ jobs:
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
api/**
+1 -1
View File
@@ -32,7 +32,7 @@ jobs:
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: '**'
+2 -2
View File
@@ -344,7 +344,7 @@ jobs:
- name: Create PR for API dependency update
if: ${{ env.PATCH_VERSION == '0' }}
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
commit-message: 'chore(api): update prowler dependency to ${{ env.BRANCH_NAME }} for release ${{ env.PROWLER_VERSION }}'
@@ -374,7 +374,7 @@ jobs:
no-changelog
- name: Create draft release
uses: softprops/action-gh-release@6da8fa9354ddfdc4aeace5fc48d7f679b5214090 # v2.4.1
uses: softprops/action-gh-release@a06a81a03ee405af7f2048a818ed3f03bbf83c7b # v2.5.0
with:
tag_name: ${{ env.PROWLER_VERSION }}
name: Prowler ${{ env.PROWLER_VERSION }}
+3 -3
View File
@@ -91,7 +91,7 @@ jobs:
git --no-pager diff
- name: Create PR for next minor version to master
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -139,7 +139,7 @@ jobs:
git --no-pager diff
- name: Create PR for first patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -196,7 +196,7 @@ jobs:
git --no-pager diff
- name: Create PR for next patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
+3 -3
View File
@@ -35,7 +35,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ./**
files_ignore: |
@@ -79,11 +79,11 @@ jobs:
- name: Lint with flake8
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib,ui,api
run: poetry run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib,ui,api,skills
- name: Check format with black
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run black --exclude api ui --check .
run: poetry run black --exclude api ui skills --check .
- name: Lint with pylint
if: steps.check-changes.outputs.any_changed == 'true'
@@ -169,7 +169,7 @@ jobs:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build and push SDK container for ${{ matrix.arch }}
id: container-push
@@ -193,13 +193,13 @@ jobs:
steps:
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Public ECR
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: public.ecr.aws
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
@@ -208,7 +208,7 @@ jobs:
AWS_REGION: ${{ env.AWS_REGION }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
+3 -3
View File
@@ -31,7 +31,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: Dockerfile
@@ -66,7 +66,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ./**
files_ignore: |
@@ -89,7 +89,7 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build SDK container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
@@ -50,7 +50,7 @@ jobs:
- name: Create pull request
id: create-pr
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
author: 'prowler-bot <179230569+prowler-bot@users.noreply.github.com>'
+5 -3
View File
@@ -28,9 +28,11 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ./**
files:
./**
.github/workflows/sdk-security.yml
files_ignore: |
.github/**
prowler/CHANGELOG.md
@@ -70,7 +72,7 @@ jobs:
- name: Security scan with Safety
if: steps.check-changes.outputs.any_changed == 'true'
run: poetry run safety check --ignore 70612 -r pyproject.toml
run: poetry run safety check -r pyproject.toml
- name: Dead code detection with Vulture
if: steps.check-changes.outputs.any_changed == 'true'
+25 -25
View File
@@ -35,7 +35,7 @@ jobs:
- name: Check for SDK changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ./**
files_ignore: |
@@ -75,7 +75,7 @@ jobs:
- name: Check if AWS files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-aws
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/aws/**
@@ -189,7 +189,7 @@ jobs:
- name: Upload AWS coverage to Codecov
if: steps.changed-aws.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -200,7 +200,7 @@ jobs:
- name: Check if Azure files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-azure
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/azure/**
@@ -213,7 +213,7 @@ jobs:
- name: Upload Azure coverage to Codecov
if: steps.changed-azure.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -224,7 +224,7 @@ jobs:
- name: Check if GCP files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-gcp
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/gcp/**
@@ -237,7 +237,7 @@ jobs:
- name: Upload GCP coverage to Codecov
if: steps.changed-gcp.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -248,7 +248,7 @@ jobs:
- name: Check if Kubernetes files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-kubernetes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/kubernetes/**
@@ -261,7 +261,7 @@ jobs:
- name: Upload Kubernetes coverage to Codecov
if: steps.changed-kubernetes.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -272,7 +272,7 @@ jobs:
- name: Check if GitHub files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-github
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/github/**
@@ -285,7 +285,7 @@ jobs:
- name: Upload GitHub coverage to Codecov
if: steps.changed-github.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -296,7 +296,7 @@ jobs:
- name: Check if NHN files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-nhn
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/nhn/**
@@ -309,7 +309,7 @@ jobs:
- name: Upload NHN coverage to Codecov
if: steps.changed-nhn.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -320,7 +320,7 @@ jobs:
- name: Check if M365 files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-m365
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/m365/**
@@ -333,7 +333,7 @@ jobs:
- name: Upload M365 coverage to Codecov
if: steps.changed-m365.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -344,7 +344,7 @@ jobs:
- name: Check if IaC files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-iac
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/iac/**
@@ -357,7 +357,7 @@ jobs:
- name: Upload IaC coverage to Codecov
if: steps.changed-iac.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -368,7 +368,7 @@ jobs:
- name: Check if MongoDB Atlas files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-mongodbatlas
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/mongodbatlas/**
@@ -381,7 +381,7 @@ jobs:
- name: Upload MongoDB Atlas coverage to Codecov
if: steps.changed-mongodbatlas.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -392,7 +392,7 @@ jobs:
- name: Check if OCI files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-oraclecloud
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/**/oraclecloud/**
@@ -405,7 +405,7 @@ jobs:
- name: Upload OCI coverage to Codecov
if: steps.changed-oraclecloud.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -416,7 +416,7 @@ jobs:
- name: Check if Lib files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-lib
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/lib/**
@@ -429,7 +429,7 @@ jobs:
- name: Upload Lib coverage to Codecov
if: steps.changed-lib.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
@@ -440,7 +440,7 @@ jobs:
- name: Check if Config files changed
if: steps.check-changes.outputs.any_changed == 'true'
id: changed-config
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
./prowler/config/**
@@ -453,7 +453,7 @@ jobs:
- name: Upload Config coverage to Codecov
if: steps.changed-config.outputs.any_changed == 'true'
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
+3 -3
View File
@@ -90,7 +90,7 @@ jobs:
git --no-pager diff
- name: Create PR for next minor version to master
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -140,7 +140,7 @@ jobs:
git --no-pager diff
- name: Create PR for first patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -199,7 +199,7 @@ jobs:
git --no-pager diff
- name: Create PR for next patch version to version branch
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
with:
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
@@ -104,7 +104,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build and push UI container for ${{ matrix.arch }}
id: container-push
@@ -130,13 +130,13 @@ jobs:
steps:
- name: Login to DockerHub
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Create and push manifests for push event
if: github.event_name == 'push'
+3 -3
View File
@@ -32,7 +32,7 @@ jobs:
- name: Check if Dockerfile changed
id: dockerfile-changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ui/Dockerfile
@@ -67,7 +67,7 @@ jobs:
- name: Check for UI changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: ui/**
files_ignore: |
@@ -76,7 +76,7 @@ jobs:
- name: Set up Docker Buildx
if: steps.check-changes.outputs.any_changed == 'true'
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Build UI container for ${{ matrix.arch }}
if: steps.check-changes.outputs.any_changed == 'true'
+4 -4
View File
@@ -114,7 +114,7 @@ jobs:
echo "All database fixtures loaded successfully!"
'
- name: Setup Node.js environment
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6.0.0
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: '20.x'
- name: Setup pnpm
@@ -126,7 +126,7 @@ jobs:
shell: bash
run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('ui/pnpm-lock.yaml') }}
@@ -139,7 +139,7 @@ jobs:
working-directory: ./ui
run: pnpm run build
- name: Cache Playwright browsers
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
id: playwright-cache
with:
path: ~/.cache/ms-playwright
@@ -154,7 +154,7 @@ jobs:
working-directory: ./ui
run: pnpm run test:e2e
- name: Upload test reports
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
if: failure()
with:
name: playwright-report
+3 -3
View File
@@ -34,7 +34,7 @@ jobs:
- name: Check for UI changes
id: check-changes
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
uses: tj-actions/changed-files@e0021407031f5be11a464abee9a0776171c79891 # v47.0.1
with:
files: |
ui/**
@@ -45,7 +45,7 @@ jobs:
- name: Setup Node.js ${{ env.NODE_VERSION }}
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6.0.0
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: ${{ env.NODE_VERSION }}
@@ -63,7 +63,7 @@ jobs:
- name: Setup pnpm cache
if: steps.check-changes.outputs.any_changed == 'true'
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('ui/pnpm-lock.yaml') }}
+9
View File
@@ -82,6 +82,9 @@ continue.json
.continuerc
.continuerc.json
# AI Coding Assistants - OpenCode
opencode.json
# AI Coding Assistants - GitHub Copilot
.copilot/
.github/copilot/
@@ -152,3 +155,9 @@ CLAUDE.md
# Compliance report
*.pdf
# AI Skills symlinks (generated by skills/setup.sh)
.claude/skills
.codex/skills
.github/skills
.gemini/skills
+6 -3
View File
@@ -34,6 +34,7 @@ repos:
rev: v2.3.1
hooks:
- id: autoflake
exclude: ^skills/
args:
[
"--in-place",
@@ -45,18 +46,20 @@ repos:
rev: 5.13.2
hooks:
- id: isort
exclude: ^skills/
args: ["--profile", "black"]
- repo: https://github.com/psf/black
rev: 24.4.2
hooks:
- id: black
exclude: ^skills/
- repo: https://github.com/pycqa/flake8
rev: 7.0.0
hooks:
- id: flake8
exclude: contrib
exclude: (contrib|^skills/)
args: ["--ignore=E266,W503,E203,E501,W605"]
- repo: https://github.com/python-poetry/poetry
@@ -109,7 +112,7 @@ repos:
- id: bandit
name: bandit
description: "Bandit is a tool for finding common security issues in Python code"
entry: bash -c 'bandit -q -lll -x '*_test.py,./contrib/,./.venv/' -r .'
entry: bash -c 'bandit -q -lll -x '*_test.py,./contrib/,./.venv/,./skills/' -r .'
language: system
files: '.*\.py'
@@ -123,7 +126,7 @@ repos:
- id: vulture
name: vulture
description: "Vulture finds unused code in Python programs."
entry: bash -c 'vulture --exclude "contrib,.venv,api/src/backend/api/tests/,api/src/backend/conftest.py,api/src/backend/tasks/tests/" --min-confidence 100 .'
entry: bash -c 'vulture --exclude "contrib,.venv,api/src/backend/api/tests/,api/src/backend/conftest.py,api/src/backend/tasks/tests/,skills/" --min-confidence 100 .'
language: system
files: '.*\.py'
+62 -88
View File
@@ -2,109 +2,83 @@
## How to Use This Guide
- Start here for cross-project norms, Prowler is a monorepo with several components. Every component should have an `AGENTS.md` file that contains the guidelines for the agents in that component. The file is located beside the code you are touching (e.g. `api/AGENTS.md`, `ui/AGENTS.md`, `prowler/AGENTS.md`).
- Follow the stricter rule when guidance conflicts; component docs override this file for their scope.
- Keep instructions synchronized. When you add new workflows or scripts, update both, the relevant component `AGENTS.md` and this file if they apply broadly.
- Start here for cross-project norms. Prowler is a monorepo with several components.
- Each component has an `AGENTS.md` file with specific guidelines (e.g., `api/AGENTS.md`, `ui/AGENTS.md`).
- Component docs override this file when guidance conflicts.
## Available Skills
Use these skills for detailed patterns on-demand:
### Generic Skills (Any Project)
| Skill | Description | URL |
|-------|-------------|-----|
| `typescript` | Const types, flat interfaces, utility types | [SKILL.md](skills/typescript/SKILL.md) |
| `react-19` | No useMemo/useCallback, React Compiler | [SKILL.md](skills/react-19/SKILL.md) |
| `nextjs-15` | App Router, Server Actions, streaming | [SKILL.md](skills/nextjs-15/SKILL.md) |
| `tailwind-4` | cn() utility, no var() in className | [SKILL.md](skills/tailwind-4/SKILL.md) |
| `playwright` | Page Object Model, MCP workflow, selectors | [SKILL.md](skills/playwright/SKILL.md) |
| `pytest` | Fixtures, mocking, markers, parametrize | [SKILL.md](skills/pytest/SKILL.md) |
| `django-drf` | ViewSets, Serializers, Filters | [SKILL.md](skills/django-drf/SKILL.md) |
| `zod-4` | New API (z.email(), z.uuid()) | [SKILL.md](skills/zod-4/SKILL.md) |
| `zustand-5` | Persist, selectors, slices | [SKILL.md](skills/zustand-5/SKILL.md) |
| `ai-sdk-5` | UIMessage, streaming, LangChain | [SKILL.md](skills/ai-sdk-5/SKILL.md) |
### Prowler-Specific Skills
| Skill | Description | URL |
|-------|-------------|-----|
| `prowler` | Project overview, component navigation | [SKILL.md](skills/prowler/SKILL.md) |
| `prowler-api` | Django + RLS + JSON:API patterns | [SKILL.md](skills/prowler-api/SKILL.md) |
| `prowler-ui` | Next.js + shadcn conventions | [SKILL.md](skills/prowler-ui/SKILL.md) |
| `prowler-sdk-check` | Create new security checks | [SKILL.md](skills/prowler-sdk-check/SKILL.md) |
| `prowler-mcp` | MCP server tools and models | [SKILL.md](skills/prowler-mcp/SKILL.md) |
| `prowler-test-sdk` | SDK testing (pytest + moto) | [SKILL.md](skills/prowler-test-sdk/SKILL.md) |
| `prowler-test-api` | API testing (pytest-django + RLS) | [SKILL.md](skills/prowler-test-api/SKILL.md) |
| `prowler-test-ui` | E2E testing (Playwright) | [SKILL.md](skills/prowler-test-ui/SKILL.md) |
| `prowler-compliance` | Compliance framework structure | [SKILL.md](skills/prowler-compliance/SKILL.md) |
| `prowler-provider` | Add new cloud providers | [SKILL.md](skills/prowler-provider/SKILL.md) |
| `prowler-pr` | Pull request conventions | [SKILL.md](skills/prowler-pr/SKILL.md) |
| `prowler-docs` | Documentation style guide | [SKILL.md](skills/prowler-docs/SKILL.md) |
| `skill-creator` | Create new AI agent skills | [SKILL.md](skills/skill-creator/SKILL.md) |
---
## Project Overview
Prowler is an open-source cloud security assessment tool that supports multiple cloud providers (AWS, Azure, GCP, Kubernetes, GitHub, M365, etc.). The project consists in a monorepo with the following main components:
Prowler is an open-source cloud security assessment tool supporting AWS, Azure, GCP, Kubernetes, GitHub, M365, and more.
- **Prowler SDK**: Python SDK, includes the Prowler CLI, providers, services, checks, compliances, config, etc. (`prowler/`)
- **Prowler API**: Django-based REST API backend (`api/`)
- **Prowler UI**: Next.js frontend application (`ui/`)
- **Prowler MCP Server**: Model Context Protocol server that gives access to the entire Prowler ecosystem for LLMs (`mcp_server/`)
- **Prowler Dashboard**: Prowler CLI feature that allows to visualize the results of the scans in a simple dashboard (`dashboard/`)
| Component | Location | Tech Stack |
|-----------|----------|------------|
| SDK | `prowler/` | Python 3.9+, Poetry |
| API | `api/` | Django 5.1, DRF, Celery |
| UI | `ui/` | Next.js 15, React 19, Tailwind 4 |
| MCP Server | `mcp_server/` | FastMCP, Python 3.12+ |
| Dashboard | `dashboard/` | Dash, Plotly |
### Project Structure (Key Folders & Files)
- `prowler/`: Main source code for Prowler SDK (CLI, providers, services, checks, compliances, config, etc.)
- `api/`: Django-based REST API backend components
- `ui/`: Next.js frontend application
- `mcp_server/`: Model Context Protocol server that gives access to the entire Prowler ecosystem for LLMs
- `dashboard/`: Prowler CLI feature that allows to visualize the results of the scans in a simple dashboard
- `docs/`: Documentation
- `examples/`: Example output formats for providers and scripts
- `permissions/`: Permission-related files and policies
- `contrib/`: Community-contributed scripts or modules
- `tests/`: Prowler SDK test suite
- `docker-compose.yml`: Docker compose file to run the Prowler App (API + UI) production environment
- `docker-compose-dev.yml`: Docker compose file to run the Prowler App (API + UI) development environment
- `pyproject.toml`: Poetry Prowler SDK project file
- `.pre-commit-config.yaml`: Pre-commit hooks configuration
- `Makefile`: Makefile to run the project
- `LICENSE`: License file
- `README.md`: README file
- `CONTRIBUTING.md`: Contributing guide
---
## Python Development
Most of the code is written in Python, so the main files in the root are focused on Python code.
### Poetry Dev Environment
For developing in Python we recommend using `poetry` to manage the dependencies. The minimal version is `2.1.1`. So it is recommended to run all commands using `poetry run ...`.
To install the core dependencies to develop it is needed to run `poetry install --with dev`.
### Pre-commit hooks
The project has pre-commit hooks to lint and format the code. They are installed by running `poetry run pre-commit install`.
When commiting a change, the hooks will be run automatically. Some of them are:
- Code formatting (black, isort)
- Linting (flake8, pylint)
- Security checks (bandit, safety, trufflehog)
- YAML/JSON validation
- Poetry lock file validation
### Linting and Formatting
We use the following tools to lint and format the code:
- `flake8`: for linting the code
- `black`: for formatting the code
- `pylint`: for linting the code
You can run all using the `make` command:
```bash
# Setup
poetry install --with dev
poetry run pre-commit install
# Code quality
poetry run make lint
poetry run make format
poetry run pre-commit run --all-files
```
Or they will be run automatically when you commit your changes using pre-commit hooks.
---
## Commit & Pull Request Guidelines
For the commit messages and pull requests name follow the conventional-commit style.
Follow conventional-commit style: `<type>[scope]: <description>`
Befire creating a pull request, complete the checklist in `.github/pull_request_template.md`. Summaries should explain deployment impact, highlight review steps, and note changelog or permission updates. Run all relevant tests and linters before requesting review and link screenshots for UI or dashboard changes.
**Types:** `feat`, `fix`, `docs`, `chore`, `perf`, `refactor`, `style`, `test`
### Conventional Commit Style
The Conventional Commits specification is a lightweight convention on top of commit messages. It provides an easy set of rules for creating an explicit commit history; which makes it easier to write automated tools on top of.
The commit message should be structured as follows:
```
<type>[optional scope]: <description>
<BLANK LINE>
[optional body]
<BLANK LINE>
[optional footer(s)]
```
Any line of the commit message cannot be longer 100 characters! This allows the message to be easier to read on GitHub as well as in various git tools
#### Commit Types
- **feat**: code change introuce new functionality to the application
- **fix**: code change that solve a bug in the codebase
- **docs**: documentation only changes
- **chore**: changes related to the build process or auxiliary tools and libraries, that do not affect the application's functionality
- **perf**: code change that improves performance
- **refactor**: code change that neither fixes a bug nor adds a feature
- **style**: changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
- **test**: adding missing tests or correcting existing tests
Before creating a PR:
1. Complete checklist in `.github/pull_request_template.md`
2. Run all relevant tests and linters
3. Link screenshots for UI changes
+39
View File
@@ -310,6 +310,45 @@ And many more environments.
![Architecture](docs/img/architecture.png)
# 🤖 AI Skills for Development
Prowler includes a comprehensive set of **AI Skills** that help AI coding assistants understand Prowler's codebase patterns and conventions.
## What are AI Skills?
Skills are structured instructions that give AI assistants the context they need to write code that follows Prowler's standards. They include:
- **Coding patterns** for each component (SDK, API, UI, MCP Server)
- **Testing conventions** (pytest, Playwright)
- **Architecture guidelines** (Clean Architecture, RLS patterns)
- **Framework-specific rules** (React 19, Next.js 15, Django DRF, Tailwind 4)
## Available Skills
| Category | Skills |
|----------|--------|
| **Generic** | `typescript`, `react-19`, `nextjs-15`, `tailwind-4`, `playwright`, `pytest`, `django-drf`, `zod-4`, `zustand-5`, `ai-sdk-5` |
| **Prowler** | `prowler`, `prowler-api`, `prowler-ui`, `prowler-mcp`, `prowler-sdk-check`, `prowler-test-ui`, `prowler-test-api`, `prowler-test-sdk`, `prowler-compliance`, `prowler-provider`, `prowler-pr`, `prowler-docs` |
## Setup
```bash
./skills/setup.sh
```
This configures skills for AI coding assistants that follow the [agentskills.io](https://agentskills.io) standard:
| Tool | Configuration |
|------|---------------|
| **Claude Code** | `.claude/skills/` (symlink) |
| **OpenCode** | `.claude/skills/` (symlink) |
| **Codex (OpenAI)** | `.codex/skills/` (symlink) |
| **GitHub Copilot** | `.github/skills/` (symlink) |
| **Gemini CLI** | `.gemini/skills/` (symlink) |
> **Note:** Restart your AI coding assistant after running setup to load the skills.
> Gemini CLI requires `experimental.skills` enabled in settings.
# 📖 Documentation
For installation instructions, usage details, tutorials, and the Developer Guide, visit https://docs.prowler.com/
+137
View File
@@ -0,0 +1,137 @@
# Prowler API - AI Agent Ruleset
> **Skills Reference**: For detailed patterns, use these skills:
> - [`prowler-api`](../skills/prowler-api/SKILL.md) - Models, Serializers, Views, RLS patterns
> - [`prowler-test-api`](../skills/prowler-test-api/SKILL.md) - Testing patterns (pytest-django)
> - [`django-drf`](../skills/django-drf/SKILL.md) - Generic DRF patterns
> - [`pytest`](../skills/pytest/SKILL.md) - Generic pytest patterns
## CRITICAL RULES - NON-NEGOTIABLE
### Models
- ALWAYS: UUIDv4 PKs, `inserted_at`/`updated_at` timestamps, `JSONAPIMeta` class
- ALWAYS: Inherit from `RowLevelSecurityProtectedModel` for tenant-scoped data
- NEVER: Auto-increment integer PKs, models without tenant isolation
### Serializers
- ALWAYS: Separate serializers for Create/Update operations
- ALWAYS: Inherit from `RLSSerializer` for tenant-scoped models
- NEVER: Write logic in serializers (use services/utils)
### Views
- ALWAYS: Inherit from `BaseRLSViewSet` for tenant-scoped resources
- ALWAYS: Define `filterset_class`, use `@extend_schema` for OpenAPI
- NEVER: Raw SQL queries, business logic in views
### Row-Level Security (RLS)
- ALWAYS: Use `rls_transaction(tenant_id)` context manager
- NEVER: Query across tenants, trust client-provided tenant_id
### Celery Tasks
- ALWAYS: `@shared_task` with `name`, `queue`, `RLSTask` base class
- NEVER: Long-running ops in views, request context in tasks
---
## DECISION TREES
### Serializer Selection
```
Read → <Model>Serializer
Create → <Model>CreateSerializer
Update → <Model>UpdateSerializer
Nested read → <Model>IncludeSerializer
```
### Task vs View
```
< 100ms → View
> 100ms or external API → Celery task
Needs retry → Celery task
```
---
## TECH STACK
Django 5.1.x | DRF 3.15.x | djangorestframework-jsonapi 7.x | Celery 5.4.x | PostgreSQL 16 | pytest 8.x
---
## PROJECT STRUCTURE
```
api/src/backend/
├── api/ # Main Django app
│ ├── v1/ # API version 1 (views, serializers, urls)
│ ├── models.py # Django models
│ ├── filters.py # FilterSet classes
│ ├── base_views.py # Base ViewSet classes
│ ├── rls.py # Row-Level Security
│ └── tests/ # Unit tests
├── config/ # Django configuration
└── tasks/ # Celery tasks
```
---
## COMMANDS
```bash
# Development
poetry run python src/backend/manage.py runserver
poetry run celery -A config.celery worker -l INFO
# Database
poetry run python src/backend/manage.py makemigrations
poetry run python src/backend/manage.py migrate
# Testing & Linting
poetry run pytest -x --tb=short
poetry run make lint
```
---
## QA CHECKLIST
- [ ] `poetry run pytest` passes
- [ ] `poetry run make lint` passes
- [ ] Migrations created if models changed
- [ ] New endpoints have `@extend_schema` decorators
- [ ] RLS properly applied for tenant data
- [ ] Tests cover success and error cases
---
## NAMING CONVENTIONS
| Entity | Pattern | Example |
|--------|---------|---------|
| Serializer (read) | `<Model>Serializer` | `ProviderSerializer` |
| Serializer (create) | `<Model>CreateSerializer` | `ProviderCreateSerializer` |
| Serializer (update) | `<Model>UpdateSerializer` | `ProviderUpdateSerializer` |
| Filter | `<Model>Filter` | `ProviderFilter` |
| ViewSet | `<Model>ViewSet` | `ProviderViewSet` |
| Task | `<action>_<entity>_task` | `sync_provider_resources_task` |
---
## API CONVENTIONS (JSON:API)
```json
{
"data": {
"type": "providers",
"id": "uuid",
"attributes": { "name": "value" },
"relationships": { "tenant": { "data": { "type": "tenants", "id": "uuid" } } }
}
}
```
- Content-Type: `application/vnd.api+json`
- Pagination: `?page[number]=1&page[size]=20`
- Filtering: `?filter[field]=value`, `?filter[field__in]=val1,val2`
- Sorting: `?sort=field`, `?sort=-field`
- Including: `?include=provider,findings`
+2
View File
@@ -5,7 +5,9 @@ All notable changes to the **Prowler API** are documented in this file.
## [1.18.0] (Prowler UNRELEASED)
### Added
- `/api/v1/overviews/compliance-watchlist` to retrieve the compliance watchlist [(#9596)](https://github.com/prowler-cloud/prowler/pull/9596)
- Support AlibabaCloud provider [(#9485)](https://github.com/prowler-cloud/prowler/pull/9485)
- `provider_id` and `provider_id__in` filter aliases for findings endpoints to enable consistent frontend parameter naming [(#9701)](https://github.com/prowler-cloud/prowler/pull/9701)
---
+261 -190
View File
@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 2.1.3 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.2.1 and should not be changed by hand.
[[package]]
name = "about-time"
@@ -38,98 +38,132 @@ files = [
[[package]]
name = "aiohttp"
version = "3.12.15"
version = "3.13.3"
description = "Async http client/server framework (asyncio)"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "aiohttp-3.12.15-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:b6fc902bff74d9b1879ad55f5404153e2b33a82e72a95c89cec5eb6cc9e92fbc"},
{file = "aiohttp-3.12.15-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:098e92835b8119b54c693f2f88a1dec690e20798ca5f5fe5f0520245253ee0af"},
{file = "aiohttp-3.12.15-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:40b3fee496a47c3b4a39a731954c06f0bd9bd3e8258c059a4beb76ac23f8e421"},
{file = "aiohttp-3.12.15-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2ce13fcfb0bb2f259fb42106cdc63fa5515fb85b7e87177267d89a771a660b79"},
{file = "aiohttp-3.12.15-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3beb14f053222b391bf9cf92ae82e0171067cc9c8f52453a0f1ec7c37df12a77"},
{file = "aiohttp-3.12.15-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4c39e87afe48aa3e814cac5f535bc6199180a53e38d3f51c5e2530f5aa4ec58c"},
{file = "aiohttp-3.12.15-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d5f1b4ce5bc528a6ee38dbf5f39bbf11dd127048726323b72b8e85769319ffc4"},
{file = "aiohttp-3.12.15-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1004e67962efabbaf3f03b11b4c43b834081c9e3f9b32b16a7d97d4708a9abe6"},
{file = "aiohttp-3.12.15-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8faa08fcc2e411f7ab91d1541d9d597d3a90e9004180edb2072238c085eac8c2"},
{file = "aiohttp-3.12.15-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:fe086edf38b2222328cdf89af0dde2439ee173b8ad7cb659b4e4c6f385b2be3d"},
{file = "aiohttp-3.12.15-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:79b26fe467219add81d5e47b4a4ba0f2394e8b7c7c3198ed36609f9ba161aecb"},
{file = "aiohttp-3.12.15-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:b761bac1192ef24e16706d761aefcb581438b34b13a2f069a6d343ec8fb693a5"},
{file = "aiohttp-3.12.15-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:e153e8adacfe2af562861b72f8bc47f8a5c08e010ac94eebbe33dc21d677cd5b"},
{file = "aiohttp-3.12.15-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:fc49c4de44977aa8601a00edbf157e9a421f227aa7eb477d9e3df48343311065"},
{file = "aiohttp-3.12.15-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:2776c7ec89c54a47029940177e75c8c07c29c66f73464784971d6a81904ce9d1"},
{file = "aiohttp-3.12.15-cp310-cp310-win32.whl", hash = "sha256:2c7d81a277fa78b2203ab626ced1487420e8c11a8e373707ab72d189fcdad20a"},
{file = "aiohttp-3.12.15-cp310-cp310-win_amd64.whl", hash = "sha256:83603f881e11f0f710f8e2327817c82e79431ec976448839f3cd05d7afe8f830"},
{file = "aiohttp-3.12.15-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d3ce17ce0220383a0f9ea07175eeaa6aa13ae5a41f30bc61d84df17f0e9b1117"},
{file = "aiohttp-3.12.15-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:010cc9bbd06db80fe234d9003f67e97a10fe003bfbedb40da7d71c1008eda0fe"},
{file = "aiohttp-3.12.15-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3f9d7c55b41ed687b9d7165b17672340187f87a773c98236c987f08c858145a9"},
{file = "aiohttp-3.12.15-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bc4fbc61bb3548d3b482f9ac7ddd0f18c67e4225aaa4e8552b9f1ac7e6bda9e5"},
{file = "aiohttp-3.12.15-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7fbc8a7c410bb3ad5d595bb7118147dfbb6449d862cc1125cf8867cb337e8728"},
{file = "aiohttp-3.12.15-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:74dad41b3458dbb0511e760fb355bb0b6689e0630de8a22b1b62a98777136e16"},
{file = "aiohttp-3.12.15-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b6f0af863cf17e6222b1735a756d664159e58855da99cfe965134a3ff63b0b0"},
{file = "aiohttp-3.12.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b5b7fe4972d48a4da367043b8e023fb70a04d1490aa7d68800e465d1b97e493b"},
{file = "aiohttp-3.12.15-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6443cca89553b7a5485331bc9bedb2342b08d073fa10b8c7d1c60579c4a7b9bd"},
{file = "aiohttp-3.12.15-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:6c5f40ec615e5264f44b4282ee27628cea221fcad52f27405b80abb346d9f3f8"},
{file = "aiohttp-3.12.15-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:2abbb216a1d3a2fe86dbd2edce20cdc5e9ad0be6378455b05ec7f77361b3ab50"},
{file = "aiohttp-3.12.15-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:db71ce547012a5420a39c1b744d485cfb823564d01d5d20805977f5ea1345676"},
{file = "aiohttp-3.12.15-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:ced339d7c9b5030abad5854aa5413a77565e5b6e6248ff927d3e174baf3badf7"},
{file = "aiohttp-3.12.15-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:7c7dd29c7b5bda137464dc9bfc738d7ceea46ff70309859ffde8c022e9b08ba7"},
{file = "aiohttp-3.12.15-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:421da6fd326460517873274875c6c5a18ff225b40da2616083c5a34a7570b685"},
{file = "aiohttp-3.12.15-cp311-cp311-win32.whl", hash = "sha256:4420cf9d179ec8dfe4be10e7d0fe47d6d606485512ea2265b0d8c5113372771b"},
{file = "aiohttp-3.12.15-cp311-cp311-win_amd64.whl", hash = "sha256:edd533a07da85baa4b423ee8839e3e91681c7bfa19b04260a469ee94b778bf6d"},
{file = "aiohttp-3.12.15-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:802d3868f5776e28f7bf69d349c26fc0efadb81676d0afa88ed00d98a26340b7"},
{file = "aiohttp-3.12.15-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f2800614cd560287be05e33a679638e586a2d7401f4ddf99e304d98878c29444"},
{file = "aiohttp-3.12.15-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8466151554b593909d30a0a125d638b4e5f3836e5aecde85b66b80ded1cb5b0d"},
{file = "aiohttp-3.12.15-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e5a495cb1be69dae4b08f35a6c4579c539e9b5706f606632102c0f855bcba7c"},
{file = "aiohttp-3.12.15-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6404dfc8cdde35c69aaa489bb3542fb86ef215fc70277c892be8af540e5e21c0"},
{file = "aiohttp-3.12.15-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3ead1c00f8521a5c9070fcb88f02967b1d8a0544e6d85c253f6968b785e1a2ab"},
{file = "aiohttp-3.12.15-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6990ef617f14450bc6b34941dba4f12d5613cbf4e33805932f853fbd1cf18bfb"},
{file = "aiohttp-3.12.15-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd736ed420f4db2b8148b52b46b88ed038d0354255f9a73196b7bbce3ea97545"},
{file = "aiohttp-3.12.15-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c5092ce14361a73086b90c6efb3948ffa5be2f5b6fbcf52e8d8c8b8848bb97c"},
{file = "aiohttp-3.12.15-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:aaa2234bb60c4dbf82893e934d8ee8dea30446f0647e024074237a56a08c01bd"},
{file = "aiohttp-3.12.15-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:6d86a2fbdd14192e2f234a92d3b494dd4457e683ba07e5905a0b3ee25389ac9f"},
{file = "aiohttp-3.12.15-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a041e7e2612041a6ddf1c6a33b883be6a421247c7afd47e885969ee4cc58bd8d"},
{file = "aiohttp-3.12.15-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5015082477abeafad7203757ae44299a610e89ee82a1503e3d4184e6bafdd519"},
{file = "aiohttp-3.12.15-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:56822ff5ddfd1b745534e658faba944012346184fbfe732e0d6134b744516eea"},
{file = "aiohttp-3.12.15-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b2acbbfff69019d9014508c4ba0401822e8bae5a5fdc3b6814285b71231b60f3"},
{file = "aiohttp-3.12.15-cp312-cp312-win32.whl", hash = "sha256:d849b0901b50f2185874b9a232f38e26b9b3d4810095a7572eacea939132d4e1"},
{file = "aiohttp-3.12.15-cp312-cp312-win_amd64.whl", hash = "sha256:b390ef5f62bb508a9d67cb3bba9b8356e23b3996da7062f1a57ce1a79d2b3d34"},
{file = "aiohttp-3.12.15-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:9f922ffd05034d439dde1c77a20461cf4a1b0831e6caa26151fe7aa8aaebc315"},
{file = "aiohttp-3.12.15-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:2ee8a8ac39ce45f3e55663891d4b1d15598c157b4d494a4613e704c8b43112cd"},
{file = "aiohttp-3.12.15-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3eae49032c29d356b94eee45a3f39fdf4b0814b397638c2f718e96cfadf4c4e4"},
{file = "aiohttp-3.12.15-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b97752ff12cc12f46a9b20327104448042fce5c33a624f88c18f66f9368091c7"},
{file = "aiohttp-3.12.15-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:894261472691d6fe76ebb7fcf2e5870a2ac284c7406ddc95823c8598a1390f0d"},
{file = "aiohttp-3.12.15-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5fa5d9eb82ce98959fc1031c28198b431b4d9396894f385cb63f1e2f3f20ca6b"},
{file = "aiohttp-3.12.15-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f0fa751efb11a541f57db59c1dd821bec09031e01452b2b6217319b3a1f34f3d"},
{file = "aiohttp-3.12.15-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5346b93e62ab51ee2a9d68e8f73c7cf96ffb73568a23e683f931e52450e4148d"},
{file = "aiohttp-3.12.15-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:049ec0360f939cd164ecbfd2873eaa432613d5e77d6b04535e3d1fbae5a9e645"},
{file = "aiohttp-3.12.15-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b52dcf013b57464b6d1e51b627adfd69a8053e84b7103a7cd49c030f9ca44461"},
{file = "aiohttp-3.12.15-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:9b2af240143dd2765e0fb661fd0361a1b469cab235039ea57663cda087250ea9"},
{file = "aiohttp-3.12.15-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ac77f709a2cde2cc71257ab2d8c74dd157c67a0558a0d2799d5d571b4c63d44d"},
{file = "aiohttp-3.12.15-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:47f6b962246f0a774fbd3b6b7be25d59b06fdb2f164cf2513097998fc6a29693"},
{file = "aiohttp-3.12.15-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:760fb7db442f284996e39cf9915a94492e1896baac44f06ae551974907922b64"},
{file = "aiohttp-3.12.15-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ad702e57dc385cae679c39d318def49aef754455f237499d5b99bea4ef582e51"},
{file = "aiohttp-3.12.15-cp313-cp313-win32.whl", hash = "sha256:f813c3e9032331024de2eb2e32a88d86afb69291fbc37a3a3ae81cc9917fb3d0"},
{file = "aiohttp-3.12.15-cp313-cp313-win_amd64.whl", hash = "sha256:1a649001580bdb37c6fdb1bebbd7e3bc688e8ec2b5c6f52edbb664662b17dc84"},
{file = "aiohttp-3.12.15-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:691d203c2bdf4f4637792efbbcdcd157ae11e55eaeb5e9c360c1206fb03d4d98"},
{file = "aiohttp-3.12.15-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8e995e1abc4ed2a454c731385bf4082be06f875822adc4c6d9eaadf96e20d406"},
{file = "aiohttp-3.12.15-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:bd44d5936ab3193c617bfd6c9a7d8d1085a8dc8c3f44d5f1dcf554d17d04cf7d"},
{file = "aiohttp-3.12.15-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:46749be6e89cd78d6068cdf7da51dbcfa4321147ab8e4116ee6678d9a056a0cf"},
{file = "aiohttp-3.12.15-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0c643f4d75adea39e92c0f01b3fb83d57abdec8c9279b3078b68a3a52b3933b6"},
{file = "aiohttp-3.12.15-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0a23918fedc05806966a2438489dcffccbdf83e921a1170773b6178d04ade142"},
{file = "aiohttp-3.12.15-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:74bdd8c864b36c3673741023343565d95bfbd778ffe1eb4d412c135a28a8dc89"},
{file = "aiohttp-3.12.15-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0a146708808c9b7a988a4af3821379e379e0f0e5e466ca31a73dbdd0325b0263"},
{file = "aiohttp-3.12.15-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b7011a70b56facde58d6d26da4fec3280cc8e2a78c714c96b7a01a87930a9530"},
{file = "aiohttp-3.12.15-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:3bdd6e17e16e1dbd3db74d7f989e8af29c4d2e025f9828e6ef45fbdee158ec75"},
{file = "aiohttp-3.12.15-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:57d16590a351dfc914670bd72530fd78344b885a00b250e992faea565b7fdc05"},
{file = "aiohttp-3.12.15-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:bc9a0f6569ff990e0bbd75506c8d8fe7214c8f6579cca32f0546e54372a3bb54"},
{file = "aiohttp-3.12.15-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:536ad7234747a37e50e7b6794ea868833d5220b49c92806ae2d7e8a9d6b5de02"},
{file = "aiohttp-3.12.15-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:f0adb4177fa748072546fb650d9bd7398caaf0e15b370ed3317280b13f4083b0"},
{file = "aiohttp-3.12.15-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:14954a2988feae3987f1eb49c706bff39947605f4b6fa4027c1d75743723eb09"},
{file = "aiohttp-3.12.15-cp39-cp39-win32.whl", hash = "sha256:b784d6ed757f27574dca1c336f968f4e81130b27595e458e69457e6878251f5d"},
{file = "aiohttp-3.12.15-cp39-cp39-win_amd64.whl", hash = "sha256:86ceded4e78a992f835209e236617bffae649371c4a50d5e5a3987f237db84b8"},
{file = "aiohttp-3.12.15.tar.gz", hash = "sha256:4fc61385e9c98d72fcdf47e6dd81833f47b2f77c114c29cd64a361be57a763a2"},
{file = "aiohttp-3.13.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d5a372fd5afd301b3a89582817fdcdb6c34124787c70dbcc616f259013e7eef7"},
{file = "aiohttp-3.13.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:147e422fd1223005c22b4fe080f5d93ced44460f5f9c105406b753612b587821"},
{file = "aiohttp-3.13.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:859bd3f2156e81dd01432f5849fc73e2243d4a487c4fd26609b1299534ee1845"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:dca68018bf48c251ba17c72ed479f4dafe9dbd5a73707ad8d28a38d11f3d42af"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:fee0c6bc7db1de362252affec009707a17478a00ec69f797d23ca256e36d5940"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c048058117fd649334d81b4b526e94bde3ccaddb20463a815ced6ecbb7d11160"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:215a685b6fbbfcf71dfe96e3eba7a6f58f10da1dfdf4889c7dd856abe430dca7"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:de2c184bb1fe2cbd2cefba613e9db29a5ab559323f994b6737e370d3da0ac455"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:75ca857eba4e20ce9f546cd59c7007b33906a4cd48f2ff6ccf1ccfc3b646f279"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:81e97251d9298386c2b7dbeb490d3d1badbdc69107fb8c9299dd04eb39bddc0e"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:c0e2d366af265797506f0283487223146af57815b388623f0357ef7eac9b209d"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:4e239d501f73d6db1522599e14b9b321a7e3b1de66ce33d53a765d975e9f4808"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:0db318f7a6f065d84cb1e02662c526294450b314a02bd9e2a8e67f0d8564ce40"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:bfc1cc2fe31a6026a8a88e4ecfb98d7f6b1fec150cfd708adbfd1d2f42257c29"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:af71fff7bac6bb7508956696dce8f6eec2bbb045eceb40343944b1ae62b5ef11"},
{file = "aiohttp-3.13.3-cp310-cp310-win32.whl", hash = "sha256:37da61e244d1749798c151421602884db5270faf479cf0ef03af0ff68954c9dd"},
{file = "aiohttp-3.13.3-cp310-cp310-win_amd64.whl", hash = "sha256:7e63f210bc1b57ef699035f2b4b6d9ce096b5914414a49b0997c839b2bd2223c"},
{file = "aiohttp-3.13.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5b6073099fb654e0a068ae678b10feff95c5cae95bbfcbfa7af669d361a8aa6b"},
{file = "aiohttp-3.13.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cb93e166e6c28716c8c6aeb5f99dfb6d5ccf482d29fe9bf9a794110e6d0ab64"},
{file = "aiohttp-3.13.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:28e027cf2f6b641693a09f631759b4d9ce9165099d2b5d92af9bd4e197690eea"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3b61b7169ababd7802f9568ed96142616a9118dd2be0d1866e920e77ec8fa92a"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:80dd4c21b0f6237676449c6baaa1039abae86b91636b6c91a7f8e61c87f89540"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:65d2ccb7eabee90ce0503c17716fc77226be026dcc3e65cce859a30db715025b"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5b179331a481cb5529fca8b432d8d3c7001cb217513c94cd72d668d1248688a3"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d4c940f02f49483b18b079d1c27ab948721852b281f8b015c058100e9421dd1"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f9444f105664c4ce47a2a7171a2418bce5b7bae45fb610f4e2c36045d85911d3"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:694976222c711d1d00ba131904beb60534f93966562f64440d0c9d41b8cdb440"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:f33ed1a2bf1997a36661874b017f5c4b760f41266341af36febaf271d179f6d7"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e636b3c5f61da31a92bf0d91da83e58fdfa96f178ba682f11d24f31944cdd28c"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:5d2d94f1f5fcbe40838ac51a6ab5704a6f9ea42e72ceda48de5e6b898521da51"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2be0e9ccf23e8a94f6f0650ce06042cefc6ac703d0d7ab6c7a917289f2539ad4"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9af5e68ee47d6534d36791bbe9b646d2a7c7deb6fc24d7943628edfbb3581f29"},
{file = "aiohttp-3.13.3-cp311-cp311-win32.whl", hash = "sha256:a2212ad43c0833a873d0fb3c63fa1bacedd4cf6af2fee62bf4b739ceec3ab239"},
{file = "aiohttp-3.13.3-cp311-cp311-win_amd64.whl", hash = "sha256:642f752c3eb117b105acbd87e2c143de710987e09860d674e068c4c2c441034f"},
{file = "aiohttp-3.13.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:b903a4dfee7d347e2d87697d0713be59e0b87925be030c9178c5faa58ea58d5c"},
{file = "aiohttp-3.13.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:a45530014d7a1e09f4a55f4f43097ba0fd155089372e105e4bff4ca76cb1b168"},
{file = "aiohttp-3.13.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:27234ef6d85c914f9efeb77ff616dbf4ad2380be0cda40b4db086ffc7ddd1b7d"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d32764c6c9aafb7fb55366a224756387cd50bfa720f32b88e0e6fa45b27dcf29"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b1a6102b4d3ebc07dad44fbf07b45bb600300f15b552ddf1851b5390202ea2e3"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c014c7ea7fb775dd015b2d3137378b7be0249a448a1612268b5a90c2d81de04d"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2b8d8ddba8f95ba17582226f80e2de99c7a7948e66490ef8d947e272a93e9463"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9ae8dd55c8e6c4257eae3a20fd2c8f41edaea5992ed67156642493b8daf3cecc"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:01ad2529d4b5035578f5081606a465f3b814c542882804e2e8cda61adf5c71bf"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bb4f7475e359992b580559e008c598091c45b5088f28614e855e42d39c2f1033"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:c19b90316ad3b24c69cd78d5c9b4f3aa4497643685901185b65166293d36a00f"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:96d604498a7c782cb15a51c406acaea70d8c027ee6b90c569baa6e7b93073679"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:084911a532763e9d3dd95adf78a78f4096cd5f58cdc18e6fdbc1b58417a45423"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:7a4a94eb787e606d0a09404b9c38c113d3b099d508021faa615d70a0131907ce"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:87797e645d9d8e222e04160ee32aa06bc5c163e8499f24db719e7852ec23093a"},
{file = "aiohttp-3.13.3-cp312-cp312-win32.whl", hash = "sha256:b04be762396457bef43f3597c991e192ee7da460a4953d7e647ee4b1c28e7046"},
{file = "aiohttp-3.13.3-cp312-cp312-win_amd64.whl", hash = "sha256:e3531d63d3bdfa7e3ac5e9b27b2dd7ec9df3206a98e0b3445fa906f233264c57"},
{file = "aiohttp-3.13.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:5dff64413671b0d3e7d5918ea490bdccb97a4ad29b3f311ed423200b2203e01c"},
{file = "aiohttp-3.13.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:87b9aab6d6ed88235aa2970294f496ff1a1f9adcd724d800e9b952395a80ffd9"},
{file = "aiohttp-3.13.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:425c126c0dc43861e22cb1c14ba4c8e45d09516d0a3ae0a3f7494b79f5f233a3"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7f9120f7093c2a32d9647abcaf21e6ad275b4fbec5b55969f978b1a97c7c86bf"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:697753042d57f4bf7122cab985bf15d0cef23c770864580f5af4f52023a56bd6"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6de499a1a44e7de70735d0b39f67c8f25eb3d91eb3103be99ca0fa882cdd987d"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:37239e9f9a7ea9ac5bf6b92b0260b01f8a22281996da609206a84df860bc1261"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f76c1e3fe7d7c8afad7ed193f89a292e1999608170dcc9751a7462a87dfd5bc0"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fc290605db2a917f6e81b0e1e0796469871f5af381ce15c604a3c5c7e51cb730"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4021b51936308aeea0367b8f006dc999ca02bc118a0cc78c303f50a2ff6afb91"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:49a03727c1bba9a97d3e93c9f93ca03a57300f484b6e935463099841261195d3"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3d9908a48eb7416dc1f4524e69f1d32e5d90e3981e4e37eb0aa1cd18f9cfa2a4"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:2712039939ec963c237286113c68dbad80a82a4281543f3abf766d9d73228998"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:7bfdc049127717581866fa4708791220970ce291c23e28ccf3922c700740fdc0"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8057c98e0c8472d8846b9c79f56766bcc57e3e8ac7bfd510482332366c56c591"},
{file = "aiohttp-3.13.3-cp313-cp313-win32.whl", hash = "sha256:1449ceddcdbcf2e0446957863af03ebaaa03f94c090f945411b61269e2cb5daf"},
{file = "aiohttp-3.13.3-cp313-cp313-win_amd64.whl", hash = "sha256:693781c45a4033d31d4187d2436f5ac701e7bbfe5df40d917736108c1cc7436e"},
{file = "aiohttp-3.13.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:ea37047c6b367fd4bd632bff8077449b8fa034b69e812a18e0132a00fae6e808"},
{file = "aiohttp-3.13.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:6fc0e2337d1a4c3e6acafda6a78a39d4c14caea625124817420abceed36e2415"},
{file = "aiohttp-3.13.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c685f2d80bb67ca8c3837823ad76196b3694b0159d232206d1e461d3d434666f"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:48e377758516d262bde50c2584fc6c578af272559c409eecbdd2bae1601184d6"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:34749271508078b261c4abb1767d42b8d0c0cc9449c73a4df494777dc55f0687"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:82611aeec80eb144416956ec85b6ca45a64d76429c1ed46ae1b5f86c6e0c9a26"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2fff83cfc93f18f215896e3a190e8e5cb413ce01553901aca925176e7568963a"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bbe7d4cecacb439e2e2a8a1a7b935c25b812af7a5fd26503a66dadf428e79ec1"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b928f30fe49574253644b1ca44b1b8adbd903aa0da4b9054a6c20fc7f4092a25"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7b5e8fe4de30df199155baaf64f2fcd604f4c678ed20910db8e2c66dc4b11603"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:8542f41a62bcc58fc7f11cf7c90e0ec324ce44950003feb70640fc2a9092c32a"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:5e1d8c8b8f1d91cd08d8f4a3c2b067bfca6ec043d3ff36de0f3a715feeedf926"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:90455115e5da1c3c51ab619ac57f877da8fd6d73c05aacd125c5ae9819582aba"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:042e9e0bcb5fba81886c8b4fbb9a09d6b8a00245fd8d88e4d989c1f96c74164c"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2eb752b102b12a76ca02dff751a801f028b4ffbbc478840b473597fc91a9ed43"},
{file = "aiohttp-3.13.3-cp314-cp314-win32.whl", hash = "sha256:b556c85915d8efaed322bf1bdae9486aa0f3f764195a0fb6ee962e5c71ef5ce1"},
{file = "aiohttp-3.13.3-cp314-cp314-win_amd64.whl", hash = "sha256:9bf9f7a65e7aa20dd764151fb3d616c81088f91f8df39c3893a536e279b4b984"},
{file = "aiohttp-3.13.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:05861afbbec40650d8a07ea324367cb93e9e8cc7762e04dd4405df99fa65159c"},
{file = "aiohttp-3.13.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2fc82186fadc4a8316768d61f3722c230e2c1dcab4200d52d2ebdf2482e47592"},
{file = "aiohttp-3.13.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:0add0900ff220d1d5c5ebbf99ed88b0c1bbf87aa7e4262300ed1376a6b13414f"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:568f416a4072fbfae453dcf9a99194bbb8bdeab718e08ee13dfa2ba0e4bebf29"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:add1da70de90a2569c5e15249ff76a631ccacfe198375eead4aadf3b8dc849dc"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:10b47b7ba335d2e9b1239fa571131a87e2d8ec96b333e68b2a305e7a98b0bae2"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3dd4dce1c718e38081c8f35f323209d4c1df7d4db4bab1b5c88a6b4d12b74587"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:34bac00a67a812570d4a460447e1e9e06fae622946955f939051e7cc895cfab8"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a19884d2ee70b06d9204b2727a7b9f983d0c684c650254679e716b0b77920632"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5f8ca7f2bb6ba8348a3614c7918cc4bb73268c5ac2a207576b7afea19d3d9f64"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:b0d95340658b9d2f11d9697f59b3814a9d3bb4b7a7c20b131df4bcef464037c0"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:a1e53262fd202e4b40b70c3aff944a8155059beedc8a89bba9dc1f9ef06a1b56"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:d60ac9663f44168038586cab2157e122e46bdef09e9368b37f2d82d354c23f72"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:90751b8eed69435bac9ff4e3d2f6b3af1f57e37ecb0fbeee59c0174c9e2d41df"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fc353029f176fd2b3ec6cfc71be166aba1936fe5d73dd1992ce289ca6647a9aa"},
{file = "aiohttp-3.13.3-cp314-cp314t-win32.whl", hash = "sha256:2e41b18a58da1e474a057b3d35248d8320029f61d70a37629535b16a0c8f3767"},
{file = "aiohttp-3.13.3-cp314-cp314t-win_amd64.whl", hash = "sha256:44531a36aa2264a1860089ffd4dce7baf875ee5a6079d5fb42e261c704ef7344"},
{file = "aiohttp-3.13.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:31a83ea4aead760dfcb6962efb1d861db48c34379f2ff72db9ddddd4cda9ea2e"},
{file = "aiohttp-3.13.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:988a8c5e317544fdf0d39871559e67b6341065b87fceac641108c2096d5506b7"},
{file = "aiohttp-3.13.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9b174f267b5cfb9a7dba9ee6859cecd234e9a681841eb85068059bc867fb8f02"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:947c26539750deeaee933b000fb6517cc770bbd064bad6033f1cff4803881e43"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:9ebf57d09e131f5323464bd347135a88622d1c0976e88ce15b670e7ad57e4bd6"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4ae5b5a0e1926e504c81c5b84353e7a5516d8778fbbff00429fe7b05bb25cbce"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2ba0eea45eb5cc3172dbfc497c066f19c41bac70963ea1a67d51fc92e4cf9a80"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bae5c2ed2eae26cc382020edad80d01f36cb8e746da40b292e68fec40421dc6a"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:8a60e60746623925eab7d25823329941aee7242d559baa119ca2b253c88a7bd6"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:e50a2e1404f063427c9d027378472316201a2290959a295169bcf25992d04558"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:9a9dc347e5a3dc7dfdbc1f82da0ef29e388ddb2ed281bfce9dd8248a313e62b7"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:b46020d11d23fe16551466c77823df9cc2f2c1e63cc965daf67fa5eec6ca1877"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:69c56fbc1993fa17043e24a546959c0178fe2b5782405ad4559e6c13975c15e3"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:b99281b0704c103d4e11e72a76f1b543d4946fea7dd10767e7e1b5f00d4e5704"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:40c5e40ecc29ba010656c18052b877a1c28f84344825efa106705e835c28530f"},
{file = "aiohttp-3.13.3-cp39-cp39-win32.whl", hash = "sha256:56339a36b9f1fc708260c76c87e593e2afb30d26de9ae1eb445b5e051b98a7a1"},
{file = "aiohttp-3.13.3-cp39-cp39-win_amd64.whl", hash = "sha256:c6b8568a3bb5819a0ad087f16d40e5a3fb6099f39ea1d5625a3edc1e923fc538"},
{file = "aiohttp-3.13.3.tar.gz", hash = "sha256:a949eee43d3782f2daae4f4a2819b2cb9b0c5d3b7f7a927067cc84dafdbb9f88"},
]
[package.dependencies]
@@ -142,7 +176,7 @@ propcache = ">=0.2.0"
yarl = ">=1.17.0,<2.0"
[package.extras]
speedups = ["Brotli ; platform_python_implementation == \"CPython\"", "aiodns (>=3.3.0)", "brotlicffi ; platform_python_implementation != \"CPython\""]
speedups = ["Brotli (>=1.2) ; platform_python_implementation == \"CPython\"", "aiodns (>=3.3.0)", "backports.zstd ; platform_python_implementation == \"CPython\" and python_version < \"3.14\"", "brotlicffi (>=1.2) ; platform_python_implementation != \"CPython\""]
[[package]]
name = "aiosignal"
@@ -813,14 +847,14 @@ tests-mypy = ["mypy (>=1.11.1) ; platform_python_implementation == \"CPython\" a
[[package]]
name = "authlib"
version = "1.6.5"
version = "1.6.6"
description = "The ultimate Python library in building OAuth and OpenID Connect servers and clients."
optional = false
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "authlib-1.6.5-py2.py3-none-any.whl", hash = "sha256:3e0e0507807f842b02175507bdee8957a1d5707fd4afb17c32fb43fee90b6e3a"},
{file = "authlib-1.6.5.tar.gz", hash = "sha256:6aaf9c79b7cc96c900f0b284061691c5d4e61221640a948fe690b556a6d6d10b"},
{file = "authlib-1.6.6-py2.py3-none-any.whl", hash = "sha256:7d9e9bc535c13974313a87f53e8430eb6ea3d1cf6ae4f6efcd793f2e949143fd"},
{file = "authlib-1.6.6.tar.gz", hash = "sha256:45770e8e056d0f283451d9996fbb59b70d45722b45d854d58f32878d0a40c38e"},
]
[package.dependencies]
@@ -1554,84 +1588,101 @@ files = [
[[package]]
name = "cffi"
version = "1.17.1"
version = "2.0.0"
description = "Foreign Function Interface for Python calling C code."
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main", "dev"]
markers = "platform_python_implementation != \"PyPy\""
files = [
{file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"},
{file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be"},
{file = "cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c"},
{file = "cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15"},
{file = "cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401"},
{file = "cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b"},
{file = "cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655"},
{file = "cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0"},
{file = "cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4"},
{file = "cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93"},
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3"},
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8"},
{file = "cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65"},
{file = "cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903"},
{file = "cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e"},
{file = "cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd"},
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed"},
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9"},
{file = "cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d"},
{file = "cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a"},
{file = "cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1"},
{file = "cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8"},
{file = "cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1"},
{file = "cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16"},
{file = "cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e"},
{file = "cffi-1.17.1-cp39-cp39-win32.whl", hash = "sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7"},
{file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"},
{file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"},
{file = "cffi-2.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44"},
{file = "cffi-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49"},
{file = "cffi-2.0.0-cp310-cp310-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c"},
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb"},
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0"},
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4"},
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453"},
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495"},
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5"},
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb"},
{file = "cffi-2.0.0-cp310-cp310-win32.whl", hash = "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a"},
{file = "cffi-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739"},
{file = "cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe"},
{file = "cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c"},
{file = "cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92"},
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93"},
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5"},
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664"},
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26"},
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9"},
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414"},
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743"},
{file = "cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5"},
{file = "cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5"},
{file = "cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d"},
{file = "cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d"},
{file = "cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c"},
{file = "cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe"},
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062"},
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e"},
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037"},
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba"},
{file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94"},
{file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187"},
{file = "cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18"},
{file = "cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5"},
{file = "cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6"},
{file = "cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb"},
{file = "cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca"},
{file = "cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b"},
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b"},
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2"},
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3"},
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26"},
{file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c"},
{file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b"},
{file = "cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27"},
{file = "cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75"},
{file = "cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91"},
{file = "cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5"},
{file = "cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13"},
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b"},
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c"},
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef"},
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775"},
{file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205"},
{file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1"},
{file = "cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f"},
{file = "cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25"},
{file = "cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad"},
{file = "cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9"},
{file = "cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d"},
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c"},
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8"},
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc"},
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592"},
{file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512"},
{file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4"},
{file = "cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e"},
{file = "cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6"},
{file = "cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9"},
{file = "cffi-2.0.0-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf"},
{file = "cffi-2.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7"},
{file = "cffi-2.0.0-cp39-cp39-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63"},
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2"},
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65"},
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322"},
{file = "cffi-2.0.0-cp39-cp39-win32.whl", hash = "sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a"},
{file = "cffi-2.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9"},
{file = "cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529"},
]
markers = {dev = "platform_python_implementation != \"PyPy\""}
[package.dependencies]
pycparser = "*"
pycparser = {version = "*", markers = "implementation_name != \"PyPy\""}
[[package]]
name = "charset-normalizer"
@@ -4216,14 +4267,14 @@ files = [
[[package]]
name = "marshmallow"
version = "3.26.1"
version = "3.26.2"
description = "A lightweight library for converting complex datatypes to and from native Python datatypes."
optional = false
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "marshmallow-3.26.1-py3-none-any.whl", hash = "sha256:3350409f20a70a7e4e11a27661187b77cdcaeb20abca41c1454fe33636bea09c"},
{file = "marshmallow-3.26.1.tar.gz", hash = "sha256:e6d8affb6cb61d39d26402096dc0aee12d5a26d490a121f118d2e81dc0719dc6"},
{file = "marshmallow-3.26.2-py3-none-any.whl", hash = "sha256:013fa8a3c4c276c24d26d84ce934dc964e2aa794345a0f8c7e5a7191482c8a73"},
{file = "marshmallow-3.26.2.tar.gz", hash = "sha256:bbe2adb5a03e6e3571b573f42527c6fe926e17467833660bebd11593ab8dfd57"},
]
[package.dependencies]
@@ -5678,11 +5729,11 @@ description = "C parser in Python"
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
markers = "platform_python_implementation != \"PyPy\" and implementation_name != \"PyPy\""
files = [
{file = "pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc"},
{file = "pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6"},
]
markers = {dev = "platform_python_implementation != \"PyPy\""}
[[package]]
name = "pycurl"
@@ -5946,30 +5997,45 @@ testutils = ["gitpython (>3)"]
[[package]]
name = "pynacl"
version = "1.5.0"
version = "1.6.2"
description = "Python binding to the Networking and Cryptography (NaCl) library"
optional = false
python-versions = ">=3.6"
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "PyNaCl-1.5.0-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:401002a4aaa07c9414132aaed7f6836ff98f59277a234704ff66878c2ee4a0d1"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:52cb72a79269189d4e0dc537556f4740f7f0a9ec41c1322598799b0bdad4ef92"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a36d4a9dda1f19ce6e03c9a784a2921a4b726b02e1c736600ca9c22029474394"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:0c84947a22519e013607c9be43706dd42513f9e6ae5d39d3613ca1e142fba44d"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:06b8f6fa7f5de8d5d2f7573fe8c863c051225a27b61e6860fd047b1775807858"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:a422368fc821589c228f4c49438a368831cb5bbc0eab5ebe1d7fac9dded6567b"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:61f642bf2378713e2c2e1de73444a3778e5f0a38be6fee0fe532fe30060282ff"},
{file = "PyNaCl-1.5.0-cp36-abi3-win32.whl", hash = "sha256:e46dae94e34b085175f8abb3b0aaa7da40767865ac82c928eeb9e57e1ea8a543"},
{file = "PyNaCl-1.5.0-cp36-abi3-win_amd64.whl", hash = "sha256:20f42270d27e1b6a29f54032090b972d97f0a1b0948cc52392041ef7831fee93"},
{file = "PyNaCl-1.5.0.tar.gz", hash = "sha256:8ac7448f09ab85811607bdd21ec2464495ac8b7c66d146bf545b0f08fb9220ba"},
{file = "pynacl-1.6.2-cp314-cp314t-macosx_10_10_universal2.whl", hash = "sha256:622d7b07cc5c02c666795792931b50c91f3ce3c2649762efb1ef0d5684c81594"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d071c6a9a4c94d79eb665db4ce5cedc537faf74f2355e4d502591d850d3913c0"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fe9847ca47d287af41e82be1dd5e23023d3c31a951da134121ab02e42ac218c9"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:04316d1fc625d860b6c162fff704eb8426b1a8bcd3abacea11142cbd99a6b574"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44081faff368d6c5553ccf55322ef2819abb40e25afaec7e740f159f74813634"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:a9f9932d8d2811ce1a8ffa79dcbdf3970e7355b5c8eb0c1a881a57e7f7d96e88"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:bc4a36b28dd72fb4845e5d8f9760610588a96d5a51f01d84d8c6ff9849968c14"},
{file = "pynacl-1.6.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:3bffb6d0f6becacb6526f8f42adfb5efb26337056ee0831fb9a7044d1a964444"},
{file = "pynacl-1.6.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:2fef529ef3ee487ad8113d287a593fa26f48ee3620d92ecc6f1d09ea38e0709b"},
{file = "pynacl-1.6.2-cp314-cp314t-win32.whl", hash = "sha256:a84bf1c20339d06dc0c85d9aea9637a24f718f375d861b2668b2f9f96fa51145"},
{file = "pynacl-1.6.2-cp314-cp314t-win_amd64.whl", hash = "sha256:320ef68a41c87547c91a8b58903c9caa641ab01e8512ce291085b5fe2fcb7590"},
{file = "pynacl-1.6.2-cp314-cp314t-win_arm64.whl", hash = "sha256:d29bfe37e20e015a7d8b23cfc8bd6aa7909c92a1b8f41ee416bbb3e79ef182b2"},
{file = "pynacl-1.6.2-cp38-abi3-macosx_10_10_universal2.whl", hash = "sha256:c949ea47e4206af7c8f604b8278093b674f7c79ed0d4719cc836902bf4517465"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:8845c0631c0be43abdd865511c41eab235e0be69c81dc66a50911594198679b0"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:22de65bb9010a725b0dac248f353bb072969c94fa8d6b1f34b87d7953cf7bbe4"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:46065496ab748469cdd999246d17e301b2c24ae2fdf739132e580a0e94c94a87"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8a66d6fb6ae7661c58995f9c6435bda2b1e68b54b598a6a10247bfcdadac996c"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:26bfcd00dcf2cf160f122186af731ae30ab120c18e8375684ec2670dccd28130"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:c8a231e36ec2cab018c4ad4358c386e36eede0319a0c41fed24f840b1dac59f6"},
{file = "pynacl-1.6.2-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:68be3a09455743ff9505491220b64440ced8973fe930f270c8e07ccfa25b1f9e"},
{file = "pynacl-1.6.2-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:8b097553b380236d51ed11356c953bf8ce36a29a3e596e934ecabe76c985a577"},
{file = "pynacl-1.6.2-cp38-abi3-win32.whl", hash = "sha256:5811c72b473b2f38f7e2a3dc4f8642e3a3e9b5e7317266e4ced1fba85cae41aa"},
{file = "pynacl-1.6.2-cp38-abi3-win_amd64.whl", hash = "sha256:62985f233210dee6548c223301b6c25440852e13d59a8b81490203c3227c5ba0"},
{file = "pynacl-1.6.2-cp38-abi3-win_arm64.whl", hash = "sha256:834a43af110f743a754448463e8fd61259cd4ab5bbedcf70f9dabad1d28a394c"},
{file = "pynacl-1.6.2.tar.gz", hash = "sha256:018494d6d696ae03c7e656e5e74cdfd8ea1326962cc401bcf018f1ed8436811c"},
]
[package.dependencies]
cffi = ">=1.4.1"
cffi = {version = ">=2.0.0", markers = "platform_python_implementation != \"PyPy\" and python_version >= \"3.9\""}
[package.extras]
docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
docs = ["sphinx (<7)", "sphinx_rtd_theme"]
tests = ["hypothesis (>=3.27.0)", "pytest (>=7.4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
[[package]]
name = "pyopenssl"
@@ -6698,6 +6764,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f66efbc1caa63c088dead1c4170d148eabc9b80d95fb75b6c92ac0aad2437d76"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:22353049ba4181685023b25b5b51a574bce33e7f51c759371a7422dcae5402a6"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:932205970b9f9991b34f55136be327501903f7c66830e9760a8ffb15b07f05cd"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a52d48f4e7bf9005e8f0a89209bf9a73f7190ddf0489eee5eb51377385f59f2a"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win32.whl", hash = "sha256:3eac5a91891ceb88138c113f9db04f3cebdae277f5d44eaa3651a4f573e6a5da"},
{file = "ruamel.yaml.clib-0.2.12-cp310-cp310-win_amd64.whl", hash = "sha256:ab007f2f5a87bd08ab1499bdf96f3d5c6ad4dcfa364884cb4549aa0154b13a28"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-macosx_13_0_arm64.whl", hash = "sha256:4a6679521a58256a90b0d89e03992c15144c5f3858f40d7c18886023d7943db6"},
@@ -6706,6 +6773,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:811ea1594b8a0fb466172c384267a4e5e367298af6b228931f273b111f17ef52"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:cf12567a7b565cbf65d438dec6cfbe2917d3c1bdddfce84a9930b7d35ea59642"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7dd5adc8b930b12c8fc5b99e2d535a09889941aa0d0bd06f4749e9a9397c71d2"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1492a6051dab8d912fc2adeef0e8c72216b24d57bd896ea607cb90bb0c4981d3"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win32.whl", hash = "sha256:bd0a08f0bab19093c54e18a14a10b4322e1eacc5217056f3c063bd2f59853ce4"},
{file = "ruamel.yaml.clib-0.2.12-cp311-cp311-win_amd64.whl", hash = "sha256:a274fb2cb086c7a3dea4322ec27f4cb5cc4b6298adb583ab0e211a4682f241eb"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:20b0f8dc160ba83b6dcc0e256846e1a02d044e13f7ea74a3d1d56ede4e48c632"},
@@ -6714,6 +6782,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:749c16fcc4a2b09f28843cda5a193e0283e47454b63ec4b81eaa2242f50e4ccd"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bf165fef1f223beae7333275156ab2022cffe255dcc51c27f066b4370da81e31"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:32621c177bbf782ca5a18ba4d7af0f1082a3f6e517ac2a18b3974d4edf349680"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b82a7c94a498853aa0b272fd5bc67f29008da798d4f93a2f9f289feb8426a58d"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win32.whl", hash = "sha256:e8c4ebfcfd57177b572e2040777b8abc537cdef58a2120e830124946aa9b42c5"},
{file = "ruamel.yaml.clib-0.2.12-cp312-cp312-win_amd64.whl", hash = "sha256:0467c5965282c62203273b838ae77c0d29d7638c8a4e3a1c8bdd3602c10904e4"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:4c8c5d82f50bb53986a5e02d1b3092b03622c02c2eb78e29bec33fd9593bae1a"},
@@ -6722,6 +6791,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:96777d473c05ee3e5e3c3e999f5d23c6f4ec5b0c38c098b3a5229085f74236c6"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:3bc2a80e6420ca8b7d3590791e2dfc709c88ab9152c00eeb511c9875ce5778bf"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:e188d2699864c11c36cdfdada94d781fd5d6b0071cd9c427bceb08ad3d7c70e1"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4f6f3eac23941b32afccc23081e1f50612bdbe4e982012ef4f5797986828cd01"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win32.whl", hash = "sha256:6442cb36270b3afb1b4951f060eccca1ce49f3d087ca1ca4563a6eb479cb3de6"},
{file = "ruamel.yaml.clib-0.2.12-cp313-cp313-win_amd64.whl", hash = "sha256:e5b8daf27af0b90da7bb903a876477a9e6d7270be6146906b276605997c7e9a3"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:fc4b630cd3fa2cf7fce38afa91d7cfe844a9f75d7f0f36393fa98815e911d987"},
@@ -6730,6 +6800,7 @@ files = [
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e2f1c3765db32be59d18ab3953f43ab62a761327aafc1594a2a1fbe038b8b8a7"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:d85252669dc32f98ebcd5d36768f5d4faeaeaa2d655ac0473be490ecdae3c285"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:e143ada795c341b56de9418c58d028989093ee611aa27ffb9b7f609c00d813ed"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:2c59aa6170b990d8d2719323e628aaf36f3bfbc1c26279c0eeeb24d05d2d11c7"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win32.whl", hash = "sha256:beffaed67936fbbeffd10966a4eb53c402fafd3d6833770516bf7314bc6ffa12"},
{file = "ruamel.yaml.clib-0.2.12-cp39-cp39-win_amd64.whl", hash = "sha256:040ae85536960525ea62868b642bdb0c2cc6021c9f9d507810c0c604e66f5a7b"},
{file = "ruamel.yaml.clib-0.2.12.tar.gz", hash = "sha256:6c8fbb13ec503f99a91901ab46e0b07ae7941cd527393187039aec586fdfd36f"},
@@ -7220,21 +7291,21 @@ files = [
[[package]]
name = "urllib3"
version = "2.5.0"
version = "2.6.3"
description = "HTTP library with thread-safe connection pooling, file post, and more."
optional = false
python-versions = ">=3.9"
groups = ["main", "dev"]
files = [
{file = "urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc"},
{file = "urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760"},
{file = "urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4"},
{file = "urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed"},
]
[package.extras]
brotli = ["brotli (>=1.0.9) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=0.8.0) ; platform_python_implementation != \"CPython\""]
brotli = ["brotli (>=1.2.0) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=1.2.0.0) ; platform_python_implementation != \"CPython\""]
h2 = ["h2 (>=4,<5)"]
socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
zstd = ["zstandard (>=0.18.0)"]
zstd = ["backports-zstd (>=1.0.0) ; python_version < \"3.14\""]
[[package]]
name = "uuid6"
@@ -7303,14 +7374,14 @@ test = ["websockets"]
[[package]]
name = "werkzeug"
version = "3.1.4"
version = "3.1.5"
description = "The comprehensive WSGI web application library."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "werkzeug-3.1.4-py3-none-any.whl", hash = "sha256:2ad50fb9ed09cc3af22c54698351027ace879a0b60a3b5edf5730b2f7d876905"},
{file = "werkzeug-3.1.4.tar.gz", hash = "sha256:cd3cd98b1b92dc3b7b3995038826c68097dcb16f9baa63abe35f20eafeb9fe5e"},
{file = "werkzeug-3.1.5-py3-none-any.whl", hash = "sha256:5111e36e91086ece91f93268bb39b4a35c1e6f1feac762c9c822ded0a4e322dc"},
{file = "werkzeug-3.1.5.tar.gz", hash = "sha256:6a548b0e88955dd07ccb25539d7d0cc97417ee9e179677d22c7041c8f078ce67"},
]
[package.dependencies]
+66 -27
View File
@@ -37,6 +37,7 @@ from api.models import (
PermissionChoices,
Processor,
Provider,
ProviderComplianceScore,
ProviderGroup,
ProviderSecret,
Resource,
@@ -92,10 +93,62 @@ class ChoiceInFilter(BaseInFilter, ChoiceFilter):
pass
class BaseProviderFilter(FilterSet):
"""
Abstract base filter for models with direct FK to Provider.
Provides standard provider_id and provider_type filters.
Subclasses must define Meta.model.
"""
provider_id = UUIDFilter(field_name="provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
field_name="provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="provider__provider",
choices=Provider.ProviderChoices.choices,
lookup_expr="in",
)
class Meta:
abstract = True
fields = {}
class BaseScanProviderFilter(FilterSet):
"""
Abstract base filter for models with FK to Scan (and Scan has FK to Provider).
Provides standard provider_id and provider_type filters via scan relationship.
Subclasses must define Meta.model.
"""
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="scan__provider__provider",
choices=Provider.ProviderChoices.choices,
lookup_expr="in",
)
class Meta:
abstract = True
fields = {}
class CommonFindingFilters(FilterSet):
# We filter providers from the scan in findings
# Both 'provider' and 'provider_id' parameters are supported for API consistency
# Frontend uses 'provider_id' uniformly across all endpoints
provider = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
choices=Provider.ProviderChoices.choices, field_name="scan__provider__provider"
)
@@ -1086,39 +1139,25 @@ class ThreatScoreSnapshotFilter(FilterSet):
}
class AttackSurfaceOverviewFilter(FilterSet):
class AttackSurfaceOverviewFilter(BaseScanProviderFilter):
"""Filter for attack surface overview aggregations by provider."""
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="scan__provider__provider",
choices=Provider.ProviderChoices.choices,
lookup_expr="in",
)
class Meta:
class Meta(BaseScanProviderFilter.Meta):
model = AttackSurfaceOverview
fields = {}
class CategoryOverviewFilter(FilterSet):
provider_id = UUIDFilter(field_name="scan__provider__id", lookup_expr="exact")
provider_id__in = UUIDInFilter(field_name="scan__provider__id", lookup_expr="in")
provider_type = ChoiceFilter(
field_name="scan__provider__provider", choices=Provider.ProviderChoices.choices
)
provider_type__in = ChoiceInFilter(
field_name="scan__provider__provider",
choices=Provider.ProviderChoices.choices,
lookup_expr="in",
)
class CategoryOverviewFilter(BaseScanProviderFilter):
"""Filter for category overview aggregations by provider."""
category = CharFilter(field_name="category", lookup_expr="exact")
category__in = CharInFilter(field_name="category", lookup_expr="in")
class Meta:
class Meta(BaseScanProviderFilter.Meta):
model = ScanCategorySummary
fields = {}
class ComplianceWatchlistFilter(BaseProviderFilter):
"""Filter for compliance watchlist overview by provider."""
class Meta(BaseProviderFilter.Meta):
model = ProviderComplianceScore
@@ -0,0 +1,94 @@
import uuid
import django.db.models.deletion
from django.db import migrations, models
import api.db_utils
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0065_alibabacloud_provider"),
]
operations = [
migrations.CreateModel(
name="ProviderComplianceScore",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("compliance_id", models.TextField()),
("requirement_id", models.TextField()),
(
"requirement_status",
api.db_utils.StatusEnumField(
choices=[
("FAIL", "Fail"),
("PASS", "Pass"),
("MANUAL", "Manual"),
]
),
),
("scan_completed_at", models.DateTimeField()),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="compliance_scores",
related_query_name="compliance_score",
to="api.provider",
),
),
(
"scan",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="compliance_scores",
related_query_name="compliance_score",
to="api.scan",
),
),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="api.tenant",
),
),
],
options={
"db_table": "provider_compliance_scores",
"abstract": False,
},
),
migrations.AddConstraint(
model_name="providercompliancescore",
constraint=models.UniqueConstraint(
fields=("tenant_id", "provider_id", "compliance_id", "requirement_id"),
name="unique_provider_compliance_req",
),
),
migrations.AddConstraint(
model_name="providercompliancescore",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_providercompliancescore",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
migrations.AddIndex(
model_name="providercompliancescore",
index=models.Index(
fields=["tenant_id", "provider_id", "compliance_id"],
name="pcs_tenant_prov_comp_idx",
),
),
]
@@ -0,0 +1,61 @@
import uuid
import django.db.models.deletion
from django.db import migrations, models
import api.rls
class Migration(migrations.Migration):
dependencies = [
("api", "0066_provider_compliance_score"),
]
operations = [
migrations.CreateModel(
name="TenantComplianceSummary",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("compliance_id", models.TextField()),
("requirements_passed", models.IntegerField(default=0)),
("requirements_failed", models.IntegerField(default=0)),
("requirements_manual", models.IntegerField(default=0)),
("total_requirements", models.IntegerField(default=0)),
("updated_at", models.DateTimeField(auto_now=True)),
(
"tenant",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="api.tenant",
),
),
],
options={
"db_table": "tenant_compliance_summaries",
"abstract": False,
},
),
migrations.AddConstraint(
model_name="tenantcompliancesummary",
constraint=models.UniqueConstraint(
fields=("tenant_id", "compliance_id"),
name="unique_tenant_compliance_summary",
),
),
migrations.AddConstraint(
model_name="tenantcompliancesummary",
constraint=api.rls.RowLevelSecurityConstraint(
"tenant_id",
name="rls_on_tenantcompliancesummary",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
),
]
+89
View File
@@ -2605,3 +2605,92 @@ class AttackSurfaceOverview(RowLevelSecurityProtectedModel):
class JSONAPIMeta:
resource_name = "attack-surface-overviews"
class ProviderComplianceScore(RowLevelSecurityProtectedModel):
"""
Compliance requirement status from latest completed scan per provider.
Used for efficient compliance watchlist queries with FAIL-dominant aggregation
across multiple providers. Updated via atomic upsert after each scan completion.
"""
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
scan = models.ForeignKey(
Scan,
on_delete=models.CASCADE,
related_name="compliance_scores",
related_query_name="compliance_score",
)
provider = models.ForeignKey(
Provider,
on_delete=models.CASCADE,
related_name="compliance_scores",
related_query_name="compliance_score",
)
compliance_id = models.TextField()
requirement_id = models.TextField()
requirement_status = StatusEnumField(choices=StatusChoices)
scan_completed_at = models.DateTimeField()
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "provider_compliance_scores"
constraints = [
models.UniqueConstraint(
fields=("tenant_id", "provider_id", "compliance_id", "requirement_id"),
name="unique_provider_compliance_req",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
indexes = [
models.Index(
fields=["tenant_id", "provider_id", "compliance_id"],
name="pcs_tenant_prov_comp_idx",
),
]
class TenantComplianceSummary(RowLevelSecurityProtectedModel):
"""
Pre-aggregated compliance counts per tenant with FAIL-dominant logic applied.
One row per (tenant, compliance_id). Used for fast watchlist queries when
no provider filter is applied. Recalculated after each scan by aggregating
across all providers with FAIL-dominant logic at requirement level.
"""
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
compliance_id = models.TextField()
requirements_passed = models.IntegerField(default=0)
requirements_failed = models.IntegerField(default=0)
requirements_manual = models.IntegerField(default=0)
total_requirements = models.IntegerField(default=0)
updated_at = models.DateTimeField(auto_now=True)
class Meta(RowLevelSecurityProtectedModel.Meta):
db_table = "tenant_compliance_summaries"
constraints = [
models.UniqueConstraint(
fields=("tenant_id", "compliance_id"),
name="unique_tenant_compliance_summary",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
statements=["SELECT", "INSERT", "UPDATE", "DELETE"],
),
]
File diff suppressed because it is too large Load Diff
+169 -1
View File
@@ -1,9 +1,21 @@
from datetime import datetime, timezone
import pytest
from allauth.socialaccount.models import SocialApp
from django.core.exceptions import ValidationError
from django.db import IntegrityError
from api.db_router import MainRouter
from api.models import Resource, ResourceTag, SAMLConfiguration, SAMLDomainIndex
from api.models import (
ProviderComplianceScore,
Resource,
ResourceTag,
SAMLConfiguration,
SAMLDomainIndex,
StateChoices,
StatusChoices,
TenantComplianceSummary,
)
@pytest.mark.django_db
@@ -324,3 +336,159 @@ class TestSAMLConfigurationModel:
errors = exc_info.value.message_dict
assert "metadata_xml" in errors
assert "There is a problem with your metadata." in errors["metadata_xml"][0]
@pytest.mark.django_db
class TestProviderComplianceScoreModel:
def test_create_provider_compliance_score(self, providers_fixture, scans_fixture):
provider = providers_fixture[0]
scan = scans_fixture[0]
scan.completed_at = datetime.now(timezone.utc)
scan.save()
score = ProviderComplianceScore.objects.create(
tenant_id=provider.tenant_id,
provider=provider,
scan=scan,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan.completed_at,
)
assert score.compliance_id == "aws_cis_2.0"
assert score.requirement_id == "req_1"
assert score.requirement_status == StatusChoices.PASS
def test_unique_constraint_per_provider_compliance_requirement(
self, providers_fixture, scans_fixture
):
provider = providers_fixture[0]
scan = scans_fixture[0]
scan.completed_at = datetime.now(timezone.utc)
scan.save()
ProviderComplianceScore.objects.create(
tenant_id=provider.tenant_id,
provider=provider,
scan=scan,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan.completed_at,
)
with pytest.raises(IntegrityError):
ProviderComplianceScore.objects.create(
tenant_id=provider.tenant_id,
provider=provider,
scan=scan,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.FAIL,
scan_completed_at=scan.completed_at,
)
def test_different_providers_same_requirement_allowed(
self, providers_fixture, scans_fixture
):
provider1, provider2, *_ = providers_fixture
scan1 = scans_fixture[0]
scan1.completed_at = datetime.now(timezone.utc)
scan1.save()
scan2 = scans_fixture[2]
scan2.state = StateChoices.COMPLETED
scan2.completed_at = datetime.now(timezone.utc)
scan2.save()
score1 = ProviderComplianceScore.objects.create(
tenant_id=provider1.tenant_id,
provider=provider1,
scan=scan1,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan1.completed_at,
)
score2 = ProviderComplianceScore.objects.create(
tenant_id=provider2.tenant_id,
provider=provider2,
scan=scan2,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.FAIL,
scan_completed_at=scan2.completed_at,
)
assert score1.id != score2.id
assert score1.requirement_status != score2.requirement_status
@pytest.mark.django_db
class TestTenantComplianceSummaryModel:
def test_create_tenant_compliance_summary(self, tenants_fixture):
tenant = tenants_fixture[0]
summary = TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="aws_cis_2.0",
requirements_passed=5,
requirements_failed=2,
requirements_manual=1,
total_requirements=8,
)
assert summary.compliance_id == "aws_cis_2.0"
assert summary.requirements_passed == 5
assert summary.requirements_failed == 2
assert summary.requirements_manual == 1
assert summary.total_requirements == 8
assert summary.updated_at is not None
def test_unique_constraint_per_tenant_compliance(self, tenants_fixture):
tenant = tenants_fixture[0]
TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="aws_cis_2.0",
requirements_passed=5,
requirements_failed=2,
requirements_manual=1,
total_requirements=8,
)
with pytest.raises(IntegrityError):
TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="aws_cis_2.0",
requirements_passed=3,
requirements_failed=4,
requirements_manual=1,
total_requirements=8,
)
def test_different_tenants_same_compliance_allowed(self, tenants_fixture):
tenant1, tenant2, *_ = tenants_fixture
summary1 = TenantComplianceSummary.objects.create(
tenant_id=tenant1.id,
compliance_id="aws_cis_2.0",
requirements_passed=5,
requirements_failed=2,
requirements_manual=1,
total_requirements=8,
)
summary2 = TenantComplianceSummary.objects.create(
tenant_id=tenant2.id,
compliance_id="aws_cis_2.0",
requirements_passed=3,
requirements_failed=4,
requirements_manual=1,
total_requirements=8,
)
assert summary1.id != summary2.id
assert summary1.requirements_passed != summary2.requirements_passed
+144
View File
@@ -4110,6 +4110,37 @@ class TestFindingViewSet:
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 2
def test_finding_filter_by_provider_id_alias(
self, authenticated_client, findings_fixture
):
"""Test that provider_id filter alias works identically to provider filter."""
response = authenticated_client.get(
reverse("finding-list"),
{
"filter[provider_id]": findings_fixture[0].scan.provider.id,
"filter[inserted_at]": TODAY,
},
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 2
def test_finding_filter_by_provider_id_in_alias(
self, authenticated_client, findings_fixture
):
"""Test that provider_id__in filter alias works identically to provider__in filter."""
response = authenticated_client.get(
reverse("finding-list"),
{
"filter[provider_id__in]": [
findings_fixture[0].scan.provider.id,
findings_fixture[1].scan.provider.id,
],
"filter[inserted_at]": TODAY,
},
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 2
@pytest.mark.parametrize(
"filter_name",
(
@@ -4331,6 +4362,28 @@ class TestFindingViewSet:
== latest_scan_finding.status
)
def test_findings_latest_filter_by_provider_id_alias(
self, authenticated_client, latest_scan_finding
):
"""Test that provider_id filter alias works on latest findings endpoint."""
response = authenticated_client.get(
reverse("finding-latest"),
{"filter[provider_id]": latest_scan_finding.scan.provider.id},
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 1
def test_findings_latest_filter_by_provider_id_in_alias(
self, authenticated_client, latest_scan_finding
):
"""Test that provider_id__in filter alias works on latest findings endpoint."""
response = authenticated_client.get(
reverse("finding-latest"),
{"filter[provider_id__in]": str(latest_scan_finding.scan.provider.id)},
)
assert response.status_code == status.HTTP_200_OK
assert len(response.json()["data"]) == 1
def test_findings_metadata_latest(self, authenticated_client, latest_scan_finding):
response = authenticated_client.get(
reverse("finding-metadata_latest"),
@@ -7956,6 +8009,97 @@ class TestOverviewViewSet:
assert data[0]["attributes"]["failed_findings"] == 13
assert data[0]["attributes"]["new_failed_findings"] == 5
def test_compliance_watchlist_no_filters_uses_tenant_summary(
self, authenticated_client, tenant_compliance_summary_fixture
):
response = authenticated_client.get(reverse("overview-compliance-watchlist"))
assert response.status_code == status.HTTP_200_OK
data = response.json()["data"]
assert len(data) == 2
by_id = {item["id"]: item["attributes"] for item in data}
assert "aws_cis_2.0" in by_id
assert by_id["aws_cis_2.0"]["requirements_passed"] == 1
assert by_id["aws_cis_2.0"]["requirements_failed"] == 2
assert by_id["aws_cis_2.0"]["requirements_manual"] == 1
assert by_id["aws_cis_2.0"]["total_requirements"] == 4
assert "gdpr_aws" in by_id
assert by_id["gdpr_aws"]["requirements_passed"] == 5
assert by_id["gdpr_aws"]["requirements_failed"] == 0
assert by_id["gdpr_aws"]["total_requirements"] == 7
def test_compliance_watchlist_with_provider_filter_uses_provider_scores(
self,
authenticated_client,
provider_compliance_scores_fixture,
providers_fixture,
):
provider1 = providers_fixture[0]
url = f"{reverse('overview-compliance-watchlist')}?filter[provider_id]={provider1.id}"
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_200_OK
data = response.json()["data"]
assert len(data) == 2
by_id = {item["id"]: item["attributes"] for item in data}
assert by_id["aws_cis_2.0"]["requirements_passed"] == 1
assert by_id["aws_cis_2.0"]["requirements_failed"] == 1
assert by_id["aws_cis_2.0"]["requirements_manual"] == 1
assert by_id["aws_cis_2.0"]["total_requirements"] == 3
def test_compliance_watchlist_fail_dominant_logic(
self, authenticated_client, provider_compliance_scores_fixture
):
response = authenticated_client.get(
f"{reverse('overview-compliance-watchlist')}?filter[provider_type]=aws"
)
assert response.status_code == status.HTTP_200_OK
data = response.json()["data"]
by_id = {item["id"]: item["attributes"] for item in data}
aws_cis = by_id["aws_cis_2.0"]
assert aws_cis["requirements_failed"] == 2
assert aws_cis["requirements_passed"] == 0
assert aws_cis["requirements_manual"] == 1
assert aws_cis["total_requirements"] == 3
def test_compliance_watchlist_provider_id_in_filter(
self,
authenticated_client,
provider_compliance_scores_fixture,
providers_fixture,
):
provider1, provider2, *_ = providers_fixture
url = (
f"{reverse('overview-compliance-watchlist')}"
f"?filter[provider_id__in]={provider1.id},{provider2.id}"
)
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_200_OK
data = response.json()["data"]
assert len(data) >= 1
def test_compliance_watchlist_empty_result(self, authenticated_client):
response = authenticated_client.get(reverse("overview-compliance-watchlist"))
assert response.status_code == status.HTTP_200_OK
data = response.json()["data"]
assert data == []
@pytest.mark.parametrize(
"invalid_provider_type",
["invalid", "not_a_provider", "AWS", "awss"],
)
def test_compliance_watchlist_invalid_provider_type_filter(
self, authenticated_client, invalid_provider_type
):
url = f"{reverse('overview-compliance-watchlist')}?filter[provider_type]={invalid_provider_type}"
response = authenticated_client.get(url)
assert response.status_code == status.HTTP_400_BAD_REQUEST
@pytest.mark.django_db
class TestScheduleViewSet:
+14
View File
@@ -2303,6 +2303,20 @@ class CategoryOverviewSerializer(BaseSerializerV1):
resource_name = "category-overviews"
class ComplianceWatchlistOverviewSerializer(BaseSerializerV1):
"""Serializer for compliance watchlist overview with FAIL-dominant aggregation."""
id = serializers.CharField(source="compliance_id")
compliance_id = serializers.CharField()
requirements_passed = serializers.IntegerField()
requirements_failed = serializers.IntegerField()
requirements_manual = serializers.IntegerField()
total_requirements = serializers.IntegerField()
class JSONAPIMeta:
resource_name = "compliance-watchlist-overviews"
class OverviewRegionSerializer(serializers.Serializer):
id = serializers.SerializerMethodField()
provider_type = serializers.CharField()
+109 -2
View File
@@ -101,6 +101,7 @@ from api.filters import (
AttackSurfaceOverviewFilter,
CategoryOverviewFilter,
ComplianceOverviewFilter,
ComplianceWatchlistFilter,
CustomDjangoFilterBackend,
DailySeveritySummaryFilter,
FindingFilter,
@@ -144,6 +145,7 @@ from api.models import (
MuteRule,
Processor,
Provider,
ProviderComplianceScore,
ProviderGroup,
ProviderGroupMembership,
ProviderSecret,
@@ -163,6 +165,7 @@ from api.models import (
StateChoices,
Task,
TenantAPIKey,
TenantComplianceSummary,
ThreatScoreSnapshot,
User,
UserRoleRelationship,
@@ -185,6 +188,7 @@ from api.v1.serializers import (
ComplianceOverviewDetailThreatscoreSerializer,
ComplianceOverviewMetadataSerializer,
ComplianceOverviewSerializer,
ComplianceWatchlistOverviewSerializer,
FindingDynamicFilterSerializer,
FindingMetadataSerializer,
FindingSerializer,
@@ -4142,6 +4146,8 @@ class OverviewViewSet(BaseRLSViewSet):
return AttackSurfaceOverviewSerializer
elif self.action == "categories":
return CategoryOverviewSerializer
elif self.action == "compliance_watchlist":
return ComplianceWatchlistOverviewSerializer
return super().get_serializer_class()
def get_filterset_class(self):
@@ -4157,6 +4163,8 @@ class OverviewViewSet(BaseRLSViewSet):
return CategoryOverviewFilter
elif self.action == "attack_surface":
return AttackSurfaceOverviewFilter
elif self.action == "compliance_watchlist":
return ComplianceWatchlistFilter
return None
def filter_queryset(self, queryset):
@@ -4240,6 +4248,8 @@ class OverviewViewSet(BaseRLSViewSet):
self.request.query_params, exclude_keys=set(exclude_keys or [])
)
filterset = filterset_class(normalized_params, queryset=queryset)
if not filterset.is_valid():
raise ValidationError(filterset.errors)
return filterset.qs
def _latest_scan_ids_for_allowed_providers(self, tenant_id, provider_filters=None):
@@ -4256,9 +4266,10 @@ class OverviewViewSet(BaseRLSViewSet):
)
def _extract_provider_filters_from_params(self):
"""Extract provider filters from query params to apply on Scan queryset."""
"""Extract and validate provider filters from query params."""
params = self.request.query_params
filters = {}
valid_provider_types = {c[0] for c in Provider.ProviderChoices.choices}
provider_id = params.get("filter[provider_id]")
if provider_id:
@@ -4270,11 +4281,21 @@ class OverviewViewSet(BaseRLSViewSet):
provider_type = params.get("filter[provider_type]")
if provider_type:
if provider_type not in valid_provider_types:
raise ValidationError(
{"provider_type": f"Invalid choice: {provider_type}"}
)
filters["provider__provider"] = provider_type
provider_type_in = params.get("filter[provider_type__in]")
if provider_type_in:
filters["provider__provider__in"] = provider_type_in.split(",")
types = provider_type_in.split(",")
invalid = [t for t in types if t not in valid_provider_types]
if invalid:
raise ValidationError(
{"provider_type__in": f"Invalid choices: {', '.join(invalid)}"}
)
filters["provider__provider__in"] = types
return filters
@@ -4984,6 +5005,92 @@ class OverviewViewSet(BaseRLSViewSet):
status=status.HTTP_200_OK,
)
@action(
detail=False,
methods=["get"],
url_name="compliance-watchlist",
url_path="compliance-watchlist",
)
def compliance_watchlist(self, request):
"""
Get compliance watchlist overview with FAIL-dominant aggregation.
Without filters: uses pre-aggregated TenantComplianceSummary (~70 rows).
With provider filters: queries ProviderComplianceScore with FAIL-dominant logic.
"""
tenant_id = request.tenant_id
rbac_filter = self._get_provider_filter()
query_params = request.query_params
has_provider_filter = any(
key.startswith("filter[provider") for key in query_params.keys()
)
has_rbac_restriction = bool(rbac_filter)
if not has_provider_filter and not has_rbac_restriction:
response_data = list(
TenantComplianceSummary.objects.filter(tenant_id=tenant_id)
.values(
"compliance_id",
"requirements_passed",
"requirements_failed",
"requirements_manual",
"total_requirements",
)
.order_by("compliance_id")
)
else:
base_queryset = ProviderComplianceScore.objects.filter(
tenant_id=tenant_id, **rbac_filter
)
filtered_queryset = self._apply_filterset(
base_queryset, ComplianceWatchlistFilter
)
aggregation = (
filtered_queryset.values("compliance_id", "requirement_id")
.annotate(
has_fail=Sum(
Case(When(requirement_status="FAIL", then=1), default=0)
),
has_manual=Sum(
Case(When(requirement_status="MANUAL", then=1), default=0)
),
)
.values("compliance_id", "requirement_id", "has_fail", "has_manual")
)
compliance_data = defaultdict(
lambda: {
"requirements_passed": 0,
"requirements_failed": 0,
"requirements_manual": 0,
"total_requirements": 0,
}
)
for row in aggregation:
cid = row["compliance_id"]
compliance_data[cid]["total_requirements"] += 1
if row["has_fail"] and row["has_fail"] > 0:
compliance_data[cid]["requirements_failed"] += 1
elif row["has_manual"] and row["has_manual"] > 0:
compliance_data[cid]["requirements_manual"] += 1
else:
compliance_data[cid]["requirements_passed"] += 1
response_data = [
{"compliance_id": cid, **data}
for cid, data in sorted(compliance_data.items())
]
return Response(
self.get_serializer(response_data, many=True).data,
status=status.HTTP_200_OK,
)
@extend_schema(tags=["Schedule"])
@extend_schema_view(
+104
View File
@@ -30,6 +30,7 @@ from api.models import (
MuteRule,
Processor,
Provider,
ProviderComplianceScore,
ProviderGroup,
ProviderSecret,
Resource,
@@ -45,6 +46,7 @@ from api.models import (
StatusChoices,
Task,
TenantAPIKey,
TenantComplianceSummary,
User,
UserRoleRelationship,
)
@@ -1631,6 +1633,108 @@ def get_authorization_header(access_token: str) -> dict:
return {"Authorization": f"Bearer {access_token}"}
@pytest.fixture
def provider_compliance_scores_fixture(
tenants_fixture, providers_fixture, scans_fixture
):
"""Create ProviderComplianceScore entries for compliance watchlist tests."""
tenant = tenants_fixture[0]
provider1, provider2, *_ = providers_fixture
scan1, _, scan3 = scans_fixture
scan1.completed_at = datetime.now(timezone.utc) - timedelta(hours=1)
scan1.save()
scan3.state = StateChoices.COMPLETED
scan3.completed_at = datetime.now(timezone.utc)
scan3.save()
scores = [
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider1,
scan=scan1,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan1.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider1,
scan=scan1,
compliance_id="aws_cis_2.0",
requirement_id="req_2",
requirement_status=StatusChoices.FAIL,
scan_completed_at=scan1.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider1,
scan=scan1,
compliance_id="aws_cis_2.0",
requirement_id="req_3",
requirement_status=StatusChoices.MANUAL,
scan_completed_at=scan1.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider2,
scan=scan3,
compliance_id="aws_cis_2.0",
requirement_id="req_1",
requirement_status=StatusChoices.FAIL,
scan_completed_at=scan3.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider2,
scan=scan3,
compliance_id="aws_cis_2.0",
requirement_id="req_2",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan3.completed_at,
),
ProviderComplianceScore.objects.create(
tenant_id=tenant.id,
provider=provider1,
scan=scan1,
compliance_id="gdpr_aws",
requirement_id="gdpr_req_1",
requirement_status=StatusChoices.PASS,
scan_completed_at=scan1.completed_at,
),
]
return scores
@pytest.fixture
def tenant_compliance_summary_fixture(tenants_fixture):
"""Create TenantComplianceSummary entries for compliance watchlist tests."""
tenant = tenants_fixture[0]
summaries = [
TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="aws_cis_2.0",
requirements_passed=1,
requirements_failed=2,
requirements_manual=1,
total_requirements=4,
),
TenantComplianceSummary.objects.create(
tenant_id=tenant.id,
compliance_id="gdpr_aws",
requirements_passed=5,
requirements_failed=0,
requirements_manual=2,
total_requirements=7,
),
]
return summaries
def pytest_collection_modifyitems(items):
"""Ensure test_rbac.py is executed first."""
items.sort(key=lambda item: 0 if "test_rbac.py" in item.nodeid else 1)
+126 -2
View File
@@ -1,17 +1,28 @@
from collections import defaultdict
from datetime import timedelta
from celery.utils.log import get_task_logger
from django.db.models import Sum
from django.utils import timezone
from tasks.jobs.queries import (
COMPLIANCE_UPSERT_PROVIDER_SCORE_SQL,
COMPLIANCE_UPSERT_TENANT_SUMMARY_ALL_SQL,
)
from tasks.jobs.scan import aggregate_category_counts
from api.db_router import READ_REPLICA_ALIAS
from api.db_utils import rls_transaction
from api.db_router import READ_REPLICA_ALIAS, MainRouter
from api.db_utils import (
POSTGRES_TENANT_VAR,
SET_CONFIG_QUERY,
psycopg_connection,
rls_transaction,
)
from api.models import (
ComplianceOverviewSummary,
ComplianceRequirementOverview,
DailySeveritySummary,
Finding,
ProviderComplianceScore,
Resource,
ResourceFindingMapping,
ResourceScanSummary,
@@ -21,6 +32,8 @@ from api.models import (
StateChoices,
)
logger = get_task_logger(__name__)
def backfill_resource_scan_summaries(tenant_id: str, scan_id: str):
with rls_transaction(tenant_id, using=READ_REPLICA_ALIAS):
@@ -341,3 +354,114 @@ def backfill_scan_category_summaries(tenant_id: str, scan_id: str):
)
return {"status": "backfilled", "categories_count": len(category_counts)}
def backfill_provider_compliance_scores(tenant_id: str) -> dict:
"""
Backfill ProviderComplianceScore from latest completed scan per provider.
For each provider with completed scans, finds the most recent scan and
upserts compliance requirement statuses with FAIL-dominant aggregation.
Args:
tenant_id: Target tenant UUID
Returns:
dict: Statistics about the backfill operation
"""
with rls_transaction(tenant_id, using=READ_REPLICA_ALIAS):
completed_scans = Scan.all_objects.filter(
tenant_id=tenant_id,
state=StateChoices.COMPLETED,
completed_at__isnull=False,
)
if not completed_scans.exists():
return {"status": "no completed scans"}
existing_providers = set(
ProviderComplianceScore.objects.filter(tenant_id=tenant_id)
.values_list("provider_id", flat=True)
.distinct()
)
if existing_providers:
completed_scans = completed_scans.exclude(
provider_id__in=existing_providers
)
scan_info = list(
completed_scans.order_by("provider_id", "-completed_at")
.distinct("provider_id")
.values("id", "provider_id", "completed_at")
)
if not scan_info:
return {"status": "no scans to process"}
total_upserted = 0
providers_processed = 0
providers_skipped = 0
for scan in scan_info:
provider_id = scan["provider_id"]
scan_id = scan["id"]
try:
with psycopg_connection(MainRouter.default_db) as connection:
connection.autocommit = False
try:
with connection.cursor() as cursor:
cursor.execute(
SET_CONFIG_QUERY, [POSTGRES_TENANT_VAR, tenant_id]
)
cursor.execute(
COMPLIANCE_UPSERT_PROVIDER_SCORE_SQL,
[tenant_id, str(scan_id)],
)
upserted = cursor.rowcount
connection.commit()
total_upserted += upserted
providers_processed += 1
except Exception:
connection.rollback()
raise
except Exception as e:
providers_skipped += 1
logger.exception(
"Error backfilling provider %s for tenant %s: %s",
provider_id,
tenant_id,
e,
)
# Recalculate tenant summary after all providers are backfilled
if providers_processed > 0:
with psycopg_connection(MainRouter.default_db) as connection:
connection.autocommit = False
try:
with connection.cursor() as cursor:
cursor.execute(SET_CONFIG_QUERY, [POSTGRES_TENANT_VAR, tenant_id])
# Advisory lock to prevent race conditions
cursor.execute(
"SELECT pg_advisory_xact_lock(hashtext(%s))", [tenant_id]
)
cursor.execute(
COMPLIANCE_UPSERT_TENANT_SUMMARY_ALL_SQL,
[tenant_id, tenant_id],
)
tenant_summary_count = cursor.rowcount
connection.commit()
except Exception:
connection.rollback()
raise
else:
tenant_summary_count = 0
return {
"status": "backfilled",
"providers_processed": providers_processed,
"providers_skipped": providers_skipped,
"total_upserted": total_upserted,
"tenant_summary_count": tenant_summary_count,
}
+134
View File
@@ -0,0 +1,134 @@
"""
Shared SQL queries for tasks.
This module centralizes raw SQL queries used across multiple task modules
to ensure consistency and maintainability.
"""
# =============================================================================
# COMPLIANCE SCORE QUERIES
# =============================================================================
# Upsert provider compliance scores from a scan's compliance requirements.
# Uses FAIL-dominant aggregation: FAIL > MANUAL > PASS
# Parameters: [tenant_id, scan_id]
COMPLIANCE_UPSERT_PROVIDER_SCORE_SQL = """
INSERT INTO provider_compliance_scores
(id, tenant_id, provider_id, scan_id, compliance_id, requirement_id,
requirement_status, scan_completed_at)
SELECT
gen_random_uuid(),
agg.tenant_id,
agg.provider_id,
agg.scan_id,
agg.compliance_id,
agg.requirement_id,
agg.requirement_status,
agg.completed_at
FROM (
SELECT DISTINCT ON (cro.compliance_id, cro.requirement_id)
cro.tenant_id,
s.provider_id,
cro.scan_id,
cro.compliance_id,
cro.requirement_id,
(CASE
WHEN bool_or(cro.requirement_status = 'FAIL')
OVER (PARTITION BY cro.compliance_id, cro.requirement_id) THEN 'FAIL'
WHEN bool_or(cro.requirement_status = 'MANUAL')
OVER (PARTITION BY cro.compliance_id, cro.requirement_id) THEN 'MANUAL'
ELSE 'PASS'
END)::status as requirement_status,
s.completed_at
FROM compliance_requirements_overviews cro
JOIN scans s ON s.id = cro.scan_id
WHERE cro.tenant_id = %s AND cro.scan_id = %s
ORDER BY cro.compliance_id, cro.requirement_id
) agg
ON CONFLICT (tenant_id, provider_id, compliance_id, requirement_id)
DO UPDATE SET
requirement_status = EXCLUDED.requirement_status,
scan_id = EXCLUDED.scan_id,
scan_completed_at = EXCLUDED.scan_completed_at
WHERE EXCLUDED.scan_completed_at > provider_compliance_scores.scan_completed_at
"""
# Upsert tenant compliance summary for specific compliance IDs.
# Aggregates across all providers with FAIL-dominant logic at requirement level.
# Parameters: [tenant_id, tenant_id, compliance_ids_array]
COMPLIANCE_UPSERT_TENANT_SUMMARY_SQL = """
INSERT INTO tenant_compliance_summaries
(id, tenant_id, compliance_id,
requirements_passed, requirements_failed, requirements_manual,
total_requirements, updated_at)
SELECT
gen_random_uuid(),
%s as tenant_id,
compliance_id,
COUNT(*) FILTER (WHERE req_status = 'PASS') as requirements_passed,
COUNT(*) FILTER (WHERE req_status = 'FAIL') as requirements_failed,
COUNT(*) FILTER (WHERE req_status = 'MANUAL') as requirements_manual,
COUNT(*) as total_requirements,
NOW() as updated_at
FROM (
SELECT
compliance_id,
requirement_id,
CASE
WHEN bool_or(requirement_status = 'FAIL') THEN 'FAIL'
WHEN bool_or(requirement_status = 'MANUAL') THEN 'MANUAL'
ELSE 'PASS'
END as req_status
FROM provider_compliance_scores
WHERE tenant_id = %s AND compliance_id = ANY(%s)
GROUP BY compliance_id, requirement_id
) req_agg
GROUP BY compliance_id
ON CONFLICT (tenant_id, compliance_id)
DO UPDATE SET
requirements_passed = EXCLUDED.requirements_passed,
requirements_failed = EXCLUDED.requirements_failed,
requirements_manual = EXCLUDED.requirements_manual,
total_requirements = EXCLUDED.total_requirements,
updated_at = NOW()
"""
# Upsert tenant compliance summary for ALL compliance IDs in tenant.
# Used by backfill when recalculating entire tenant summary.
# Parameters: [tenant_id, tenant_id]
COMPLIANCE_UPSERT_TENANT_SUMMARY_ALL_SQL = """
INSERT INTO tenant_compliance_summaries
(id, tenant_id, compliance_id,
requirements_passed, requirements_failed, requirements_manual,
total_requirements, updated_at)
SELECT
gen_random_uuid(),
%s as tenant_id,
compliance_id,
COUNT(*) FILTER (WHERE req_status = 'PASS') as requirements_passed,
COUNT(*) FILTER (WHERE req_status = 'FAIL') as requirements_failed,
COUNT(*) FILTER (WHERE req_status = 'MANUAL') as requirements_manual,
COUNT(*) as total_requirements,
NOW() as updated_at
FROM (
SELECT
compliance_id,
requirement_id,
CASE
WHEN bool_or(requirement_status = 'FAIL') THEN 'FAIL'
WHEN bool_or(requirement_status = 'MANUAL') THEN 'MANUAL'
ELSE 'PASS'
END as req_status
FROM provider_compliance_scores
WHERE tenant_id = %s
GROUP BY compliance_id, requirement_id
) req_agg
GROUP BY compliance_id
ON CONFLICT (tenant_id, compliance_id)
DO UPDATE SET
requirements_passed = EXCLUDED.requirements_passed,
requirements_failed = EXCLUDED.requirements_failed,
requirements_manual = EXCLUDED.requirements_manual,
total_requirements = EXCLUDED.total_requirements,
updated_at = NOW()
"""
+141
View File
@@ -14,6 +14,10 @@ from config.env import env
from config.settings.celery import CELERY_DEADLOCK_ATTEMPTS
from django.db import IntegrityError, OperationalError
from django.db.models import Case, Count, IntegerField, Prefetch, Q, Sum, When
from tasks.jobs.queries import (
COMPLIANCE_UPSERT_PROVIDER_SCORE_SQL,
COMPLIANCE_UPSERT_TENANT_SUMMARY_SQL,
)
from tasks.utils import CustomEncoder
from api.compliance import PROWLER_COMPLIANCE_OVERVIEW_TEMPLATE
@@ -1489,3 +1493,140 @@ def aggregate_daily_severity(tenant_id: str, scan_id: str):
"date": str(scan_date),
"severity_data": severity_data,
}
def update_provider_compliance_scores(tenant_id: str, scan_id: str):
"""
Update ProviderComplianceScore with requirement statuses from a completed scan.
Uses atomic SQL upsert with ON CONFLICT for concurrency safety. Only updates
if the new scan is more recent than existing data. Also cleans up stale
requirements that no longer exist in the new scan.
Reads from primary DB (not replica) to avoid replication lag issues since
this runs immediately after create_compliance_requirements_task.
Args:
tenant_id: Tenant that owns the scan.
scan_id: Scan UUID whose compliance data should be materialized.
Returns:
dict: Statistics about the upsert operation.
"""
with rls_transaction(tenant_id):
scan = (
Scan.all_objects.filter(
tenant_id=tenant_id,
id=scan_id,
state=StateChoices.COMPLETED,
)
.select_related("provider")
.first()
)
if not scan:
logger.warning(
f"Scan {scan_id} not found or not completed for compliance score update"
)
return {"status": "skipped", "reason": "scan not completed"}
if not scan.completed_at:
logger.warning(f"Scan {scan_id} has no completed_at timestamp")
return {"status": "skipped", "reason": "no completed_at"}
provider_id = str(scan.provider_id)
scan_completed_at = scan.completed_at
delete_stale_sql = """
DELETE FROM provider_compliance_scores pcs
WHERE pcs.tenant_id = %s
AND pcs.provider_id = %s
AND pcs.scan_completed_at < %s
AND NOT EXISTS (
SELECT 1 FROM compliance_requirements_overviews cro
WHERE cro.tenant_id = pcs.tenant_id
AND cro.scan_id = %s
AND cro.compliance_id = pcs.compliance_id
AND cro.requirement_id = pcs.requirement_id
)
RETURNING compliance_id
"""
compliance_ids_sql = """
SELECT DISTINCT compliance_id
FROM compliance_requirements_overviews
WHERE tenant_id = %s AND scan_id = %s
"""
try:
with psycopg_connection(MainRouter.default_db) as connection:
connection.autocommit = False
try:
with connection.cursor() as cursor:
cursor.execute(SET_CONFIG_QUERY, [POSTGRES_TENANT_VAR, tenant_id])
# Update requirement-level scores per provider
cursor.execute(
COMPLIANCE_UPSERT_PROVIDER_SCORE_SQL, [tenant_id, scan_id]
)
upserted_count = cursor.rowcount
cursor.execute(compliance_ids_sql, [tenant_id, scan_id])
scan_rows = cursor.fetchall()
if not isinstance(scan_rows, (list, tuple)):
scan_rows = []
scan_compliance_ids = {row[0] for row in scan_rows}
cursor.execute(
delete_stale_sql,
[tenant_id, provider_id, scan_completed_at, scan_id],
)
deleted_rows = cursor.fetchall()
if not isinstance(deleted_rows, (list, tuple)):
deleted_rows = []
deleted_ids = {row[0] for row in deleted_rows}
stale_deleted = len(deleted_ids)
impacted_compliance_ids = sorted(scan_compliance_ids | deleted_ids)
if impacted_compliance_ids:
# Advisory lock on tenant to prevent race conditions when
# multiple scans complete simultaneously for the same tenant
cursor.execute(
"SELECT pg_advisory_xact_lock(hashtext(%s))", [tenant_id]
)
# Recalculate tenant-level summary (FAIL-dominant across all providers)
cursor.execute(
COMPLIANCE_UPSERT_TENANT_SUMMARY_SQL,
[tenant_id, tenant_id, impacted_compliance_ids],
)
tenant_summary_count = cursor.rowcount
else:
tenant_summary_count = 0
connection.commit()
except Exception:
connection.rollback()
raise
logger.info(
f"Provider compliance scores updated for scan {scan_id}: "
f"{upserted_count} upserted, {stale_deleted} stale deleted, "
f"{tenant_summary_count} tenant summaries upserted"
)
return {
"status": "completed",
"scan_id": str(scan_id),
"provider_id": provider_id,
"upserted": upserted_count,
"stale_deleted": stale_deleted,
"tenant_summary_count": tenant_summary_count,
}
except Exception as e:
logger.error(
f"Error updating provider compliance scores for scan {scan_id}: {e}"
)
raise
+35 -3
View File
@@ -11,6 +11,7 @@ from django_celery_beat.models import PeriodicTask
from tasks.jobs.backfill import (
backfill_compliance_summaries,
backfill_daily_severity_summaries,
backfill_provider_compliance_scores,
backfill_resource_scan_summaries,
backfill_scan_category_summaries,
)
@@ -44,6 +45,7 @@ from tasks.jobs.scan import (
aggregate_findings,
create_compliance_requirements,
perform_prowler_scan,
update_provider_compliance_scores,
)
from tasks.utils import batched, get_next_execution_datetime
@@ -122,9 +124,10 @@ def _perform_scan_complete_tasks(tenant_id: str, scan_id: str, provider_id: str)
scan_id (str): The ID of the scan that was performed.
provider_id (str): The primary key of the Provider instance that was scanned.
"""
create_compliance_requirements_task.apply_async(
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
)
chain(
create_compliance_requirements_task.si(tenant_id=tenant_id, scan_id=scan_id),
update_provider_compliance_scores_task.si(tenant_id=tenant_id, scan_id=scan_id),
).apply_async()
aggregate_attack_surface_task.apply_async(
kwargs={"tenant_id": tenant_id, "scan_id": scan_id}
)
@@ -610,6 +613,20 @@ def backfill_scan_category_summaries_task(tenant_id: str, scan_id: str):
return backfill_scan_category_summaries(tenant_id=tenant_id, scan_id=scan_id)
@shared_task(name="backfill-provider-compliance-scores", queue="backfill")
def backfill_provider_compliance_scores_task(tenant_id: str):
"""
Backfill ProviderComplianceScore from latest completed scan per provider.
Used to populate the compliance watchlist materialized table for tenants
that had scans before the feature was deployed.
Args:
tenant_id: Target tenant UUID.
"""
return backfill_provider_compliance_scores(tenant_id=tenant_id)
@shared_task(base=RLSTask, name="scan-compliance-overviews", queue="compliance")
@handle_provider_deletion
def create_compliance_requirements_task(tenant_id: str, scan_id: str):
@@ -643,6 +660,21 @@ def aggregate_attack_surface_task(tenant_id: str, scan_id: str):
return aggregate_attack_surface(tenant_id=tenant_id, scan_id=scan_id)
@shared_task(name="scan-provider-compliance-scores", queue="compliance")
def update_provider_compliance_scores_task(tenant_id: str, scan_id: str):
"""
Update provider compliance scores from a completed scan.
This task materializes compliance requirement statuses into ProviderComplianceScore
for efficient watchlist queries. Uses atomic upsert with concurrency protection.
Args:
tenant_id (str): The tenant ID for which to update scores.
scan_id (str): The ID of the scan whose data should be materialized.
"""
return update_provider_compliance_scores(tenant_id=tenant_id, scan_id=scan_id)
@shared_task(name="scan-daily-severity", queue="overview")
@handle_provider_deletion
def aggregate_daily_severity_task(tenant_id: str, scan_id: str):
@@ -1,8 +1,11 @@
from datetime import datetime, timezone
from unittest.mock import MagicMock, patch
from uuid import uuid4
import pytest
from tasks.jobs.backfill import (
backfill_compliance_summaries,
backfill_provider_compliance_scores,
backfill_resource_scan_summaries,
backfill_scan_category_summaries,
)
@@ -260,3 +263,62 @@ class TestBackfillScanCategorySummaries:
assert summary.total_findings == 1
assert summary.failed_findings == 1
assert summary.new_failed_findings == 1
@pytest.mark.django_db
class TestBackfillProviderComplianceScores:
def test_no_completed_scans(self, tenants_fixture):
tenant = tenants_fixture[2]
result = backfill_provider_compliance_scores(str(tenant.id))
assert result == {"status": "no completed scans"}
def test_no_scans_to_process(self, tenants_fixture, scans_fixture):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
scan.completed_at = None
scan.save()
result = backfill_provider_compliance_scores(str(tenant.id))
assert result == {"status": "no completed scans"}
@patch("tasks.jobs.backfill.psycopg_connection")
def test_successful_backfill_executes_sql_queries(
self,
mock_psycopg_connection,
tenants_fixture,
scans_fixture,
settings,
):
"""Test successful backfill executes SQL queries and returns correct stats."""
settings.DATABASES.setdefault("admin", settings.DATABASES["default"])
tenant = tenants_fixture[0]
scan = scans_fixture[0]
# Set completed_at to make the scan eligible for backfill
scan.completed_at = datetime.now(timezone.utc)
scan.save()
connection = MagicMock()
cursor = MagicMock()
cursor_context = MagicMock()
cursor_context.__enter__.return_value = cursor
cursor_context.__exit__.return_value = False
connection.cursor.return_value = cursor_context
connection.__enter__.return_value = connection
connection.__exit__.return_value = False
connection.autocommit = True
context_manager = MagicMock()
context_manager.__enter__.return_value = connection
context_manager.__exit__.return_value = False
mock_psycopg_connection.return_value = context_manager
cursor.rowcount = 5
result = backfill_provider_compliance_scores(str(tenant.id))
assert result["status"] == "backfilled"
assert result["providers_processed"] == 1
assert result["providers_skipped"] == 0
assert result["total_upserted"] == 5
assert result["tenant_summary_count"] == 5
+121
View File
@@ -24,6 +24,7 @@ from tasks.jobs.scan import (
aggregate_findings,
create_compliance_requirements,
perform_prowler_scan,
update_provider_compliance_scores,
)
from tasks.utils import CustomEncoder
@@ -4022,3 +4023,123 @@ class TestAggregateCategoryCounts:
assert len(cache) == 3
for cat in ["security", "compliance", "data-protection"]:
assert cache[(cat, "low")] == {"total": 1, "failed": 1, "new_failed": 1}
@pytest.mark.django_db
class TestUpdateProviderComplianceScores:
@patch("tasks.jobs.scan.psycopg_connection")
def test_update_provider_compliance_scores_basic(
self,
mock_psycopg_connection,
tenants_fixture,
scans_fixture,
settings,
):
settings.DATABASES.setdefault("admin", settings.DATABASES["default"])
tenant = tenants_fixture[0]
scan = scans_fixture[0]
tenant_id = str(tenant.id)
scan_id = str(scan.id)
scan.state = StateChoices.COMPLETED
scan.completed_at = datetime.now(timezone.utc)
scan.save()
connection = MagicMock()
cursor = MagicMock()
cursor_context = MagicMock()
cursor_context.__enter__.return_value = cursor
cursor_context.__exit__.return_value = False
connection.cursor.return_value = cursor_context
connection.__enter__.return_value = connection
connection.__exit__.return_value = False
connection.autocommit = True
context_manager = MagicMock()
context_manager.__enter__.return_value = connection
context_manager.__exit__.return_value = False
mock_psycopg_connection.return_value = context_manager
cursor.rowcount = 2
result = update_provider_compliance_scores(tenant_id, scan_id)
assert result["status"] == "completed"
assert result["upserted"] == 2
assert cursor.execute.call_count >= 3
connection.commit.assert_called_once()
def test_update_provider_compliance_scores_skips_incomplete_scan(
self, tenants_fixture, scans_fixture
):
tenant = tenants_fixture[0]
scan = scans_fixture[1]
tenant_id = str(tenant.id)
scan_id = str(scan.id)
result = update_provider_compliance_scores(tenant_id, scan_id)
assert result["status"] == "skipped"
assert result["reason"] == "scan not completed"
def test_update_provider_compliance_scores_skips_no_completed_at(
self, tenants_fixture, scans_fixture
):
tenant = tenants_fixture[0]
scan = scans_fixture[0]
tenant_id = str(tenant.id)
scan_id = str(scan.id)
scan.state = StateChoices.COMPLETED
scan.completed_at = None
scan.save()
result = update_provider_compliance_scores(tenant_id, scan_id)
assert result["status"] == "skipped"
assert result["reason"] == "no completed_at"
@patch("tasks.jobs.scan.psycopg_connection")
def test_update_provider_compliance_scores_executes_sql_queries(
self,
mock_psycopg_connection,
tenants_fixture,
providers_fixture,
scans_fixture,
settings,
):
settings.DATABASES.setdefault("admin", settings.DATABASES["default"])
tenant = tenants_fixture[0]
scan = scans_fixture[0]
tenant_id = str(tenant.id)
scan_id = str(scan.id)
scan.state = StateChoices.COMPLETED
scan.completed_at = datetime.now(timezone.utc)
scan.save()
connection = MagicMock()
cursor = MagicMock()
cursor_context = MagicMock()
cursor_context.__enter__.return_value = cursor
cursor_context.__exit__.return_value = False
connection.cursor.return_value = cursor_context
connection.__enter__.return_value = connection
connection.__exit__.return_value = False
context_manager = MagicMock()
context_manager.__enter__.return_value = connection
context_manager.__exit__.return_value = False
mock_psycopg_connection.return_value = context_manager
cursor.rowcount = 1
cursor.fetchall.side_effect = [[("aws_cis_2.0",)], []]
result = update_provider_compliance_scores(tenant_id, scan_id)
assert result["status"] == "completed"
calls = [str(c) for c in cursor.execute.call_args_list]
assert any("provider_compliance_scores" in c for c in calls)
assert any("tenant_compliance_summaries" in c for c in calls)
assert any("pg_advisory_xact_lock" in c for c in calls)
+12 -3
View File
@@ -730,7 +730,9 @@ class TestGenerateOutputs:
class TestScanCompleteTasks:
@patch("tasks.tasks.aggregate_attack_surface_task.apply_async")
@patch("tasks.tasks.create_compliance_requirements_task.apply_async")
@patch("tasks.tasks.chain")
@patch("tasks.tasks.create_compliance_requirements_task.si")
@patch("tasks.tasks.update_provider_compliance_scores_task.si")
@patch("tasks.tasks.perform_scan_summary_task.si")
@patch("tasks.tasks.generate_outputs_task.si")
@patch("tasks.tasks.generate_compliance_reports_task.si")
@@ -741,15 +743,22 @@ class TestScanCompleteTasks:
mock_compliance_reports_task,
mock_outputs_task,
mock_scan_summary_task,
mock_update_compliance_scores_task,
mock_compliance_requirements_task,
mock_chain,
mock_attack_surface_task,
):
"""Test that scan complete tasks are properly orchestrated with optimized reports."""
_perform_scan_complete_tasks("tenant-id", "scan-id", "provider-id")
# Verify compliance requirements task is called
# Verify compliance requirements task is called via chain
mock_compliance_requirements_task.assert_called_once_with(
kwargs={"tenant_id": "tenant-id", "scan_id": "scan-id"},
tenant_id="tenant-id", scan_id="scan-id"
)
# Verify update provider compliance scores task is called via chain
mock_update_compliance_scores_task.assert_called_once_with(
tenant_id="tenant-id", scan_id="scan-id"
)
# Verify attack surface task is called
@@ -0,0 +1,24 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)
+25
View File
@@ -0,0 +1,25 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)
+24
View File
@@ -0,0 +1,24 @@
import warnings
from dashboard.common_methods import get_section_containers_cis
warnings.filterwarnings("ignore")
def get_table(data):
aux = data[
[
"REQUIREMENTS_ID",
"REQUIREMENTS_DESCRIPTION",
"REQUIREMENTS_ATTRIBUTES_SECTION",
"CHECKID",
"STATUS",
"REGION",
"ACCOUNTID",
"RESOURCEID",
]
].copy()
return get_section_containers_cis(
aux, "REQUIREMENTS_ID", "REQUIREMENTS_ATTRIBUTES_SECTION"
)
+216
View File
@@ -0,0 +1,216 @@
---
title: 'AI Skills System'
---
This guide explains the AI Skills system that provides on-demand context and patterns to AI agents working with the Prowler codebase.
<Info>
**What are AI Skills?** Skills are structured instructions that help AI agents (Claude Code, Cursor, Copilot, etc.) understand Prowler's conventions, patterns, and best practices.
</Info>
## Architecture Overview
```mermaid
graph LR
subgraph FLOW["AI Skills Architecture"]
A["AI Agent"] -->|"1. matches trigger"| B["AGENTS.md"]
B -->|"2. loads"| C["Skill"]
C -->|"3. provides"| D["Patterns<br/>Templates<br/>Commands"]
C -->|"4. references"| E["Local Docs"]
D --> F["Correct Output"]
E --> F
end
style A fill:#1e3a5f,stroke:#4a9eff,color:#fff
style B fill:#5c4d1a,stroke:#ffd700,color:#fff
style C fill:#1a4d1a,stroke:#4caf50,color:#fff
style E fill:#4a1a4d,stroke:#ba68c8,color:#fff
style F fill:#1a4d2e,stroke:#66bb6a,color:#fff
```
## How It Works
```mermaid
sequenceDiagram
participant U as User
participant A as AI Agent
participant R as AGENTS.md
participant S as Skill
participant AS as assets/
participant RF as references/
participant D as Local Docs
U->>A: "Create an AWS security check"
Note over A: Analyze request context
A->>R: Find matching skill trigger
R-->>A: prowler-sdk-check matches
A->>S: Load SKILL.md
S-->>A: Patterns, rules, templates, commands
Note over A: Need code template?
A->>AS: Read assets/aws_check.py
AS-->>A: Check implementation template
Note over A: Need more details?
A->>RF: Read references/metadata-docs.md
RF-->>A: Points to local docs
A->>D: Read docs/developer-guide/checks.mdx
D-->>A: Full documentation
Note over A: Execute with full context
A->>U: Creates check with correct patterns
```
## Before vs After
```mermaid
graph TD
subgraph COMPARISON["BEFORE vs AFTER"]
direction LR
subgraph BEFORE["Without Skills"]
B1["AI guesses conventions"]
B2["Wrong structure"]
B3["Multiple iterations"]
B4["Web searches for docs"]
B5["Inconsistent patterns"]
end
subgraph AFTER["With Skills"]
A1["AI loads exact patterns"]
A2["Correct structure"]
A3["First-time right"]
A4["Local docs referenced"]
A5["Consistent patterns"]
end
end
style BEFORE fill:#5c1a1a,stroke:#ef5350,color:#fff
style AFTER fill:#1a4d1a,stroke:#66bb6a,color:#fff
```
## Complete Architecture
```mermaid
flowchart TB
subgraph ENTRY["ENTRY POINT"]
AGENTS["AGENTS.md<br/>━━━━━━━━━━━━━━━━━<br/>• Available skills registry<br/>• Skill → Trigger mapping<br/>• Component navigation"]
end
subgraph SKILLS["SKILLS LIBRARY"]
direction TB
subgraph GENERIC["Generic Skills"]
G1["typescript"]
G2["react-19"]
G3["nextjs-15"]
G4["tailwind-4"]
G5["pytest"]
G6["playwright"]
G7["django-drf"]
G8["zod-4"]
G9["zustand-5"]
G10["ai-sdk-5"]
end
subgraph PROWLER["Prowler Skills"]
P1["prowler"]
P2["prowler-sdk-check"]
P3["prowler-api"]
P4["prowler-ui"]
P5["prowler-mcp"]
P6["prowler-provider"]
P7["prowler-compliance"]
P8["prowler-docs"]
P9["prowler-pr"]
end
subgraph TESTING["Testing Skills"]
T1["prowler-test-sdk"]
T2["prowler-test-api"]
T3["prowler-test-ui"]
end
subgraph META["Meta Skills"]
M1["skill-creator"]
end
end
subgraph STRUCTURE["SKILL STRUCTURE"]
direction LR
SKILLMD["SKILL.md<br/>━━━━━━━━━━━━━━<br/>• Frontmatter<br/>• Critical patterns<br/>• Decision trees<br/>• Code examples<br/>• Commands<br/>• Keywords"]
ASSETS["assets/<br/>━━━━━━━━━━━━━━<br/>• Code templates<br/>• JSON schemas<br/>• Config examples"]
REFS["references/<br/>━━━━━━━━━━━━━━<br/>• Local doc paths<br/>• No web URLs<br/>• Single source"]
end
subgraph DOCS["DOCUMENTATION"]
direction TB
DD["docs/developer-guide/"]
D1["checks.mdx"]
D2["unit-testing.mdx"]
D3["provider.mdx"]
D4["mcp-server.mdx"]
D5["..."]
DD --> D1
DD --> D2
DD --> D3
DD --> D4
DD --> D5
end
ENTRY --> SKILLS
SKILLS --> STRUCTURE
SKILLMD --> ASSETS
SKILLMD --> REFS
REFS -.->|"points to"| DOCS
style ENTRY fill:#1e3a5f,stroke:#4a9eff,color:#fff
style GENERIC fill:#5c4d1a,stroke:#ffd700,color:#fff
style PROWLER fill:#1a4d1a,stroke:#66bb6a,color:#fff
style TESTING fill:#4d1a3d,stroke:#f06292,color:#fff
style META fill:#4a1a4d,stroke:#ba68c8,color:#fff
style STRUCTURE fill:#5c3d1a,stroke:#ffb74d,color:#fff
style DOCS fill:#1a3d4d,stroke:#4dd0e1,color:#fff
```
## Skills Included
| Type | Skills |
|------|--------|
| **Generic** | typescript, react-19, nextjs-15, tailwind-4, pytest, playwright, django-drf, zod-4, zustand-5, ai-sdk-5 |
| **Prowler** | prowler, prowler-sdk-check, prowler-api, prowler-ui, prowler-mcp, prowler-provider, prowler-compliance, prowler-docs, prowler-pr |
| **Testing** | prowler-test-sdk, prowler-test-api, prowler-test-ui |
| **Meta** | skill-creator |
## Skill Structure
Each skill follows the [Agent Skills spec](https://agentskills.io):
```
skills/{skill-name}/
├── SKILL.md # Patterns, rules, decision trees
├── assets/ # Code templates, schemas
└── references/ # Links to local docs (single source of truth)
```
## Key Design Decisions
1. **Self-contained skills** - Critical patterns inline for fast loading
2. **Local doc references** - No web URLs, points to `docs/developer-guide/*.mdx`
3. **Single source of truth** - Skills reference docs, no duplication
4. **On-demand loading** - AI loads only what's needed for the task
## Creating New Skills
Use the `skill-creator` meta-skill to create new skills that follow the Agent Skills spec. See `AGENTS.md` for the full list of available skills and their triggers.
+8 -1
View File
@@ -6,6 +6,10 @@ Thanks for your interest in contributing to Prowler!
Prowler can be extended in various ways. This guide provides the different ways to contribute and how to get started.
<Warning>
Maintainers will assess whether a change fits the project roadmap and scope before merging.
</Warning>
## Contributing to Prowler
### Review Current Issues
@@ -32,6 +36,9 @@ Prowler is constantly evolving. Contributions to checks, services, or integratio
<Card title="Adding New Providers" icon="cloud" href="/developer-guide/provider">
If you would like to extend Prowler to work with a new cloud provider, this typically involves setting up new services and checks to ensure compatibility.
</Card>
<Card title="Adding New Compliance Frameworks" icon="scale-balanced" href="/developer-guide/security-compliance-framework">
Need to ensure Prowler supports a specific compliance framework? Add new security compliance frameworks to map checks against regulatory or industry standards.
</Card>
<Card title="Adding New Output Formats" icon="file" href="/developer-guide/outputs">
Want to tailor how results are displayed or exported? You can add custom output formats.
</Card>
@@ -213,4 +220,4 @@ pipx install "git+https://github.com/prowler-cloud/prowler.git@branch-name"
Replace `branch-name` with the name of the branch you want to test. This will install Prowler in an isolated environment, allowing you to try out the changes safely.
For more details on testing go to the [Testing section](/developer-guide/unit-testing) of this documentation.
For more details on testing go to the [Testing section](/developer-guide/unit-testing) of this documentation.
@@ -0,0 +1,407 @@
---
title: 'Lighthouse AI Architecture'
---
This document describes the internal architecture of Prowler Lighthouse AI, enabling developers to understand how components interact and where to add new functionality.
<Info>
**Looking for user documentation?** See:
- [Lighthouse AI Overview](/getting-started/products/prowler-lighthouse-ai) - Capabilities and FAQs
- [How Lighthouse AI Works](/user-guide/tutorials/prowler-app-lighthouse) - Configuration and usage
- [Multi-LLM Provider Setup](/user-guide/tutorials/prowler-app-lighthouse-multi-llm) - Provider configuration
</Info>
## Architecture Overview
Lighthouse AI operates as a Langchain-based agent that connects Large Language Models (LLMs) with Prowler security data through the Model Context Protocol (MCP).
<img className="block dark:hidden" src="/images/lighthouse-architecture-light.png" alt="Prowler Lighthouse Architecture" />
<img className="hidden dark:block" src="/images/lighthouse-architecture-dark.png" alt="Prowler Lighthouse Architecture" />
### Three-Tier Architecture
The system follows a three-tier architecture:
1. **Frontend (Next.js)**: Chat interface, message rendering, model selection
2. **API Route**: Request handling, authentication, stream transformation
3. **Langchain Agent**: LLM orchestration, tool calling through MCP
### Request Flow
When a user sends a message through the Lighthouse chat interface, the system processes it through several stages:
1. **User Submits a Message**.
The chat component (`ui/components/lighthouse/chat.tsx`) captures the user's question (e.g., "What are my critical findings in AWS?") and sends it as an HTTP POST request to the backend API route.
2. **Authentication and Context Assembly**.
The API route (`ui/app/api/lighthouse/analyst/route.ts`) validates the user's session, extracts the JWT token (stored via `auth-context.ts`), and gathers context including the tenant's business context and current security posture data (assembled in `data.ts`).
3. **Agent Initialization**.
The workflow orchestrator (`ui/lib/lighthouse/workflow.ts`) creates a Langchain agent configured with:
- The selected LLM, instantiated through the factory (`llm-factory.ts`)
- A system prompt containing available tools and instructions (`system-prompt.ts`)
- Two meta-tools (`describe_tool` and `execute_tool`) for accessing Prowler data
4. **LLM Reasoning and Tool Calling**.
The agent sends the conversation to the LLM, which decides whether to respond directly or call tools to fetch data. When tools are needed, the meta-tools in `ui/lib/lighthouse/tools/meta-tool.ts` interact with the MCP client (`mcp-client.ts`) to:
- First call `describe_tool` to understand the tool's parameters
- Then call `execute_tool` to retrieve data from the MCP Server
- Continue reasoning with the returned data
5. **Streaming Response**.
As the LLM generates its response, the stream handler (`ui/lib/lighthouse/analyst-stream.ts`) transforms Langchain events into UI-compatible messages and streams tokens back to the browser in real-time using Server-Sent Events. The stream includes both text tokens and tool execution events (displayed as "chain of thought").
6. **Message Rendering**.
The frontend receives the stream and renders it through `message-item.tsx` with markdown formatting. Any tool calls that occurred during reasoning are displayed via `chain-of-thought-display.tsx`.
## Frontend Components
Frontend components reside in `ui/components/lighthouse/` and handle the chat interface and configuration workflows.
### Core Components
| Component | Location | Purpose |
|-----------|----------|---------|
| `chat.tsx` | `ui/components/lighthouse/` | Main chat interface managing message history and input handling |
| `message-item.tsx` | `ui/components/lighthouse/` | Individual message rendering with markdown support |
| `select-model.tsx` | `ui/components/lighthouse/` | Model and provider selection dropdown |
| `chain-of-thought-display.tsx` | `ui/components/lighthouse/` | Displays tool calls and reasoning steps during execution |
### Configuration Components
| Component | Location | Purpose |
|-----------|----------|---------|
| `lighthouse-settings.tsx` | `ui/components/lighthouse/` | Settings panel for business context and preferences |
| `connect-llm-provider.tsx` | `ui/components/lighthouse/` | Provider connection workflow |
| `llm-providers-table.tsx` | `ui/components/lighthouse/` | Provider management table |
| `forms/delete-llm-provider-form.tsx` | `ui/components/lighthouse/forms/` | Provider deletion confirmation dialog |
### Supporting Components
| Component | Location | Purpose |
|-----------|----------|---------|
| `banner.tsx` / `banner-client.tsx` | `ui/components/lighthouse/` | Status banners and notifications |
| `workflow/` | `ui/components/lighthouse/workflow/` | Multi-step configuration workflows |
| `ai-elements/` | `ui/components/lighthouse/ai-elements/` | Custom UI primitives for chat interface (input, select, dropdown, tooltip) |
## Library Code
Core library code resides in `ui/lib/lighthouse/` and handles agent orchestration, MCP communication, and stream processing.
### Workflow Orchestrator
**Location:** `ui/lib/lighthouse/workflow.ts`
The workflow module serves as the core orchestrator, responsible for:
- Initializing the Langchain agent with system prompt and tools
- Loading tenant configuration (default provider, model, business context)
- Creating the LLM instance through the factory
- Generating dynamic tool listings from available MCP tools
```typescript
// Simplified workflow initialization
export async function initLighthouseWorkflow(runtimeConfig?: RuntimeConfig) {
await initializeMCPClient();
const toolListing = generateToolListing();
const systemPrompt = LIGHTHOUSE_SYSTEM_PROMPT_TEMPLATE.replace(
"{{TOOL_LISTING}}",
toolListing,
);
const llm = createLLM({
provider: providerType,
model: modelId,
credentials,
// ...
});
return createAgent({
model: llm,
tools: [describeTool, executeTool],
systemPrompt,
});
}
```
### MCP Client Manager
**Location:** `ui/lib/lighthouse/mcp-client.ts`
The MCP client manages connections to the Prowler MCP Server using a singleton pattern:
- **Connection Management**: Retry logic with configurable attempts and delays
- **Tool Discovery**: Fetches available tools from MCP server on initialization
- **Authentication Injection**: Automatically adds JWT tokens to `prowler_app_*` tool calls
- **Reconnection**: Supports forced reconnection after server restarts
Key constants:
- `MAX_RETRY_ATTEMPTS`: 3 connection attempts
- `RETRY_DELAY_MS`: 2000ms between retries
- `RECONNECT_INTERVAL_MS`: 5 minutes before retry after failure
```typescript
// Authentication injection for Prowler App tools
private handleBeforeToolCall = ({ name, args }) => {
// Only inject auth for prowler_app_* tools (user-specific data)
if (!name.startsWith("prowler_app_")) {
return { args };
}
const accessToken = getAuthContext();
return {
args,
headers: { Authorization: `Bearer ${accessToken}` },
};
};
```
### Meta-Tools
**Location:** `ui/lib/lighthouse/tools/meta-tool.ts`
Instead of registering all MCP tools directly with the agent, Lighthouse uses two meta-tools for dynamic tool discovery and execution:
| Tool | Purpose |
|------|---------|
| `describe_tool` | Retrieves full schema and parameter details for a specific tool |
| `execute_tool` | Executes a tool with provided parameters |
This pattern reduces the number of tools the LLM must track while maintaining access to all MCP capabilities.
### Additional Library Modules
| Module | Location | Purpose |
|--------|----------|---------|
| `analyst-stream.ts` | `ui/lib/lighthouse/` | Transforms Langchain stream events to UI message format |
| `llm-factory.ts` | `ui/lib/lighthouse/` | Creates LLM instances for OpenAI, Bedrock, and OpenAI-compatible providers |
| `system-prompt.ts` | `ui/lib/lighthouse/` | System prompt template with dynamic tool listing injection |
| `auth-context.ts` | `ui/lib/lighthouse/` | AsyncLocalStorage for JWT token propagation across async boundaries |
| `types.ts` | `ui/lib/lighthouse/` | TypeScript type definitions |
| `constants.ts` | `ui/lib/lighthouse/` | Configuration constants and error messages |
| `utils.ts` | `ui/lib/lighthouse/` | Message conversion and model parameter extraction |
| `validation.ts` | `ui/lib/lighthouse/` | Input validation utilities |
| `data.ts` | `ui/lib/lighthouse/` | Current data section generation for context enrichment |
## API Route
**Location:** `ui/app/api/lighthouse/analyst/route.ts`
The API route handles chat requests and manages the streaming response pipeline:
1. **Request Parsing**: Extracts messages, model, and provider from request body
2. **Authentication**: Validates session and extracts access token
3. **Context Assembly**: Gathers business context and current data
4. **Agent Initialization**: Creates Langchain agent with runtime configuration
5. **Stream Processing**: Transforms agent events to UI-compatible format
6. **Error Handling**: Captures errors with Sentry integration
```typescript
export async function POST(req: Request) {
const { messages, model, provider } = await req.json();
const session = await auth();
if (!session?.accessToken) {
return Response.json({ error: "Unauthorized" }, { status: 401 });
}
return await authContextStorage.run(accessToken, async () => {
const app = await initLighthouseWorkflow(runtimeConfig);
const agentStream = app.streamEvents({ messages }, { version: "v2" });
// Transform stream events to UI format
const stream = new ReadableStream({
async start(controller) {
for await (const streamEvent of agentStream) {
// Handle on_chat_model_stream, on_tool_start, on_tool_end, etc.
}
},
});
return createUIMessageStreamResponse({ stream });
});
}
```
## Backend Components
Backend components handle LLM provider configuration, model management, and credential storage.
### Database Models
**Location:** `api/src/backend/api/models.py`
| Model | Purpose |
|-------|---------|
| `LighthouseProviderConfiguration` | Per-tenant LLM provider credentials (encrypted with Fernet) |
| `LighthouseTenantConfiguration` | Tenant-level settings including business context and default provider/model |
| `LighthouseProviderModels` | Available models per provider configuration |
All models implement Row-Level Security (RLS) for tenant isolation.
#### LighthouseProviderConfiguration
Stores provider-specific credentials for each tenant:
- **provider_type**: `openai`, `bedrock`, or `openai_compatible`
- **credentials**: Encrypted JSON containing API keys or AWS credentials
- **base_url**: Custom endpoint for OpenAI-compatible providers
- **is_active**: Connection validation status
#### LighthouseTenantConfiguration
Stores tenant-wide Lighthouse settings:
- **business_context**: Optional context for personalized responses
- **default_provider**: Default LLM provider type
- **default_models**: JSON mapping provider types to default model IDs
#### LighthouseProviderModels
Catalogs available models for each provider:
- **model_id**: Provider-specific model identifier
- **model_name**: Human-readable display name
- **default_parameters**: Optional model-specific parameters
### Background Jobs
**Location:** `api/src/backend/tasks/jobs/lighthouse_providers.py`
#### check_lighthouse_provider_connection
Validates provider credentials by making a test API call:
- OpenAI: Lists models via `client.models.list()`
- Bedrock: Lists foundation models via `bedrock_client.list_foundation_models()`
- OpenAI-compatible: Lists models via custom base URL
Updates `is_active` status based on connection result.
#### refresh_lighthouse_provider_models
Synchronizes available models from provider APIs:
- Fetches current model catalog from provider
- Filters out non-chat models (DALL-E, Whisper, TTS, embeddings)
- Upserts model records in `LighthouseProviderModels`
- Removes stale models no longer available
**Excluded OpenAI model prefixes:**
```python
EXCLUDED_OPENAI_MODEL_PREFIXES = (
"dall-e", "whisper", "tts-", "sora",
"text-embedding", "text-moderation",
# Legacy models
"text-davinci", "davinci", "curie", "babbage", "ada",
)
```
## MCP Server Integration
Lighthouse AI communicates with the Prowler MCP Server to access security data. For detailed MCP Server architecture, see [Extending the MCP Server](/developer-guide/mcp-server).
### Tool Namespacing
MCP tools are organized into three namespaces based on authentication requirements:
| Namespace | Auth Required | Description |
|-----------|---------------|-------------|
| `prowler_app_*` | Yes (JWT) | Prowler Cloud/App tools for findings, providers, scans, resources |
| `prowler_hub_*` | No | Security checks catalog, compliance frameworks |
| `prowler_docs_*` | No | Documentation search and retrieval |
### Authentication Flow
1. User authenticates with Prowler App, receiving a JWT token
2. Token is stored in session and propagated via `authContextStorage`
3. MCP client injects `Authorization: Bearer <token>` header for `prowler_app_*` calls
4. MCP Server validates token and applies RLS filtering
### Tool Execution Pattern
The agent uses meta-tools rather than direct tool registration:
```
Agent needs data → describe_tool("prowler_app_search_findings")
→ Returns parameter schema → execute_tool with parameters
→ MCP client adds auth header → MCP Server executes
→ Results returned to agent → Agent continues reasoning
```
## Extension Points
### Adding New LLM Providers
To add a new LLM provider:
1. **Frontend**: Update `ui/lib/lighthouse/llm-factory.ts` with provider-specific initialization
2. **Backend**: Add provider type to `LighthouseProviderConfiguration.LLMProviderChoices`
3. **Jobs**: Add credential extraction and model fetching in `lighthouse_providers.py`
4. **UI**: Add connection workflow in `ui/components/lighthouse/workflow/`
### Modifying System Prompt
The system prompt template lives in `ui/lib/lighthouse/system-prompt.ts`. The `{{TOOL_LISTING}}` placeholder is dynamically replaced with available MCP tools during agent initialization.
### Adding Stream Events
To handle new Langchain stream events, modify `ui/lib/lighthouse/analyst-stream.ts`. Current handlers include:
- `on_chat_model_stream`: Token-by-token text streaming
- `on_chat_model_end`: Model completion with tool call detection
- `on_tool_start`: Tool execution started
- `on_tool_end`: Tool execution completed
### Adding MCP Tools
See [Extending the MCP Server](/developer-guide/mcp-server) for detailed instructions on adding new tools to the Prowler MCP Server.
## Configuration
### Environment Variables
| Variable | Description |
|----------|-------------|
| `PROWLER_MCP_SERVER_URL` | MCP server endpoint (e.g., `https://mcp.prowler.com/mcp`) |
### Database Configuration
Provider credentials are stored encrypted in `LighthouseProviderConfiguration`:
- **OpenAI**: `{"api_key": "sk-..."}`
- **Bedrock**: `{"access_key_id": "...", "secret_access_key": "...", "region": "us-east-1"}` or `{"api_key": "...", "region": "us-east-1"}`
- **OpenAI-compatible**: `{"api_key": "..."}` with `base_url` field
### Tenant Configuration
Business context and default settings are stored in `LighthouseTenantConfiguration`:
```python
{
"business_context": "Optional organization context for personalized responses",
"default_provider": "openai",
"default_models": {
"openai": "gpt-4o",
"bedrock": "anthropic.claude-3-5-sonnet-20240620-v1:0"
}
}
```
## Related Documentation
<CardGroup cols={2}>
<Card title="MCP Server Extension" icon="wrench" href="/developer-guide/mcp-server">
Adding new tools to the Prowler MCP Server
</Card>
<Card title="Lighthouse AI Overview" icon="robot" href="/getting-started/products/prowler-lighthouse-ai">
Capabilities, FAQs, and limitations
</Card>
<Card title="Multi-LLM Setup" icon="sliders" href="/user-guide/tutorials/prowler-app-lighthouse-multi-llm">
Configuring multiple LLM providers
</Card>
<Card title="How Lighthouse Works" icon="gear" href="/user-guide/tutorials/prowler-app-lighthouse">
User-facing architecture and setup guide
</Card>
</CardGroup>
-140
View File
@@ -1,140 +0,0 @@
---
title: 'Extending Prowler Lighthouse AI'
---
This guide helps developers customize and extend Prowler Lighthouse AI by adding or modifying AI agents.
## Understanding AI Agents
AI agents combine Large Language Models (LLMs) with specialized tools that provide environmental context. These tools can include API calls, system command execution, or any function-wrapped capability.
### Types of AI Agents
AI agents fall into two main categories:
- **Autonomous Agents**: Freely chooses from available tools to complete tasks, adapting their approach based on context. They decide which tools to use and when.
- **Workflow Agents**: Follows structured paths with predefined logic. They execute specific tool sequences and can include conditional logic.
Prowler Lighthouse AI is an autonomous agent - selecting the right tool(s) based on the users query.
<Note>
To learn more about AI agents, read [Anthropic's blog post on building effective agents](https://www.anthropic.com/engineering/building-effective-agents).
</Note>
### LLM Dependency
The autonomous nature of agents depends on the underlying LLM. Autonomous agents using identical system prompts and tools but powered by different LLM providers might approach user queries differently. Agent with one LLM might solve a problem efficiently, while with another it might take a different route or fail entirely.
After evaluating multiple LLM providers (OpenAI, Gemini, Claude, LLama) based on tool calling features and response accuracy, we recommend using the `gpt-4o` model.
## Prowler Lighthouse AI Architecture
Prowler Lighthouse AI uses a multi-agent architecture orchestrated by the [Langgraph-Supervisor](https://www.npmjs.com/package/@langchain/langgraph-supervisor) library.
### Architecture Components
<img src="/images/prowler-app/lighthouse-architecture.png" alt="Prowler Lighthouse architecture" />
Prowler Lighthouse AI integrates with the NextJS application:
- The [Langgraph-Supervisor](https://www.npmjs.com/package/@langchain/langgraph-supervisor) library integrates directly with NextJS
- The system uses the authenticated user session to interact with the Prowler API server
- Agents only access data the current user is authorized to view
- Session management operates automatically, ensuring Role-Based Access Control (RBAC) is maintained
## Available Prowler AI Agents
The following specialized AI agents are available in Prowler:
### Agent Overview
- **provider_agent**: Fetches information about cloud providers connected to Prowler
- **user_info_agent**: Retrieves information about Prowler users
- **scans_agent**: Fetches information about Prowler scans
- **compliance_agent**: Retrieves compliance overviews across scans
- **findings_agent**: Fetches information about individual findings across scans
- **overview_agent**: Retrieves overview information (providers, findings by status and severity, etc.)
## How to Add New Capabilities
### Updating the Supervisor Prompt
The supervisor agent controls system behavior, tone, and capabilities. You can find the supervisor prompt at: [https://github.com/prowler-cloud/prowler/blob/master/ui/lib/lighthouse/prompts.ts](https://github.com/prowler-cloud/prowler/blob/master/ui/lib/lighthouse/prompts.ts)
#### Supervisor Prompt Modifications
Modifying the supervisor prompt allows you to:
- Change personality or response style
- Add new high-level capabilities
- Modify task delegation to specialized agents
- Set up guardrails (query types to answer or decline)
<Note>
The supervisor agent should not have its own tools. This design keeps the system modular and maintainable.
</Note>
### How to Create New Specialized Agents
The supervisor agent and all specialized agents are defined in the `route.ts` file. The supervisor agent uses [langgraph-supervisor](https://www.npmjs.com/package/@langchain/langgraph-supervisor), while other agents use the prebuilt [create-react-agent](https://langchain-ai.github.io/langgraphjs/how-tos/create-react-agent/).
To add new capabilities or all Lighthouse AI to interact with other APIs, create additional specialized agents:
1. First determine what the new agent would do. Create a detailed prompt defining the agent's purpose and capabilities. You can see an example from [here](https://github.com/prowler-cloud/prowler/blob/master/ui/lib/lighthouse/prompts.ts#L359-L385).
<Note>
Ensure that the new agent's capabilities don't collide with existing agents. For example, if there's already a *findings_agent* that talks to findings APIs don't create a new agent to do the same.
</Note>
2. Create necessary tools for the agents to access specific data or perform actions. A tool is a specialized function that extends the capabilities of LLM by allowing it to access external data or APIs. A tool is triggered by LLM based on the description of the tool and the user's query.
For example, the description of `getScanTool` is "Fetches detailed information about a specific scan by its ID." If the description doesn't convey what the tool is capable of doing, LLM will not invoke the function. If the description of `getScanTool` was set to something random or not set at all, LLM will not answer queries like "Give me the critical issues from the scan ID xxxxxxxxxxxxxxx"
<Note>
Ensure that one tool is added to one agent only. Adding tools is optional. There can be agents with no tools at all.
</Note>
3. Use the `createReactAgent` function to define a new agent. For example, the rolesAgent name is "roles_agent" and has access to call tools "*getRolesTool*" and "*getRoleTool*"
```js
const rolesAgent = createReactAgent({
llm: llm,
tools: [getRolesTool, getRoleTool],
name: "roles_agent",
prompt: rolesAgentPrompt,
});
```
4. Create a detailed prompt defining the agent's purpose and capabilities.
5. Add the new agent to the available agents list:
```js
const agents = [
userInfoAgent,
providerAgent,
overviewAgent,
scansAgent,
complianceAgent,
findingsAgent,
rolesAgent, // New agent added here
];
// Create supervisor workflow
const workflow = createSupervisor({
agents: agents,
llm: supervisorllm,
prompt: supervisorPrompt,
outputMode: "last_message",
});
```
6. Update the supervisor's system prompt to summarize the new agent's capabilities.
### Best Practices for Agent Development
When developing new agents or capabilities:
- **Clear Responsibility Boundaries**: Each agent should have a defined purpose with minimal overlap. No two agents should access the same tools or different tools accessing the same Prowler APIs.
- **Minimal Data Access**: Agents should only request the data they need, keeping requests specific to minimize context window usage, cost, and response time.
- **Thorough Prompting:** Ensure agent prompts include clear instructions about:
- The agent's purpose and limitations
- How to use its tools
- How to format responses for the supervisor
- Error handling procedures (Optional)
- **Security Considerations:** Agents should never modify data or access sensitive information like secrets or credentials.
- **Testing:** Thoroughly test new agents with various queries before deploying to production.
+3 -2
View File
@@ -284,8 +284,9 @@
"developer-guide/outputs",
"developer-guide/integrations",
"developer-guide/security-compliance-framework",
"developer-guide/lighthouse",
"developer-guide/mcp-server"
"developer-guide/lighthouse-architecture",
"developer-guide/mcp-server",
"developer-guide/ai-skills"
]
},
{
@@ -59,6 +59,14 @@ Prowler Lighthouse AI is powerful, but there are limitations:
- **NextJS session dependence**: If your Prowler application session expires or logs out, Lighthouse AI will error out. Refresh and log back in to continue.
- **Response quality**: The response quality depends on the selected LLM provider and model. Choose models with strong tool-calling capabilities for best results. We recommend `gpt-5` model from OpenAI.
## Extending Lighthouse AI
Lighthouse AI retrieves data through Prowler MCP. To add new capabilities, extend the Prowler MCP Server with additional tools and Lighthouse AI discovers them automatically.
For development details, see:
- [Lighthouse AI Architecture](/developer-guide/lighthouse-architecture) - Internal architecture and extension points
- [Extending the MCP Server](/developer-guide/mcp-server) - Adding new tools to Prowler MCP
### Getting Help
If you encounter issues with Prowler Lighthouse AI or have suggestions for improvements, please [reach out through our Slack channel](https://goto.prowler.com/slack).
@@ -67,94 +75,6 @@ If you encounter issues with Prowler Lighthouse AI or have suggestions for impro
The following API endpoints are accessible to Prowler Lighthouse AI. Data from the following API endpoints could be shared with LLM provider depending on the scope of user's query:
#### Accessible API Endpoints
**User Management:**
- List all users - `/api/v1/users`
- Retrieve the current user's information - `/api/v1/users/me`
**Provider Management:**
- List all providers - `/api/v1/providers`
- Retrieve data from a provider - `/api/v1/providers/{id}`
**Scan Management:**
- List all scans - `/api/v1/scans`
- Retrieve data from a specific scan - `/api/v1/scans/{id}`
**Resource Management:**
- List all resources - `/api/v1/resources`
- Retrieve data for a resource - `/api/v1/resources/{id}`
**Findings Management:**
- List all findings - `/api/v1/findings`
- Retrieve data from a specific finding - `/api/v1/findings/{id}`
- Retrieve metadata values from findings - `/api/v1/findings/metadata`
**Overview Data:**
- Get aggregated findings data - `/api/v1/overviews/findings`
- Get findings data by severity - `/api/v1/overviews/findings_severity`
- Get aggregated provider data - `/api/v1/overviews/providers`
- Get findings data by service - `/api/v1/overviews/services`
**Compliance Management:**
- List compliance overviews (optionally filter by scan) - `/api/v1/compliance-overviews`
- Retrieve data from a specific compliance overview - `/api/v1/compliance-overviews/{id}`
#### Excluded API Endpoints
Not all Prowler API endpoints are integrated with Lighthouse AI. They are intentionally excluded for the following reasons:
- OpenAI/other LLM providers shouldn't have access to sensitive data (like fetching provider secrets and other sensitive config)
- Users queries don't need responses from those API endpoints (ex: tasks, tenant details, downloading zip file, etc.)
**Excluded Endpoints:**
**User Management:**
- List specific users information - `/api/v1/users/{id}`
- List user memberships - `/api/v1/users/{user_pk}/memberships`
- Retrieve membership data from the user - `/api/v1/users/{user_pk}/memberships/{id}`
**Tenant Management:**
- List all tenants - `/api/v1/tenants`
- Retrieve data from a tenant - `/api/v1/tenants/{id}`
- List tenant memberships - `/api/v1/tenants/{tenant_pk}/memberships`
- List all invitations - `/api/v1/tenants/invitations`
- Retrieve data from tenant invitation - `/api/v1/tenants/invitations/{id}`
**Security and Configuration:**
- List all secrets - `/api/v1/providers/secrets`
- Retrieve data from a secret - `/api/v1/providers/secrets/{id}`
- List all provider groups - `/api/v1/provider-groups`
- Retrieve data from a provider group - `/api/v1/provider-groups/{id}`
**Reports and Tasks:**
- Download zip report - `/api/v1/scans/{v1}/report`
- List all tasks - `/api/v1/tasks`
- Retrieve data from a specific task - `/api/v1/tasks/{id}`
**Lighthouse AI Configuration:**
- List LLM providers - `/api/v1/lighthouse/providers`
- Retrieve LLM provider - `/api/v1/lighthouse/providers/{id}`
- List available models - `/api/v1/lighthouse/models`
- Retrieve tenant configuration - `/api/v1/lighthouse/configuration`
<Note>
Agents only have access to hit GET endpoints. They don't have access to other HTTP methods.
</Note>
## FAQs
**1. Which LLM providers are supported?**
@@ -167,13 +87,21 @@ Lighthouse AI supports three providers:
For detailed configuration instructions, see [Using Multiple LLM Providers with Lighthouse](/user-guide/tutorials/prowler-app-lighthouse-multi-llm).
**2. Why a multi-agent supervisor model?**
**2. Why some models don't appear in Lighthouse AI?**
Context windows are limited. While demo data fits inside the context window, querying real-world data often exceeds it. A multi-agent architecture is used so different agents fetch different sizes of data and respond with the minimum required data to the supervisor. This spreads the context window usage across agents.
LLM providers offer different types of models. Not every model can be integrated with Lighthouse AI (for example, text-to-speech, vision, embedding, computer use, etc.).
Lighthouse AI requires models that support:
- Text input
- Text output
- Tool calling
Lighthouse AI [automatically filters](https://github.com/prowler-cloud/prowler/blob/master/api/src/backend/tasks/jobs/lighthouse_providers.py#L341-L353) out models that do not support these capabilities, so some provider models may not appear in the Lighthouse AI model list.
**3. Is my security data shared with LLM providers?**
Minimal data is shared to generate useful responses. Agents can access security findings and remediation details when needed. Provider secrets are protected by design and cannot be read. The LLM provider credentials configured with Lighthouse AI are only accessible to our NextJS server and are never sent to the LLM providers. Resource metadata (names, tags, account/project IDs, etc) may be shared with the configured LLM provider based on query requirements.
Minimal data is shared to generate useful responses. Agent can access security findings and remediation details when needed. Provider secrets are protected by design and cannot be read. The LLM provider credentials configured with Lighthouse AI are only accessible to the Next.js server and are never sent to the LLM providers. Resource metadata (names, tags, account/project IDs, etc.) may be shared with the configured LLM provider based on query requirements.
**4. Can the Lighthouse AI change my cloud environment?**
Binary file not shown.

Before

Width:  |  Height:  |  Size: 178 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 267 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 265 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 178 KiB

+13 -13
View File
@@ -23,19 +23,19 @@
The supported providers right now are:
| Provider | Support | Interface |
| -------------------------------------------------------------------------------- | ---------- | ------------ |
| [AWS](/user-guide/providers/aws/getting-started-aws) | Official | UI, API, CLI |
| [Azure](/user-guide/providers/azure/getting-started-azure) | Official | UI, API, CLI |
| [Google Cloud](/user-guide/providers/gcp/getting-started-gcp) | Official | UI, API, CLI |
| [Kubernetes](/user-guide/providers/kubernetes/getting-started-k8s) | Official | UI, API, CLI |
| [M365](/user-guide/providers/microsoft365/getting-started-m365) | Official | UI, API, CLI |
| [Github](/user-guide/providers/github/getting-started-github) | Official | UI, API, CLI |
| [Oracle Cloud](/user-guide/providers/oci/getting-started-oci) | Official | UI, API, CLI |
| [Infra as Code](/user-guide/providers/iac/getting-started-iac) | Official | UI, API, CLI |
| [MongoDB Atlas](/user-guide/providers/mongodbatlas/getting-started-mongodbatlas) | Official | UI, API, CLI |
| [LLM](/user-guide/providers/llm/getting-started-llm) | Official | CLI |
| **NHN** | Unofficial | CLI |
| Provider | Support | Audit Scope/Entities | Interface |
| -------------------------------------------------------------------------------- | ---------- | ---------------------------- | ------------ |
| [AWS](/user-guide/providers/aws/getting-started-aws) | Official | Accounts | UI, API, CLI |
| [Azure](/user-guide/providers/azure/getting-started-azure) | Official | Subscriptions | UI, API, CLI |
| [Google Cloud](/user-guide/providers/gcp/getting-started-gcp) | Official | Projects | UI, API, CLI |
| [Kubernetes](/user-guide/providers/kubernetes/getting-started-k8s) | Official | Clusters | UI, API, CLI |
| [M365](/user-guide/providers/microsoft365/getting-started-m365) | Official | Tenants | UI, API, CLI |
| [Github](/user-guide/providers/github/getting-started-github) | Official | Organizations / Repositories | UI, API, CLI |
| [Oracle Cloud](/user-guide/providers/oci/getting-started-oci) | Official | Tenancies / Compartments | UI, API, CLI |
| [Infra as Code](/user-guide/providers/iac/getting-started-iac) | Official | Repositories | UI, API, CLI |
| [MongoDB Atlas](/user-guide/providers/mongodbatlas/getting-started-mongodbatlas) | Official | Organizations | UI, API, CLI |
| [LLM](/user-guide/providers/llm/getting-started-llm) | Official | Models | CLI |
| **NHN** | Unofficial | Tenants | CLI |
For more information about the checks and compliance of each provider visit [Prowler Hub](https://hub.prowler.com).
Binary file not shown.

Before

Width:  |  Height:  |  Size: 178 KiB

@@ -97,6 +97,7 @@ The following list includes all the GCP checks with configurable variables that
| Check Name | Value | Type |
|---------------------------------------------------------------|--------------------------------------------------|-----------------|
| `compute_configuration_changes` | `compute_audit_log_lookback_days` | Integer |
| `compute_instance_group_multiple_zones` | `mig_min_zones` | Integer |
## Kubernetes
@@ -553,6 +554,9 @@ gcp:
# GCP Compute Configuration
# gcp.compute_public_address_shodan
shodan_api_key: null
# gcp.compute_configuration_changes
# Number of days to look back for Compute Engine configuration changes in audit logs
compute_audit_log_lookback_days: 1
# gcp.compute_instance_group_multiple_zones
# Minimum number of zones a MIG should span for high availability
mig_min_zones: 2
Binary file not shown.

Before

Width:  |  Height:  |  Size: 178 KiB

@@ -246,12 +246,223 @@ prowler azure --az-cli-auth
*Available only for Prowler CLI*
Authenticate via Azure Managed Identity (when running on Azure resources):
Authenticate via Azure Managed Identity when running Prowler on Azure resources (VMs, Container Instances, Azure Functions, etc.):
```console
prowler azure --managed-identity-auth
```
### Prerequisites
Before using Managed Identity authentication, the following steps are required:
1. **Enable Managed Identity** on the Azure resource (e.g., VM, Container Instance)
2. **Assign the required permissions** to the Managed Identity on the target subscription(s) to scan
<Warning>
A common misconception is that enabling a Managed Identity on a resource automatically grants it permissions. **This is not the case.** Without explicit role assignments, Prowler will be unable to scan subscriptions and will return authorization errors, resulting in incomplete security assessments. The Managed Identity itself is a service principal that must be explicitly granted Reader and ProwlerRole permissions on each subscription to scan.
</Warning>
### Step-by-Step Setup Guide
#### Step 1: Enable Managed Identity on the Azure Resource
<Tabs>
<Tab title="Azure VM">
**Via Azure Portal:**
1. Navigate to the VM in Azure Portal
2. Select "Identity" from the left menu under "Security"
3. Under "System assigned" tab, set Status to "On"
4. Click "Save"
5. Note the "Object (principal) ID" - this value is required for permission assignment
**Via Azure CLI:**
```console
# Enable system-assigned managed identity
az vm identity assign --name <vm-name> --resource-group <resource-group>
# Get the principal ID
az vm identity show --name <vm-name> --resource-group <resource-group> --query principalId -o tsv
```
</Tab>
<Tab title="Azure Container Instance">
**Via Azure CLI:**
```console
# Enable system-assigned managed identity
az container create \
--resource-group <resource-group> \
--name <container-name> \
--image <image> \
--assign-identity
# Get the principal ID
az container show --resource-group <resource-group> --name <container-name> --query identity.principalId -o tsv
```
</Tab>
</Tabs>
#### Step 2: Assign Reader Role to the Managed Identity
The Managed Identity needs the **Reader** role on each subscription to scan. This role must be assigned to the **Managed Identity's principal ID**, not the VM or resource itself.
<Tabs>
<Tab title="Azure Portal">
1. Navigate to the **target subscription** to scan (not the VM's resource group)
2. Select "Access control (IAM)" from the left menu
3. Click "+ Add" > "Add role assignment"
4. Select "Reader" role, click "Next"
5. Click "+ Select members"
6. Search for the VM name or paste the Managed Identity's Object/Principal ID
7. Select it and click "Select"
8. Click "Review + assign"
<Note>
When scanning a subscription different from where the VM is located, ensure the role is assigned on the **target subscription**, not the VM's subscription.
</Note>
</Tab>
<Tab title="Azure CLI">
```console
# Get the principal ID of the resource's managed identity
PRINCIPAL_ID=$(az vm identity show --name <vm-name> --resource-group <resource-group> --query principalId -o tsv)
# Assign Reader role on the target subscription
az role assignment create \
--role "Reader" \
--assignee-object-id $PRINCIPAL_ID \
--assignee-principal-type ServicePrincipal \
--scope /subscriptions/<target-subscription-id>
```
</Tab>
</Tabs>
#### Step 3: Create and Assign ProwlerRole to the Managed Identity
The ProwlerRole is a custom role required for specific security checks. First, create the role if it does not exist, then assign it to the Managed Identity.
<Tabs>
<Tab title="Azure CLI">
**Create the ProwlerRole:**
```console
az role definition create --role-definition '{
"Name": "ProwlerRole",
"IsCustom": true,
"Description": "Role used for checks that require read-only access to Azure resources and are not covered by the Reader role.",
"AssignableScopes": ["/subscriptions/<target-subscription-id>"],
"Actions": [
"Microsoft.Web/sites/host/listkeys/action",
"Microsoft.Web/sites/config/list/Action"
]
}'
```
**Assign ProwlerRole to the Managed Identity:**
```console
# Get the principal ID if not already available
PRINCIPAL_ID=$(az vm identity show --name <vm-name> --resource-group <resource-group> --query principalId -o tsv)
# Assign ProwlerRole on the target subscription
az role assignment create \
--role "ProwlerRole" \
--assignee-object-id $PRINCIPAL_ID \
--assignee-principal-type ServicePrincipal \
--scope /subscriptions/<target-subscription-id>
```
</Tab>
<Tab title="Azure Portal">
Follow the same process as creating the ProwlerRole in the [Assigning ProwlerRole Permissions](/user-guide/providers/azure/authentication#assigning-prowlerrole-permissions-at-the-subscription-level) section, then assign it to the Managed Identity using the same steps as the Reader role assignment.
</Tab>
</Tabs>
#### Step 4: (Optional) Assign Microsoft Graph Permissions
For Entra ID (Azure AD) checks, the Managed Identity needs Microsoft Graph API permissions: `Directory.Read.All`, `Policy.Read.All`, and optionally `UserAuthenticationMethod.Read.All`.
<Note>
Assigning Microsoft Graph API permissions to a Managed Identity requires Azure CLI or PowerShell - it cannot be done through the Azure Portal's standard role assignment interface.
</Note>
```console
# Get the Managed Identity's principal ID
PRINCIPAL_ID=$(az vm identity show --name <vm-name> --resource-group <resource-group> --query principalId -o tsv)
# Get Microsoft Graph's service principal ID
GRAPH_SP_ID=$(az ad sp list --display-name "Microsoft Graph" --query [0].id -o tsv)
# Assign Directory.Read.All permission (App Role ID: 7ab1d382-f21e-4acd-a863-ba3e13f7da61)
az rest --method POST \
--uri "https://graph.microsoft.com/v1.0/servicePrincipals/$PRINCIPAL_ID/appRoleAssignments" \
--headers "Content-Type=application/json" \
--body "{\"principalId\": \"$PRINCIPAL_ID\", \"resourceId\": \"$GRAPH_SP_ID\", \"appRoleId\": \"7ab1d382-f21e-4acd-a863-ba3e13f7da61\"}"
# Assign Policy.Read.All permission (App Role ID: 246dd0d5-5bd0-4def-940b-0421030a5b68)
az rest --method POST \
--uri "https://graph.microsoft.com/v1.0/servicePrincipals/$PRINCIPAL_ID/appRoleAssignments" \
--headers "Content-Type=application/json" \
--body "{\"principalId\": \"$PRINCIPAL_ID\", \"resourceId\": \"$GRAPH_SP_ID\", \"appRoleId\": \"246dd0d5-5bd0-4def-940b-0421030a5b68\"}"
```
#### Step 5: Run Prowler
SSH or connect to the Azure resource and run Prowler:
```console
# Scan all accessible subscriptions
prowler azure --managed-identity-auth
# Scan specific subscription(s)
prowler azure --managed-identity-auth --subscription-ids <subscription-id>
```
<Note>
Wait a few minutes after assigning roles for Azure to propagate permissions. Role assignments are not always immediately effective.
</Note>
### Troubleshooting
#### Error: "No subscriptions were found, please check your permission assignments"
**Cause:** The Managed Identity does not have the Reader role assigned on any subscription.
**Solution:**
- Verify the Managed Identity has the Reader role assigned on at least one subscription.
- Wait a few minutes after role assignment for Azure to propagate permissions.
- Verify role assignments:
```console
az role assignment list --assignee <principal-id> --all
```
#### Error: "does not have authorization to perform action 'Microsoft.Resources/subscriptions/read'"
**Cause:** The Managed Identity lacks the Reader role on the target subscription.
**Solution:**
- Ensure the Reader role is assigned to the **Managed Identity's principal ID**, not the VM resource.
- Verify the role is assigned on the **target subscription** to scan, not just the VM's resource group.
- Check role assignments:
```console
az role assignment list --assignee <principal-id> --scope /subscriptions/<subscription-id>
```
#### Error: "CredentialUnavailableError: ManagedIdentityCredential authentication unavailable"
**Cause:** Managed Identity is not enabled on the resource, or Prowler is running outside of Azure.
**Solution:**
- Verify Managed Identity is enabled on the Azure resource.
- Ensure Prowler is running from within the Azure resource (not a local machine).
- Check Managed Identity status:
```console
az vm identity show --name <vm-name> --resource-group <resource-group>
```
#### Error: Access token validation failure for Entra ID checks
**Cause:** The Managed Identity lacks Microsoft Graph API permissions.
**Solution:**
- Assign the required Graph API permissions as shown in Step 4.
- These permissions are optional for basic resource scanning but required for Entra ID security checks.
## Browser Authentication
*Available only for Prowler CLI*
@@ -164,7 +164,7 @@ prowler oci --profile PRODUCTION
Use a config file from a custom location:
```bash
prowler oci --config-file /path/to/custom/config
prowler oci --oci-config-file /path/to/custom/config
```
### Setting Up API Keys
@@ -377,7 +377,7 @@ ls -la ~/.oci/config
mkdir -p ~/.oci
# Specify custom location
prowler oci --config-file /path/to/config
prowler oci --oci-config-file /path/to/config
```
#### Error: "InvalidKeyOrSignature"
@@ -122,7 +122,7 @@ prowler oci --profile production
##### Using a Custom Config File
```bash
prowler oci --config-file /path/to/custom/config
prowler oci --oci-config-file /path/to/custom/config
```
#### Instance Principal Authentication
@@ -22,7 +22,7 @@ For Lighthouse AI to work properly, models **must** support all of the following
- **Text input**: Ability to receive text prompts.
- **Text output**: Ability to generate text responses.
- **Tool calling**: Ability to invoke tools and functions.
- **Tool calling**: Ability to invoke tools and functions to retrieve data from Prowler.
If any of these capabilities are missing, the model will not be compatible with Lighthouse AI.
@@ -8,24 +8,33 @@ import { VersionBadge } from "/snippets/version-badge.mdx"
Prowler Lighthouse AI integrates Large Language Models (LLMs) with Prowler security findings data.
Here's what's happening behind the scenes:
Behind the scenes, Lighthouse AI works as follows:
- Lighthouse AI runs as a [Langchain agent](https://docs.langchain.com/oss/javascript/langchain/agents) in NextJS
- The agent connects to the configured LLM provider to understand the prompt and decide what data is needed
- The agent accesses Prowler data through [Prowler MCP](https://docs.prowler.com/getting-started/products/prowler-mcp), which exposes tools from multiple sources, including:
- Prowler Hub
- Prowler Docs
- Prowler App
- Instead of calling every tool directly, the agent uses two meta-tools:
- `describe_tool` to retrieve a tool schema and parameter requirements.
- `execute_tool` to run the selected tool with the required input.
- Based on the user's query and the data necessary to answer it, Lighthouse agent will invoke necessary Prowler MCP tools using `discover_tool` and `execute_tool`
- The system uses a multi-agent architecture built with [LanggraphJS](https://github.com/langchain-ai/langgraphjs) for LLM logic and [Vercel AI SDK UI](https://sdk.vercel.ai/docs/ai-sdk-ui/overview) for frontend chatbot.
- It uses a ["supervisor" architecture](https://langchain-ai.lang.chat/langgraphjs/tutorials/multi_agent/agent_supervisor/) that interacts with different agents for specialized tasks. For example, `findings_agent` can analyze detected security findings, while `overview_agent` provides a summary of connected cloud accounts.
- The system connects to the configured LLM provider to understand user's query, fetches the right data, and responds to the query.
<Note>
Lighthouse AI supports multiple LLM providers including OpenAI, Amazon Bedrock, and OpenAI-compatible services. For configuration details, see [Using Multiple LLM Providers with Lighthouse](/user-guide/tutorials/prowler-app-lighthouse-multi-llm).
</Note>
- The supervisor agent is the main contact point. It is what users interact with directly from the chat interface. It coordinates with other agents to answer users' questions comprehensively.
<img src="/images/prowler-app/lighthouse-architecture.png" alt="Lighthouse AI Architecture" />
<img className="block dark:hidden" src="/images/lighthouse-architecture-light.png" alt="Prowler Lighthouse Architecture" />
<img className="hidden dark:block" src="/images/lighthouse-architecture-dark.png" alt="Prowler Lighthouse Architecture" />
<Note>
All agents can only read relevant security data. They cannot modify your data or access sensitive information like configured secrets or tenant details.
Lighthouse AI can only read relevant security data. It cannot modify data or access sensitive information such as configured secrets or tenant details.
</Note>
## Set up
## Set Up
Getting started with Prowler Lighthouse AI is easy:
@@ -43,11 +52,11 @@ For detailed configuration instructions for each provider, see [Using Multiple L
### Adding Business Context
The optional business context field lets you provide additional information to help Lighthouse AI understand your environment and priorities, including:
The optional business context field lets teams provide additional information to help Lighthouse AI understand environment priorities, including:
- Your organization's cloud security goals
- Organization cloud security goals
- Information about account owners or responsible teams
- Compliance requirements for your organization
- Compliance requirements
- Current security initiatives or focus areas
Better context leads to more relevant responses and prioritization that aligns with your needs.
+42 -264
View File
@@ -1,310 +1,88 @@
# Prowler MCP Server - AI Agent Ruleset
**Complete guide for AI agents and developers working on the Prowler MCP Server - the Model Context Protocol server that provides AI agents access to the Prowler ecosystem.**
> **Skills Reference**: For detailed patterns, use the [`prowler-mcp`](../skills/prowler-mcp/SKILL.md) skill.
## Project Overview
The Prowler MCP Server brings the entire Prowler ecosystem to AI assistants through
the Model Context Protocol (MCP). It enables seamless integration with AI tools
like Claude Desktop, Cursor, and other MCP hosts, allowing interaction with
Prowler's security capabilities through natural language.
The Prowler MCP Server provides AI agents access to the Prowler ecosystem through the Model Context Protocol (MCP). It integrates with Claude Desktop, Cursor, and other MCP hosts.
---
## Critical Rules
## CRITICAL RULES
### Tool Implementation
- **ALWAYS**: Extend `BaseTool` ABC for new Prowler App tools (auto-registration)
- **ALWAYS**: Use `@mcp.tool()` decorator for Hub/Docs tools (manual registration)
- **NEVER**: Manually register BaseTool subclasses (auto-discovered via `load_all_tools()`)
- **NEVER**: Import tools directly in server.py (tool_loader handles discovery)
- ALWAYS: Extend `BaseTool` ABC for Prowler App tools (auto-registration)
- ALWAYS: Use `@mcp.tool()` decorator for Hub/Docs tools
- NEVER: Manually register BaseTool subclasses
- NEVER: Import tools directly in server.py
### Models
- **ALWAYS**: Use `MinimalSerializerMixin` for LLM-optimized responses
- **ALWAYS**: Implement `from_api_response()` factory method for API transformations
- **ALWAYS**: Use two-tier models (Simplified for lists, Detailed for single items)
- **NEVER**: Return raw API responses (transform to simplified models)
- ALWAYS: Use `MinimalSerializerMixin` for LLM-optimized responses
- ALWAYS: Implement `from_api_response()` factory method
- ALWAYS: Two-tier models (Simplified for lists, Detailed for single items)
- NEVER: Return raw API responses
### API Client
- **ALWAYS**: Use singleton `ProwlerAPIClient` via `self.api_client` in tools
- **ALWAYS**: Use `build_filter_params()` for query parameter normalization
- **NEVER**: Create new httpx clients in tools (use shared client)
- ALWAYS: Use singleton `ProwlerAPIClient` via `self.api_client`
- ALWAYS: Use `build_filter_params()` for query parameters
- NEVER: Create new httpx clients in tools
---
## Architecture
## ARCHITECTURE
### Three Sub-Servers Pattern
The main server (`server.py`) orchestrates three independent sub-servers with prefixed tool namespacing:
### Three Sub-Servers
```python
# server.py imports sub-servers with prefixes
await prowler_mcp_server.import_server(hub_mcp_server, prefix="prowler_hub")
await prowler_mcp_server.import_server(app_mcp_server, prefix="prowler_app")
await prowler_mcp_server.import_server(docs_mcp_server, prefix="prowler_docs")
```
This pattern ensures:
- Failures in one sub-server do not block others
- Clear tool namespacing for LLM disambiguation
- Independent development and testing
### Tool Naming Convention
All tools follow a consistent naming pattern with prefixes:
- `prowler_hub_*` - Prowler Hub catalog and compliance tools
- `prowler_docs_*` - Prowler documentation search and retrieval
- `prowler_app_*` - Prowler Cloud and App (Self-Managed) management tools
### Tool Registration Patterns
**Pattern 1: Prowler Hub/Docs (Direct Decorators)**
```python
# prowler_hub/server.py or prowler_documentation/server.py
hub_mcp_server = FastMCP("prowler-hub")
@hub_mcp_server.tool()
async def get_checks(providers: str | None = None) -> dict:
"""Tool docstring becomes LLM description."""
# Direct implementation
response = prowler_hub_client.get("/check", params=params)
return response.json()
```
**Pattern 2: Prowler App (BaseTool Auto-Registration)**
```python
# prowler_app/tools/findings.py
class FindingsTools(BaseTool):
async def search_security_findings(
self,
severity: list[str] = Field(default=[], description="Filter by severity")
) -> dict:
"""Docstring becomes LLM description."""
response = await self.api_client.get("/api/v1/findings")
return SimplifiedFinding.from_api_response(response).model_dump()
```
NOTE: Only public methods of `BaseTool` subclasses are registered as tools.
### Tool Naming
- `prowler_hub_*` - Catalog and compliance (no auth)
- `prowler_docs_*` - Documentation search (no auth)
- `prowler_app_*` - Cloud/App management (auth required)
---
## Tech Stack
## TECH STACK
- **Language**: Python 3.12+
- **MCP Framework**: FastMCP 2.13.1
- **HTTP Client**: httpx (async)
- **Validation**: Pydantic with MinimalSerializerMixin
- **Package Manager**: uv
Python 3.12+ | FastMCP 2.13.1 | httpx (async) | Pydantic | uv
---
## Project Structure
## PROJECT STRUCTURE
```
mcp_server/
├── README.md # User documentation
├── AGENTS.md # This file - AI agent guidelines
├── CHANGELOG.md # Version history
├── pyproject.toml # Project metadata and dependencies
├── Dockerfile # Container image definition
├── entrypoint.sh # Docker entrypoint script
└── prowler_mcp_server/
├── __init__.py # Version info
── main.py # CLI entry point
├── server.py # Main FastMCP server orchestration
├── lib/
│ └── logger.py # Structured logging
├── prowler_hub/
│ └── server.py # Hub tools (10 tools, no auth)
├── prowler_app/
│ ├── server.py # App server initialization
│ ├── tools/
│ │ ├── base.py # BaseTool abstract class
│ │ ├── findings.py # Findings tools
│ │ ├── providers.py # Provider tools
│ │ ├── scans.py # Scan tools
│ │ ├── resources.py # Resource tools
│ │ └── muting.py # Muting tools
│ ├── models/
│ │ ├── base.py # MinimalSerializerMixin
│ │ ├── findings.py # Finding models
│ │ ├── providers.py # Provider models
│ │ ├── scans.py # Scan models
│ │ ├── resources.py # Resource models
│ │ └── muting.py # Muting models
│ └── utils/
│ ├── api_client.py # ProwlerAPIClient singleton
│ ├── auth.py # ProwlerAppAuth (STDIO/HTTP)
│ └── tool_loader.py # Auto-discovery and registration
└── prowler_documentation/
├── server.py # Documentation tools (2 tools, no auth)
└── search_engine.py # Mintlify API integration
mcp_server/prowler_mcp_server/
├── server.py # Main orchestration
├── prowler_hub/server.py # Hub tools (no auth)
├── prowler_app/
│ ├── server.py
│ ├── tools/{feature}.py # BaseTool subclasses
│ ├── models/{feature}.py # Pydantic models
│ └── utils/api_client.py # ProwlerAPIClient
└── prowler_documentation/
── server.py # Docs tools (no auth)
```
---
## Commands
NOTE: To run a python command always use `uv run <command>` from within the `mcp_server/` directory.
### Development
## COMMANDS
```bash
# Navigate to MCP server directory
cd mcp_server
# Run in STDIO mode (default)
uv run prowler-mcp
# Run in HTTP mode
uv run prowler-mcp --transport http --host 0.0.0.0 --port 8000
# Run from anywhere using uvx
uvx /path/to/prowler/mcp_server/
cd mcp_server && uv run prowler-mcp # STDIO mode
cd mcp_server && uv run prowler-mcp --transport http --port 8000 # HTTP mode
```
---
## Development Patterns
## QA CHECKLIST
### Adding New Tools to Prowler App
1. **Create or extend a tool class** in `prowler_app/tools/`:
```python
# prowler_app/tools/new_feature.py
from pydantic import Field
from prowler_mcp_server.prowler_app.tools.base import BaseTool
from prowler_mcp_server.prowler_app.models.new_feature import FeatureResponse
class NewFeatureTools(BaseTool):
async def list_features(
self,
status: str | None = Field(default=None, description="Filter by status")
) -> dict:
"""List all features with optional filtering.
Returns a simplified list of features optimized for LLM consumption.
"""
params = {}
if status:
params["filter[status]"] = status
clean_params = self.api_client.build_filter_params(params)
response = await self.api_client.get("/api/v1/features", params=clean_params)
return FeatureResponse.from_api_response(response).model_dump()
```
2. **Create corresponding models** in `prowler_app/models/`:
```python
# prowler_app/models/new_feature.py
from pydantic import Field
from prowler_mcp_server.prowler_app.models.base import MinimalSerializerMixin
class SimplifiedFeature(MinimalSerializerMixin):
"""Lightweight feature for list operations."""
id: str
name: str
status: str
class DetailedFeature(SimplifiedFeature):
"""Extended feature with complete details."""
description: str | None = None
created_at: str
updated_at: str
@classmethod
def from_api_response(cls, data: dict) -> "DetailedFeature":
"""Transform API response to model."""
attributes = data.get("attributes", {})
return cls(
id=data["id"],
name=attributes["name"],
status=attributes["status"],
description=attributes.get("description"),
created_at=attributes["created_at"],
updated_at=attributes["updated_at"],
)
```
3. **No registration needed** - the tool loader auto-discovers BaseTool subclasses
### Adding Tools to Prowler Hub/Docs
Use the `@mcp.tool()` decorator directly:
```python
# prowler_hub/server.py
@hub_mcp_server.tool()
async def new_hub_tool(param: str) -> dict:
"""Tool description for LLM."""
response = prowler_hub_client.get("/endpoint")
return response.json()
```
---
## Code Quality Standards
### Tool Docstrings
Tool docstrings become AI agent descriptions. Write them in a clear, concise manner focusing on LLM-relevant behavior:
```python
async def search_security_findings(
self,
severity: list[str] = Field(default=[], description="Filter by severity levels")
) -> dict:
"""Search security findings with advanced filtering.
Returns a lightweight list of findings optimized for LLM consumption.
Use get_finding_details for complete information about a specific finding.
"""
```
### Model Design
- Use `MinimalSerializerMixin` to exclude None/empty values
- Implement `from_api_response()` for consistent API transformation
- Create two-tier models: Simplified (lists) and Detailed (single items)
### Error Handling
Return structured error responses rather than raising exceptions:
```python
try:
response = await self.api_client.get(f"/api/v1/items/{item_id}")
return DetailedItem.from_api_response(response["data"]).model_dump()
except Exception as e:
self.logger.error(f"Failed to get item {item_id}: {e}")
return {"error": str(e), "status": "failed"}
```
---
## QA Checklist Before Commit
- [ ] Tool docstrings are clear and describe LLM-relevant behavior
- [ ] Models use `MinimalSerializerMixin` for LLM optimization
- [ ] API responses are transformed to simplified models
- [ ] No hardcoded secrets or API keys
- [ ] Tool docstrings describe LLM-relevant behavior
- [ ] Models use `MinimalSerializerMixin`
- [ ] API responses transformed to simplified models
- [ ] No hardcoded secrets
- [ ] Error handling returns structured responses
- [ ] New tools are auto-discovered (BaseTool subclass) or properly decorated
- [ ] Parameter descriptions use Pydantic `Field()` with clear descriptions
---
## References
- **Root Project Guide**: `../AGENTS.md`
- **FastMCP Documentation**: https://gofastmcp.com/llms.txt
- **Prowler API Documentation**: https://api.prowler.com/api/v1/docs
- [ ] Parameter descriptions use Pydantic `Field()`
+1 -1
View File
@@ -4,7 +4,7 @@ requires = ["setuptools>=61.0", "wheel"]
[project]
dependencies = [
"fastmcp==2.13.1",
"fastmcp==2.14.0",
"httpx>=0.28.0"
]
description = "MCP server for Prowler ecosystem"
Generated
+253 -187
View File
@@ -38,98 +38,132 @@ files = [
[[package]]
name = "aiohttp"
version = "3.12.14"
version = "3.13.3"
description = "Async http client/server framework (asyncio)"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "aiohttp-3.12.14-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:906d5075b5ba0dd1c66fcaaf60eb09926a9fef3ca92d912d2a0bbdbecf8b1248"},
{file = "aiohttp-3.12.14-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c875bf6fc2fd1a572aba0e02ef4e7a63694778c5646cdbda346ee24e630d30fb"},
{file = "aiohttp-3.12.14-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fbb284d15c6a45fab030740049d03c0ecd60edad9cd23b211d7e11d3be8d56fd"},
{file = "aiohttp-3.12.14-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38e360381e02e1a05d36b223ecab7bc4a6e7b5ab15760022dc92589ee1d4238c"},
{file = "aiohttp-3.12.14-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:aaf90137b5e5d84a53632ad95ebee5c9e3e7468f0aab92ba3f608adcb914fa95"},
{file = "aiohttp-3.12.14-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e532a25e4a0a2685fa295a31acf65e027fbe2bea7a4b02cdfbbba8a064577663"},
{file = "aiohttp-3.12.14-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eab9762c4d1b08ae04a6c77474e6136da722e34fdc0e6d6eab5ee93ac29f35d1"},
{file = "aiohttp-3.12.14-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:abe53c3812b2899889a7fca763cdfaeee725f5be68ea89905e4275476ffd7e61"},
{file = "aiohttp-3.12.14-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5760909b7080aa2ec1d320baee90d03b21745573780a072b66ce633eb77a8656"},
{file = "aiohttp-3.12.14-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:02fcd3f69051467bbaa7f84d7ec3267478c7df18d68b2e28279116e29d18d4f3"},
{file = "aiohttp-3.12.14-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:4dcd1172cd6794884c33e504d3da3c35648b8be9bfa946942d353b939d5f1288"},
{file = "aiohttp-3.12.14-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:224d0da41355b942b43ad08101b1b41ce633a654128ee07e36d75133443adcda"},
{file = "aiohttp-3.12.14-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:e387668724f4d734e865c1776d841ed75b300ee61059aca0b05bce67061dcacc"},
{file = "aiohttp-3.12.14-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:dec9cde5b5a24171e0b0a4ca064b1414950904053fb77c707efd876a2da525d8"},
{file = "aiohttp-3.12.14-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:bbad68a2af4877cc103cd94af9160e45676fc6f0c14abb88e6e092b945c2c8e3"},
{file = "aiohttp-3.12.14-cp310-cp310-win32.whl", hash = "sha256:ee580cb7c00bd857b3039ebca03c4448e84700dc1322f860cf7a500a6f62630c"},
{file = "aiohttp-3.12.14-cp310-cp310-win_amd64.whl", hash = "sha256:cf4f05b8cea571e2ccc3ca744e35ead24992d90a72ca2cf7ab7a2efbac6716db"},
{file = "aiohttp-3.12.14-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:f4552ff7b18bcec18b60a90c6982049cdb9dac1dba48cf00b97934a06ce2e597"},
{file = "aiohttp-3.12.14-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:8283f42181ff6ccbcf25acaae4e8ab2ff7e92b3ca4a4ced73b2c12d8cd971393"},
{file = "aiohttp-3.12.14-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:040afa180ea514495aaff7ad34ec3d27826eaa5d19812730fe9e529b04bb2179"},
{file = "aiohttp-3.12.14-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b413c12f14c1149f0ffd890f4141a7471ba4b41234fe4fd4a0ff82b1dc299dbb"},
{file = "aiohttp-3.12.14-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:1d6f607ce2e1a93315414e3d448b831238f1874b9968e1195b06efaa5c87e245"},
{file = "aiohttp-3.12.14-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:565e70d03e924333004ed101599902bba09ebb14843c8ea39d657f037115201b"},
{file = "aiohttp-3.12.14-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4699979560728b168d5ab63c668a093c9570af2c7a78ea24ca5212c6cdc2b641"},
{file = "aiohttp-3.12.14-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ad5fdf6af93ec6c99bf800eba3af9a43d8bfd66dce920ac905c817ef4a712afe"},
{file = "aiohttp-3.12.14-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4ac76627c0b7ee0e80e871bde0d376a057916cb008a8f3ffc889570a838f5cc7"},
{file = "aiohttp-3.12.14-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:798204af1180885651b77bf03adc903743a86a39c7392c472891649610844635"},
{file = "aiohttp-3.12.14-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:4f1205f97de92c37dd71cf2d5bcfb65fdaed3c255d246172cce729a8d849b4da"},
{file = "aiohttp-3.12.14-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:76ae6f1dd041f85065d9df77c6bc9c9703da9b5c018479d20262acc3df97d419"},
{file = "aiohttp-3.12.14-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:a194ace7bc43ce765338ca2dfb5661489317db216ea7ea700b0332878b392cab"},
{file = "aiohttp-3.12.14-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:16260e8e03744a6fe3fcb05259eeab8e08342c4c33decf96a9dad9f1187275d0"},
{file = "aiohttp-3.12.14-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:8c779e5ebbf0e2e15334ea404fcce54009dc069210164a244d2eac8352a44b28"},
{file = "aiohttp-3.12.14-cp311-cp311-win32.whl", hash = "sha256:a289f50bf1bd5be227376c067927f78079a7bdeccf8daa6a9e65c38bae14324b"},
{file = "aiohttp-3.12.14-cp311-cp311-win_amd64.whl", hash = "sha256:0b8a69acaf06b17e9c54151a6c956339cf46db4ff72b3ac28516d0f7068f4ced"},
{file = "aiohttp-3.12.14-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:a0ecbb32fc3e69bc25efcda7d28d38e987d007096cbbeed04f14a6662d0eee22"},
{file = "aiohttp-3.12.14-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:0400f0ca9bb3e0b02f6466421f253797f6384e9845820c8b05e976398ac1d81a"},
{file = "aiohttp-3.12.14-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a56809fed4c8a830b5cae18454b7464e1529dbf66f71c4772e3cfa9cbec0a1ff"},
{file = "aiohttp-3.12.14-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27f2e373276e4755691a963e5d11756d093e346119f0627c2d6518208483fb6d"},
{file = "aiohttp-3.12.14-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:ca39e433630e9a16281125ef57ece6817afd1d54c9f1bf32e901f38f16035869"},
{file = "aiohttp-3.12.14-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9c748b3f8b14c77720132b2510a7d9907a03c20ba80f469e58d5dfd90c079a1c"},
{file = "aiohttp-3.12.14-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f0a568abe1b15ce69d4cc37e23020720423f0728e3cb1f9bcd3f53420ec3bfe7"},
{file = "aiohttp-3.12.14-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9888e60c2c54eaf56704b17feb558c7ed6b7439bca1e07d4818ab878f2083660"},
{file = "aiohttp-3.12.14-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3006a1dc579b9156de01e7916d38c63dc1ea0679b14627a37edf6151bc530088"},
{file = "aiohttp-3.12.14-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:aa8ec5c15ab80e5501a26719eb48a55f3c567da45c6ea5bb78c52c036b2655c7"},
{file = "aiohttp-3.12.14-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:39b94e50959aa07844c7fe2206b9f75d63cc3ad1c648aaa755aa257f6f2498a9"},
{file = "aiohttp-3.12.14-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:04c11907492f416dad9885d503fbfc5dcb6768d90cad8639a771922d584609d3"},
{file = "aiohttp-3.12.14-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:88167bd9ab69bb46cee91bd9761db6dfd45b6e76a0438c7e884c3f8160ff21eb"},
{file = "aiohttp-3.12.14-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:791504763f25e8f9f251e4688195e8b455f8820274320204f7eafc467e609425"},
{file = "aiohttp-3.12.14-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2785b112346e435dd3a1a67f67713a3fe692d288542f1347ad255683f066d8e0"},
{file = "aiohttp-3.12.14-cp312-cp312-win32.whl", hash = "sha256:15f5f4792c9c999a31d8decf444e79fcfd98497bf98e94284bf390a7bb8c1729"},
{file = "aiohttp-3.12.14-cp312-cp312-win_amd64.whl", hash = "sha256:3b66e1a182879f579b105a80d5c4bd448b91a57e8933564bf41665064796a338"},
{file = "aiohttp-3.12.14-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:3143a7893d94dc82bc409f7308bc10d60285a3cd831a68faf1aa0836c5c3c767"},
{file = "aiohttp-3.12.14-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:3d62ac3d506cef54b355bd34c2a7c230eb693880001dfcda0bf88b38f5d7af7e"},
{file = "aiohttp-3.12.14-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:48e43e075c6a438937c4de48ec30fa8ad8e6dfef122a038847456bfe7b947b63"},
{file = "aiohttp-3.12.14-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:077b4488411a9724cecc436cbc8c133e0d61e694995b8de51aaf351c7578949d"},
{file = "aiohttp-3.12.14-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d8c35632575653f297dcbc9546305b2c1133391089ab925a6a3706dfa775ccab"},
{file = "aiohttp-3.12.14-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6b8ce87963f0035c6834b28f061df90cf525ff7c9b6283a8ac23acee6502afd4"},
{file = "aiohttp-3.12.14-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f0a2cf66e32a2563bb0766eb24eae7e9a269ac0dc48db0aae90b575dc9583026"},
{file = "aiohttp-3.12.14-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdea089caf6d5cde975084a884c72d901e36ef9c2fd972c9f51efbbc64e96fbd"},
{file = "aiohttp-3.12.14-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8a7865f27db67d49e81d463da64a59365ebd6b826e0e4847aa111056dcb9dc88"},
{file = "aiohttp-3.12.14-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0ab5b38a6a39781d77713ad930cb5e7feea6f253de656a5f9f281a8f5931b086"},
{file = "aiohttp-3.12.14-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:9b3b15acee5c17e8848d90a4ebc27853f37077ba6aec4d8cb4dbbea56d156933"},
{file = "aiohttp-3.12.14-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:e4c972b0bdaac167c1e53e16a16101b17c6d0ed7eac178e653a07b9f7fad7151"},
{file = "aiohttp-3.12.14-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:7442488b0039257a3bdbc55f7209587911f143fca11df9869578db6c26feeeb8"},
{file = "aiohttp-3.12.14-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:f68d3067eecb64c5e9bab4a26aa11bd676f4c70eea9ef6536b0a4e490639add3"},
{file = "aiohttp-3.12.14-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f88d3704c8b3d598a08ad17d06006cb1ca52a1182291f04979e305c8be6c9758"},
{file = "aiohttp-3.12.14-cp313-cp313-win32.whl", hash = "sha256:a3c99ab19c7bf375c4ae3debd91ca5d394b98b6089a03231d4c580ef3c2ae4c5"},
{file = "aiohttp-3.12.14-cp313-cp313-win_amd64.whl", hash = "sha256:3f8aad695e12edc9d571f878c62bedc91adf30c760c8632f09663e5f564f4baa"},
{file = "aiohttp-3.12.14-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:b8cc6b05e94d837bcd71c6531e2344e1ff0fb87abe4ad78a9261d67ef5d83eae"},
{file = "aiohttp-3.12.14-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:d1dcb015ac6a3b8facd3677597edd5ff39d11d937456702f0bb2b762e390a21b"},
{file = "aiohttp-3.12.14-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3779ed96105cd70ee5e85ca4f457adbce3d9ff33ec3d0ebcdf6c5727f26b21b3"},
{file = "aiohttp-3.12.14-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:717a0680729b4ebd7569c1dcd718c46b09b360745fd8eb12317abc74b14d14d0"},
{file = "aiohttp-3.12.14-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b5dd3a2ef7c7e968dbbac8f5574ebeac4d2b813b247e8cec28174a2ba3627170"},
{file = "aiohttp-3.12.14-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4710f77598c0092239bc12c1fcc278a444e16c7032d91babf5abbf7166463f7b"},
{file = "aiohttp-3.12.14-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f3e9f75ae842a6c22a195d4a127263dbf87cbab729829e0bd7857fb1672400b2"},
{file = "aiohttp-3.12.14-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f9c8d55d6802086edd188e3a7d85a77787e50d56ce3eb4757a3205fa4657922"},
{file = "aiohttp-3.12.14-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:79b29053ff3ad307880d94562cca80693c62062a098a5776ea8ef5ef4b28d140"},
{file = "aiohttp-3.12.14-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:23e1332fff36bebd3183db0c7a547a1da9d3b4091509f6d818e098855f2f27d3"},
{file = "aiohttp-3.12.14-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:a564188ce831fd110ea76bcc97085dd6c625b427db3f1dbb14ca4baa1447dcbc"},
{file = "aiohttp-3.12.14-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:a7a1b4302f70bb3ec40ca86de82def532c97a80db49cac6a6700af0de41af5ee"},
{file = "aiohttp-3.12.14-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:1b07ccef62950a2519f9bfc1e5b294de5dd84329f444ca0b329605ea787a3de5"},
{file = "aiohttp-3.12.14-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:938bd3ca6259e7e48b38d84f753d548bd863e0c222ed6ee6ace3fd6752768a84"},
{file = "aiohttp-3.12.14-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:8bc784302b6b9f163b54c4e93d7a6f09563bd01ff2b841b29ed3ac126e5040bf"},
{file = "aiohttp-3.12.14-cp39-cp39-win32.whl", hash = "sha256:a3416f95961dd7d5393ecff99e3f41dc990fb72eda86c11f2a60308ac6dcd7a0"},
{file = "aiohttp-3.12.14-cp39-cp39-win_amd64.whl", hash = "sha256:196858b8820d7f60578f8b47e5669b3195c21d8ab261e39b1d705346458f445f"},
{file = "aiohttp-3.12.14.tar.gz", hash = "sha256:6e06e120e34d93100de448fd941522e11dafa78ef1a893c179901b7d66aa29f2"},
{file = "aiohttp-3.13.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d5a372fd5afd301b3a89582817fdcdb6c34124787c70dbcc616f259013e7eef7"},
{file = "aiohttp-3.13.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:147e422fd1223005c22b4fe080f5d93ced44460f5f9c105406b753612b587821"},
{file = "aiohttp-3.13.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:859bd3f2156e81dd01432f5849fc73e2243d4a487c4fd26609b1299534ee1845"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:dca68018bf48c251ba17c72ed479f4dafe9dbd5a73707ad8d28a38d11f3d42af"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:fee0c6bc7db1de362252affec009707a17478a00ec69f797d23ca256e36d5940"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c048058117fd649334d81b4b526e94bde3ccaddb20463a815ced6ecbb7d11160"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:215a685b6fbbfcf71dfe96e3eba7a6f58f10da1dfdf4889c7dd856abe430dca7"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:de2c184bb1fe2cbd2cefba613e9db29a5ab559323f994b6737e370d3da0ac455"},
{file = "aiohttp-3.13.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:75ca857eba4e20ce9f546cd59c7007b33906a4cd48f2ff6ccf1ccfc3b646f279"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:81e97251d9298386c2b7dbeb490d3d1badbdc69107fb8c9299dd04eb39bddc0e"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:c0e2d366af265797506f0283487223146af57815b388623f0357ef7eac9b209d"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:4e239d501f73d6db1522599e14b9b321a7e3b1de66ce33d53a765d975e9f4808"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:0db318f7a6f065d84cb1e02662c526294450b314a02bd9e2a8e67f0d8564ce40"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:bfc1cc2fe31a6026a8a88e4ecfb98d7f6b1fec150cfd708adbfd1d2f42257c29"},
{file = "aiohttp-3.13.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:af71fff7bac6bb7508956696dce8f6eec2bbb045eceb40343944b1ae62b5ef11"},
{file = "aiohttp-3.13.3-cp310-cp310-win32.whl", hash = "sha256:37da61e244d1749798c151421602884db5270faf479cf0ef03af0ff68954c9dd"},
{file = "aiohttp-3.13.3-cp310-cp310-win_amd64.whl", hash = "sha256:7e63f210bc1b57ef699035f2b4b6d9ce096b5914414a49b0997c839b2bd2223c"},
{file = "aiohttp-3.13.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5b6073099fb654e0a068ae678b10feff95c5cae95bbfcbfa7af669d361a8aa6b"},
{file = "aiohttp-3.13.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cb93e166e6c28716c8c6aeb5f99dfb6d5ccf482d29fe9bf9a794110e6d0ab64"},
{file = "aiohttp-3.13.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:28e027cf2f6b641693a09f631759b4d9ce9165099d2b5d92af9bd4e197690eea"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3b61b7169ababd7802f9568ed96142616a9118dd2be0d1866e920e77ec8fa92a"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:80dd4c21b0f6237676449c6baaa1039abae86b91636b6c91a7f8e61c87f89540"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:65d2ccb7eabee90ce0503c17716fc77226be026dcc3e65cce859a30db715025b"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5b179331a481cb5529fca8b432d8d3c7001cb217513c94cd72d668d1248688a3"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d4c940f02f49483b18b079d1c27ab948721852b281f8b015c058100e9421dd1"},
{file = "aiohttp-3.13.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f9444f105664c4ce47a2a7171a2418bce5b7bae45fb610f4e2c36045d85911d3"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:694976222c711d1d00ba131904beb60534f93966562f64440d0c9d41b8cdb440"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:f33ed1a2bf1997a36661874b017f5c4b760f41266341af36febaf271d179f6d7"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e636b3c5f61da31a92bf0d91da83e58fdfa96f178ba682f11d24f31944cdd28c"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:5d2d94f1f5fcbe40838ac51a6ab5704a6f9ea42e72ceda48de5e6b898521da51"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2be0e9ccf23e8a94f6f0650ce06042cefc6ac703d0d7ab6c7a917289f2539ad4"},
{file = "aiohttp-3.13.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9af5e68ee47d6534d36791bbe9b646d2a7c7deb6fc24d7943628edfbb3581f29"},
{file = "aiohttp-3.13.3-cp311-cp311-win32.whl", hash = "sha256:a2212ad43c0833a873d0fb3c63fa1bacedd4cf6af2fee62bf4b739ceec3ab239"},
{file = "aiohttp-3.13.3-cp311-cp311-win_amd64.whl", hash = "sha256:642f752c3eb117b105acbd87e2c143de710987e09860d674e068c4c2c441034f"},
{file = "aiohttp-3.13.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:b903a4dfee7d347e2d87697d0713be59e0b87925be030c9178c5faa58ea58d5c"},
{file = "aiohttp-3.13.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:a45530014d7a1e09f4a55f4f43097ba0fd155089372e105e4bff4ca76cb1b168"},
{file = "aiohttp-3.13.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:27234ef6d85c914f9efeb77ff616dbf4ad2380be0cda40b4db086ffc7ddd1b7d"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d32764c6c9aafb7fb55366a224756387cd50bfa720f32b88e0e6fa45b27dcf29"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b1a6102b4d3ebc07dad44fbf07b45bb600300f15b552ddf1851b5390202ea2e3"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c014c7ea7fb775dd015b2d3137378b7be0249a448a1612268b5a90c2d81de04d"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2b8d8ddba8f95ba17582226f80e2de99c7a7948e66490ef8d947e272a93e9463"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9ae8dd55c8e6c4257eae3a20fd2c8f41edaea5992ed67156642493b8daf3cecc"},
{file = "aiohttp-3.13.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:01ad2529d4b5035578f5081606a465f3b814c542882804e2e8cda61adf5c71bf"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bb4f7475e359992b580559e008c598091c45b5088f28614e855e42d39c2f1033"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:c19b90316ad3b24c69cd78d5c9b4f3aa4497643685901185b65166293d36a00f"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:96d604498a7c782cb15a51c406acaea70d8c027ee6b90c569baa6e7b93073679"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:084911a532763e9d3dd95adf78a78f4096cd5f58cdc18e6fdbc1b58417a45423"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:7a4a94eb787e606d0a09404b9c38c113d3b099d508021faa615d70a0131907ce"},
{file = "aiohttp-3.13.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:87797e645d9d8e222e04160ee32aa06bc5c163e8499f24db719e7852ec23093a"},
{file = "aiohttp-3.13.3-cp312-cp312-win32.whl", hash = "sha256:b04be762396457bef43f3597c991e192ee7da460a4953d7e647ee4b1c28e7046"},
{file = "aiohttp-3.13.3-cp312-cp312-win_amd64.whl", hash = "sha256:e3531d63d3bdfa7e3ac5e9b27b2dd7ec9df3206a98e0b3445fa906f233264c57"},
{file = "aiohttp-3.13.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:5dff64413671b0d3e7d5918ea490bdccb97a4ad29b3f311ed423200b2203e01c"},
{file = "aiohttp-3.13.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:87b9aab6d6ed88235aa2970294f496ff1a1f9adcd724d800e9b952395a80ffd9"},
{file = "aiohttp-3.13.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:425c126c0dc43861e22cb1c14ba4c8e45d09516d0a3ae0a3f7494b79f5f233a3"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7f9120f7093c2a32d9647abcaf21e6ad275b4fbec5b55969f978b1a97c7c86bf"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:697753042d57f4bf7122cab985bf15d0cef23c770864580f5af4f52023a56bd6"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6de499a1a44e7de70735d0b39f67c8f25eb3d91eb3103be99ca0fa882cdd987d"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:37239e9f9a7ea9ac5bf6b92b0260b01f8a22281996da609206a84df860bc1261"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f76c1e3fe7d7c8afad7ed193f89a292e1999608170dcc9751a7462a87dfd5bc0"},
{file = "aiohttp-3.13.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fc290605db2a917f6e81b0e1e0796469871f5af381ce15c604a3c5c7e51cb730"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4021b51936308aeea0367b8f006dc999ca02bc118a0cc78c303f50a2ff6afb91"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:49a03727c1bba9a97d3e93c9f93ca03a57300f484b6e935463099841261195d3"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3d9908a48eb7416dc1f4524e69f1d32e5d90e3981e4e37eb0aa1cd18f9cfa2a4"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:2712039939ec963c237286113c68dbad80a82a4281543f3abf766d9d73228998"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:7bfdc049127717581866fa4708791220970ce291c23e28ccf3922c700740fdc0"},
{file = "aiohttp-3.13.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8057c98e0c8472d8846b9c79f56766bcc57e3e8ac7bfd510482332366c56c591"},
{file = "aiohttp-3.13.3-cp313-cp313-win32.whl", hash = "sha256:1449ceddcdbcf2e0446957863af03ebaaa03f94c090f945411b61269e2cb5daf"},
{file = "aiohttp-3.13.3-cp313-cp313-win_amd64.whl", hash = "sha256:693781c45a4033d31d4187d2436f5ac701e7bbfe5df40d917736108c1cc7436e"},
{file = "aiohttp-3.13.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:ea37047c6b367fd4bd632bff8077449b8fa034b69e812a18e0132a00fae6e808"},
{file = "aiohttp-3.13.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:6fc0e2337d1a4c3e6acafda6a78a39d4c14caea625124817420abceed36e2415"},
{file = "aiohttp-3.13.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c685f2d80bb67ca8c3837823ad76196b3694b0159d232206d1e461d3d434666f"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:48e377758516d262bde50c2584fc6c578af272559c409eecbdd2bae1601184d6"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:34749271508078b261c4abb1767d42b8d0c0cc9449c73a4df494777dc55f0687"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:82611aeec80eb144416956ec85b6ca45a64d76429c1ed46ae1b5f86c6e0c9a26"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2fff83cfc93f18f215896e3a190e8e5cb413ce01553901aca925176e7568963a"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bbe7d4cecacb439e2e2a8a1a7b935c25b812af7a5fd26503a66dadf428e79ec1"},
{file = "aiohttp-3.13.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b928f30fe49574253644b1ca44b1b8adbd903aa0da4b9054a6c20fc7f4092a25"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7b5e8fe4de30df199155baaf64f2fcd604f4c678ed20910db8e2c66dc4b11603"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:8542f41a62bcc58fc7f11cf7c90e0ec324ce44950003feb70640fc2a9092c32a"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:5e1d8c8b8f1d91cd08d8f4a3c2b067bfca6ec043d3ff36de0f3a715feeedf926"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:90455115e5da1c3c51ab619ac57f877da8fd6d73c05aacd125c5ae9819582aba"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:042e9e0bcb5fba81886c8b4fbb9a09d6b8a00245fd8d88e4d989c1f96c74164c"},
{file = "aiohttp-3.13.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2eb752b102b12a76ca02dff751a801f028b4ffbbc478840b473597fc91a9ed43"},
{file = "aiohttp-3.13.3-cp314-cp314-win32.whl", hash = "sha256:b556c85915d8efaed322bf1bdae9486aa0f3f764195a0fb6ee962e5c71ef5ce1"},
{file = "aiohttp-3.13.3-cp314-cp314-win_amd64.whl", hash = "sha256:9bf9f7a65e7aa20dd764151fb3d616c81088f91f8df39c3893a536e279b4b984"},
{file = "aiohttp-3.13.3-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:05861afbbec40650d8a07ea324367cb93e9e8cc7762e04dd4405df99fa65159c"},
{file = "aiohttp-3.13.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2fc82186fadc4a8316768d61f3722c230e2c1dcab4200d52d2ebdf2482e47592"},
{file = "aiohttp-3.13.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:0add0900ff220d1d5c5ebbf99ed88b0c1bbf87aa7e4262300ed1376a6b13414f"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:568f416a4072fbfae453dcf9a99194bbb8bdeab718e08ee13dfa2ba0e4bebf29"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:add1da70de90a2569c5e15249ff76a631ccacfe198375eead4aadf3b8dc849dc"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:10b47b7ba335d2e9b1239fa571131a87e2d8ec96b333e68b2a305e7a98b0bae2"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3dd4dce1c718e38081c8f35f323209d4c1df7d4db4bab1b5c88a6b4d12b74587"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:34bac00a67a812570d4a460447e1e9e06fae622946955f939051e7cc895cfab8"},
{file = "aiohttp-3.13.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a19884d2ee70b06d9204b2727a7b9f983d0c684c650254679e716b0b77920632"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:5f8ca7f2bb6ba8348a3614c7918cc4bb73268c5ac2a207576b7afea19d3d9f64"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:b0d95340658b9d2f11d9697f59b3814a9d3bb4b7a7c20b131df4bcef464037c0"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:a1e53262fd202e4b40b70c3aff944a8155059beedc8a89bba9dc1f9ef06a1b56"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:d60ac9663f44168038586cab2157e122e46bdef09e9368b37f2d82d354c23f72"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:90751b8eed69435bac9ff4e3d2f6b3af1f57e37ecb0fbeee59c0174c9e2d41df"},
{file = "aiohttp-3.13.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fc353029f176fd2b3ec6cfc71be166aba1936fe5d73dd1992ce289ca6647a9aa"},
{file = "aiohttp-3.13.3-cp314-cp314t-win32.whl", hash = "sha256:2e41b18a58da1e474a057b3d35248d8320029f61d70a37629535b16a0c8f3767"},
{file = "aiohttp-3.13.3-cp314-cp314t-win_amd64.whl", hash = "sha256:44531a36aa2264a1860089ffd4dce7baf875ee5a6079d5fb42e261c704ef7344"},
{file = "aiohttp-3.13.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:31a83ea4aead760dfcb6962efb1d861db48c34379f2ff72db9ddddd4cda9ea2e"},
{file = "aiohttp-3.13.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:988a8c5e317544fdf0d39871559e67b6341065b87fceac641108c2096d5506b7"},
{file = "aiohttp-3.13.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9b174f267b5cfb9a7dba9ee6859cecd234e9a681841eb85068059bc867fb8f02"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:947c26539750deeaee933b000fb6517cc770bbd064bad6033f1cff4803881e43"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:9ebf57d09e131f5323464bd347135a88622d1c0976e88ce15b670e7ad57e4bd6"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4ae5b5a0e1926e504c81c5b84353e7a5516d8778fbbff00429fe7b05bb25cbce"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2ba0eea45eb5cc3172dbfc497c066f19c41bac70963ea1a67d51fc92e4cf9a80"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bae5c2ed2eae26cc382020edad80d01f36cb8e746da40b292e68fec40421dc6a"},
{file = "aiohttp-3.13.3-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:8a60e60746623925eab7d25823329941aee7242d559baa119ca2b253c88a7bd6"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:e50a2e1404f063427c9d027378472316201a2290959a295169bcf25992d04558"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:9a9dc347e5a3dc7dfdbc1f82da0ef29e388ddb2ed281bfce9dd8248a313e62b7"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:b46020d11d23fe16551466c77823df9cc2f2c1e63cc965daf67fa5eec6ca1877"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:69c56fbc1993fa17043e24a546959c0178fe2b5782405ad4559e6c13975c15e3"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:b99281b0704c103d4e11e72a76f1b543d4946fea7dd10767e7e1b5f00d4e5704"},
{file = "aiohttp-3.13.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:40c5e40ecc29ba010656c18052b877a1c28f84344825efa106705e835c28530f"},
{file = "aiohttp-3.13.3-cp39-cp39-win32.whl", hash = "sha256:56339a36b9f1fc708260c76c87e593e2afb30d26de9ae1eb445b5e051b98a7a1"},
{file = "aiohttp-3.13.3-cp39-cp39-win_amd64.whl", hash = "sha256:c6b8568a3bb5819a0ad087f16d40e5a3fb6099f39ea1d5625a3edc1e923fc538"},
{file = "aiohttp-3.13.3.tar.gz", hash = "sha256:a949eee43d3782f2daae4f4a2819b2cb9b0c5d3b7f7a927067cc84dafdbb9f88"},
]
[package.dependencies]
@@ -143,7 +177,7 @@ propcache = ">=0.2.0"
yarl = ">=1.17.0,<2.0"
[package.extras]
speedups = ["Brotli ; platform_python_implementation == \"CPython\"", "aiodns (>=3.3.0)", "brotlicffi ; platform_python_implementation != \"CPython\""]
speedups = ["Brotli (>=1.2) ; platform_python_implementation == \"CPython\"", "aiodns (>=3.3.0)", "backports.zstd ; platform_python_implementation == \"CPython\" and python_version < \"3.14\"", "brotlicffi (>=1.2) ; platform_python_implementation != \"CPython\""]
[[package]]
name = "aiosignal"
@@ -1545,84 +1579,101 @@ files = [
[[package]]
name = "cffi"
version = "1.17.1"
version = "2.0.0"
description = "Foreign Function Interface for Python calling C code."
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main", "dev"]
markers = "platform_python_implementation != \"PyPy\""
files = [
{file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"},
{file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be"},
{file = "cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c"},
{file = "cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15"},
{file = "cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401"},
{file = "cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b"},
{file = "cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655"},
{file = "cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0"},
{file = "cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4"},
{file = "cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93"},
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3"},
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8"},
{file = "cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65"},
{file = "cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903"},
{file = "cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e"},
{file = "cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd"},
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed"},
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9"},
{file = "cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d"},
{file = "cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a"},
{file = "cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1"},
{file = "cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8"},
{file = "cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1"},
{file = "cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16"},
{file = "cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e"},
{file = "cffi-1.17.1-cp39-cp39-win32.whl", hash = "sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7"},
{file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"},
{file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"},
{file = "cffi-2.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44"},
{file = "cffi-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49"},
{file = "cffi-2.0.0-cp310-cp310-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c"},
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb"},
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0"},
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4"},
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453"},
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495"},
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5"},
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb"},
{file = "cffi-2.0.0-cp310-cp310-win32.whl", hash = "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a"},
{file = "cffi-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739"},
{file = "cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe"},
{file = "cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c"},
{file = "cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92"},
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93"},
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5"},
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664"},
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26"},
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9"},
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414"},
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743"},
{file = "cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5"},
{file = "cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5"},
{file = "cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d"},
{file = "cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d"},
{file = "cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c"},
{file = "cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe"},
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062"},
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e"},
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037"},
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba"},
{file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94"},
{file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187"},
{file = "cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18"},
{file = "cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5"},
{file = "cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6"},
{file = "cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb"},
{file = "cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca"},
{file = "cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b"},
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b"},
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2"},
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3"},
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26"},
{file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c"},
{file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b"},
{file = "cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27"},
{file = "cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75"},
{file = "cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91"},
{file = "cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5"},
{file = "cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13"},
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b"},
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c"},
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef"},
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775"},
{file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205"},
{file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1"},
{file = "cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f"},
{file = "cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25"},
{file = "cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad"},
{file = "cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9"},
{file = "cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d"},
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c"},
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8"},
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc"},
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592"},
{file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512"},
{file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4"},
{file = "cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e"},
{file = "cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6"},
{file = "cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9"},
{file = "cffi-2.0.0-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf"},
{file = "cffi-2.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7"},
{file = "cffi-2.0.0-cp39-cp39-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63"},
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2"},
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65"},
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322"},
{file = "cffi-2.0.0-cp39-cp39-win32.whl", hash = "sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a"},
{file = "cffi-2.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9"},
{file = "cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529"},
]
markers = {dev = "platform_python_implementation != \"PyPy\""}
[package.dependencies]
pycparser = "*"
pycparser = {version = "*", markers = "implementation_name != \"PyPy\""}
[[package]]
name = "cfgv"
@@ -3238,14 +3289,14 @@ files = [
[[package]]
name = "marshmallow"
version = "3.26.1"
version = "3.26.2"
description = "A lightweight library for converting complex datatypes to and from native Python datatypes."
optional = false
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "marshmallow-3.26.1-py3-none-any.whl", hash = "sha256:3350409f20a70a7e4e11a27661187b77cdcaeb20abca41c1454fe33636bea09c"},
{file = "marshmallow-3.26.1.tar.gz", hash = "sha256:e6d8affb6cb61d39d26402096dc0aee12d5a26d490a121f118d2e81dc0719dc6"},
{file = "marshmallow-3.26.2-py3-none-any.whl", hash = "sha256:013fa8a3c4c276c24d26d84ce934dc964e2aa794345a0f8c7e5a7191482c8a73"},
{file = "marshmallow-3.26.2.tar.gz", hash = "sha256:bbe2adb5a03e6e3571b573f42527c6fe926e17467833660bebd11593ab8dfd57"},
]
[package.dependencies]
@@ -4529,11 +4580,11 @@ description = "C parser in Python"
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
markers = "platform_python_implementation != \"PyPy\" and implementation_name != \"PyPy\""
files = [
{file = "pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc"},
{file = "pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6"},
]
markers = {dev = "platform_python_implementation != \"PyPy\""}
[[package]]
name = "pydantic"
@@ -4770,30 +4821,45 @@ testutils = ["gitpython (>3)"]
[[package]]
name = "pynacl"
version = "1.5.0"
version = "1.6.2"
description = "Python binding to the Networking and Cryptography (NaCl) library"
optional = false
python-versions = ">=3.6"
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "PyNaCl-1.5.0-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:401002a4aaa07c9414132aaed7f6836ff98f59277a234704ff66878c2ee4a0d1"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:52cb72a79269189d4e0dc537556f4740f7f0a9ec41c1322598799b0bdad4ef92"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a36d4a9dda1f19ce6e03c9a784a2921a4b726b02e1c736600ca9c22029474394"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:0c84947a22519e013607c9be43706dd42513f9e6ae5d39d3613ca1e142fba44d"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:06b8f6fa7f5de8d5d2f7573fe8c863c051225a27b61e6860fd047b1775807858"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:a422368fc821589c228f4c49438a368831cb5bbc0eab5ebe1d7fac9dded6567b"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:61f642bf2378713e2c2e1de73444a3778e5f0a38be6fee0fe532fe30060282ff"},
{file = "PyNaCl-1.5.0-cp36-abi3-win32.whl", hash = "sha256:e46dae94e34b085175f8abb3b0aaa7da40767865ac82c928eeb9e57e1ea8a543"},
{file = "PyNaCl-1.5.0-cp36-abi3-win_amd64.whl", hash = "sha256:20f42270d27e1b6a29f54032090b972d97f0a1b0948cc52392041ef7831fee93"},
{file = "PyNaCl-1.5.0.tar.gz", hash = "sha256:8ac7448f09ab85811607bdd21ec2464495ac8b7c66d146bf545b0f08fb9220ba"},
{file = "pynacl-1.6.2-cp314-cp314t-macosx_10_10_universal2.whl", hash = "sha256:622d7b07cc5c02c666795792931b50c91f3ce3c2649762efb1ef0d5684c81594"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d071c6a9a4c94d79eb665db4ce5cedc537faf74f2355e4d502591d850d3913c0"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fe9847ca47d287af41e82be1dd5e23023d3c31a951da134121ab02e42ac218c9"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:04316d1fc625d860b6c162fff704eb8426b1a8bcd3abacea11142cbd99a6b574"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44081faff368d6c5553ccf55322ef2819abb40e25afaec7e740f159f74813634"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:a9f9932d8d2811ce1a8ffa79dcbdf3970e7355b5c8eb0c1a881a57e7f7d96e88"},
{file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:bc4a36b28dd72fb4845e5d8f9760610588a96d5a51f01d84d8c6ff9849968c14"},
{file = "pynacl-1.6.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:3bffb6d0f6becacb6526f8f42adfb5efb26337056ee0831fb9a7044d1a964444"},
{file = "pynacl-1.6.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:2fef529ef3ee487ad8113d287a593fa26f48ee3620d92ecc6f1d09ea38e0709b"},
{file = "pynacl-1.6.2-cp314-cp314t-win32.whl", hash = "sha256:a84bf1c20339d06dc0c85d9aea9637a24f718f375d861b2668b2f9f96fa51145"},
{file = "pynacl-1.6.2-cp314-cp314t-win_amd64.whl", hash = "sha256:320ef68a41c87547c91a8b58903c9caa641ab01e8512ce291085b5fe2fcb7590"},
{file = "pynacl-1.6.2-cp314-cp314t-win_arm64.whl", hash = "sha256:d29bfe37e20e015a7d8b23cfc8bd6aa7909c92a1b8f41ee416bbb3e79ef182b2"},
{file = "pynacl-1.6.2-cp38-abi3-macosx_10_10_universal2.whl", hash = "sha256:c949ea47e4206af7c8f604b8278093b674f7c79ed0d4719cc836902bf4517465"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:8845c0631c0be43abdd865511c41eab235e0be69c81dc66a50911594198679b0"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:22de65bb9010a725b0dac248f353bb072969c94fa8d6b1f34b87d7953cf7bbe4"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:46065496ab748469cdd999246d17e301b2c24ae2fdf739132e580a0e94c94a87"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8a66d6fb6ae7661c58995f9c6435bda2b1e68b54b598a6a10247bfcdadac996c"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:26bfcd00dcf2cf160f122186af731ae30ab120c18e8375684ec2670dccd28130"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:c8a231e36ec2cab018c4ad4358c386e36eede0319a0c41fed24f840b1dac59f6"},
{file = "pynacl-1.6.2-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:68be3a09455743ff9505491220b64440ced8973fe930f270c8e07ccfa25b1f9e"},
{file = "pynacl-1.6.2-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:8b097553b380236d51ed11356c953bf8ce36a29a3e596e934ecabe76c985a577"},
{file = "pynacl-1.6.2-cp38-abi3-win32.whl", hash = "sha256:5811c72b473b2f38f7e2a3dc4f8642e3a3e9b5e7317266e4ced1fba85cae41aa"},
{file = "pynacl-1.6.2-cp38-abi3-win_amd64.whl", hash = "sha256:62985f233210dee6548c223301b6c25440852e13d59a8b81490203c3227c5ba0"},
{file = "pynacl-1.6.2-cp38-abi3-win_arm64.whl", hash = "sha256:834a43af110f743a754448463e8fd61259cd4ab5bbedcf70f9dabad1d28a394c"},
{file = "pynacl-1.6.2.tar.gz", hash = "sha256:018494d6d696ae03c7e656e5e74cdfd8ea1326962cc401bcf018f1ed8436811c"},
]
[package.dependencies]
cffi = ">=1.4.1"
cffi = {version = ">=2.0.0", markers = "platform_python_implementation != \"PyPy\" and python_version >= \"3.9\""}
[package.extras]
docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
docs = ["sphinx (<7)", "sphinx_rtd_theme"]
tests = ["hypothesis (>=3.27.0)", "pytest (>=7.4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
[[package]]
name = "pyopenssl"
@@ -5980,22 +6046,22 @@ socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
[[package]]
name = "urllib3"
version = "2.5.0"
version = "2.6.3"
description = "HTTP library with thread-safe connection pooling, file post, and more."
optional = false
python-versions = ">=3.9"
groups = ["main", "dev"]
markers = "python_version >= \"3.10\""
files = [
{file = "urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc"},
{file = "urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760"},
{file = "urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4"},
{file = "urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed"},
]
[package.extras]
brotli = ["brotli (>=1.0.9) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=0.8.0) ; platform_python_implementation != \"CPython\""]
brotli = ["brotli (>=1.2.0) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=1.2.0.0) ; platform_python_implementation != \"CPython\""]
h2 = ["h2 (>=4,<5)"]
socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
zstd = ["zstandard (>=0.18.0)"]
zstd = ["backports-zstd (>=1.0.0) ; python_version < \"3.14\""]
[[package]]
name = "virtualenv"
@@ -6052,18 +6118,18 @@ test = ["websockets"]
[[package]]
name = "werkzeug"
version = "3.1.3"
version = "3.1.5"
description = "The comprehensive WSGI web application library."
optional = false
python-versions = ">=3.9"
groups = ["main", "dev"]
files = [
{file = "werkzeug-3.1.3-py3-none-any.whl", hash = "sha256:54b78bf3716d19a65be4fceccc0d1d7b89e608834989dfae50ea87564639213e"},
{file = "werkzeug-3.1.3.tar.gz", hash = "sha256:60723ce945c19328679790e3282cc758aa4a6040e4bb330f53d30fa546d44746"},
{file = "werkzeug-3.1.5-py3-none-any.whl", hash = "sha256:5111e36e91086ece91f93268bb39b4a35c1e6f1feac762c9c822ded0a4e322dc"},
{file = "werkzeug-3.1.5.tar.gz", hash = "sha256:6a548b0e88955dd07ccb25539d7d0cc97417ee9e179677d22c7041c8f078ce67"},
]
[package.dependencies]
MarkupSafe = ">=2.1.1"
markupsafe = ">=2.1.1"
[package.extras]
watchdog = ["watchdog (>=2.3)"]
+80 -321
View File
@@ -1,366 +1,125 @@
# Prowler SDK Agent Guide
**Complete guide for AI agents and developers working on the Prowler SDK - the core Python security scanning engine.**
> **Skills Reference**: For detailed patterns, use these skills:
> - [`prowler-sdk-check`](../skills/prowler-sdk-check/SKILL.md) - Create new security checks (step-by-step)
> - [`prowler-provider`](../skills/prowler-provider/SKILL.md) - Add new cloud providers
> - [`prowler-test-sdk`](../skills/prowler-test-sdk/SKILL.md) - pytest patterns for SDK
> - [`prowler-compliance`](../skills/prowler-compliance/SKILL.md) - Compliance framework structure
> - [`pytest`](../skills/pytest/SKILL.md) - Generic pytest patterns
## Project Overview
The Prowler SDK is the core Python engine that powers Prowler's cloud security assessment capabilities. It provides:
- **Multi-cloud Security Scanning**: AWS, Azure, GCP, Kubernetes, GitHub, M365, Oracle Cloud, MongoDB Atlas, and more
- **Compliance Frameworks**: 30+ frameworks including CIS, NIST, PCI-DSS, SOC2, GDPR
- **1000+ Security Checks**: Comprehensive coverage across all supported providers
- **Multiple Output Formats**: JSON, CSV, HTML, ASFF, OCSF, and compliance-specific formats
## Mission & Scope
- Maintain and enhance the core Prowler SDK functionality with security and stability as top priorities
- Follow best practices for Python patterns, code style, security, and comprehensive testing
- To get more information about development guidelines, please refer to the Prowler Developer Guide in `docs/developer-guide/`
The Prowler SDK is the core Python engine powering cloud security assessments across AWS, Azure, GCP, Kubernetes, GitHub, M365, and more. It includes 1000+ security checks and 30+ compliance frameworks.
---
## Architecture Rules
## CRITICAL RULES
### 1. Provider Architecture Pattern
All Prowler providers MUST follow the established pattern:
### Provider Architecture
```
prowler/providers/{provider}/
├── {provider}_provider.py # Main provider class
├── models.py # Provider-specific models
├── config.py # Provider configuration
── exceptions/ # Provider-specific exceptions
├── lib/ # Provider libraries (as minimun it should have implemented the next folders: service, arguments, mutelist)
├── service/ # Provider-specific service class to be inherited by all services of the provider
── arguments/ # Provider-specific CLI arguments parser
└── mutelist/ # Provider-specific mutelist functionality
└── services/ # All provider services to be audited
└── {service}/ # Individual service
├── {service}_service.py # Class to fetch the needed resources from the API and store them to be used by the checks
├── {service}_client.py # Python instance of the service class to be used by the checks
└── {check_name}/ # Individual check folder
├── {check_name}.py # Python class to implement the check logic
└── {check_name}.metadata.json # JSON file to store the check metadata
└── {check_name_2}/ # Other checks can be added to the same service folder
├── {check_name_2}.py
└── {check_name_2}.metadata.json
...
└── {service_2}/ # Other services can be added to the same provider folder
...
├── {provider}_provider.py # Main provider class
├── models.py # Provider-specific models
├── lib/ # service/, arguments/, mutelist/
── services/{service}/
├── {service}_service.py # Resource fetcher
├── {service}_client.py # Singleton instance
── {check_name}/ # Individual checks
├── {check_name}.py
└── {check_name}.metadata.json
```
### 2. Check Implementation Standards
Every security check MUST implement:
### Check Implementation
```python
from prowler.lib.check.models import Check, CheckReport<Provider>
from prowler.providers.<provider>.services.<service>.<service>_client import <service>_client
from prowler.lib.check.models import Check, CheckReport{Provider}
from prowler.providers.{provider}.services.{service}.{service}_client import {service}_client
class check_name(Check):
"""Ensure that <resource> meets <security_requirement>."""
def execute(self) -> list[CheckReport<Provider>]:
"""Execute the check logic.
Returns:
A list of reports containing the result of the check.
"""
class {check_name}(Check):
def execute(self) -> list[CheckReport{Provider}]:
findings = []
# Check implementation here
for resource in <service>_client.<resources>:
# Security validation logic
report = CheckReport<Provider>(metadata=self.metadata(), resource=resource)
report.status = "PASS" | "FAIL"
for resource in {service}_client.{resources}:
report = CheckReport{Provider}(metadata=self.metadata(), resource=resource)
report.status = "PASS" if resource.is_compliant else "FAIL"
report.status_extended = "Detailed explanation"
findings.append(report) # Add the report to the list of findings
findings.append(report)
return findings
```
### 3. Compliance Framework Integration
### Code Style
All compliance frameworks must be defined in:
- `prowler/compliance/{provider}/{framework}.json`
- Follow the established Compliance model structure
- Include proper requirement mappings and metadata
- Type hints required for all public functions
- Docstrings required for classes and methods (Google style)
- PEP 8 compliance enforced by black/flake8
- Import order: standard → third-party → local
---
## Tech Stack
## TECH STACK
- **Language**: Python 3.9+
- **Dependency Management**: Poetry 2+
- **CLI Framework**: Custom argument parser with provider-specific subcommands
- **Testing**: Pytest with extensive unit and integration tests
- **Code Quality**: Pre-commit hooks for Black, Flake8, Pylint, Bandit for security scanning
Python 3.9+ | Poetry 2+ | pytest | moto (AWS mocking) | Pre-commit hooks (black, flake8, pylint, bandit)
## Commands
---
### Development Environment
```bash
# Core development setup
poetry install --with dev # Install all dependencies
poetry run pre-commit install # Install pre-commit hooks
# Code quality
poetry run pre-commit run --all-files
# Run tests
poetry run pytest -n auto -vvv -s -x tests/
```
### Running Prowler CLI
```bash
# Run Prowler
poetry run python prowler-cli.py --help
# Run Prowler with a specific provider
poetry run python prowler-cli.py <provider>
# Run Prowler with error logging
poetry run python prowler-cli.py <provider> --log-level ERROR --verbose
# Run specific checks
poetry run python prowler-cli.py <provider> --checks <check_name_1> <check_name_2>
```
## Project Structure
## PROJECT STRUCTURE
```
prowler/
├── __main__.py # Main CLI entry point
├── config/ # Global configuration
│ ├── config.py # Core configuration settings
── __init__.py
├── lib/ # Core library functions
│ ├── check/ # Check execution engine
│ ├── check.py # Check execution logic
│ │ ├── checks_loader.py # Dynamic check loading
│ ├── compliance.py # Compliance framework handling
│ │ └── models.py # Check and report models
│ ├── cli/ # Command-line interface
│ │ └── parser.py # Argument parsing
│ ├── outputs/ # Output format handlers
│ │ ├── csv/ # CSV output
│ │ ├── html/ # HTML reports
│ │ ├── json/ # JSON formats
│ │ └── compliance/ # Compliance reports
│ ├── scan/ # Scan orchestration
│ ├── utils/ # Utility functions
│ └── mutelist/ # Mute list functionality
├── providers/ # Cloud provider implementations
│ ├── aws/ # AWS provider
│ ├── azure/ # Azure provider
│ ├── gcp/ # Google Cloud provider
│ ├── kubernetes/ # Kubernetes provider
│ ├── github/ # GitHub provider
│ ├── m365/ # Microsoft 365 provider
│ ├── mongodbatlas/ # MongoDB Atlas provider
│ ├── oci/ # Oracle Cloud provider
│ ├── ...
│ └── common/ # Shared provider utilities
├── compliance/ # Compliance framework definitions
│ ├── aws/ # AWS compliance frameworks
│ ├── azure/ # Azure compliance frameworks
│ ├── gcp/ # GCP compliance frameworks
│ ├── ...
└── exceptions/ # Global exception definitions
├── __main__.py # CLI entry point
├── config/ # Global configuration
├── lib/
── check/ # Check execution engine
├── cli/ # Command-line interface
│ ├── outputs/ # Output format handlers (JSON, CSV, HTML, ASFF, OCSF)
└── mutelist/ # Mute list functionality
├── providers/ # Cloud providers (aws, azure, gcp, kubernetes, github, m365...)
── common/ # Shared provider utilities
├── compliance/ # Compliance framework definitions (CIS, NIST, PCI-DSS, SOC2...)
└── exceptions/ # Global exceptions
```
## Key Components
---
### 1. Provider System
## COMMANDS
Each cloud provider implements:
```python
class Provider:
"""Base provider class"""
def __init__(self, arguments):
self.session = self._setup_session(arguments)
self.regions = self._get_regions()
# Initialize all services
def _setup_session(self, arguments):
"""Provider-specific authentication"""
pass
def _get_regions(self):
"""Get available regions for provider"""
pass
```
### 2. Check Engine
The check execution system:
- **Dynamic Loading**: Automatically discovers and loads checks
- **Parallel Execution**: Runs checks in parallel for performance
- **Error Isolation**: Individual check failures don't affect others
- **Comprehensive Reporting**: Detailed findings with remediation guidance
### 3. Compliance Framework Engine
Compliance frameworks are defined as JSON files mapping checks to requirements:
```json
{
"Framework": "CIS",
"Name": "CIS Amazon Web Services Foundations Benchmark v2.0.0",
"Version": "2.0",
"Provider": "AWS",
"Description": "The CIS Amazon Web Services Foundations Benchmark provides prescriptive guidance for configuring security options for a subset of Amazon Web Services with an emphasis on foundational, testable, and architecture agnostic settings.",
"Requirements": [
{
"Id": "1.1",
"Description": "Maintain current contact details",
"Checks": ["account_contact_details_configured"]
}
]
}
```
### 4. Output System
Multiple output formats supported:
- **JSON**: Machine-readable findings
- **CSV**: Spreadsheet-compatible format
- **HTML**: Interactive web reports
- **ASFF**: AWS Security Finding Format
- **OCSF**: Open Cybersecurity Schema Framework
## Development Patterns
### Adding New Cloud Providers
1. **Create Provider Structure**:
```bash
mkdir -p prowler/providers/{provider}
mkdir -p prowler/providers/{provider}/services
mkdir -p prowler/providers/{provider}/lib/{service,arguments,mutelist}
mkdir -p prowler/providers/{provider}/exceptions
# Setup
poetry install --with dev
poetry run pre-commit install
# Run Prowler
poetry run python prowler-cli.py {provider}
poetry run python prowler-cli.py {provider} --check {check_name}
poetry run python prowler-cli.py {provider} --list-checks
# Testing
poetry run pytest -n auto -vvv tests/
poetry run pytest tests/providers/{provider}/services/{service}/ -v
# Code Quality
poetry run pre-commit run --all-files
```
2. **Implement Provider Class**:
```python
from prowler.providers.common.provider import Provider
---
class NewProvider(Provider):
def __init__(self, arguments):
super().__init__(arguments)
# Provider-specific initialization
```
## CREATING NEW CHECKS (Quick Reference)
3. **Add Provider to CLI**:
Update `prowler/lib/cli/parser.py` to include new provider arguments.
1. Verify check doesn't exist: `--list-checks | grep {check_name}`
2. Create folder: `prowler/providers/{provider}/services/{service}/{check_name}/`
3. Create files: `__init__.py`, `{check_name}.py`, `{check_name}.metadata.json`
4. Implement check logic
5. Test locally: `--check {check_name}`
6. Write tests
### Adding New Security Checks
**For detailed guidance, use the `prowler-sdk-check` skill.**
The most common high level steps to create a new check are:
---
1. Prerequisites:
- Verify the check does not already exist by searching in the same service folder as `prowler/providers/<provider>/services/<service>/<check_name_want_to_implement>/`.
- Ensure required provider and service exist. If not, you will need to create them first.
- Confirm the service has implemented all required methods and attributes for the check (in most cases, you will need to add or modify some methods in the service to get the data you need for the check).
2. Navigate to the service directory. The path should be as follows: `prowler/providers/<provider>/services/<service>`.
3. Create a check-specific folder. The path should follow this pattern: `prowler/providers/<provider>/services/<service>/<check_name_want_to_implement>`. Adhere to the [Naming Format for Checks](/developer-guide/checks#naming-format-for-checks).
4. Create the check files, you can use next commands:
```bash
mkdir -p prowler/providers/<provider>/services/<service>/<check_name_want_to_implement>
touch prowler/providers/<provider>/services/<service>/<check_name_want_to_implement>/__init__.py
touch prowler/providers/<provider>/services/<service>/<check_name_want_to_implement>/<check_name_want_to_implement>.py
touch prowler/providers/<provider>/services/<service>/<check_name_want_to_implement>/<check_name_want_to_implement>.metadata.json
```
5. Run the check locally to ensure it works as expected. For checking you can use the CLI in the next way:
- To ensure the check has been detected by Prowler: `poetry run python prowler-cli.py <provider> --list-checks | grep <check_name>`.
- To run the check, to find possible issues: `poetry run python prowler-cli.py <provider> --log-level ERROR --verbose --check <check_name>`.
6. Create comprehensive tests for the check that cover multiple scenarios including both PASS (compliant) and FAIL (non-compliant) cases. For detailed information about test structure and implementation guidelines, refer to the [Testing](/developer-guide/unit-testing) documentation.
7. If the check and its corresponding tests are working as expected, you can submit a PR to Prowler.
## QA CHECKLIST
### Adding Compliance Frameworks
1. **Create Framework File**:
```bash
# Create prowler/compliance/{provider}/{framework}.json
```
2. **Define Requirements**:
Map framework requirements to existing checks.
3. **Test Compliance**:
```bash
poetry run python -m prowler {provider} --compliance {framework}
```
## Code Quality Standards
### 1. Python Style
- **PEP 8 Compliance**: Enforced by black and flake8
- **Type Hints**: Required for all public functions
- **Docstrings**: Required for all classes and methods
- **Import Organization**: Use isort for consistent import ordering
```python
import standard_library
from third_party import library
from prowler.lib import internal_module
class ExampleClass:
"""Class docstring."""
def method(self, param: str) -> dict | list | None:
"""Method docstring.
Args:
param: Description of parameter
Returns:
Description of return value
"""
return None
```
### 2. Error Handling
```python
from prowler.lib.logger import logger
try:
# Risky operation
result = api_call()
except ProviderSpecificException as e:
logger.error(f"Provider error: {e}")
# Graceful handling
except Exception as e:
logger.error(f"Unexpected error: {e}")
# Never let checks crash the entire scan
```
### 3. Security Practices
- **No Hardcoded Secrets**: Use environment variables or secure credential management
- **Input Validation**: Validate all external inputs
- **Principle of Least Privilege**: Request minimal necessary permissions
- **Secure Defaults**: Default to secure configurations
## Testing Guidelines
### Unit Tests
- **100% Coverage Goal**: Aim for complete test coverage
- **Mock External Services**: Use mock objects to simulate the external services
- **Test Edge Cases**: Include error conditions and boundary cases
## References
- **Root Project Guide**: `../AGENTS.md` (takes priority for cross-component guidance)
- **Provider Examples**: Reference existing providers for implementation patterns
- **Check Examples**: Study existing checks for proper implementation patterns
- **Compliance Framework Examples**: Review existing frameworks for structure
- [ ] `poetry run pytest` passes
- [ ] `poetry run pre-commit run --all-files` passes
- [ ] Check metadata JSON is valid
- [ ] Tests cover PASS, FAIL, and empty resource scenarios
- [ ] Docstrings follow Google style
+18
View File
@@ -5,6 +5,7 @@ All notable changes to the **Prowler SDK** are documented in this file.
## [5.17.0] (Prowler UNRELEASED)
### Added
- AI Skills pack for AI coding assistants (Claude Code, OpenCode, Codex) following agentskills.io standard [(#9728)](https://github.com/prowler-cloud/prowler/pull/9728)
- Add Prowler ThreatScore for the Alibaba Cloud provider [(#9511)](https://github.com/prowler-cloud/prowler/pull/9511)
- `compute_instance_group_multiple_zones` check for GCP provider [(#9566)](https://github.com/prowler-cloud/prowler/pull/9566)
- `compute_instance_group_autohealing_enabled` check for GCP provider [(#9690)](https://github.com/prowler-cloud/prowler/pull/9690)
@@ -12,9 +13,14 @@ All notable changes to the **Prowler SDK** are documented in this file.
- `compute_instance_disk_auto_delete_disabled` check for GCP provider [(#9604)](https://github.com/prowler-cloud/prowler/pull/9604)
- Bedrock service pagination [(#9606)](https://github.com/prowler-cloud/prowler/pull/9606)
- `ResourceGroup` field to all check metadata for resource classification [(#9656)](https://github.com/prowler-cloud/prowler/pull/9656)
- `compute_configuration_changes` check for GCP provider to detect Compute Engine configuration changes in Cloud Audit Logs [(#9698)](https://github.com/prowler-cloud/prowler/pull/9698)
- `compute_instance_group_load_balancer_attached` check for GCP provider [(#9695)](https://github.com/prowler-cloud/prowler/pull/9695)
- `compute_instance_single_network_interface` check for GCP provider [(#9702)](https://github.com/prowler-cloud/prowler/pull/9702)
- `compute_image_not_publicly_shared` check for GCP provider [(#9718)](https://github.com/prowler-cloud/prowler/pull/9718)
- CIS 1.12 compliance framework for Kubernetes [(#9778)](https://github.com/prowler-cloud/prowler/pull/9778)
- CIS 6.0 for M365 provider [(#9779)](https://github.com/prowler-cloud/prowler/pull/9779)
- CIS 5.0 compliance framework for the Azure provider [(#9777)](https://github.com/prowler-cloud/prowler/pull/9777)
- Container Image provider (POC) using Trivy for vulnerability and secret scanning
### Changed
- Update AWS Step Functions service metadata to new format [(#9432)](https://github.com/prowler-cloud/prowler/pull/9432)
@@ -37,6 +43,18 @@ All notable changes to the **Prowler SDK** are documented in this file.
- Update AWS OpenSearch service metadata to new format [(#9383)](https://github.com/prowler-cloud/prowler/pull/9383)
- Update AWS VPC service metadata to new format [(#9479)](https://github.com/prowler-cloud/prowler/pull/9479)
- Update AWS Transfer service metadata to new format [(#9434)](https://github.com/prowler-cloud/prowler/pull/9434)
- Update AWS S3 service metadata to new format [(#9552)](https://github.com/prowler-cloud/prowler/pull/9552)
- Update AWS DataSync service metadata to new format [(#8854)](https://github.com/prowler-cloud/prowler/pull/8854)
- Update AWS RDS service metadata to new format [(#9551)](https://github.com/prowler-cloud/prowler/pull/9551)
- Update AWS Bedrock service metadata to new format [(#8827)](https://github.com/prowler-cloud/prowler/pull/8827)
---
## [5.16.2] (Prowler v5.16.2) (UNRELEASED)
### Fixed
- Fix OCI authentication error handling and validation [(#9738)](https://github.com/prowler-cloud/prowler/pull/9738)
- Fixup AWS EC2 SG library [(#9216)](https://github.com/prowler-cloud/prowler/pull/9216)
---
+9 -6
View File
@@ -118,6 +118,7 @@ from prowler.providers.common.quick_inventory import run_provider_quick_inventor
from prowler.providers.gcp.models import GCPOutputOptions
from prowler.providers.github.models import GithubOutputOptions
from prowler.providers.iac.models import IACOutputOptions
from prowler.providers.image.models import ImageOutputOptions
from prowler.providers.kubernetes.models import KubernetesOutputOptions
from prowler.providers.llm.models import LLMOutputOptions
from prowler.providers.m365.models import M365OutputOptions
@@ -204,8 +205,8 @@ def prowler():
# Load compliance frameworks
logger.debug("Loading compliance frameworks from .json files")
# Skip compliance frameworks for IAC and LLM providers
if provider != "iac" and provider != "llm":
# Skip compliance frameworks for IAC, LLM, and Image providers
if provider not in ("iac", "llm", "image"):
bulk_compliance_frameworks = Compliance.get_bulk(provider)
# Complete checks metadata with the compliance framework specification
bulk_checks_metadata = update_checks_metadata_with_compliance(
@@ -262,8 +263,8 @@ def prowler():
if not args.only_logs:
global_provider.print_credentials()
# Skip service and check loading for IAC and LLM providers
if provider != "iac" and provider != "llm":
# Skip service and check loading for IAC, LLM, and Image providers
if provider not in ("iac", "llm", "image"):
# Import custom checks from folder
if checks_folder:
custom_checks = parse_checks_from_folder(global_provider, checks_folder)
@@ -346,6 +347,8 @@ def prowler():
)
elif provider == "iac":
output_options = IACOutputOptions(args, bulk_checks_metadata)
elif provider == "image":
output_options = ImageOutputOptions(args, bulk_checks_metadata)
elif provider == "llm":
output_options = LLMOutputOptions(args, bulk_checks_metadata)
elif provider == "oraclecloud":
@@ -365,8 +368,8 @@ def prowler():
# Execute checks
findings = []
if provider == "iac" or provider == "llm":
# For IAC and LLM providers, run the scan directly
if provider in ("iac", "llm", "image"):
# For IAC, LLM, and Image providers, run the scan directly
if provider == "llm":
def streaming_callback(findings_batch):
File diff suppressed because one or more lines are too long
File diff suppressed because it is too large Load Diff
File diff suppressed because it is too large Load Diff
+4 -1
View File
@@ -73,7 +73,10 @@ def get_available_compliance_frameworks(provider=None):
if provider:
providers = [provider]
for provider in providers:
with os.scandir(f"{actual_directory}/../compliance/{provider}") as files:
compliance_dir = f"{actual_directory}/../compliance/{provider}"
if not os.path.isdir(compliance_dir):
continue
with os.scandir(compliance_dir) as files:
for file in files:
if file.is_file() and file.name.endswith(".json"):
available_compliance_frameworks.append(
+40
View File
@@ -163,6 +163,7 @@ class CheckMetadata(BaseModel):
check_id
and values.get("Provider") != "iac"
and values.get("Provider") != "llm"
and values.get("Provider") != "image"
):
service_from_check_id = check_id.split("_")[0]
if service_name != service_from_check_id:
@@ -183,6 +184,7 @@ class CheckMetadata(BaseModel):
check_id
and values.get("Provider") != "iac"
and values.get("Provider") != "llm"
and values.get("Provider") != "image"
):
if "-" in check_id:
raise ValueError(
@@ -791,6 +793,44 @@ class CheckReportIAC(Check_Report):
)
@dataclass
class CheckReportImage(Check_Report):
"""Contains the Container Image Check's finding information using Trivy."""
resource_name: str
image_digest: str
package_name: str
installed_version: str
fixed_version: str
def __init__(
self,
metadata: Optional[dict] = None,
finding: Optional[dict] = None,
image_name: str = "",
) -> None:
"""
Initialize the Container Image Check's finding information from a Trivy vulnerability/secret dict.
Args:
metadata (Dict): Check metadata.
finding (dict): A single vulnerability/secret result from Trivy's JSON output.
image_name (str): The container image name being scanned.
"""
if metadata is None:
metadata = {}
if finding is None:
finding = {}
super().__init__(metadata, finding)
self.resource = finding
self.resource_name = image_name
self.image_digest = finding.get("PkgID", "")
self.package_name = finding.get("PkgName", "")
self.installed_version = finding.get("InstalledVersion", "")
self.fixed_version = finding.get("FixedVersion", "")
@dataclass
class CheckReportLLM(Check_Report):
"""Contains the LLM Check's finding information."""
+2 -2
View File
@@ -14,8 +14,8 @@ def recover_checks_from_provider(
Returns a list of tuples with the following format (check_name, check_path)
"""
try:
# Bypass check loading for IAC provider since it uses Trivy directly
if provider == "iac" or provider == "llm":
# Bypass check loading for IAC, LLM, and Image providers since they use external tools directly
if provider in ("iac", "llm", "image"):
return []
checks = []
+3 -2
View File
@@ -27,10 +27,10 @@ class ProwlerArgumentParser:
self.parser = argparse.ArgumentParser(
prog="prowler",
formatter_class=RawTextHelpFormatter,
usage="prowler [-h] [--version] {aws,azure,gcp,kubernetes,m365,github,nhn,mongodbatlas,oraclecloud,alibabacloud,dashboard,iac} ...",
usage="prowler [-h] [--version] {aws,azure,gcp,kubernetes,m365,github,nhn,mongodbatlas,oraclecloud,alibabacloud,dashboard,iac,image} ...",
epilog="""
Available Cloud Providers:
{aws,azure,gcp,kubernetes,m365,github,iac,llm,nhn,mongodbatlas,oraclecloud,alibabacloud}
{aws,azure,gcp,kubernetes,m365,github,iac,llm,image,nhn,mongodbatlas,oraclecloud,alibabacloud}
aws AWS Provider
azure Azure Provider
gcp GCP Provider
@@ -41,6 +41,7 @@ Available Cloud Providers:
alibabacloud Alibaba Cloud Provider
iac IaC Provider (Beta)
llm LLM Provider (Beta)
image Container Image Provider (PoC)
nhn NHN Provider (Unofficial)
mongodbatlas MongoDB Atlas Provider (Beta)
+17
View File
@@ -358,6 +358,23 @@ class Finding(BaseModel):
)
output_data["region"] = check_output.region
elif provider.type == "image":
output_data["auth_method"] = provider.auth_method
output_data["account_uid"] = "image"
output_data["account_name"] = "image"
output_data["resource_name"] = getattr(
check_output, "resource_name", ""
)
output_data["resource_uid"] = getattr(check_output, "resource_name", "")
output_data["region"] = getattr(check_output, "region", "container")
output_data["package_name"] = getattr(check_output, "package_name", "")
output_data["installed_version"] = getattr(
check_output, "installed_version", ""
)
output_data["fixed_version"] = getattr(
check_output, "fixed_version", ""
)
# check_output Unique ID
# TODO: move this to a function
# TODO: in Azure, GCP and K8s there are findings without resource_name
+6
View File
@@ -77,6 +77,12 @@ def display_summary_table(
elif provider.type == "alibabacloud":
entity_type = "Account"
audited_entities = provider.identity.account_id
elif provider.type == "image":
entity_type = "Image"
if len(provider.images) == 1:
audited_entities = provider.images[0]
else:
audited_entities = f"{len(provider.images)} images"
# Check if there are findings and that they are not all MANUAL
if findings and not all(finding.status == "MANUAL" for finding in findings):
@@ -1023,6 +1023,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-6",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
@@ -3714,6 +3715,7 @@
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-5",
"ap-southeast-7",
"ca-central-1",
@@ -4418,23 +4420,6 @@
]
}
},
"elastictranscoder": {
"regions": {
"aws": [
"ap-northeast-1",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"eu-west-1",
"us-east-1",
"us-west-1",
"us-west-2"
],
"aws-cn": [],
"aws-eusc": [],
"aws-us-gov": []
}
},
"elb": {
"regions": {
"aws": [
@@ -4847,24 +4832,6 @@
]
}
},
"evidently": {
"regions": {
"aws": [
"ap-northeast-1",
"ap-southeast-1",
"ap-southeast-2",
"eu-central-1",
"eu-north-1",
"eu-west-1",
"us-east-1",
"us-east-2",
"us-west-2"
],
"aws-cn": [],
"aws-eusc": [],
"aws-us-gov": []
}
},
"evs": {
"regions": {
"aws": [
@@ -5582,6 +5549,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-6",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
@@ -6563,6 +6531,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-6",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
@@ -6613,6 +6582,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-6",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
@@ -6639,7 +6609,10 @@
"cn-northwest-1"
],
"aws-eusc": [],
"aws-us-gov": []
"aws-us-gov": [
"us-gov-east-1",
"us-gov-west-1"
]
}
},
"kendra": {
@@ -6883,6 +6856,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-6",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
@@ -7340,24 +7314,6 @@
"aws-us-gov": []
}
},
"lookoutmetrics": {
"regions": {
"aws": [
"ap-northeast-1",
"ap-southeast-1",
"ap-southeast-2",
"eu-central-1",
"eu-north-1",
"eu-west-1",
"us-east-1",
"us-east-2",
"us-west-2"
],
"aws-cn": [],
"aws-eusc": [],
"aws-us-gov": []
}
},
"lumberyard": {
"regions": {
"aws": [
@@ -8618,7 +8574,10 @@
"odb": {
"regions": {
"aws": [
"ap-northeast-1",
"eu-central-1",
"us-east-1",
"us-east-2",
"us-west-2"
],
"aws-cn": [],
@@ -9422,26 +9381,6 @@
"aws-us-gov": []
}
},
"qldb": {
"regions": {
"aws": [
"ap-northeast-1",
"ap-northeast-2",
"ap-southeast-1",
"ap-southeast-2",
"ca-central-1",
"eu-central-1",
"eu-west-1",
"eu-west-2",
"us-east-1",
"us-east-2",
"us-west-2"
],
"aws-cn": [],
"aws-eusc": [],
"aws-us-gov": []
}
},
"quicksight": {
"regions": {
"aws": [
@@ -9781,6 +9720,7 @@
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-5",
"ap-southeast-6",
"ap-southeast-7",
"ca-central-1",
"eu-central-1",
@@ -12533,6 +12473,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-6",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
@@ -12980,6 +12921,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-6",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
@@ -13073,6 +13015,7 @@
"ap-southeast-3",
"ap-southeast-4",
"ap-southeast-5",
"ap-southeast-6",
"ap-southeast-7",
"ca-central-1",
"ca-west-1",
@@ -13360,4 +13303,4 @@
}
}
}
}
}
@@ -1,27 +1,36 @@
{
"Provider": "aws",
"CheckID": "bedrock_agent_guardrail_enabled",
"CheckTitle": "Ensure that Guardrails are enabled for Amazon Bedrock agent sessions.",
"CheckType": [],
"CheckTitle": "Amazon Bedrock agent uses a guardrail to protect agent sessions",
"CheckType": [
"Software and Configuration Checks/AWS Security Best Practices/Runtime Behavior Analysis",
"Effects/Data Exposure"
],
"ServiceName": "bedrock",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:bedrock:region:account-id:agent/resource-id",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "Other",
"ResourceGroup": "ai_ml",
"Description": "This check ensures that Guardrails are enabled to protect Amazon Bedrock agent sessions. Guardrails help mitigate security risks by filtering and blocking harmful or sensitive content during interactions with AI models.",
"Risk": "Without guardrails enabled, Amazon Bedrock agent sessions are vulnerable to harmful prompts or inputs that could expose sensitive information or generate inappropriate content. This could lead to privacy violations, data leaks, or other security risks.",
"RelatedUrl": "https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails.html",
"Description": "**Bedrock agents** should have an associated **guardrail** for their sessions. The evaluation identifies agents without a guardrail linked for input/output screening during interactions.",
"Risk": "Without **guardrails**, agent exchanges may expose **PII** or internal data (confidentiality), accept **prompt injections** that manipulate tool calls or outputs (integrity), and produce unsafe or out-of-scope responses that erode trust and cause policy violations.",
"RelatedUrl": "",
"AdditionalURLs": [
"https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails-create.html",
"https://docs.aws.amazon.com/bedrock/latest/userguide/agents-guardrail.html",
"https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/Bedrock/protect-agent-sessions-with-guardrails.html",
"https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails.html"
],
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/Bedrock/protect-agent-sessions-with-guardrails.html",
"Terraform": ""
"CLI": "aws bedrock-agent update-agent --agent-id <agent_id> --agent-name <agent_name> --agent-resource-role-arn <role_arn> --foundation-model <model_id> --guardrail-configuration guardrailIdentifier=<guardrail_id>,guardrailVersion=DRAFT",
"NativeIaC": "```yaml\n# CloudFormation: Associate a guardrail with a Bedrock Agent\nResources:\n ExampleAgent:\n Type: AWS::Bedrock::Agent\n Properties:\n AgentName: <example_resource_name>\n AgentResourceRoleArn: <example_resource_id>\n FoundationModel: <example_resource_name>\n Instruction: \"<example_resource_name>\"\n GuardrailConfiguration: # CRITICAL: associates a guardrail to protect sessions\n GuardrailIdentifier: <example_resource_id> # CRITICAL: guardrail ID used by the agent\n GuardrailVersion: DRAFT # CRITICAL: version applied\n```",
"Other": "1. Open the Amazon Bedrock console and go to Agents\n2. Select the agent and click Edit\n3. In Guardrail details, select an existing guardrail and its version (e.g., DRAFT)\n4. Click Save (deploy changes if prompted)\n5. Verify the agent now shows the selected guardrail",
"Terraform": "```hcl\n# Terraform (AWS Cloud Control): Associate a guardrail with a Bedrock Agent\nresource \"awscc_bedrock_agent\" \"example\" {\n agent_name = \"<example_resource_name>\"\n agent_resource_role_arn = \"<example_resource_id>\"\n foundation_model = \"<example_resource_name>\"\n instruction = \"<example_resource_name>\"\n\n # CRITICAL: associates a guardrail to protect agent sessions\n guardrail_configuration {\n guardrail_identifier = \"<example_resource_id>\" # CRITICAL: guardrail ID\n guardrail_version = \"DRAFT\" # CRITICAL: version applied\n }\n}\n```"
},
"Recommendation": {
"Text": "Enable Guardrails for Amazon Bedrock agent sessions to protect against harmful inputs and outputs during interactions.",
"Url": "https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails-create.html"
"Text": "Associate a **guardrail** with every agent and tailor policies to your use case:\n- Enable content/word filters, denied topics, and sensitive-data masking\n- Use contextual grounding for RAG where relevant\n- Test and iterate across versions\nApply **least privilege** to agent tools and use **defense in depth** with monitoring and review.",
"Url": "https://hub.prowler.com/check/bedrock_agent_guardrail_enabled"
}
},
"Categories": [
@@ -1,35 +1,42 @@
{
"Provider": "aws",
"CheckID": "bedrock_api_key_no_administrative_privileges",
"CheckTitle": "Ensure Amazon Bedrock API keys do not have administrative privileges or privilege escalation",
"CheckTitle": "Amazon Bedrock API key does not have administrative privileges, privilege escalation paths, or full Bedrock service access",
"CheckType": [
"Software and Configuration Checks",
"Industry and Regulatory Standards"
"Software and Configuration Checks/AWS Security Best Practices",
"Software and Configuration Checks/Industry and Regulatory Standards/AWS Foundational Security Best Practices",
"Software and Configuration Checks/Industry and Regulatory Standards/CIS AWS Foundations Benchmark",
"TTPs/Privilege Escalation"
],
"ServiceName": "bedrock",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:iam:region:account-id:user/{user-name}/credential/{api-key-id}",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "AwsIamServiceSpecificCredential",
"ResourceType": "AwsIamAccessKey",
"ResourceGroup": "IAM",
"Description": "Ensure that Amazon Bedrock API keys do not have administrative privileges or privilege escalation capabilities. API keys with administrative privileges can perform any action on any resource in your AWS environment, while privilege escalation allows users to grant themselves additional permissions, both posing significant security risks.",
"Risk": "Amazon Bedrock API keys with administrative privileges can perform any action on any resource in your AWS environment. Privilege escalation capabilities allow users to grant themselves additional permissions beyond their intended scope. Both violations of the principle of least privilege can lead to security vulnerabilities, data leaks, data loss, or unexpected charges if the API key is compromised or misused.",
"RelatedUrl": "https://docs.aws.amazon.com/bedrock/latest/userguide/api-keys.html",
"Description": "**Bedrock API keys** linked to IAM users are evaluated for excessive permissions, including policies that grant full access (`*` or `bedrock:*`) or enable **privilege escalation**. The finding highlights keys whose attached or inline policies provide broad or escalating capabilities.",
"Risk": "Over-privileged Bedrock API keys weaken confidentiality, integrity, and availability. If compromised, an attacker could:\n- Escalate IAM rights and persist access\n- Invoke models at scale to exfiltrate data or incur high costs\n- Modify Bedrock settings, disrupting operations",
"RelatedUrl": "",
"AdditionalURLs": [
"https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html#grant-least-privilege",
"https://docs.aws.amazon.com/IAM/latest/UserGuide/getting-started-reduce-permissions.html",
"https://docs.aws.amazon.com/bedrock/latest/userguide/api-keys.html"
],
"Remediation": {
"Code": {
"CLI": "aws iam delete-service-specific-credential --user-name <username> --service-specific-credential-id <credential-id>",
"NativeIaC": "",
"Other": "",
"Terraform": ""
"NativeIaC": "```yaml\n# CloudFormation: attach least-privilege policy to the IAM user owning the Bedrock API key\nResources:\n <example_resource_name>:\n Type: AWS::IAM::Policy\n Properties:\n PolicyName: least-priv-bedrock\n Users:\n - <example_resource_name>\n PolicyDocument:\n Version: '2012-10-17'\n Statement:\n - Effect: Allow\n Action:\n - bedrock:InvokeModel # CRITICAL: allow only needed Bedrock action to avoid admin or bedrock:* permissions\n Resource: \"*\" # Limits access scope to only InvokeModel on any resource\n```",
"Other": "1. Open the AWS Console and go to IAM > Users\n2. Select the user that owns the Bedrock service-specific credential (Security credentials > Service-specific credentials shows bedrock.amazonaws.com)\n3. In the Permissions tab, detach any policy granting AdministratorAccess or bedrock:* (e.g., AmazonBedrockFullAccess)\n4. In the same tab, delete any inline policy that provides admin/privilege-escalation permissions or bedrock:* access\n5. If Bedrock access is needed, add a minimal policy allowing only bedrock:InvokeModel\n6. Save changes",
"Terraform": "```hcl\n# Attach a minimal inline policy to the IAM user owning the Bedrock API key\nresource \"aws_iam_user_policy\" \"<example_resource_name>\" {\n name = \"least-priv-bedrock\"\n user = \"<example_resource_name>\"\n\n # CRITICAL: allow only the specific action required; avoids admin or bedrock:* full access\n policy = jsonencode({\n Version = \"2012-10-17\"\n Statement = [{\n Effect = \"Allow\"\n Action = [\"bedrock:InvokeModel\"]\n Resource = \"*\"\n }]\n })\n}\n```"
},
"Recommendation": {
"Text": "Apply the principle of least privilege to Amazon Bedrock API keys. Instead of granting administrative privileges or privilege escalation capabilities, assign only the permissions necessary for specific tasks. Create custom IAM policies with minimal permissions based on the principle of least privilege. Regularly review and audit API key permissions to ensure they cannot be used for privilege escalation.",
"Url": "https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html#grant-least-privilege"
"Text": "Enforce **least privilege** on Bedrock keys:\n- Avoid wildcards like `*` and `bedrock:*`; allow only required actions\n- Prevent identity changes by disallowing `iam:*`\n- Prefer short-term credentials with rotation and MFA\n- Use permissions boundaries and SCPs as guardrails\n- Review usage and tighten policies via access analysis",
"Url": "https://hub.prowler.com/check/bedrock_api_key_no_administrative_privileges"
}
},
"Categories": [
"gen-ai",
"trust-boundaries"
"identity-access"
],
"DependsOn": [],
"RelatedTo": [],
@@ -1,35 +1,42 @@
{
"Provider": "aws",
"CheckID": "bedrock_api_key_no_long_term_credentials",
"CheckTitle": "Ensure Amazon Bedrock API keys are not long-term credentials",
"CheckTitle": "Amazon Bedrock API key is expired",
"CheckType": [
"Software and Configuration Checks",
"Industry and Regulatory Standards"
"Software and Configuration Checks/AWS Security Best Practices",
"Software and Configuration Checks/Industry and Regulatory Standards/AWS Foundational Security Best Practices",
"Software and Configuration Checks/Industry and Regulatory Standards/CIS AWS Foundations Benchmark",
"TTPs/Initial Access/Valid Accounts"
],
"ServiceName": "bedrock",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:iam:region:account-id:user/{user-name}/credential/{api-key-id}",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "AwsIamServiceSpecificCredential",
"ResourceType": "AwsIamUser",
"ResourceGroup": "IAM",
"Description": "Ensure that Amazon Bedrock API keys have expiration dates set to prevent long-term credential exposure. Long-term credentials pose a significant security risk as they remain valid indefinitely and can be used for unauthorized access if compromised.",
"Risk": "Amazon Bedrock API keys without expiration dates are long-term credentials that remain valid indefinitely. This increases the risk of unauthorized access if the credentials are compromised, as they cannot be automatically invalidated. Long-term credentials violate the principle of credential rotation and can lead to security vulnerabilities, data breaches, or unauthorized usage of Bedrock services.",
"RelatedUrl": "https://docs.aws.amazon.com/bedrock/latest/userguide/api-keys.html",
"Description": "**Bedrock API keys** are evaluated for **lifetime** and **expiration**.\n\nThe finding identifies keys that are long-lived, set to expire far in the future, or configured to `never expire`, and distinguishes them from keys that have already expired.",
"Risk": "Long-lived or non-expiring keys enable persistent access if compromised.\n- Confidentiality: unauthorized inference and exposure of prompts/outputs\n- Availability/Cost: uncontrolled usage and spend spikes\n- Integrity: actions can continue without timely revocation or rotation",
"RelatedUrl": "",
"AdditionalURLs": [
"https://docs.aws.amazon.com/ja_jp/bedrock/latest/userguide/getting-started-api-keys.html",
"https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html#rotate-credentials",
"https://docs.aws.amazon.com/bedrock/latest/userguide/api-keys.html"
],
"Remediation": {
"Code": {
"CLI": "aws iam delete-service-specific-credential --user-name <username> --service-specific-credential-id <credential-id>",
"NativeIaC": "",
"Other": "",
"Other": "1. Sign in to the AWS Management Console and open IAM\n2. Go to Users > select <example_resource_name> > Security credentials\n3. In \"API keys for Amazon Bedrock\", find the non-expired key and click Delete\n4. Confirm deletion to remove the key (removes the long-term credential so the check passes)",
"Terraform": ""
},
"Recommendation": {
"Text": "Delete the long-term API keys for Amazon Bedrock. Instead, use temporary credentials, IAM roles, or create new API keys with appropriate expiration dates. Implement a credential rotation policy to ensure all API keys have reasonable expiration periods. Consider using AWS STS for temporary credentials when possible.",
"Url": "https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html#rotate-credentials"
"Text": "Prefer **short-term credentials** and **IAM roles**; avoid `never expire`.\n\nEnforce **least privilege**, strict **rotation**, and automatic **expiration** for any long-term key. Store secrets securely, monitor with audit logs, and revoke unused or stale keys quickly.",
"Url": "https://hub.prowler.com/check/bedrock_api_key_no_long_term_credentials"
}
},
"Categories": [
"gen-ai",
"trust-boundaries"
"secrets",
"gen-ai"
],
"DependsOn": [],
"RelatedTo": [],
@@ -1,27 +1,37 @@
{
"Provider": "aws",
"CheckID": "bedrock_guardrail_prompt_attack_filter_enabled",
"CheckTitle": "Configure Prompt Attack Filter with the highest strength for Amazon Bedrock Guardrails.",
"CheckType": [],
"CheckTitle": "Amazon Bedrock guardrail has prompt attack filter strength set to HIGH",
"CheckType": [
"Software and Configuration Checks/AWS Security Best Practices/Runtime Behavior Analysis",
"TTPs/Defense Evasion",
"Effects/Data Exposure"
],
"ServiceName": "bedrock",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:bedrock:region:account-id:guardrails/resource-id",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "Other",
"ResourceGroup": "ai_ml",
"Description": "Ensure that prompt attack filter strength is set to HIGH for Amazon Bedrock guardrails to mitigate prompt injection and bypass techniques.",
"Risk": "If prompt attack filter strength is not set to HIGH, Bedrock models may be more vulnerable to prompt injection attacks or jailbreak attempts, which could allow harmful or sensitive content to bypass filters and reach end users.",
"RelatedUrl": "https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails.html",
"Description": "**Bedrock guardrails** have the **Prompt attack** filter set to `HIGH` strength to detect and block injection and jailbreak patterns. Guardrails missing this setting or using lower strengths are identified.",
"Risk": "Without **HIGH** prompt-attack filtering, models are exposed to **prompt injection/jailbreaks**:\n- Confidentiality: coerced disclosure of sensitive data\n- Integrity: policy evasion and manipulated outputs\n- Operations: unintended tool execution and workflow tampering",
"RelatedUrl": "",
"AdditionalURLs": [
"https://trendmicro.com/cloudoneconformity/knowledge-base/aws/Bedrock/prompt-attack-strength.html",
"https://docs.aws.amazon.com/bedrock/latest/userguide/prompt-injection.html",
"https://support.icompaas.com/support/solutions/articles/62000233535-ensure-prompt-attack-filter-is-configured-at-highest-strength-for-amazon-bedrock-guardrails",
"https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails.html"
],
"Remediation": {
"Code": {
"CLI": "aws bedrock put-guardrails-configuration --guardrails-config 'promptAttackStrength=HIGH'",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/Bedrock/prompt-attack-strength.html",
"Terraform": ""
"CLI": "aws bedrock update-guardrail --guardrail-identifier <guardrail_id> --content-policy-config 'filtersConfig=[{type=PROMPT_ATTACK,inputStrength=HIGH}]'",
"NativeIaC": "```yaml\nResources:\n <example_resource_name>:\n Type: AWS::Bedrock::Guardrail\n Properties:\n Name: <example_resource_name>\n BlockedInputMessaging: \"Blocked\"\n BlockedOutputsMessaging: \"Blocked\"\n ContentPolicyConfig:\n FiltersConfig:\n - Type: PROMPT_ATTACK # Critical: enables the Prompt Attack filter\n InputStrength: HIGH # Critical: sets filter strength to HIGH to pass the check\n```",
"Other": "1. Open the AWS Console and go to Amazon Bedrock\n2. Select Guardrails, then choose your guardrail\n3. In Content filters, find Prompt attacks\n4. Set Strength to High\n5. Click Save",
"Terraform": "```hcl\nresource \"aws_bedrock_guardrail\" \"<example_resource_name>\" {\n name = \"<example_resource_name>\"\n blocked_input_messaging = \"Blocked\"\n blocked_outputs_messaging = \"Blocked\"\n\n content_policy_config {\n filters_config {\n type = \"PROMPT_ATTACK\" # Critical: enables the Prompt Attack filter\n input_strength = \"HIGH\" # Critical: sets filter strength to HIGH to pass the check\n }\n }\n}\n```"
},
"Recommendation": {
"Text": "Set the prompt attack filter strength to HIGH for Amazon Bedrock guardrails to prevent prompt injection attacks and ensure robust protection against content manipulation.",
"Url": "https://docs.aws.amazon.com/bedrock/latest/userguide/prompt-injection.html"
"Text": "Set the **Prompt attack** filter to `HIGH` and apply **defense in depth**:\n- Tag user/external inputs as untrusted for evaluation\n- Combine with denied topics and sensitive-info filters\n- Enforce **least privilege** and approvals for risky actions\n- Monitor guardrail hits and tune to reduce false negatives",
"Url": "https://hub.prowler.com/check/bedrock_guardrail_prompt_attack_filter_enabled"
}
},
"Categories": [
@@ -1,27 +1,37 @@
{
"Provider": "aws",
"CheckID": "bedrock_guardrail_sensitive_information_filter_enabled",
"CheckTitle": "Configure Sensitive Information Filters for Amazon Bedrock Guardrails.",
"CheckType": [],
"CheckTitle": "Amazon Bedrock guardrail blocks or masks sensitive information",
"CheckType": [
"Software and Configuration Checks/AWS Security Best Practices",
"Software and Configuration Checks/AWS Security Best Practices/Runtime Behavior Analysis",
"Effects/Data Exposure",
"Sensitive Data Identifications/PII"
],
"ServiceName": "bedrock",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:bedrock:region:account-id:guardrails/resource-id",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "Other",
"ResourceGroup": "ai_ml",
"Description": "Ensure that sensitive information filters are enabled for Amazon Bedrock guardrails to prevent the leakage of sensitive data such as personally identifiable information (PII), financial data, or confidential corporate information.",
"Risk": "If sensitive information filters are not enabled, Bedrock models may inadvertently generate or expose confidential or sensitive information in responses, leading to data breaches, regulatory violations, or reputational damage.",
"RelatedUrl": "https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails.html",
"Description": "**Bedrock guardrails** use **sensitive information filters** to `block` or `mask` detected PII and custom pattern matches in prompts and responses.\n\nThe evaluation looks for guardrails with this filtering configured.",
"Risk": "Absent filtering, prompts or outputs can reveal **PII**, credentials, or financial records, compromising **confidentiality**.\n- Exposed tokens enable unauthorized access and data tampering (integrity)\n- Disclosed customer details facilitate fraud and identity theft, with potential lateral movement",
"RelatedUrl": "",
"AdditionalURLs": [
"https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails-sensitive-filters.html",
"https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/Bedrock/guardrails-with-pii-mask-block.html",
"https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails.html"
],
"Remediation": {
"Code": {
"CLI": "aws bedrock put-guardrails-configuration --guardrails-config 'sensitiveInformationFilter=true'",
"CLI": "aws bedrock update-guardrail --guardrail-identifier <example_resource_id> --sensitive-information-policy-config '{\"piiEntitiesConfig\":[{\"type\":\"EMAIL\",\"action\":\"ANONYMIZE\"}]}'",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/Bedrock/guardrails-with-pii-mask-block.html",
"Other": "1. Sign in to the AWS Console and open Amazon Bedrock\n2. Go to Guardrails and select <example_resource_name>\n3. Click Edit (or Open draft) and open Sensitive information filters\n4. Add PII type EMAIL and set action to Mask (or Block)\n5. Click Save",
"Terraform": ""
},
"Recommendation": {
"Text": "Enable sensitive information filters for Amazon Bedrock guardrails to prevent the exposure of sensitive or confidential information.",
"Url": "https://docs.aws.amazon.com/bedrock/latest/userguide/guardrails-sensitive-filters.html"
"Text": "Enable and tune **sensitive information filters** for inputs and outputs.\n- Use `BLOCK` for high-risk disclosures; `ANONYMIZE` when context is needed\n- Add custom regex for org-specific IDs\n- Apply least privilege and data minimization\n- Test regularly and monitor outcomes as part of defense-in-depth",
"Url": "https://hub.prowler.com/check/bedrock_guardrail_sensitive_information_filter_enabled"
}
},
"Categories": [
@@ -1,27 +1,35 @@
{
"Provider": "aws",
"CheckID": "bedrock_model_invocation_logging_enabled",
"CheckTitle": "Ensure that model invocation logging is enabled for Amazon Bedrock.",
"CheckType": [],
"CheckTitle": "Amazon Bedrock model invocation logging is enabled",
"CheckType": [
"Software and Configuration Checks/AWS Security Best Practices",
"Software and Configuration Checks/Industry and Regulatory Standards/AWS Foundational Security Best Practices"
],
"ServiceName": "bedrock",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:bedrock:region:account-id:model/resource-id",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "Other",
"ResourceGroup": "ai_ml",
"Description": "Ensure that model invocation logging is enabled for Amazon Bedrock service in order to collect metadata, requests, and responses for all model invocations in your AWS cloud account.",
"Risk": "In Amazon Bedrock, model invocation logging enables you to collect the invocation request and response data, along with metadata, for all 'Converse', 'ConverseStream', 'InvokeModel', and 'InvokeModelWithResponseStream' API calls in your AWS account. Each log entry includes important details such as the timestamp, request ID, model ID, and token usage. Invocation logs can be utilized for troubleshooting, performance enhancements, abuse detection, and security auditing. By default, model invocation logging is disabled.",
"RelatedUrl": "https://docs.aws.amazon.com/bedrock/latest/userguide/model-invocation-logging.html",
"Description": "**Bedrock** model invocation logging captures request, response, and metadata for `Converse`, `ConverseStream`, `InvokeModel`, and `InvokeModelWithResponseStream` calls per Region, delivering records to **CloudWatch Logs** and/or **S3** when configured.",
"Risk": "Without **invocation logs**, you lose **auditability** and **forensic visibility** into model activity.\n\nCredential misuse or **prompt injection/jailbreak** attempts may go unnoticed, enabling data exfiltration and unauthorized spend. Missing traceability weakens **integrity** controls and slows incident response.",
"RelatedUrl": "",
"AdditionalURLs": [
"https://docs.aws.amazon.com/bedrock/latest/userguide/model-invocation-logging.html#model-invocation-logging-console",
"https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/Bedrock/enable-model-invocation-logging.html",
"https://docs.aws.amazon.com/bedrock/latest/userguide/model-invocation-logging.html"
],
"Remediation": {
"Code": {
"CLI": "aws bedrock put-model-invocation-logging-configuration --logging-config 's3Config={bucketName='tm-bedrock-logging-data',keyPrefix='invocation-logs'},textDataDeliveryEnabled=true,imageDataDeliveryEnabled=true,embeddingDataDeliveryEnabled=true'",
"CLI": "aws bedrock put-model-invocation-logging-configuration --logging-config '{\"s3Config\":{\"bucketName\":\"<example_resource_name>\"},\"textDataDeliveryEnabled\":true}'",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/Bedrock/enable-model-invocation-logging.html",
"Other": "1. Open the Amazon Bedrock console in the target Region\n2. Go to Settings > Model invocation logging\n3. Toggle Logging to On\n4. Select Amazon S3 as the destination and choose <example_resource_name> bucket\n5. Under Data types, select Text\n6. Click Save",
"Terraform": ""
},
"Recommendation": {
"Text": "Enable model invocation logging for Amazon Bedrock service in order to collect metadata, requests, and responses for all model invocations in your AWS cloud account.",
"Url": "https://docs.aws.amazon.com/bedrock/latest/userguide/model-invocation-logging.html#model-invocation-logging-console"
"Text": "Enable **model invocation logging** and route events to **CloudWatch Logs** and/or **S3**.\n\nEnforce **least privilege** on log access, use encryption, and set retention/lifecycle policies. Monitor for anomalies and alerts to support **defense in depth** and **separation of duties**.",
"Url": "https://hub.prowler.com/check/bedrock_model_invocation_logging_enabled"
}
},
"Categories": [
@@ -1,27 +1,38 @@
{
"Provider": "aws",
"CheckID": "bedrock_model_invocation_logs_encryption_enabled",
"CheckTitle": "Ensure that Amazon Bedrock model invocation logs are encrypted with KMS.",
"CheckType": [],
"CheckTitle": "Amazon Bedrock model invocation logs are encrypted in the S3 bucket and KMS-encrypted in the CloudWatch log group",
"CheckType": [
"Software and Configuration Checks/AWS Security Best Practices",
"Software and Configuration Checks/Industry and Regulatory Standards/AWS Foundational Security Best Practices",
"Software and Configuration Checks/Industry and Regulatory Standards/CIS AWS Foundations Benchmark",
"Effects/Data Exposure"
],
"ServiceName": "bedrock",
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:bedrock:region:account-id:model/resource-id",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "Other",
"ResourceType": "AwsS3Bucket",
"ResourceGroup": "ai_ml",
"Description": "Ensure that Amazon Bedrock model invocation logs are encrypted using AWS KMS to protect sensitive data in the request and response logs for all model invocations.",
"Risk": "If Amazon Bedrock model invocation logs are not encrypted, sensitive data such as prompts, responses, and token usage could be exposed to unauthorized parties. This may lead to data breaches, security vulnerabilities, or unintended use of sensitive information.",
"RelatedUrl": "https://docs.aws.amazon.com/bedrock/latest/userguide/data-protection.html",
"Description": "**Bedrock model invocation logs** are stored in encrypted destinations: **S3 buckets** with bucket encryption and **CloudWatch Logs** groups protected by an AWS KMS key.\n\nThis evaluates whether configured log targets enforce encryption at rest for request/response content and associated metadata.",
"Risk": "Without encryption at rest, prompts, outputs, images, and token/usage metadata in logs can be read if S3 or CloudWatch storage or replicas are accessed by unauthorized principals.\n\nImpacts:\n- Loss of data **confidentiality**\n- Secret or PII exposure enabling **account abuse**\n- Operational intel for **lateral movement**",
"RelatedUrl": "",
"AdditionalURLs": [
"https://docs.aws.amazon.com/bedrock/latest/userguide/data-protection.html",
"https://docs.aws.amazon.com/bedrock/latest/userguide/model-invocation-logging.html",
"https://support.icompaas.com/support/solutions/articles/62000233532-ensure-that-amazon-bedrock-model-invocation-logs-are-encrypted-with-kms",
"https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/configure-bedrock-invocation-logging-cloudformation.html"
],
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
"NativeIaC": "```yaml\nResources:\n EncryptedLogsBucket:\n Type: AWS::S3::Bucket\n Properties:\n BucketName: <example_resource_name>\n BucketEncryption: # critical: enables default encryption for the bucket (SSE-S3)\n ServerSideEncryptionConfiguration:\n - ServerSideEncryptionByDefault:\n SSEAlgorithm: AES256\n\n EncryptedLogGroup:\n Type: AWS::Logs::LogGroup\n Properties:\n LogGroupName: <example_resource_name>\n KmsKeyId: <example_kms_key_arn> # critical: encrypts the CloudWatch log group with a KMS key\n```",
"Other": "1. In the Bedrock console, go to Settings and note the S3 bucket and CloudWatch log group used for Model invocation logging.\n2. S3 bucket: AWS Console > S3 > Buckets > <bucket> > Properties > Default encryption > Enable > Choose SSE-S3 (AES-256) > Save.\n3. CloudWatch Logs: AWS Console > CloudWatch > Logs > Log groups > select <log group> > Actions > Edit > KMS encryption > select <KMS key> > Save.",
"Terraform": "```hcl\nresource \"aws_s3_bucket\" \"example\" {\n bucket = \"<example_resource_name>\"\n}\n\nresource \"aws_s3_bucket_server_side_encryption_configuration\" \"example\" {\n bucket = aws_s3_bucket.example.id\n rule {\n apply_server_side_encryption_by_default {\n sse_algorithm = \"AES256\" # critical: enables default encryption for the S3 bucket\n }\n }\n}\n\nresource \"aws_cloudwatch_log_group\" \"example\" {\n name = \"<example_resource_name>\"\n kms_key_id = \"<example_kms_key_arn>\" # critical: encrypts the log group with a KMS key\n}\n```"
},
"Recommendation": {
"Text": "Ensure that model invocation logs for Amazon Bedrock are encrypted using AWS KMS to prevent unauthorized access to sensitive log data.",
"Url": "hhttps://docs.aws.amazon.com/bedrock/latest/userguide/data-protection.html"
"Text": "Ensure all invocation logs are encrypted end to end:\n- Enable S3 default encryption, preferably `SSE-KMS`, and restrict key usage\n- Assign a KMS key to CloudWatch log groups\n- Enforce **least privilege** on keys and logs, rotate keys, and monitor access for **defense in depth**",
"Url": "https://hub.prowler.com/check/bedrock_model_invocation_logs_encryption_enabled"
}
},
"Categories": [
@@ -1,29 +1,34 @@
{
"Provider": "aws",
"CheckID": "datasync_task_logging_enabled",
"CheckTitle": "DataSync tasks should have logging enabled",
"CheckTitle": "DataSync task has CloudWatch Logs log group configured for logging",
"CheckType": [
"Software and Configuration Checks/AWS Security Best Practices"
"Software and Configuration Checks/AWS Security Best Practices",
"Software and Configuration Checks/Industry and Regulatory Standards/AWS Foundational Security Best Practices"
],
"ServiceName": "datasync",
"SubServiceName": "",
"ResourceIdTemplate": "arn:aws:datasync:{region}:{account-id}:task/{task-id}",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "AwsDataSyncTask",
"ResourceType": "Other",
"ResourceGroup": "storage",
"Description": "This control checks if AWS DataSync tasks have logging enabled. The control fails if the task doesn't have the CloudWatchLogGroupArn property defined.",
"Risk": "Without logging enabled, important operational data may be lost, making it difficult to troubleshoot issues, monitor performance, and ensure compliance with auditing requirements.",
"RelatedUrl": "https://docs.aws.amazon.com/datasync/latest/userguide/monitor-datasync.html#enable-logging",
"Description": "**AWS DataSync tasks** are evaluated for a configured **CloudWatch Logs** destination (`CloudWatchLogGroupArn`).\n\nTasks that specify a log group are recognized as logging-enabled; those without one are identified as not publishing execution events.",
"Risk": "**Absent DataSync task logs** create blind spots, preventing timely detection of **failed or partial transfers**, unexpected deletions, or anomalies. This undermines data **integrity** verification, obscures potential **exfiltration** indicators, and slows forensics and recovery, reducing **availability** during incidents.",
"RelatedUrl": "",
"AdditionalURLs": [
"https://support.icompaas.com/support/solutions/articles/62000233637-ensure-datasync-tasks-should-have-logging-enabled",
"https://docs.aws.amazon.com/datasync/latest/userguide/monitor-datasync.html#enable-logging"
],
"Remediation": {
"Code": {
"CLI": "aws datasync update-task --task-arn <task-arn> --cloud-watch-log-group-arn <log-group-arn>",
"NativeIaC": "",
"Other": "https://docs.aws.amazon.com/datasync/latest/userguide/monitor-datasync.html#enable-logging",
"Terraform": ""
"NativeIaC": "```yaml\n# CloudFormation: Enable CloudWatch Logs for a DataSync task\nResources:\n <example_resource_name>:\n Type: AWS::DataSync::Task\n Properties:\n SourceLocationArn: <example_source_location_arn>\n DestinationLocationArn: <example_destination_location_arn>\n CloudWatchLogGroupArn: <example_log_group_arn> # Critical: attaches a CloudWatch Logs group to enable task logging\n```",
"Other": "1. In the AWS Console, go to DataSync > Tasks\n2. Select the task and click Edit\n3. In the Logging section, set CloudWatch Log group to an existing log group\n4. Click Save",
"Terraform": "```hcl\n# Enable CloudWatch Logs for a DataSync task\nresource \"aws_datasync_task\" \"<example_resource_name>\" {\n source_location_arn = \"<example_source_location_arn>\"\n destination_location_arn = \"<example_destination_location_arn>\"\n cloudwatch_log_group_arn = \"<example_log_group_arn>\" # Critical: attaches a CloudWatch Logs group to enable task logging\n}\n```"
},
"Recommendation": {
"Text": "Configure logging for your DataSync tasks to ensure that operational data is captured and available for debugging, monitoring, and auditing purposes.",
"Url": "https://docs.aws.amazon.com/datasync/latest/userguide/monitor-datasync.html#enable-logging"
"Text": "Configure each task to publish logs to a dedicated CloudWatch Logs group. Select an appropriate log level (e.g., `BASIC` or `TRANSFER`), enforce **least privilege** for log access, set **retention** and immutability, and integrate alerts. Centralize and monitor logs to support **defense in depth** and incident response.",
"Url": "https://hub.prowler.com/check/datasync_task_logging_enabled"
}
},
"Categories": [
@@ -25,8 +25,8 @@ class dms_instance_no_public_access(Check):
if check_security_group(
ingress_rule,
"-1",
ports=None,
any_address=True,
all_ports=True,
):
report.status = "FAIL"
report.status_extended = f"DMS Replication Instance {instance.id} is set as publicly accessible and security group {security_group.name} ({security_group.id}) is open to the Internet."
@@ -31,7 +31,7 @@ class ec2_securitygroup_allow_ingress_from_internet_to_any_port(Check):
report.status_extended = f"Security group {security_group.name} ({security_group.id}) does not have any port open to the Internet."
for ingress_rule in security_group.ingress_rules:
if check_security_group(
ingress_rule, "-1", ports=None, any_address=True
ingress_rule, "-1", any_address=True, all_ports=True
):
self.check_enis(
report=report,
@@ -3,10 +3,14 @@ from typing import Any
def check_security_group(
ingress_rule: Any, protocol: str, ports: list = [], any_address: bool = False
ingress_rule: Any,
protocol: str,
ports: list | None = None,
any_address: bool = False,
all_ports: bool = False,
) -> bool:
"""
Check if the security group ingress rule has public access to the check_ports using the protocol
Check if the security group ingress rule has public access to the check_ports using the protocol.
@param ingress_rule: AWS Security Group IpPermissions Ingress Rule
{
@@ -29,13 +33,17 @@ def check_security_group(
@param protocol: Protocol to check. If -1, all protocols will be checked.
@param ports: List of ports to check. If empty, any port will be checked. If None, any port will be checked. (Default: [])
@param ports: List of ports to check. If not provided all ports will be checked unless all_ports is False. (Default: None)
@param any_address: If True, only 0.0.0.0/0 or "::/0" will be public and do not search for public addresses. (Default: False)
@param all_ports: If True, empty ports list will be treated as all ports. (Default: False)
@return: True if the security group has public access to the check_ports using the protocol
"""
if ports is None:
ports = []
# Check for all traffic ingress rules regardless of the protocol
if ingress_rule["IpProtocol"] == "-1":
for ip_ingress_rule in ingress_rule["IpRanges"]:
@@ -54,54 +62,42 @@ def check_security_group(
# Check for specific ports in ingress rules
if "FromPort" in ingress_rule:
# If there is a port range
# If the ports are not the same create a covering range.
# Note range is exclusive of the end value so we add 1 to the ToPort.
if ingress_rule["FromPort"] != ingress_rule["ToPort"]:
# Calculate port range, adding 1
diff = (ingress_rule["ToPort"] - ingress_rule["FromPort"]) + 1
ingress_port_range = []
for x in range(diff):
ingress_port_range.append(int(ingress_rule["FromPort"]) + x)
# If FromPort and ToPort are the same
ingress_port_range = set(
range(ingress_rule["FromPort"], ingress_rule["ToPort"] + 1)
)
else:
ingress_port_range = []
ingress_port_range.append(int(ingress_rule["FromPort"]))
ingress_port_range = {int(ingress_rule["FromPort"])}
# Test Security Group
# IPv4
for ip_ingress_rule in ingress_rule["IpRanges"]:
if _is_cidr_public(ip_ingress_rule["CidrIp"], any_address):
# If there are input ports to check
if ports:
for port in ports:
if (
port in ingress_port_range
and ingress_rule["IpProtocol"] == protocol
):
return True
# If empty input ports check if all ports are open
if len(set(ingress_port_range)) == 65536:
return True
# If None input ports check if any port is open
if ports is None:
return True
# Combine IPv4 and IPv6 ranges to facilitate a single check loop.
all_ingress_rules = []
all_ingress_rules.extend(ingress_rule["IpRanges"])
all_ingress_rules.extend(ingress_rule["Ipv6Ranges"])
# IPv6
for ip_ingress_rule in ingress_rule["Ipv6Ranges"]:
if _is_cidr_public(ip_ingress_rule["CidrIpv6"], any_address):
# If there are input ports to check
if ports:
for port in ports:
if (
port in ingress_port_range
and ingress_rule["IpProtocol"] == protocol
):
return True
# If empty input ports check if all ports are open
if len(set(ingress_port_range)) == 65536:
return True
# If None input ports check if any port is open
if ports is None:
return True
for ip_ingress_rule in all_ingress_rules:
# We only check public CIDRs
if _is_cidr_public(
ip_ingress_rule.get("CidrIp", ip_ingress_rule.get("CidrIpv6")),
any_address,
):
for port in ports:
if port in ingress_port_range and (
ingress_rule["IpProtocol"] == protocol or protocol == "-1"
):
# Direct match for a port in the specified port range
return True
# We did not find a specific port for the given protocol for
# a public cidr so let's see if all the ports are open
all_ports_open = len(ingress_port_range) == 65536
# Use the all_ports flag to determine if empty ports should be treated as all ports.
empty_ports_same_as_all_ports_open = all_ports and not ports
return all_ports_open or empty_ports_same_as_all_ports_open
return False
@@ -120,3 +116,4 @@ def _is_cidr_public(cidr: str, any_address: bool = False) -> bool:
return True
if not any_address:
return ipaddress.ip_network(cidr).is_global
return False
@@ -1,30 +1,40 @@
{
"Provider": "aws",
"CheckID": "rds_cluster_backtrack_enabled",
"CheckTitle": "Check if RDS Aurora MySQL Clusters have backtrack enabled.",
"CheckType": [],
"CheckTitle": "RDS Aurora MySQL cluster has Backtrack enabled",
"CheckType": [
"Software and Configuration Checks/AWS Security Best Practices"
],
"ServiceName": "rds",
"SubServiceName": "",
"ResourceIdTemplate": "arn:aws:rds:region:account-id:db-cluster",
"Severity": "medium",
"ResourceIdTemplate": "",
"Severity": "low",
"ResourceType": "AwsRdsDbCluster",
"ResourceGroup": "database",
"Description": "Ensure that the Backtrack feature is enabled for your Amazon Aurora (with MySQL compatibility) database clusters in order to backtrack your clusters to a specific time, without using backups. Backtrack is an Amazon RDS feature that allows you to specify the amount of time that an Aurora MySQL database cluster needs to retain change records, in order to have a fast way to recover from user errors, such as dropping the wrong table or deleting the wrong row by moving your MySQL database to a prior point in time without the need to restore from a recent backup.",
"Risk": "Once the Backtrack feature is enabled, Amazon RDS can quickly 'rewind' your Aurora MySQL database cluster to a point in time that you specify. In contrast to the backup and restore method, with Backtrack you can easily undo a destructive action, such as a DELETE query without a WHERE clause, with minimal downtime, you can rewind your Aurora cluster in just few minutes, and you can repeatedly backtrack a database cluster back and forth in time to help determine when a particular data change occurred.",
"RelatedUrl": "https://docs.aws.amazon.com/securityhub/latest/userguide/rds-controls.html#rds-14",
"Description": "**Aurora MySQL DB clusters** have **Backtrack** configured with a non-zero `BacktrackWindow`, retaining change records to allow rewinding to a consistent earlier time. *Applies to `aurora-mysql` engines only.*",
"Risk": "Without **Backtrack**, destructive queries or admin mistakes can't be quickly undone, forcing snapshot/point-in-time restores. This increases recovery time, disrupts availability, and risks data **integrity** from partial restores or rollbacks.\n\nAdversaries who alter data can cause longer impact windows before containment.",
"RelatedUrl": "",
"AdditionalURLs": [
"https://docs.aws.amazon.com/config/latest/developerguide/aurora-mysql-backtracking-enabled.html",
"https://docs.aws.amazon.com/securityhub/latest/userguide/rds-controls.html#rds-14",
"https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Managing.Backtrack.html",
"https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/RDS/backtrack.html#"
],
"Remediation": {
"Code": {
"CLI": "aws rds restore-db-cluster-to-point-in-time --region <REGION> --source-db-cluster-identifier <SOURCE_DB_CLUSTER_ID> --db-cluster-identifier <DB_CLUSTER_ID> --restore-type copy-on-write --use-latest-restorable-time --backtrack-window 86400",
"NativeIaC": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/RDS/backtrack.html#",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/RDS/backtrack.html#",
"Terraform": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/aws/RDS/backtrack.html#"
"CLI": "aws rds restore-db-cluster-to-point-in-time --source-db-cluster-identifier <SOURCE_DB_CLUSTER_ID> --db-cluster-identifier <DB_CLUSTER_ID> --use-latest-restorable-time --backtrack-window 3600",
"NativeIaC": "```yaml\n# CloudFormation: Create Aurora MySQL cluster with Backtrack enabled\nResources:\n <example_resource_name>:\n Type: AWS::RDS::DBCluster\n Properties:\n Engine: aurora-mysql\n MasterUsername: \"<MASTER_USERNAME>\"\n MasterUserPassword: \"<MASTER_PASSWORD>\"\n BacktrackWindow: 3600 # CRITICAL: Enables Backtrack (seconds) so the check passes\n```",
"Other": "1. In the AWS Console, go to RDS > Databases and select the Aurora MySQL cluster\n2. Click Actions > Restore to point in time\n3. Choose Use latest restorable time\n4. Set Backtrack window (seconds) to a value > 0 (e.g., 3600)\n5. Enter a new DB cluster identifier and click Restore DB cluster\n6. Cut over applications to the new cluster",
"Terraform": "```hcl\n# Terraform: Aurora MySQL cluster with Backtrack enabled\nresource \"aws_rds_cluster\" \"<example_resource_name>\" {\n engine = \"aurora-mysql\"\n master_username = \"<MASTER_USERNAME>\"\n master_password = \"<MASTER_PASSWORD>\"\n backtrack_window = 3600 # CRITICAL: Enables Backtrack (seconds) so the check passes\n}\n```"
},
"Recommendation": {
"Text": "Backups help you to recover more quickly from a security incident. They also strengthens the resilience of your systems. Aurora backtracking reduces the time to recover a database to a point in time. It does not require a database restore to do so. You cannot enable backtracking on an existing cluster. Instead, you can create a clone that has backtracking enabled.",
"Url": "https://docs.aws.amazon.com/securityhub/latest/userguide/rds-controls.html#rds-14"
"Text": "Enable **Backtrack** on Aurora MySQL clusters and set `BacktrackWindow` to meet RTO while balancing cost and workload. Use it with automated backups for **defense in depth** and resilience.\n\n*For clusters without Backtrack*, provision a clone or new cluster with it enabled; monitor usage and adjust the window as change rates evolve.",
"Url": "https://hub.prowler.com/check/rds_cluster_backtrack_enabled"
}
},
"Categories": [],
"Categories": [
"resilience"
],
"DependsOn": [],
"RelatedTo": [],
"Notes": ""

Some files were not shown because too many files have changed in this diff Show More