Compare commits

...

107 Commits

Author SHA1 Message Date
Chandrapal Badshah 77451d5cf8 chore: update changelog 2025-09-23 16:45:21 +05:30
Chandrapal Badshah 5b54b47930 fix(lighthouse): allow scrolling during AI response streaming (#8669)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
(cherry picked from commit 3949ab736d)
2025-09-23 09:09:41 +00:00
Prowler Bot a1168e3082 fix: handle 4XX and 204 properly (#8732)
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
2025-09-15 17:43:18 +02:00
Prowler Bot f2341c9878 chore(changelog): remove whitespace in links (#8718)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-09-12 18:10:48 +05:45
Prowler Bot 67b8e925e5 chore(release): Bump version to v5.12.2 (#8713)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-09-12 13:24:37 +02:00
Prowler Bot ad4475efc9 fix(firehose): false positive in firehose_stream_encrypted_at_rest (#8707)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-09-12 10:02:25 +02:00
Prowler Bot 4dd6547b9c fix(auth): validate email field (#8706)
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2025-09-11 16:25:35 +02:00
Prowler Bot cc4d759f47 fix(auth): add method attribute to form for proper submission handling (#8705)
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
2025-09-11 19:36:24 +05:45
Prowler Bot e9aca866c8 fix(defender): change policies rules key (#8703)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: HugoPBrito <hugopbrit@gmail.com>
2025-09-11 14:00:52 +02:00
Prowler Bot 12f9e477a3 fix(compliance): replace old check id with new one (#8686)
Co-authored-by: Daniel Barranquero <74871504+danibarranqueroo@users.noreply.github.com>
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2025-09-09 15:34:09 +02:00
Prowler Bot a2a3b7c125 chore(release): Bump version to v5.12.1 (#8680)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-09-09 16:35:54 +05:45
Pepe Fagoaga 31f34fd15e chore(api): Use Prowler from v5.12 (#8678) 2025-09-09 14:29:31 +05:45
Adrián Jesús Peña Rodríguez 3f5178bffb chore: update api changelog (#8677) 2025-09-09 10:23:55 +02:00
Josema Camacho e08b272a1d fix(login): add DRF throttle option for dj-rest-auth lib (#8672) 2025-09-09 09:34:02 +02:00
Pedro Martín 64c43a288d feat(jira): add force accept language for requests (#8674) 2025-09-09 13:17:25 +05:45
Daniel Barranquero 74bf0e6b47 fix(aws): nonetype errors in opensearch, firehose and cognito (#8670) 2025-09-09 13:12:57 +05:45
Andoni Alonso 02b7c5328f docs: update providers table (#8676)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2025-09-09 09:25:20 +02:00
Alejandro Bailo bb02004e7c fix: social auth buttons showed for sign-up (#8673) 2025-09-09 09:23:56 +02:00
Andoni Alonso 82cf216a74 feat(mongodbatlas): add MongoDB Atlas provider PoC (#8312)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2025-09-09 09:18:37 +02:00
Daniel Barranquero 7916425ed4 fix(memorydb): handle clusters with no security groups (#8666) 2025-09-08 15:05:13 -04:00
Andoni Alonso d98063ed47 docs: add interface column to providers (#8675) 2025-09-08 15:03:17 -04:00
Andoni Alonso 27bf78a3a1 docs: update providers list (#8671) 2025-09-08 17:12:16 +02:00
Andoni Alonso f50bd50d60 docs: add SSO with SAML Entra ID video link (#8668) 2025-09-08 14:57:38 +02:00
Alejandro Bailo 80665e0396 feat(ui): send a finding to Jira (#8649) 2025-09-08 14:15:23 +02:00
Pedro Martín 4b259fa8dd chore(changelog): update with latest changes (#8665) 2025-09-08 17:24:31 +05:45
Hugo Pereira Brito 10db2ed6d8 chore(docs): add notes regarding gov accounts support (#8656) 2025-09-08 11:07:00 +02:00
Chandrapal Badshah 422a8a0f62 fix: change title in lighthouse settings (#8615)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-09-08 10:34:09 +02:00
Daniel Barranquero 906a2cc651 fix(entra): add metadata description for check entra_admin_users_phishing_resistant_mfa_enabled (#8654) 2025-09-08 08:11:46 +02:00
Víctor Fernández Poyatos 43fe9c6860 feat(integrations): allow sending findings to Jira from the API (#8645) 2025-09-05 14:28:34 +02:00
Andoni Alonso f87b2089fb docs: remove llms.txt (#8653) 2025-09-05 17:08:42 +05:45
Samuele Pasini 1884874ab6 fix: typo ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_* CheckID (#8294)
Co-authored-by: Daniel Barranquero <danielbo2001@gmail.com>
2025-09-05 13:16:12 +02:00
Andoni Alonso cd6d29e176 docs: reorg tutorials (#8652) 2025-09-05 16:49:14 +05:45
Pedro Martín 0b7055e983 feat(jira): add send_finding method with specific finding fields (#8648) 2025-09-05 12:25:53 +02:00
Josema Camacho ae53b76d78 feat(login): add DJANGO_THROTTLE_TOKEN_OBTAIN to main .env file (#8650) 2025-09-05 16:01:48 +05:45
Josema Camacho 406e473b5c feat(login): add throttling option for the /api/v1/tokens endpoint (#8647) 2025-09-05 14:37:31 +05:45
Pedro Martín 1a2bf461f0 feat(jira): support labels in jira tickets (#8603) 2025-09-05 09:53:24 +02:00
Samuele Pasini 1b49c0b27f feat: add --excluded-checks-file flag (#8301)
Co-authored-by: pedrooot <pedromarting3@gmail.com>
2025-09-05 09:33:21 +02:00
Pablo Lara 12ada66978 feat: add status filter to /overviews endpoint (#8186)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-09-04 18:46:14 +02:00
Alejandro Bailo daa2536005 feat: Jira UI integration - pages and server actions (#8640) 2025-09-04 15:59:37 +02:00
Chandrapal Badshah 69a62db19a chore: rename to lighthouse ai (#8614)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-09-04 15:30:07 +05:45
Pedro Martín 79450d6977 fix(securityhub): resolve TypeError from Python3.9 (#8619)
Co-authored-by: Hugo Pereira Brito <101209179+HugoPBrito@users.noreply.github.com>
2025-09-03 17:52:09 +02:00
Víctor Fernández Poyatos 0463fd0830 refactor(integrations-jira): Move domain to credentials and retrieve metadata during connection test (#8637) 2025-09-03 17:24:42 +02:00
Alejandro Bailo b15e3d339c fix(saml): remove validation call on email domain change (#8638) 2025-09-03 17:04:51 +02:00
Pedro Martín 1fc12952ba feat(jira): add color for manual status (#8642) 2025-09-03 16:53:31 +02:00
sumit-tft 088a6bcbda feat(ui): handle no-permissions on scan page (#8624)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2025-09-03 15:51:14 +02:00
Hugo Pereira Brito a3b0bb6d4b refactor(models): rename AdditionalUrls to AdditionalURLs (#8639) 2025-09-03 19:34:06 +05:45
Pedro Martín 3c819f8875 chore(changelog): update with latest changes (#8636) 2025-09-03 12:54:50 +02:00
Pedro Martín cdf0292bbc feat(jira): add get_metadata (#8630) 2025-09-03 10:59:07 +02:00
César Arroba 987121051b chore(sdk): comment push readme to dockerhub steps (#8628) 2025-09-02 21:48:42 +05:45
Hugo Pereira Brito c9ed7773d2 feat(models): add AdditionalUrls field to check metadata (#8590) 2025-09-02 21:27:21 +05:45
Pepe Fagoaga fdf45aac51 fix(img): prowler architecture (#8635) 2025-09-02 21:15:40 +05:45
Alejandro Bailo 3ded224a4b fix: new errors detected through the app (#8629) 2025-09-02 12:35:06 +02:00
sumit-tft 230a085c76 fix(ui): display NoProvidersAdded when no cloud providers are configured (#8626) 2025-09-02 12:33:58 +02:00
Chandrapal Badshah 8cd90e07dc chore(ui): eslint nextjs files (#8627)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-09-02 12:15:48 +02:00
Pedro Martín 06ded98d05 feat(jira): add data to table and error handling (#8601) 2025-09-02 11:48:52 +02:00
Pedro Martín a5066326bd chore(changelog): update with latests changes (#8620) 2025-09-02 11:27:13 +02:00
Alejandro Bailo 83a9ac2109 chore(ui): update CHANGELOG (#8625) 2025-09-02 10:45:34 +02:00
Alejandro Bailo 136eb4facd feat: 50X errors handler (#8621) 2025-09-02 10:12:03 +02:00
Víctor Fernández Poyatos d4eb4bdca7 feat(integrations): Support JIRA integration in the API (#8622) 2025-09-02 09:53:36 +02:00
Alejandro Bailo 665c9d878a chore(ui): update Next.js and ESLint dependencies to version 14.2.32 (#8623) 2025-09-01 18:38:39 +02:00
Hugo Pereira Brito a064e43302 chore(ui): render attributes as markdown (#8604)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2025-09-01 16:43:36 +02:00
Daniel Barranquero fdb76e7820 feat(docs): update mfa enforcement date for m365 (#8610) 2025-09-01 09:48:21 +02:00
Pepe Fagoaga 1259bb85e3 fix: remove dot (#8613) 2025-08-29 14:46:19 +05:45
Pepe Fagoaga 0db9ab91b2 chore(docs): review stats, imgs and update copy (#8612) 2025-08-29 14:44:01 +05:45
César Arroba f6ea314ec0 chore(sdk): push readme file to docker hub (#8611) 2025-08-29 14:43:53 +05:45
Alejandro Bailo 9e02da342b docs: Security Hub API and UI documentation (#8576)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
2025-08-28 20:43:42 +05:45
Prowler Bot 358d4239c7 chore(release): Bump version to v5.12.0 (#8605)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-08-28 16:56:24 +02:00
Víctor Fernández Poyatos b003fca377 fix(docs): remove empty sections (#8600) 2025-08-28 12:55:46 +02:00
Víctor Fernández Poyatos b4deda3c3f docs(api): fix API response samples (#8592) 2025-08-28 12:39:07 +02:00
Sergio Garcia 338bb74c0c fix(azure): query API management logs with not empty operations (#8598) 2025-08-28 12:03:35 +02:00
Alejandro Bailo 7342a8901f chore: update CHANGELOG.md for Prowler v5.11.0 release (#8597) 2025-08-28 11:43:24 +02:00
Sergio Garcia f484b83f15 feat(azure): Add APIM threat detection for LLM jacking attacks (#8571)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-08-28 11:42:07 +02:00
Adrián Jesús Peña Rodríguez c69187f484 chore: prepare api changelog for 5.11 (#8596) 2025-08-28 10:25:08 +02:00
Alejandro Bailo 5038afeb26 fix(security-hub): copy updated (#8594) 2025-08-27 18:42:34 +02:00
Sergio Garcia fce43cea16 chore: update changelog (#8593) 2025-08-27 17:57:07 +02:00
Andoni Alonso 43a14b89bc fix(github): provider always scans user instead of organization when using provider UID (#8587) 2025-08-27 17:45:13 +02:00
Tom 24364bd73e feat(gcp): Add support for skipping APIs check (#8575)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2025-08-27 14:44:34 +02:00
Adrián Jesús Peña Rodríguez a1abe6dd2d fix(sh): reset regions information if connection fails (#8588) 2025-08-27 14:15:09 +02:00
César Arroba 25098bc82a chore(gha): fix conflict checker action (#8586) 2025-08-27 13:41:39 +02:00
sumit-tft 20f2f45610 feat(ui): add S3 bucket link with folder for each integration (#8554)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2025-08-27 12:40:37 +02:00
Alejandro Bailo 06c2608a05 feat(integrations): external links and copies changed (#8574) 2025-08-27 12:40:25 +02:00
Alejandro Bailo 329ac113f2 chore(docs): update CHANGELOG properly (#8585) 2025-08-27 11:57:12 +02:00
Hugo Pereira Brito 97179d2b43 fix(docs): incorrect permission in sp creation guide (#8581) 2025-08-27 11:01:37 +02:00
sumit-tft 8317ea783f feat(ui): show all provider UIDs in scan page filter regardless of co… (#8375) 2025-08-27 10:50:16 +02:00
Andoni Alonso 65e7e89d61 fix(github): GitHub Personal Access Token authentication fails without user:email scope (#8580) 2025-08-27 09:57:32 +02:00
Víctor Fernández Poyatos 26a4dd4e8d chore: bump h2 to 4.3.0 (#8573) 2025-08-26 15:17:06 +02:00
Alejandro Bailo dab0cea2dd feat(ui): Security Hub (#8552) 2025-08-26 14:30:45 +02:00
Daniel Barranquero 3b42eb3818 fix(s3): resource metadata error in s3_bucket_shadow_resource_vulnerability (#8572) 2025-08-26 13:30:49 +02:00
Prowler Bot a5ba950627 chore(regions_update): Changes in regions for AWS services (#8567)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-08-26 09:57:45 +02:00
Andoni Alonso a1232446c1 docs: refactor several sections (#8570) 2025-08-26 09:55:18 +02:00
Pedro Martín aa6f851887 docs(aws): deploying prowler iam roles across aws organizations (#8427)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-08-26 09:45:14 +02:00
Adrián Jesús Peña Rodríguez 25f972e910 feat(sh): create asff of there is an enabled SecurityHub integration (#8569) 2025-08-25 16:58:21 +02:00
Pedro Martín 7216e5ce3d chore(github): improve pull request template (#7910) 2025-08-25 16:22:55 +02:00
Adrián Jesús Peña Rodríguez 83242da0ab feat(integrations): implement AWS Security Hub integration (#8365) 2025-08-25 15:53:48 +02:00
Alejandro Bailo d457166a0c fix(ui): AWS form selector default values (#8553) 2025-08-25 12:30:02 +02:00
Daniel Barranquero 88f38b2d2a feat(docs): remove old requirements links (#8561) 2025-08-22 14:22:50 +02:00
Pepe Fagoaga c2e0849d5f fix(conflict-checker): use prowler-bot (#8560) 2025-08-22 17:27:44 +05:45
Andoni Alonso 1fdebfa295 docs: remove "Requirements" page (#8559) 2025-08-22 15:55:25 +05:45
Sergio Garcia ea6d04ed3a chore(securityhub): add static credentials and role assumption support (#8539)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-08-22 11:58:35 +02:00
Sergio Garcia 2167683851 feat(aws): add Resource Explorer enumeration actions (#8557) 2025-08-22 11:47:51 +02:00
Pepe Fagoaga 6324be31ab fix(api): poetry lock up to date with the SDK (#8558) 2025-08-22 11:05:14 +02:00
Alejandro Bailo 525f152e51 fix(ui): update authorization logic to match right paths (#8556) 2025-08-22 10:35:28 +02:00
Sergio Garcia c3a2d79234 chore(iac): change engine to trivy (#8466)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-08-22 10:17:51 +02:00
Andoni Alonso cefa708322 docs: add provider bulk provisioning (#8551) 2025-08-21 16:33:45 +02:00
Andoni Alonso 1a9e14ab2a chore(bulk-provisioning-tool): add script to bulk provision providers (#8540) 2025-08-21 13:11:46 +02:00
Chandrapal Badshah b1c6094b6d fix: Remove temperature for GPT-5 models (#8550)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-08-21 12:40:49 +02:00
Pablo Lara 1038b11fe3 docs: update changelog (#8549) 2025-08-21 12:22:27 +02:00
355 changed files with 27261 additions and 6048 deletions
+1
View File
@@ -127,6 +127,7 @@ jQIDAQAB
DJANGO_SECRETS_ENCRYPTION_KEY="oE/ltOhp/n1TdbHjVmzcjDPLcLA41CVI/4Rk+UB5ESc="
DJANGO_BROKER_VISIBILITY_TIMEOUT=86400
DJANGO_SENTRY_DSN=
DJANGO_THROTTLE_TOKEN_OBTAIN=50/minute
# Sentry settings
SENTRY_ENVIRONMENT=local
+7
View File
@@ -37,6 +37,11 @@ provider/iac:
- any-glob-to-any-file: "prowler/providers/iac/**"
- any-glob-to-any-file: "tests/providers/iac/**"
provider/mongodbatlas:
- changed-files:
- any-glob-to-any-file: "prowler/providers/mongodbatlas/**"
- any-glob-to-any-file: "tests/providers/mongodbatlas/**"
github_actions:
- changed-files:
- any-glob-to-any-file: ".github/workflows/*"
@@ -52,11 +57,13 @@ mutelist:
- any-glob-to-any-file: "prowler/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/kubernetes/lib/mutelist/**"
- any-glob-to-any-file: "prowler/providers/mongodbatlas/lib/mutelist/**"
- any-glob-to-any-file: "tests/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/aws/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/azure/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/gcp/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/kubernetes/lib/mutelist/**"
- any-glob-to-any-file: "tests/providers/mongodbatlas/lib/mutelist/**"
integration/s3:
- changed-files:
+4
View File
@@ -8,6 +8,10 @@ If fixes an issue please add it with `Fix #XXXX`
Please include a summary of the change and which issue is fixed. List any dependencies that are required for this change.
### Steps to review
Please add a detailed description of how to review this PR.
### Checklist
- Are there new checks included in this PR? Yes / No
+2 -1
View File
@@ -71,7 +71,7 @@ jobs:
if: steps.conflict-check.outputs.has_conflicts == 'true'
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
github-token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
script: |
const { data: labels } = await github.rest.issues.listLabelsOnIssue({
owner: context.repo.owner,
@@ -97,6 +97,7 @@ jobs:
if: steps.conflict-check.outputs.has_conflicts == 'false'
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
script: |
try {
await github.rest.issues.removeLabel({
@@ -157,6 +157,22 @@ jobs:
cache-from: type=gha
cache-to: type=gha,mode=max
# - name: Push README to Docker Hub (toniblyx)
# uses: peter-evans/dockerhub-description@432a30c9e07499fd01da9f8a49f0faf9e0ca5b77 # v4.0.2
# with:
# username: ${{ secrets.DOCKERHUB_USERNAME }}
# password: ${{ secrets.DOCKERHUB_TOKEN }}
# repository: ${{ env.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}
# readme-filepath: ./README.md
#
# - name: Push README to Docker Hub (prowlercloud)
# uses: peter-evans/dockerhub-description@432a30c9e07499fd01da9f8a49f0faf9e0ca5b77 # v4.0.2
# with:
# username: ${{ secrets.DOCKERHUB_USERNAME }}
# password: ${{ secrets.DOCKERHUB_TOKEN }}
# repository: ${{ env.PROWLERCLOUD_DOCKERHUB_REPOSITORY }}/${{ env.PROWLERCLOUD_DOCKERHUB_IMAGE }}
# readme-filepath: ./README.md
dispatch-action:
needs: container-build-push
runs-on: ubuntu-latest
+15
View File
@@ -234,6 +234,21 @@ jobs:
run: |
poetry run pytest -n auto --cov=./prowler/providers/iac --cov-report=xml:iac_coverage.xml tests/providers/iac
# Test MongoDB Atlas
- name: MongoDB Atlas - Check if any file has changed
id: mongodb-atlas-changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
./prowler/providers/mongodbatlas/**
./tests/providers/mongodbatlas/**
.poetry.lock
- name: MongoDB Atlas - Test
if: steps.mongodb-atlas-changed-files.outputs.any_changed == 'true'
run: |
poetry run pytest -n auto --cov=./prowler/providers/mongodbatlas --cov-report=xml:mongodb_atlas_coverage.xml tests/providers/mongodbatlas
# Common Tests
- name: Lib - Test
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
+6
View File
@@ -75,3 +75,9 @@ node_modules
# Persistent data
_data/
# Claude
CLAUDE.md
# LLM's (Until we have a standard one)
AGENTS.md
+27 -26
View File
@@ -19,19 +19,16 @@
<a href="https://goto.prowler.com/slack"><img alt="Slack Shield" src="https://img.shields.io/badge/slack-prowler-brightgreen.svg?logo=slack"></a>
<a href="https://pypi.org/project/prowler/"><img alt="Python Version" src="https://img.shields.io/pypi/v/prowler.svg"></a>
<a href="https://pypi.python.org/pypi/prowler/"><img alt="Python Version" src="https://img.shields.io/pypi/pyversions/prowler.svg"></a>
<a href="https://pypistats.org/packages/prowler"><img alt="PyPI Prowler Downloads" src="https://img.shields.io/pypi/dw/prowler.svg?label=prowler%20downloads"></a>
<a href="https://pypistats.org/packages/prowler"><img alt="PyPI Downloads" src="https://img.shields.io/pypi/dw/prowler.svg?label=downloads"></a>
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/toniblyx/prowler"></a>
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker" src="https://img.shields.io/docker/cloud/build/toniblyx/prowler"></a>
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker" src="https://img.shields.io/docker/image-size/toniblyx/prowler"></a>
<a href="https://gallery.ecr.aws/prowler-cloud/prowler"><img width="120" height=19" alt="AWS ECR Gallery" src="https://user-images.githubusercontent.com/3985464/151531396-b6535a68-c907-44eb-95a1-a09508178616.png"></a>
<a href="https://codecov.io/gh/prowler-cloud/prowler"><img src="https://codecov.io/gh/prowler-cloud/prowler/graph/badge.svg?token=OflBGsdpDl"/></a>
</p>
<p align="center">
<a href="https://github.com/prowler-cloud/prowler"><img alt="Repo size" src="https://img.shields.io/github/repo-size/prowler-cloud/prowler"></a>
<a href="https://github.com/prowler-cloud/prowler/issues"><img alt="Issues" src="https://img.shields.io/github/issues/prowler-cloud/prowler"></a>
<a href="https://github.com/prowler-cloud/prowler/releases"><img alt="Version" src="https://img.shields.io/github/v/release/prowler-cloud/prowler"></a>
<a href="https://github.com/prowler-cloud/prowler/releases"><img alt="Version" src="https://img.shields.io/github/v/release/prowler-cloud/prowler"></a>
<a href="https://github.com/prowler-cloud/prowler/releases"><img alt="Version" src="https://img.shields.io/github/release-date/prowler-cloud/prowler"></a>
<a href="https://github.com/prowler-cloud/prowler"><img alt="Contributors" src="https://img.shields.io/github/contributors-anon/prowler-cloud/prowler"></a>
<a href="https://github.com/prowler-cloud/prowler/issues"><img alt="Issues" src="https://img.shields.io/github/issues/prowler-cloud/prowler"></a>
<a href="https://github.com/prowler-cloud/prowler"><img alt="License" src="https://img.shields.io/github/license/prowler-cloud/prowler"></a>
<a href="https://twitter.com/ToniBlyx"><img alt="Twitter" src="https://img.shields.io/twitter/follow/toniblyx?style=social"></a>
<a href="https://twitter.com/prowlercloud"><img alt="Twitter" src="https://img.shields.io/twitter/follow/prowlercloud?style=social"></a>
@@ -55,15 +52,11 @@ Prowler includes hundreds of built-in controls to ensure compliance with standar
- **National Security Standards:** ENS (Spanish National Security Scheme)
- **Custom Security Frameworks:** Tailored to your needs
## Prowler CLI and Prowler Cloud
Prowler offers a Command Line Interface (CLI), known as Prowler Open Source, and an additional service built on top of it, called <a href="https://prowler.com">Prowler Cloud</a>.
## Prowler App
Prowler App is a web-based application that simplifies running Prowler across your cloud provider accounts. It provides a user-friendly interface to visualize the results and streamline your security assessments.
![Prowler App](docs/img/overview.png)
![Prowler App](docs/products/img/overview.png)
>For more details, refer to the [Prowler App Documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#prowler-app-installation)
@@ -80,28 +73,36 @@ prowler <provider>
```console
prowler dashboard
```
![Prowler Dashboard](docs/img/dashboard.png)
![Prowler Dashboard](docs/products/img/dashboard.png)
# Prowler at a Glance
> [!Tip]
> For the most accurate and up-to-date information about checks, services, frameworks, and categories, visit [**Prowler Hub**](https://hub.prowler.com).
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 571 | 82 | 36 | 10 |
| GCP | 79 | 13 | 10 | 3 |
| Azure | 162 | 19 | 11 | 4 |
| Kubernetes | 83 | 7 | 5 | 7 |
| GitHub | 17 | 2 | 1 | 0 |
| M365 | 70 | 7 | 3 | 2 |
| NHN (Unofficial) | 6 | 2 | 1 | 0 |
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) | Support | Stage | Interface |
|---|---|---|---|---|---|---|---|
| AWS | 576 | 82 | 36 | 10 | Official | Stable | UI, API, CLI |
| GCP | 79 | 13 | 10 | 3 | Official | Stable | UI, API, CLI |
| Azure | 162 | 19 | 11 | 4 | Official | Stable | UI, API, CLI |
| Kubernetes | 83 | 7 | 5 | 7 | Official | Stable | UI, API, CLI |
| GitHub | 17 | 2 | 1 | 0 | Official | Stable | UI, API, CLI |
| M365 | 70 | 7 | 3 | 2 | Official | Stable | UI, API, CLI |
| IaC | [See `trivy` docs.](https://trivy.dev/latest/docs/coverage/iac/) | N/A | N/A | N/A | Official | Beta | CLI |
| MongoDB Atlas | 10 | 3 | 0 | 0 | Official | Beta | CLI |
| NHN | 6 | 2 | 1 | 0 | Unofficial | Beta | CLI |
> [!Note]
> The numbers in the table are updated periodically.
> [!Tip]
> For the most accurate and up-to-date information about checks, services, frameworks, and categories, visit [**Prowler Hub**](https://hub.prowler.com).
> [!Note]
> Use the following commands to list Prowler's available checks, services, compliance frameworks, and categories: `prowler <provider> --list-checks`, `prowler <provider> --list-services`, `prowler <provider> --list-compliance` and `prowler <provider> --list-categories`.
> Use the following commands to list Prowler's available checks, services, compliance frameworks, and categories:
> - `prowler <provider> --list-checks`
> - `prowler <provider> --list-services`
> - `prowler <provider> --list-compliance`
> - `prowler <provider> --list-categories`
# 💻 Installation
@@ -239,7 +240,7 @@ The following versions of Prowler CLI are available, depending on your requireme
The container images are available here:
- Prowler CLI:
- [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags)
- [DockerHub](https://hub.docker.com/r/prowlercloud/prowler/tags)
- [AWS Public ECR](https://gallery.ecr.aws/prowler-cloud/prowler)
- Prowler App:
- [DockerHub - Prowler UI](https://hub.docker.com/r/prowlercloud/prowler-ui/tags)
@@ -274,7 +275,7 @@ python prowler-cli.py -v
- **Prowler API**: A backend service, developed with Django REST Framework, responsible for running Prowler scans and storing the generated results.
- **Prowler SDK**: A Python SDK designed to extend the functionality of the Prowler CLI for advanced capabilities.
![Prowler App Architecture](docs/img/prowler-app-architecture.png)
![Prowler App Architecture](docs/products/img/prowler-app-architecture.png)
## Prowler CLI
+57 -15
View File
@@ -1,23 +1,65 @@
# Security Policy
# Security
## Software Security
As an **AWS Partner** and we have passed the [AWS Foundation Technical Review (FTR)](https://aws.amazon.com/partners/foundational-technical-review/) and we use the following tools and automation to make sure our code is secure and dependencies up-to-dated:
## Reporting Vulnerabilities
- `bandit` for code security review.
- `safety` and `dependabot` for dependencies.
- `hadolint` and `dockle` for our containers security.
- `snyk` in Docker Hub.
- `clair` in Amazon ECR.
- `vulture`, `flake8`, `black` and `pylint` for formatting and best practices.
At Prowler, we consider the security of our open source software and systems a top priority. But no matter how much effort we put into system security, there can still be vulnerabilities present.
## Reporting a Vulnerability
If you discover a vulnerability, we would like to know about it so we can take steps to address it as quickly as possible. We would like to ask you to help us better protect our users, our clients and our systems.
If you would like to report a vulnerability or have a security concern regarding Prowler Open Source or ProwlerPro service, please submit the information by contacting to https://support.prowler.com.
When reporting vulnerabilities, please consider (1) attack scenario / exploitability, and (2) the security impact of the bug. The following issues are considered out of scope:
The information you share with ProwlerPro as part of this process is kept confidential within ProwlerPro. We will only share this information with a third party if the vulnerability you report is found to affect a third-party product, in which case we will share this information with the third-party product's author or manufacturer. Otherwise, we will only share this information as permitted by you.
- Social engineering support or attacks requiring social engineering.
- Clickjacking on pages with no sensitive actions.
- Cross-Site Request Forgery (CSRF) on unauthenticated forms or forms with no sensitive actions.
- Attacks requiring Man-In-The-Middle (MITM) or physical access to a user's device.
- Previously known vulnerable libraries without a working Proof of Concept (PoC).
- Comma Separated Values (CSV) injection without demonstrating a vulnerability.
- Missing best practices in SSL/TLS configuration.
- Any activity that could lead to the disruption of service (DoS).
- Rate limiting or brute force issues on non-authentication endpoints.
- Missing best practices in Content Security Policy (CSP).
- Missing HttpOnly or Secure flags on cookies.
- Configuration of or missing security headers.
- Missing email best practices, such as invalid, incomplete, or missing SPF/DKIM/DMARC records.
- Vulnerabilities only affecting users of outdated or unpatched browsers (less than two stable versions behind).
- Software version disclosure, banner identification issues, or descriptive error messages.
- Tabnabbing.
- Issues that require unlikely user interaction.
- Improper logout functionality and improper session timeout.
- CORS misconfiguration without an exploitation scenario.
- Broken link hijacking.
- Automated scanning results (e.g., sqlmap, Burp active scanner) that have not been manually verified.
- Content spoofing and text injection issues without a clear attack vector.
- Email spoofing without exploiting security flaws.
- Dead links or broken links.
- User enumeration.
We will review the submitted report, and assign it a tracking number. We will then respond to you, acknowledging receipt of the report, and outline the next steps in the process.
Testing guidelines:
- Do not run automated scanners on other customer projects. Running automated scanners can run up costs for our users. Aggressively configured scanners might inadvertently disrupt services, exploit vulnerabilities, lead to system instability or breaches and violate Terms of Service from our upstream providers. Our own security systems won't be able to distinguish hostile reconnaissance from whitehat research. If you wish to run an automated scanner, notify us at support@prowler.com and only run it on your own Prowler app project. Do NOT attack Prowler in usage of other customers.
- Do not take advantage of the vulnerability or problem you have discovered, for example by downloading more data than necessary to demonstrate the vulnerability or deleting or modifying other people's data.
You will receive a non-automated response to your initial contact within 24 hours, confirming receipt of your reported vulnerability.
Reporting guidelines:
- File a report through our Support Desk at https://support.prowler.com
- If it is about a lack of a security functionality, please file a feature request instead at https://github.com/prowler-cloud/prowler/issues
- Do provide sufficient information to reproduce the problem, so we will be able to resolve it as quickly as possible.
- If you have further questions and want direct interaction with the Prowler team, please contact us at via our Community Slack at goto.prowler.com/slack.
We will coordinate public notification of any validated vulnerability with you. Where possible, we prefer that our respective public disclosures be posted simultaneously.
Disclosure guidelines:
- In order to protect our users and customers, do not reveal the problem to others until we have researched, addressed and informed our affected customers.
- If you want to publicly share your research about Prowler at a conference, in a blog or any other public forum, you should share a draft with us for review and approval at least 30 days prior to the publication date. Please note that the following should not be included:
- Data regarding any Prowler user or customer projects.
- Prowler customers' data.
- Information about Prowler employees, contractors or partners.
What we promise:
- We will respond to your report within 5 business days with our evaluation of the report and an expected resolution date.
- If you have followed the instructions above, we will not take any legal action against you in regard to the report.
- We will handle your report with strict confidentiality, and not pass on your personal details to third parties without your permission.
- We will keep you informed of the progress towards resolving the problem.
- In the public information concerning the problem reported, we will give your name as the discoverer of the problem (unless you desire otherwise).
We strive to resolve all problems as quickly as possible, and we would like to play an active role in the ultimate publication on the problem after it is resolved.
---
For more information about our security policies, please refer to our [Security](https://docs.prowler.com/projects/prowler-open-source/en/latest/security/) section in our documentation.
+2
View File
@@ -19,6 +19,8 @@ DJANGO_REFRESH_TOKEN_LIFETIME=1440
DJANGO_CACHE_MAX_AGE=3600
DJANGO_STALE_WHILE_REVALIDATE=60
DJANGO_SECRETS_ENCRYPTION_KEY=""
# Throttle, two options: Empty means no throttle; or if desired use one in DRF format: https://www.django-rest-framework.org/api-guide/throttling/#setting-the-throttling-policy
DJANGO_THROTTLE_TOKEN_OBTAIN=50/minute
# Decide whether to allow Django manage database table partitions
DJANGO_MANAGE_DB_PARTITIONS=[True|False]
DJANGO_CELERY_DEADLOCK_ATTEMPTS=5
+15 -1
View File
@@ -2,10 +2,24 @@
All notable changes to the **Prowler API** are documented in this file.
## [1.12.0] (Prowler 5.11.0 - UNRELEASED)
## [1.13.0] (Prowler 5.12.0)
### Added
- Integration with JIRA, enabling sending findings to a JIRA project [(#8622)](https://github.com/prowler-cloud/prowler/pull/8622), [(#8637)](https://github.com/prowler-cloud/prowler/pull/8637)
- `GET /overviews/findings_severity` now supports `filter[status]` and `filter[status__in]` to aggregate by specific statuses (`FAIL`, `PASS`)[(#8186)](https://github.com/prowler-cloud/prowler/pull/8186)
- Throttling options for `/api/v1/tokens` using the `DJANGO_THROTTLE_TOKEN_OBTAIN` environment variable [(#8647)](https://github.com/prowler-cloud/prowler/pull/8647)
---
## [1.12.0] (Prowler 5.11.0)
### Added
- Lighthouse support for OpenAI GPT-5 [(#8527)](https://github.com/prowler-cloud/prowler/pull/8527)
- Integration with Amazon Security Hub, enabling sending findings to Security Hub [(#8365)](https://github.com/prowler-cloud/prowler/pull/8365)
- Generate ASFF output for AWS providers with SecurityHub integration enabled [(#8569)](https://github.com/prowler-cloud/prowler/pull/8569)
### Fixed
- GitHub provider always scans user instead of organization when using provider UID [(#8587)](https://github.com/prowler-cloud/prowler/pull/8587)
## [1.11.0] (Prowler 5.10.0)
+1542 -1237
View File
File diff suppressed because it is too large Load Diff
+4 -3
View File
@@ -24,13 +24,14 @@ dependencies = [
"drf-spectacular-jsonapi==0.5.1",
"gunicorn==23.0.0",
"lxml==5.3.2",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@master",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.12",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
"uuid6==2024.7.10",
"openai (>=1.82.0,<2.0.0)",
"xmlsec==1.3.14"
"xmlsec==1.3.14",
"h2 (==4.3.0)"
]
description = "Prowler's API (Django/DRF)"
license = "Apache-2.0"
@@ -38,7 +39,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.11.0"
version = "1.13.0"
[project.scripts]
celery = "src.backend.config.settings.celery"
+88 -1
View File
@@ -2,7 +2,7 @@ from datetime import date, datetime, timedelta, timezone
from dateutil.parser import parse
from django.conf import settings
from django.db.models import Q
from django.db.models import F, Q
from django_filters.rest_framework import (
BaseInFilter,
BooleanFilter,
@@ -28,6 +28,7 @@ from api.models import (
Integration,
Invitation,
Membership,
OverviewStatusChoices,
PermissionChoices,
Processor,
Provider,
@@ -750,6 +751,72 @@ class ScanSummaryFilter(FilterSet):
}
class ScanSummarySeverityFilter(ScanSummaryFilter):
"""Filter for findings_severity ScanSummary endpoint - includes status filters"""
# Custom status filters - only for severity grouping endpoint
status = ChoiceFilter(method="filter_status", choices=OverviewStatusChoices.choices)
status__in = CharInFilter(method="filter_status_in", lookup_expr="in")
def filter_status(self, queryset, name, value):
# Validate the status value
if value not in [choice[0] for choice in OverviewStatusChoices.choices]:
raise ValidationError(f"Invalid status value: {value}")
# Apply the filter by annotating the queryset with the status field
if value == OverviewStatusChoices.FAIL:
return queryset.annotate(status_count=F("fail"))
elif value == OverviewStatusChoices.PASS:
return queryset.annotate(status_count=F("_pass"))
else:
return queryset.annotate(status_count=F("total"))
def filter_status_in(self, queryset, name, value):
# Validate the status values
valid_statuses = [choice[0] for choice in OverviewStatusChoices.choices]
for status_val in value:
if status_val not in valid_statuses:
raise ValidationError(f"Invalid status value: {status_val}")
# If all statuses or no valid statuses, use total
if (
set(value)
>= {
OverviewStatusChoices.FAIL,
OverviewStatusChoices.PASS,
}
or not value
):
return queryset.annotate(status_count=F("total"))
# Build the sum expression based on status values
sum_expression = None
for status in value:
if status == OverviewStatusChoices.FAIL:
field_expr = F("fail")
elif status == OverviewStatusChoices.PASS:
field_expr = F("_pass")
else:
continue
if sum_expression is None:
sum_expression = field_expr
else:
sum_expression = sum_expression + field_expr
if sum_expression is None:
return queryset.annotate(status_count=F("total"))
return queryset.annotate(status_count=sum_expression)
class Meta:
model = ScanSummary
fields = {
"inserted_at": ["date", "gte", "lte"],
"region": ["exact", "icontains", "in"],
}
class ServiceOverviewFilter(ScanSummaryFilter):
def is_valid(self):
# Check if at least one of the inserted_at filters is present
@@ -793,3 +860,23 @@ class ProcessorFilter(FilterSet):
field_name="processor_type",
lookup_expr="in",
)
class IntegrationJiraFindingsFilter(FilterSet):
# To be expanded as needed
finding_id = UUIDFilter(field_name="id", lookup_expr="exact")
finding_id__in = UUIDInFilter(field_name="id", lookup_expr="in")
class Meta:
model = Finding
fields = {}
def filter_queryset(self, queryset):
# Validate that there is at least one filter provided
if not self.data:
raise ValidationError(
{
"findings": "No finding filters provided. At least one filter is required."
}
)
return super().filter_queryset(queryset)
File diff suppressed because one or more lines are too long
@@ -0,0 +1,33 @@
# Generated by Django 5.1.10 on 2025-08-20 09:04
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0045_alter_scan_output_location"),
]
operations = [
migrations.AlterField(
model_name="lighthouseconfiguration",
name="model",
field=models.CharField(
choices=[
("gpt-4o-2024-11-20", "GPT-4o v2024-11-20"),
("gpt-4o-2024-08-06", "GPT-4o v2024-08-06"),
("gpt-4o-2024-05-13", "GPT-4o v2024-05-13"),
("gpt-4o", "GPT-4o Default"),
("gpt-4o-mini-2024-07-18", "GPT-4o Mini v2024-07-18"),
("gpt-4o-mini", "GPT-4o Mini Default"),
("gpt-5-2025-08-07", "GPT-5 v2025-08-07"),
("gpt-5", "GPT-5 Default"),
("gpt-5-mini-2025-08-07", "GPT-5 Mini v2025-08-07"),
("gpt-5-mini", "GPT-5 Mini Default"),
],
default="gpt-4o-2024-08-06",
help_text="Must be one of the supported model names",
max_length=50,
),
),
]
@@ -0,0 +1,16 @@
# Generated by Django 5.1.10 on 2025-08-20 08:24
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("api", "0046_lighthouse_gpt5"),
]
operations = [
migrations.RemoveConstraint(
model_name="integration",
name="unique_configuration_per_tenant",
),
]
+9 -4
View File
@@ -74,6 +74,15 @@ class StatusChoices(models.TextChoices):
MANUAL = "MANUAL", _("Manual")
class OverviewStatusChoices(models.TextChoices):
"""
Status filters allowed in overview/severity endpoints.
"""
FAIL = "FAIL", _("Fail")
PASS = "PASS", _("Pass")
class StateChoices(models.TextChoices):
AVAILABLE = "available", _("Available")
SCHEDULED = "scheduled", _("Scheduled")
@@ -1372,10 +1381,6 @@ class Integration(RowLevelSecurityProtectedModel):
db_table = "integrations"
constraints = [
models.UniqueConstraint(
fields=("configuration", "tenant"),
name="unique_configuration_per_tenant",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
+95
View File
@@ -0,0 +1,95 @@
def _pick_task_response_component(components):
schemas = components.get("schemas", {}) or {}
for candidate in ("TaskResponse",):
if candidate in schemas:
return candidate
return None
def _extract_task_example_from_components(components):
schemas = components.get("schemas", {}) or {}
candidate = "TaskResponse"
doc = schemas.get(candidate)
if isinstance(doc, dict) and "example" in doc:
return doc["example"]
res = schemas.get(candidate)
if isinstance(res, dict) and "example" in res:
example = res["example"]
return example if "data" in example else {"data": example}
# Fallback
return {
"data": {
"type": "tasks",
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"attributes": {
"inserted_at": "2019-08-24T14:15:22Z",
"completed_at": "2019-08-24T14:15:22Z",
"name": "string",
"state": "available",
"result": None,
"task_args": None,
"metadata": None,
},
}
}
def attach_task_202_examples(result, generator, request, public): # noqa: F841
if not isinstance(result, dict):
return result
components = result.get("components", {}) or {}
task_resp_component = _pick_task_response_component(components)
task_example = _extract_task_example_from_components(components)
paths = result.get("paths", {}) or {}
for path_item in paths.values():
if not isinstance(path_item, dict):
continue
for method_obj in path_item.values():
if not isinstance(method_obj, dict):
continue
responses = method_obj.get("responses", {}) or {}
resp_202 = responses.get("202")
if not isinstance(resp_202, dict):
continue
content = resp_202.get("content", {}) or {}
jsonapi = content.get("application/vnd.api+json")
if not isinstance(jsonapi, dict):
continue
# Inject example if missing
if "examples" not in jsonapi and "example" not in jsonapi:
jsonapi["examples"] = {
"Task queued": {
"summary": "Task queued",
"value": task_example,
}
}
# Rewrite schema $ref if needed
if task_resp_component:
schema = jsonapi.get("schema")
must_replace = False
if not isinstance(schema, dict):
must_replace = True
else:
ref = schema.get("$ref")
if not ref:
must_replace = True
else:
current = ref.split("/")[-1]
if current != task_resp_component:
must_replace = True
if must_replace:
jsonapi["schema"] = {
"$ref": f"#/components/schemas/{task_resp_component}"
}
return result
File diff suppressed because it is too large Load Diff
+262 -1
View File
@@ -6,16 +6,18 @@ from rest_framework.exceptions import NotFound, ValidationError
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Invitation, Provider
from api.models import Integration, Invitation, Provider
from api.utils import (
get_prowler_provider_kwargs,
initialize_prowler_provider,
merge_dicts,
prowler_integration_connection_test,
prowler_provider_connection_test,
return_prowler_provider,
validate_invitation,
)
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHubConnection
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
@@ -197,6 +199,10 @@ class TestGetProwlerProviderKwargs:
Provider.ProviderChoices.M365.value,
{},
),
(
Provider.ProviderChoices.GITHUB.value,
{"organizations": ["provider_uid"]},
),
],
)
def test_get_prowler_provider_kwargs(self, provider_type, expected_extra_kwargs):
@@ -390,3 +396,258 @@ class TestValidateInvitation:
mock_db.get.assert_called_once_with(
token="VALID_TOKEN", email__iexact="user@example.com"
)
class TestProwlerIntegrationConnectionTest:
"""Test prowler_integration_connection_test function for SecurityHub regions reset."""
@patch("api.utils.SecurityHub")
def test_security_hub_connection_failure_resets_regions(
self, mock_security_hub_class
):
"""Test that SecurityHub connection failure resets regions to empty dict."""
# Create integration with existing regions configuration
integration = MagicMock()
integration.integration_type = Integration.IntegrationChoices.AWS_SECURITY_HUB
integration.credentials = {
"aws_access_key_id": "test_key",
"aws_secret_access_key": "test_secret",
}
integration.configuration = {
"send_only_fails": True,
"regions": {
"us-east-1": True,
"us-west-2": True,
"eu-west-1": False,
"ap-south-1": False,
},
}
# Mock provider relationship
mock_provider = MagicMock()
mock_provider.uid = "123456789012"
mock_relationship = MagicMock()
mock_relationship.provider = mock_provider
integration.integrationproviderrelationship_set.first.return_value = (
mock_relationship
)
# Mock failed SecurityHub connection
mock_connection = SecurityHubConnection(
is_connected=False,
error=Exception("SecurityHub testing"),
enabled_regions=set(),
disabled_regions=set(),
)
mock_security_hub_class.test_connection.return_value = mock_connection
# Call the function
result = prowler_integration_connection_test(integration)
# Assertions
assert result.is_connected is False
assert str(result.error) == "SecurityHub testing"
# Verify regions were completely reset to empty dict
assert integration.configuration["regions"] == {}
# Verify save was called to persist the change
integration.save.assert_called_once()
# Verify test_connection was called with correct parameters
mock_security_hub_class.test_connection.assert_called_once_with(
aws_account_id="123456789012",
raise_on_exception=False,
aws_access_key_id="test_key",
aws_secret_access_key="test_secret",
)
@patch("api.utils.SecurityHub")
def test_security_hub_connection_success_saves_regions(
self, mock_security_hub_class
):
"""Test that successful SecurityHub connection saves regions correctly."""
integration = MagicMock()
integration.integration_type = Integration.IntegrationChoices.AWS_SECURITY_HUB
integration.credentials = {
"aws_access_key_id": "valid_key",
"aws_secret_access_key": "valid_secret",
}
integration.configuration = {"send_only_fails": False}
# Mock provider relationship
mock_provider = MagicMock()
mock_provider.uid = "123456789012"
mock_relationship = MagicMock()
mock_relationship.provider = mock_provider
integration.integrationproviderrelationship_set.first.return_value = (
mock_relationship
)
# Mock successful SecurityHub connection with regions
mock_connection = SecurityHubConnection(
is_connected=True,
error=None,
enabled_regions={"us-east-1", "eu-west-1"},
disabled_regions={"ap-south-1"},
)
mock_security_hub_class.test_connection.return_value = mock_connection
result = prowler_integration_connection_test(integration)
assert result.is_connected is True
# Verify regions were saved correctly
assert integration.configuration["regions"]["us-east-1"] is True
assert integration.configuration["regions"]["eu-west-1"] is True
assert integration.configuration["regions"]["ap-south-1"] is False
integration.save.assert_called_once()
@patch("api.utils.rls_transaction")
@patch("api.utils.Jira")
def test_jira_connection_success_basic_auth(
self, mock_jira_class, mock_rls_transaction
):
integration = MagicMock()
integration.integration_type = Integration.IntegrationChoices.JIRA
integration.tenant_id = "test-tenant-id"
integration.credentials = {
"user_mail": "test@example.com",
"api_token": "test_api_token",
"domain": "example.atlassian.net",
}
integration.configuration = {}
# Mock successful JIRA connection with projects
mock_connection = MagicMock()
mock_connection.is_connected = True
mock_connection.error = None
mock_connection.projects = {"PROJ1": "Project 1", "PROJ2": "Project 2"}
mock_jira_class.test_connection.return_value = mock_connection
# Mock rls_transaction context manager
mock_rls_transaction.return_value.__enter__ = MagicMock()
mock_rls_transaction.return_value.__exit__ = MagicMock()
result = prowler_integration_connection_test(integration)
assert result.is_connected is True
assert result.error is None
# Verify JIRA connection was called with correct parameters including domain from credentials
mock_jira_class.test_connection.assert_called_once_with(
user_mail="test@example.com",
api_token="test_api_token",
domain="example.atlassian.net",
raise_on_exception=False,
)
# Verify rls_transaction was called with correct tenant_id
mock_rls_transaction.assert_called_once_with("test-tenant-id")
# Verify projects were saved to integration configuration
assert integration.configuration["projects"] == {
"PROJ1": "Project 1",
"PROJ2": "Project 2",
}
# Verify integration.save() was called
integration.save.assert_called_once()
@patch("api.utils.rls_transaction")
@patch("api.utils.Jira")
def test_jira_connection_failure_invalid_credentials(
self, mock_jira_class, mock_rls_transaction
):
integration = MagicMock()
integration.integration_type = Integration.IntegrationChoices.JIRA
integration.tenant_id = "test-tenant-id"
integration.credentials = {
"user_mail": "invalid@example.com",
"api_token": "invalid_token",
"domain": "invalid.atlassian.net",
}
integration.configuration = {}
# Mock failed JIRA connection
mock_connection = MagicMock()
mock_connection.is_connected = False
mock_connection.error = Exception("Authentication failed: Invalid credentials")
mock_connection.projects = {} # Empty projects when connection fails
mock_jira_class.test_connection.return_value = mock_connection
# Mock rls_transaction context manager
mock_rls_transaction.return_value.__enter__ = MagicMock()
mock_rls_transaction.return_value.__exit__ = MagicMock()
result = prowler_integration_connection_test(integration)
assert result.is_connected is False
assert "Authentication failed: Invalid credentials" in str(result.error)
# Verify JIRA connection was called with correct parameters
mock_jira_class.test_connection.assert_called_once_with(
user_mail="invalid@example.com",
api_token="invalid_token",
domain="invalid.atlassian.net",
raise_on_exception=False,
)
# Verify rls_transaction was called even on failure
mock_rls_transaction.assert_called_once_with("test-tenant-id")
# Verify empty projects dict was saved to integration configuration
assert integration.configuration["projects"] == {}
# Verify integration.save() was called even on connection failure
integration.save.assert_called_once()
@patch("api.utils.rls_transaction")
@patch("api.utils.Jira")
def test_jira_connection_projects_update_with_existing_configuration(
self, mock_jira_class, mock_rls_transaction
):
"""Test that projects are properly updated when integration already has configuration data"""
integration = MagicMock()
integration.integration_type = Integration.IntegrationChoices.JIRA
integration.tenant_id = "test-tenant-id"
integration.credentials = {
"user_mail": "test@example.com",
"api_token": "test_api_token",
"domain": "example.atlassian.net",
}
integration.configuration = {
"issue_types": ["Task"], # Existing configuration
"projects": {"OLD_PROJ": "Old Project"}, # Will be overwritten
}
# Mock successful JIRA connection with new projects
mock_connection = MagicMock()
mock_connection.is_connected = True
mock_connection.error = None
mock_connection.projects = {
"NEW_PROJ1": "New Project 1",
"NEW_PROJ2": "New Project 2",
}
mock_jira_class.test_connection.return_value = mock_connection
# Mock rls_transaction context manager
mock_rls_transaction.return_value.__enter__ = MagicMock()
mock_rls_transaction.return_value.__exit__ = MagicMock()
result = prowler_integration_connection_test(integration)
assert result.is_connected is True
assert result.error is None
# Verify projects were updated (old projects replaced with new ones)
assert integration.configuration["projects"] == {
"NEW_PROJ1": "New Project 1",
"NEW_PROJ2": "New Project 2",
}
# Verify other configuration fields were preserved
assert integration.configuration["issue_types"] == ["Task"]
# Verify integration.save() was called
integration.save.assert_called_once()
+292
View File
@@ -5708,6 +5708,47 @@ class TestIntegrationViewSet:
== data["data"]["relationships"]["providers"]["data"][0]["id"]
)
def test_integrations_create_valid_jira(
self,
authenticated_client,
):
"""Jira integrations are special"""
data = {
"data": {
"type": "integrations",
"attributes": {
"integration_type": Integration.IntegrationChoices.JIRA,
"configuration": {},
"credentials": {
"domain": "prowlerdomain",
"api_token": "this-is-an-api-token-for-jira-that-works-for-sure",
"user_mail": "testing@prowler.com",
},
"enabled": True,
},
}
}
response = authenticated_client.post(
reverse("integration-list"),
data=json.dumps(data),
content_type="application/vnd.api+json",
)
assert response.status_code == status.HTTP_201_CREATED
assert Integration.objects.count() == 1
integration = Integration.objects.first()
integration_configuration = response.json()["data"]["attributes"][
"configuration"
]
assert "projects" in integration_configuration
assert "issue_types" in integration_configuration
assert "domain" in integration_configuration
assert integration.enabled == data["data"]["attributes"]["enabled"]
assert (
integration.integration_type
== data["data"]["attributes"]["integration_type"]
)
assert "credentials" not in response.json()["data"]["attributes"]
def test_integrations_create_valid_relationships(
self,
authenticated_client,
@@ -5806,6 +5847,46 @@ class TestIntegrationViewSet:
"invalid",
None,
),
(
{
"integration_type": "jira",
"configuration": {
"projects": ["JIRA"],
},
"credentials": {"domain": "prowlerdomain"},
},
"invalid",
"configuration",
),
(
{
"integration_type": "jira",
"credentialss": {
"domain": "prowlerdomain",
"api_token": "api-token",
"user_mail": "test@prowler.com",
},
},
"required",
"configuration",
),
(
{
"integration_type": "jira",
"configuration": {},
},
"required",
"credentials",
),
(
{
"integration_type": "jira",
"configuration": {},
"credentials": {"api_token": "api-token"},
},
"invalid",
"credentials",
),
]
),
)
@@ -5995,6 +6076,217 @@ class TestIntegrationViewSet:
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
def test_integrations_create_duplicate_amazon_s3(
self, authenticated_client, providers_fixture
):
provider = providers_fixture[0]
# Create first S3 integration
data = {
"data": {
"type": "integrations",
"attributes": {
"integration_type": Integration.IntegrationChoices.AMAZON_S3,
"configuration": {
"bucket_name": "test-bucket",
"output_directory": "test-output",
},
"credentials": {
"role_arn": "arn:aws:iam::123456789012:role/test-role",
"external_id": "test-external-id",
},
"enabled": True,
},
"relationships": {
"providers": {
"data": [{"type": "providers", "id": str(provider.id)}]
}
},
}
}
# First creation should succeed
response = authenticated_client.post(
reverse("integration-list"),
data=json.dumps(data),
content_type="application/vnd.api+json",
)
assert response.status_code == status.HTTP_201_CREATED
# Attempt to create duplicate should return 409
response = authenticated_client.post(
reverse("integration-list"),
data=json.dumps(data),
content_type="application/vnd.api+json",
)
assert response.status_code == status.HTTP_409_CONFLICT
assert (
"This integration already exists" in response.json()["errors"][0]["detail"]
)
assert (
response.json()["errors"][0]["source"]["pointer"]
== "/data/attributes/configuration"
)
def test_integrations_create_duplicate_jira(self, authenticated_client):
# Create first JIRA integration
data = {
"data": {
"type": "integrations",
"attributes": {
"integration_type": Integration.IntegrationChoices.JIRA,
"configuration": {},
"credentials": {
"user_mail": "test@example.com",
"api_token": "test-api-token",
"domain": "prowlerdomain",
},
"enabled": True,
},
}
}
# First creation should succeed
response = authenticated_client.post(
reverse("integration-list"),
data=json.dumps(data),
content_type="application/vnd.api+json",
)
assert response.status_code == status.HTTP_201_CREATED
# Attempt to create duplicate should return 409
response = authenticated_client.post(
reverse("integration-list"),
data=json.dumps(data),
content_type="application/vnd.api+json",
)
assert response.status_code == status.HTTP_409_CONFLICT
assert (
"This integration already exists" in response.json()["errors"][0]["detail"]
)
assert (
response.json()["errors"][0]["source"]["pointer"]
== "/data/attributes/configuration"
)
def test_integrations_update_jira_configuration_readonly(
self, authenticated_client
):
# Create JIRA integration first
create_data = {
"data": {
"type": "integrations",
"attributes": {
"integration_type": Integration.IntegrationChoices.JIRA,
"configuration": {},
"credentials": {
"user_mail": "test@example.com",
"api_token": "test-api-token",
"domain": "initial-domain",
},
"enabled": True,
},
}
}
# Create the integration
response = authenticated_client.post(
reverse("integration-list"),
data=json.dumps(create_data),
content_type="application/vnd.api+json",
)
assert response.status_code == status.HTTP_201_CREATED
integration_id = response.json()["data"]["id"]
# Attempt to update configuration - should be ignored/not allowed
update_data = {
"data": {
"type": "integrations",
"id": integration_id,
"attributes": {
"configuration": {
"projects": {"NEW_PROJECT": "New Project"},
"issue_types": ["Epic", "Story"],
"domain": "malicious-domain",
}
},
}
}
response = authenticated_client.patch(
reverse("integration-detail", kwargs={"pk": integration_id}),
data=json.dumps(update_data),
content_type="application/vnd.api+json",
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
def test_integrations_update_jira_credentials_domain_reflects_in_configuration(
self, authenticated_client
):
# Create JIRA integration first
create_data = {
"data": {
"type": "integrations",
"attributes": {
"integration_type": Integration.IntegrationChoices.JIRA,
"configuration": {},
"credentials": {
"user_mail": "test@example.com",
"api_token": "test-api-token",
"domain": "original-domain",
},
"enabled": True,
},
}
}
# Create the integration
response = authenticated_client.post(
reverse("integration-list"),
data=json.dumps(create_data),
content_type="application/vnd.api+json",
)
assert response.status_code == status.HTTP_201_CREATED
integration_id = response.json()["data"]["id"]
# Verify initial domain in configuration
initial_integration = response.json()["data"]
assert (
initial_integration["attributes"]["configuration"]["domain"]
== "original-domain"
)
# Update credentials with new domain
update_data = {
"data": {
"type": "integrations",
"id": integration_id,
"attributes": {
"credentials": {
"user_mail": "updated@example.com",
"api_token": "updated-api-token",
"domain": "updated-domain",
}
},
}
}
response = authenticated_client.patch(
reverse("integration-detail", kwargs={"pk": integration_id}),
data=json.dumps(update_data),
content_type="application/vnd.api+json",
)
assert response.status_code == status.HTTP_200_OK
# Verify the new domain is reflected in configuration
updated_integration = response.json()["data"]
configuration = updated_integration["attributes"]["configuration"]
assert configuration["domain"] == "updated-domain"
# Verify other configuration fields are preserved
assert "projects" in configuration
assert "issue_types" in configuration
@pytest.mark.django_db
class TestSAMLTokenValidation:
+66 -3
View File
@@ -6,11 +6,14 @@ from django.db.models import Subquery
from rest_framework.exceptions import NotFound, ValidationError
from api.db_router import MainRouter
from api.db_utils import rls_transaction
from api.exceptions import InvitationTokenExpiredException
from api.models import Integration, Invitation, Processor, Provider, Resource
from api.v1.serializers import FindingMetadataSerializer
from prowler.lib.outputs.jira.jira import Jira, JiraBasicAuthError
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.aws.lib.s3.s3 import S3
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHub
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.common.models import Connection
from prowler.providers.gcp.gcp_provider import GcpProvider
@@ -119,6 +122,12 @@ def get_prowler_provider_kwargs(
}
elif provider.provider == Provider.ProviderChoices.KUBERNETES.value:
prowler_provider_kwargs = {**prowler_provider_kwargs, "context": provider.uid}
elif provider.provider == Provider.ProviderChoices.GITHUB.value:
if provider.uid:
prowler_provider_kwargs = {
**prowler_provider_kwargs,
"organizations": [provider.uid],
}
if mutelist_processor:
mutelist_content = mutelist_processor.configuration.get("Mutelist", {})
@@ -192,13 +201,53 @@ def prowler_integration_connection_test(integration: Integration) -> Connection:
raise_on_exception=False,
)
# TODO: It is possible that we can unify the connection test for all integrations, but need refactoring
# to avoid code duplication. Actually the AWS integrations are similar, so SecurityHub and S3 can be unified making some changes in the SDK.
# to avoid code duplication. Actually the AWS integrations are similar, so SecurityHub and S3 can be unified
# making some changes in the SDK.
elif (
integration.integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB
):
pass
# Get the provider associated with this integration
provider_relationship = integration.integrationproviderrelationship_set.first()
if not provider_relationship:
return Connection(
is_connected=False, error="No provider associated with this integration"
)
credentials = (
integration.credentials
if integration.credentials
else provider_relationship.provider.secret.secret
)
connection = SecurityHub.test_connection(
aws_account_id=provider_relationship.provider.uid,
raise_on_exception=False,
**credentials,
)
# Only save regions if connection is successful
if connection.is_connected:
regions_status = {r: True for r in connection.enabled_regions}
regions_status.update({r: False for r in connection.disabled_regions})
# Save regions information in the integration configuration
integration.configuration["regions"] = regions_status
integration.save()
else:
# Reset regions information if connection fails
integration.configuration["regions"] = {}
integration.save()
return connection
elif integration.integration_type == Integration.IntegrationChoices.JIRA:
pass
jira_connection = Jira.test_connection(
**integration.credentials,
raise_on_exception=False,
)
project_keys = jira_connection.projects if jira_connection.is_connected else {}
with rls_transaction(str(integration.tenant_id)):
integration.configuration["projects"] = project_keys
integration.save()
return jira_connection
elif integration.integration_type == Integration.IntegrationChoices.SLACK:
pass
else:
@@ -298,3 +347,17 @@ def get_findings_metadata_no_aggregations(tenant_id: str, filtered_queryset):
serializer.is_valid(raise_exception=True)
return serializer.data
def initialize_prowler_integration(integration: Integration) -> Jira:
# TODO Refactor other integrations to use this function
if integration.integration_type == Integration.IntegrationChoices.JIRA:
try:
return Jira(**integration.credentials)
except JiraBasicAuthError as jira_auth_error:
with rls_transaction(str(integration.tenant_id)):
integration.configuration["projects"] = {}
integration.connected = False
integration.connection_last_checked_at = datetime.now(tz=timezone.utc)
integration.save()
raise jira_auth_error
@@ -52,6 +52,32 @@ class S3ConfigSerializer(BaseValidateSerializer):
resource_name = "integrations"
class SecurityHubConfigSerializer(BaseValidateSerializer):
send_only_fails = serializers.BooleanField(default=False)
archive_previous_findings = serializers.BooleanField(default=False)
regions = serializers.DictField(default=dict, read_only=True)
def to_internal_value(self, data):
validated_data = super().to_internal_value(data)
# Always initialize regions as empty dict
validated_data["regions"] = {}
return validated_data
class Meta:
resource_name = "integrations"
class JiraConfigSerializer(BaseValidateSerializer):
domain = serializers.CharField(read_only=True)
issue_types = serializers.ListField(
read_only=True, child=serializers.CharField(), default=["Task"]
)
projects = serializers.DictField(read_only=True)
class Meta:
resource_name = "integrations"
class AWSCredentialSerializer(BaseValidateSerializer):
role_arn = serializers.CharField(required=False)
external_id = serializers.CharField(required=False)
@@ -67,6 +93,15 @@ class AWSCredentialSerializer(BaseValidateSerializer):
resource_name = "integrations"
class JiraCredentialSerializer(BaseValidateSerializer):
user_mail = serializers.EmailField(required=True)
api_token = serializers.CharField(required=True)
domain = serializers.CharField(required=True)
class Meta:
resource_name = "integrations"
@extend_schema_field(
{
"oneOf": [
@@ -118,6 +153,27 @@ class AWSCredentialSerializer(BaseValidateSerializer):
},
},
},
{
"type": "object",
"title": "JIRA Credentials",
"properties": {
"user_mail": {
"type": "string",
"format": "email",
"description": "The email address of the JIRA user account.",
},
"api_token": {
"type": "string",
"description": "The API token for authentication with JIRA. This can be generated from your "
"Atlassian account settings.",
},
"domain": {
"type": "string",
"description": "The JIRA domain/instance URL (e.g., 'your-domain.atlassian.net').",
},
},
"required": ["user_mail", "api_token", "domain"],
},
]
}
)
@@ -138,7 +194,10 @@ class IntegrationCredentialField(serializers.JSONField):
},
"output_directory": {
"type": "string",
"description": 'The directory path within the bucket where files will be saved. Optional - defaults to "output" if not provided. Path will be normalized to remove excessive slashes and invalid characters are not allowed (< > : " | ? *). Maximum length is 900 characters.',
"description": "The directory path within the bucket where files will be saved. Optional - "
'defaults to "output" if not provided. Path will be normalized to remove '
'excessive slashes and invalid characters are not allowed (< > : " | ? *). '
"Maximum length is 900 characters.",
"maxLength": 900,
"pattern": '^[^<>:"|?*]+$',
"default": "output",
@@ -146,6 +205,30 @@ class IntegrationCredentialField(serializers.JSONField):
},
"required": ["bucket_name"],
},
{
"type": "object",
"title": "AWS Security Hub",
"properties": {
"send_only_fails": {
"type": "boolean",
"default": False,
"description": "If true, only findings with status 'FAIL' will be sent to Security Hub.",
},
"archive_previous_findings": {
"type": "boolean",
"default": False,
"description": "If true, archives findings that are not present in the current execution.",
},
},
},
{
"type": "object",
"title": "JIRA",
"description": "JIRA integration does not accept any configuration in the payload. Leave it as an "
"empty JSON object (`{}`).",
"properties": {},
"additionalProperties": False,
},
]
}
)
+166 -16
View File
@@ -15,6 +15,7 @@ from rest_framework_simplejwt.exceptions import TokenError
from rest_framework_simplejwt.serializers import TokenObtainPairSerializer
from rest_framework_simplejwt.tokens import RefreshToken
from api.exceptions import ConflictException
from api.models import (
Finding,
Integration,
@@ -45,7 +46,10 @@ from api.v1.serializer_utils.integrations import (
AWSCredentialSerializer,
IntegrationConfigField,
IntegrationCredentialField,
JiraConfigSerializer,
JiraCredentialSerializer,
S3ConfigSerializer,
SecurityHubConfigSerializer,
)
from api.v1.serializer_utils.processors import ProcessorConfigField
from api.v1.serializer_utils.providers import ProviderSecretField
@@ -1951,13 +1955,59 @@ class ScheduleDailyCreateSerializer(serializers.Serializer):
class BaseWriteIntegrationSerializer(BaseWriteSerializer):
def validate(self, attrs):
if Integration.objects.filter(
configuration=attrs.get("configuration")
).exists():
raise serializers.ValidationError(
{"name": "This integration already exists."}
integration_type = attrs.get("integration_type")
if (
integration_type == Integration.IntegrationChoices.AMAZON_S3
and Integration.objects.filter(
configuration=attrs.get("configuration")
).exists()
):
raise ConflictException(
detail="This integration already exists.",
pointer="/data/attributes/configuration",
)
if (
integration_type == Integration.IntegrationChoices.JIRA
and Integration.objects.filter(
configuration__contains={
"domain": attrs.get("configuration").get("domain")
}
).exists()
):
raise ConflictException(
detail="This integration already exists.",
pointer="/data/attributes/configuration",
)
# Check if any provider already has a SecurityHub integration
if hasattr(self, "instance") and self.instance and not integration_type:
integration_type = self.instance.integration_type
if (
integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB
and "providers" in attrs
):
providers = attrs.get("providers", [])
tenant_id = self.context.get("tenant_id")
for provider in providers:
# For updates, exclude the current instance from the check
query = IntegrationProviderRelationship.objects.filter(
provider=provider,
integration__integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
tenant_id=tenant_id,
)
if hasattr(self, "instance") and self.instance:
query = query.exclude(integration=self.instance)
if query.exists():
raise ConflictException(
detail=f"Provider {provider.id} already has a Security Hub integration. Only one "
"Security Hub integration is allowed per provider.",
pointer="/data/relationships/providers",
)
return super().validate(attrs)
@staticmethod
@@ -1970,14 +2020,46 @@ class BaseWriteIntegrationSerializer(BaseWriteSerializer):
if integration_type == Integration.IntegrationChoices.AMAZON_S3:
config_serializer = S3ConfigSerializer
credentials_serializers = [AWSCredentialSerializer]
# TODO: This will be required for AWS Security Hub
# if providers and not all(
# provider.provider == Provider.ProviderChoices.AWS
# for provider in providers
# ):
# raise serializers.ValidationError(
# {"providers": "All providers must be AWS for the S3 integration."}
# )
elif integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB:
if providers:
if len(providers) > 1:
raise serializers.ValidationError(
{
"providers": "Only one provider is supported for the Security Hub integration."
}
)
if providers[0].provider != Provider.ProviderChoices.AWS:
raise serializers.ValidationError(
{
"providers": "The provider must be AWS type for the Security Hub integration."
}
)
config_serializer = SecurityHubConfigSerializer
credentials_serializers = [AWSCredentialSerializer]
elif integration_type == Integration.IntegrationChoices.JIRA:
if providers:
raise serializers.ValidationError(
{
"providers": "Relationship field is not accepted. This integration applies to all providers."
}
)
if configuration:
raise serializers.ValidationError(
{
"configuration": "This integration does not support custom configuration."
}
)
config_serializer = JiraConfigSerializer
# Create non-editable configuration for JIRA integration
default_jira_issue_types = ["Task"]
configuration.update(
{
"projects": {},
"issue_types": default_jira_issue_types,
"domain": credentials.get("domain"),
}
)
credentials_serializers = [JiraCredentialSerializer]
else:
raise serializers.ValidationError(
{
@@ -2041,6 +2123,10 @@ class IntegrationSerializer(RLSSerializer):
for provider in representation["providers"]
if provider["id"] in allowed_provider_ids
]
if instance.integration_type == Integration.IntegrationChoices.JIRA:
representation["configuration"].update(
{"domain": instance.credentials.get("domain")}
)
return representation
@@ -2077,6 +2163,14 @@ class IntegrationCreateSerializer(BaseWriteIntegrationSerializer):
configuration = attrs.get("configuration")
credentials = attrs.get("credentials")
if (
not providers
and integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB
):
raise serializers.ValidationError(
{"providers": "At least one provider is required for this integration."}
)
self.validate_integration_data(
integration_type, providers, configuration, credentials
)
@@ -2131,16 +2225,18 @@ class IntegrationUpdateSerializer(BaseWriteIntegrationSerializer):
}
def validate(self, attrs):
super().validate(attrs)
integration_type = self.instance.integration_type
providers = attrs.get("providers")
configuration = attrs.get("configuration") or self.instance.configuration
if integration_type != Integration.IntegrationChoices.JIRA:
configuration = attrs.get("configuration") or self.instance.configuration
else:
configuration = attrs.get("configuration", {})
credentials = attrs.get("credentials") or self.instance.credentials
validated_attrs = super().validate(attrs)
self.validate_integration_data(
integration_type, providers, configuration, credentials
)
validated_attrs = super().validate(attrs)
return validated_attrs
def update(self, instance, validated_data):
@@ -2155,8 +2251,62 @@ class IntegrationUpdateSerializer(BaseWriteIntegrationSerializer):
]
IntegrationProviderRelationship.objects.bulk_create(new_relationships)
# Preserve regions field for Security Hub integrations
if instance.integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB:
if "configuration" in validated_data:
# Preserve the existing regions field if it exists
existing_regions = instance.configuration.get("regions", {})
validated_data["configuration"]["regions"] = existing_regions
return super().update(instance, validated_data)
def to_representation(self, instance):
representation = super().to_representation(instance)
# Ensure JIRA integrations show updated domain in configuration from credentials
if instance.integration_type == Integration.IntegrationChoices.JIRA:
representation["configuration"].update(
{"domain": instance.credentials.get("domain")}
)
return representation
class IntegrationJiraDispatchSerializer(serializers.Serializer):
"""
Serializer for dispatching findings to JIRA integration.
"""
project_key = serializers.CharField(required=True)
issue_type = serializers.ChoiceField(required=True, choices=["Task"])
class JSONAPIMeta:
resource_name = "integrations-jira-dispatches"
def validate(self, attrs):
validated_attrs = super().validate(attrs)
integration_instance = Integration.objects.get(
id=self.context.get("integration_id")
)
if integration_instance.integration_type != Integration.IntegrationChoices.JIRA:
raise ValidationError(
{"integration_type": "The given integration is not a JIRA integration"}
)
if not integration_instance.enabled:
raise ValidationError(
{"integration": "The given integration is not enabled"}
)
project_key = attrs.get("project_key")
if project_key not in integration_instance.configuration.get("projects", {}):
raise ValidationError(
{
"project_key": "The given project key is not available for this JIRA integration. Refresh the "
"connection if this is an error."
}
)
return validated_attrs
# Processors
+9
View File
@@ -12,6 +12,7 @@ from api.v1.views import (
FindingViewSet,
GithubSocialLoginView,
GoogleSocialLoginView,
IntegrationJiraViewSet,
IntegrationViewSet,
InvitationAcceptViewSet,
InvitationViewSet,
@@ -73,6 +74,13 @@ tenants_router.register(
users_router = routers.NestedSimpleRouter(router, r"users", lookup="user")
users_router.register(r"memberships", MembershipViewSet, basename="user-membership")
integrations_router = routers.NestedSimpleRouter(
router, r"integrations", lookup="integration"
)
integrations_router.register(
r"jira", IntegrationJiraViewSet, basename="integration-jira"
)
urlpatterns = [
path("tokens", CustomTokenObtainView.as_view(), name="token-obtain"),
path("tokens/refresh", CustomTokenRefreshView.as_view(), name="token-refresh"),
@@ -162,6 +170,7 @@ urlpatterns = [
path("", include(router.urls)),
path("", include(tenants_router.urls)),
path("", include(users_router.urls)),
path("", include(integrations_router.urls)),
path("schema", SchemaView.as_view(), name="schema"),
path("docs", SpectacularRedocView.as_view(url_name="schema"), name="docs"),
]
+154 -34
View File
@@ -62,6 +62,7 @@ from tasks.tasks import (
check_provider_connection_task,
delete_provider_task,
delete_tenant_task,
jira_integration_task,
perform_scan_task,
)
@@ -75,8 +76,10 @@ from api.db_utils import rls_transaction
from api.exceptions import TaskFailedException
from api.filters import (
ComplianceOverviewFilter,
CustomDjangoFilterBackend,
FindingFilter,
IntegrationFilter,
IntegrationJiraFindingsFilter,
InvitationFilter,
LatestFindingFilter,
LatestResourceFilter,
@@ -89,6 +92,7 @@ from api.filters import (
RoleFilter,
ScanFilter,
ScanSummaryFilter,
ScanSummarySeverityFilter,
ServiceOverviewFilter,
TaskFilter,
TenantFilter,
@@ -142,6 +146,7 @@ from api.v1.serializers import (
FindingMetadataSerializer,
FindingSerializer,
IntegrationCreateSerializer,
IntegrationJiraDispatchSerializer,
IntegrationSerializer,
IntegrationUpdateSerializer,
InvitationAcceptSerializer,
@@ -214,6 +219,8 @@ class RelationshipViewSchema(JsonApiAutoSchema):
description="Obtain a token by providing valid credentials and an optional tenant ID.",
)
class CustomTokenObtainView(GenericAPIView):
throttle_scope = "token-obtain"
resource_name = "tokens"
serializer_class = TokenSerializer
http_method_names = ["post"]
@@ -293,7 +300,7 @@ class SchemaView(SpectacularAPIView):
def get(self, request, *args, **kwargs):
spectacular_settings.TITLE = "Prowler API"
spectacular_settings.VERSION = "1.11.0"
spectacular_settings.VERSION = "1.13.0"
spectacular_settings.DESCRIPTION = (
"Prowler API specification.\n\nThis file is auto-generated."
)
@@ -313,6 +320,11 @@ class SchemaView(SpectacularAPIView):
"description": "Endpoints for tenant invitations management, allowing retrieval and filtering of "
"invitations, creating new invitations, accepting and revoking them.",
},
{
"name": "Role",
"description": "Endpoints for managing RBAC roles within tenants, allowing creation, retrieval, "
"updating, and deletion of role configurations and permissions.",
},
{
"name": "Provider",
"description": "Endpoints for managing providers (AWS, GCP, Azure, etc...).",
@@ -321,10 +333,20 @@ class SchemaView(SpectacularAPIView):
"name": "Provider Group",
"description": "Endpoints for managing provider groups.",
},
{
"name": "Task",
"description": "Endpoints for task management, allowing retrieval of task status and "
"revoking tasks that have not started.",
},
{
"name": "Scan",
"description": "Endpoints for triggering manual scans and viewing scan results.",
},
{
"name": "Schedule",
"description": "Endpoints for managing scan schedules, allowing configuration of automated "
"scans with different scheduling options.",
},
{
"name": "Resource",
"description": "Endpoints for managing resources discovered by scans, allowing "
@@ -336,8 +358,9 @@ class SchemaView(SpectacularAPIView):
"findings that result from scans.",
},
{
"name": "Overview",
"description": "Endpoints for retrieving aggregated summaries of resources from the system.",
"name": "Processor",
"description": "Endpoints for managing post-processors used to process Prowler findings, including "
"registration, configuration, and deletion of post-processing actions.",
},
{
"name": "Compliance Overview",
@@ -345,9 +368,8 @@ class SchemaView(SpectacularAPIView):
" compliance framework ID.",
},
{
"name": "Task",
"description": "Endpoints for task management, allowing retrieval of task status and "
"revoking tasks that have not started.",
"name": "Overview",
"description": "Endpoints for retrieving aggregated summaries of resources from the system.",
},
{
"name": "Integration",
@@ -355,14 +377,15 @@ class SchemaView(SpectacularAPIView):
" retrieval, and deletion of integrations such as S3, JIRA, or other services.",
},
{
"name": "Lighthouse",
"description": "Endpoints for managing Lighthouse configurations, including creation, retrieval, "
"updating, and deletion of configurations such as OpenAI keys, models, and business context.",
"name": "Lighthouse AI",
"description": "Endpoints for managing Lighthouse AI configurations, including creation, retrieval, "
"updating, and deletion of configurations such as OpenAI keys, models, and business "
"context.",
},
{
"name": "Processor",
"description": "Endpoints for managing post-processors used to process Prowler findings, including "
"registration, configuration, and deletion of post-processing actions.",
"name": "SAML",
"description": "Endpoints for Single Sign-On authentication management via SAML for seamless user "
"authentication.",
},
]
return super().get(request, *args, **kwargs)
@@ -3033,7 +3056,9 @@ class RoleProviderGroupRelationshipView(RelationshipView, BaseRLSViewSet):
description="Compliance overviews metadata obtained successfully",
response=ComplianceOverviewMetadataSerializer,
),
202: OpenApiResponse(description="The task is in progress"),
202: OpenApiResponse(
description="The task is in progress", response=TaskSerializer
),
500: OpenApiResponse(
description="Compliance overviews generation task failed"
),
@@ -3065,7 +3090,9 @@ class RoleProviderGroupRelationshipView(RelationshipView, BaseRLSViewSet):
description="Compliance requirement details obtained successfully",
response=ComplianceOverviewDetailSerializer(many=True),
),
202: OpenApiResponse(description="The task is in progress"),
202: OpenApiResponse(
description="The task is in progress", response=TaskSerializer
),
500: OpenApiResponse(
description="Compliance overviews generation task failed"
),
@@ -3526,8 +3553,10 @@ class OverviewViewSet(BaseRLSViewSet):
def get_filterset_class(self):
if self.action == "providers":
return None
elif self.action in ["findings", "findings_severity"]:
elif self.action == "findings":
return ScanSummaryFilter
elif self.action == "findings_severity":
return ScanSummarySeverityFilter
elif self.action == "services":
return ServiceOverviewFilter
return None
@@ -3649,7 +3678,12 @@ class OverviewViewSet(BaseRLSViewSet):
@action(detail=False, methods=["get"], url_name="findings_severity")
def findings_severity(self, request):
tenant_id = self.request.tenant_id
queryset = self.get_queryset()
# Load only required fields
queryset = self.get_queryset().only(
"tenant_id", "scan_id", "severity", "fail", "_pass", "total"
)
filtered_queryset = self.filter_queryset(queryset)
provider_filter = (
{"provider__in": self.allowed_providers}
@@ -3669,16 +3703,22 @@ class OverviewViewSet(BaseRLSViewSet):
tenant_id=tenant_id, scan_id__in=latest_scan_ids
)
# The filter will have added a status_count annotation if any status filter was used
if "status_count" in filtered_queryset.query.annotations:
sum_expression = Sum("status_count")
else:
sum_expression = Sum("total")
severity_counts = (
filtered_queryset.values("severity")
.annotate(count=Sum("total"))
.annotate(count=sum_expression)
.order_by("severity")
)
severity_data = {sev[0]: 0 for sev in SeverityChoices}
for item in severity_counts:
severity_data[item["severity"]] = item["count"]
severity_data.update(
{item["severity"]: item["count"] for item in severity_counts}
)
serializer = self.get_serializer(severity_data)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -3866,31 +3906,111 @@ class IntegrationViewSet(BaseRLSViewSet):
)
@extend_schema_view(
dispatches=extend_schema(
tags=["Integration"],
summary="Send findings to a Jira integration",
description="Send a set of filtered findings to the given integration. At least one finding filter must be "
"provided.",
responses={202: OpenApiResponse(response=TaskSerializer)},
filters=True,
)
)
class IntegrationJiraViewSet(BaseRLSViewSet):
queryset = Finding.all_objects.all()
serializer_class = IntegrationJiraDispatchSerializer
http_method_names = ["post"]
filter_backends = [CustomDjangoFilterBackend]
filterset_class = IntegrationJiraFindingsFilter
# RBAC required permissions
required_permissions = [Permissions.MANAGE_INTEGRATIONS]
@extend_schema(exclude=True)
def create(self, request, *args, **kwargs):
raise MethodNotAllowed(method="POST")
def get_queryset(self):
tenant_id = self.request.tenant_id
user_roles = get_role(self.request.user)
if user_roles.unlimited_visibility:
# User has unlimited visibility, return all findings
queryset = Finding.all_objects.filter(tenant_id=tenant_id)
else:
# User lacks permission, filter findings based on provider groups associated with the role
queryset = Finding.all_objects.filter(
scan__provider__in=get_providers(user_roles)
)
return queryset
@action(detail=False, methods=["post"], url_name="dispatches")
def dispatches(self, request, integration_pk=None):
get_object_or_404(Integration, pk=integration_pk)
serializer = self.get_serializer(
data=request.data, context={"integration_id": integration_pk}
)
serializer.is_valid(raise_exception=True)
if self.filter_queryset(self.get_queryset()).count() == 0:
raise ValidationError(
{"findings": "No findings match the provided filters"}
)
finding_ids = [
str(finding_id)
for finding_id in self.filter_queryset(self.get_queryset()).values_list(
"id", flat=True
)
]
project_key = serializer.validated_data["project_key"]
issue_type = serializer.validated_data["issue_type"]
with transaction.atomic():
task = jira_integration_task.delay(
tenant_id=self.request.tenant_id,
integration_id=integration_pk,
project_key=project_key,
issue_type=issue_type,
finding_ids=finding_ids,
)
prowler_task = Task.objects.get(id=task.id)
serializer = TaskSerializer(prowler_task)
return Response(
data=serializer.data,
status=status.HTTP_202_ACCEPTED,
headers={
"Content-Location": reverse(
"task-detail", kwargs={"pk": prowler_task.id}
)
},
)
@extend_schema_view(
list=extend_schema(
tags=["Lighthouse"],
summary="List all Lighthouse configurations",
description="Retrieve a list of all Lighthouse configurations.",
tags=["Lighthouse AI"],
summary="List all Lighthouse AI configurations",
description="Retrieve a list of all Lighthouse AI configurations.",
),
create=extend_schema(
tags=["Lighthouse"],
summary="Create a new Lighthouse configuration",
description="Create a new Lighthouse configuration with the specified details.",
tags=["Lighthouse AI"],
summary="Create a new Lighthouse AI configuration",
description="Create a new Lighthouse AI configuration with the specified details.",
),
partial_update=extend_schema(
tags=["Lighthouse"],
summary="Partially update a Lighthouse configuration",
description="Update certain fields of an existing Lighthouse configuration.",
tags=["Lighthouse AI"],
summary="Partially update a Lighthouse AI configuration",
description="Update certain fields of an existing Lighthouse AI configuration.",
),
destroy=extend_schema(
tags=["Lighthouse"],
summary="Delete a Lighthouse configuration",
description="Remove a Lighthouse configuration by its ID.",
tags=["Lighthouse AI"],
summary="Delete a Lighthouse AI configuration",
description="Remove a Lighthouse AI configuration by its ID.",
),
connection=extend_schema(
tags=["Lighthouse"],
tags=["Lighthouse AI"],
summary="Check the connection to the OpenAI API",
description="Verify the connection to the OpenAI API for a specific Lighthouse configuration.",
description="Verify the connection to the OpenAI API for a specific Lighthouse AI configuration.",
request=None,
responses={202: OpenApiResponse(response=TaskSerializer)},
),
+10
View File
@@ -108,6 +108,13 @@ REST_FRAMEWORK = {
),
"TEST_REQUEST_DEFAULT_FORMAT": "vnd.api+json",
"JSON_API_UNIFORM_EXCEPTIONS": True,
"DEFAULT_THROTTLE_CLASSES": [
"rest_framework.throttling.ScopedRateThrottle",
],
"DEFAULT_THROTTLE_RATES": {
"token-obtain": env("DJANGO_THROTTLE_TOKEN_OBTAIN", default=None),
"dj_rest_auth": None,
},
}
SPECTACULAR_SETTINGS = {
@@ -116,6 +123,9 @@ SPECTACULAR_SETTINGS = {
"PREPROCESSING_HOOKS": [
"drf_spectacular_jsonapi.hooks.fix_nested_path_parameters",
],
"POSTPROCESSING_HOOKS": [
"api.schema_hooks.attach_task_202_examples",
],
"TITLE": "API Reference - Prowler",
}
+3
View File
@@ -13,8 +13,10 @@ from api.models import Scan
from prowler.config.config import (
csv_file_suffix,
html_file_suffix,
json_asff_file_suffix,
json_ocsf_file_suffix,
)
from prowler.lib.outputs.asff.asff import ASFF
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
AWSWellArchitected,
)
@@ -109,6 +111,7 @@ OUTPUT_FORMATS_MAPPING = {
"kwargs": {},
},
"json-ocsf": {"class": OCSF, "suffix": json_ocsf_file_suffix, "kwargs": {}},
"json-asff": {"class": ASFF, "suffix": json_asff_file_suffix, "kwargs": {}},
"html": {"class": HTML, "suffix": html_file_suffix, "kwargs": {"stats": {}}},
}
+351 -1
View File
@@ -2,15 +2,21 @@ import os
from glob import glob
from celery.utils.log import get_task_logger
from config.django.base import DJANGO_FINDINGS_BATCH_SIZE
from tasks.utils import batched
from api.db_utils import rls_transaction
from api.models import Integration
from api.models import Finding, Integration, Provider
from api.utils import initialize_prowler_integration, initialize_prowler_provider
from prowler.lib.outputs.asff.asff import ASFF
from prowler.lib.outputs.compliance.generic.generic import GenericCompliance
from prowler.lib.outputs.csv.csv import CSV
from prowler.lib.outputs.finding import Finding as FindingOutput
from prowler.lib.outputs.html.html import HTML
from prowler.lib.outputs.ocsf.ocsf import OCSF
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.aws.lib.s3.s3 import S3
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHub
from prowler.providers.common.models import Connection
logger = get_task_logger(__name__)
@@ -154,3 +160,347 @@ def upload_s3_integration(
except Exception as e:
logger.error(f"S3 integrations failed for provider {provider_id}: {str(e)}")
return False
def get_security_hub_client_from_integration(
integration: Integration, tenant_id: str, findings: list
) -> tuple[bool, SecurityHub | Connection]:
"""
Create and return a SecurityHub client using AWS credentials from an integration.
Args:
integration (Integration): The integration to get the Security Hub client from.
tenant_id (str): The tenant identifier.
findings (list): List of findings in ASFF format to send to Security Hub.
Returns:
tuple[bool, SecurityHub | Connection]: A tuple containing a boolean indicating
if the connection was successful and the SecurityHub client or connection object.
"""
# Get the provider associated with this integration
with rls_transaction(tenant_id):
provider_relationship = integration.integrationproviderrelationship_set.first()
if not provider_relationship:
return Connection(
is_connected=False, error="No provider associated with this integration"
)
provider_uid = provider_relationship.provider.uid
provider_secret = provider_relationship.provider.secret.secret
credentials = (
integration.credentials if integration.credentials else provider_secret
)
connection = SecurityHub.test_connection(
aws_account_id=provider_uid,
raise_on_exception=False,
**credentials,
)
if connection.is_connected:
all_security_hub_regions = AwsProvider.get_available_aws_service_regions(
"securityhub", connection.partition
)
# Create regions status dictionary
regions_status = {}
for region in set(all_security_hub_regions):
regions_status[region] = region in connection.enabled_regions
# Save regions information in the integration configuration
with rls_transaction(tenant_id):
integration.configuration["regions"] = regions_status
integration.save()
# Create SecurityHub client with all necessary parameters
security_hub = SecurityHub(
aws_account_id=provider_uid,
findings=findings,
send_only_fails=integration.configuration.get("send_only_fails", False),
aws_security_hub_available_regions=list(connection.enabled_regions),
**credentials,
)
return True, security_hub
else:
# Reset regions information if connection fails
with rls_transaction(tenant_id):
integration.configuration["regions"] = {}
integration.save()
return False, connection
def upload_security_hub_integration(
tenant_id: str, provider_id: str, scan_id: str
) -> bool:
"""
Upload findings to AWS Security Hub using configured integrations.
This function retrieves findings from the database, transforms them to ASFF format,
and sends them to AWS Security Hub using the configured integration credentials.
Args:
tenant_id (str): The tenant identifier.
provider_id (str): The provider identifier.
scan_id (str): The scan identifier for which to send findings.
Returns:
bool: True if all integrations executed successfully, False otherwise.
"""
logger.info(f"Processing Security Hub integrations for provider {provider_id}")
try:
with rls_transaction(tenant_id):
# Get Security Hub integrations for this provider
integrations = list(
Integration.objects.filter(
integrationproviderrelationship__provider_id=provider_id,
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
enabled=True,
)
)
if not integrations:
logger.error(
f"No Security Hub integrations found for provider {provider_id}"
)
return False
# Get the provider object
provider = Provider.objects.get(id=provider_id)
# Initialize prowler provider for finding transformation
prowler_provider = initialize_prowler_provider(provider)
# Process each Security Hub integration
integration_executions = 0
total_findings_sent = {} # Track findings sent per integration
for integration in integrations:
try:
# Initialize Security Hub client for this integration
# We'll create the client once and reuse it for all batches
security_hub_client = None
send_only_fails = integration.configuration.get(
"send_only_fails", False
)
total_findings_sent[integration.id] = 0
# Process findings in batches to avoid memory issues
has_findings = False
batch_number = 0
with rls_transaction(tenant_id):
qs = (
Finding.all_objects.filter(tenant_id=tenant_id, scan_id=scan_id)
.order_by("uid")
.iterator()
)
for batch, _ in batched(qs, DJANGO_FINDINGS_BATCH_SIZE):
batch_number += 1
has_findings = True
# Transform findings for this batch
transformed_findings = [
FindingOutput.transform_api_finding(
finding, prowler_provider
)
for finding in batch
]
# Convert to ASFF format
asff_transformer = ASFF(
findings=transformed_findings,
file_path="",
file_extension="json",
)
asff_transformer.transform(transformed_findings)
# Get the batch of ASFF findings
batch_asff_findings = asff_transformer.data
if batch_asff_findings:
# Create Security Hub client for first batch or reuse existing
if not security_hub_client:
connected, security_hub = (
get_security_hub_client_from_integration(
integration, tenant_id, batch_asff_findings
)
)
if not connected:
logger.error(
f"Security Hub connection failed for integration {integration.id}: "
f"{security_hub.error}"
)
integration.connected = False
integration.save()
break # Skip this integration
security_hub_client = security_hub
logger.info(
f"Sending {'fail' if send_only_fails else 'all'} findings to Security Hub via "
f"integration {integration.id}"
)
else:
# Update findings in existing client for this batch
security_hub_client._findings_per_region = (
security_hub_client.filter(
batch_asff_findings, send_only_fails
)
)
# Send this batch to Security Hub
try:
findings_sent = (
security_hub_client.batch_send_to_security_hub()
)
total_findings_sent[integration.id] += findings_sent
if findings_sent > 0:
logger.debug(
f"Sent batch {batch_number} with {findings_sent} findings to Security Hub"
)
except Exception as batch_error:
logger.error(
f"Failed to send batch {batch_number} to Security Hub: {str(batch_error)}"
)
# Clear memory after processing each batch
asff_transformer._data.clear()
del batch_asff_findings
del transformed_findings
if not has_findings:
logger.info(
f"No findings to send to Security Hub for scan {scan_id}"
)
integration_executions += 1
elif security_hub_client:
if total_findings_sent[integration.id] > 0:
logger.info(
f"Successfully sent {total_findings_sent[integration.id]} total findings to Security Hub via integration {integration.id}"
)
integration_executions += 1
else:
logger.warning(
f"No findings were sent to Security Hub via integration {integration.id}"
)
# Archive previous findings if configured to do so
if integration.configuration.get(
"archive_previous_findings", False
):
logger.info(
f"Archiving previous findings in Security Hub via integration {integration.id}"
)
try:
findings_archived = (
security_hub_client.archive_previous_findings()
)
logger.info(
f"Successfully archived {findings_archived} previous findings in Security Hub"
)
except Exception as archive_error:
logger.warning(
f"Failed to archive previous findings: {str(archive_error)}"
)
except Exception as e:
logger.error(
f"Security Hub integration {integration.id} failed: {str(e)}"
)
continue
result = integration_executions == len(integrations)
if result:
logger.info(
f"All Security Hub integrations completed successfully for provider {provider_id}"
)
else:
logger.error(
f"Some Security Hub integrations failed for provider {provider_id}"
)
return result
except Exception as e:
logger.error(
f"Security Hub integrations failed for provider {provider_id}: {str(e)}"
)
return False
def send_findings_to_jira(
tenant_id: str,
integration_id: str,
project_key: str,
issue_type: str,
finding_ids: list[str],
):
with rls_transaction(tenant_id):
integration = Integration.objects.get(id=integration_id)
jira_integration = initialize_prowler_integration(integration)
num_tickets_created = 0
for finding_id in finding_ids:
with rls_transaction(tenant_id):
finding_instance = (
Finding.all_objects.select_related("scan__provider")
.prefetch_related("resources")
.get(id=finding_id)
)
# Extract resource information
resource = (
finding_instance.resources.first()
if finding_instance.resources.exists()
else None
)
resource_uid = resource.uid if resource else ""
resource_name = resource.name if resource else ""
resource_tags = {}
if resource and hasattr(resource, "tags"):
resource_tags = resource.get_tags(tenant_id)
# Get region
region = resource.region if resource and resource.region else ""
# Extract remediation information from check_metadata
check_metadata = finding_instance.check_metadata
remediation = check_metadata.get("remediation", {})
recommendation = remediation.get("recommendation", {})
remediation_code = remediation.get("code", {})
# Send the individual finding to Jira
result = jira_integration.send_finding(
check_id=finding_instance.check_id,
check_title=check_metadata.get("checktitle", ""),
severity=finding_instance.severity,
status=finding_instance.status,
status_extended=finding_instance.status_extended or "",
provider=finding_instance.scan.provider.provider,
region=region,
resource_uid=resource_uid,
resource_name=resource_name,
risk=check_metadata.get("risk", ""),
recommendation_text=recommendation.get("text", ""),
recommendation_url=recommendation.get("url", ""),
remediation_code_native_iac=remediation_code.get("nativeiac", ""),
remediation_code_terraform=remediation_code.get("terraform", ""),
remediation_code_cli=remediation_code.get("cli", ""),
remediation_code_other=remediation_code.get("other", ""),
resource_tags=resource_tags,
compliance=finding_instance.compliance or {},
project_key=project_key,
issue_type=issue_type,
)
if result:
num_tickets_created += 1
else:
logger.error(f"Failed to send finding {finding_id} to Jira")
return {
"created_count": num_tickets_created,
"failed_count": len(finding_ids) - num_tickets_created,
}
+76 -4
View File
@@ -21,7 +21,11 @@ from tasks.jobs.export import (
_generate_output_directory,
_upload_to_s3,
)
from tasks.jobs.integrations import upload_s3_integration
from tasks.jobs.integrations import (
send_findings_to_jira,
upload_s3_integration,
upload_security_hub_integration,
)
from tasks.jobs.scan import (
aggregate_findings,
create_compliance_requirements,
@@ -62,6 +66,7 @@ def _perform_scan_complete_tasks(tenant_id: str, scan_id: str, provider_id: str)
check_integrations_task.si(
tenant_id=tenant_id,
provider_id=provider_id,
scan_id=scan_id,
),
).apply_async()
@@ -323,12 +328,30 @@ def generate_outputs_task(scan_id: str, provider_id: str, tenant_id: str):
ScanSummary.objects.filter(scan_id=scan_id)
)
qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid").iterator()
# Check if we need to generate ASFF output for AWS providers with SecurityHub integration
generate_asff = False
if provider_type == "aws":
security_hub_integrations = Integration.objects.filter(
integrationproviderrelationship__provider_id=provider_id,
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
enabled=True,
)
generate_asff = security_hub_integrations.exists()
qs = (
Finding.all_objects.filter(tenant_id=tenant_id, scan_id=scan_id)
.order_by("uid")
.iterator()
)
for batch, is_last in batched(qs, DJANGO_FINDINGS_BATCH_SIZE):
fos = [FindingOutput.transform_api_finding(f, prowler_provider) for f in batch]
# Outputs
for mode, cfg in OUTPUT_FORMATS_MAPPING.items():
# Skip ASFF generation if not needed
if mode == "json-asff" and not generate_asff:
continue
cls = cfg["class"]
suffix = cfg["suffix"]
extra = cfg.get("kwargs", {}).copy()
@@ -474,17 +497,19 @@ def check_lighthouse_connection_task(lighthouse_config_id: str, tenant_id: str =
@shared_task(name="integration-check")
def check_integrations_task(tenant_id: str, provider_id: str):
def check_integrations_task(tenant_id: str, provider_id: str, scan_id: str = None):
"""
Check and execute all configured integrations for a provider.
Args:
tenant_id (str): The tenant identifier
provider_id (str): The provider identifier
scan_id (str, optional): The scan identifier for integrations that need scan data
"""
logger.info(f"Checking integrations for provider {provider_id}")
try:
integration_tasks = []
with rls_transaction(tenant_id):
integrations = Integration.objects.filter(
integrationproviderrelationship__provider_id=provider_id,
@@ -495,7 +520,16 @@ def check_integrations_task(tenant_id: str, provider_id: str):
logger.info(f"No integrations configured for provider {provider_id}")
return {"integrations_processed": 0}
integration_tasks = []
# Security Hub integration
security_hub_integrations = integrations.filter(
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB
)
if security_hub_integrations.exists():
integration_tasks.append(
security_hub_integration_task.s(
tenant_id=tenant_id, provider_id=provider_id, scan_id=scan_id
)
)
# TODO: Add other integration types here
# slack_integrations = integrations.filter(
@@ -541,3 +575,41 @@ def s3_integration_task(
output_directory (str): Path to the directory containing output files
"""
return upload_s3_integration(tenant_id, provider_id, output_directory)
@shared_task(
base=RLSTask,
name="integration-security-hub",
queue="integrations",
)
def security_hub_integration_task(
tenant_id: str,
provider_id: str,
scan_id: str,
):
"""
Process Security Hub integrations for a provider.
Args:
tenant_id (str): The tenant identifier
provider_id (str): The provider identifier
scan_id (str): The scan identifier
"""
return upload_security_hub_integration(tenant_id, provider_id, scan_id)
@shared_task(
base=RLSTask,
name="integration-jira",
queue="integrations",
)
def jira_integration_task(
tenant_id: str,
integration_id: str,
project_key: str,
issue_type: str,
finding_ids: list[str],
):
return send_findings_to_jira(
tenant_id, integration_id, project_key, issue_type, finding_ids
)
File diff suppressed because it is too large Load Diff
+440 -9
View File
@@ -7,6 +7,7 @@ from tasks.tasks import (
check_integrations_task,
generate_outputs_task,
s3_integration_task,
security_hub_integration_task,
)
from api.models import Integration
@@ -521,31 +522,68 @@ class TestCheckIntegrationsTask:
enabled=True,
)
@patch("tasks.tasks.security_hub_integration_task")
@patch("tasks.tasks.group")
@patch("tasks.tasks.rls_transaction")
@patch("tasks.tasks.Integration.objects.filter")
def test_check_integrations_s3_success(
self, mock_integration_filter, mock_rls, mock_group
def test_check_integrations_security_hub_success(
self, mock_integration_filter, mock_rls, mock_group, mock_security_hub_task
):
# Mock that we have some integrations
mock_integration_filter.return_value.exists.return_value = True
"""Test that SecurityHub integrations are processed correctly."""
# Mock that we have SecurityHub integrations
mock_integrations = MagicMock()
mock_integrations.exists.return_value = True
# Mock SecurityHub integrations to return existing integrations
mock_security_hub_integrations = MagicMock()
mock_security_hub_integrations.exists.return_value = True
# Set up the filter chain
mock_integration_filter.return_value = mock_integrations
mock_integrations.filter.return_value = mock_security_hub_integrations
# Mock the task signature
mock_task_signature = MagicMock()
mock_security_hub_task.s.return_value = mock_task_signature
# Mock group job
mock_job = MagicMock()
mock_group.return_value = mock_job
# Ensure rls_transaction is mocked
mock_rls.return_value.__enter__.return_value = None
# Since the current implementation doesn't actually create tasks yet (TODO comment),
# we test that no tasks are created but the function returns the correct count
# Execute the function
result = check_integrations_task(
tenant_id=self.tenant_id,
provider_id=self.provider_id,
scan_id="test-scan-id",
)
assert result == {"integrations_processed": 0}
# Should process 1 SecurityHub integration
assert result == {"integrations_processed": 1}
# Verify the integration filter was called
mock_integration_filter.assert_called_once_with(
integrationproviderrelationship__provider_id=self.provider_id,
enabled=True,
)
# group should not be called since no integration tasks are created yet
mock_group.assert_not_called()
# Verify SecurityHub integrations were filtered
mock_integrations.filter.assert_called_once_with(
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB
)
# Verify SecurityHub task was created with correct parameters
mock_security_hub_task.s.assert_called_once_with(
tenant_id=self.tenant_id,
provider_id=self.provider_id,
scan_id="test-scan-id",
)
# Verify group was called and job was executed
mock_group.assert_called_once_with([mock_task_signature])
mock_job.apply_async.assert_called_once()
@patch("tasks.tasks.rls_transaction")
@patch("tasks.tasks.Integration.objects.filter")
@@ -567,6 +605,369 @@ class TestCheckIntegrationsTask:
enabled=True,
)
@patch("tasks.tasks.s3_integration_task")
@patch("tasks.tasks.Integration.objects.filter")
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Finding.all_objects.filter")
@patch("tasks.tasks._generate_output_directory")
@patch("tasks.tasks.FindingOutput._transform_findings_stats")
@patch("tasks.tasks.FindingOutput.transform_api_finding")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks.Scan.all_objects.filter")
@patch("tasks.tasks.rmtree")
def test_generate_outputs_with_asff_for_aws_with_security_hub(
self,
mock_rmtree,
mock_scan_update,
mock_upload,
mock_compress,
mock_transform_finding,
mock_transform_stats,
mock_generate_dir,
mock_findings,
mock_get_frameworks,
mock_compliance_bulk,
mock_initialize_provider,
mock_provider_get,
mock_scan_summary,
mock_integration_filter,
mock_s3_task,
):
"""Test that ASFF output is generated for AWS providers with SecurityHub integration."""
# Setup
mock_scan_summary_qs = MagicMock()
mock_scan_summary_qs.exists.return_value = True
mock_scan_summary.return_value = mock_scan_summary_qs
# Mock AWS provider
mock_provider = MagicMock()
mock_provider.uid = "aws-account-123"
mock_provider.provider = "aws"
mock_provider_get.return_value = mock_provider
# Mock SecurityHub integration exists
mock_security_hub_integrations = MagicMock()
mock_security_hub_integrations.exists.return_value = True
mock_integration_filter.return_value = mock_security_hub_integrations
# Mock s3_integration_task
mock_s3_task.apply_async.return_value.get.return_value = True
# Mock other necessary components
mock_initialize_provider.return_value = MagicMock()
mock_compliance_bulk.return_value = {}
mock_get_frameworks.return_value = []
mock_generate_dir.return_value = ("out-dir", "comp-dir")
mock_transform_stats.return_value = {"stats": "data"}
# Mock findings
mock_finding = MagicMock()
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[mock_finding],
True,
]
mock_transform_finding.return_value = MagicMock(compliance={})
# Track which output formats were created
created_writers = {}
def track_writer_creation(cls_type):
def factory(*args, **kwargs):
writer = MagicMock()
writer._data = []
writer.transform = MagicMock()
writer.batch_write_data_to_file = MagicMock()
created_writers[cls_type] = writer
return writer
return factory
# Mock OUTPUT_FORMATS_MAPPING with tracking
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"csv": {
"class": track_writer_creation("csv"),
"suffix": ".csv",
"kwargs": {},
},
"json-asff": {
"class": track_writer_creation("asff"),
"suffix": ".asff.json",
"kwargs": {},
},
"json-ocsf": {
"class": track_writer_creation("ocsf"),
"suffix": ".ocsf.json",
"kwargs": {},
},
},
):
mock_compress.return_value = "/tmp/compressed.zip"
mock_upload.return_value = "s3://bucket/file.zip"
# Execute
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
# Verify ASFF was created for AWS with SecurityHub
assert "asff" in created_writers, "ASFF writer should be created"
assert "csv" in created_writers, "CSV writer should be created"
assert "ocsf" in created_writers, "OCSF writer should be created"
# Verify SecurityHub integration was checked
assert mock_integration_filter.call_count == 2
mock_integration_filter.assert_any_call(
integrationproviderrelationship__provider_id=self.provider_id,
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
enabled=True,
)
assert result == {"upload": True}
@patch("tasks.tasks.s3_integration_task")
@patch("tasks.tasks.Integration.objects.filter")
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Finding.all_objects.filter")
@patch("tasks.tasks._generate_output_directory")
@patch("tasks.tasks.FindingOutput._transform_findings_stats")
@patch("tasks.tasks.FindingOutput.transform_api_finding")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks.Scan.all_objects.filter")
@patch("tasks.tasks.rmtree")
def test_generate_outputs_no_asff_for_aws_without_security_hub(
self,
mock_rmtree,
mock_scan_update,
mock_upload,
mock_compress,
mock_transform_finding,
mock_transform_stats,
mock_generate_dir,
mock_findings,
mock_get_frameworks,
mock_compliance_bulk,
mock_initialize_provider,
mock_provider_get,
mock_scan_summary,
mock_integration_filter,
mock_s3_task,
):
"""Test that ASFF output is NOT generated for AWS providers without SecurityHub integration."""
# Setup
mock_scan_summary_qs = MagicMock()
mock_scan_summary_qs.exists.return_value = True
mock_scan_summary.return_value = mock_scan_summary_qs
# Mock AWS provider
mock_provider = MagicMock()
mock_provider.uid = "aws-account-123"
mock_provider.provider = "aws"
mock_provider_get.return_value = mock_provider
# Mock NO SecurityHub integration
mock_security_hub_integrations = MagicMock()
mock_security_hub_integrations.exists.return_value = False
mock_integration_filter.return_value = mock_security_hub_integrations
# Mock other necessary components
mock_initialize_provider.return_value = MagicMock()
mock_compliance_bulk.return_value = {}
mock_get_frameworks.return_value = []
mock_generate_dir.return_value = ("out-dir", "comp-dir")
mock_transform_stats.return_value = {"stats": "data"}
# Mock findings
mock_finding = MagicMock()
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[mock_finding],
True,
]
mock_transform_finding.return_value = MagicMock(compliance={})
# Track which output formats were created
created_writers = {}
def track_writer_creation(cls_type):
def factory(*args, **kwargs):
writer = MagicMock()
writer._data = []
writer.transform = MagicMock()
writer.batch_write_data_to_file = MagicMock()
created_writers[cls_type] = writer
return writer
return factory
# Mock OUTPUT_FORMATS_MAPPING with tracking
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"csv": {
"class": track_writer_creation("csv"),
"suffix": ".csv",
"kwargs": {},
},
"json-asff": {
"class": track_writer_creation("asff"),
"suffix": ".asff.json",
"kwargs": {},
},
"json-ocsf": {
"class": track_writer_creation("ocsf"),
"suffix": ".ocsf.json",
"kwargs": {},
},
},
):
mock_compress.return_value = "/tmp/compressed.zip"
mock_upload.return_value = "s3://bucket/file.zip"
# Execute
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
# Verify ASFF was NOT created when no SecurityHub integration
assert "asff" not in created_writers, "ASFF writer should NOT be created"
assert "csv" in created_writers, "CSV writer should be created"
assert "ocsf" in created_writers, "OCSF writer should be created"
# Verify SecurityHub integration was checked
assert mock_integration_filter.call_count == 2
mock_integration_filter.assert_any_call(
integrationproviderrelationship__provider_id=self.provider_id,
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
enabled=True,
)
assert result == {"upload": True}
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Finding.all_objects.filter")
@patch("tasks.tasks._generate_output_directory")
@patch("tasks.tasks.FindingOutput._transform_findings_stats")
@patch("tasks.tasks.FindingOutput.transform_api_finding")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks.Scan.all_objects.filter")
@patch("tasks.tasks.rmtree")
def test_generate_outputs_no_asff_for_non_aws_provider(
self,
mock_rmtree,
mock_scan_update,
mock_upload,
mock_compress,
mock_transform_finding,
mock_transform_stats,
mock_generate_dir,
mock_findings,
mock_get_frameworks,
mock_compliance_bulk,
mock_initialize_provider,
mock_provider_get,
mock_scan_summary,
):
"""Test that ASFF output is NOT generated for non-AWS providers (e.g., Azure, GCP)."""
# Setup
mock_scan_summary_qs = MagicMock()
mock_scan_summary_qs.exists.return_value = True
mock_scan_summary.return_value = mock_scan_summary_qs
# Mock Azure provider (non-AWS)
mock_provider = MagicMock()
mock_provider.uid = "azure-subscription-123"
mock_provider.provider = "azure" # Non-AWS provider
mock_provider_get.return_value = mock_provider
# Mock other necessary components
mock_initialize_provider.return_value = MagicMock()
mock_compliance_bulk.return_value = {}
mock_get_frameworks.return_value = []
mock_generate_dir.return_value = ("out-dir", "comp-dir")
mock_transform_stats.return_value = {"stats": "data"}
# Mock findings
mock_finding = MagicMock()
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[mock_finding],
True,
]
mock_transform_finding.return_value = MagicMock(compliance={})
# Track which output formats were created
created_writers = {}
def track_writer_creation(cls_type):
def factory(*args, **kwargs):
writer = MagicMock()
writer._data = []
writer.transform = MagicMock()
writer.batch_write_data_to_file = MagicMock()
created_writers[cls_type] = writer
return writer
return factory
# Mock OUTPUT_FORMATS_MAPPING with tracking
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"csv": {
"class": track_writer_creation("csv"),
"suffix": ".csv",
"kwargs": {},
},
"json-asff": {
"class": track_writer_creation("asff"),
"suffix": ".asff.json",
"kwargs": {},
},
"json-ocsf": {
"class": track_writer_creation("ocsf"),
"suffix": ".ocsf.json",
"kwargs": {},
},
},
):
mock_compress.return_value = "/tmp/compressed.zip"
mock_upload.return_value = "s3://bucket/file.zip"
# Execute
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
# Verify ASFF was NOT created for non-AWS provider
assert (
"asff" not in created_writers
), "ASFF writer should NOT be created for non-AWS providers"
assert "csv" in created_writers, "CSV writer should be created"
assert "ocsf" in created_writers, "OCSF writer should be created"
assert result == {"upload": True}
@patch("tasks.tasks.upload_s3_integration")
def test_s3_integration_task_success(self, mock_upload):
mock_upload.return_value = True
@@ -598,3 +999,33 @@ class TestCheckIntegrationsTask:
mock_upload.assert_called_once_with(
self.tenant_id, self.provider_id, output_directory
)
@patch("tasks.tasks.upload_security_hub_integration")
def test_security_hub_integration_task_success(self, mock_upload):
"""Test successful SecurityHub integration task execution."""
mock_upload.return_value = True
scan_id = "test-scan-123"
result = security_hub_integration_task(
tenant_id=self.tenant_id,
provider_id=self.provider_id,
scan_id=scan_id,
)
assert result is True
mock_upload.assert_called_once_with(self.tenant_id, self.provider_id, scan_id)
@patch("tasks.tasks.upload_security_hub_integration")
def test_security_hub_integration_task_failure(self, mock_upload):
"""Test SecurityHub integration task handling failure."""
mock_upload.return_value = False
scan_id = "test-scan-123"
result = security_hub_integration_task(
tenant_id=self.tenant_id,
provider_id=self.provider_id,
scan_id=scan_id,
)
assert result is False
mock_upload.assert_called_once_with(self.tenant_id, self.provider_id, scan_id)
-24
View File
@@ -1,24 +0,0 @@
---
hide:
- toc
---
# About
## Author
Prowler was created by **Toni de la Fuente** in 2016.
| ![](img/toni.png)<br>[![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/toniblyx.svg?style=social&label=Follow%20%40toniblyx)](https://twitter.com/toniblyx) [![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/prowlercloud.svg?style=social&label=Follow%20%40prowlercloud)](https://twitter.com/prowlercloud)|
|:--:|
| <b>Toni de la Fuente </b>|
## Maintainers
Prowler is maintained by the Engineers of the **Prowler Team** :
| ![](img/nacho.png)[![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/NachoRivCor.svg?style=social&label=Follow%20%40NachoRivCor)](https://twitter.com/NachoRivCor) | ![](img/sergio.png)[![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/sergargar1.svg?style=social&label=Follow%20%40sergargar1)](https://twitter.com/sergargar1) |![](img/pepe.png)[![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/jfagoagas.svg?style=social&label=Follow%20%40jfagoagas)](https://twitter.com/jfagoagas) |
|:--:|:--:|:--:
| <b>Nacho Rivera</b>| <b>Sergio Garcia</b>| <b>Pepe Fagoaga</b>|
## License
Prowler is licensed as **Apache License 2.0** as specified in each file. You may obtain a copy of the License at
<http://www.apache.org/licenses/LICENSE-2.0>
+1 -1
View File
@@ -50,7 +50,7 @@ Click `Go to Scans` to monitor progress.
Review findings during scan execution in the following sections:
- **Overview** Provides a high-level summary of your scans.
<img src="../../img/overview.png" alt="Overview" width="700"/>
<img src="../../products/img/overview.png" alt="Overview" width="700"/>
- **Compliance** Displays compliance insights based on security frameworks.
<img src="../../img/compliance.png" alt="Compliance" width="700"/>
+32 -7
View File
@@ -1,6 +1,6 @@
## Running Prowler
Running Prowler requires specifying the provider (e.g `aws`, `gcp`, `azure`, `m365`, `github` or `kubernetes`):
Running Prowler requires specifying the provider (e.g `aws`, `gcp`, `azure`, `kubernetes`, `m365`, `github`, `iac` or `mongodbatlas`):
???+ note
If no provider is specified, AWS is used by default for backward compatibility with Prowler v2.
@@ -11,7 +11,7 @@ prowler <provider>
![Prowler Execution](../img/short-display.png)
???+ note
Running the `prowler` command without options will uses environment variable credentials. Refer to the [Requirements](../getting-started/requirements.md) section for credential configuration details.
Running the `prowler` command without options will uses environment variable credentials. Refer to the Authentication section of each provider for credential configuration details.
## Verbose Output
@@ -77,7 +77,7 @@ prowler aws --profile custom-profile -f us-east-1 eu-south-2
???+ note
By default, `prowler` will scan all AWS regions.
See more details about AWS Authentication in the [Requirements](../getting-started/requirements.md#aws) section.
See more details about AWS Authentication in the [Authentication Section](../tutorials/aws/authentication.md) section.
## Azure
@@ -97,7 +97,7 @@ prowler azure --browser-auth --tenant-id "XXXXXXXX"
prowler azure --managed-identity-auth
```
See more details about Azure Authentication in [Requirements](../getting-started/requirements.md#azure)
See more details about Azure Authentication in the [Authentication Section](../tutorials/azure/authentication.md)
By default, Prowler scans all accessible subscriptions. Scan specific subscriptions using the following flag (using az cli auth as example):
@@ -193,7 +193,7 @@ prowler m365 --browser-auth --tenant-id "XXXXXXXX"
```
See more details about M365 Authentication in the [Requirements](../getting-started/requirements.md#microsoft-365) section.
See more details about M365 Authentication in the [Authentication Section](../tutorials/microsoft365/authentication.md) section.
## GitHub
@@ -223,7 +223,7 @@ Prowler enables security scanning of your **GitHub account**, including **Reposi
## Infrastructure as Code (IaC)
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Checkov](https://www.checkov.io/). This provider supports a wide range of IaC frameworks, allowing you to assess your code before deployment.
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Trivy](https://trivy.dev/). This provider supports a wide range of IaC frameworks, allowing you to assess your code before deployment.
```console
# Scan a directory for IaC files
@@ -252,6 +252,31 @@ prowler iac --scan-path ./my-iac-directory --exclude-path ./my-iac-directory/tes
- For remote repository scans, authentication can be provided via CLI flags or environment variables (`GITHUB_OAUTH_APP_TOKEN`, `GITHUB_USERNAME`, `GITHUB_PERSONAL_ACCESS_TOKEN`). CLI flags take precedence.
- The IaC provider does not require cloud authentication for local scans.
- It is ideal for CI/CD pipelines and local development environments.
- For more details on supported frameworks and rules, see the [Checkov documentation](https://www.checkov.io/1.Welcome/Quick%20Start.html)
- For more details on supported scanners, see the [Trivy documentation](https://trivy.dev/latest/docs/scanner/vulnerability/)
See more details about IaC scanning in the [IaC Tutorial](../tutorials/iac/getting-started-iac.md) section.
## MongoDB Atlas
Prowler allows you to scan your MongoDB Atlas cloud database deployments for security and compliance issues.
Authentication is done using MongoDB Atlas API key pairs:
```console
# Using command-line arguments
prowler mongodbatlas --atlas-public-key <public_key> --atlas-private-key <private_key>
# Using environment variables
export ATLAS_PUBLIC_KEY=<public_key>
export ATLAS_PRIVATE_KEY=<private_key>
prowler mongodbatlas
```
You can filter scans to specific organizations or projects:
```console
# Scan specific project
prowler mongodbatlas --atlas-project-id <project_id>
```
See more details about MongoDB Atlas Authentication in [Requirements](../getting-started/requirements.md#mongodb-atlas)
+1 -1
View File
@@ -2,7 +2,7 @@
In this page you can find all the details about [Amazon Web Services (AWS)](https://aws.amazon.com/) provider implementation in Prowler.
By default, Prowler will audit just one account and organization settings per scan. To configure it, follow the [getting started](../index.md#aws) page.
By default, Prowler will audit just one account and organization settings per scan. To configure it, follow the [AWS getting started guide](../tutorials/aws/getting-started-aws.md).
## AWS Provider Classes Architecture
+1 -1
View File
@@ -2,7 +2,7 @@
In this page you can find all the details about [Microsoft Azure](https://azure.microsoft.com/) provider implementation in Prowler.
By default, Prowler will audit all the subscriptions that it is able to list in the Microsoft Entra tenant, and tenant Entra ID service. To configure it, follow the [getting started](../index.md#azure) page.
By default, Prowler will audit all the subscriptions that it is able to list in the Microsoft Entra tenant, and tenant Entra ID service. To configure it, follow the [Azure getting started guide](../tutorials/azure/getting-started-azure.md).
## Azure Provider Classes Architecture
+1 -1
View File
@@ -265,7 +265,7 @@ Below is a generic example of a check metadata file. **Do not include comments i
- For AWS this field must follow the [AWS Security Hub Types](https://docs.aws.amazon.com/securityhub/latest/userguide/asff-required-attributes.html#Types) format. So the common pattern to follow is `namespace/category/classifier`, refer to the attached documentation for the valid values for this fields.
- **ServiceName** — The name of the provider service being audited. This field **must** be in lowercase and match with the service folder name. For supported services refer to [Prowler Hub](https://hub.prowler.com/check) or directly to [Prowler Code](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers).
- **SubServiceName** — The subservice or resource within the service, if applicable. For more information refer to the [Naming Format for Checks](#naming-format-for-checks) section.
- **ResourceIdTemplate** — A template for the unique resource identifier. For more information refer to the [Prowler's Resource Identification](#prowlers-resource-identification) section.
- **ResourceIdTemplate** — A template for the unique resource identifier. For more information refer to the [Resource Identification in Prowler](#resource-identification-in-prowler) section.
- **Severity** — The severity of the finding if the check fails. Must be one of: `critical`, `high`, `medium`, `low`, or `informational`, this field **must** be in lowercase. To get more information about the severity levels refer to the [Prowler's Check Severity Levels](#prowlers-check-severity-levels) section.
- **ResourceType** — The type of resource being audited. *For now this field is only standardized for the AWS provider*.
- For AWS use the [Security Hub resource types](https://docs.aws.amazon.com/securityhub/latest/userguide/asff-resources.html) or, if not available, the PascalCase version of the [CloudFormation type](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-template-resource-type-ref.html) (e.g., `AwsEc2Instance`). Use "Other" if no match exists.
+1 -1
View File
@@ -2,7 +2,7 @@
This page details the [Google Cloud Platform (GCP)](https://cloud.google.com/) provider implementation in Prowler.
By default, Prowler will audit all the GCP projects that the authenticated identity can access. To configure it, follow the [getting started](../index.md#google-cloud) page.
By default, Prowler will audit all the GCP projects that the authenticated identity can access. To configure it, follow the [GCP getting started guide](../tutorials/gcp/getting-started-gcp.md).
## GCP Provider Classes Architecture
+1 -1
View File
@@ -2,7 +2,7 @@
This page details the [GitHub](https://github.com/) provider implementation in Prowler.
By default, Prowler will audit the GitHub account - scanning all repositories, organizations, and applications that your configured credentials can access. To configure it, follow the [getting started](../index.md#github) page.
By default, Prowler will audit the GitHub account - scanning all repositories, organizations, and applications that your configured credentials can access. To configure it, follow the [GitHub getting started guide](../tutorials/github/getting-started-github.md).
## GitHub Provider Classes Architecture
+1 -1
View File
@@ -2,7 +2,7 @@
This page details the [Kubernetes](https://kubernetes.io/) provider implementation in Prowler.
By default, Prowler will audit all namespaces in the Kubernetes cluster accessible by the configured context. To configure it, follow the [getting started](../index.md#kubernetes) page.
By default, Prowler will audit all namespaces in the Kubernetes cluster accessible by the configured context. To configure it, see the [In-Cluster Execution](../tutorials/kubernetes/in-cluster.md) or [Non In-Cluster Execution](../tutorials/kubernetes/outside-cluster.md) guides.
## Kubernetes Provider Classes Architecture
+2 -2
View File
@@ -2,7 +2,7 @@
This page details the [Microsoft 365 (M365)](https://www.microsoft.com/en-us/microsoft-365) provider implementation in Prowler.
By default, Prowler will audit the Microsoft Entra ID tenant and its supported services. To configure it, follow the [getting started](../index.md#microsoft-365) page.
By default, Prowler will audit the Microsoft Entra ID tenant and its supported services. To configure it, follow the [M365 getting started guide](../tutorials/microsoft365/getting-started-m365.md).
---
@@ -15,7 +15,7 @@ By default, Prowler will audit the Microsoft Entra ID tenant and its supported s
- **Required modules:**
- [ExchangeOnlineManagement](https://www.powershellgallery.com/packages/ExchangeOnlineManagement/3.6.0) (≥ 3.6.0)
- [MicrosoftTeams](https://www.powershellgallery.com/packages/MicrosoftTeams/6.6.0) (≥ 6.6.0)
- If you use Prowler Cloud or the official containers, PowerShell is pre-installed. For local or pip installations, you must install PowerShell and the modules yourself. See [Requirements: Supported PowerShell Versions](../getting-started/requirements.md#supported-powershell-versions) and [Needed PowerShell Modules](../getting-started/requirements.md#needed-powershell-modules).
- If you use Prowler Cloud or the official containers, PowerShell is pre-installed. For local or pip installations, you must install PowerShell and the modules yourself. See [Authentication: Supported PowerShell Versions](../tutorials/microsoft365/authentication.md#supported-powershell-versions) and [Needed PowerShell Modules](../tutorials/microsoft365/authentication.md#required-powershell-modules).
- For more details and troubleshooting, see [Use of PowerShell in M365](../tutorials/microsoft365/use-of-powershell.md).
---
+2
View File
@@ -10,6 +10,7 @@ A provider is any platform or service that offers resources, data, or functional
- Software as a Service (SaaS) Platforms (like Microsoft 365)
- Development Platforms (like GitHub)
- Container Orchestration Platforms (like Kubernetes)
- Database-as-a-Service Platforms (like MongoDB Atlas)
For providers supported by Prowler, refer to [Prowler Hub](https://hub.prowler.com/).
@@ -63,6 +64,7 @@ Given the complexity and variability of providers, use existing provider impleme
- [Kubernetes](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/kubernetes/kubernetes_provider.py)
- [Microsoft365](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/microsoft365/microsoft365_provider.py)
- [GitHub](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/github/github_provider.py)
- [MongoDB Atlas](https://github.com/prowler-cloud/prowler/blob/master/prowler/providers/mongodbatlas/mongodbatlas_provider.py)
### Basic Provider Implementation: Pseudocode Example
+5 -5
View File
@@ -231,11 +231,11 @@ Before implementing a new service, verify that Prowler's existing permissions fo
Provider-Specific Permissions Documentation:
- [AWS](../getting-started/requirements.md#authentication)
- [Azure](../getting-started/requirements.md#needed-permissions)
- [GCP](../getting-started/requirements.md#needed-permissions_1)
- [M365](../getting-started/requirements.md#needed-permissions_2)
- [GitHub](../getting-started/requirements.md#authentication_2)
- [AWS](../tutorials/aws/authentication.md#required-permissions)
- [Azure](../tutorials/azure/authentication.md#required-permissions)
- [GCP](../tutorials/gcp/authentication.md#required-permissions)
- [M365](../tutorials/microsoft365/authentication.md#required-permissions)
- [GitHub](../tutorials/github/authentication.md)
## Best Practices
+2 -2
View File
@@ -39,7 +39,7 @@ To execute the Prowler test suite, install the necessary dependencies listed in
### Prerequisites
If you have not installed Prowler yet, refer to the [developer guide introduction](./introduction.md#get-the-code-and-install-all-dependencies).
If you have not installed Prowler yet, refer to the [developer guide introduction](./introduction.md#getting-the-code-and-installing-all-dependencies).
### Executing Tests
@@ -520,7 +520,7 @@ Execute tests on the service `__init__` to ensure correct information retrieval.
While service tests resemble *Integration Tests*, as they assess how the service interacts with the provider, they ultimately fall under *Unit Tests*, due to the use of Moto or custom mock objects.
For detailed guidance on test creation and existing service tests, refer to the [AWS checks test](./unit-testing.md#checks) [documentation](https://github.com/prowler-cloud/prowler/tree/master/tests/providers/aws/services).
For detailed guidance on test creation and existing service tests, check the current [AWS checks implementation](https://github.com/prowler-cloud/prowler/tree/master/tests/providers/aws/services).
## GCP
-591
View File
@@ -1,591 +0,0 @@
# Prowler Requirements
Prowler is built in Python and utilizes the following SDKs:
- [AWS SDK (Boto3)](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html#)
- [Azure SDK](https://azure.github.io/azure-sdk-for-python/)
- [GCP API Python Client](https://github.com/googleapis/google-api-python-client/)
- [Kubernetes SDK](https://github.com/kubernetes-client/python)
- [M365 Graph SDK](https://github.com/microsoftgraph/msgraph-sdk-python)
- [Github REST API SDK](https://github.com/PyGithub/PyGithub)
## AWS
Prowler requires AWS credentials to function properly. You can authenticate using any method outlined in the [AWS CLI configuration guide](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html#cli-configure-quickstart-precedence).
### Authentication Steps
Ensure your AWS CLI is correctly configured with valid credentials and region settings. You can achieve this via:
```console
aws configure
```
or
```console
export AWS_ACCESS_KEY_ID="ASXXXXXXX"
export AWS_SECRET_ACCESS_KEY="XXXXXXXXX"
export AWS_SESSION_TOKEN="XXXXXXXXX"
```
#### Required IAM Permissions
The credentials used must be associated with a user or role that has appropriate permissions for security checks. Attach the following AWS managed policies to ensure access:
- `arn:aws:iam::aws:policy/SecurityAudit`
- `arn:aws:iam::aws:policy/job-function/ViewOnlyAccess`
#### Additional Permissions
For certain checks, additional read-only permissions are required. Attach the following custom policy to your role:
[prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json)
If you intend to send findings to
[AWS Security Hub](https://aws.amazon.com/security-hub), attach the following custom policy:
[prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
### Multi-Factor Authentication (MFA)
If your IAM entity requires Multi-Factor Authentication (MFA), you can use the `--mfa` flag. Prowler will prompt you to enter the following values to initiate a new session:
- **ARN of your MFA device**
- **TOTP (Time-Based One-Time Password)**
## Azure
Prowler for Azure supports multiple authentication types. To use a specific method, pass the appropriate flag during execution:
- [**Service Principal Application**](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (**Recommended**)
- Existing **AZ CLI credentials**
- **Interactive browser authentication**
- [**Managed Identity**](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) authentication
> ⚠️ **Important:** For Prowler App, only Service Principal authentication is supported.
### Service Principal Application Authentication
To allow Prowler to authenticate using a Service Principal Application, set up the following environment variables:
```console
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXX"
```
If you execute Prowler with the `--sp-env-auth` flag and these variables are not set or exported, execution will fail.
Refer to the [Create Prowler Service Principal](../tutorials/azure/create-prowler-service-principal.md#how-to-create-prowler-service-principal-application) guide for detailed setup instructions.
### Azure Authentication Methods
Prowler for Azure supports the following authentication methods:
- **AZ CLI Authentication (`--az-cli-auth`)** Automated authentication using stored AZ CLI credentials.
- **Managed Identity Authentication (`--managed-identity-auth`)** Automated authentication via Azure Managed Identity.
- **Browser Authentication (`--browser-auth`)** Requires the user to authenticate using the default browser. The `tenant-id` parameter is mandatory for this method.
### Required Permissions
Prowler for Azure requires two types of permission scopes:
#### Microsoft Entra ID Permissions
These permissions allow Prowler to retrieve metadata from the assumed identity and perform specific Entra checks. While not mandatory for execution, they enhance functionality.
Required permissions:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All` (used for Entra multifactor authentication checks)
???+ note
You can replace `Directory.Read.All` with `Domain.Read.All` that is a more restrictive permission but you won't be able to run the Entra checks related with DirectoryRoles and GetUsers.
#### Subscription Scope Permissions
These permissions are required to perform security checks against Azure resources. The following **RBAC roles** must be assigned per subscription to the entity used by Prowler:
- `Reader` Grants read-only access to Azure resources.
- `ProwlerRole` A custom role with minimal permissions, defined in the [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json).
???+ note
The `assignableScopes` field in the JSON custom role file must be updated to reflect the correct subscription or management group. Use one of the following formats: `/subscriptions/<subscription-id>` or `/providers/Microsoft.Management/managementGroups/<management-group-id>`.
### Assigning Permissions
To properly configure permissions, follow these guides:
- [Microsoft Entra ID permissions](../tutorials/azure/create-prowler-service-principal.md#assigning-the-proper-permissions)
- [Azure subscription permissions](../tutorials/azure/subscriptions.md#assign-the-appropriate-permissions-to-the-identity-that-is-going-to-be-assumed-by-prowler)
???+ warning
Some permissions in `ProwlerRole` involve **write access**. If a `ReadOnly` lock is attached to certain resources, you may encounter errors, and findings for those checks will not be available.
#### Checks Requiring `ProwlerRole`
The following security checks require the `ProwlerRole` permissions for execution. Ensure the role is assigned to the identity assumed by Prowler before running these checks:
- `app_function_access_keys_configured`
- `app_function_ftps_deployment_disabled`
## Google Cloud
### Authentication
Prowler follows the same credential discovery process as the [Google authentication libraries](https://cloud.google.com/docs/authentication/application-default-credentials#search_order):
1. **Environment Variable Authentication** Uses the [`GOOGLE_APPLICATION_CREDENTIALS` environment variable](https://cloud.google.com/docs/authentication/application-default-credentials#GAC).
2. **Google Cloud CLI Credentials** Uses credentials configured via the [Google Cloud CLI](https://cloud.google.com/docs/authentication/application-default-credentials#personal).
3. **Service Account Authentication** Retrieves the attached service account credentials from the metadata server. More details [here](https://cloud.google.com/docs/authentication/application-default-credentials#attached-sa).
### Required Permissions
Prowler for Google Cloud requires the following permissions:
#### IAM Roles
- **Reader (`roles/reader`)** Must be granted at the **project, folder, or organization** level to allow scanning of target projects.
#### Project-Level Settings
At least one project must have the following configurations:
- **Identity and Access Management (IAM) API (`iam.googleapis.com`)** Must be enabled via:
- The [Google Cloud API UI](https://console.cloud.google.com/apis/api/iam.googleapis.com/metrics), or
- The `gcloud` CLI:
```sh
gcloud services enable iam.googleapis.com --project <your-project-id>
```
- **Service Usage Consumer (`roles/serviceusage.serviceUsageConsumer`)** IAM Role Required for resource scanning.
- **Quota Project Setting** Define a quota project using either:
- The `gcloud` CLI:
```sh
gcloud auth application-default set-quota-project <project-id>
```
- Setting an environment variable:
```sh
export GOOGLE_CLOUD_QUOTA_PROJECT=<project-id>
```
### Default Project Scanning
By default, Prowler scans **all accessible GCP projects**. To limit the scan to specific projects, use the `--project-ids` flag.
## Microsoft 365
Prowler for Microsoft 365 (M365) supports the following authentication methods:
- [**Service Principal Application**](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (**Recommended**)
- **Service Principal Application with Microsoft User Credentials**
- **Stored AZ CLI credentials**
- **Interactive browser authentication**
???+ warning
Prowler App supports the **Service Principal** authentication method and the **Service Principal with User Credentials** authentication method, but this last one will be deprecated in September once Microsoft will enforce MFA in all tenants not allowing User authentication without interactive method.
### Service Principal Authentication (Recommended)
**Authentication flag:** `--sp-env-auth`
To enable Prowler to authenticate as the **Service Principal Application**, configure the following environment variables:
```console
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
```
If these variables are not set or exported, execution using `--sp-env-auth` will fail.
Refer to the [Create Prowler Service Principal](../tutorials/microsoft365/getting-started-m365.md#create-the-service-principal-app) guide for setup instructions.
If the external API permissions described in the mentioned section above are not added only checks that work through MS Graph will be executed. This means that the full provider will not be executed.
???+ note
In order to scan all the checks from M365 required permissions to the service principal application must be added. Refer to the [External API Permissions Assignment](../tutorials/microsoft365/getting-started-m365.md#grant-powershell-modules-permissions) section for more information.
### Service Principal and User Credentials Authentication
Authentication flag: `--env-auth`
???+ warning
This method is not recommended anymore, we recommend just use the **Service Principal Application** authentication method instead.
This method builds upon the Service Principal authentication by adding User Credentials. Configure the following environment variables: `M365_USER` and `M365_PASSWORD`.
```console
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
export M365_USER="your_email@example.com"
export M365_PASSWORD="examplepassword"
```
These two new environment variables are **required** in this authentication method to execute the PowerShell modules needed to retrieve information from M365 services. Prowler uses Service Principal authentication to access Microsoft Graph and user credentials to authenticate to Microsoft PowerShell modules.
- `M365_USER` should be your Microsoft account email using the **assigned domain in the tenant**. This means it must look like `example@YourCompany.onmicrosoft.com` or `example@YourCompany.com`, but it must be the exact domain assigned to that user in the tenant.
???+ warning
If the user is newly created, you need to sign in with that account first, as Microsoft will prompt you to change the password. If you dont complete this step, user authentication will fail because Microsoft marks the initial password as expired.
???+ warning
If the user is newly created, you need to sign in with that account first, as Microsoft will prompt you to change the password. If you dont complete this step, user authentication will fail because Microsoft marks the initial password as expired.
???+ warning
The user must not be MFA capable. Microsoft does not allow MFA capable users to authenticate programmatically. See [Microsoft documentation](https://learn.microsoft.com/en-us/entra/identity-platform/scenario-desktop-acquire-token-username-password?tabs=dotnet) for more information.
???+ warning
Using a tenant domain other than the one assigned — even if it belongs to the same tenant — will cause Prowler to fail, as Microsoft authentication will not succeed.
Ensure you are using the right domain for the user you are trying to authenticate with.
![User Domains](../tutorials/microsoft365/img/user-domains.png)
- `M365_PASSWORD` must be the user password.
???+ note
Before we asked for a encrypted password, but now we ask for the user password directly. Prowler will now handle the password encryption for you.
### Interactive Browser Authentication
**Authentication flag:** `--browser-auth`
This authentication method requires the user to authenticate against Azure using the default browser to start the scan. The `--tenant-id` flag is also required.
With these credentials, you will only be able to run checks that rely on Microsoft Graph. This means you won't be able to run the entire provider. To perform a full M365 security scan, use the **recommended authentication method**.
Since this is a **delegated permission** authentication method, necessary permissions should be assigned to the user rather than the application.
### Required Permissions
To run the full Prowler provider, including PowerShell checks, two types of permission scopes must be set in **Microsoft Entra ID**.
#### For Service Principal Authentication (`--sp-env-auth`) - Recommended
When using service principal authentication, you need to add the following **Application Permissions** configured to:
**Microsoft Graph API Permissions:**
- `AuditLog.Read.All`: Required for Entra service.
- `Directory.Read.All`: Required for all services.
- `Policy.Read.All`: Required for all services.
- `SharePointTenantSettings.Read.All`: Required for SharePoint service.
- `User.Read` (IMPORTANT: this must be set as **delegated**): Required for the sign-in.
**External API Permissions:**
- `Exchange.ManageAsApp` from external API `Office 365 Exchange Online`: Required for Exchange PowerShell module app authentication. You also need to assign the `Global Reader` role to the app.
- `application_access` from external API `Skype and Teams Tenant Admin API`: Required for Teams PowerShell module app authentication.
???+ note
`Directory.Read.All` can be replaced with `Domain.Read.All` that is a more restrictive permission but you won't be able to run the Entra checks related with DirectoryRoles and GetUsers.
> If you do this you will need to add also the `Organization.Read.All` permission to the service principal application in order to authenticate.
???+ note
This is the **recommended authentication method** because it allows you to run the full M365 provider including PowerShell checks, providing complete coverage of all available security checks, same as the Service Principal Authentication + User Credentials Authentication but this last one will be deprecated in September once Microsoft will enforce MFA in all tenants not allowing User authentication without interactive method.
#### For Service Principal + User Credentials Authentication (`--env-auth`)
When using service principal with user credentials authentication, you need **both** sets of permissions:
**1. Service Principal Application Permissions**:
- You **will need** all the Microsoft Graph API permissions listed above.
- You **won't need** the External API permissions listed above.
**2. User-Level Permissions**: These are set at the `M365_USER` level, so the user used to run Prowler must have one of the following roles:
- `Global Reader` (recommended): this allows you to read all roles needed.
- `Exchange Administrator` and `Teams Administrator`: user needs both roles but with this [roles](https://learn.microsoft.com/en-us/exchange/permissions-exo/permissions-exo#microsoft-365-permissions-in-exchange-online) you can access to the same information as a Global Reader (since only read access is needed, Global Reader is recommended).
#### For Browser Authentication (`--browser-auth`)
When using browser authentication, permissions are delegated to the user, so the user must have the appropriate permissions rather than the application.
???+ warning
With browser authentication, you will only be able to run checks that work through MS Graph API. PowerShell module checks will not be executed.
### Assigning Permissions and Roles
For guidance on assigning the necessary permissions and roles, follow these instructions:
- [Grant API Permissions](../tutorials/microsoft365/getting-started-m365.md#grant-required-graph-api-permissions)
- [Assign Required Roles](../tutorials/microsoft365/getting-started-m365.md#if-using-user-authentication)
### Supported PowerShell Versions
PowerShell is required to run certain M365 checks.
**Supported versions:**
- **PowerShell 7.4 or higher** (7.5 is recommended)
#### Why Is PowerShell 7.4+ Required?
- **PowerShell 5.1** (default on some Windows systems) does not support required cmdlets.
- Older [cross-platform PowerShell versions](https://learn.microsoft.com/en-us/powershell/scripting/install/powershell-support-lifecycle?view=powershell-7.5) are **unsupported**, leading to potential errors.
???+ note
Installing PowerShell is only necessary if you install Prowler via **pip or other sources**. **SDK and API containers include PowerShell by default.**
### Installing PowerShell
Installing PowerShell is different depending on your OS.
- [Windows](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.5#install-powershell-using-winget-recommended): you will need to update PowerShell to +7.4 to be able to run prowler, if not some checks will not show findings and the provider could not work as expected. This version of PowerShell is [supported](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.4#supported-versions-of-windows) on Windows 10, Windows 11, Windows Server 2016 and higher versions.
```console
winget install --id Microsoft.PowerShell --source winget
```
- [MacOS](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-macos?view=powershell-7.5#install-the-latest-stable-release-of-powershell): installing PowerShell on MacOS needs to have installed [brew](https://brew.sh/), once you have it is just running the command above, Pwsh is only supported in macOS 15 (Sequoia) x64 and Arm64, macOS 14 (Sonoma) x64 and Arm64, macOS 13 (Ventura) x64 and Arm64
```console
brew install powershell/tap/powershell
```
Once it's installed run `pwsh` on your terminal to verify it's working.
- Linux: installing PowerShell on Linux depends on the distro you are using:
- [Ubuntu](https://learn.microsoft.com/es-es/powershell/scripting/install/install-ubuntu?view=powershell-7.5#installation-via-package-repository-the-package-repository): The required version for installing PowerShell +7.4 on Ubuntu are Ubuntu 22.04 and Ubuntu 24.04. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget apt-transport-https software-properties-common
# Get the version of Ubuntu
source /etc/os-release
# Download the Microsoft repository keys
wget -q https://packages.microsoft.com/config/ubuntu/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Alpine](https://learn.microsoft.com/es-es/powershell/scripting/install/install-alpine?view=powershell-7.5#installation-steps): The only supported version for installing PowerShell +7.4 on Alpine is Alpine 3.20. The unique way to install it is downloading the tar.gz package available on [PowerShell github](https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz). You just need to follow the following steps:
```console
# Install the requirements
sudo apk add --no-cache \
ca-certificates \
less \
ncurses-terminfo-base \
krb5-libs \
libgcc \
libintl \
libssl3 \
libstdc++ \
tzdata \
userspace-rcu \
zlib \
icu-libs \
curl
apk -X https://dl-cdn.alpinelinux.org/alpine/edge/main add --no-cache \
lttng-ust \
openssh-client \
# Download the powershell '.tar.gz' archive
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz -o /tmp/powershell.tar.gz
# Create the target folder where powershell will be placed
sudo mkdir -p /opt/microsoft/powershell/7
# Expand powershell to the target folder
sudo tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7
# Set execute permissions
sudo chmod +x /opt/microsoft/powershell/7/pwsh
# Create the symbolic link that points to pwsh
sudo ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh
# Start PowerShell
pwsh
```
- [Debian](https://learn.microsoft.com/es-es/powershell/scripting/install/install-debian?view=powershell-7.5#installation-on-debian-11-or-12-via-the-package-repository): The required version for installing PowerShell +7.4 on Debian are Debian 11 and Debian 12. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget
# Get the version of Debian
source /etc/os-release
# Download the Microsoft repository GPG keys
wget -q https://packages.microsoft.com/config/debian/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository GPG keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository GPG keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Rhel](https://learn.microsoft.com/es-es/powershell/scripting/install/install-rhel?view=powershell-7.5#installation-via-the-package-repository): The required version for installing PowerShell +7.4 on Red Hat are RHEL 8 and RHEL 9. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Get version of RHEL
source /etc/os-release
if [ ${VERSION_ID%.*} -lt 8 ]
then majorver=7
elif [ ${VERSION_ID%.*} -lt 9 ]
then majorver=8
else majorver=9
fi
# Download the Microsoft RedHat repository package
curl -sSL -O https://packages.microsoft.com/config/rhel/$majorver/packages-microsoft-prod.rpm
# Register the Microsoft RedHat repository
sudo rpm -i packages-microsoft-prod.rpm
# Delete the downloaded package after installing
rm packages-microsoft-prod.rpm
# Update package index files
sudo dnf update
# Install PowerShell
sudo dnf install powershell -y
```
- [Docker](https://learn.microsoft.com/es-es/powershell/scripting/install/powershell-in-docker?view=powershell-7.5#use-powershell-in-a-container): The following command download the latest stable versions of PowerShell:
```console
docker pull mcr.microsoft.com/dotnet/sdk:9.0
```
To start an interactive shell of Pwsh you just need to run:
```console
docker run -it mcr.microsoft.com/dotnet/sdk:9.0 pwsh
```
### Required PowerShell Modules
Prowler relies on several PowerShell cmdlets to retrieve necessary data.
These cmdlets come from different modules that must be installed.
#### Automatic Installation
The required modules are automatically installed when running Prowler with the `--init-modules` flag.
Example command:
```console
python3 prowler-cli.py m365 --verbose --log-level ERROR --env-auth --init-modules
```
If the modules are already installed, running this command will not cause issues—it will simply verify that the necessary modules are available.
???+ note
Prowler installs the modules using `-Scope CurrentUser`.
If you encounter any issues with services not working after the automatic installation, try installing the modules manually using `-Scope AllUsers` (administrator permissions are required for this).
The command needed to install a module manually is:
```powershell
Install-Module -Name "ModuleName" -Scope AllUsers -Force
```
#### Modules Version
- [ExchangeOnlineManagement](https://www.powershellgallery.com/packages/ExchangeOnlineManagement/3.6.0) (Minimum version: 3.6.0) Required for checks across Exchange, Defender, and Purview.
- [MicrosoftTeams](https://www.powershellgallery.com/packages/MicrosoftTeams/6.6.0) (Minimum version: 6.6.0) Required for all Teams checks.
- [MSAL.PS](https://www.powershellgallery.com/packages/MSAL.PS/4.32.0): Required for Exchange module via application authentication.
[MSAL.PS](https://www.powershellgallery.com/packages/MSAL.PS/4.32.0): Required for Exchange module via application authentication.
## GitHub
Prowler supports multiple [authentication methods for GitHub](https://docs.github.com/en/rest/authentication/authenticating-to-the-rest-api).
### Supported Authentication Methods
- **Personal Access Token (PAT)**
- **OAuth App Token**
- **GitHub App Credentials**
These options provide flexibility for scanning and analyzing your GitHub account, repositories, organizations, and applications. Choose the authentication method that best suits your security needs.
???+ note
GitHub App Credentials support less checks than other authentication methods.
## Infrastructure as Code (IaC)
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Checkov](https://www.checkov.io/). This provider supports a wide range of IaC frameworks and requires no cloud authentication for local scans.
### Authentication
- For local scans, no authentication is required.
- For remote repository scans, authentication can be provided via:
- [**GitHub Username and Personal Access Token (PAT)**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)
- [**GitHub OAuth App Token**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-fine-grained-personal-access-token)
- [**Git URL**](https://git-scm.com/docs/git-clone#_git_urls)
### Supported Frameworks
The IaC provider leverages Checkov to support multiple frameworks, including:
- Terraform
- CloudFormation
- Kubernetes
- ARM (Azure Resource Manager)
- Serverless
- Dockerfile
- YAML/JSON (generic IaC)
- Bicep
- Helm
- GitHub Actions, GitLab CI, Bitbucket Pipelines, Azure Pipelines, CircleCI, Argo Workflows
- Ansible
- Kustomize
- OpenAPI
- SAST, SCA (Software Composition Analysis)
+11 -9
View File
@@ -4,15 +4,17 @@
The official supported providers right now are:
- **AWS**
- **Azure**
- **Google Cloud**
- **Kubernetes**
- **M365**
- **Github**
- **IaC**
Unofficially, Prowler supports: NHN.
| Provider | Support | Stage | Interface |
|----------|--------|-------|----------|
| **AWS** | Official | Stable | UI, API, CLI |
| **Azure** | Official | Stable | UI, API, CLI |
| **Google Cloud** | Official | Stable | UI, API, CLI |
| **Kubernetes** | Official | Stable | UI, API, CLI |
| **M365** | Official | Stable | UI, API, CLI |
| **Github** | Official | Stable | UI, API, CLI |
| **IaC** | Official | Beta | CLI |
| **MongoDB Atlas** | Official | Beta | CLI |
| **NHN** | Unofficial | Beta | CLI |
Prowler supports **auditing, incident response, continuous monitoring, hardening, forensic readiness, and remediation**.
+76 -13
View File
@@ -1,24 +1,87 @@
# Security
## Compliance and Trust
We publish our live SOC 2 Type 2 Compliance data at [https://trust.prowler.com](https://trust.prowler.com)
As an **AWS Partner**, we have passed the [AWS Foundation Technical Review (FTR)](https://aws.amazon.com/partners/foundational-technical-review/).
## Encryption (Prowler Cloud)
We use encryption everywhere possible. The data and communications used by **Prowler Cloud** are **encrypted at-rest** and **in-transit**.
## Data Retention Policy (Prowler Cloud)
Prowler Cloud is GDPR compliant in regards to personal data and the ["right to be forgotten"](https://gdpr.eu/right-to-be-forgotten/). When a user deletes their account their user information will be deleted from Prowler Cloud online and backup systems within 10 calendar days.
## Software Security
As an **AWS Partner** and we have passed the [AWS Foundation Technical Review (FTR)](https://aws.amazon.com/partners/foundational-technical-review/) and we use the following tools and automation to make sure our code is secure and dependencies up-to-dated:
We follow a **security-by-design approach** throughout our software development lifecycle. All changes go through automated checks at every stage, from local development to production deployment.
- `bandit` for code security review.
- `safety` and `dependabot` for dependencies.
- `hadolint` and `dockle` for our containers security.
- `snyk` in Docker Hub.
- `clair` in Amazon ECR.
- `vulture`, `flake8`, `black` and `pylint` for formatting and best practices.
We enforce [pre-commit](https://github.com/prowler-cloud/prowler/blob/master/.pre-commit-config.yaml) validations to catch issues early, and [our CI/CD pipelines](https://github.com/prowler-cloud/prowler/tree/master/.github) include multiple security gates to ensure code quality, secure configurations, and compliance with internal standards.
## Reporting Vulnerabilities
Our container registries are continuously scanned for vulnerabilities, with findings automatically reported to our security team for assessment and remediation. This process evolves alongside our stack as we adopt new languages, frameworks, and technologies, ensuring our security practices remain comprehensive, proactive, and adaptable.
If you would like to report a vulnerability or have a security concern regarding Prowler Open Source or Prowler Cloud service, please submit the information by contacting to us via [**support.prowler.com**](http://support.prowler.com).
## Reporting Vulnerabilities
The information you share with the Prowler team as part of this process is kept confidential within Prowler. We will only share this information with a third party if the vulnerability you report is found to affect a third-party product, in which case we will share this information with the third-party product's author or manufacturer. Otherwise, we will only share this information as permitted by you.
At Prowler, we consider the security of our open source software and systems a top priority. But no matter how much effort we put into system security, there can still be vulnerabilities present.
We will review the submitted report, and assign it a tracking number. We will then respond to you, acknowledging receipt of the report, and outline the next steps in the process.
If you discover a vulnerability, we would like to know about it so we can take steps to address it as quickly as possible. We would like to ask you to help us better protect our users, our clients and our systems.
You will receive a non-automated response to your initial contact within 24 hours, confirming receipt of your reported vulnerability.
When reporting vulnerabilities, please consider (1) attack scenario / exploitability, and (2) the security impact of the bug. The following issues are considered out of scope:
We will coordinate public notification of any validated vulnerability with you. Where possible, we prefer that our respective public disclosures be posted simultaneously.
- Social engineering support or attacks requiring social engineering.
- Clickjacking on pages with no sensitive actions.
- Cross-Site Request Forgery (CSRF) on unauthenticated forms or forms with no sensitive actions.
- Attacks requiring Man-In-The-Middle (MITM) or physical access to a user's device.
- Previously known vulnerable libraries without a working Proof of Concept (PoC).
- Comma Separated Values (CSV) injection without demonstrating a vulnerability.
- Missing best practices in SSL/TLS configuration.
- Any activity that could lead to the disruption of service (DoS).
- Rate limiting or brute force issues on non-authentication endpoints.
- Missing best practices in Content Security Policy (CSP).
- Missing HttpOnly or Secure flags on cookies.
- Configuration of or missing security headers.
- Missing email best practices, such as invalid, incomplete, or missing SPF/DKIM/DMARC records.
- Vulnerabilities only affecting users of outdated or unpatched browsers (less than two stable versions behind).
- Software version disclosure, banner identification issues, or descriptive error messages.
- Tabnabbing.
- Issues that require unlikely user interaction.
- Improper logout functionality and improper session timeout.
- CORS misconfiguration without an exploitation scenario.
- Broken link hijacking.
- Automated scanning results (e.g., sqlmap, Burp active scanner) that have not been manually verified.
- Content spoofing and text injection issues without a clear attack vector.
- Email spoofing without exploiting security flaws.
- Dead links or broken links.
- User enumeration.
Testing guidelines:
- Do not run automated scanners on other customer projects. Running automated scanners can run up costs for our users. Aggressively configured scanners might inadvertently disrupt services, exploit vulnerabilities, lead to system instability or breaches and violate Terms of Service from our upstream providers. Our own security systems won't be able to distinguish hostile reconnaissance from whitehat research. If you wish to run an automated scanner, notify us at support@prowler.com and only run it on your own Prowler app project. Do NOT attack Prowler in usage of other customers.
- Do not take advantage of the vulnerability or problem you have discovered, for example by downloading more data than necessary to demonstrate the vulnerability or deleting or modifying other people's data.
Reporting guidelines:
- File a report through our Support Desk at https://support.prowler.com
- If it is about a lack of a security functionality, please file a feature request instead at https://github.com/prowler-cloud/prowler/issues
- Do provide sufficient information to reproduce the problem, so we will be able to resolve it as quickly as possible.
- If you have further questions and want direct interaction with the Prowler team, please contact us at via our Community Slack at goto.prowler.com/slack.
Disclosure guidelines:
- In order to protect our users and customers, do not reveal the problem to others until we have researched, addressed and informed our affected customers.
- If you want to publicly share your research about Prowler at a conference, in a blog or any other public forum, you should share a draft with us for review and approval at least 30 days prior to the publication date. Please note that the following should not be included:
- Data regarding any Prowler user or customer projects.
- Prowler customers' data.
- Information about Prowler employees, contractors or partners.
What we promise:
- We will respond to your report within 5 business days with our evaluation of the report and an expected resolution date.
- If you have followed the instructions above, we will not take any legal action against you in regard to the report.
- We will handle your report with strict confidentiality, and not pass on your personal details to third parties without your permission.
- We will keep you informed of the progress towards resolving the problem.
- In the public information concerning the problem reported, we will give your name as the discoverer of the problem (unless you desire otherwise).
We strive to resolve all problems as quickly as possible, and we would like to play an active role in the ultimate publication on the problem after it is resolved.
+18 -18
View File
@@ -1,6 +1,17 @@
# AWS Authentication in Prowler
Proper authentication is required for Prowler to perform security checks across AWS resources. Ensure that AWS-CLI is correctly configured or manually declare AWS credentials before running scans.
Prowler requires AWS credentials to function properly. Authentication is available through the following methods:
## Required Permissions
To ensure full functionality, attach the following AWS managed policies to the designated user or role:
- `arn:aws:iam::aws:policy/SecurityAudit`
- `arn:aws:iam::aws:policy/job-function/ViewOnlyAccess`
### Additional Permissions
For certain checks, additional read-only permissions are required. Attach the following custom policy to your role: [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json)
## Configure AWS Credentials
@@ -18,24 +29,13 @@ export AWS_SECRET_ACCESS_KEY="XXXXXXXXX"
export AWS_SESSION_TOKEN="XXXXXXXXX"
```
These credentials must be associated with a user or role with the necessary permissions to perform security checks.
These credentials must be associated with a user or role with the necessary permissions to perform security checks.
## Assign Required AWS Permissions
To ensure full functionality, attach the following AWS managed policies to the designated user or role:
- `arn:aws:iam::aws:policy/SecurityAudit`
- `arn:aws:iam::aws:policy/job-function/ViewOnlyAccess`
???+ note
Some security checks require read-only additional permissions. Attach the following custom policies to the role: [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json). If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure to also attach the custom policy: [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
## AWS Profiles
## AWS Profiles and Service Scanning in Prowler
Prowler supports authentication and security assessments using custom AWS profiles and can optionally scan unused services.
**Using Custom AWS Profiles**
Prowler allows you to specify a custom AWS profile using the following command:
Specify a custom AWS profile using the following command:
```console
prowler aws -p/--profile <profile_name>
@@ -43,7 +43,7 @@ prowler aws -p/--profile <profile_name>
## Multi-Factor Authentication (MFA)
If MFA enforcement is required for your IAM entity, you can use `--mfa`. Prowler will prompt you to enter the following in order to get a new session:
For IAM entities requiring Multi-Factor Authentication (MFA), use the `--mfa` flag. Prowler prompts for the following values to initiate a new session:
- ARN of your MFA device
- TOTP (Time-Based One-Time Password)
- **ARN of your MFA device**
- **TOTP (Time-Based One-Time Password)**
@@ -97,6 +97,9 @@ This method grants permanent access and is the recommended setup for production
![External ID](./img/prowler-cloud-external-id.png)
![Stack Data](./img/fill-stack-data.png)
!!! info
An **External ID** is required when assuming the *ProwlerScan* role to comply with AWS [confused deputy prevention](https://docs.aws.amazon.com/IAM/latest/UserGuide/confused-deputy.html).
6. Acknowledge the IAM resource creation warning and proceed
![Stack Creation Second Step](./img/stack-creation-second-step.png)
+70 -1
View File
@@ -1,5 +1,13 @@
# AWS Organizations in Prowler
Prowler can integrate with AWS Organizations to manage the visibility and onboarding of accounts centrally.
When trusted access is enabled with the Organization, Prowler can discover accounts as they are created and even automate deployment of the Prowler Scan IAM Role.
> ️ Trusted access can be enabled in the Management Account from the AWS Console under **AWS Organizations → Settings → Trusted access for AWS CloudFormation StackSets**.
When not using StackSets or Prowler and only needing to scan AWS Organization accounts using the CLI, it is possible to assume a role in each account manually or automate that logic with custom scripts.
## Retrieving AWS Account Details
If AWS Organizations is enabled, Prowler can fetch detailed account information during scans, including:
@@ -33,7 +41,7 @@ Prowler will scan the AWS account and get the account details from AWS Organizat
### Handling JSON Output
In Prowlers JSON output, tags are encoded in Base64 to prevent formatting errors in CSV or JSON outputs. This ensures compatibility when exporting findings.
In Prowler's JSON output, tags are encoded in Base64 to prevent formatting errors in CSV or JSON outputs. This ensures compatibility when exporting findings.
```json
"Account Email": "my-prod-account@domain.com",
@@ -51,6 +59,67 @@ The additional fields in CSV header output are as follows:
- ACCOUNT\_DETAILS\_ORG
- ACCOUNT\_DETAILS\_TAGS
## Deploying Prowler IAM Roles Across AWS Organizations
When onboarding multiple AWS accounts into Prowler Cloud, it is important to deploy the Prowler Scan IAM Role in each account. The most efficient way to do this across an AWS Organization is by leveraging AWS CloudFormation StackSets, which rolls out infrastructure—like IAM roles—to all accounts centrally from the Management or Delegated Admin account.
When using Infrastructure as Code (IaC), Terraform is recommended to manage this deployment systematically.
### Recommended Approach
- **Use StackSets** from the **Management Account** (or a Delegated Admin/Security Account).
- **Use Terraform** to orchestrate the deployment.
- **Use the official CloudFormation template** provided by Prowler.
- Target specific Organizational Units (OUs) or the entire Organization.
???+ note
A detailed community article this implementation is based on is available here:
[Deploy IAM Roles Across an AWS Organization as Code (Unicrons)](https://unicrons.cloud/en/2024/10/14/deploy-iam-roles-across-an-aws-organization-as-code/)
This guide has been adapted with permission and aligned with Prowlers IAM role requirements.
---
### Step-by-Step Guide Using Terraform
Below is a ready Terraform snippet that deploys the [Prowler Scan IAM Role CloudFormation template](https://github.com/prowler-cloud/prowler/blob/master/permissions/templates/cloudformation/prowler-scan-role.yml) across the AWS Organization using StackSets:
```hcl title="main.tf"
data "aws_caller_identity" "this" {}
data "aws_organizations_organization" "this" {}
module "prowler-scan-role" {
source = "unicrons/organization-iam-role/aws"
stack_set_name = "prowler-scan-role"
stack_set_description = "Deploy Prowler Scan IAM Role across all organization accounts"
template_path = "${path.root}/prowler-scan-role.yaml"
template_parameters = {
ExternalId = "<< external ID >>" # Replace with the External ID provided by Prowler Cloud
}
# Specific OU IDs can be specified instead of root
organizational_unit_ids = [data.aws_organizations_organization.this.roots[0].id]
}
```
#### `prowler-scan-role.yaml`
Download or reference the official CloudFormation template directly from GitHub:
- [prowler-scan-role.yml](https://github.com/prowler-cloud/prowler/blob/master/permissions/templates/cloudformation/prowler-scan-role.yml)
---
### IAM Role: External ID Support
Include the `ExternalId` parameter in the StackSet if required by the organization's Prowler Cloud setup. This ensures secure cross-account access for scanning.
---
When encountering issues during deployment or needing to target specific OUs or environments (e.g., dev/staging/prod), reach out to the Prowler team via [Slack Community](https://prowler.com/slack) or [Support](mailto:support@prowler.com).
## Extra: Run Prowler across all accounts in AWS Organizations by assuming roles
### Running Prowler Across All AWS Organization Accounts
+3 -3
View File
@@ -21,7 +21,7 @@ AWS Security Hub can be enabled using either of the following methods:
#### Enabling AWS Security Hub for Prowler Integration
If AWS Security Hub is already enabled, you can proceed to the [next section](#enable-prowler-integration).
If AWS Security Hub is already enabled, you can proceed to the [next section](#enabling-prowler-integration-in-aws-security-hub).
1. Enable AWS Security Hub via Console: Open the **AWS Security Hub** console: https://console.aws.amazon.com/securityhub/.
@@ -33,7 +33,7 @@ If AWS Security Hub is already enabled, you can proceed to the [next section](#e
#### Enabling Prowler Integration in AWS Security Hub
If the Prowler integration is already enabled in AWS Security Hub, you can proceed to the [next section](#send-findings) and begin sending findings.
If the Prowler integration is already enabled in AWS Security Hub, you can proceed to the [next section](#sending-findings-to-aws-security-hub) and begin sending findings.
Once **AWS Security Hub** is activated, **Prowler** must be enabled as partner integration to allow security findings to be sent to it.
@@ -132,7 +132,7 @@ prowler --security-hub --role arn:aws:iam::123456789012:role/ProwlerExecutionRol
```
???+ note
The specified IAM role must have the necessary permissions to send findings to Security Hub. For details on the required permissions, refer to the IAM policy: [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json)
The specified IAM role must have the necessary permissions to send findings to Security Hub. For details on the required permissions, refer to the IAM policy: [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json)
## Sending Only Failed Findings to AWS Security Hub
+68 -19
View File
@@ -1,28 +1,77 @@
# Azure Authentication in Prowler
By default, Prowler utilizes the Azure Python SDK identity package for authentication, leveraging the classes `DefaultAzureCredential` and `InteractiveBrowserCredential`. This enables authentication against Azure using the following approaches:
Prowler for Azure supports multiple authentication types. To use a specific method, pass the appropriate flag during execution:
- Service principal authentication via environment variables (Enterprise Application)
- Currently stored AZ CLI credentials
- Interactive browser authentication
- Managed identity authentication
- [**Service Principal Application**](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (**Recommended**)
- Existing **AZ CLI credentials**
- **Interactive browser authentication**
- [**Managed Identity**](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) authentication
Before launching the tool, specify the desired method using the following flags:
> ⚠️ **Important:** For Prowler App, only Service Principal authentication is supported.
### Service Principal Application Authentication
Enable Prowler authentication using a Service Principal Application by setting up the following environment variables:
```console
# Service principal authentication:
prowler azure --sp-env-auth
# AZ CLI authentication
prowler azure --az-cli-auth
# Browser authentication
prowler azure --browser-auth --tenant-id "XXXXXXXX"
# Managed identity authentication
prowler azure --managed-identity-auth
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXX"
```
## Permission Configuration
Execution with the `--sp-env-auth` flag fails if these variables are not set or exported.
To ensure Prowler can access the required resources within your Azure account, proper permissions must be configured. Refer to the [Requirements](../../getting-started/requirements.md) section for details on setting up necessary privileges.
Refer to the [Create Prowler Service Principal](create-prowler-service-principal.md) guide for detailed setup instructions.
### Azure Authentication Methods
Prowler for Azure supports the following authentication methods:
- **AZ CLI Authentication (`--az-cli-auth`)** Automated authentication using stored AZ CLI credentials.
- **Managed Identity Authentication (`--managed-identity-auth`)** Automated authentication via Azure Managed Identity.
- **Browser Authentication (`--browser-auth`)** Requires the user to authenticate using the default browser. The `tenant-id` parameter is mandatory for this method.
### Required Permissions
Prowler for Azure requires two types of permission scopes:
#### Microsoft Entra ID Permissions
These permissions allow Prowler to retrieve metadata from the assumed identity and perform specific Entra checks. While not mandatory for execution, they enhance functionality.
Required permissions:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All` (used for Entra multifactor authentication checks)
???+ note
Replace `Directory.Read.All` with `Domain.Read.All` for more restrictive permissions. Note that Entra checks related to DirectoryRoles and GetUsers will not run with this permission.
#### Subscription Scope Permissions
These permissions are required to perform security checks against Azure resources. The following **RBAC roles** must be assigned per subscription to the entity used by Prowler:
- `Reader` Grants read-only access to Azure resources.
- `ProwlerRole` A custom role with minimal permissions, defined in the [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json).
???+ note
The `assignableScopes` field in the JSON custom role file must be updated to reflect the correct subscription or management group. Use one of the following formats: `/subscriptions/<subscription-id>` or `/providers/Microsoft.Management/managementGroups/<management-group-id>`.
### Assigning Permissions
To properly configure permissions, follow these guides:
- [Microsoft Entra ID permissions](create-prowler-service-principal.md#assigning-proper-permissions)
- [Azure subscription permissions](subscriptions.md)
???+ warning
Some permissions in `ProwlerRole` involve **write access**. If a `ReadOnly` lock is attached to certain resources, you may encounter errors, and findings for those checks will not be available.
#### Checks Requiring `ProwlerRole`
The following security checks require the `ProwlerRole` permissions for execution. Ensure the role is assigned to the identity assumed by Prowler before running these checks:
- `app_function_access_keys_configured`
- `app_function_ftps_deployment_disabled`
@@ -73,7 +73,7 @@ Permissions can be assigned via the Azure Portal or the Azure CLI.
7. Finally, search for "Directory", "Policy" and "UserAuthenticationMethod" select the following permissions:
- `Domain.Read.All`
- `Directory.Read.All`
- `Policy.Read.All`
@@ -4,6 +4,9 @@
Set up your Azure subscription to enable security scanning using Prowler Cloud/App.
???+ note "Government Cloud Support"
Government cloud subscriptions (Azure Government) are not currently supported, but we expect to add support for them in the near future.
## Requirements
To configure your Azure subscription, youll need:
+3 -3
View File
@@ -21,7 +21,7 @@ Prowler allows you to specify one or more subscriptions for scanning (up to N),
To perform scans, ensure that the identity assumed by Prowler has the appropriate permissions.
By default, Prowler scans all accessible subscriptions. If you need to audit specific subscriptions, you must assign the necessary role `Reader` for each one. For streamlined and less repetitive role assignments in multi-subscription environments, refer to the [following section](#recommendation-for-multiple-subscriptions).
By default, Prowler scans all accessible subscriptions. If you need to audit specific subscriptions, you must assign the necessary role `Reader` for each one. For streamlined and less repetitive role assignments in multi-subscription environments, refer to the [following section](#recommendation-for-managing-multiple-subscriptions).
### Assigning the Reader Role in Azure Portal
@@ -76,7 +76,7 @@ Navigate to the subscription you want to audit with Prowler.
Some read-only permissions required for specific security checks are not included in the built-in Reader role. To support these checks, Prowler utilizes a custom role, defined in [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json). Once created, this role can be assigned following the same process as the `Reader` role.
The checks requiring this `ProwlerRole` can be found in the [requirements section](../../getting-started/requirements.md#checks-that-require-prowlerrole).
The checks requiring this `ProwlerRole` can be found in this [section](../../tutorials/azure/authentication.md#checks-requiring-prowlerrole).
#### Create ProwlerRole via Azure Portal
@@ -152,7 +152,7 @@ Scanning multiple subscriptions requires creating and assigning roles for each,
![Create management group](../../img/create-management-group.gif)
2. **Assign Roles**: Assign necessary roles to the management group, similar to the [role assignment process](#assign-the-appropriate-permissions-to-the-identity-that-is-going-to-be-assumed-by-prowler).
2. **Assign Roles**: Assign necessary roles to the management group, similar to the [role assignment process](#assigning-permissions-for-subscription-scans).
Role assignment should be done at the management group level instead of per subscription.
@@ -0,0 +1,367 @@
# Bulk Provider Provisioning in Prowler
Prowler enables automated provisioning of multiple cloud providers through the Bulk Provider Provisioning tool. This approach streamlines the onboarding process for organizations managing numerous cloud accounts, subscriptions, and projects across AWS, Azure, GCP, Kubernetes, Microsoft 365, and GitHub.
The tool is available in the Prowler repository at: [util/prowler-bulk-provisioning](https://github.com/prowler-cloud/prowler/tree/master/util/prowler-bulk-provisioning)
![](./img/bulk-provider-provisioning.png)
## Overview
The Bulk Provider Provisioning tool automates the creation of cloud providers in Prowler App or Prowler Cloud by:
* Reading provider configurations from YAML files
* Creating providers with appropriate authentication credentials
* Testing connections to verify successful authentication
* Processing multiple providers concurrently for efficiency
## Prerequisites
### Requirements
* Python 3.7 or higher
* Prowler API token (from Prowler Cloud or self-hosted Prowler App)
* For self-hosted Prowler App, remember to [point to your API base URL](#custom-api-endpoints)
* Authentication credentials for target cloud providers
### Installation
Clone the repository and install the required dependencies:
```bash
git clone https://github.com/prowler-cloud/prowler.git
cd prowler/util/prowler-bulk-provisioning
pip install -r requirements.txt
```
### Authentication Setup
Configure your Prowler API token:
```bash
export PROWLER_API_TOKEN="your-prowler-api-token"
```
To obtain an API token programmatically:
```bash
export PROWLER_API_TOKEN=$(curl --location 'https://api.prowler.com/api/v1/tokens' \
--header 'Content-Type: application/vnd.api+json' \
--header 'Accept: application/vnd.api+json' \
--data-raw '{
"data": {
"type": "tokens",
"attributes": {
"email": "your@email.com",
"password": "your-password"
}
}
}' | jq -r .data.attributes.access)
```
## Configuration File Structure
Create a YAML file listing your cloud providers and credentials:
```yaml
# providers.yaml
- provider: aws
uid: "123456789012" # AWS Account ID
alias: "production-account"
auth_method: role
credentials:
role_arn: "arn:aws:iam::123456789012:role/ProwlerScanRole"
external_id: "prowler-external-id"
- provider: azure
uid: "00000000-1111-2222-3333-444444444444" # Subscription ID
alias: "azure-production"
auth_method: service_principal
credentials:
tenant_id: "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee"
client_id: "ffffffff-1111-2222-3333-444444444444"
client_secret: "your-client-secret"
- provider: gcp
uid: "my-gcp-project" # Project ID
alias: "gcp-production"
auth_method: service_account
credentials:
service_account_key_json_path: "./service-account.json"
```
## Running the Bulk Provisioning Tool
### Basic Usage
To provision all providers from your configuration file:
```bash
python prowler_bulk_provisioning.py providers.yaml
```
The tool automatically tests each provider connection after creation (enabled by default).
### Dry Run Mode
Test your configuration without making API calls:
```bash
python prowler_bulk_provisioning.py providers.yaml --dry-run
```
### Skip Connection Testing
To provision providers without testing connections:
```bash
python prowler_bulk_provisioning.py providers.yaml --test-provider false
```
### Test Existing Providers Only
To verify connections for already provisioned providers:
```bash
python prowler_bulk_provisioning.py providers.yaml --test-provider-only
```
## Provider-Specific Configuration
### AWS Provider Configuration
#### Using IAM Role (Recommended)
```yaml
- provider: aws
uid: "123456789012"
alias: "aws-production"
auth_method: role
credentials:
role_arn: "arn:aws:iam::123456789012:role/ProwlerScanRole"
external_id: "optional-external-id"
session_name: "prowler-scan-session" # optional
duration_seconds: 3600 # optional
```
#### Using Access Keys
```yaml
- provider: aws
uid: "123456789012"
alias: "aws-development"
auth_method: credentials
credentials:
access_key_id: "AKIA..."
secret_access_key: "..."
session_token: "..." # optional for temporary credentials
```
### Azure Provider Configuration
```yaml
- provider: azure
uid: "subscription-uuid"
alias: "azure-production"
auth_method: service_principal
credentials:
tenant_id: "tenant-uuid"
client_id: "client-uuid"
client_secret: "client-secret"
```
### GCP Provider Configuration
#### Using Service Account JSON
```yaml
- provider: gcp
uid: "project-id"
alias: "gcp-production"
auth_method: service_account
credentials:
service_account_key_json_path: "/path/to/key.json"
```
#### Using OAuth2 Credentials
```yaml
- provider: gcp
uid: "project-id"
alias: "gcp-production"
auth_method: oauth2
credentials:
client_id: "123456789.apps.googleusercontent.com"
client_secret: "GOCSPX-xxxx"
refresh_token: "1//0exxxxxx"
```
### Kubernetes Provider Configuration
```yaml
- provider: kubernetes
uid: "context-name"
alias: "eks-production"
auth_method: kubeconfig
credentials:
kubeconfig_path: "~/.kube/config"
# OR inline configuration:
# kubeconfig_inline: |
# apiVersion: v1
# clusters: ...
```
### Microsoft 365 Provider Configuration
```yaml
- provider: m365
uid: "domain.onmicrosoft.com"
alias: "m365-tenant"
auth_method: service_principal
credentials:
tenant_id: "tenant-uuid"
client_id: "client-uuid"
client_secret: "client-secret"
```
### GitHub Provider Configuration
#### Using Personal Access Token
```yaml
- provider: github
uid: "organization-name"
alias: "github-org"
auth_method: personal_access_token
credentials:
token: "ghp_..."
```
#### Using GitHub App
```yaml
- provider: github
uid: "organization-name"
alias: "github-org"
auth_method: github_app
credentials:
app_id: "123456"
private_key_path: "/path/to/private-key.pem"
```
## Advanced Configuration
### Concurrent Processing
Adjust the number of concurrent provider creations:
```bash
python prowler_bulk_provisioning.py providers.yaml --concurrency 10
```
### Custom API Endpoints
For self-hosted Prowler App installations:
```bash
python prowler_bulk_provisioning.py providers.yaml \
--base-url http://localhost:8080/api/v1
```
### Timeout Configuration
Set custom timeout for API requests:
```bash
python prowler_bulk_provisioning.py providers.yaml --timeout 120
```
## Bulk Provider Management
### Deleting Multiple Providers
To remove all providers from your Prowler account:
```bash
python nuke_providers.py --confirm
```
Filter deletions by provider type:
```bash
python nuke_providers.py --confirm --filter-provider aws
```
Filter deletions by alias pattern:
```bash
python nuke_providers.py --confirm --filter-alias "test-*"
```
## Configuration File Format
The tool uses YAML format for provider configuration files. Each provider entry requires:
* `provider`: The cloud provider type (aws, azure, gcp, kubernetes, m365, github)
* `uid`: Unique identifier for the provider (account ID, subscription ID, project ID, etc.)
* `alias`: A friendly name for the provider
* `auth_method`: Authentication method to use
* `credentials`: Authentication credentials specific to the provider and method
Example YAML structure:
```yaml
- provider: aws
uid: "123456789012"
alias: "production"
auth_method: role
credentials:
role_arn: "arn:aws:iam::123456789012:role/ProwlerScan"
```
## Example Output
Successful provider provisioning:
```
[1] ✅ Created provider (id=db9a8985-f9ec-4dd8-b5a0-e05ab3880bed)
[1] ✅ Created secret (id=466f76c6-5878-4602-a4bc-13f9522c1fd2)
[1] ✅ Connection test: Connected
[2] ✅ Created provider (id=7a99f789-0cf5-4329-8279-2d443a962676)
[2] ✅ Created secret (id=c5702180-f7c4-40fd-be0e-f6433479b126)
[2] ⚠️ Connection test: Not connected
Done. Success: 2 Failures: 0
```
## Troubleshooting
### Invalid API Token
```
Error: 401 Unauthorized
Solution: Verify your PROWLER_API_TOKEN or --token parameter
```
### Network Timeouts
```
Error: Connection timeout
Solution: Increase timeout with --timeout 120
```
### Provider Already Exists
```
Error: Provider with this UID already exists
Solution: Use different UID or delete existing provider first
```
### Authentication Failures
```
Connection test: Not connected
Solution: Verify credentials and IAM permissions
```
+3 -84
View File
@@ -23,97 +23,16 @@ Standard results will be shown and additionally the framework information as the
## List Available Compliance Frameworks
In order to see which compliance frameworks are covered by Prowler, you can use option `--list-compliance`:
To see which compliance frameworks are covered by Prowler, use the `--list-compliance` option:
```sh
prowler <provider> --list-compliance
```
### AWS (36 frameworks)
- `aws_account_security_onboarding_aws`
- `aws_audit_manager_control_tower_guardrails_aws`
- `aws_foundational_security_best_practices_aws`
- `aws_foundational_technical_review_aws`
- `aws_well_architected_framework_reliability_pillar_aws`
- `aws_well_architected_framework_security_pillar_aws`
- `cis_1.4_aws`
- `cis_1.5_aws`
- `cis_2.0_aws`
- `cis_3.0_aws`
- `cis_4.0_aws`
- `cis_5.0_aws`
- `cisa_aws`
- `ens_rd2022_aws`
- `fedramp_low_revision_4_aws`
- `fedramp_moderate_revision_4_aws`
- `ffiec_aws`
- `gdpr_aws`
- `gxp_21_cfr_part_11_aws`
- `gxp_eu_annex_11_aws`
- `hipaa_aws`
- `iso27001_2013_aws`
- `iso27001_2022_aws`
- `kisa_isms_p_2023_aws`
- `kisa_isms_p_2023_korean_aws`
- `mitre_attack_aws`
- `nis2_aws`
- `nist_800_171_revision_2_aws`
- `nist_800_53_revision_4_aws`
- `nist_800_53_revision_5_aws`
- `nist_csf_1.1_aws`
- `pci_3.2.1_aws`
- `pci_4.0_aws`
- `prowler_threatscore_aws`
- `rbi_cyber_security_framework_aws`
- `soc2_aws`
### Azure (10 frameworks)
- `cis_2.0_azure`
- `cis_2.1_azure`
- `cis_3.0_azure`
- `ens_rd2022_azure`
- `iso27001_2022_azure`
- `mitre_attack_azure`
- `nis2_azure`
- `pci_4.0_azure`
- `prowler_threatscore_azure`
- `soc2_azure`
### GCP (10 frameworks)
- `cis_2.0_gcp`
- `cis_3.0_gcp`
- `cis_4.0_gcp`
- `ens_rd2022_gcp`
- `iso27001_2022_gcp`
- `mitre_attack_gcp`
- `nis2_gcp`
- `pci_4.0_gcp`
- `prowler_threatscore_gcp`
- `soc2_gcp`
### Kubernetes (5 frameworks)
- `cis_1.10_kubernetes`
- `cis_1.11_kubernetes`
- `cis_1.8_kubernetes`
- `iso27001_2022_kubernetes`
- `pci_4.0_kubernetes`
### M365 (3 frameworks)
- `cis_4.0_m365`
- `iso27001_2022_m365`
- `prowler_threatscore_m365`
### GitHub (1 framework)
- `cis_1.0_github`
Or you can visit [Prowler Hub](https://hub.prowler.com/compliance).
## List Requirements of Compliance Frameworks
For each compliance framework, you can use the `--list-compliance-requirements` option to list its requirements:
To list requirements for a compliance framework, use the `--list-compliance-requirements` option:
```sh
prowler <provider> --list-compliance-requirements <compliance_framework(s)>
+45
View File
@@ -83,6 +83,9 @@ The following list includes all the Azure checks with configurable variables tha
| `vm_sufficient_daily_backup_retention_period` | `vm_backup_min_daily_retention_days` | Integer |
| `vm_desired_sku_size` | `desired_vm_sku_sizes` | List of Strings |
| `defender_attack_path_notifications_properly_configured` | `defender_attack_path_minimal_risk_level` | String |
| `apim_threat_detection_llm_jacking` | `apim_threat_detection_llm_jacking_threshold` | Float |
| `apim_threat_detection_llm_jacking` | `apim_threat_detection_llm_jacking_minutes` | Integer |
| `apim_threat_detection_llm_jacking` | `apim_threat_detection_llm_jacking_actions` | List of Strings |
## GCP
@@ -494,6 +497,48 @@ azure:
"Standard_DS3_v2",
"Standard_D4s_v3",
]
# Azure VM Backup Configuration
# azure.vm_sufficient_daily_backup_retention_period
vm_backup_min_daily_retention_days: 7
# Azure API Management Threat Detection Configuration
# azure.apim_threat_detection_llm_jacking
apim_threat_detection_llm_jacking_threshold: 0.1
apim_threat_detection_llm_jacking_minutes: 1440
apim_threat_detection_llm_jacking_actions:
[
# OpenAI API endpoints
"ImageGenerations_Create",
"ChatCompletions_Create",
"Completions_Create",
"Embeddings_Create",
"FineTuning_Jobs_Create",
"Models_List",
# Azure OpenAI endpoints
"Deployments_List",
"Deployments_Get",
"Deployments_Create",
"Deployments_Delete",
# Anthropic endpoints
"Messages_Create",
"Claude_Create",
# Google AI endpoints
"GenerateContent",
"GenerateText",
"GenerateImage",
# Meta AI endpoints
"Llama_Create",
"CodeLlama_Create",
# Other LLM endpoints
"Gemini_Generate",
"Claude_Generate",
"Llama_Generate"
]
# GCP Configuration
gcp:
+1 -1
View File
@@ -1,4 +1,4 @@
# Prowler Fixer (remediation)
# Prowler Fixers (Remediations)
Prowler allows you to fix some of the failed findings it identifies. You can use the `--fixer` flag to run the fixes that are available for the checks that failed.
+41 -40
View File
@@ -1,40 +1,40 @@
# GCP Authentication in Prowler
## Default Authentication
## Required Permissions
By default, Prowler uses your User Account credentials. You can configure authentication as follows:
Prowler for Google Cloud requires the following permissions:
- `gcloud init` to use a new account, or
- `gcloud config set account <account>` to use an existing account.
### IAM Roles
- **Reader (`roles/reader`)** Must be granted at the **project, folder, or organization** level to allow scanning of target projects.
Then, obtain your access credentials using: `gcloud auth application-default login`.
### Project-Level Settings
## Using Service Account Keys
At least one project must have the following configurations:
Alternatively, Service Account keys can be generated and downloaded in JSON format. Follow the steps in the Google Cloud IAM guide (https://cloud.google.com/iam/docs/creating-managing-service-account-keys) to create and manage service account keys. Provide the path to the key file using:
- **Identity and Access Management (IAM) API (`iam.googleapis.com`)** Must be enabled via:
```console
prowler gcp --credentials-file path
```
- The [Google Cloud API UI](https://console.cloud.google.com/apis/api/iam.googleapis.com/metrics), or
- The `gcloud` CLI:
```sh
gcloud services enable iam.googleapis.com --project <your-project-id>
```
- **Service Usage Consumer (`roles/serviceusage.serviceUsageConsumer`)** IAM Role Required for resource scanning.
- **Quota Project Setting** Define a quota project using either:
- The `gcloud` CLI:
```sh
gcloud auth application-default set-quota-project <project-id>
```
- Setting an environment variable:
```sh
export GOOGLE_CLOUD_QUOTA_PROJECT=<project-id>
```
???+ note
`prowler` will scan the GCP project associated with the credentials.
## Using an access token
If you already have an access token (e.g., generated with `gcloud auth print-access-token`), you can run Prowler with:
```bash
export CLOUDSDK_AUTH_ACCESS_TOKEN=$(gcloud auth print-access-token)
prowler gcp --project-ids <project-id>
```
???+ note
If using this method, it's recommended to also set the default project explicitly:
```bash
export GOOGLE_CLOUD_PROJECT=<project-id>
```
## Credentials lookup order
Prowler follows the same credential search process as [Google authentication libraries](https://cloud.google.com/docs/authentication/application-default-credentials#search_order), checking credentials in this order:
@@ -52,26 +52,27 @@ Prowler follows the same credential search process as [Google authentication lib
Prowler will use the enabled Google Cloud APIs to get the information needed to perform the checks.
## Required Permissions
To ensure full functionality, Prowler for Google Cloud needs the following permissions to be set:
- **Reader (`roles/reader`) IAM role**: granted at the project / folder / org level in order to scan the target projects
- **Project level settings**: you need to have at least one project with the below settings:
- Identity and Access Management (IAM) API (`iam.googleapis.com`) enabled by either using the
[Google Cloud API UI](https://console.cloud.google.com/apis/api/iam.googleapis.com/metrics) or
by using the gcloud CLI `gcloud services enable iam.googleapis.com --project <your-project-id>` command
- Set the quota project to be this project by either running `gcloud auth application-default set-quota-project <project-id>` or by setting an environment variable:
`export GOOGLE_CLOUD_QUOTA_PROJECT=<project-id>`
The above settings must be associated to a user or service account.
## Using an Access Token
For existing access tokens (e.g., generated with `gcloud auth print-access-token`), run Prowler with:
```bash
export CLOUDSDK_AUTH_ACCESS_TOKEN=$(gcloud auth print-access-token)
prowler gcp --project-ids <project-id>
```
???+ note
Prowler will use the enabled Google Cloud APIs to get the information needed to perform the checks.
When using this method, also set the default project explicitly:
```bash
export GOOGLE_CLOUD_PROJECT=<project-id>
```
## Impersonating a GCP Service Account in Prowler
## Impersonating a GCP Service Account
To impersonate a GCP service account, use the `--impersonate-service-account` argument followed by the service account email:
+3 -3
View File
@@ -1,4 +1,4 @@
# GitHub Authentication
# Github Authentication in Prowler
Prowler supports multiple methods to [authenticate with GitHub](https://docs.github.com/en/rest/authentication/authenticating-to-the-rest-api). These include:
@@ -6,7 +6,7 @@ Prowler supports multiple methods to [authenticate with GitHub](https://docs.git
- **OAuth App Token**
- **GitHub App Credentials**
This flexibility allows you to scan and analyze your GitHub account, including repositories, organizations, and applications, using the method that best suits your use case.
This flexibility enables scanning and analysis of GitHub accounts, including repositories, organizations, and applications, using the method that best suits the use case.
## Supported Login Methods
@@ -44,4 +44,4 @@ If no login method is explicitly provided, Prowler will automatically attempt to
3. `GITHUB_APP_ID` and `GITHUB_APP_KEY` (where the key is the content of the private key file)
???+ note
Ensure the corresponding environment variables are set up before running Prowler for automatic detection if you don't plan to specify the login method.
Ensure the corresponding environment variables are set up before running Prowler for automatic detection when not specifying the login method.
@@ -11,7 +11,7 @@ This guide explains how to set up authentication with GitHub for Prowler. The do
### 1. Personal Access Token (PAT)
Personal Access Tokens provide the simplest GitHub authentication method and support individual user authentication or testing scenarios.
Personal Access Tokens provide the simplest GitHub authentication method, but it can only access resources owned by a single user or organization.
???+ warning "Classic Tokens Deprecated"
GitHub has deprecated Personal Access Tokens (classic) in favor of fine-grained Personal Access Tokens. We recommend using fine-grained tokens as they provide better security through more granular permissions and resource-specific access control.
@@ -44,13 +44,13 @@ Personal Access Tokens provide the simplest GitHub authentication method and sup
To enable Prowler functionality, configure the following permissions:
- **Repository permissions:**
- **Administration**: Read-only access
- **Contents**: Read-only access
- **Metadata**: Read-only access
- **Pull requests**: Read-only access
- **Security advisories**: Read-only access
- **Statuses**: Read-only access
- **Organization permissions:**
- **Administration**: Read-only access
- **Members**: Read-only access
- **Account permissions:**
@@ -186,7 +186,10 @@ GitHub Apps provide the recommended integration method for accessing multiple re
- **Account permissions**:
- Email addresses (Read)
4. **Generate Private Key**
4. **Where can this GitHub App be installed?**
- Select "Any account" to be able to install the GitHub App in any organization.
5. **Generate Private Key**
- Scroll to the "Private keys" section after app creation
- Click "Generate a private key"
- Download the `.pem` file and store securely
+11
View File
@@ -0,0 +1,11 @@
# IaC Authentication in Prowler
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Trivy](https://trivy.dev/). This provider supports a wide range of IaC frameworks and requires no cloud authentication for local scans.
### Authentication
- For local scans, no authentication is required.
- For remote repository scans, authentication can be provided via:
- [**GitHub Username and Personal Access Token (PAT)**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)
- [**GitHub OAuth App Token**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-fine-grained-personal-access-token)
- [**Git URL**](https://git-scm.com/docs/git-clone#_git_urls)
+12 -22
View File
@@ -1,32 +1,22 @@
# Getting Started with the IaC Provider
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Checkov](https://www.checkov.io/). This provider supports a wide range of IaC frameworks, allowing you to assess your code before deployment.
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Trivy](https://trivy.dev/). This provider supports a wide range of IaC frameworks, allowing you to assess your code before deployment.
## Supported Frameworks
## Supported Scanners
The IaC provider leverages Checkov to support multiple frameworks, including:
The IaC provider leverages Trivy to support multiple scanners, including:
- Terraform
- CloudFormation
- Kubernetes
- ARM (Azure Resource Manager)
- Serverless
- Dockerfile
- YAML/JSON (generic IaC)
- Bicep
- Helm
- GitHub Actions, GitLab CI, Bitbucket Pipelines, Azure Pipelines, CircleCI, Argo Workflows
- Ansible
- Kustomize
- OpenAPI
- SAST, SCA (Software Composition Analysis)
- Vulnerability
- Misconfiguration
- Secret
- License
## How It Works
- The IaC provider scans your local directory (or a specified path) for supported IaC files, or scan a remote repository.
- No cloud credentials or authentication are required for local scans.
- For remote repository scans, authentication can be provided via [git URL](https://git-scm.com/docs/git-clone#_git_urls), CLI flags or environment variables.
- Mutelist logic is handled by Checkov, not Prowler.
- Mutelist logic is handled by Trivy, not Prowler.
- Results are output in the same formats as other Prowler providers (CSV, JSON, HTML, etc.).
## Usage
@@ -67,12 +57,12 @@ You can provide authentication for private repositories using one of the followi
#### Mutually Exclusive Flags
- `--scan-path` and `--scan-repository-url` are mutually exclusive. Only one can be specified at a time.
### Specify Frameworks
### Specify Scanners
Scan only Terraform and Kubernetes files:
Scan only vulnerability and misconfiguration scanners:
```sh
prowler iac --scan-path ./my-iac-directory --frameworks terraform kubernetes
prowler iac --scan-path ./my-iac-directory --scanners vuln misconfig
```
### Exclude Paths
@@ -95,4 +85,4 @@ prowler iac --scan-path ./iac --output-formats csv json html
- For remote repository scans, authentication is optional but required for private repos.
- CLI flags override environment variables for authentication.
- It is ideal for CI/CD pipelines and local development environments.
- For more details on supported frameworks and rules, see the [Checkov documentation](https://www.checkov.io/1.Welcome/Quick%20Start.html).
- For more details on supported scanners, see the [Trivy documentation](https://trivy.dev/latest/docs/scanner/vulnerability/).
Binary file not shown.

After

Width:  |  Height:  |  Size: 100 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 395 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 105 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 370 KiB

+351 -19
View File
@@ -1,29 +1,361 @@
# Microsoft 365 Authentication for Prowler
By default, Prowler utilizes the MsGraph Python SDK identity package for authentication, leveraging the class `ClientSecretCredential`. This enables authentication against Microsoft 365 using the following approaches:
Prowler for Microsoft 365 (M365) supports the following authentication methods:
- Service principal authentication by environment variables (Enterprise Application)
- Service principal and Microsoft user credentials by environment variabled (using PowerShell requires this authentication method)
- Current CLI credentials stored
- Interactive browser authentication
- [**Service Principal Application**](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (**Recommended**)
- **Service Principal Application with Microsoft User Credentials**
- **Stored AZ CLI credentials**
- **Interactive browser authentication**
???+ warning
Prowler App supports the **Service Principal** authentication method and the **Service Principal with User Credentials** authentication method, but this last one will be deprecated in October once Microsoft will enforce MFA in all tenants not allowing User authentication without interactive method.
To launch the tool first you need to specify which method is used through the following flags:
### Service Principal Authentication (Recommended)
**Authentication flag:** `--sp-env-auth`
Enable Prowler authentication as the **Service Principal Application** by configuring the following environment variables:
```console
# To use service principal (app) authentication and Microsoft user credentials
prowler m365 --env-auth
# To use service principal authentication
prowler m365 --sp-env-auth
# To use cli authentication
prowler m365 --az-cli-auth
# To use browser authentication
prowler m365 --browser-auth --tenant-id "XXXXXXXX"
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
```
## Permission Configuration
If these variables are not set or exported, execution using `--sp-env-auth` will fail.
To ensure Prowler can access the required resources within your Microsoft 365 account, proper permissions must be configured. Refer to the [Requirements](../../getting-started/requirements.md#needed-permissions_2) section for details on setting up necessary privileges.
Refer to the [Create Prowler Service Principal](getting-started-m365.md#create-the-service-principal-app) guide for setup instructions.
If the external API permissions described in the mentioned section above are not added only checks that work through MS Graph will be executed. This means that the full provider will not be executed.
???+ note
In order to scan all the checks from M365 required permissions to the service principal application must be added. Refer to the [External API Permissions Assignment](getting-started-m365.md#grant-powershell-modules-permissions) section for more information.
### Service Principal and User Credentials Authentication
Authentication flag: `--env-auth`
???+ warning
This method is not recommended anymore, we recommend just use the **Service Principal Application** authentication method instead.
This method builds upon the Service Principal authentication by adding User Credentials. Configure the following environment variables: `M365_USER` and `M365_PASSWORD`.
```console
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
export M365_USER="your_email@example.com"
export M365_PASSWORD="examplepassword"
```
These two new environment variables are **required** in this authentication method to execute the PowerShell modules needed to retrieve information from M365 services. Prowler uses Service Principal authentication to access Microsoft Graph and user credentials to authenticate to Microsoft PowerShell modules.
- `M365_USER` should be your Microsoft account email using the **assigned domain in the tenant**. This means it must look like `example@YourCompany.onmicrosoft.com` or `example@YourCompany.com`, but it must be the exact domain assigned to that user in the tenant.
???+ warning
Newly created users must sign in with the account first, as Microsoft prompts for password change. Without completing this step, user authentication fails because Microsoft marks the initial password as expired.
???+ warning
The user must not be MFA capable. Microsoft does not allow MFA capable users to authenticate programmatically. See [Microsoft documentation](https://learn.microsoft.com/en-us/entra/identity-platform/scenario-desktop-acquire-token-username-password?tabs=dotnet) for more information.
???+ warning
Using a tenant domain other than the one assigned — even if it belongs to the same tenant — will cause Prowler to fail, as Microsoft authentication will not succeed.
Ensure the correct domain is used for the authenticating user.
![User Domains](img/user-domains.png)
- `M365_PASSWORD` must be the user password.
???+ note
Previously an encrypted password was required, but now the user password is accepted directly. Prowler handles the password encryption.
### Interactive Browser Authentication
**Authentication flag:** `--browser-auth`
This authentication method requires authentication against Azure using the default browser to start the scan. The `--tenant-id` flag is also required.
These credentials only enable checks that rely on Microsoft Graph. The entire provider cannot be run with this method. To perform a full M365 security scan, use the **recommended authentication method**.
Since this is a **delegated permission** authentication method, necessary permissions should be assigned to the user rather than the application.
### Required Permissions
To run the full Prowler provider, including PowerShell checks, two types of permission scopes must be set in **Microsoft Entra ID**.
#### Service Principal Authentication (`--sp-env-auth`) - Recommended
When using service principal authentication, add the following **Application Permissions**:
**Microsoft Graph API Permissions:**
- `AuditLog.Read.All`: Required for Entra service.
- `Directory.Read.All`: Required for all services.
- `Policy.Read.All`: Required for all services.
- `SharePointTenantSettings.Read.All`: Required for SharePoint service.
- `User.Read` (IMPORTANT: this must be set as **delegated**): Required for the sign-in.
**External API Permissions:**
- `Exchange.ManageAsApp` from external API `Office 365 Exchange Online`: Required for Exchange PowerShell module app authentication. You also need to assign the `Global Reader` role to the app.
- `application_access` from external API `Skype and Teams Tenant Admin API`: Required for Teams PowerShell module app authentication.
???+ note
`Directory.Read.All` can be replaced with `Domain.Read.All` that is a more restrictive permission but you won't be able to run the Entra checks related with DirectoryRoles and GetUsers.
> If you do this you will need to add also the `Organization.Read.All` permission to the service principal application in order to authenticate.
???+ note
This is the **recommended authentication method** because it allows you to run the full M365 provider including PowerShell checks, providing complete coverage of all available security checks, same as the Service Principal Authentication + User Credentials Authentication but this last one will be deprecated in October once Microsoft will enforce MFA in all tenants not allowing User authentication without interactive method.
#### Service Principal + User Credentials Authentication (`--env-auth`)
When using service principal with user credentials authentication, you need **both** sets of permissions:
**1. Service Principal Application Permissions**:
- You **will need** all the Microsoft Graph API permissions listed above.
- You **won't need** the External API permissions listed above.
**2. User-Level Permissions**: These are set at the `M365_USER` level, so the user used to run Prowler must have one of the following roles:
- `Global Reader` (recommended): this allows you to read all roles needed.
- `Exchange Administrator` and `Teams Administrator`: user needs both roles but with this [roles](https://learn.microsoft.com/en-us/exchange/permissions-exo/permissions-exo#microsoft-365-permissions-in-exchange-online) you can access to the same information as a Global Reader (since only read access is needed, Global Reader is recommended).
#### Browser Authentication (`--browser-auth`)
When using browser authentication, permissions are delegated to the user, so the user must have the appropriate permissions rather than the application.
???+ warning
With browser authentication, you will only be able to run checks that work through MS Graph API. PowerShell module checks will not be executed.
### Assigning Permissions and Roles
For guidance on assigning the necessary permissions and roles, follow these instructions:
- [Grant API Permissions](getting-started-m365.md#grant-required-graph-api-permissions)
- [Assign Required Roles](getting-started-m365.md#if-using-user-authentication)
### Supported PowerShell Versions
PowerShell is required to run certain M365 checks.
**Supported versions:**
- **PowerShell 7.4 or higher** (7.5 is recommended)
#### Why Is PowerShell 7.4+ Required?
- **PowerShell 5.1** (default on some Windows systems) does not support required cmdlets.
- Older [cross-platform PowerShell versions](https://learn.microsoft.com/en-us/powershell/scripting/install/powershell-support-lifecycle?view=powershell-7.5) are **unsupported**, leading to potential errors.
???+ note
Installing PowerShell is only necessary if you install Prowler via **pip or other sources**. **SDK and API containers include PowerShell by default.**
### Installing PowerShell
Installing PowerShell is different depending on your OS.
- [Windows](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.5#install-powershell-using-winget-recommended): you will need to update PowerShell to +7.4 to be able to run prowler, if not some checks will not show findings and the provider could not work as expected. This version of PowerShell is [supported](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.4#supported-versions-of-windows) on Windows 10, Windows 11, Windows Server 2016 and higher versions.
```console
winget install --id Microsoft.PowerShell --source winget
```
- [MacOS](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-macos?view=powershell-7.5#install-the-latest-stable-release-of-powershell): installing PowerShell on MacOS needs to have installed [brew](https://brew.sh/), once you have it is just running the command above, Pwsh is only supported in macOS 15 (Sequoia) x64 and Arm64, macOS 14 (Sonoma) x64 and Arm64, macOS 13 (Ventura) x64 and Arm64
```console
brew install powershell/tap/powershell
```
Once it's installed run `pwsh` on your terminal to verify it's working.
- Linux: installing PowerShell on Linux depends on the distro you are using:
- [Ubuntu](https://learn.microsoft.com/es-es/powershell/scripting/install/install-ubuntu?view=powershell-7.5#installation-via-package-repository-the-package-repository): The required version for installing PowerShell +7.4 on Ubuntu are Ubuntu 22.04 and Ubuntu 24.04. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget apt-transport-https software-properties-common
# Get the version of Ubuntu
source /etc/os-release
# Download the Microsoft repository keys
wget -q https://packages.microsoft.com/config/ubuntu/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Alpine](https://learn.microsoft.com/es-es/powershell/scripting/install/install-alpine?view=powershell-7.5#installation-steps): The only supported version for installing PowerShell +7.4 on Alpine is Alpine 3.20. The unique way to install it is downloading the tar.gz package available on [PowerShell github](https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz). You just need to follow the following steps:
```console
# Install the requirements
sudo apk add --no-cache \
ca-certificates \
less \
ncurses-terminfo-base \
krb5-libs \
libgcc \
libintl \
libssl3 \
libstdc++ \
tzdata \
userspace-rcu \
zlib \
icu-libs \
curl
apk -X https://dl-cdn.alpinelinux.org/alpine/edge/main add --no-cache \
lttng-ust \
openssh-client \
# Download the powershell '.tar.gz' archive
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz -o /tmp/powershell.tar.gz
# Create the target folder where powershell will be placed
sudo mkdir -p /opt/microsoft/powershell/7
# Expand powershell to the target folder
sudo tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7
# Set execute permissions
sudo chmod +x /opt/microsoft/powershell/7/pwsh
# Create the symbolic link that points to pwsh
sudo ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh
# Start PowerShell
pwsh
```
- [Debian](https://learn.microsoft.com/es-es/powershell/scripting/install/install-debian?view=powershell-7.5#installation-on-debian-11-or-12-via-the-package-repository): The required version for installing PowerShell +7.4 on Debian are Debian 11 and Debian 12. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget
# Get the version of Debian
source /etc/os-release
# Download the Microsoft repository GPG keys
wget -q https://packages.microsoft.com/config/debian/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository GPG keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository GPG keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Rhel](https://learn.microsoft.com/es-es/powershell/scripting/install/install-rhel?view=powershell-7.5#installation-via-the-package-repository): The required version for installing PowerShell +7.4 on Red Hat are RHEL 8 and RHEL 9. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Get version of RHEL
source /etc/os-release
if [ ${VERSION_ID%.*} -lt 8 ]
then majorver=7
elif [ ${VERSION_ID%.*} -lt 9 ]
then majorver=8
else majorver=9
fi
# Download the Microsoft RedHat repository package
curl -sSL -O https://packages.microsoft.com/config/rhel/$majorver/packages-microsoft-prod.rpm
# Register the Microsoft RedHat repository
sudo rpm -i packages-microsoft-prod.rpm
# Delete the downloaded package after installing
rm packages-microsoft-prod.rpm
# Update package index files
sudo dnf update
# Install PowerShell
sudo dnf install powershell -y
```
- [Docker](https://learn.microsoft.com/es-es/powershell/scripting/install/powershell-in-docker?view=powershell-7.5#use-powershell-in-a-container): The following command download the latest stable versions of PowerShell:
```console
docker pull mcr.microsoft.com/dotnet/sdk:9.0
```
To start an interactive shell of Pwsh you just need to run:
```console
docker run -it mcr.microsoft.com/dotnet/sdk:9.0 pwsh
```
### Required PowerShell Modules
Prowler relies on several PowerShell cmdlets to retrieve necessary data.
These cmdlets come from different modules that must be installed.
#### Automatic Installation
The required modules are automatically installed when running Prowler with the `--init-modules` flag.
Example command:
```console
python3 prowler-cli.py m365 --verbose --log-level ERROR --env-auth --init-modules
```
If the modules are already installed, running this command will not cause issues—it will simply verify that the necessary modules are available.
???+ note
Prowler installs the modules using `-Scope CurrentUser`.
If you encounter any issues with services not working after the automatic installation, try installing the modules manually using `-Scope AllUsers` (administrator permissions are required for this).
The command needed to install a module manually is:
```powershell
Install-Module -Name "ModuleName" -Scope AllUsers -Force
```
#### Modules Version
- [ExchangeOnlineManagement](https://www.powershellgallery.com/packages/ExchangeOnlineManagement/3.6.0) (Minimum version: 3.6.0) Required for checks across Exchange, Defender, and Purview.
- [MicrosoftTeams](https://www.powershellgallery.com/packages/MicrosoftTeams/6.6.0) (Minimum version: 6.6.0) Required for all Teams checks.
- [MSAL.PS](https://www.powershellgallery.com/packages/MSAL.PS/4.32.0): Required for Exchange module via application authentication.
- [MSAL.PS](https://www.powershellgallery.com/packages/MSAL.PS/4.32.0): Required for Exchange module via application authentication.
@@ -2,6 +2,9 @@
Set up your M365 account to enable security scanning using Prowler Cloud/App.
???+ note "Government Cloud Support"
Government cloud accounts or tenants (Microsoft 365 Government) are not currently supported, but we expect to add support for them in the near future.
## Requirements
To configure your M365 account, you'll need:
@@ -194,7 +197,7 @@ To grant the permissions for the PowerShell modules via application authenticati
#### If using user authentication
This method is not recommended because it requires a user with MFA enabled and Microsoft will not allow MFA capable users to authenticate programmatically after 1st September 2025. See [Microsoft documentation](https://learn.microsoft.com/en-us/entra/identity/authentication/concept-mandatory-multifactor-authentication?tabs=dotnet) for more information.
This method is not recommended because it requires a user with MFA enabled and Microsoft will not allow MFA capable users to authenticate programmatically after 1st October 2025. See [Microsoft documentation](https://learn.microsoft.com/en-us/entra/identity/authentication/concept-mandatory-multifactor-authentication?tabs=dotnet) for more information.
???+ warning
Remember that if the user is newly created, you need to sign in with that account first, as Microsoft will prompt you to change the password. If you dont complete this step, user authentication will fail because Microsoft marks the initial password as expired.
@@ -257,7 +260,8 @@ This method is not recommended because it requires a user with MFA enabled and M
- `AZURE_CLIENT_SECRET` from earlier
If you are using user authentication, also add:
- `M365_USER` the user using the correct assigned domain, more info [here](../../getting-started/requirements.md#service-principal-and-user-credentials-authentication)
- `M365_USER` the user using the correct assigned domain, more info [here](../../tutorials/microsoft365/authentication.md#service-principal-and-user-credentials-authentication)
- `M365_PASSWORD` the password of the user
![Prowler Cloud M365 Credentials](./img/m365-credentials.png)
@@ -4,9 +4,9 @@ PowerShell is required by this provider because it is the only way to retrieve d
If you are using Prowler Cloud, you don't need to worry about PowerShell — it is already installed in our infrastructure.
However, if you want to run Prowler on your own, you must have PowerShell installed to execute the full M365 provider and retrieve all findings.
To learn more about how to install PowerShell and which versions are supported, click [here](../../getting-started/requirements.md#supported-powershell-versions).
To learn more about how to install PowerShell and which versions are supported, click [here](../../tutorials/microsoft365/authentication.md#supported-powershell-versions).
## Required Modules
The necessary modules will not be installed automatically by Prowler. Nevertheless, if you want Prowler to install them for you, you can execute the provider with the flag `--init-modules`, which will run the script to install and import them.
If you want to learn more about this process or you are running some issues with this, click [here](../../getting-started/requirements.md#needed-powershell-modules).
If you want to learn more about this process or you are running some issues with this, click [here](../../tutorials/microsoft365/authentication.md#required-powershell-modules).
+1 -1
View File
@@ -116,7 +116,7 @@ Each check must reside in a dedicated subfolder, following this structure:
???+ note
The check name must start with the service name followed by an underscore (e.g., ec2\_instance\_public\_ip).
To see more information about how to write checks, refer to the [Developer Guide](../developer-guide/checks.md#create-a-new-check-for-a-provider).
To see more information about how to write checks, refer to the [Developer Guide](../developer-guide/checks.md#creating-a-check).
???+ note
If you want to run ONLY your custom check(s), import it with -x (--checks-folder) and then run it with -c (--checks), e.g.: `console prowler aws -x s3://bucket/prowler/providers/aws/services/s3/s3_bucket_policy/ -c s3_bucket_policy`
@@ -0,0 +1,45 @@
# MongoDB Atlas Authentication
MongoDB Atlas provider uses [HTTP Digest Authentication with API key pairs consisting of a public key and private key](https://www.mongodb.com/docs/atlas/configure-api-access/#grant-programmatic-access-to-service).
## Authentication Methods
### Command-Line Arguments
```bash
prowler mongodbatlas --atlas-public-key <public_key> --atlas-private-key <private_key>
```
### Environment Variables
```bash
export ATLAS_PUBLIC_KEY=<public_key>
export ATLAS_PRIVATE_KEY=<private_key>
prowler mongodbatlas
```
## Creating API Keys
### Step-by-Step Guide
1. **Log into MongoDB Atlas**
- Access the MongoDB Atlas console
2. **Navigate to Access Manager**
- Go to the organization or project access management section
3. **Select API Keys Tab**
- Click on the "API Keys" tab
4. **Create API Key**
- Click "Create API Key"
- Provide a description for the key
5. **Set Permissions**
- Grant minimum required permissions
6. **Save Credentials**
- Note the public key and private key
- Store credentials securely
For more details about MongoDB Atlas, see the [MongoDB Atlas Tutorial](../tutorials/mongodbatlas/getting-started-mongodbatlas.md).
@@ -0,0 +1,87 @@
# Getting Started with MongoDB Atlas
MongoDB Atlas provider enables security assessments of MongoDB Atlas cloud database deployments.
## Features
- **Authentication**: Supports MongoDB Atlas API key authentication
- **Services**: Projects and clusters services
- **Checks**: Network access security and encryption at rest validation
## Creating API Keys
To create MongoDB Atlas API keys:
1. **Log into MongoDB Atlas**: Access the MongoDB Atlas console
2. **Navigate to Access Manager**: Go to the organization access management section:
- Click on Access Manager and Organization Access:
![Organization Access](./img/organization-access.png)
- After that click on the Applications tab inside the Access Manager:
![Project Access](./img/access-manager.png)
3. **Select API Keys Tab**: Click on the "API Keys" tab that appears in the image above
4. **Create API Key**: Click "Create API Key" and provide a description
![Create API Key](./img/create-api-key.png)
5. **Set Permissions**: Project permissions are recommended for security, you can modify them after creating the key
![Set Permissions](./img/modify-permission.png)
6. **Save Credentials**: Note the public key and private key and store them securely
![Save Credentials](./img/copy-key.png)
7. **Add IP Access List**: Add the IP where you are running Prowler to the IP Access List of the API Key. If you want to skip this step and use your API key in all type of IP addresses you need to uncheck the `Require IP Access List for the Atlas Administration API` button on the [Organization Settings](#needed-permissions), but this is not recommended.
![Organization Settings](./img/add-ip.png)
## Basic Usage
### Scan All Projects and Clusters
After storing your API keys, you can run Prowler with the following command:
```bash
prowler mongodbatlas --atlas-public-key <key> --atlas-private-key <secret>
```
Also, you can set your API keys as environment variables:
```bash
export ATLAS_PUBLIC_KEY=<key>
export ATLAS_PRIVATE_KEY=<secret>
```
And then just run Prowler with the following command:
```bash
prowler mongodbatlas
```
### Scanning a Specific Project
If you want to scan a specific project, you can use the following argument added to the command above:
```bash
prowler mongodbatlas --atlas-project-id <project-id>
```
### Needed Permissions
MongoDB Atlas API keys require appropriate permissions to perform security checks:
- **Organization Read Only**: Provides read-only access to everything in the organization, including all projects in the organization.
- If you want to be able to [audit the Auditing configuration for the project](https://www.mongodb.com/docs/api/doc/atlas-admin-api-v2/group/endpoint-auditing), **Organization Owner** is needed.
Also, it's important to note that the IP where you are running Prowler must be added to the IP Access List of the MongoDB Atlas organization API key. If you want to skip this step and use your API key in all type of IP addresses you need to uncheck the `Require IP Access List for the Atlas Administration API` button on the Organization Settings, that setting is [enabled by default](https://www.mongodb.com/docs/atlas/configure-api-access/#optional--require-an-ip-access-list-for-the-atlas-administration-api).
???+ warning
If you want the check `organizations_api_access_list_required` to pass you will need to enable the API access list for the organization, so to make sure that your API Key is working you need to add your IP to the IP Access List of the organization. If you are running the check from Prowler Cloud, you will need to add our IP to the IP Access List.
![Organization Settings](./img/ip-access-list.png)
Binary file not shown.

After

Width:  |  Height:  |  Size: 95 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 97 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 97 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 58 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 220 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 67 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 100 KiB

+1 -1
View File
@@ -1,4 +1,4 @@
# Managing Users and Roles
# Managing Users and Role-Based Access Control (RBAC)
**Prowler App** supports multiple users within a single tenant, enabling seamless collaboration by allowing team members to easily share insights and manage security findings.
@@ -0,0 +1,128 @@
# AWS Security Hub Integration
Prowler App enables automatic export of security findings to AWS Security Hub, providing seamless integration with AWS's native security and compliance service. This comprehensive guide demonstrates how to configure and manage AWS Security Hub integrations to centralize security findings and enhance compliance tracking across AWS environments.
Integrating Prowler App with AWS Security Hub provides:
* **Centralized security visibility:** Consolidate findings from multiple AWS accounts and regions
* **Native AWS integration:** Leverage existing AWS security workflows and compliance frameworks
* **Automated finding management:** Archive resolved findings and filter results based on severity
* **Cost optimization:** Send only failed findings to reduce AWS Security Hub costs
* **Real-time updates:** Automatically export findings after each scan completion
## How It Works
When enabled and configured:
1. Scan results are automatically sent to AWS Security Hub after each scan completes
2. Findings are formatted in [AWS Security Finding Format](https://docs.aws.amazon.com/securityhub/latest/userguide/securityhub-findings-format.html) (ASFF)
3. The integration automatically detects new AWS regions to send findings if the Prowler partner integration is enabled
4. Previously resolved findings are archived to maintain clean Security Hub dashboards
???+ note
Refer to [AWS Security Hub pricing](https://aws.amazon.com/security-hub/pricing/) for cost information.
## Prerequisites
Before configuring AWS Security Hub Integration in Prowler App, complete these steps:
### AWS Security Hub Setup
Enable the Prowler partner integration in AWS Security Hub by following the [AWS Security Hub setup documentation](./aws/securityhub.md#enabling-aws-security-hub-for-prowler-integration).
### AWS Authentication
Configure AWS credentials by following the [AWS authentication setup guide](./aws/getting-started-aws.md#step-3-set-up-aws-authentication).
## Configuration
To configure AWS Security Hub integration in Prowler App:
1. Navigate to **Integrations** in the Prowler App interface
2. Locate the **AWS Security Hub** card and click **Manage**, then select **Add integration**
![Integrations tab](./img/security-hub/integrations-tab.png)
3. Complete the integration settings
* **AWS Provider:** Select the AWS provider whose findings should be exported to Security Hub
* **Send Only Failed Findings:** Filter out `PASS` findings to reduce AWS Security Hub costs (enabled by default)
* **Archive Previous Findings:** Automatically archive findings resolved since the last scan to maintain clean Security Hub dashboards
![Integration settings](./img/security-hub/integration-settings.png)
4. Configure authentication:
Choose the appropriate authentication method:
* **Use Provider Credentials** (recommended): Leverages the AWS provider's existing credentials
???+ tip "Simplified Credential Management"
Using provider credentials reduces administrative complexity by managing a single set of credentials instead of maintaining separate authentication mechanisms. This approach minimizes security risks and provides the most efficient integration path when the AWS account has sufficient permissions to export findings to Security Hub.
* **Custom Credentials:** Configure separate credentials specifically for Security Hub access
5. Click **Create integration** to enable the integration
![Create integration](./img/security-hub/create-integration.png)
Once configured successfully, findings from subsequent scans will automatically appear in AWS Security Hub.
### Integration Status
Once the integration is active, monitor its status and make adjustments as needed through the integrations management interface.
1. Review configured integrations in the management interface
2. Each integration displays:
- **Connection Status:** Connected or Disconnected indicator.
- **Provider Information:** Selected AWS provider name.
- **Finding Filters:** Status of failed-only and archive settings.
- **Last Checked:** Timestamp of the most recent connection test.
- **Regions:** List of regions where the integration is active.
#### Actions
Each Security Hub integration provides several management actions accessible through dedicated buttons:
| Button | Purpose | Available Actions | Notes |
|--------|---------|------------------|-------|
| **Test** | Verify integration connectivity | • Test AWS credential validity<br/>• Check Security Hub accessibility<br/>• Detect enabled regions automatically<br/>• Validate finding export capability | Results displayed in notification message |
| **Config** | Modify integration settings | • Update AWS provider selection<br/>• Change finding filter settings<br/>• Modify archive preferences | Click "Update Configuration" to save changes |
| **Credentials** | Update authentication settings | • Switch between provider/custom credentials<br/>• Update AWS access keys<br/>• Change IAM role configuration | Click "Update Credentials" to save changes |
| **Enable/Disable** | Toggle integration status | • Enable integration to start exporting findings<br/>• Disable integration to pause exports | Status change takes effect immediately |
| **Delete** | Remove integration permanently | • Permanently delete integration<br/>• Remove all configuration data | ⚠️ **Cannot be undone** - confirm before deleting |
???+ tip "Management Best Practices"
- Test the integration after any configuration changes
- Use the Enable/Disable toggle for temporary changes instead of deleting
- Monitor the Last Checked timestamp to ensure recent connectivity
## Viewing Findings in AWS Security Hub
After successful configuration and scan completion, Prowler findings automatically appear in AWS Security Hub. For detailed information about accessing and interpreting findings in the Security Hub console, refer to the [AWS Security Hub findings documentation](./aws/securityhub.md#viewing-prowler-findings-in-aws-security-hub).
## Troubleshooting
**Connection test fails:**
- Verify AWS Security Hub is enabled in target regions
- Confirm Prowler integration is accepted in Security Hub
- Check IAM permissions include required Security Hub actions
- If using IAM Role, verify trust policy and External ID
**No findings in Security Hub:**
- Ensure integration shows "Connected" status
- Verify a scan has completed after enabling integration
- Check Security Hub console in the correct region
- Confirm finding filters match expectations
**Authentication errors:**
- For provider credentials, verify provider configuration
- For custom credentials, check access key validity
- For IAM roles, confirm role ARN and External ID match
+2
View File
@@ -2,6 +2,8 @@
This page provides instructions for creating and configuring a Microsoft Entra ID (formerly Azure AD) application to use SAML SSO with Prowler App.
You can find a walkthrough video [here](https://youtu.be/UtcjDh5cAjI).
## Creating and Configuring the Enterprise Application
1. From the "Enterprise Applications" page in the Azure Portal, click "+ New application".
+5 -3
View File
@@ -4,7 +4,7 @@
## Accessing Prowler App and API Documentation
After [installing](../index.md#prowler-app-installation) **Prowler App**, access it at [http://localhost:3000](http://localhost:3000). To view the auto-generated **Prowler API** documentation, navigate to [http://localhost:8080/api/v1/docs](http://localhost:8080/api/v1/docs). This documentation provides details on available endpoints, parameters, and responses.
After [installing](../installation/prowler-app.md) **Prowler App**, access it at [http://localhost:3000](http://localhost:3000). To view the auto-generated **Prowler API** documentation, navigate to [http://localhost:8080/api/v1/docs](http://localhost:8080/api/v1/docs). This documentation provides details on available endpoints, parameters, and responses.
???+ note
If you are a [Prowler Cloud](https://cloud.prowler.com/sign-in) user, you can access API docs at [https://api.prowler.com/api/v1/docs](https://api.prowler.com/api/v1/docs)
@@ -39,7 +39,7 @@ Upon logging in, the Overview page will display. At this stage, no data is prese
## **Step 3: Add a Provider**
To perform security scans, link a cloud provider account. Prowler supports the following providers:
To perform security scans, link a cloud provider account. Prowler supports the following providers and more:
- **AWS**
@@ -51,6 +51,8 @@ To perform security scans, link a cloud provider account. Prowler supports the f
- **M365**
- **GitHub**
Steps to add a provider:
1. Navigate to `Settings > Cloud Providers`.
@@ -109,7 +111,7 @@ For AWS, enter your `AWS Account ID` and choose one of the following methods to
### **Step 4.2: Azure Credentials**:
For Azure, Prowler App uses a service principal application to authenticate. For more information about the process of creating and adding permissions to a service principal refer to this [section](../getting-started/requirements.md#azure). When you finish creating and adding the [Entra](./azure/create-prowler-service-principal.md#assigning-the-proper-permissions) and [Subscription](./azure/subscriptions.md#assign-the-appropriate-permissions-to-the-identity-that-is-going-to-be-assumed-by-prowler) scope permissions to the service principal, enter the `Tenant ID`, `Client ID` and `Client Secret` of the service principal application.
For Azure, Prowler App uses a service principal application to authenticate. For more information about the process of creating and adding permissions to a service principal refer to this [section](../tutorials/azure/authentication.md). When you finish creating and adding the [Entra](./azure/create-prowler-service-principal.md#assigning-proper-permissions) and [Subscription](./azure/subscriptions.md) scope permissions to the service principal, enter the `Tenant ID`, `Client ID` and `Client Secret` of the service principal application.
<img src="../../img/azure-credentials.png" alt="Azure Credentials" width="700"/>
+70 -60
View File
@@ -51,84 +51,94 @@ nav:
- Products:
- Prowler App: products/prowler-app.md
- Prowler CLI: products/prowler-cli.md
- Prowler Cloud: https://cloud.prowler.com
- Prowler Hub: https://hub.prowler.com
- Prowler Cloud 🔗: https://cloud.prowler.com
- Prowler Hub 🔗: https://hub.prowler.com
- Installation:
- Prowler App: installation/prowler-app.md
- Prowler CLI: installation/prowler-cli.md
- Basic Usage:
- Prowler App: basic-usage/prowler-app.md
- Prowler CLI: basic-usage/prowler-cli.md
- Requirements: getting-started/requirements.md
- Tutorials:
- User Guide:
- Prowler App:
- Getting Started: tutorials/prowler-app.md
- Authentication:
- Social Login: tutorials/prowler-app-social-login.md
- SSO with SAML: tutorials/prowler-app-sso.md
- Role-Based Access Control: tutorials/prowler-app-rbac.md
- Social Login: tutorials/prowler-app-social-login.md
- SSO with SAML: tutorials/prowler-app-sso.md
- Mute findings: tutorials/prowler-app-mute-findings.md
- Amazon S3 Integration: tutorials/prowler-app-s3-integration.md
- Lighthouse: tutorials/prowler-app-lighthouse.md
- Mutelist: tutorials/prowler-app-mute-findings.md
- Integrations:
- Amazon S3: tutorials/prowler-app-s3-integration.md
- AWS Security Hub: tutorials/prowler-app-security-hub-integration.md
- Lighthouse AI: tutorials/prowler-app-lighthouse.md
- Tutorials:
- SSO with Entra: tutorials/prowler-app-sso-entra.md
- Bulk Provider Provisioning: tutorials/bulk-provider-provisioning.md
- CLI:
- Miscellaneous: tutorials/misc.md
- Reporting: tutorials/reporting.md
- Compliance: tutorials/compliance.md
- Dashboard: tutorials/dashboard.md
- Fixer (remediations): tutorials/fixer.md
- Quick Inventory: tutorials/quick-inventory.md
- Slack Integration: tutorials/integrations.md
- Configuration File: tutorials/configuration_file.md
- Logging: tutorials/logging.md
- Mutelist: tutorials/mutelist.md
- Integrations:
- AWS Security Hub: tutorials/aws/securityhub.md
- Slack: tutorials/integrations.md
- Send reports to AWS S3: tutorials/aws/s3.md
- Fixers (Remediations): tutorials/fixer.md
- Check Aliases: tutorials/check-aliases.md
- Custom Metadata: tutorials/custom-checks-metadata.md
- Scan Unused Services: tutorials/scan-unused-services.md
- Pentesting: tutorials/pentesting.md
- Parallel Execution: tutorials/parallel-execution.md
- Developer Guide: developer-guide/introduction.md
- Prowler Check Kreator: tutorials/prowler-check-kreator.md
- AWS:
- Getting Started: tutorials/aws/getting-started-aws.md
- Authentication: tutorials/aws/authentication.md
- Assume Role: tutorials/aws/role-assumption.md
- AWS Security Hub: tutorials/aws/securityhub.md
- AWS Organizations: tutorials/aws/organizations.md
- AWS Regions and Partitions: tutorials/aws/regions-and-partitions.md
- Scan Multiple AWS Accounts: tutorials/aws/multiaccount.md
- Send reports to AWS S3: tutorials/aws/s3.md
- AWS CloudShell: tutorials/aws/cloudshell.md
- Checks v2 to v3 and v4 Mapping: tutorials/aws/v2_to_v3_checks_mapping.md
- Tag-based Scan: tutorials/aws/tag-based-scan.md
- Resource ARNs based Scan: tutorials/aws/resource-arn-based-scan.md
- Boto3 Configuration: tutorials/aws/boto3-configuration.md
- Threat Detection: tutorials/aws/threat-detection.md
- Azure:
- Getting Started: tutorials/azure/getting-started-azure.md
- Authentication: tutorials/azure/authentication.md
- Non default clouds: tutorials/azure/use-non-default-cloud.md
- Subscriptions: tutorials/azure/subscriptions.md
- Create Prowler Service Principal: tutorials/azure/create-prowler-service-principal.md
- Google Cloud:
- Getting Started: tutorials/gcp/getting-started-gcp.md
- Authentication: tutorials/gcp/authentication.md
- Projects: tutorials/gcp/projects.md
- Organization: tutorials/gcp/organization.md
- Retry Configuration: tutorials/gcp/retry-configuration.md
- Kubernetes:
- In-Cluster Execution: tutorials/kubernetes/in-cluster.md
- Non In-Cluster Execution: tutorials/kubernetes/outside-cluster.md
- Miscellaneous: tutorials/kubernetes/misc.md
- Microsoft 365:
- Getting Started: tutorials/microsoft365/getting-started-m365.md
- Authentication: tutorials/microsoft365/authentication.md
- Use of PowerShell: tutorials/microsoft365/use-of-powershell.md
- GitHub:
- Authentication: tutorials/github/authentication.md
- Getting Started: tutorials/github/getting-started-github.md
- IaC:
- Getting Started: tutorials/iac/getting-started-iac.md
- Scan Unused Services: tutorials/scan-unused-services.md
- Quick Inventory: tutorials/quick-inventory.md
- Tutorials:
- Parallel Execution: tutorials/parallel-execution.md
- Providers:
- AWS:
- Getting Started: tutorials/aws/getting-started-aws.md
- Authentication: tutorials/aws/authentication.md
- Assume Role: tutorials/aws/role-assumption.md
- AWS Organizations: tutorials/aws/organizations.md
- AWS Regions and Partitions: tutorials/aws/regions-and-partitions.md
- Tag-based Scan: tutorials/aws/tag-based-scan.md
- Resource ARNs based Scan: tutorials/aws/resource-arn-based-scan.md
- Boto3 Configuration: tutorials/aws/boto3-configuration.md
- Threat Detection: tutorials/aws/threat-detection.md
- Tutorial > AWS CloudShell: tutorials/aws/cloudshell.md
- Tutorial > Scan Multiple AWS Accounts: tutorials/aws/multiaccount.md
- Azure:
- Getting Started: tutorials/azure/getting-started-azure.md
- Authentication: tutorials/azure/authentication.md
- Non default clouds: tutorials/azure/use-non-default-cloud.md
- Subscriptions: tutorials/azure/subscriptions.md
- Create Prowler Service Principal: tutorials/azure/create-prowler-service-principal.md
- Google Cloud:
- Getting Started: tutorials/gcp/getting-started-gcp.md
- Authentication: tutorials/gcp/authentication.md
- Projects: tutorials/gcp/projects.md
- Organization: tutorials/gcp/organization.md
- Retry Configuration: tutorials/gcp/retry-configuration.md
- Kubernetes:
- In-Cluster Execution: tutorials/kubernetes/in-cluster.md
- Non In-Cluster Execution: tutorials/kubernetes/outside-cluster.md
- Miscellaneous: tutorials/kubernetes/misc.md
- Microsoft 365:
- Getting Started: tutorials/microsoft365/getting-started-m365.md
- Authentication: tutorials/microsoft365/authentication.md
- Use of PowerShell: tutorials/microsoft365/use-of-powershell.md
- GitHub:
- Getting Started: tutorials/github/getting-started-github.md
- Authentication: tutorials/github/authentication.md
- IaC:
- Getting Started: tutorials/iac/getting-started-iac.md
- Authentication: tutorials/iac/authentication.md
- MongoDB Atlas:
- Getting Started: tutorials/mongodbatlas/getting-started-mongodbatlas.md
- Authentication: tutorials/mongodbatlas/authentication.md
- Developer Guide:
- General Concepts:
- Concepts:
- Introduction: developer-guide/introduction.md
- Providers: developer-guide/provider.md
- Services: developer-guide/services.md
@@ -137,7 +147,7 @@ nav:
- Integrations: developer-guide/integrations.md
- Compliance: developer-guide/security-compliance-framework.md
- Lighthouse: developer-guide/lighthouse.md
- Provider Specific Details:
- Providers:
- AWS: developer-guide/aws-details.md
- Azure: developer-guide/azure-details.md
- Google Cloud: developer-guide/gcp-details.md
@@ -154,8 +164,8 @@ nav:
- Security: security.md
- Contact Us: contact.md
- Troubleshooting: troubleshooting.md
- About: about.md
- Prowler Cloud: https://prowler.com
- About 🔗: https://prowler.com/about#team
- Release Notes 🔗: https://github.com/prowler-cloud/prowler/releases
# Customization
extra:
Generated
+57 -5
View File
@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.1.3 and should not be changed by hand.
[[package]]
name = "about-time"
@@ -394,6 +394,24 @@ cryptography = ">=2.1.4"
isodate = ">=0.6.1"
typing-extensions = ">=4.0.1"
[[package]]
name = "azure-mgmt-apimanagement"
version = "5.0.0"
description = "Microsoft Azure API Management Client Library for Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "azure_mgmt_apimanagement-5.0.0-py3-none-any.whl", hash = "sha256:b88c42a392333b60722fb86f15d092dfc19a8d67510dccd15c217381dff4e6ec"},
{file = "azure_mgmt_apimanagement-5.0.0.tar.gz", hash = "sha256:0ab7fe17e70fe3154cd840ff47d19d7a4610217003eaa7c21acf3511a6e57999"},
]
[package.dependencies]
azure-common = ">=1.1"
azure-mgmt-core = ">=1.3.2"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-mgmt-applicationinsights"
version = "4.1.0"
@@ -551,6 +569,23 @@ azure-mgmt-core = ">=1.3.2"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-mgmt-loganalytics"
version = "12.0.0"
description = "Microsoft Azure Log Analytics Management Client Library for Python"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "azure-mgmt-loganalytics-12.0.0.zip", hash = "sha256:da128a7e0291be7fa2063848df92a9180cf5c16d42adc09d2bc2efd711536bfb"},
{file = "azure_mgmt_loganalytics-12.0.0-py2.py3-none-any.whl", hash = "sha256:75ac1d47dd81179905c40765be8834643d8994acff31056ddc1863017f3faa02"},
]
[package.dependencies]
azure-common = ">=1.1,<2.0"
azure-mgmt-core = ">=1.2.0,<2.0.0"
msrest = ">=0.6.21"
[[package]]
name = "azure-mgmt-monitor"
version = "6.0.2"
@@ -761,6 +796,23 @@ azure-mgmt-core = ">=1.3.2"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-monitor-query"
version = "2.0.0"
description = "Microsoft Corporation Azure Monitor Query Client Library for Python"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "azure_monitor_query-2.0.0-py3-none-any.whl", hash = "sha256:8f52d581271d785e12f49cd5aaa144b8910fb843db2373855a7ef94c7fc462ea"},
{file = "azure_monitor_query-2.0.0.tar.gz", hash = "sha256:7b05f2fcac4fb67fc9f77a7d4c5d98a0f3099fb73b57c69ec1b080773994671b"},
]
[package.dependencies]
azure-core = ">=1.30.0"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-storage-blob"
version = "12.24.1"
@@ -2060,14 +2112,14 @@ files = [
[[package]]
name = "h2"
version = "4.2.0"
version = "4.3.0"
description = "Pure-Python HTTP/2 protocol implementation"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "h2-4.2.0-py3-none-any.whl", hash = "sha256:479a53ad425bb29af087f3458a61d30780bc818e4ebcf01f0b536ba916462ed0"},
{file = "h2-4.2.0.tar.gz", hash = "sha256:c8a52129695e88b1a0578d8d2cc6842bbd79128ac685463b887ee278126ad01f"},
{file = "h2-4.3.0-py3-none-any.whl", hash = "sha256:c438f029a25f7945c69e0ccf0fb951dc3f73a5f6412981daee861431b70e2bdd"},
{file = "h2-4.3.0.tar.gz", hash = "sha256:6c59efe4323fa18b47a632221a1888bd7fde6249819beda254aeca909f221bf1"},
]
[package.dependencies]
@@ -5839,4 +5891,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.1"
python-versions = ">3.9.1,<3.13"
content-hash = "a0635a7bb99427a5169b126b429d603079fc24c39ce6759a648fdffe74e50d6c"
content-hash = "aea38b0311bfabac00d4bf9ee5d2fa0a7f3e32dd2ee5c5d27eb54c69a80b35e9"
+53 -8
View File
@@ -2,7 +2,50 @@
All notable changes to the **Prowler SDK** are documented in this file.
## [v5.11.0] (Prowler UNRELEASED)
## [v5.12.1] (Prowler v5.12.1)
### Fixed
- Replaced old check id with new ones for compliance files [(#8682)](https://github.com/prowler-cloud/prowler/pull/8682)
- `firehose_stream_encrypted_at_rest` check false positives and new api call in kafka service [(#8599)](https://github.com/prowler-cloud/prowler/pull/8599)
- Replace defender rules policies key to use old name [(#8702)](https://github.com/prowler-cloud/prowler/pull/8702)
## [v5.12.0] (Prowler v5.12.0)
### Added
- Add more fields for the Jira ticket and handle custom fields errors [(#8601)](https://github.com/prowler-cloud/prowler/pull/8601)
- Support labels on Jira tickets [(#8603)](https://github.com/prowler-cloud/prowler/pull/8603)
- Add finding url and tenant info inside Jira tickets [(#8607)](https://github.com/prowler-cloud/prowler/pull/8607)
- Get Jira Project's metadata [(#8630)](https://github.com/prowler-cloud/prowler/pull/8630)
- Get Jira projects from test_connection [(#8634)](https://github.com/prowler-cloud/prowler/pull/8634)
- `AdditionalUrls` field in CheckMetadata [(#8590)](https://github.com/prowler-cloud/prowler/pull/8590)
- Support color for MANUAL finidngs in Jira tickets [(#8642)](https://github.com/prowler-cloud/prowler/pull/8642)
- `--excluded-checks-file` flag [(#8301)](https://github.com/prowler-cloud/prowler/pull/8301)
- Send finding in Jira integration with the needed values [(#8648)](https://github.com/prowler-cloud/prowler/pull/8648)
- Add language enforcement for Jira requests [(#8674)](https://github.com/prowler-cloud/prowler/pull/8674)
- MongoDB Atlas provider with 10 security checks [(#8312)](https://github.com/prowler-cloud/prowler/pull/8312)
- `clusters_authentication_enabled` - Ensure clusters have authentication enabled
- `clusters_backup_enabled` - Ensure clusters have backup enabled
- `clusters_encryption_at_rest_enabled` - Ensure clusters have encryption at rest enabled
- `clusters_tls_enabled` - Ensure clusters have TLS authentication required
- `organizations_api_access_list_required` - Ensure organization requires API access list
- `organizations_mfa_required` - Ensure organization requires MFA
- `organizations_security_contact_defined` - Ensure organization has security contact defined
- `organizations_service_account_secrets_expiration` - Ensure organization has maximum period expiration for service account secrets
- `projects_auditing_enabled` - Ensure database auditing is enabled
- `projects_network_access_list_exposed_to_internet` - Ensure project network access list is not exposed to internet
### Changed
- Rename ftp and mongo checks to follow pattern `ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_*` [(#8293)](https://github.com/prowler-cloud/prowler/pull/8293)
### Fixed
- Renamed `AdditionalUrls` to `AdditionalURLs` field in CheckMetadata [(#8639)](https://github.com/prowler-cloud/prowler/pull/8639)
- TypeError from Python 3.9 in Security Hub module by updating type annotations [(#8619)](https://github.com/prowler-cloud/prowler/pull/8619)
- KeyError when SecurityGroups field is missing in MemoryDB check [(#8666)](https://github.com/prowler-cloud/prowler/pull/8666)
- NoneType error in Opensearch, Firehose and Cognito checks [(#8670)](https://github.com/prowler-cloud/prowler/pull/8670)
---
## [v5.11.0] (Prowler v5.11.0)
### Added
- Certificate authentication for M365 provider [(#8404)](https://github.com/prowler-cloud/prowler/pull/8404)
@@ -11,24 +54,25 @@ All notable changes to the **Prowler SDK** are documented in this file.
- Bedrock AgentCore privilege escalation combination for AWS provider [(#8526)](https://github.com/prowler-cloud/prowler/pull/8526)
- Add User Email and APP name/installations information in GitHub provider [(#8501)](https://github.com/prowler-cloud/prowler/pull/8501)
- Remove standalone iam:PassRole from privesc detection and add missing patterns [(#8530)](https://github.com/prowler-cloud/prowler/pull/8530)
- Support session/profile/role/static credentials in Security Hub integration [(#8539)](https://github.com/prowler-cloud/prowler/pull/8539)
- `eks_cluster_deletion_protection_enabled` check for AWS provider [(#8536)](https://github.com/prowler-cloud/prowler/pull/8536)
- ECS privilege escalation patterns (StartTask and RunTask) for AWS provider [(#8541)](https://github.com/prowler-cloud/prowler/pull/8541)
- Resource Explorer enumeration v2 API actions in `cloudtrail_threat_detection_enumeration` check [(#8557)](https://github.com/prowler-cloud/prowler/pull/8557)
- `apim_threat_detection_llm_jacking` check for Azure provider [(#8571)](https://github.com/prowler-cloud/prowler/pull/8571)
- GCP `--skip-api-check` command line flag [(#8575)](https://github.com/prowler-cloud/prowler/pull/8575)
### Changed
- Refine kisa isms-p compliance mapping [(#8479)](https://github.com/prowler-cloud/prowler/pull/8479)
- Improve AWS Security Hub region check using multiple threads [(#8365)](https://github.com/prowler-cloud/prowler/pull/8365)
### Fixed
---
## [v5.10.3] (Prowler UNRELEASED)
### Fixed
- Resource metadata error in `s3_bucket_shadow_resource_vulnerability` check [(#8572)](https://github.com/prowler-cloud/prowler/pull/8572)
- GitHub App authentication through API fails with auth_method validation error [(#8587)](https://github.com/prowler-cloud/prowler/pull/8587)
- AWS resource-arn filtering [(#8533)](https://github.com/prowler-cloud/prowler/pull/8533)
- GitHub App authentication for GitHub provider [(#8529)](https://github.com/prowler-cloud/prowler/pull/8529)
- List all accessible organizations in GitHub provider [(#8535)](https://github.com/prowler-cloud/prowler/pull/8535)
- Only evaluate enabled accounts in `entra_users_mfa_capable` check [(#8544)](https://github.com/prowler-cloud/prowler/pull/8544)
- GitHub Personal Access Token authentication fails without `user:email` scope [(#8580)](https://github.com/prowler-cloud/prowler/pull/8580)
---
@@ -65,6 +109,7 @@ All notable changes to the **Prowler SDK** are documented in this file.
- GitHub repository and organization scoping support with `--repository/respositories` and `--organization/organizations` flags [(#8329)](https://github.com/prowler-cloud/prowler/pull/8329)
- GCP provider retry configuration [(#8412)](https://github.com/prowler-cloud/prowler/pull/8412)
- `s3_bucket_shadow_resource_vulnerability` check for AWS provider [(#8398)](https://github.com/prowler-cloud/prowler/pull/8398)
- Use `trivy` as engine for IaC provider [(#8466)](https://github.com/prowler-cloud/prowler/pull/8466)
### Changed
- Handle some AWS errors as warnings instead of errors [(#8347)](https://github.com/prowler-cloud/prowler/pull/8347)
+16
View File
@@ -23,6 +23,7 @@ from prowler.lib.check.check import (
list_checks_json,
list_fixers,
list_services,
parse_checks_from_file,
parse_checks_from_folder,
print_categories,
print_checks,
@@ -102,6 +103,7 @@ from prowler.providers.github.models import GithubOutputOptions
from prowler.providers.iac.models import IACOutputOptions
from prowler.providers.kubernetes.models import KubernetesOutputOptions
from prowler.providers.m365.models import M365OutputOptions
from prowler.providers.mongodbatlas.models import MongoDBAtlasOutputOptions
from prowler.providers.nhn.models import NHNOutputOptions
@@ -121,6 +123,7 @@ def prowler():
checks = args.check
excluded_checks = args.excluded_check
excluded_checks_file = args.excluded_checks_file
excluded_services = args.excluded_service
services = args.service
categories = args.category
@@ -257,6 +260,15 @@ def prowler():
checks_to_execute, excluded_checks
)
# Exclude checks if --excluded-checks-file
if excluded_checks_file:
excluded_checks_from_file = parse_checks_from_file(
excluded_checks_file, provider
)
checks_to_execute = exclude_checks_to_run(
checks_to_execute, list(excluded_checks_from_file)
)
# Exclude services if --excluded-services
if excluded_services:
checks_to_execute = exclude_services_to_run(
@@ -300,6 +312,10 @@ def prowler():
output_options = M365OutputOptions(
args, bulk_checks_metadata, global_provider.identity
)
elif provider == "mongodbatlas":
output_options = MongoDBAtlasOutputOptions(
args, bulk_checks_metadata, global_provider.identity
)
elif provider == "nhn":
output_options = NHNOutputOptions(
args, bulk_checks_metadata, global_provider.identity
@@ -364,8 +364,8 @@
"ec2_ami_public",
"ec2_instance_public_ip",
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -721,8 +721,8 @@
"ec2_networkacl_allow_ingress_tcp_port_22",
"ec2_networkacl_allow_ingress_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1510,8 +1510,8 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1604,8 +1604,8 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1698,8 +1698,8 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1558,8 +1558,8 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1682,7 +1682,7 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_elasticsearch_kibana_9200_9300_5601",
@@ -1814,7 +1814,7 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_memcached_11211",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mysql_3306",
@@ -1917,7 +1917,7 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_telnet_23",
@@ -3024,8 +3024,8 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -4588,4 +4588,4 @@
]
}
]
}
}
@@ -1557,8 +1557,8 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1682,7 +1682,7 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_elasticsearch_kibana_9200_9300_5601",
@@ -1816,7 +1816,7 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_memcached_11211",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mysql_3306",
@@ -1919,7 +1919,7 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_telnet_23",
@@ -3028,8 +3028,8 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -4603,4 +4603,4 @@
]
}
]
}
}
+10 -10
View File
@@ -107,8 +107,8 @@
"ec2_networkacl_allow_ingress_tcp_port_22",
"ec2_networkacl_allow_ingress_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1024,8 +1024,8 @@
"ec2_networkacl_allow_ingress_tcp_port_22",
"ec2_networkacl_allow_ingress_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1470,8 +1470,8 @@
"ec2_networkacl_allow_ingress_tcp_port_22",
"ec2_networkacl_allow_ingress_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1650,8 +1650,8 @@
"ec2_networkacl_allow_ingress_tcp_port_22",
"ec2_networkacl_allow_ingress_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -1902,8 +1902,8 @@
"ec2_networkacl_allow_ingress_tcp_port_22",
"ec2_networkacl_allow_ingress_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -553,8 +553,8 @@
"Description": "Ensure that ec2 security groups do not allow ingress from internet to common ports",
"Checks": [
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",
@@ -66,7 +66,7 @@
"elbv2_ssl_listeners",
"ssm_documents_set_as_public",
"vpc_subnet_no_public_ip_by_default",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mysql_3306",
"s3_account_level_public_access_blocks"
+2 -2
View File
@@ -253,8 +253,8 @@
"ec2_securitygroup_allow_ingress_from_internet_to_all_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_any_port",
"ec2_securitygroup_allow_ingress_from_internet_to_high_risk_tcp_ports",
"ec2_securitygroup_allow_ingress_from_internet_to_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_ftp_port_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_mongodb_27017_27018",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_ftp_20_21",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_22",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_3389",
"ec2_securitygroup_allow_ingress_from_internet_to_tcp_port_cassandra_7199_9160_8888",

Some files were not shown because too many files have changed in this diff Show More