Compare commits

..

92 Commits

Author SHA1 Message Date
César Arroba
6511f98e87 chore(api): update prowler version (#8602) 2025-08-28 15:14:39 +02:00
Víctor Fernández Poyatos
b003fca377 fix(docs): remove empty sections (#8600) 2025-08-28 12:55:46 +02:00
Víctor Fernández Poyatos
b4deda3c3f docs(api): fix API response samples (#8592) 2025-08-28 12:39:07 +02:00
Sergio Garcia
338bb74c0c fix(azure): query API management logs with not empty operations (#8598) 2025-08-28 12:03:35 +02:00
Alejandro Bailo
7342a8901f chore: update CHANGELOG.md for Prowler v5.11.0 release (#8597) 2025-08-28 11:43:24 +02:00
Sergio Garcia
f484b83f15 feat(azure): Add APIM threat detection for LLM jacking attacks (#8571)
Co-authored-by: Rubén De la Torre Vico <ruben@prowler.com>
2025-08-28 11:42:07 +02:00
Adrián Jesús Peña Rodríguez
c69187f484 chore: prepare api changelog for 5.11 (#8596) 2025-08-28 10:25:08 +02:00
Alejandro Bailo
5038afeb26 fix(security-hub): copy updated (#8594) 2025-08-27 18:42:34 +02:00
Sergio Garcia
fce43cea16 chore: update changelog (#8593) 2025-08-27 17:57:07 +02:00
Andoni Alonso
43a14b89bc fix(github): provider always scans user instead of organization when using provider UID (#8587) 2025-08-27 17:45:13 +02:00
Tom
24364bd73e feat(gcp): Add support for skipping APIs check (#8575)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2025-08-27 14:44:34 +02:00
Adrián Jesús Peña Rodríguez
a1abe6dd2d fix(sh): reset regions information if connection fails (#8588) 2025-08-27 14:15:09 +02:00
César Arroba
25098bc82a chore(gha): fix conflict checker action (#8586) 2025-08-27 13:41:39 +02:00
sumit-tft
20f2f45610 feat(ui): add S3 bucket link with folder for each integration (#8554)
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2025-08-27 12:40:37 +02:00
Alejandro Bailo
06c2608a05 feat(integrations): external links and copies changed (#8574) 2025-08-27 12:40:25 +02:00
Alejandro Bailo
329ac113f2 chore(docs): update CHANGELOG properly (#8585) 2025-08-27 11:57:12 +02:00
Hugo Pereira Brito
97179d2b43 fix(docs): incorrect permission in sp creation guide (#8581) 2025-08-27 11:01:37 +02:00
sumit-tft
8317ea783f feat(ui): show all provider UIDs in scan page filter regardless of co… (#8375) 2025-08-27 10:50:16 +02:00
Andoni Alonso
65e7e89d61 fix(github): GitHub Personal Access Token authentication fails without user:email scope (#8580) 2025-08-27 09:57:32 +02:00
Víctor Fernández Poyatos
26a4dd4e8d chore: bump h2 to 4.3.0 (#8573) 2025-08-26 15:17:06 +02:00
Alejandro Bailo
dab0cea2dd feat(ui): Security Hub (#8552) 2025-08-26 14:30:45 +02:00
Daniel Barranquero
3b42eb3818 fix(s3): resource metadata error in s3_bucket_shadow_resource_vulnerability (#8572) 2025-08-26 13:30:49 +02:00
Prowler Bot
a5ba950627 chore(regions_update): Changes in regions for AWS services (#8567)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-08-26 09:57:45 +02:00
Andoni Alonso
a1232446c1 docs: refactor several sections (#8570) 2025-08-26 09:55:18 +02:00
Pedro Martín
aa6f851887 docs(aws): deploying prowler iam roles across aws organizations (#8427)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-08-26 09:45:14 +02:00
Adrián Jesús Peña Rodríguez
25f972e910 feat(sh): create asff of there is an enabled SecurityHub integration (#8569) 2025-08-25 16:58:21 +02:00
Pedro Martín
7216e5ce3d chore(github): improve pull request template (#7910) 2025-08-25 16:22:55 +02:00
Adrián Jesús Peña Rodríguez
83242da0ab feat(integrations): implement AWS Security Hub integration (#8365) 2025-08-25 15:53:48 +02:00
Alejandro Bailo
d457166a0c fix(ui): AWS form selector default values (#8553) 2025-08-25 12:30:02 +02:00
Daniel Barranquero
88f38b2d2a feat(docs): remove old requirements links (#8561) 2025-08-22 14:22:50 +02:00
Pepe Fagoaga
c2e0849d5f fix(conflict-checker): use prowler-bot (#8560) 2025-08-22 17:27:44 +05:45
Andoni Alonso
1fdebfa295 docs: remove "Requirements" page (#8559) 2025-08-22 15:55:25 +05:45
Sergio Garcia
ea6d04ed3a chore(securityhub): add static credentials and role assumption support (#8539)
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-08-22 11:58:35 +02:00
Sergio Garcia
2167683851 feat(aws): add Resource Explorer enumeration actions (#8557) 2025-08-22 11:47:51 +02:00
Pepe Fagoaga
6324be31ab fix(api): poetry lock up to date with the SDK (#8558) 2025-08-22 11:05:14 +02:00
Alejandro Bailo
525f152e51 fix(ui): update authorization logic to match right paths (#8556) 2025-08-22 10:35:28 +02:00
Sergio Garcia
c3a2d79234 chore(iac): change engine to trivy (#8466)
Co-authored-by: Andoni A. <14891798+andoniaf@users.noreply.github.com>
2025-08-22 10:17:51 +02:00
Andoni Alonso
cefa708322 docs: add provider bulk provisioning (#8551) 2025-08-21 16:33:45 +02:00
Andoni Alonso
1a9e14ab2a chore(bulk-provisioning-tool): add script to bulk provision providers (#8540) 2025-08-21 13:11:46 +02:00
Chandrapal Badshah
b1c6094b6d fix: Remove temperature for GPT-5 models (#8550)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-08-21 12:40:49 +02:00
Pablo Lara
1038b11fe3 docs: update changelog (#8549) 2025-08-21 12:22:27 +02:00
Chandrapal Badshah
d54e3b25db fix: Refactor getting lighthouse config (#8546)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-08-21 11:14:21 +02:00
Pepe Fagoaga
6a8e8750bb chore(actions): conflict checker (#8547) 2025-08-21 14:28:18 +05:45
Hugo Pereira Brito
ad3d4536fb fix(m365): only evaluate enabled users in entra_users_mfa_capable (#8544) 2025-08-20 16:45:00 +02:00
Andoni Alonso
46c24055ee docs: refactor Overview into several files (#8543) 2025-08-20 17:44:06 +05:45
Pepe Fagoaga
4c6a1592ac chore(actions): update docs comment with link (#8448) 2025-08-20 17:42:32 +05:45
Hugo Pereira Brito
89e657561c feat(github): add User Email and APP name/installations information (#8501)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-08-20 12:26:38 +02:00
Hugo Pereira Brito
55099abc86 fix(organization): list all accessible organizations (#8535)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-08-20 12:13:01 +02:00
Andoni Alonso
3c599a75cc feat(iam): add ECS privilege escalation patterns to IAM checks (#8541) 2025-08-20 09:23:30 +02:00
Chandrapal Badshah
f77897f813 feat: gpt-5 and gpt-5-mini integration with lighthouse (#8527)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-08-19 16:49:21 +02:00
Sergio Garcia
30518f2e0e feat(aws): new check eks_cluster_deletion_protection_enabled (#8536) 2025-08-19 10:25:24 +02:00
Chandrapal Badshah
efdeb431ba feat: Add resource agent to supervisor (#8509)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-08-19 09:40:14 +02:00
Sergio Garcia
bb07cf9147 fix(aws): exact match in resource-arn filtering (#8533) 2025-08-18 12:11:13 +02:00
Prowler Bot
9214b5c26f chore(regions_update): Changes in regions for AWS services (#8531)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-08-18 11:58:41 +02:00
dependabot[bot]
d57df3cc28 chore(deps): bump actions/upload-artifact from 4.5.0 to 4.6.2 (#8154)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-18 11:43:41 +02:00
Andoni Alonso
2f5fce41dc feat(iam): remove standalone iam:PassRole from privesc detection and add missing patterns (#8530) 2025-08-18 11:35:14 +02:00
Chandrapal Badshah
6918a75449 fix: add business context to lighthouse chat (#8528)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-08-18 09:49:23 +02:00
Pablo Lara
3aeaa3d992 feat(filters): improve provider connection filter UX (#8520) 2025-08-18 09:10:16 +02:00
Sergio Garcia
fd833eecf0 fix(github): solve Github APP auth method (#8529) 2025-08-18 08:35:19 +02:00
Andoni Alonso
39e4d20b24 feat(iam): add Bedrock AgentCore privilege escalation combo (#8526) 2025-08-15 13:25:15 +02:00
Sergio Garcia
dfdd45e4d0 fix(github): list all accessible repositories (#8522) 2025-08-14 10:38:38 +02:00
Hugo Pereira Brito
81478dfed3 fix(compliance): GitHub CIS 1.0 (#8519) 2025-08-13 16:45:36 +02:00
Chandrapal Badshah
2854f8405c fix: simplify error handling to use only error.message (#8518)
Co-authored-by: Chandrapal Badshah <12944530+Chan9390@users.noreply.github.com>
2025-08-13 10:59:47 +02:00
Jaen-923
0e1578cfbc chore(aws): Refine kisa isms-p compliance mapping (#8479)
Co-authored-by: ghkim583 <203069125+ghkim583@users.noreply.github.com>
2025-08-13 09:08:37 +02:00
Hugo Pereira Brito
f5b1532647 fix(kafka): false positives in kafka_cluster_is_public check (#8514) 2025-08-13 09:05:09 +02:00
Sergio Garcia
d9f3a6b88e docs(github): add Github onboarding documentation (#8510) 2025-08-12 17:11:30 +02:00
Hugo Pereira Brito
b0c386fc60 fix(app): fix false positives in app_http_logs_enabled (#8507)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-08-12 14:47:17 +02:00
Hugo Pereira Brito
72b06261df fix(storage): fall positives in storage_geo_redundant_enabled (#8504) 2025-08-12 12:30:43 +02:00
sumit-tft
1562b77581 fix(ui): redirection after deleting providers group and improve erro… (#8389)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-08-12 11:31:45 +02:00
Daniel Barranquero
10e38ca407 fix: missing resource_name in GCP and Azure Defender checks (#8352) 2025-08-11 16:16:08 +02:00
Rubén De la Torre Vico
5842f2df37 feat(azure/vm): add new check vm_jit_access_enabled (#8202)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-08-11 13:12:36 +02:00
Prowler Bot
8b3b9ffd99 chore(regions_update): Changes in regions for AWS services (#8499)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-08-11 12:00:02 +02:00
Rubén De la Torre Vico
d238050065 feat(azure/vm): add new check vm_sufficient_daily_backup_retention_period (#8200)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-08-11 11:44:45 +02:00
sumit-tft
5572d476ad fix(ui): adjust table headers to be single-line and consistent (#8480) 2025-08-11 10:47:10 +02:00
sumit-tft
3c94d3a56f fix(ui): disable See Compliance button until scan completes (#8487)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-08-11 10:37:35 +02:00
Hugo Pereira Brito
85af4ff77c feat(m365): add certificate auth method to cli (#8404)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-08-11 09:47:56 +02:00
Daniel Barranquero
dcee114ef3 fix: validation errors in azure and m365 (#8368) 2025-08-11 09:42:30 +02:00
Pedro Martín
760723874c fix(prowler-threatscore): order the requirements by id (#8495)
Co-authored-by: Sergio Garcia <hello@mistercloudsec.com>
2025-08-11 08:20:10 +02:00
Pedro Martín
c0a4898074 chore(changelog): update (#8496) 2025-08-11 07:48:23 +02:00
Alejandro Bailo
03c0533b58 feat(ui): overview charts display improved (#8491)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-08-08 10:59:15 +02:00
sumit-tft
c8dcb0edb0 feat(ui): add GitHub submenu under High Risk Findings (#8488)
Co-authored-by: Pablo Lara <larabjj@gmail.com>
2025-08-08 10:36:36 +02:00
Pablo Lara
82171ee916 docs: update changelog (#8489) 2025-08-08 10:20:53 +02:00
Pablo Lara
df4bf18b97 feat(ui): add Mutelist menu item under Configuration (#8444)
Co-authored-by: Alejandro Bailo <59607668+alejandrobailo@users.noreply.github.com>
Co-authored-by: alejandrobailo <alejandrobailo94@gmail.com>
2025-08-08 09:09:37 +02:00
Alejandro Bailo
94e60f7329 fix(ui): assume role fields shown (#8484) 2025-08-07 17:44:46 +02:00
Rubén De la Torre Vico
f1ba5abbec chore(docs): update provider statistics in README.md (#8483)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-07 17:10:56 +02:00
Hugo Pereira Brito
6cc1a9a2cb fix(compliance): delete invalid requirements for GitHub CIS 1.0 (#8472)
Co-authored-by: MrCloudSec <hello@mistercloudsec.com>
2025-08-07 20:51:20 +07:00
Pablo Lara
31f98092bf feat(ui): add provider type filter to providers page (#8473) 2025-08-07 14:34:04 +02:00
Pepe Fagoaga
85197036ca chore(env): Update NEXT_PUBLIC_PROWLER_RELEASE_VERSION (#8476) 2025-08-07 17:50:18 +05:45
Pepe Fagoaga
be43025f00 fix(actions): always get latest SDK reference (#8474) 2025-08-07 17:38:40 +05:45
César Arroba
c6b34f0a85 chore(api): open PR with API prowler version (#8475) 2025-08-07 13:49:39 +02:00
Prowler Bot
675698a26a chore(release): Bump version to v5.11.0 (#8470)
Co-authored-by: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
2025-08-07 12:40:55 +02:00
Alejandro Bailo
8d9bf2384f docs: S3 tutorial documentation (#8414)
Co-authored-by: Pepe Fagoaga <pepe@prowler.com>
Co-authored-by: Adrián Jesús Peña Rodríguez <adrianjpr@gmail.com>
2025-08-07 16:04:42 +05:45
250 changed files with 19742 additions and 7060 deletions

View File

@@ -8,6 +8,10 @@ If fixes an issue please add it with `Fix #XXXX`
Please include a summary of the change and which issue is fixed. List any dependencies that are required for this change.
### Steps to review
Please add a detailed description of how to review this PR.
### Checklist
- Are there new checks included in this PR? Yes / No

View File

@@ -13,6 +13,7 @@ on:
- "master"
- "v5.*"
paths:
- ".github/workflows/api-pull-request.yml"
- "api/**"
env:
@@ -81,7 +82,9 @@ jobs:
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: api/**
files: |
api/**
.github/workflows/api-pull-request.yml
files_ignore: ${{ env.IGNORE_FILES }}
- name: Replace @master with current branch in pyproject.toml
@@ -105,6 +108,23 @@ jobs:
run: |
poetry lock
- name: Update SDK's poetry.lock resolved_reference to latest commit - Only for push events to `master`
working-directory: ./api
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true' && github.event_name == 'push'
run: |
# Get the latest commit hash from the prowler-cloud/prowler repository
LATEST_COMMIT=$(curl -s "https://api.github.com/repos/prowler-cloud/prowler/commits/master" | jq -r '.sha')
echo "Latest commit hash: $LATEST_COMMIT"
# Update the resolved_reference specifically for prowler-cloud/prowler repository
sed -i '/url = "https:\/\/github\.com\/prowler-cloud\/prowler\.git"/,/resolved_reference = / {
s/resolved_reference = "[a-f0-9]\{40\}"/resolved_reference = "'"$LATEST_COMMIT"'"/
}' poetry.lock
# Verify the change was made
echo "Updated resolved_reference:"
grep -A2 -B2 "resolved_reference" poetry.lock
- name: Set up Python ${{ matrix.python-version }}
if: steps.are-non-ignored-files-changed.outputs.any_changed == 'true'
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0

View File

@@ -7,6 +7,7 @@ on:
- 'v3'
paths:
- 'docs/**'
- '.github/workflows/build-documentation-on-pr.yml'
env:
PR_NUMBER: ${{ github.event.pull_request.number }}
@@ -16,9 +17,20 @@ jobs:
name: Documentation Link
runs-on: ubuntu-latest
steps:
- name: Leave PR comment with the Prowler Documentation URI
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
- name: Find existing documentation comment
uses: peter-evans/find-comment@3eae4d37986fb5a8592848f6a574fdf654e61f9e # v3.1.0
id: find-comment
with:
issue-number: ${{ env.PR_NUMBER }}
comment-author: 'github-actions[bot]'
body-includes: '<!-- prowler-docs-link -->'
- name: Create or update PR comment with the Prowler Documentation URI
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
comment-id: ${{ steps.find-comment.outputs.comment-id }}
issue-number: ${{ env.PR_NUMBER }}
body: |
<!-- prowler-docs-link -->
You can check the documentation for this PR here -> [Prowler Documentation](https://prowler-prowler-docs--${{ env.PR_NUMBER }}.com.readthedocs.build/projects/prowler-open-source/en/${{ env.PR_NUMBER }}/)
edit-mode: replace

View File

@@ -1,4 +1,4 @@
name: Create Backport Label
name: Prowler - Create Backport Label
on:
release:

View File

@@ -0,0 +1,167 @@
name: Prowler - PR Conflict Checker
on:
pull_request:
types:
- opened
- synchronize
- reopened
branches:
- "master"
- "v5.*"
pull_request_target:
types:
- opened
- synchronize
- reopened
branches:
- "master"
- "v5.*"
jobs:
conflict-checker:
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: write
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
with:
files: |
**
- name: Check for conflict markers
id: conflict-check
run: |
echo "Checking for conflict markers in changed files..."
CONFLICT_FILES=""
HAS_CONFLICTS=false
# Check each changed file for conflict markers
for file in ${{ steps.changed-files.outputs.all_changed_files }}; do
if [ -f "$file" ]; then
echo "Checking file: $file"
# Look for conflict markers
if grep -l "^<<<<<<<\|^=======\|^>>>>>>>" "$file" 2>/dev/null; then
echo "Conflict markers found in: $file"
CONFLICT_FILES="$CONFLICT_FILES$file "
HAS_CONFLICTS=true
fi
fi
done
if [ "$HAS_CONFLICTS" = true ]; then
echo "has_conflicts=true" >> $GITHUB_OUTPUT
echo "conflict_files=$CONFLICT_FILES" >> $GITHUB_OUTPUT
echo "Conflict markers detected in files: $CONFLICT_FILES"
else
echo "has_conflicts=false" >> $GITHUB_OUTPUT
echo "No conflict markers found in changed files"
fi
- name: Add conflict label
if: steps.conflict-check.outputs.has_conflicts == 'true'
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
script: |
const { data: labels } = await github.rest.issues.listLabelsOnIssue({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
});
const hasConflictLabel = labels.some(label => label.name === 'has-conflicts');
if (!hasConflictLabel) {
await github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
labels: ['has-conflicts']
});
console.log('Added has-conflicts label');
} else {
console.log('has-conflicts label already exists');
}
- name: Remove conflict label
if: steps.conflict-check.outputs.has_conflicts == 'false'
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
with:
github-token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
script: |
try {
await github.rest.issues.removeLabel({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
name: 'has-conflicts'
});
console.log('Removed has-conflicts label');
} catch (error) {
if (error.status === 404) {
console.log('has-conflicts label was not present');
} else {
throw error;
}
}
- name: Find existing conflict comment
if: steps.conflict-check.outputs.has_conflicts == 'true'
uses: peter-evans/find-comment@3eae4d37986fb5a8592848f6a574fdf654e61f9e # v3.1.0
id: find-comment
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: 'github-actions[bot]'
body-regex: '(⚠️ \*\*Conflict Markers Detected\*\*|✅ \*\*Conflict Markers Resolved\*\*)'
- name: Create or update conflict comment
if: steps.conflict-check.outputs.has_conflicts == 'true'
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
comment-id: ${{ steps.find-comment.outputs.comment-id }}
issue-number: ${{ github.event.pull_request.number }}
edit-mode: replace
body: |
⚠️ **Conflict Markers Detected**
This pull request contains unresolved conflict markers in the following files:
```
${{ steps.conflict-check.outputs.conflict_files }}
```
Please resolve these conflicts by:
1. Locating the conflict markers: `<<<<<<<`, `=======`, and `>>>>>>>`
2. Manually editing the files to resolve the conflicts
3. Removing all conflict markers
4. Committing and pushing the changes
- name: Find existing conflict comment when resolved
if: steps.conflict-check.outputs.has_conflicts == 'false'
uses: peter-evans/find-comment@3eae4d37986fb5a8592848f6a574fdf654e61f9e # v3.1.0
id: find-resolved-comment
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: 'github-actions[bot]'
body-regex: '(⚠️ \*\*Conflict Markers Detected\*\*|✅ \*\*Conflict Markers Resolved\*\*)'
- name: Update comment when conflicts resolved
if: steps.conflict-check.outputs.has_conflicts == 'false' && steps.find-resolved-comment.outputs.comment-id != ''
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
comment-id: ${{ steps.find-resolved-comment.outputs.comment-id }}
issue-number: ${{ github.event.pull_request.number }}
edit-mode: replace
body: |
✅ **Conflict Markers Resolved**
All conflict markers have been successfully resolved in this pull request.

View File

@@ -1,4 +1,4 @@
name: Prowler Release Preparation
name: Prowler - Release Preparation
run-name: Prowler Release Preparation for ${{ inputs.prowler_version }}
@@ -144,27 +144,34 @@ jobs:
fi
echo "✓ api/src/backend/api/v1/views.py version: $CURRENT_API_VERSION"
- name: Create release branch for minor release
- name: Checkout existing release branch for minor release
if: ${{ env.PATCH_VERSION == '0' }}
run: |
echo "Minor release detected (patch = 0), creating new branch $BRANCH_NAME..."
if git show-ref --verify --quiet "refs/heads/$BRANCH_NAME" || git show-ref --verify --quiet "refs/remotes/origin/$BRANCH_NAME"; then
echo "ERROR: Branch $BRANCH_NAME already exists for minor release $PROWLER_VERSION"
echo "Minor release detected (patch = 0), checking out existing branch $BRANCH_NAME..."
if git show-ref --verify --quiet "refs/remotes/origin/$BRANCH_NAME"; then
echo "Branch $BRANCH_NAME exists remotely, checking out..."
git checkout -b "$BRANCH_NAME" "origin/$BRANCH_NAME"
else
echo "ERROR: Branch $BRANCH_NAME should exist for minor release $PROWLER_VERSION. Please create it manually first."
exit 1
fi
git checkout -b "$BRANCH_NAME"
# Push the new branch first so it exists remotely
git push origin "$BRANCH_NAME"
- name: Update prowler dependency in api/pyproject.toml
- name: Prepare prowler dependency update for minor release
if: ${{ env.PATCH_VERSION == '0' }}
run: |
CURRENT_PROWLER_REF=$(grep 'prowler @ git+https://github.com/prowler-cloud/prowler.git@' api/pyproject.toml | sed -E 's/.*@([^"]+)".*/\1/' | tr -d '[:space:]')
BRANCH_NAME_TRIMMED=$(echo "$BRANCH_NAME" | tr -d '[:space:]')
# Minor release: update the dependency to use the new branch
echo "Minor release detected - updating prowler dependency from '$CURRENT_PROWLER_REF' to '$BRANCH_NAME_TRIMMED'"
# Create a temporary branch for the PR
TEMP_BRANCH="update-api-dependency-$BRANCH_NAME_TRIMMED-$(date +%s)"
echo "TEMP_BRANCH=$TEMP_BRANCH" >> $GITHUB_ENV
# Switch back to master and create temp branch
git checkout master
git checkout -b "$TEMP_BRANCH"
# Minor release: update the dependency to use the release branch
echo "Updating prowler dependency from '$CURRENT_PROWLER_REF' to '$BRANCH_NAME_TRIMMED'"
sed -i "s|prowler @ git+https://github.com/prowler-cloud/prowler.git@[^\"]*\"|prowler @ git+https://github.com/prowler-cloud/prowler.git@$BRANCH_NAME_TRIMMED\"|" api/pyproject.toml
# Verify the change was made
@@ -180,12 +187,39 @@ jobs:
poetry lock
cd ..
# Commit and push the changes
# Commit and push the temporary branch
git add api/pyproject.toml api/poetry.lock
git commit -m "chore(api): update prowler dependency to $BRANCH_NAME_TRIMMED for release $PROWLER_VERSION"
git push origin "$BRANCH_NAME"
git push origin "$TEMP_BRANCH"
echo "✓ api/pyproject.toml prowler dependency updated to: $UPDATED_PROWLER_REF"
echo "✓ Prepared prowler dependency update to: $UPDATED_PROWLER_REF"
- name: Create Pull Request against release branch
if: ${{ env.PATCH_VERSION == '0' }}
uses: peter-evans/create-pull-request@271a8d0340265f705b14b6d32b9829c1cb33d45e # v7.0.8
with:
token: ${{ secrets.PROWLER_BOT_ACCESS_TOKEN }}
branch: ${{ env.TEMP_BRANCH }}
base: ${{ env.BRANCH_NAME }}
title: "chore(api): Update prowler dependency to ${{ env.BRANCH_NAME }} for release ${{ env.PROWLER_VERSION }}"
body: |
### Description
Updates the API prowler dependency for release ${{ env.PROWLER_VERSION }}.
**Changes:**
- Updates `api/pyproject.toml` prowler dependency from `@master` to `@${{ env.BRANCH_NAME }}`
- Updates `api/poetry.lock` file with resolved dependencies
This PR should be merged into the `${{ env.BRANCH_NAME }}` release branch.
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
author: prowler-bot <179230569+prowler-bot@users.noreply.github.com>
labels: |
component/api
no-changelog
- name: Extract changelog entries
run: |

View File

@@ -1,4 +1,4 @@
name: Check Changelog
name: Prowler - Check Changelog
on:
pull_request:

View File

@@ -84,7 +84,7 @@ jobs:
working-directory: ./ui
run: npm run test:e2e
- name: Upload test reports
uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b # v4.5.0
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
if: failure()
with:
name: playwright-report

3
.gitignore vendored
View File

@@ -75,3 +75,6 @@ node_modules
# Persistent data
_data/
# Claude
CLAUDE.md

View File

@@ -86,12 +86,12 @@ prowler dashboard
| Provider | Checks | Services | [Compliance Frameworks](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/compliance/) | [Categories](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/misc/#categories) |
|---|---|---|---|---|
| AWS | 567 | 82 | 36 | 10 |
| AWS | 571 | 82 | 36 | 10 |
| GCP | 79 | 13 | 10 | 3 |
| Azure | 142 | 18 | 11 | 3 |
| Azure | 162 | 19 | 11 | 4 |
| Kubernetes | 83 | 7 | 5 | 7 |
| GitHub | 16 | 2 | 1 | 0 |
| M365 | 69 | 7 | 3 | 2 |
| GitHub | 17 | 2 | 1 | 0 |
| M365 | 70 | 7 | 3 | 2 |
| NHN (Unofficial) | 6 | 2 | 1 | 0 |
> [!Note]

View File

@@ -1,23 +1,65 @@
# Security Policy
# Security
## Software Security
As an **AWS Partner** and we have passed the [AWS Foundation Technical Review (FTR)](https://aws.amazon.com/partners/foundational-technical-review/) and we use the following tools and automation to make sure our code is secure and dependencies up-to-dated:
## Reporting Vulnerabilities
- `bandit` for code security review.
- `safety` and `dependabot` for dependencies.
- `hadolint` and `dockle` for our containers security.
- `snyk` in Docker Hub.
- `clair` in Amazon ECR.
- `vulture`, `flake8`, `black` and `pylint` for formatting and best practices.
At Prowler, we consider the security of our open source software and systems a top priority. But no matter how much effort we put into system security, there can still be vulnerabilities present.
## Reporting a Vulnerability
If you discover a vulnerability, we would like to know about it so we can take steps to address it as quickly as possible. We would like to ask you to help us better protect our users, our clients and our systems.
If you would like to report a vulnerability or have a security concern regarding Prowler Open Source or ProwlerPro service, please submit the information by contacting to https://support.prowler.com.
When reporting vulnerabilities, please consider (1) attack scenario / exploitability, and (2) the security impact of the bug. The following issues are considered out of scope:
The information you share with ProwlerPro as part of this process is kept confidential within ProwlerPro. We will only share this information with a third party if the vulnerability you report is found to affect a third-party product, in which case we will share this information with the third-party product's author or manufacturer. Otherwise, we will only share this information as permitted by you.
- Social engineering support or attacks requiring social engineering.
- Clickjacking on pages with no sensitive actions.
- Cross-Site Request Forgery (CSRF) on unauthenticated forms or forms with no sensitive actions.
- Attacks requiring Man-In-The-Middle (MITM) or physical access to a user's device.
- Previously known vulnerable libraries without a working Proof of Concept (PoC).
- Comma Separated Values (CSV) injection without demonstrating a vulnerability.
- Missing best practices in SSL/TLS configuration.
- Any activity that could lead to the disruption of service (DoS).
- Rate limiting or brute force issues on non-authentication endpoints.
- Missing best practices in Content Security Policy (CSP).
- Missing HttpOnly or Secure flags on cookies.
- Configuration of or missing security headers.
- Missing email best practices, such as invalid, incomplete, or missing SPF/DKIM/DMARC records.
- Vulnerabilities only affecting users of outdated or unpatched browsers (less than two stable versions behind).
- Software version disclosure, banner identification issues, or descriptive error messages.
- Tabnabbing.
- Issues that require unlikely user interaction.
- Improper logout functionality and improper session timeout.
- CORS misconfiguration without an exploitation scenario.
- Broken link hijacking.
- Automated scanning results (e.g., sqlmap, Burp active scanner) that have not been manually verified.
- Content spoofing and text injection issues without a clear attack vector.
- Email spoofing without exploiting security flaws.
- Dead links or broken links.
- User enumeration.
We will review the submitted report, and assign it a tracking number. We will then respond to you, acknowledging receipt of the report, and outline the next steps in the process.
Testing guidelines:
- Do not run automated scanners on other customer projects. Running automated scanners can run up costs for our users. Aggressively configured scanners might inadvertently disrupt services, exploit vulnerabilities, lead to system instability or breaches and violate Terms of Service from our upstream providers. Our own security systems won't be able to distinguish hostile reconnaissance from whitehat research. If you wish to run an automated scanner, notify us at support@prowler.com and only run it on your own Prowler app project. Do NOT attack Prowler in usage of other customers.
- Do not take advantage of the vulnerability or problem you have discovered, for example by downloading more data than necessary to demonstrate the vulnerability or deleting or modifying other people's data.
You will receive a non-automated response to your initial contact within 24 hours, confirming receipt of your reported vulnerability.
Reporting guidelines:
- File a report through our Support Desk at https://support.prowler.com
- If it is about a lack of a security functionality, please file a feature request instead at https://github.com/prowler-cloud/prowler/issues
- Do provide sufficient information to reproduce the problem, so we will be able to resolve it as quickly as possible.
- If you have further questions and want direct interaction with the Prowler team, please contact us at via our Community Slack at goto.prowler.com/slack.
We will coordinate public notification of any validated vulnerability with you. Where possible, we prefer that our respective public disclosures be posted simultaneously.
Disclosure guidelines:
- In order to protect our users and customers, do not reveal the problem to others until we have researched, addressed and informed our affected customers.
- If you want to publicly share your research about Prowler at a conference, in a blog or any other public forum, you should share a draft with us for review and approval at least 30 days prior to the publication date. Please note that the following should not be included:
- Data regarding any Prowler user or customer projects.
- Prowler customers' data.
- Information about Prowler employees, contractors or partners.
What we promise:
- We will respond to your report within 5 business days with our evaluation of the report and an expected resolution date.
- If you have followed the instructions above, we will not take any legal action against you in regard to the report.
- We will handle your report with strict confidentiality, and not pass on your personal details to third parties without your permission.
- We will keep you informed of the progress towards resolving the problem.
- In the public information concerning the problem reported, we will give your name as the discoverer of the problem (unless you desire otherwise).
We strive to resolve all problems as quickly as possible, and we would like to play an active role in the ultimate publication on the problem after it is resolved.
---
For more information about our security policies, please refer to our [Security](https://docs.prowler.com/projects/prowler-open-source/en/latest/security/) section in our documentation.

View File

@@ -2,6 +2,16 @@
All notable changes to the **Prowler API** are documented in this file.
## [1.12.0] (Prowler 5.11.0)
### Added
- Lighthouse support for OpenAI GPT-5 [(#8527)](https://github.com/prowler-cloud/prowler/pull/8527)
- Integration with Amazon Security Hub, enabling sending findings to Security Hub [(#8365)](https://github.com/prowler-cloud/prowler/pull/8365)
- Generate ASFF output for AWS providers with SecurityHub integration enabled [(#8569)](https://github.com/prowler-cloud/prowler/pull/8569)
### Fixed
- GitHub provider always scans user instead of organization when using provider UID [(#8587)](https://github.com/prowler-cloud/prowler/pull/8587)
## [1.11.0] (Prowler 5.10.0)
### Added

2267
api/poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -24,13 +24,14 @@ dependencies = [
"drf-spectacular-jsonapi==0.5.1",
"gunicorn==23.0.0",
"lxml==5.3.2",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.10",
"prowler @ git+https://github.com/prowler-cloud/prowler.git@v5.11",
"psycopg2-binary==2.9.9",
"pytest-celery[redis] (>=1.0.1,<2.0.0)",
"sentry-sdk[django] (>=2.20.0,<3.0.0)",
"uuid6==2024.7.10",
"openai (>=1.82.0,<2.0.0)",
"xmlsec==1.3.14"
"xmlsec==1.3.14",
"h2 (==4.3.0)"
]
description = "Prowler's API (Django/DRF)"
license = "Apache-2.0"
@@ -38,7 +39,7 @@ name = "prowler-api"
package-mode = false
# Needed for the SDK compatibility
requires-python = ">=3.11,<3.13"
version = "1.11.2"
version = "1.12.0"
[project.scripts]
celery = "src.backend.config.settings.celery"

View File

@@ -0,0 +1,33 @@
# Generated by Django 5.1.10 on 2025-08-20 09:04
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0045_alter_scan_output_location"),
]
operations = [
migrations.AlterField(
model_name="lighthouseconfiguration",
name="model",
field=models.CharField(
choices=[
("gpt-4o-2024-11-20", "GPT-4o v2024-11-20"),
("gpt-4o-2024-08-06", "GPT-4o v2024-08-06"),
("gpt-4o-2024-05-13", "GPT-4o v2024-05-13"),
("gpt-4o", "GPT-4o Default"),
("gpt-4o-mini-2024-07-18", "GPT-4o Mini v2024-07-18"),
("gpt-4o-mini", "GPT-4o Mini Default"),
("gpt-5-2025-08-07", "GPT-5 v2025-08-07"),
("gpt-5", "GPT-5 Default"),
("gpt-5-mini-2025-08-07", "GPT-5 Mini v2025-08-07"),
("gpt-5-mini", "GPT-5 Mini Default"),
],
default="gpt-4o-2024-08-06",
help_text="Must be one of the supported model names",
max_length=50,
),
),
]

View File

@@ -0,0 +1,16 @@
# Generated by Django 5.1.10 on 2025-08-20 08:24
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("api", "0046_lighthouse_gpt5"),
]
operations = [
migrations.RemoveConstraint(
model_name="integration",
name="unique_configuration_per_tenant",
),
]

View File

@@ -1372,10 +1372,6 @@ class Integration(RowLevelSecurityProtectedModel):
db_table = "integrations"
constraints = [
models.UniqueConstraint(
fields=("configuration", "tenant"),
name="unique_configuration_per_tenant",
),
RowLevelSecurityConstraint(
field="tenant_id",
name="rls_on_%(class)s",
@@ -1752,6 +1748,10 @@ class LighthouseConfiguration(RowLevelSecurityProtectedModel):
GPT_4O = "gpt-4o", _("GPT-4o Default")
GPT_4O_MINI_2024_07_18 = "gpt-4o-mini-2024-07-18", _("GPT-4o Mini v2024-07-18")
GPT_4O_MINI = "gpt-4o-mini", _("GPT-4o Mini Default")
GPT_5_2025_08_07 = "gpt-5-2025-08-07", _("GPT-5 v2025-08-07")
GPT_5 = "gpt-5", _("GPT-5 Default")
GPT_5_MINI_2025_08_07 = "gpt-5-mini-2025-08-07", _("GPT-5 Mini v2025-08-07")
GPT_5_MINI = "gpt-5-mini", _("GPT-5 Mini Default")
id = models.UUIDField(primary_key=True, default=uuid4, editable=False)
inserted_at = models.DateTimeField(auto_now_add=True, editable=False)

View File

@@ -0,0 +1,95 @@
def _pick_task_response_component(components):
schemas = components.get("schemas", {}) or {}
for candidate in ("TaskResponse",):
if candidate in schemas:
return candidate
return None
def _extract_task_example_from_components(components):
schemas = components.get("schemas", {}) or {}
candidate = "TaskResponse"
doc = schemas.get(candidate)
if isinstance(doc, dict) and "example" in doc:
return doc["example"]
res = schemas.get(candidate)
if isinstance(res, dict) and "example" in res:
example = res["example"]
return example if "data" in example else {"data": example}
# Fallback
return {
"data": {
"type": "tasks",
"id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
"attributes": {
"inserted_at": "2019-08-24T14:15:22Z",
"completed_at": "2019-08-24T14:15:22Z",
"name": "string",
"state": "available",
"result": None,
"task_args": None,
"metadata": None,
},
}
}
def attach_task_202_examples(result, generator, request, public): # noqa: F841
if not isinstance(result, dict):
return result
components = result.get("components", {}) or {}
task_resp_component = _pick_task_response_component(components)
task_example = _extract_task_example_from_components(components)
paths = result.get("paths", {}) or {}
for path_item in paths.values():
if not isinstance(path_item, dict):
continue
for method_obj in path_item.values():
if not isinstance(method_obj, dict):
continue
responses = method_obj.get("responses", {}) or {}
resp_202 = responses.get("202")
if not isinstance(resp_202, dict):
continue
content = resp_202.get("content", {}) or {}
jsonapi = content.get("application/vnd.api+json")
if not isinstance(jsonapi, dict):
continue
# Inject example if missing
if "examples" not in jsonapi and "example" not in jsonapi:
jsonapi["examples"] = {
"Task queued": {
"summary": "Task queued",
"value": task_example,
}
}
# Rewrite schema $ref if needed
if task_resp_component:
schema = jsonapi.get("schema")
must_replace = False
if not isinstance(schema, dict):
must_replace = True
else:
ref = schema.get("$ref")
if not ref:
must_replace = True
else:
current = ref.split("/")[-1]
if current != task_resp_component:
must_replace = True
if must_replace:
jsonapi["schema"] = {
"$ref": f"#/components/schemas/{task_resp_component}"
}
return result

File diff suppressed because it is too large Load Diff

View File

@@ -6,16 +6,18 @@ from rest_framework.exceptions import NotFound, ValidationError
from api.db_router import MainRouter
from api.exceptions import InvitationTokenExpiredException
from api.models import Invitation, Provider
from api.models import Integration, Invitation, Provider
from api.utils import (
get_prowler_provider_kwargs,
initialize_prowler_provider,
merge_dicts,
prowler_integration_connection_test,
prowler_provider_connection_test,
return_prowler_provider,
validate_invitation,
)
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHubConnection
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.gcp.gcp_provider import GcpProvider
from prowler.providers.kubernetes.kubernetes_provider import KubernetesProvider
@@ -197,6 +199,10 @@ class TestGetProwlerProviderKwargs:
Provider.ProviderChoices.M365.value,
{},
),
(
Provider.ProviderChoices.GITHUB.value,
{"organizations": ["provider_uid"]},
),
],
)
def test_get_prowler_provider_kwargs(self, provider_type, expected_extra_kwargs):
@@ -390,3 +396,109 @@ class TestValidateInvitation:
mock_db.get.assert_called_once_with(
token="VALID_TOKEN", email__iexact="user@example.com"
)
class TestProwlerIntegrationConnectionTest:
"""Test prowler_integration_connection_test function for SecurityHub regions reset."""
@patch("api.utils.SecurityHub")
def test_security_hub_connection_failure_resets_regions(
self, mock_security_hub_class
):
"""Test that SecurityHub connection failure resets regions to empty dict."""
# Create integration with existing regions configuration
integration = MagicMock()
integration.integration_type = Integration.IntegrationChoices.AWS_SECURITY_HUB
integration.credentials = {
"aws_access_key_id": "test_key",
"aws_secret_access_key": "test_secret",
}
integration.configuration = {
"send_only_fails": True,
"regions": {
"us-east-1": True,
"us-west-2": True,
"eu-west-1": False,
"ap-south-1": False,
},
}
# Mock provider relationship
mock_provider = MagicMock()
mock_provider.uid = "123456789012"
mock_relationship = MagicMock()
mock_relationship.provider = mock_provider
integration.integrationproviderrelationship_set.first.return_value = (
mock_relationship
)
# Mock failed SecurityHub connection
mock_connection = SecurityHubConnection(
is_connected=False,
error=Exception("SecurityHub testing"),
enabled_regions=set(),
disabled_regions=set(),
)
mock_security_hub_class.test_connection.return_value = mock_connection
# Call the function
result = prowler_integration_connection_test(integration)
# Assertions
assert result.is_connected is False
assert str(result.error) == "SecurityHub testing"
# Verify regions were completely reset to empty dict
assert integration.configuration["regions"] == {}
# Verify save was called to persist the change
integration.save.assert_called_once()
# Verify test_connection was called with correct parameters
mock_security_hub_class.test_connection.assert_called_once_with(
aws_account_id="123456789012",
raise_on_exception=False,
aws_access_key_id="test_key",
aws_secret_access_key="test_secret",
)
@patch("api.utils.SecurityHub")
def test_security_hub_connection_success_saves_regions(
self, mock_security_hub_class
):
"""Test that successful SecurityHub connection saves regions correctly."""
integration = MagicMock()
integration.integration_type = Integration.IntegrationChoices.AWS_SECURITY_HUB
integration.credentials = {
"aws_access_key_id": "valid_key",
"aws_secret_access_key": "valid_secret",
}
integration.configuration = {"send_only_fails": False}
# Mock provider relationship
mock_provider = MagicMock()
mock_provider.uid = "123456789012"
mock_relationship = MagicMock()
mock_relationship.provider = mock_provider
integration.integrationproviderrelationship_set.first.return_value = (
mock_relationship
)
# Mock successful SecurityHub connection with regions
mock_connection = SecurityHubConnection(
is_connected=True,
error=None,
enabled_regions={"us-east-1", "eu-west-1"},
disabled_regions={"ap-south-1"},
)
mock_security_hub_class.test_connection.return_value = mock_connection
result = prowler_integration_connection_test(integration)
assert result.is_connected is True
# Verify regions were saved correctly
assert integration.configuration["regions"]["us-east-1"] is True
assert integration.configuration["regions"]["eu-west-1"] is True
assert integration.configuration["regions"]["ap-south-1"] is False
integration.save.assert_called_once()

View File

@@ -11,6 +11,7 @@ from api.models import Integration, Invitation, Processor, Provider, Resource
from api.v1.serializers import FindingMetadataSerializer
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.aws.lib.s3.s3 import S3
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHub
from prowler.providers.azure.azure_provider import AzureProvider
from prowler.providers.common.models import Connection
from prowler.providers.gcp.gcp_provider import GcpProvider
@@ -119,6 +120,12 @@ def get_prowler_provider_kwargs(
}
elif provider.provider == Provider.ProviderChoices.KUBERNETES.value:
prowler_provider_kwargs = {**prowler_provider_kwargs, "context": provider.uid}
elif provider.provider == Provider.ProviderChoices.GITHUB.value:
if provider.uid:
prowler_provider_kwargs = {
**prowler_provider_kwargs,
"organizations": [provider.uid],
}
if mutelist_processor:
mutelist_content = mutelist_processor.configuration.get("Mutelist", {})
@@ -196,7 +203,38 @@ def prowler_integration_connection_test(integration: Integration) -> Connection:
elif (
integration.integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB
):
pass
# Get the provider associated with this integration
provider_relationship = integration.integrationproviderrelationship_set.first()
if not provider_relationship:
return Connection(
is_connected=False, error="No provider associated with this integration"
)
credentials = (
integration.credentials
if integration.credentials
else provider_relationship.provider.secret.secret
)
connection = SecurityHub.test_connection(
aws_account_id=provider_relationship.provider.uid,
raise_on_exception=False,
**credentials,
)
# Only save regions if connection is successful
if connection.is_connected:
regions_status = {r: True for r in connection.enabled_regions}
regions_status.update({r: False for r in connection.disabled_regions})
# Save regions information in the integration configuration
integration.configuration["regions"] = regions_status
integration.save()
else:
# Reset regions information if connection fails
integration.configuration["regions"] = {}
integration.save()
return connection
elif integration.integration_type == Integration.IntegrationChoices.JIRA:
pass
elif integration.integration_type == Integration.IntegrationChoices.SLACK:

View File

@@ -52,6 +52,21 @@ class S3ConfigSerializer(BaseValidateSerializer):
resource_name = "integrations"
class SecurityHubConfigSerializer(BaseValidateSerializer):
send_only_fails = serializers.BooleanField(default=False)
archive_previous_findings = serializers.BooleanField(default=False)
regions = serializers.DictField(default=dict, read_only=True)
def to_internal_value(self, data):
validated_data = super().to_internal_value(data)
# Always initialize regions as empty dict
validated_data["regions"] = {}
return validated_data
class Meta:
resource_name = "integrations"
class AWSCredentialSerializer(BaseValidateSerializer):
role_arn = serializers.CharField(required=False)
external_id = serializers.CharField(required=False)
@@ -146,6 +161,22 @@ class IntegrationCredentialField(serializers.JSONField):
},
"required": ["bucket_name"],
},
{
"type": "object",
"title": "AWS Security Hub",
"properties": {
"send_only_fails": {
"type": "boolean",
"default": False,
"description": "If true, only findings with status 'FAIL' will be sent to Security Hub.",
},
"archive_previous_findings": {
"type": "boolean",
"default": False,
"description": "If true, archives findings that are not present in the current execution.",
},
},
},
]
}
)

View File

@@ -46,6 +46,7 @@ from api.v1.serializer_utils.integrations import (
IntegrationConfigField,
IntegrationCredentialField,
S3ConfigSerializer,
SecurityHubConfigSerializer,
)
from api.v1.serializer_utils.processors import ProcessorConfigField
from api.v1.serializer_utils.providers import ProviderSecretField
@@ -1951,13 +1952,44 @@ class ScheduleDailyCreateSerializer(serializers.Serializer):
class BaseWriteIntegrationSerializer(BaseWriteSerializer):
def validate(self, attrs):
if Integration.objects.filter(
configuration=attrs.get("configuration")
).exists():
if (
attrs.get("integration_type") == Integration.IntegrationChoices.AMAZON_S3
and Integration.objects.filter(
configuration=attrs.get("configuration")
).exists()
):
raise serializers.ValidationError(
{"name": "This integration already exists."}
{"configuration": "This integration already exists."}
)
# Check if any provider already has a SecurityHub integration
integration_type = attrs.get("integration_type")
if hasattr(self, "instance") and self.instance and not integration_type:
integration_type = self.instance.integration_type
if (
integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB
and "providers" in attrs
):
providers = attrs.get("providers", [])
tenant_id = self.context.get("tenant_id")
for provider in providers:
# For updates, exclude the current instance from the check
query = IntegrationProviderRelationship.objects.filter(
provider=provider,
integration__integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
tenant_id=tenant_id,
)
if hasattr(self, "instance") and self.instance:
query = query.exclude(integration=self.instance)
if query.exists():
raise serializers.ValidationError(
{
"providers": f"Provider {provider.id} already has a Security Hub integration. Only one Security Hub integration is allowed per provider."
}
)
return super().validate(attrs)
@staticmethod
@@ -1970,14 +2002,22 @@ class BaseWriteIntegrationSerializer(BaseWriteSerializer):
if integration_type == Integration.IntegrationChoices.AMAZON_S3:
config_serializer = S3ConfigSerializer
credentials_serializers = [AWSCredentialSerializer]
# TODO: This will be required for AWS Security Hub
# if providers and not all(
# provider.provider == Provider.ProviderChoices.AWS
# for provider in providers
# ):
# raise serializers.ValidationError(
# {"providers": "All providers must be AWS for the S3 integration."}
# )
elif integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB:
if providers:
if len(providers) > 1:
raise serializers.ValidationError(
{
"providers": "Only one provider is supported for the Security Hub integration."
}
)
if providers[0].provider != Provider.ProviderChoices.AWS:
raise serializers.ValidationError(
{
"providers": "The provider must be AWS type for the Security Hub integration."
}
)
config_serializer = SecurityHubConfigSerializer
credentials_serializers = [AWSCredentialSerializer]
else:
raise serializers.ValidationError(
{
@@ -2077,6 +2117,16 @@ class IntegrationCreateSerializer(BaseWriteIntegrationSerializer):
configuration = attrs.get("configuration")
credentials = attrs.get("credentials")
if (
not providers
and integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB
):
raise serializers.ValidationError(
{
"providers": "At least one provider is required for the Security Hub integration."
}
)
self.validate_integration_data(
integration_type, providers, configuration, credentials
)
@@ -2131,16 +2181,15 @@ class IntegrationUpdateSerializer(BaseWriteIntegrationSerializer):
}
def validate(self, attrs):
super().validate(attrs)
integration_type = self.instance.integration_type
providers = attrs.get("providers")
configuration = attrs.get("configuration") or self.instance.configuration
credentials = attrs.get("credentials") or self.instance.credentials
validated_attrs = super().validate(attrs)
self.validate_integration_data(
integration_type, providers, configuration, credentials
)
validated_attrs = super().validate(attrs)
return validated_attrs
def update(self, instance, validated_data):
@@ -2155,6 +2204,13 @@ class IntegrationUpdateSerializer(BaseWriteIntegrationSerializer):
]
IntegrationProviderRelationship.objects.bulk_create(new_relationships)
# Preserve regions field for Security Hub integrations
if instance.integration_type == Integration.IntegrationChoices.AWS_SECURITY_HUB:
if "configuration" in validated_data:
# Preserve the existing regions field if it exists
existing_regions = instance.configuration.get("regions", {})
validated_data["configuration"]["regions"] = existing_regions
return super().update(instance, validated_data)

View File

@@ -293,7 +293,7 @@ class SchemaView(SpectacularAPIView):
def get(self, request, *args, **kwargs):
spectacular_settings.TITLE = "Prowler API"
spectacular_settings.VERSION = "1.11.2"
spectacular_settings.VERSION = "1.12.0"
spectacular_settings.DESCRIPTION = (
"Prowler API specification.\n\nThis file is auto-generated."
)
@@ -313,6 +313,11 @@ class SchemaView(SpectacularAPIView):
"description": "Endpoints for tenant invitations management, allowing retrieval and filtering of "
"invitations, creating new invitations, accepting and revoking them.",
},
{
"name": "Role",
"description": "Endpoints for managing RBAC roles within tenants, allowing creation, retrieval, "
"updating, and deletion of role configurations and permissions.",
},
{
"name": "Provider",
"description": "Endpoints for managing providers (AWS, GCP, Azure, etc...).",
@@ -321,10 +326,20 @@ class SchemaView(SpectacularAPIView):
"name": "Provider Group",
"description": "Endpoints for managing provider groups.",
},
{
"name": "Task",
"description": "Endpoints for task management, allowing retrieval of task status and "
"revoking tasks that have not started.",
},
{
"name": "Scan",
"description": "Endpoints for triggering manual scans and viewing scan results.",
},
{
"name": "Schedule",
"description": "Endpoints for managing scan schedules, allowing configuration of automated "
"scans with different scheduling options.",
},
{
"name": "Resource",
"description": "Endpoints for managing resources discovered by scans, allowing "
@@ -336,8 +351,9 @@ class SchemaView(SpectacularAPIView):
"findings that result from scans.",
},
{
"name": "Overview",
"description": "Endpoints for retrieving aggregated summaries of resources from the system.",
"name": "Processor",
"description": "Endpoints for managing post-processors used to process Prowler findings, including "
"registration, configuration, and deletion of post-processing actions.",
},
{
"name": "Compliance Overview",
@@ -345,9 +361,8 @@ class SchemaView(SpectacularAPIView):
" compliance framework ID.",
},
{
"name": "Task",
"description": "Endpoints for task management, allowing retrieval of task status and "
"revoking tasks that have not started.",
"name": "Overview",
"description": "Endpoints for retrieving aggregated summaries of resources from the system.",
},
{
"name": "Integration",
@@ -357,12 +372,13 @@ class SchemaView(SpectacularAPIView):
{
"name": "Lighthouse",
"description": "Endpoints for managing Lighthouse configurations, including creation, retrieval, "
"updating, and deletion of configurations such as OpenAI keys, models, and business context.",
"updating, and deletion of configurations such as OpenAI keys, models, and business "
"context.",
},
{
"name": "Processor",
"description": "Endpoints for managing post-processors used to process Prowler findings, including "
"registration, configuration, and deletion of post-processing actions.",
"name": "SAML",
"description": "Endpoints for Single Sign-On authentication management via SAML for seamless user "
"authentication.",
},
]
return super().get(request, *args, **kwargs)
@@ -3033,7 +3049,9 @@ class RoleProviderGroupRelationshipView(RelationshipView, BaseRLSViewSet):
description="Compliance overviews metadata obtained successfully",
response=ComplianceOverviewMetadataSerializer,
),
202: OpenApiResponse(description="The task is in progress"),
202: OpenApiResponse(
description="The task is in progress", response=TaskSerializer
),
500: OpenApiResponse(
description="Compliance overviews generation task failed"
),
@@ -3065,7 +3083,9 @@ class RoleProviderGroupRelationshipView(RelationshipView, BaseRLSViewSet):
description="Compliance requirement details obtained successfully",
response=ComplianceOverviewDetailSerializer(many=True),
),
202: OpenApiResponse(description="The task is in progress"),
202: OpenApiResponse(
description="The task is in progress", response=TaskSerializer
),
500: OpenApiResponse(
description="Compliance overviews generation task failed"
),

View File

@@ -116,6 +116,9 @@ SPECTACULAR_SETTINGS = {
"PREPROCESSING_HOOKS": [
"drf_spectacular_jsonapi.hooks.fix_nested_path_parameters",
],
"POSTPROCESSING_HOOKS": [
"api.schema_hooks.attach_task_202_examples",
],
"TITLE": "API Reference - Prowler",
}

View File

@@ -13,8 +13,10 @@ from api.models import Scan
from prowler.config.config import (
csv_file_suffix,
html_file_suffix,
json_asff_file_suffix,
json_ocsf_file_suffix,
)
from prowler.lib.outputs.asff.asff import ASFF
from prowler.lib.outputs.compliance.aws_well_architected.aws_well_architected import (
AWSWellArchitected,
)
@@ -109,6 +111,7 @@ OUTPUT_FORMATS_MAPPING = {
"kwargs": {},
},
"json-ocsf": {"class": OCSF, "suffix": json_ocsf_file_suffix, "kwargs": {}},
"json-asff": {"class": ASFF, "suffix": json_asff_file_suffix, "kwargs": {}},
"html": {"class": HTML, "suffix": html_file_suffix, "kwargs": {"stats": {}}},
}

View File

@@ -2,15 +2,21 @@ import os
from glob import glob
from celery.utils.log import get_task_logger
from config.django.base import DJANGO_FINDINGS_BATCH_SIZE
from tasks.utils import batched
from api.db_utils import rls_transaction
from api.models import Integration
from api.models import Finding, Integration, Provider
from api.utils import initialize_prowler_provider
from prowler.lib.outputs.asff.asff import ASFF
from prowler.lib.outputs.compliance.generic.generic import GenericCompliance
from prowler.lib.outputs.csv.csv import CSV
from prowler.lib.outputs.finding import Finding as FindingOutput
from prowler.lib.outputs.html.html import HTML
from prowler.lib.outputs.ocsf.ocsf import OCSF
from prowler.providers.aws.aws_provider import AwsProvider
from prowler.providers.aws.lib.s3.s3 import S3
from prowler.providers.aws.lib.security_hub.security_hub import SecurityHub
from prowler.providers.common.models import Connection
logger = get_task_logger(__name__)
@@ -154,3 +160,270 @@ def upload_s3_integration(
except Exception as e:
logger.error(f"S3 integrations failed for provider {provider_id}: {str(e)}")
return False
def get_security_hub_client_from_integration(
integration: Integration, tenant_id: str, findings: list
) -> tuple[bool, SecurityHub | Connection]:
"""
Create and return a SecurityHub client using AWS credentials from an integration.
Args:
integration (Integration): The integration to get the Security Hub client from.
tenant_id (str): The tenant identifier.
findings (list): List of findings in ASFF format to send to Security Hub.
Returns:
tuple[bool, SecurityHub | Connection]: A tuple containing a boolean indicating
if the connection was successful and the SecurityHub client or connection object.
"""
# Get the provider associated with this integration
with rls_transaction(tenant_id):
provider_relationship = integration.integrationproviderrelationship_set.first()
if not provider_relationship:
return Connection(
is_connected=False, error="No provider associated with this integration"
)
provider_uid = provider_relationship.provider.uid
provider_secret = provider_relationship.provider.secret.secret
credentials = (
integration.credentials if integration.credentials else provider_secret
)
connection = SecurityHub.test_connection(
aws_account_id=provider_uid,
raise_on_exception=False,
**credentials,
)
if connection.is_connected:
all_security_hub_regions = AwsProvider.get_available_aws_service_regions(
"securityhub", connection.partition
)
# Create regions status dictionary
regions_status = {}
for region in set(all_security_hub_regions):
regions_status[region] = region in connection.enabled_regions
# Save regions information in the integration configuration
with rls_transaction(tenant_id):
integration.configuration["regions"] = regions_status
integration.save()
# Create SecurityHub client with all necessary parameters
security_hub = SecurityHub(
aws_account_id=provider_uid,
findings=findings,
send_only_fails=integration.configuration.get("send_only_fails", False),
aws_security_hub_available_regions=list(connection.enabled_regions),
**credentials,
)
return True, security_hub
else:
# Reset regions information if connection fails
with rls_transaction(tenant_id):
integration.configuration["regions"] = {}
integration.save()
return False, connection
def upload_security_hub_integration(
tenant_id: str, provider_id: str, scan_id: str
) -> bool:
"""
Upload findings to AWS Security Hub using configured integrations.
This function retrieves findings from the database, transforms them to ASFF format,
and sends them to AWS Security Hub using the configured integration credentials.
Args:
tenant_id (str): The tenant identifier.
provider_id (str): The provider identifier.
scan_id (str): The scan identifier for which to send findings.
Returns:
bool: True if all integrations executed successfully, False otherwise.
"""
logger.info(f"Processing Security Hub integrations for provider {provider_id}")
try:
with rls_transaction(tenant_id):
# Get Security Hub integrations for this provider
integrations = list(
Integration.objects.filter(
integrationproviderrelationship__provider_id=provider_id,
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
enabled=True,
)
)
if not integrations:
logger.error(
f"No Security Hub integrations found for provider {provider_id}"
)
return False
# Get the provider object
provider = Provider.objects.get(id=provider_id)
# Initialize prowler provider for finding transformation
prowler_provider = initialize_prowler_provider(provider)
# Process each Security Hub integration
integration_executions = 0
total_findings_sent = {} # Track findings sent per integration
for integration in integrations:
try:
# Initialize Security Hub client for this integration
# We'll create the client once and reuse it for all batches
security_hub_client = None
send_only_fails = integration.configuration.get(
"send_only_fails", False
)
total_findings_sent[integration.id] = 0
# Process findings in batches to avoid memory issues
has_findings = False
batch_number = 0
with rls_transaction(tenant_id):
qs = (
Finding.all_objects.filter(tenant_id=tenant_id, scan_id=scan_id)
.order_by("uid")
.iterator()
)
for batch, _ in batched(qs, DJANGO_FINDINGS_BATCH_SIZE):
batch_number += 1
has_findings = True
# Transform findings for this batch
transformed_findings = [
FindingOutput.transform_api_finding(
finding, prowler_provider
)
for finding in batch
]
# Convert to ASFF format
asff_transformer = ASFF(
findings=transformed_findings,
file_path="",
file_extension="json",
)
asff_transformer.transform(transformed_findings)
# Get the batch of ASFF findings
batch_asff_findings = asff_transformer.data
if batch_asff_findings:
# Create Security Hub client for first batch or reuse existing
if not security_hub_client:
connected, security_hub = (
get_security_hub_client_from_integration(
integration, tenant_id, batch_asff_findings
)
)
if not connected:
logger.error(
f"Security Hub connection failed for integration {integration.id}: {security_hub.error}"
)
integration.connected = False
integration.save()
break # Skip this integration
security_hub_client = security_hub
logger.info(
f"Sending {'fail' if send_only_fails else 'all'} findings to Security Hub via integration {integration.id}"
)
else:
# Update findings in existing client for this batch
security_hub_client._findings_per_region = (
security_hub_client.filter(
batch_asff_findings, send_only_fails
)
)
# Send this batch to Security Hub
try:
findings_sent = (
security_hub_client.batch_send_to_security_hub()
)
total_findings_sent[integration.id] += findings_sent
if findings_sent > 0:
logger.debug(
f"Sent batch {batch_number} with {findings_sent} findings to Security Hub"
)
except Exception as batch_error:
logger.error(
f"Failed to send batch {batch_number} to Security Hub: {str(batch_error)}"
)
# Clear memory after processing each batch
asff_transformer._data.clear()
del batch_asff_findings
del transformed_findings
if not has_findings:
logger.info(
f"No findings to send to Security Hub for scan {scan_id}"
)
integration_executions += 1
elif security_hub_client:
if total_findings_sent[integration.id] > 0:
logger.info(
f"Successfully sent {total_findings_sent[integration.id]} total findings to Security Hub via integration {integration.id}"
)
integration_executions += 1
else:
logger.warning(
f"No findings were sent to Security Hub via integration {integration.id}"
)
# Archive previous findings if configured to do so
if integration.configuration.get(
"archive_previous_findings", False
):
logger.info(
f"Archiving previous findings in Security Hub via integration {integration.id}"
)
try:
findings_archived = (
security_hub_client.archive_previous_findings()
)
logger.info(
f"Successfully archived {findings_archived} previous findings in Security Hub"
)
except Exception as archive_error:
logger.warning(
f"Failed to archive previous findings: {str(archive_error)}"
)
except Exception as e:
logger.error(
f"Security Hub integration {integration.id} failed: {str(e)}"
)
continue
result = integration_executions == len(integrations)
if result:
logger.info(
f"All Security Hub integrations completed successfully for provider {provider_id}"
)
else:
logger.error(
f"Some Security Hub integrations failed for provider {provider_id}"
)
return result
except Exception as e:
logger.error(
f"Security Hub integrations failed for provider {provider_id}: {str(e)}"
)
return False

View File

@@ -21,7 +21,10 @@ from tasks.jobs.export import (
_generate_output_directory,
_upload_to_s3,
)
from tasks.jobs.integrations import upload_s3_integration
from tasks.jobs.integrations import (
upload_s3_integration,
upload_security_hub_integration,
)
from tasks.jobs.scan import (
aggregate_findings,
create_compliance_requirements,
@@ -62,6 +65,7 @@ def _perform_scan_complete_tasks(tenant_id: str, scan_id: str, provider_id: str)
check_integrations_task.si(
tenant_id=tenant_id,
provider_id=provider_id,
scan_id=scan_id,
),
).apply_async()
@@ -323,12 +327,30 @@ def generate_outputs_task(scan_id: str, provider_id: str, tenant_id: str):
ScanSummary.objects.filter(scan_id=scan_id)
)
qs = Finding.all_objects.filter(scan_id=scan_id).order_by("uid").iterator()
# Check if we need to generate ASFF output for AWS providers with SecurityHub integration
generate_asff = False
if provider_type == "aws":
security_hub_integrations = Integration.objects.filter(
integrationproviderrelationship__provider_id=provider_id,
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
enabled=True,
)
generate_asff = security_hub_integrations.exists()
qs = (
Finding.all_objects.filter(tenant_id=tenant_id, scan_id=scan_id)
.order_by("uid")
.iterator()
)
for batch, is_last in batched(qs, DJANGO_FINDINGS_BATCH_SIZE):
fos = [FindingOutput.transform_api_finding(f, prowler_provider) for f in batch]
# Outputs
for mode, cfg in OUTPUT_FORMATS_MAPPING.items():
# Skip ASFF generation if not needed
if mode == "json-asff" and not generate_asff:
continue
cls = cfg["class"]
suffix = cfg["suffix"]
extra = cfg.get("kwargs", {}).copy()
@@ -474,17 +496,19 @@ def check_lighthouse_connection_task(lighthouse_config_id: str, tenant_id: str =
@shared_task(name="integration-check")
def check_integrations_task(tenant_id: str, provider_id: str):
def check_integrations_task(tenant_id: str, provider_id: str, scan_id: str = None):
"""
Check and execute all configured integrations for a provider.
Args:
tenant_id (str): The tenant identifier
provider_id (str): The provider identifier
scan_id (str, optional): The scan identifier for integrations that need scan data
"""
logger.info(f"Checking integrations for provider {provider_id}")
try:
integration_tasks = []
with rls_transaction(tenant_id):
integrations = Integration.objects.filter(
integrationproviderrelationship__provider_id=provider_id,
@@ -495,7 +519,16 @@ def check_integrations_task(tenant_id: str, provider_id: str):
logger.info(f"No integrations configured for provider {provider_id}")
return {"integrations_processed": 0}
integration_tasks = []
# Security Hub integration
security_hub_integrations = integrations.filter(
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB
)
if security_hub_integrations.exists():
integration_tasks.append(
security_hub_integration_task.s(
tenant_id=tenant_id, provider_id=provider_id, scan_id=scan_id
)
)
# TODO: Add other integration types here
# slack_integrations = integrations.filter(
@@ -541,3 +574,24 @@ def s3_integration_task(
output_directory (str): Path to the directory containing output files
"""
return upload_s3_integration(tenant_id, provider_id, output_directory)
@shared_task(
base=RLSTask,
name="integration-security-hub",
queue="integrations",
)
def security_hub_integration_task(
tenant_id: str,
provider_id: str,
scan_id: str,
):
"""
Process Security Hub integrations for a provider.
Args:
tenant_id (str): The tenant identifier
provider_id (str): The provider identifier
scan_id (str): The scan identifier
"""
return upload_security_hub_integration(tenant_id, provider_id, scan_id)

File diff suppressed because it is too large Load Diff

View File

@@ -7,6 +7,7 @@ from tasks.tasks import (
check_integrations_task,
generate_outputs_task,
s3_integration_task,
security_hub_integration_task,
)
from api.models import Integration
@@ -521,31 +522,68 @@ class TestCheckIntegrationsTask:
enabled=True,
)
@patch("tasks.tasks.security_hub_integration_task")
@patch("tasks.tasks.group")
@patch("tasks.tasks.rls_transaction")
@patch("tasks.tasks.Integration.objects.filter")
def test_check_integrations_s3_success(
self, mock_integration_filter, mock_rls, mock_group
def test_check_integrations_security_hub_success(
self, mock_integration_filter, mock_rls, mock_group, mock_security_hub_task
):
# Mock that we have some integrations
mock_integration_filter.return_value.exists.return_value = True
"""Test that SecurityHub integrations are processed correctly."""
# Mock that we have SecurityHub integrations
mock_integrations = MagicMock()
mock_integrations.exists.return_value = True
# Mock SecurityHub integrations to return existing integrations
mock_security_hub_integrations = MagicMock()
mock_security_hub_integrations.exists.return_value = True
# Set up the filter chain
mock_integration_filter.return_value = mock_integrations
mock_integrations.filter.return_value = mock_security_hub_integrations
# Mock the task signature
mock_task_signature = MagicMock()
mock_security_hub_task.s.return_value = mock_task_signature
# Mock group job
mock_job = MagicMock()
mock_group.return_value = mock_job
# Ensure rls_transaction is mocked
mock_rls.return_value.__enter__.return_value = None
# Since the current implementation doesn't actually create tasks yet (TODO comment),
# we test that no tasks are created but the function returns the correct count
# Execute the function
result = check_integrations_task(
tenant_id=self.tenant_id,
provider_id=self.provider_id,
scan_id="test-scan-id",
)
assert result == {"integrations_processed": 0}
# Should process 1 SecurityHub integration
assert result == {"integrations_processed": 1}
# Verify the integration filter was called
mock_integration_filter.assert_called_once_with(
integrationproviderrelationship__provider_id=self.provider_id,
enabled=True,
)
# group should not be called since no integration tasks are created yet
mock_group.assert_not_called()
# Verify SecurityHub integrations were filtered
mock_integrations.filter.assert_called_once_with(
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB
)
# Verify SecurityHub task was created with correct parameters
mock_security_hub_task.s.assert_called_once_with(
tenant_id=self.tenant_id,
provider_id=self.provider_id,
scan_id="test-scan-id",
)
# Verify group was called and job was executed
mock_group.assert_called_once_with([mock_task_signature])
mock_job.apply_async.assert_called_once()
@patch("tasks.tasks.rls_transaction")
@patch("tasks.tasks.Integration.objects.filter")
@@ -567,6 +605,369 @@ class TestCheckIntegrationsTask:
enabled=True,
)
@patch("tasks.tasks.s3_integration_task")
@patch("tasks.tasks.Integration.objects.filter")
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Finding.all_objects.filter")
@patch("tasks.tasks._generate_output_directory")
@patch("tasks.tasks.FindingOutput._transform_findings_stats")
@patch("tasks.tasks.FindingOutput.transform_api_finding")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks.Scan.all_objects.filter")
@patch("tasks.tasks.rmtree")
def test_generate_outputs_with_asff_for_aws_with_security_hub(
self,
mock_rmtree,
mock_scan_update,
mock_upload,
mock_compress,
mock_transform_finding,
mock_transform_stats,
mock_generate_dir,
mock_findings,
mock_get_frameworks,
mock_compliance_bulk,
mock_initialize_provider,
mock_provider_get,
mock_scan_summary,
mock_integration_filter,
mock_s3_task,
):
"""Test that ASFF output is generated for AWS providers with SecurityHub integration."""
# Setup
mock_scan_summary_qs = MagicMock()
mock_scan_summary_qs.exists.return_value = True
mock_scan_summary.return_value = mock_scan_summary_qs
# Mock AWS provider
mock_provider = MagicMock()
mock_provider.uid = "aws-account-123"
mock_provider.provider = "aws"
mock_provider_get.return_value = mock_provider
# Mock SecurityHub integration exists
mock_security_hub_integrations = MagicMock()
mock_security_hub_integrations.exists.return_value = True
mock_integration_filter.return_value = mock_security_hub_integrations
# Mock s3_integration_task
mock_s3_task.apply_async.return_value.get.return_value = True
# Mock other necessary components
mock_initialize_provider.return_value = MagicMock()
mock_compliance_bulk.return_value = {}
mock_get_frameworks.return_value = []
mock_generate_dir.return_value = ("out-dir", "comp-dir")
mock_transform_stats.return_value = {"stats": "data"}
# Mock findings
mock_finding = MagicMock()
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[mock_finding],
True,
]
mock_transform_finding.return_value = MagicMock(compliance={})
# Track which output formats were created
created_writers = {}
def track_writer_creation(cls_type):
def factory(*args, **kwargs):
writer = MagicMock()
writer._data = []
writer.transform = MagicMock()
writer.batch_write_data_to_file = MagicMock()
created_writers[cls_type] = writer
return writer
return factory
# Mock OUTPUT_FORMATS_MAPPING with tracking
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"csv": {
"class": track_writer_creation("csv"),
"suffix": ".csv",
"kwargs": {},
},
"json-asff": {
"class": track_writer_creation("asff"),
"suffix": ".asff.json",
"kwargs": {},
},
"json-ocsf": {
"class": track_writer_creation("ocsf"),
"suffix": ".ocsf.json",
"kwargs": {},
},
},
):
mock_compress.return_value = "/tmp/compressed.zip"
mock_upload.return_value = "s3://bucket/file.zip"
# Execute
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
# Verify ASFF was created for AWS with SecurityHub
assert "asff" in created_writers, "ASFF writer should be created"
assert "csv" in created_writers, "CSV writer should be created"
assert "ocsf" in created_writers, "OCSF writer should be created"
# Verify SecurityHub integration was checked
assert mock_integration_filter.call_count == 2
mock_integration_filter.assert_any_call(
integrationproviderrelationship__provider_id=self.provider_id,
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
enabled=True,
)
assert result == {"upload": True}
@patch("tasks.tasks.s3_integration_task")
@patch("tasks.tasks.Integration.objects.filter")
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Finding.all_objects.filter")
@patch("tasks.tasks._generate_output_directory")
@patch("tasks.tasks.FindingOutput._transform_findings_stats")
@patch("tasks.tasks.FindingOutput.transform_api_finding")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks.Scan.all_objects.filter")
@patch("tasks.tasks.rmtree")
def test_generate_outputs_no_asff_for_aws_without_security_hub(
self,
mock_rmtree,
mock_scan_update,
mock_upload,
mock_compress,
mock_transform_finding,
mock_transform_stats,
mock_generate_dir,
mock_findings,
mock_get_frameworks,
mock_compliance_bulk,
mock_initialize_provider,
mock_provider_get,
mock_scan_summary,
mock_integration_filter,
mock_s3_task,
):
"""Test that ASFF output is NOT generated for AWS providers without SecurityHub integration."""
# Setup
mock_scan_summary_qs = MagicMock()
mock_scan_summary_qs.exists.return_value = True
mock_scan_summary.return_value = mock_scan_summary_qs
# Mock AWS provider
mock_provider = MagicMock()
mock_provider.uid = "aws-account-123"
mock_provider.provider = "aws"
mock_provider_get.return_value = mock_provider
# Mock NO SecurityHub integration
mock_security_hub_integrations = MagicMock()
mock_security_hub_integrations.exists.return_value = False
mock_integration_filter.return_value = mock_security_hub_integrations
# Mock other necessary components
mock_initialize_provider.return_value = MagicMock()
mock_compliance_bulk.return_value = {}
mock_get_frameworks.return_value = []
mock_generate_dir.return_value = ("out-dir", "comp-dir")
mock_transform_stats.return_value = {"stats": "data"}
# Mock findings
mock_finding = MagicMock()
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[mock_finding],
True,
]
mock_transform_finding.return_value = MagicMock(compliance={})
# Track which output formats were created
created_writers = {}
def track_writer_creation(cls_type):
def factory(*args, **kwargs):
writer = MagicMock()
writer._data = []
writer.transform = MagicMock()
writer.batch_write_data_to_file = MagicMock()
created_writers[cls_type] = writer
return writer
return factory
# Mock OUTPUT_FORMATS_MAPPING with tracking
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"csv": {
"class": track_writer_creation("csv"),
"suffix": ".csv",
"kwargs": {},
},
"json-asff": {
"class": track_writer_creation("asff"),
"suffix": ".asff.json",
"kwargs": {},
},
"json-ocsf": {
"class": track_writer_creation("ocsf"),
"suffix": ".ocsf.json",
"kwargs": {},
},
},
):
mock_compress.return_value = "/tmp/compressed.zip"
mock_upload.return_value = "s3://bucket/file.zip"
# Execute
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
# Verify ASFF was NOT created when no SecurityHub integration
assert "asff" not in created_writers, "ASFF writer should NOT be created"
assert "csv" in created_writers, "CSV writer should be created"
assert "ocsf" in created_writers, "OCSF writer should be created"
# Verify SecurityHub integration was checked
assert mock_integration_filter.call_count == 2
mock_integration_filter.assert_any_call(
integrationproviderrelationship__provider_id=self.provider_id,
integration_type=Integration.IntegrationChoices.AWS_SECURITY_HUB,
enabled=True,
)
assert result == {"upload": True}
@patch("tasks.tasks.ScanSummary.objects.filter")
@patch("tasks.tasks.Provider.objects.get")
@patch("tasks.tasks.initialize_prowler_provider")
@patch("tasks.tasks.Compliance.get_bulk")
@patch("tasks.tasks.get_compliance_frameworks")
@patch("tasks.tasks.Finding.all_objects.filter")
@patch("tasks.tasks._generate_output_directory")
@patch("tasks.tasks.FindingOutput._transform_findings_stats")
@patch("tasks.tasks.FindingOutput.transform_api_finding")
@patch("tasks.tasks._compress_output_files")
@patch("tasks.tasks._upload_to_s3")
@patch("tasks.tasks.Scan.all_objects.filter")
@patch("tasks.tasks.rmtree")
def test_generate_outputs_no_asff_for_non_aws_provider(
self,
mock_rmtree,
mock_scan_update,
mock_upload,
mock_compress,
mock_transform_finding,
mock_transform_stats,
mock_generate_dir,
mock_findings,
mock_get_frameworks,
mock_compliance_bulk,
mock_initialize_provider,
mock_provider_get,
mock_scan_summary,
):
"""Test that ASFF output is NOT generated for non-AWS providers (e.g., Azure, GCP)."""
# Setup
mock_scan_summary_qs = MagicMock()
mock_scan_summary_qs.exists.return_value = True
mock_scan_summary.return_value = mock_scan_summary_qs
# Mock Azure provider (non-AWS)
mock_provider = MagicMock()
mock_provider.uid = "azure-subscription-123"
mock_provider.provider = "azure" # Non-AWS provider
mock_provider_get.return_value = mock_provider
# Mock other necessary components
mock_initialize_provider.return_value = MagicMock()
mock_compliance_bulk.return_value = {}
mock_get_frameworks.return_value = []
mock_generate_dir.return_value = ("out-dir", "comp-dir")
mock_transform_stats.return_value = {"stats": "data"}
# Mock findings
mock_finding = MagicMock()
mock_findings.return_value.order_by.return_value.iterator.return_value = [
[mock_finding],
True,
]
mock_transform_finding.return_value = MagicMock(compliance={})
# Track which output formats were created
created_writers = {}
def track_writer_creation(cls_type):
def factory(*args, **kwargs):
writer = MagicMock()
writer._data = []
writer.transform = MagicMock()
writer.batch_write_data_to_file = MagicMock()
created_writers[cls_type] = writer
return writer
return factory
# Mock OUTPUT_FORMATS_MAPPING with tracking
with patch(
"tasks.tasks.OUTPUT_FORMATS_MAPPING",
{
"csv": {
"class": track_writer_creation("csv"),
"suffix": ".csv",
"kwargs": {},
},
"json-asff": {
"class": track_writer_creation("asff"),
"suffix": ".asff.json",
"kwargs": {},
},
"json-ocsf": {
"class": track_writer_creation("ocsf"),
"suffix": ".ocsf.json",
"kwargs": {},
},
},
):
mock_compress.return_value = "/tmp/compressed.zip"
mock_upload.return_value = "s3://bucket/file.zip"
# Execute
result = generate_outputs_task(
scan_id=self.scan_id,
provider_id=self.provider_id,
tenant_id=self.tenant_id,
)
# Verify ASFF was NOT created for non-AWS provider
assert (
"asff" not in created_writers
), "ASFF writer should NOT be created for non-AWS providers"
assert "csv" in created_writers, "CSV writer should be created"
assert "ocsf" in created_writers, "OCSF writer should be created"
assert result == {"upload": True}
@patch("tasks.tasks.upload_s3_integration")
def test_s3_integration_task_success(self, mock_upload):
mock_upload.return_value = True
@@ -598,3 +999,33 @@ class TestCheckIntegrationsTask:
mock_upload.assert_called_once_with(
self.tenant_id, self.provider_id, output_directory
)
@patch("tasks.tasks.upload_security_hub_integration")
def test_security_hub_integration_task_success(self, mock_upload):
"""Test successful SecurityHub integration task execution."""
mock_upload.return_value = True
scan_id = "test-scan-123"
result = security_hub_integration_task(
tenant_id=self.tenant_id,
provider_id=self.provider_id,
scan_id=scan_id,
)
assert result is True
mock_upload.assert_called_once_with(self.tenant_id, self.provider_id, scan_id)
@patch("tasks.tasks.upload_security_hub_integration")
def test_security_hub_integration_task_failure(self, mock_upload):
"""Test SecurityHub integration task handling failure."""
mock_upload.return_value = False
scan_id = "test-scan-123"
result = security_hub_integration_task(
tenant_id=self.tenant_id,
provider_id=self.provider_id,
scan_id=scan_id,
)
assert result is False
mock_upload.assert_called_once_with(self.tenant_id, self.provider_id, scan_id)

View File

@@ -1,24 +0,0 @@
---
hide:
- toc
---
# About
## Author
Prowler was created by **Toni de la Fuente** in 2016.
| ![](img/toni.png)<br>[![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/toniblyx.svg?style=social&label=Follow%20%40toniblyx)](https://twitter.com/toniblyx) [![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/prowlercloud.svg?style=social&label=Follow%20%40prowlercloud)](https://twitter.com/prowlercloud)|
|:--:|
| <b>Toni de la Fuente </b>|
## Maintainers
Prowler is maintained by the Engineers of the **Prowler Team** :
| ![](img/nacho.png)[![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/NachoRivCor.svg?style=social&label=Follow%20%40NachoRivCor)](https://twitter.com/NachoRivCor) | ![](img/sergio.png)[![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/sergargar1.svg?style=social&label=Follow%20%40sergargar1)](https://twitter.com/sergargar1) |![](img/pepe.png)[![Twitter URL](https://img.shields.io/twitter/url/https/twitter.com/jfagoagas.svg?style=social&label=Follow%20%40jfagoagas)](https://twitter.com/jfagoagas) |
|:--:|:--:|:--:
| <b>Nacho Rivera</b>| <b>Sergio Garcia</b>| <b>Pepe Fagoaga</b>|
## License
Prowler is licensed as **Apache License 2.0** as specified in each file. You may obtain a copy of the License at
<http://www.apache.org/licenses/LICENSE-2.0>

View File

@@ -0,0 +1,61 @@
## Access Prowler App
After [installation](../installation/prowler-app.md), navigate to [http://localhost:3000](http://localhost:3000) and sign up with email and password.
<img src="../../img/sign-up-button.png" alt="Sign Up Button" width="320"/>
<img src="../../img/sign-up.png" alt="Sign Up" width="285"/>
???+ note "User creation and default tenant behavior"
When creating a new user, the behavior depends on whether an invitation is provided:
- **Without an invitation**:
- A new tenant is automatically created.
- The new user is assigned to this tenant.
- A set of **RBAC admin permissions** is generated and assigned to the user for the newly-created tenant.
- **With an invitation**: The user is added to the specified tenant with the permissions defined in the invitation.
This mechanism ensures that the first user in a newly created tenant has administrative permissions within that tenant.
## Log In
Access Prowler App by logging in with **email and password**.
<img src="../../img/log-in.png" alt="Log In" width="285"/>
## Add Cloud Provider
Configure a cloud provider for scanning:
1. Navigate to `Settings > Cloud Providers` and click `Add Account`.
2. Select the cloud provider.
3. Enter the provider's identifier (Optional: Add an alias):
- **AWS**: Account ID
- **GCP**: Project ID
- **Azure**: Subscription ID
- **Kubernetes**: Cluster ID
- **M365**: Domain ID
4. Follow the guided instructions to add and authenticate your credentials.
## Start a Scan
Once credentials are successfully added and validated, Prowler initiates a scan of your cloud environment.
Click `Go to Scans` to monitor progress.
## View Results
Review findings during scan execution in the following sections:
- **Overview** Provides a high-level summary of your scans.
<img src="../../img/overview.png" alt="Overview" width="700"/>
- **Compliance** Displays compliance insights based on security frameworks.
<img src="../../img/compliance.png" alt="Compliance" width="700"/>
> For detailed usage instructions, refer to the [Prowler App Guide](../tutorials/prowler-app.md).
???+ note
Prowler will automatically scan all configured providers every **24 hours**, ensuring your cloud environment stays continuously monitored.

View File

@@ -0,0 +1,257 @@
## Running Prowler
Running Prowler requires specifying the provider (e.g `aws`, `gcp`, `azure`, `m365`, `github` or `kubernetes`):
???+ note
If no provider is specified, AWS is used by default for backward compatibility with Prowler v2.
```console
prowler <provider>
```
![Prowler Execution](../img/short-display.png)
???+ note
Running the `prowler` command without options will uses environment variable credentials. Refer to the Authentication section of each provider for credential configuration details.
## Verbose Output
If you prefer the former verbose output, use: `--verbose`. This allows seeing more info while Prowler is running, minimal output is displayed unless verbosity is enabled.
## Report Generation
By default, Prowler generates CSV, JSON-OCSF, and HTML reports. To generate a JSON-ASFF report (used by AWS Security Hub), specify `-M` or `--output-modes`:
```console
prowler <provider> -M csv json-asff json-ocsf html
```
The HTML report is saved in the output directory, alongside other reports. It will look like this:
![Prowler Execution](../img/html-output.png)
## Listing Available Checks and Services
List all available checks or services within a provider using `-l`/`--list-checks` or `--list-services`.
```console
prowler <provider> --list-checks
prowler <provider> --list-services
```
## Running Specific Checks or Services
Execute specific checks or services using `-c`/`checks` or `-s`/`services`:
```console
prowler azure --checks storage_blob_public_access_level_is_disabled
prowler aws --services s3 ec2
prowler gcp --services iam compute
prowler kubernetes --services etcd apiserver
```
## Excluding Checks and Services
Checks and services can be excluded with `-e`/`--excluded-checks` or `--excluded-services`:
```console
prowler aws --excluded-checks s3_bucket_public_access
prowler azure --excluded-services defender iam
prowler gcp --excluded-services kms
prowler kubernetes --excluded-services controllermanager
```
## Additional Options
Explore more advanced time-saving execution methods in the [Miscellaneous](../tutorials/misc.md) section.
Access the help menu and view all available options with `-h`/`--help`:
```console
prowler --help
```
## AWS
Use a custom AWS profile with `-p`/`--profile` and/or specific AWS regions with `-f`/`--filter-region`:
```console
prowler aws --profile custom-profile -f us-east-1 eu-south-2
```
???+ note
By default, `prowler` will scan all AWS regions.
See more details about AWS Authentication in the [Authentication Section](../tutorials/aws/authentication.md) section.
## Azure
Azure requires specifying the auth method:
```console
# To use service principal authentication
prowler azure --sp-env-auth
# To use az cli authentication
prowler azure --az-cli-auth
# To use browser authentication
prowler azure --browser-auth --tenant-id "XXXXXXXX"
# To use managed identity auth
prowler azure --managed-identity-auth
```
See more details about Azure Authentication in the [Authentication Section](../tutorials/azure/authentication.md)
By default, Prowler scans all accessible subscriptions. Scan specific subscriptions using the following flag (using az cli auth as example):
```console
prowler azure --az-cli-auth --subscription-ids <subscription ID 1> <subscription ID 2> ... <subscription ID N>
```
## Google Cloud
- **User Account Credentials**
By default, Prowler uses **User Account credentials**. Configure accounts using:
- `gcloud init` Set up a new account.
- `gcloud config set account <account>` Switch to an existing account.
Once configured, obtain access credentials using: `gcloud auth application-default login`.
- **Service Account Authentication**
Alternatively, you can use Service Account credentials:
Generate and download Service Account keys in JSON format. Refer to [Google IAM documentation](https://cloud.google.com/iam/docs/creating-managing-service-account-keys) for details.
Provide the key file location using this argument:
```console
prowler gcp --credentials-file path
```
- **Scanning Specific GCP Projects**
By default, Prowler scans all accessible GCP projects. Scan specific projects with the `--project-ids` flag:
```console
prowler gcp --project-ids <Project ID 1> <Project ID 2> ... <Project ID N>
```
- **GCP Retry Configuration**
Configure the maximum number of retry attempts for Google Cloud SDK API calls with the `--gcp-retries-max-attempts` flag:
```console
prowler gcp --gcp-retries-max-attempts 5
```
This is useful when experiencing quota exceeded errors (HTTP 429) to increase the number of automatic retry attempts.
## Kubernetes
Prowler enables security scanning of Kubernetes clusters, supporting both **in-cluster** and **external** execution.
- **Non In-Cluster Execution**
```console
prowler kubernetes --kubeconfig-file path
```
???+ note
If no `--kubeconfig-file` is provided, Prowler will use the default KubeConfig file location (`~/.kube/config`).
- **In-Cluster Execution**
To run Prowler inside the cluster, apply the provided YAML configuration to deploy a job in a new namespace:
```console
kubectl apply -f kubernetes/prowler-sa.yaml
kubectl apply -f kubernetes/job.yaml
kubectl apply -f kubernetes/prowler-role.yaml
kubectl apply -f kubernetes/prowler-rolebinding.yaml
kubectl get pods --namespace prowler-ns --> prowler-XXXXX
kubectl logs prowler-XXXXX --namespace prowler-ns
```
???+ note
By default, Prowler scans all namespaces in the active Kubernetes context. Use the `--context`flag to specify the context to be scanned and `--namespaces` to restrict scanning to specific namespaces.
## Microsoft 365
Microsoft 365 requires specifying the auth method:
```console
# To use service principal authentication for MSGraph and PowerShell modules
prowler m365 --sp-env-auth
# To use both service principal (for MSGraph) and user credentials (for PowerShell modules)
prowler m365 --env-auth
# To use az cli authentication
prowler m365 --az-cli-auth
# To use browser authentication
prowler m365 --browser-auth --tenant-id "XXXXXXXX"
```
See more details about M365 Authentication in the [Authentication Section](../tutorials/microsoft365/authentication.md) section.
## GitHub
Prowler enables security scanning of your **GitHub account**, including **Repositories**, **Organizations** and **Applications**.
- **Supported Authentication Methods**
Authenticate using one of the following methods:
```console
# Personal Access Token (PAT):
prowler github --personal-access-token pat
# OAuth App Token:
prowler github --oauth-app-token oauth_token
# GitHub App Credentials:
prowler github --github-app-id app_id --github-app-key app_key
```
???+ note
If no login method is explicitly provided, Prowler will automatically attempt to authenticate using environment variables in the following order of precedence:
1. `GITHUB_PERSONAL_ACCESS_TOKEN`
2. `OAUTH_APP_TOKEN`
3. `GITHUB_APP_ID` and `GITHUB_APP_KEY`
## Infrastructure as Code (IaC)
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Trivy](https://trivy.dev/). This provider supports a wide range of IaC frameworks, allowing you to assess your code before deployment.
```console
# Scan a directory for IaC files
prowler iac --scan-path ./my-iac-directory
# Scan a remote GitHub repository (public or private)
prowler iac --scan-repository-url https://github.com/user/repo.git
# Authenticate to a private repo with GitHub username and PAT
prowler iac --scan-repository-url https://github.com/user/repo.git \
--github-username <username> --personal-access-token <token>
# Authenticate to a private repo with OAuth App Token
prowler iac --scan-repository-url https://github.com/user/repo.git \
--oauth-app-token <oauth_token>
# Specify frameworks to scan (default: all)
prowler iac --scan-path ./my-iac-directory --frameworks terraform kubernetes
# Exclude specific paths
prowler iac --scan-path ./my-iac-directory --exclude-path ./my-iac-directory/test,./my-iac-directory/examples
```
???+ note
- `--scan-path` and `--scan-repository-url` are mutually exclusive; only one can be specified at a time.
- For remote repository scans, authentication can be provided via CLI flags or environment variables (`GITHUB_OAUTH_APP_TOKEN`, `GITHUB_USERNAME`, `GITHUB_PERSONAL_ACCESS_TOKEN`). CLI flags take precedence.
- The IaC provider does not require cloud authentication for local scans.
- It is ideal for CI/CD pipelines and local development environments.
- For more details on supported scanners, see the [Trivy documentation](https://trivy.dev/latest/docs/scanner/vulnerability/)
See more details about IaC scanning in the [IaC Tutorial](../tutorials/iac/getting-started-iac.md) section.

View File

@@ -2,7 +2,7 @@
In this page you can find all the details about [Amazon Web Services (AWS)](https://aws.amazon.com/) provider implementation in Prowler.
By default, Prowler will audit just one account and organization settings per scan. To configure it, follow the [getting started](../index.md#aws) page.
By default, Prowler will audit just one account and organization settings per scan. To configure it, follow the [AWS getting started guide](../tutorials/aws/getting-started-aws.md).
## AWS Provider Classes Architecture

View File

@@ -2,7 +2,7 @@
In this page you can find all the details about [Microsoft Azure](https://azure.microsoft.com/) provider implementation in Prowler.
By default, Prowler will audit all the subscriptions that it is able to list in the Microsoft Entra tenant, and tenant Entra ID service. To configure it, follow the [getting started](../index.md#azure) page.
By default, Prowler will audit all the subscriptions that it is able to list in the Microsoft Entra tenant, and tenant Entra ID service. To configure it, follow the [Azure getting started guide](../tutorials/azure/getting-started-azure.md).
## Azure Provider Classes Architecture

View File

@@ -265,7 +265,7 @@ Below is a generic example of a check metadata file. **Do not include comments i
- For AWS this field must follow the [AWS Security Hub Types](https://docs.aws.amazon.com/securityhub/latest/userguide/asff-required-attributes.html#Types) format. So the common pattern to follow is `namespace/category/classifier`, refer to the attached documentation for the valid values for this fields.
- **ServiceName** — The name of the provider service being audited. This field **must** be in lowercase and match with the service folder name. For supported services refer to [Prowler Hub](https://hub.prowler.com/check) or directly to [Prowler Code](https://github.com/prowler-cloud/prowler/tree/master/prowler/providers).
- **SubServiceName** — The subservice or resource within the service, if applicable. For more information refer to the [Naming Format for Checks](#naming-format-for-checks) section.
- **ResourceIdTemplate** — A template for the unique resource identifier. For more information refer to the [Prowler's Resource Identification](#prowlers-resource-identification) section.
- **ResourceIdTemplate** — A template for the unique resource identifier. For more information refer to the [Resource Identification in Prowler](#resource-identification-in-prowler) section.
- **Severity** — The severity of the finding if the check fails. Must be one of: `critical`, `high`, `medium`, `low`, or `informational`, this field **must** be in lowercase. To get more information about the severity levels refer to the [Prowler's Check Severity Levels](#prowlers-check-severity-levels) section.
- **ResourceType** — The type of resource being audited. *For now this field is only standardized for the AWS provider*.
- For AWS use the [Security Hub resource types](https://docs.aws.amazon.com/securityhub/latest/userguide/asff-resources.html) or, if not available, the PascalCase version of the [CloudFormation type](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-template-resource-type-ref.html) (e.g., `AwsEc2Instance`). Use "Other" if no match exists.

View File

@@ -2,7 +2,7 @@
This page details the [Google Cloud Platform (GCP)](https://cloud.google.com/) provider implementation in Prowler.
By default, Prowler will audit all the GCP projects that the authenticated identity can access. To configure it, follow the [getting started](../index.md#google-cloud) page.
By default, Prowler will audit all the GCP projects that the authenticated identity can access. To configure it, follow the [GCP getting started guide](../tutorials/gcp/getting-started-gcp.md).
## GCP Provider Classes Architecture

View File

@@ -2,7 +2,7 @@
This page details the [GitHub](https://github.com/) provider implementation in Prowler.
By default, Prowler will audit the GitHub account - scanning all repositories, organizations, and applications that your configured credentials can access. To configure it, follow the [getting started](../index.md#github) page.
By default, Prowler will audit the GitHub account - scanning all repositories, organizations, and applications that your configured credentials can access. To configure it, follow the [GitHub getting started guide](../tutorials/github/getting-started-github.md).
## GitHub Provider Classes Architecture

View File

@@ -2,7 +2,7 @@
This page details the [Kubernetes](https://kubernetes.io/) provider implementation in Prowler.
By default, Prowler will audit all namespaces in the Kubernetes cluster accessible by the configured context. To configure it, follow the [getting started](../index.md#kubernetes) page.
By default, Prowler will audit all namespaces in the Kubernetes cluster accessible by the configured context. To configure it, see the [In-Cluster Execution](../tutorials/kubernetes/in-cluster.md) or [Non In-Cluster Execution](../tutorials/kubernetes/outside-cluster.md) guides.
## Kubernetes Provider Classes Architecture

View File

@@ -2,7 +2,7 @@
This page details the [Microsoft 365 (M365)](https://www.microsoft.com/en-us/microsoft-365) provider implementation in Prowler.
By default, Prowler will audit the Microsoft Entra ID tenant and its supported services. To configure it, follow the [getting started](../index.md#microsoft-365) page.
By default, Prowler will audit the Microsoft Entra ID tenant and its supported services. To configure it, follow the [M365 getting started guide](../tutorials/microsoft365/getting-started-m365.md).
---
@@ -15,7 +15,7 @@ By default, Prowler will audit the Microsoft Entra ID tenant and its supported s
- **Required modules:**
- [ExchangeOnlineManagement](https://www.powershellgallery.com/packages/ExchangeOnlineManagement/3.6.0) (≥ 3.6.0)
- [MicrosoftTeams](https://www.powershellgallery.com/packages/MicrosoftTeams/6.6.0) (≥ 6.6.0)
- If you use Prowler Cloud or the official containers, PowerShell is pre-installed. For local or pip installations, you must install PowerShell and the modules yourself. See [Requirements: Supported PowerShell Versions](../getting-started/requirements.md#supported-powershell-versions) and [Needed PowerShell Modules](../getting-started/requirements.md#needed-powershell-modules).
- If you use Prowler Cloud or the official containers, PowerShell is pre-installed. For local or pip installations, you must install PowerShell and the modules yourself. See [Authentication: Supported PowerShell Versions](../tutorials/microsoft365/authentication.md#supported-powershell-versions) and [Needed PowerShell Modules](../tutorials/microsoft365/authentication.md#required-powershell-modules).
- For more details and troubleshooting, see [Use of PowerShell in M365](../tutorials/microsoft365/use-of-powershell.md).
---

View File

@@ -231,11 +231,11 @@ Before implementing a new service, verify that Prowler's existing permissions fo
Provider-Specific Permissions Documentation:
- [AWS](../getting-started/requirements.md#authentication)
- [Azure](../getting-started/requirements.md#needed-permissions)
- [GCP](../getting-started/requirements.md#needed-permissions_1)
- [M365](../getting-started/requirements.md#needed-permissions_2)
- [GitHub](../getting-started/requirements.md#authentication_2)
- [AWS](../tutorials/aws/authentication.md#required-permissions)
- [Azure](../tutorials/azure/authentication.md#required-permissions)
- [GCP](../tutorials/gcp/authentication.md#required-permissions)
- [M365](../tutorials/microsoft365/authentication.md#required-permissions)
- [GitHub](../tutorials/github/authentication.md)
## Best Practices

View File

@@ -39,7 +39,7 @@ To execute the Prowler test suite, install the necessary dependencies listed in
### Prerequisites
If you have not installed Prowler yet, refer to the [developer guide introduction](./introduction.md#get-the-code-and-install-all-dependencies).
If you have not installed Prowler yet, refer to the [developer guide introduction](./introduction.md#getting-the-code-and-installing-all-dependencies).
### Executing Tests
@@ -520,7 +520,7 @@ Execute tests on the service `__init__` to ensure correct information retrieval.
While service tests resemble *Integration Tests*, as they assess how the service interacts with the provider, they ultimately fall under *Unit Tests*, due to the use of Moto or custom mock objects.
For detailed guidance on test creation and existing service tests, refer to the [AWS checks test](./unit-testing.md#checks) [documentation](https://github.com/prowler-cloud/prowler/tree/master/tests/providers/aws/services).
For detailed guidance on test creation and existing service tests, check the current [AWS checks implementation](https://github.com/prowler-cloud/prowler/tree/master/tests/providers/aws/services).
## GCP

View File

@@ -1,591 +0,0 @@
# Prowler Requirements
Prowler is built in Python and utilizes the following SDKs:
- [AWS SDK (Boto3)](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html#)
- [Azure SDK](https://azure.github.io/azure-sdk-for-python/)
- [GCP API Python Client](https://github.com/googleapis/google-api-python-client/)
- [Kubernetes SDK](https://github.com/kubernetes-client/python)
- [M365 Graph SDK](https://github.com/microsoftgraph/msgraph-sdk-python)
- [Github REST API SDK](https://github.com/PyGithub/PyGithub)
## AWS
Prowler requires AWS credentials to function properly. You can authenticate using any method outlined in the [AWS CLI configuration guide](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html#cli-configure-quickstart-precedence).
### Authentication Steps
Ensure your AWS CLI is correctly configured with valid credentials and region settings. You can achieve this via:
```console
aws configure
```
or
```console
export AWS_ACCESS_KEY_ID="ASXXXXXXX"
export AWS_SECRET_ACCESS_KEY="XXXXXXXXX"
export AWS_SESSION_TOKEN="XXXXXXXXX"
```
#### Required IAM Permissions
The credentials used must be associated with a user or role that has appropriate permissions for security checks. Attach the following AWS managed policies to ensure access:
- `arn:aws:iam::aws:policy/SecurityAudit`
- `arn:aws:iam::aws:policy/job-function/ViewOnlyAccess`
#### Additional Permissions
For certain checks, additional read-only permissions are required. Attach the following custom policy to your role:
[prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json)
If you intend to send findings to
[AWS Security Hub](https://aws.amazon.com/security-hub), attach the following custom policy:
[prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
### Multi-Factor Authentication (MFA)
If your IAM entity requires Multi-Factor Authentication (MFA), you can use the `--mfa` flag. Prowler will prompt you to enter the following values to initiate a new session:
- **ARN of your MFA device**
- **TOTP (Time-Based One-Time Password)**
## Azure
Prowler for Azure supports multiple authentication types. To use a specific method, pass the appropriate flag during execution:
- [**Service Principal Application**](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (**Recommended**)
- Existing **AZ CLI credentials**
- **Interactive browser authentication**
- [**Managed Identity**](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) authentication
> ⚠️ **Important:** For Prowler App, only Service Principal authentication is supported.
### Service Principal Application Authentication
To allow Prowler to authenticate using a Service Principal Application, set up the following environment variables:
```console
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXX"
```
If you execute Prowler with the `--sp-env-auth` flag and these variables are not set or exported, execution will fail.
Refer to the [Create Prowler Service Principal](../tutorials/azure/create-prowler-service-principal.md#how-to-create-prowler-service-principal-application) guide for detailed setup instructions.
### Azure Authentication Methods
Prowler for Azure supports the following authentication methods:
- **AZ CLI Authentication (`--az-cli-auth`)** Automated authentication using stored AZ CLI credentials.
- **Managed Identity Authentication (`--managed-identity-auth`)** Automated authentication via Azure Managed Identity.
- **Browser Authentication (`--browser-auth`)** Requires the user to authenticate using the default browser. The `tenant-id` parameter is mandatory for this method.
### Required Permissions
Prowler for Azure requires two types of permission scopes:
#### Microsoft Entra ID Permissions
These permissions allow Prowler to retrieve metadata from the assumed identity and perform specific Entra checks. While not mandatory for execution, they enhance functionality.
Required permissions:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All` (used for Entra multifactor authentication checks)
???+ note
You can replace `Directory.Read.All` with `Domain.Read.All` that is a more restrictive permission but you won't be able to run the Entra checks related with DirectoryRoles and GetUsers.
#### Subscription Scope Permissions
These permissions are required to perform security checks against Azure resources. The following **RBAC roles** must be assigned per subscription to the entity used by Prowler:
- `Reader` Grants read-only access to Azure resources.
- `ProwlerRole` A custom role with minimal permissions, defined in the [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json).
???+ note
The `assignableScopes` field in the JSON custom role file must be updated to reflect the correct subscription or management group. Use one of the following formats: `/subscriptions/<subscription-id>` or `/providers/Microsoft.Management/managementGroups/<management-group-id>`.
### Assigning Permissions
To properly configure permissions, follow these guides:
- [Microsoft Entra ID permissions](../tutorials/azure/create-prowler-service-principal.md#assigning-the-proper-permissions)
- [Azure subscription permissions](../tutorials/azure/subscriptions.md#assign-the-appropriate-permissions-to-the-identity-that-is-going-to-be-assumed-by-prowler)
???+ warning
Some permissions in `ProwlerRole` involve **write access**. If a `ReadOnly` lock is attached to certain resources, you may encounter errors, and findings for those checks will not be available.
#### Checks Requiring `ProwlerRole`
The following security checks require the `ProwlerRole` permissions for execution. Ensure the role is assigned to the identity assumed by Prowler before running these checks:
- `app_function_access_keys_configured`
- `app_function_ftps_deployment_disabled`
## Google Cloud
### Authentication
Prowler follows the same credential discovery process as the [Google authentication libraries](https://cloud.google.com/docs/authentication/application-default-credentials#search_order):
1. **Environment Variable Authentication** Uses the [`GOOGLE_APPLICATION_CREDENTIALS` environment variable](https://cloud.google.com/docs/authentication/application-default-credentials#GAC).
2. **Google Cloud CLI Credentials** Uses credentials configured via the [Google Cloud CLI](https://cloud.google.com/docs/authentication/application-default-credentials#personal).
3. **Service Account Authentication** Retrieves the attached service account credentials from the metadata server. More details [here](https://cloud.google.com/docs/authentication/application-default-credentials#attached-sa).
### Required Permissions
Prowler for Google Cloud requires the following permissions:
#### IAM Roles
- **Reader (`roles/reader`)** Must be granted at the **project, folder, or organization** level to allow scanning of target projects.
#### Project-Level Settings
At least one project must have the following configurations:
- **Identity and Access Management (IAM) API (`iam.googleapis.com`)** Must be enabled via:
- The [Google Cloud API UI](https://console.cloud.google.com/apis/api/iam.googleapis.com/metrics), or
- The `gcloud` CLI:
```sh
gcloud services enable iam.googleapis.com --project <your-project-id>
```
- **Service Usage Consumer (`roles/serviceusage.serviceUsageConsumer`)** IAM Role Required for resource scanning.
- **Quota Project Setting** Define a quota project using either:
- The `gcloud` CLI:
```sh
gcloud auth application-default set-quota-project <project-id>
```
- Setting an environment variable:
```sh
export GOOGLE_CLOUD_QUOTA_PROJECT=<project-id>
```
### Default Project Scanning
By default, Prowler scans **all accessible GCP projects**. To limit the scan to specific projects, use the `--project-ids` flag.
## Microsoft 365
Prowler for Microsoft 365 (M365) supports the following authentication methods:
- [**Service Principal Application**](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (**Recommended**)
- **Service Principal Application with Microsoft User Credentials**
- **Stored AZ CLI credentials**
- **Interactive browser authentication**
???+ warning
Prowler App supports the **Service Principal** authentication method and the **Service Principal with User Credentials** authentication method, but this last one will be deprecated in September once Microsoft will enforce MFA in all tenants not allowing User authentication without interactive method.
### Service Principal Authentication (Recommended)
**Authentication flag:** `--sp-env-auth`
To enable Prowler to authenticate as the **Service Principal Application**, configure the following environment variables:
```console
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
```
If these variables are not set or exported, execution using `--sp-env-auth` will fail.
Refer to the [Create Prowler Service Principal](../tutorials/microsoft365/getting-started-m365.md#create-the-service-principal-app) guide for setup instructions.
If the external API permissions described in the mentioned section above are not added only checks that work through MS Graph will be executed. This means that the full provider will not be executed.
???+ note
In order to scan all the checks from M365 required permissions to the service principal application must be added. Refer to the [External API Permissions Assignment](../tutorials/microsoft365/getting-started-m365.md#grant-powershell-modules-permissions) section for more information.
### Service Principal and User Credentials Authentication
Authentication flag: `--env-auth`
???+ warning
This method is not recommended anymore, we recommend just use the **Service Principal Application** authentication method instead.
This method builds upon the Service Principal authentication by adding User Credentials. Configure the following environment variables: `M365_USER` and `M365_PASSWORD`.
```console
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
export M365_USER="your_email@example.com"
export M365_PASSWORD="examplepassword"
```
These two new environment variables are **required** in this authentication method to execute the PowerShell modules needed to retrieve information from M365 services. Prowler uses Service Principal authentication to access Microsoft Graph and user credentials to authenticate to Microsoft PowerShell modules.
- `M365_USER` should be your Microsoft account email using the **assigned domain in the tenant**. This means it must look like `example@YourCompany.onmicrosoft.com` or `example@YourCompany.com`, but it must be the exact domain assigned to that user in the tenant.
???+ warning
If the user is newly created, you need to sign in with that account first, as Microsoft will prompt you to change the password. If you dont complete this step, user authentication will fail because Microsoft marks the initial password as expired.
???+ warning
If the user is newly created, you need to sign in with that account first, as Microsoft will prompt you to change the password. If you dont complete this step, user authentication will fail because Microsoft marks the initial password as expired.
???+ warning
The user must not be MFA capable. Microsoft does not allow MFA capable users to authenticate programmatically. See [Microsoft documentation](https://learn.microsoft.com/en-us/entra/identity-platform/scenario-desktop-acquire-token-username-password?tabs=dotnet) for more information.
???+ warning
Using a tenant domain other than the one assigned — even if it belongs to the same tenant — will cause Prowler to fail, as Microsoft authentication will not succeed.
Ensure you are using the right domain for the user you are trying to authenticate with.
![User Domains](../tutorials/microsoft365/img/user-domains.png)
- `M365_PASSWORD` must be the user password.
???+ note
Before we asked for a encrypted password, but now we ask for the user password directly. Prowler will now handle the password encryption for you.
### Interactive Browser Authentication
**Authentication flag:** `--browser-auth`
This authentication method requires the user to authenticate against Azure using the default browser to start the scan. The `--tenant-id` flag is also required.
With these credentials, you will only be able to run checks that rely on Microsoft Graph. This means you won't be able to run the entire provider. To perform a full M365 security scan, use the **recommended authentication method**.
Since this is a **delegated permission** authentication method, necessary permissions should be assigned to the user rather than the application.
### Required Permissions
To run the full Prowler provider, including PowerShell checks, two types of permission scopes must be set in **Microsoft Entra ID**.
#### For Service Principal Authentication (`--sp-env-auth`) - Recommended
When using service principal authentication, you need to add the following **Application Permissions** configured to:
**Microsoft Graph API Permissions:**
- `AuditLog.Read.All`: Required for Entra service.
- `Directory.Read.All`: Required for all services.
- `Policy.Read.All`: Required for all services.
- `SharePointTenantSettings.Read.All`: Required for SharePoint service.
- `User.Read` (IMPORTANT: this must be set as **delegated**): Required for the sign-in.
**External API Permissions:**
- `Exchange.ManageAsApp` from external API `Office 365 Exchange Online`: Required for Exchange PowerShell module app authentication. You also need to assign the `Global Reader` role to the app.
- `application_access` from external API `Skype and Teams Tenant Admin API`: Required for Teams PowerShell module app authentication.
???+ note
`Directory.Read.All` can be replaced with `Domain.Read.All` that is a more restrictive permission but you won't be able to run the Entra checks related with DirectoryRoles and GetUsers.
> If you do this you will need to add also the `Organization.Read.All` permission to the service principal application in order to authenticate.
???+ note
This is the **recommended authentication method** because it allows you to run the full M365 provider including PowerShell checks, providing complete coverage of all available security checks, same as the Service Principal Authentication + User Credentials Authentication but this last one will be deprecated in September once Microsoft will enforce MFA in all tenants not allowing User authentication without interactive method.
#### For Service Principal + User Credentials Authentication (`--env-auth`)
When using service principal with user credentials authentication, you need **both** sets of permissions:
**1. Service Principal Application Permissions**:
- You **will need** all the Microsoft Graph API permissions listed above.
- You **won't need** the External API permissions listed above.
**2. User-Level Permissions**: These are set at the `M365_USER` level, so the user used to run Prowler must have one of the following roles:
- `Global Reader` (recommended): this allows you to read all roles needed.
- `Exchange Administrator` and `Teams Administrator`: user needs both roles but with this [roles](https://learn.microsoft.com/en-us/exchange/permissions-exo/permissions-exo#microsoft-365-permissions-in-exchange-online) you can access to the same information as a Global Reader (since only read access is needed, Global Reader is recommended).
#### For Browser Authentication (`--browser-auth`)
When using browser authentication, permissions are delegated to the user, so the user must have the appropriate permissions rather than the application.
???+ warning
With browser authentication, you will only be able to run checks that work through MS Graph API. PowerShell module checks will not be executed.
### Assigning Permissions and Roles
For guidance on assigning the necessary permissions and roles, follow these instructions:
- [Grant API Permissions](../tutorials/microsoft365/getting-started-m365.md#grant-required-graph-api-permissions)
- [Assign Required Roles](../tutorials/microsoft365/getting-started-m365.md#if-using-user-authentication)
### Supported PowerShell Versions
PowerShell is required to run certain M365 checks.
**Supported versions:**
- **PowerShell 7.4 or higher** (7.5 is recommended)
#### Why Is PowerShell 7.4+ Required?
- **PowerShell 5.1** (default on some Windows systems) does not support required cmdlets.
- Older [cross-platform PowerShell versions](https://learn.microsoft.com/en-us/powershell/scripting/install/powershell-support-lifecycle?view=powershell-7.5) are **unsupported**, leading to potential errors.
???+ note
Installing PowerShell is only necessary if you install Prowler via **pip or other sources**. **SDK and API containers include PowerShell by default.**
### Installing PowerShell
Installing PowerShell is different depending on your OS.
- [Windows](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.5#install-powershell-using-winget-recommended): you will need to update PowerShell to +7.4 to be able to run prowler, if not some checks will not show findings and the provider could not work as expected. This version of PowerShell is [supported](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.4#supported-versions-of-windows) on Windows 10, Windows 11, Windows Server 2016 and higher versions.
```console
winget install --id Microsoft.PowerShell --source winget
```
- [MacOS](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-macos?view=powershell-7.5#install-the-latest-stable-release-of-powershell): installing PowerShell on MacOS needs to have installed [brew](https://brew.sh/), once you have it is just running the command above, Pwsh is only supported in macOS 15 (Sequoia) x64 and Arm64, macOS 14 (Sonoma) x64 and Arm64, macOS 13 (Ventura) x64 and Arm64
```console
brew install powershell/tap/powershell
```
Once it's installed run `pwsh` on your terminal to verify it's working.
- Linux: installing PowerShell on Linux depends on the distro you are using:
- [Ubuntu](https://learn.microsoft.com/es-es/powershell/scripting/install/install-ubuntu?view=powershell-7.5#installation-via-package-repository-the-package-repository): The required version for installing PowerShell +7.4 on Ubuntu are Ubuntu 22.04 and Ubuntu 24.04. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget apt-transport-https software-properties-common
# Get the version of Ubuntu
source /etc/os-release
# Download the Microsoft repository keys
wget -q https://packages.microsoft.com/config/ubuntu/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Alpine](https://learn.microsoft.com/es-es/powershell/scripting/install/install-alpine?view=powershell-7.5#installation-steps): The only supported version for installing PowerShell +7.4 on Alpine is Alpine 3.20. The unique way to install it is downloading the tar.gz package available on [PowerShell github](https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz). You just need to follow the following steps:
```console
# Install the requirements
sudo apk add --no-cache \
ca-certificates \
less \
ncurses-terminfo-base \
krb5-libs \
libgcc \
libintl \
libssl3 \
libstdc++ \
tzdata \
userspace-rcu \
zlib \
icu-libs \
curl
apk -X https://dl-cdn.alpinelinux.org/alpine/edge/main add --no-cache \
lttng-ust \
openssh-client \
# Download the powershell '.tar.gz' archive
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz -o /tmp/powershell.tar.gz
# Create the target folder where powershell will be placed
sudo mkdir -p /opt/microsoft/powershell/7
# Expand powershell to the target folder
sudo tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7
# Set execute permissions
sudo chmod +x /opt/microsoft/powershell/7/pwsh
# Create the symbolic link that points to pwsh
sudo ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh
# Start PowerShell
pwsh
```
- [Debian](https://learn.microsoft.com/es-es/powershell/scripting/install/install-debian?view=powershell-7.5#installation-on-debian-11-or-12-via-the-package-repository): The required version for installing PowerShell +7.4 on Debian are Debian 11 and Debian 12. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget
# Get the version of Debian
source /etc/os-release
# Download the Microsoft repository GPG keys
wget -q https://packages.microsoft.com/config/debian/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository GPG keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository GPG keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Rhel](https://learn.microsoft.com/es-es/powershell/scripting/install/install-rhel?view=powershell-7.5#installation-via-the-package-repository): The required version for installing PowerShell +7.4 on Red Hat are RHEL 8 and RHEL 9. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Get version of RHEL
source /etc/os-release
if [ ${VERSION_ID%.*} -lt 8 ]
then majorver=7
elif [ ${VERSION_ID%.*} -lt 9 ]
then majorver=8
else majorver=9
fi
# Download the Microsoft RedHat repository package
curl -sSL -O https://packages.microsoft.com/config/rhel/$majorver/packages-microsoft-prod.rpm
# Register the Microsoft RedHat repository
sudo rpm -i packages-microsoft-prod.rpm
# Delete the downloaded package after installing
rm packages-microsoft-prod.rpm
# Update package index files
sudo dnf update
# Install PowerShell
sudo dnf install powershell -y
```
- [Docker](https://learn.microsoft.com/es-es/powershell/scripting/install/powershell-in-docker?view=powershell-7.5#use-powershell-in-a-container): The following command download the latest stable versions of PowerShell:
```console
docker pull mcr.microsoft.com/dotnet/sdk:9.0
```
To start an interactive shell of Pwsh you just need to run:
```console
docker run -it mcr.microsoft.com/dotnet/sdk:9.0 pwsh
```
### Required PowerShell Modules
Prowler relies on several PowerShell cmdlets to retrieve necessary data.
These cmdlets come from different modules that must be installed.
#### Automatic Installation
The required modules are automatically installed when running Prowler with the `--init-modules` flag.
Example command:
```console
python3 prowler-cli.py m365 --verbose --log-level ERROR --env-auth --init-modules
```
If the modules are already installed, running this command will not cause issues—it will simply verify that the necessary modules are available.
???+ note
Prowler installs the modules using `-Scope CurrentUser`.
If you encounter any issues with services not working after the automatic installation, try installing the modules manually using `-Scope AllUsers` (administrator permissions are required for this).
The command needed to install a module manually is:
```powershell
Install-Module -Name "ModuleName" -Scope AllUsers -Force
```
#### Modules Version
- [ExchangeOnlineManagement](https://www.powershellgallery.com/packages/ExchangeOnlineManagement/3.6.0) (Minimum version: 3.6.0) Required for checks across Exchange, Defender, and Purview.
- [MicrosoftTeams](https://www.powershellgallery.com/packages/MicrosoftTeams/6.6.0) (Minimum version: 6.6.0) Required for all Teams checks.
- [MSAL.PS](https://www.powershellgallery.com/packages/MSAL.PS/4.32.0): Required for Exchange module via application authentication.
[MSAL.PS](https://www.powershellgallery.com/packages/MSAL.PS/4.32.0): Required for Exchange module via application authentication.
## GitHub
Prowler supports multiple [authentication methods for GitHub](https://docs.github.com/en/rest/authentication/authenticating-to-the-rest-api).
### Supported Authentication Methods
- **Personal Access Token (PAT)**
- **OAuth App Token**
- **GitHub App Credentials**
These options provide flexibility for scanning and analyzing your GitHub account, repositories, organizations, and applications. Choose the authentication method that best suits your security needs.
???+ note
GitHub App Credentials support less checks than other authentication methods.
## Infrastructure as Code (IaC)
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Checkov](https://www.checkov.io/). This provider supports a wide range of IaC frameworks and requires no cloud authentication for local scans.
### Authentication
- For local scans, no authentication is required.
- For remote repository scans, authentication can be provided via:
- [**GitHub Username and Personal Access Token (PAT)**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)
- [**GitHub OAuth App Token**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-fine-grained-personal-access-token)
- [**Git URL**](https://git-scm.com/docs/git-clone#_git_urls)
### Supported Frameworks
The IaC provider leverages Checkov to support multiple frameworks, including:
- Terraform
- CloudFormation
- Kubernetes
- ARM (Azure Resource Manager)
- Serverless
- Dockerfile
- YAML/JSON (generic IaC)
- Bicep
- Helm
- GitHub Actions, GitLab CI, Bitbucket Pipelines, Azure Pipelines, CircleCI, Argo Workflows
- Ansible
- Kustomize
- OpenAPI
- SAST, SCA (Software Composition Analysis)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 518 KiB

View File

@@ -1,3 +1,5 @@
# What is Prowler?
**Prowler** is the open source cloud security platform trusted by thousands to **automate security and compliance** in any cloud environment. With hundreds of ready-to-use checks and compliance frameworks, Prowler delivers real-time, customizable monitoring and seamless integrations, making cloud security simple, scalable, and cost-effective for organizations of any size.
The official supported providers right now are:
@@ -8,790 +10,15 @@ The official supported providers right now are:
- **Kubernetes**
- **M365**
- **Github**
- **IaC**
Unofficially, Prowler supports: NHN.
Prowler supports **auditing, incident response, continuous monitoring, hardening, forensic readiness, and remediation**.
### Prowler Components
### Products
- **Prowler CLI** (Command Line Interface) Known as **Prowler Open Source**.
- **Prowler Cloud** A managed service built on top of Prowler CLI.
More information: [Prowler Cloud](https://prowler.com)
## Prowler App
![Prowler App](img/overview.png)
Prowler App is a web application that simplifies running Prowler. It provides:
- A **user-friendly interface** for configuring and executing scans.
- A dashboard to **view results** and manage **security findings**.
### Installation Guide
Refer to the [Quick Start](#prowler-app-installation) section for installation steps.
## Prowler CLI
```console
prowler <provider>
```
![Prowler CLI Execution](img/short-display.png)
## Prowler Dashboard
```console
prowler dashboard
```
![Prowler Dashboard](img/dashboard.png)
Prowler includes hundreds of security controls aligned with widely recognized industry frameworks and standards, including:
- CIS Benchmarks (AWS, Azure, Microsoft 365, Kubernetes, GitHub)
- NIST SP 800-53 (rev. 4 and 5) and NIST SP 800-171
- NIST Cybersecurity Framework (CSF)
- CISA Guidelines
- FedRAMP Low & Moderate
- PCI DSS v3.2.1 and v4.0
- ISO/IEC 27001:2013 and 2022
- SOC 2
- GDPR (General Data Protection Regulation)
- HIPAA (Health Insurance Portability and Accountability Act)
- FFIEC (Federal Financial Institutions Examination Council)
- ENS RD2022 (Spanish National Security Framework)
- GxP 21 CFR Part 11 and EU Annex 11
- RBI Cybersecurity Framework (Reserve Bank of India)
- KISA ISMS-P (Korean Information Security Management System)
- MITRE ATT&CK
- AWS Well-Architected Framework (Security & Reliability Pillars)
- AWS Foundational Technical Review (FTR)
- Microsoft NIS2 Directive (EU)
- Custom threat scoring frameworks (prowler_threatscore)
- Custom security frameworks for enterprise needs
## Quick Start
### Prowler App Installation
Prowler App supports multiple installation methods based on your environment.
Refer to the [Prowler App Tutorial](tutorials/prowler-app.md) for detailed usage instructions.
=== "Docker Compose"
_Requirements_:
* `Docker Compose` installed: https://docs.docker.com/compose/install/.
_Commands_:
``` bash
curl -LO https://raw.githubusercontent.com/prowler-cloud/prowler/refs/heads/master/docker-compose.yml
curl -LO https://raw.githubusercontent.com/prowler-cloud/prowler/refs/heads/master/.env
docker compose up -d
```
> Containers are built for `linux/amd64`. If your workstation's architecture is different, please set `DOCKER_DEFAULT_PLATFORM=linux/amd64` in your environment or use the `--platform linux/amd64` flag in the docker command.
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
???+ note
You can change the environment variables in the `.env` file. Note that it is not recommended to use the default values in production environments.
???+ note
There is a development mode available, you can use the file https://github.com/prowler-cloud/prowler/blob/master/docker-compose-dev.yml to run the app in development mode.
???+ warning
Google and GitHub authentication is only available in [Prowler Cloud](https://prowler.com).
=== "GitHub"
_Requirements_:
* `git` installed.
* `poetry` installed: [poetry installation](https://python-poetry.org/docs/#installation).
* `npm` installed: [npm installation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
* `Docker Compose` installed: https://docs.docker.com/compose/install/.
???+ warning
Make sure to have `api/.env` and `ui/.env.local` files with the required environment variables. You can find the required environment variables in the [`api/.env.template`](https://github.com/prowler-cloud/prowler/blob/master/api/.env.example) and [`ui/.env.template`](https://github.com/prowler-cloud/prowler/blob/master/ui/.env.template) files.
_Commands to run the API_:
``` bash
git clone https://github.com/prowler-cloud/prowler \
cd prowler/api \
poetry install \
eval $(poetry env activate) \
set -a \
source .env \
docker compose up postgres valkey -d \
cd src/backend \
python manage.py migrate --database admin \
gunicorn -c config/guniconf.py config.wsgi:application
```
???+ important
Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
> Now, you can access the API documentation at http://localhost:8080/api/v1/docs.
_Commands to run the API Worker_:
``` bash
git clone https://github.com/prowler-cloud/prowler \
cd prowler/api \
poetry install \
eval $(poetry env activate) \
set -a \
source .env \
cd src/backend \
python -m celery -A config.celery worker -l info -E
```
_Commands to run the API Scheduler_:
``` bash
git clone https://github.com/prowler-cloud/prowler \
cd prowler/api \
poetry install \
eval $(poetry env activate) \
set -a \
source .env \
cd src/backend \
python -m celery -A config.celery beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
```
_Commands to run the UI_:
``` bash
git clone https://github.com/prowler-cloud/prowler \
cd prowler/ui \
npm install \
npm run build \
npm start
```
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
???+ warning
Google and GitHub authentication is only available in [Prowler Cloud](https://prowler.com).
### Prowler CLI Installation
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/). Consequently, it can be installed as Python package with `Python >= 3.9, <= 3.12`:
=== "pipx"
[pipx](https://pipx.pypa.io/stable/) is a tool to install Python applications in isolated environments. It is recommended to use `pipx` for a global installation.
_Requirements_:
* `Python >= 3.9, <= 3.12`
* `pipx` installed: [pipx installation](https://pipx.pypa.io/stable/installation/).
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
``` bash
pipx install prowler
prowler -v
```
To upgrade Prowler to the latest version, run:
``` bash
pipx upgrade prowler
```
=== "pip"
???+ warning
This method is not recommended because it will modify the environment which you choose to install. Consider using [pipx](https://docs.prowler.com/projects/prowler-open-source/en/latest/#__tabbed_1_1) for a global installation.
_Requirements_:
* `Python >= 3.9, <= 3.12`
* `Python pip >= 21.0.0`
* AWS, GCP, Azure, M365 and/or Kubernetes credentials
_Commands_:
``` bash
pip install prowler
prowler -v
```
To upgrade Prowler to the latest version, run:
``` bash
pip install --upgrade prowler
```
=== "Docker"
_Requirements_:
* Have `docker` installed: https://docs.docker.com/get-docker/.
* In the command below, change `-v` to your local directory path in order to access the reports.
* AWS, GCP, Azure and/or Kubernetes credentials
> Containers are built for `linux/amd64`. If your workstation's architecture is different, please set `DOCKER_DEFAULT_PLATFORM=linux/amd64` in your environment or use the `--platform linux/amd64` flag in the docker command.
_Commands_:
``` bash
docker run -ti --rm -v /your/local/dir/prowler-output:/home/prowler/output \
--name prowler \
--env AWS_ACCESS_KEY_ID \
--env AWS_SECRET_ACCESS_KEY \
--env AWS_SESSION_TOKEN toniblyx/prowler:latest
```
=== "GitHub"
_Requirements for Developers_:
* `git`
* `poetry` installed: [poetry installation](https://python-poetry.org/docs/#installation).
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
```
git clone https://github.com/prowler-cloud/prowler
cd prowler
poetry install
poetry run python prowler-cli.py -v
```
???+ note
If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
=== "Amazon Linux 2"
_Requirements_:
* `Python >= 3.9, <= 3.12`
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
```
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install prowler
prowler -v
```
=== "Ubuntu"
_Requirements_:
* `Ubuntu 23.04` or above, if you are using an older version of Ubuntu check [pipx installation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#__tabbed_1_1) and ensure you have `Python >= 3.9, <= 3.12`.
* `Python >= 3.9, <= 3.12`
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
``` bash
sudo apt update
sudo apt install pipx
pipx ensurepath
pipx install prowler
prowler -v
```
=== "Brew"
_Requirements_:
* `Brew` installed in your Mac or Linux
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
``` bash
brew install prowler
prowler -v
```
=== "AWS CloudShell"
After the migration of AWS CloudShell from Amazon Linux 2 to Amazon Linux 2023 [[1]](https://aws.amazon.com/about-aws/whats-new/2023/12/aws-cloudshell-migrated-al2023/) [[2]](https://docs.aws.amazon.com/cloudshell/latest/userguide/cloudshell-AL2023-migration.html), there is no longer a need to manually compile Python 3.9 as it is already included in AL2023. Prowler can thus be easily installed following the generic method of installation via pip. Follow the steps below to successfully execute Prowler v4 in AWS CloudShell:
_Requirements_:
* Open AWS CloudShell `bash`.
_Commands_:
```bash
sudo bash
adduser prowler
su prowler
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install prowler
cd /tmp
prowler aws
```
???+ note
To download the results from AWS CloudShell, select Actions -> Download File and add the full path of each file. For the CSV file it will be something like `/tmp/output/prowler-output-123456789012-20221220191331.csv`
=== "Azure CloudShell"
_Requirements_:
* Open Azure CloudShell `bash`.
_Commands_:
```bash
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install prowler
cd /tmp
prowler azure --az-cli-auth
```
### Prowler App Update
You have two options to upgrade your Prowler App installation:
#### Option 1: Change env file with the following values
Edit your `.env` file and change the version values:
```env
PROWLER_UI_VERSION="5.9.0"
PROWLER_API_VERSION="5.9.0"
```
#### Option 2: Run the following command
```bash
docker compose pull --policy always
```
The `--policy always` flag ensures that Docker pulls the latest images even if they already exist locally.
???+ note "What Gets Preserved During Upgrade"
Everything is preserved, nothing will be deleted after the update.
#### Troubleshooting
If containers don't start, check logs for errors:
```bash
# Check logs for errors
docker compose logs
# Verify image versions
docker images | grep prowler
```
If you encounter issues, you can rollback to the previous version by changing the `.env` file back to your previous version and running:
```bash
docker compose pull
docker compose up -d
```
## Prowler container versions
The available versions of Prowler CLI are the following:
- `latest`: in sync with `master` branch (please note that it is not a stable version)
- `v4-latest`: in sync with `v4` branch (please note that it is not a stable version)
- `v3-latest`: in sync with `v3` branch (please note that it is not a stable version)
- `<x.y.z>` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases.
- `stable`: this tag always point to the latest release.
- `v4-stable`: this tag always point to the latest release for v4.
- `v3-stable`: this tag always point to the latest release for v3.
The container images are available here:
- Prowler CLI:
- [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags)
- [AWS Public ECR](https://gallery.ecr.aws/prowler-cloud/prowler)
- Prowler App:
- [DockerHub - Prowler UI](https://hub.docker.com/r/prowlercloud/prowler-ui/tags)
- [DockerHub - Prowler API](https://hub.docker.com/r/prowlercloud/prowler-api/tags)
## High level architecture
You can run Prowler from your workstation, a Kubernetes Job, a Google Compute Engine, an Azure VM, an EC2 instance, Fargate or any other container, CloudShell and many more.
![Architecture](img/architecture.png)
### Prowler App
The **Prowler App** consists of three main components:
- **Prowler UI**: A user-friendly web interface for running Prowler and viewing results, powered by Next.js.
- **Prowler API**: The backend API that executes Prowler scans and stores the results, built with Django REST Framework.
- **Prowler SDK**: A Python SDK that integrates with Prowler CLI for advanced functionality.
The app leverages the following supporting infrastructure:
- **PostgreSQL**: Used for persistent storage of scan results.
- **Celery Workers**: Facilitate asynchronous execution of Prowler scans.
- **Valkey**: An in-memory database serving as a message broker for the Celery workers.
![Prowler App Architecture](img/prowler-app-architecture.png)
## Deprecations from v3
The following are the deprecations carried out from v3.
### General
- `Allowlist` now is called `Mutelist`.
- The `--quiet` option has been deprecated. From now on use the `--status` flag to select the finding's status you want to get: PASS, FAIL or MANUAL.
- All `INFO` finding's status has changed to `MANUAL`.
- The CSV output format is common for all providers.
Some output formats are now deprecated:
- The native JSON is replaced for the JSON [OCSF](https://schema.ocsf.io/) v1.1.0, common for all the providers.
### AWS
- Deprecate the AWS flag `--sts-endpoint-region` since AWS STS regional tokens are used.
- To send only FAILS to AWS Security Hub, now you must use either `--send-sh-only-fails` or `--security-hub --status FAIL`.
## Basic Usage
### Prowler App
#### **Access the App**
Go to [http://localhost:3000](http://localhost:3000) after installing the app (see [Quick Start](#prowler-app-installation)). Sign up with your email and password.
<img src="img/sign-up-button.png" alt="Sign Up Button" width="320"/>
<img src="img/sign-up.png" alt="Sign Up" width="285"/>
???+ note "User creation and default tenant behavior"
When creating a new user, the behavior depends on whether an invitation is provided:
- **Without an invitation**:
- A new tenant is automatically created.
- The new user is assigned to this tenant.
- A set of **RBAC admin permissions** is generated and assigned to the user for the newly-created tenant.
- **With an invitation**: The user is added to the specified tenant with the permissions defined in the invitation.
This mechanism ensures that the first user in a newly created tenant has administrative permissions within that tenant.
#### Log In
Log in using your **email and password** to access the Prowler App.
<img src="img/log-in.png" alt="Log In" width="285"/>
#### Add a Cloud Provider
To configure a cloud provider for scanning:
1. Navigate to `Settings > Cloud Providers` and click `Add Account`.
2. Select the cloud provider you wish to scan (**AWS, GCP, Azure, Kubernetes**).
3. Enter the provider's identifier (Optional: Add an alias):
- **AWS**: Account ID
- **GCP**: Project ID
- **Azure**: Subscription ID
- **Kubernetes**: Cluster ID
- **M36**: Domain ID
4. Follow the guided instructions to add and authenticate your credentials.
#### Start a Scan
Once credentials are successfully added and validated, Prowler initiates a scan of your cloud environment.
Click `Go to Scans` to monitor progress.
#### View Results
While the scan is running, you can review findings in the following sections:
- **Overview** Provides a high-level summary of your scans.
<img src="img/overview.png" alt="Overview" width="700"/>
- **Compliance** Displays compliance insights based on security frameworks.
<img src="img/compliance.png" alt="Compliance" width="700"/>
> For detailed usage instructions, refer to the [Prowler App Guide](tutorials/prowler-app.md).
???+ note
Prowler will automatically scan all configured providers every **24 hours**, ensuring your cloud environment stays continuously monitored.
### Prowler CLI
#### Running Prowler
To run Prowler, you will need to specify the provider (e.g `aws`, `gcp`, `azure`, `m365`, `github` or `kubernetes`):
???+ note
If no provider is specified, AWS is used by default for backward compatibility with Prowler v2.
```console
prowler <provider>
```
![Prowler Execution](img/short-display.png)
???+ note
Running the `prowler` command without options will uses environment variable credentials. Refer to the [Requirements](./getting-started/requirements.md) section for credential configuration details.
#### Verbose Output
If you prefer the former verbose output, use: `--verbose`. This allows seeing more info while Prowler is running, minimal output is displayed unless verbosity is enabled.
#### Report Generation
By default, Prowler generates CSV, JSON-OCSF, and HTML reports. To generate a JSON-ASFF report (used by AWS Security Hub), specify `-M` or `--output-modes`:
```console
prowler <provider> -M csv json-asff json-ocsf html
```
The HTML report is saved in the output directory, alongside other reports. It will look like this:
![Prowler Execution](img/html-output.png)
#### Listing Available Checks and Services
To view all available checks or services within a provider:, use `-l`/`--list-checks` or `--list-services`.
```console
prowler <provider> --list-checks
prowler <provider> --list-services
```
#### Running Specific Checks or Services
Execute specific checks or services using `-c`/`checks` or `-s`/`services`:
```console
prowler azure --checks storage_blob_public_access_level_is_disabled
prowler aws --services s3 ec2
prowler gcp --services iam compute
prowler kubernetes --services etcd apiserver
```
#### Excluding Checks and Services
Checks and services can be excluded with `-e`/`--excluded-checks` or `--excluded-services`:
```console
prowler aws --excluded-checks s3_bucket_public_access
prowler azure --excluded-services defender iam
prowler gcp --excluded-services kms
prowler kubernetes --excluded-services controllermanager
```
#### Additional Options
Explore more advanced time-saving execution methods in the [Miscellaneous](tutorials/misc.md) section.
To access the help menu and view all available options, use: `-h`/`--help`:
```console
prowler --help
```
#### AWS
Use a custom AWS profile with `-p`/`--profile` and/or the AWS regions you want to audit with `-f`/`--filter-region`:
```console
prowler aws --profile custom-profile -f us-east-1 eu-south-2
```
???+ note
By default, `prowler` will scan all AWS regions.
See more details about AWS Authentication in the [Requirements](getting-started/requirements.md#aws) section.
#### Azure
Azure requires specifying the auth method:
```console
# To use service principal authentication
prowler azure --sp-env-auth
# To use az cli authentication
prowler azure --az-cli-auth
# To use browser authentication
prowler azure --browser-auth --tenant-id "XXXXXXXX"
# To use managed identity auth
prowler azure --managed-identity-auth
```
See more details about Azure Authentication in [Requirements](getting-started/requirements.md#azure)
By default, Prowler scans all the subscriptions for which it has permissions. To scan a single or various specific subscription you can use the following flag (using az cli auth as example):
```console
prowler azure --az-cli-auth --subscription-ids <subscription ID 1> <subscription ID 2> ... <subscription ID N>
```
#### Google Cloud
- **User Account Credentials**
By default, Prowler uses **User Account credentials**. You can configure your account using:
- `gcloud init` Set up a new account.
- `gcloud config set account <account>` Switch to an existing account.
Once configured, obtain access credentials using: `gcloud auth application-default login`.
- **Service Account Authentication**
Alternatively, you can use Service Account credentials:
Generate and download Service Account keys in JSON format. Refer to [Google IAM documentation](https://cloud.google.com/iam/docs/creating-managing-service-account-keys) for details.
Provide the key file location using this argument:
```console
prowler gcp --credentials-file path
```
- **Scanning Specific GCP Projects**
By default, Prowler scans all accessible GCP projects. To scan specific projects, use the `--project-ids` flag:
```console
prowler gcp --project-ids <Project ID 1> <Project ID 2> ... <Project ID N>
```
- **GCP Retry Configuration**
To configure the maximum number of retry attempts for Google Cloud SDK API calls, use the `--gcp-retries-max-attempts` flag:
```console
prowler gcp --gcp-retries-max-attempts 5
```
This is useful when experiencing quota exceeded errors (HTTP 429) to increase the number of automatic retry attempts.
#### Kubernetes
Prowler enables security scanning of Kubernetes clusters, supporting both **in-cluster** and **external** execution.
- **Non In-Cluster Execution**
```console
prowler kubernetes --kubeconfig-file path
```
???+ note
If no `--kubeconfig-file` is provided, Prowler will use the default KubeConfig file location (`~/.kube/config`).
- **In-Cluster Execution**
To run Prowler inside the cluster, apply the provided YAML configuration to deploy a job in a new namespace:
```console
kubectl apply -f kubernetes/prowler-sa.yaml
kubectl apply -f kubernetes/job.yaml
kubectl apply -f kubernetes/prowler-role.yaml
kubectl apply -f kubernetes/prowler-rolebinding.yaml
kubectl get pods --namespace prowler-ns --> prowler-XXXXX
kubectl logs prowler-XXXXX --namespace prowler-ns
```
???+ note
By default, Prowler scans all namespaces in the active Kubernetes context. Use the `--context`flag to specify the context to be scanned and `--namespaces` to restrict scanning to specific namespaces.
#### Microsoft 365
Microsoft 365 requires specifying the auth method:
```console
# To use service principal authentication for MSGraph and PowerShell modules
prowler m365 --sp-env-auth
# To use both service principal (for MSGraph) and user credentials (for PowerShell modules)
prowler m365 --env-auth
# To use az cli authentication
prowler m365 --az-cli-auth
# To use browser authentication
prowler m365 --browser-auth --tenant-id "XXXXXXXX"
```
See more details about M365 Authentication in the [Requirements](getting-started/requirements.md#microsoft-365) section.
#### GitHub
Prowler enables security scanning of your **GitHub account**, including **Repositories**, **Organizations** and **Applications**.
- **Supported Authentication Methods**
Authenticate using one of the following methods:
```console
# Personal Access Token (PAT):
prowler github --personal-access-token pat
# OAuth App Token:
prowler github --oauth-app-token oauth_token
# GitHub App Credentials:
prowler github --github-app-id app_id --github-app-key app_key
```
???+ note
If no login method is explicitly provided, Prowler will automatically attempt to authenticate using environment variables in the following order of precedence:
1. `GITHUB_PERSONAL_ACCESS_TOKEN`
2. `OAUTH_APP_TOKEN`
3. `GITHUB_APP_ID` and `GITHUB_APP_KEY`
#### Infrastructure as Code (IaC)
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Checkov](https://www.checkov.io/). This provider supports a wide range of IaC frameworks, allowing you to assess your code before deployment.
```console
# Scan a directory for IaC files
prowler iac --scan-path ./my-iac-directory
# Scan a remote GitHub repository (public or private)
prowler iac --scan-repository-url https://github.com/user/repo.git
# Authenticate to a private repo with GitHub username and PAT
prowler iac --scan-repository-url https://github.com/user/repo.git \
--github-username <username> --personal-access-token <token>
# Authenticate to a private repo with OAuth App Token
prowler iac --scan-repository-url https://github.com/user/repo.git \
--oauth-app-token <oauth_token>
# Specify frameworks to scan (default: all)
prowler iac --scan-path ./my-iac-directory --frameworks terraform kubernetes
# Exclude specific paths
prowler iac --scan-path ./my-iac-directory --exclude-path ./my-iac-directory/test,./my-iac-directory/examples
```
???+ note
- `--scan-path` and `--scan-repository-url` are mutually exclusive; only one can be specified at a time.
- For remote repository scans, authentication can be provided via CLI flags or environment variables (`GITHUB_OAUTH_APP_TOKEN`, `GITHUB_USERNAME`, `GITHUB_PERSONAL_ACCESS_TOKEN`). CLI flags take precedence.
- The IaC provider does not require cloud authentication for local scans.
- It is ideal for CI/CD pipelines and local development environments.
- For more details on supported frameworks and rules, see the [Checkov documentation](https://www.checkov.io/1.Welcome/Quick%20Start.html)
See more details about IaC scanning in the [IaC Tutorial](tutorials/iac/getting-started-iac.md) section.
## Prowler v2 Documentation
For **Prowler v2 Documentation**, refer to the [official repository](https://github.com/prowler-cloud/prowler/blob/8818f47333a0c1c1a457453c87af0ea5b89a385f/README.md).
- **Prowler CLI** (Command Line Interface)
- **Prowler App** (Web Application)
- [**Prowler Cloud**](https://cloud.prowler.com) A managed service built on top of Prowler App.
- [**Prowler Hub**](https://hub.prowler.com) A public library of versioned checks, cloud service artifacts, and compliance frameworks.

View File

@@ -0,0 +1,173 @@
### Installation
Prowler App supports multiple installation methods based on your environment.
Refer to the [Prowler App Tutorial](../tutorials/prowler-app.md) for detailed usage instructions.
=== "Docker Compose"
_Requirements_:
* `Docker Compose` installed: https://docs.docker.com/compose/install/.
_Commands_:
``` bash
curl -LO https://raw.githubusercontent.com/prowler-cloud/prowler/refs/heads/master/docker-compose.yml
curl -LO https://raw.githubusercontent.com/prowler-cloud/prowler/refs/heads/master/.env
docker compose up -d
```
> Containers are built for `linux/amd64`. If your workstation's architecture is different, please set `DOCKER_DEFAULT_PLATFORM=linux/amd64` in your environment or use the `--platform linux/amd64` flag in the docker command.
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
???+ note
You can change the environment variables in the `.env` file. Note that it is not recommended to use the default values in production environments.
???+ note
There is a development mode available, you can use the file https://github.com/prowler-cloud/prowler/blob/master/docker-compose-dev.yml to run the app in development mode.
???+ warning
Google and GitHub authentication is only available in [Prowler Cloud](https://prowler.com).
=== "GitHub"
_Requirements_:
* `git` installed.
* `poetry` installed: [poetry installation](https://python-poetry.org/docs/#installation).
* `npm` installed: [npm installation](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm).
* `Docker Compose` installed: https://docs.docker.com/compose/install/.
???+ warning
Make sure to have `api/.env` and `ui/.env.local` files with the required environment variables. You can find the required environment variables in the [`api/.env.template`](https://github.com/prowler-cloud/prowler/blob/master/api/.env.example) and [`ui/.env.template`](https://github.com/prowler-cloud/prowler/blob/master/ui/.env.template) files.
_Commands to run the API_:
``` bash
git clone https://github.com/prowler-cloud/prowler \
cd prowler/api \
poetry install \
eval $(poetry env activate) \
set -a \
source .env \
docker compose up postgres valkey -d \
cd src/backend \
python manage.py migrate --database admin \
gunicorn -c config/guniconf.py config.wsgi:application
```
???+ important
Starting from Poetry v2.0.0, `poetry shell` has been deprecated in favor of `poetry env activate`.
If your poetry version is below 2.0.0 you must keep using `poetry shell` to activate your environment.
In case you have any doubts, consult the Poetry environment activation guide: https://python-poetry.org/docs/managing-environments/#activating-the-environment
> Now, you can access the API documentation at http://localhost:8080/api/v1/docs.
_Commands to run the API Worker_:
``` bash
git clone https://github.com/prowler-cloud/prowler \
cd prowler/api \
poetry install \
eval $(poetry env activate) \
set -a \
source .env \
cd src/backend \
python -m celery -A config.celery worker -l info -E
```
_Commands to run the API Scheduler_:
``` bash
git clone https://github.com/prowler-cloud/prowler \
cd prowler/api \
poetry install \
eval $(poetry env activate) \
set -a \
source .env \
cd src/backend \
python -m celery -A config.celery beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
```
_Commands to run the UI_:
``` bash
git clone https://github.com/prowler-cloud/prowler \
cd prowler/ui \
npm install \
npm run build \
npm start
```
> Enjoy Prowler App at http://localhost:3000 by signing up with your email and password.
???+ warning
Google and GitHub authentication is only available in [Prowler Cloud](https://prowler.com).
### Update Prowler App
Upgrade Prowler App installation using one of two options:
#### Option 1: Update Environment File
Edit the `.env` file and change version values:
```env
PROWLER_UI_VERSION="5.9.0"
PROWLER_API_VERSION="5.9.0"
```
#### Option 2: Use Docker Compose Pull
```bash
docker compose pull --policy always
```
The `--policy always` flag ensures that Docker pulls the latest images even if they already exist locally.
???+ note "What Gets Preserved During Upgrade"
Everything is preserved, nothing will be deleted after the update.
### Troubleshooting
If containers don't start, check logs for errors:
```bash
# Check logs for errors
docker compose logs
# Verify image versions
docker images | grep prowler
```
If you encounter issues, you can rollback to the previous version by changing the `.env` file back to your previous version and running:
```bash
docker compose pull
docker compose up -d
```
### Container versions
The available versions of Prowler CLI are the following:
- `latest`: in sync with `master` branch (please note that it is not a stable version)
- `v4-latest`: in sync with `v4` branch (please note that it is not a stable version)
- `v3-latest`: in sync with `v3` branch (please note that it is not a stable version)
- `<x.y.z>` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases.
- `stable`: this tag always point to the latest release.
- `v4-stable`: this tag always point to the latest release for v4.
- `v3-stable`: this tag always point to the latest release for v3.
The container images are available here:
- Prowler App:
- [DockerHub - Prowler UI](https://hub.docker.com/r/prowlercloud/prowler-ui/tags)
- [DockerHub - Prowler API](https://hub.docker.com/r/prowlercloud/prowler-api/tags)

View File

@@ -0,0 +1,208 @@
## Installation
Prowler is available as a project in [PyPI](https://pypi.org/project/prowler/). Install it as a Python package with `Python >= 3.9, <= 3.12`:
=== "pipx"
[pipx](https://pipx.pypa.io/stable/) installs Python applications in isolated environments. Use `pipx` for global installation.
_Requirements_:
* `Python >= 3.9, <= 3.12`
* `pipx` installed: [pipx installation](https://pipx.pypa.io/stable/installation/).
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
``` bash
pipx install prowler
prowler -v
```
Upgrade Prowler to the latest version:
``` bash
pipx upgrade prowler
```
=== "pip"
???+ warning
This method modifies the chosen installation environment. Consider using [pipx](https://docs.prowler.com/projects/prowler-open-source/en/latest/#__tabbed_1_1) for global installation.
_Requirements_:
* `Python >= 3.9, <= 3.12`
* `Python pip >= 21.0.0`
* AWS, GCP, Azure, M365 and/or Kubernetes credentials
_Commands_:
``` bash
pip install prowler
prowler -v
```
Upgrade Prowler to the latest version:
``` bash
pip install --upgrade prowler
```
=== "Docker"
_Requirements_:
* Have `docker` installed: https://docs.docker.com/get-docker/.
* In the command below, change `-v` to your local directory path in order to access the reports.
* AWS, GCP, Azure and/or Kubernetes credentials
> Containers are built for `linux/amd64`. If your workstation's architecture is different, please set `DOCKER_DEFAULT_PLATFORM=linux/amd64` in your environment or use the `--platform linux/amd64` flag in the docker command.
_Commands_:
``` bash
docker run -ti --rm -v /your/local/dir/prowler-output:/home/prowler/output \
--name prowler \
--env AWS_ACCESS_KEY_ID \
--env AWS_SECRET_ACCESS_KEY \
--env AWS_SESSION_TOKEN toniblyx/prowler:latest
```
=== "GitHub"
_Requirements for Developers_:
* `git`
* `poetry` installed: [poetry installation](https://python-poetry.org/docs/#installation).
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
```
git clone https://github.com/prowler-cloud/prowler
cd prowler
poetry install
poetry run python prowler-cli.py -v
```
???+ note
If you want to clone Prowler from Windows, use `git config core.longpaths true` to allow long file paths.
=== "Amazon Linux 2"
_Requirements_:
* `Python >= 3.9, <= 3.12`
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
```
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install prowler
prowler -v
```
=== "Ubuntu"
_Requirements_:
* `Ubuntu 23.04` or above, if you are using an older version of Ubuntu check [pipx installation](https://docs.prowler.com/projects/prowler-open-source/en/latest/#__tabbed_1_1) and ensure you have `Python >= 3.9, <= 3.12`.
* `Python >= 3.9, <= 3.12`
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
``` bash
sudo apt update
sudo apt install pipx
pipx ensurepath
pipx install prowler
prowler -v
```
=== "Brew"
_Requirements_:
* `Brew` installed in your Mac or Linux
* AWS, GCP, Azure and/or Kubernetes credentials
_Commands_:
``` bash
brew install prowler
prowler -v
```
=== "AWS CloudShell"
After the migration of AWS CloudShell from Amazon Linux 2 to Amazon Linux 2023 [[1]](https://aws.amazon.com/about-aws/whats-new/2023/12/aws-cloudshell-migrated-al2023/) [[2]](https://docs.aws.amazon.com/cloudshell/latest/userguide/cloudshell-AL2023-migration.html), there is no longer a need to manually compile Python 3.9 as it is already included in AL2023. Prowler can thus be easily installed following the generic method of installation via pip. Follow the steps below to successfully execute Prowler v4 in AWS CloudShell:
_Requirements_:
* Open AWS CloudShell `bash`.
_Commands_:
```bash
sudo bash
adduser prowler
su prowler
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install prowler
cd /tmp
prowler aws
```
???+ note
To download the results from AWS CloudShell, select Actions -> Download File and add the full path of each file. For the CSV file it will be something like `/tmp/output/prowler-output-123456789012-20221220191331.csv`
=== "Azure CloudShell"
_Requirements_:
* Open Azure CloudShell `bash`.
_Commands_:
```bash
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install prowler
cd /tmp
prowler azure --az-cli-auth
```
## Container versions
The available versions of Prowler CLI are the following:
- `latest`: in sync with `master` branch (please note that it is not a stable version)
- `v4-latest`: in sync with `v4` branch (please note that it is not a stable version)
- `v3-latest`: in sync with `v3` branch (please note that it is not a stable version)
- `<x.y.z>` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases.
- `stable`: this tag always point to the latest release.
- `v4-stable`: this tag always point to the latest release for v4.
- `v3-stable`: this tag always point to the latest release for v3.
The container images are available here:
- Prowler CLI:
- [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags)
- [AWS Public ECR](https://gallery.ecr.aws/prowler-cloud/prowler)

View File

Before

Width:  |  Height:  |  Size: 313 KiB

After

Width:  |  Height:  |  Size: 313 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 420 KiB

View File

Before

Width:  |  Height:  |  Size: 192 KiB

After

Width:  |  Height:  |  Size: 192 KiB

View File

@@ -0,0 +1,22 @@
Prowler App is a web application that simplifies running Prowler. It provides:
- **User-friendly interface** for configuring and executing scans
- Dashboard to **view results** and manage **security findings**
![Prowler App](img/overview.png)
## Components
Prowler App consists of three main components:
- **Prowler UI**: User-friendly web interface for running Prowler and viewing results, powered by Next.js
- **Prowler API**: Backend API that executes Prowler scans and stores results, built with Django REST Framework
- **Prowler SDK**: Python SDK that integrates with Prowler CLI for advanced functionality
Supporting infrastructure includes:
- **PostgreSQL**: Persistent storage of scan results
- **Celery Workers**: Asynchronous execution of Prowler scans
- **Valkey**: In-memory database serving as message broker for Celery workers
![Prowler App Architecture](img/prowler-app-architecture.png)

View File

@@ -0,0 +1,37 @@
Prowler CLI is a command-line interface for running Prowler scans from the terminal.
```console
prowler <provider>
```
![Prowler CLI Execution](../img/short-display.png)
## Prowler Dashboard
```console
prowler dashboard
```
![Prowler Dashboard](img/dashboard.png)
Prowler includes hundreds of security controls aligned with widely recognized industry frameworks and standards, including:
- CIS Benchmarks (AWS, Azure, Microsoft 365, Kubernetes, GitHub)
- NIST SP 800-53 (rev. 4 and 5) and NIST SP 800-171
- NIST Cybersecurity Framework (CSF)
- CISA Guidelines
- FedRAMP Low & Moderate
- PCI DSS v3.2.1 and v4.0
- ISO/IEC 27001:2013 and 2022
- SOC 2
- GDPR (General Data Protection Regulation)
- HIPAA (Health Insurance Portability and Accountability Act)
- FFIEC (Federal Financial Institutions Examination Council)
- ENS RD2022 (Spanish National Security Framework)
- GxP 21 CFR Part 11 and EU Annex 11
- RBI Cybersecurity Framework (Reserve Bank of India)
- KISA ISMS-P (Korean Information Security Management System)
- MITRE ATT&CK
- AWS Well-Architected Framework (Security & Reliability Pillars)
- AWS Foundational Technical Review (FTR)
- Microsoft NIS2 Directive (EU)
- Custom threat scoring frameworks (prowler_threatscore)
- Custom security frameworks for enterprise needs

View File

@@ -1,24 +1,87 @@
# Security
## Compliance and Trust
We publish our live SOC 2 Type 2 Compliance data at [https://trust.prowler.com](https://trust.prowler.com)
As an **AWS Partner**, we have passed the [AWS Foundation Technical Review (FTR)](https://aws.amazon.com/partners/foundational-technical-review/).
## Encryption (Prowler Cloud)
We use encryption everywhere possible. The data and communications used by **Prowler Cloud** are **encrypted at-rest** and **in-transit**.
## Data Retention Policy (Prowler Cloud)
Prowler Cloud is GDPR compliant in regards to personal data and the ["right to be forgotten"](https://gdpr.eu/right-to-be-forgotten/). When a user deletes their account their user information will be deleted from Prowler Cloud online and backup systems within 10 calendar days.
## Software Security
As an **AWS Partner** and we have passed the [AWS Foundation Technical Review (FTR)](https://aws.amazon.com/partners/foundational-technical-review/) and we use the following tools and automation to make sure our code is secure and dependencies up-to-dated:
We follow a **security-by-design approach** throughout our software development lifecycle. All changes go through automated checks at every stage, from local development to production deployment.
- `bandit` for code security review.
- `safety` and `dependabot` for dependencies.
- `hadolint` and `dockle` for our containers security.
- `snyk` in Docker Hub.
- `clair` in Amazon ECR.
- `vulture`, `flake8`, `black` and `pylint` for formatting and best practices.
We enforce [pre-commit](https://github.com/prowler-cloud/prowler/blob/master/.pre-commit-config.yaml) validations to catch issues early, and [our CI/CD pipelines](https://github.com/prowler-cloud/prowler/tree/master/.github) include multiple security gates to ensure code quality, secure configurations, and compliance with internal standards.
## Reporting Vulnerabilities
Our container registries are continuously scanned for vulnerabilities, with findings automatically reported to our security team for assessment and remediation. This process evolves alongside our stack as we adopt new languages, frameworks, and technologies, ensuring our security practices remain comprehensive, proactive, and adaptable.
If you would like to report a vulnerability or have a security concern regarding Prowler Open Source or Prowler Cloud service, please submit the information by contacting to us via [**support.prowler.com**](http://support.prowler.com).
## Reporting Vulnerabilities
The information you share with the Prowler team as part of this process is kept confidential within Prowler. We will only share this information with a third party if the vulnerability you report is found to affect a third-party product, in which case we will share this information with the third-party product's author or manufacturer. Otherwise, we will only share this information as permitted by you.
At Prowler, we consider the security of our open source software and systems a top priority. But no matter how much effort we put into system security, there can still be vulnerabilities present.
We will review the submitted report, and assign it a tracking number. We will then respond to you, acknowledging receipt of the report, and outline the next steps in the process.
If you discover a vulnerability, we would like to know about it so we can take steps to address it as quickly as possible. We would like to ask you to help us better protect our users, our clients and our systems.
You will receive a non-automated response to your initial contact within 24 hours, confirming receipt of your reported vulnerability.
When reporting vulnerabilities, please consider (1) attack scenario / exploitability, and (2) the security impact of the bug. The following issues are considered out of scope:
We will coordinate public notification of any validated vulnerability with you. Where possible, we prefer that our respective public disclosures be posted simultaneously.
- Social engineering support or attacks requiring social engineering.
- Clickjacking on pages with no sensitive actions.
- Cross-Site Request Forgery (CSRF) on unauthenticated forms or forms with no sensitive actions.
- Attacks requiring Man-In-The-Middle (MITM) or physical access to a user's device.
- Previously known vulnerable libraries without a working Proof of Concept (PoC).
- Comma Separated Values (CSV) injection without demonstrating a vulnerability.
- Missing best practices in SSL/TLS configuration.
- Any activity that could lead to the disruption of service (DoS).
- Rate limiting or brute force issues on non-authentication endpoints.
- Missing best practices in Content Security Policy (CSP).
- Missing HttpOnly or Secure flags on cookies.
- Configuration of or missing security headers.
- Missing email best practices, such as invalid, incomplete, or missing SPF/DKIM/DMARC records.
- Vulnerabilities only affecting users of outdated or unpatched browsers (less than two stable versions behind).
- Software version disclosure, banner identification issues, or descriptive error messages.
- Tabnabbing.
- Issues that require unlikely user interaction.
- Improper logout functionality and improper session timeout.
- CORS misconfiguration without an exploitation scenario.
- Broken link hijacking.
- Automated scanning results (e.g., sqlmap, Burp active scanner) that have not been manually verified.
- Content spoofing and text injection issues without a clear attack vector.
- Email spoofing without exploiting security flaws.
- Dead links or broken links.
- User enumeration.
Testing guidelines:
- Do not run automated scanners on other customer projects. Running automated scanners can run up costs for our users. Aggressively configured scanners might inadvertently disrupt services, exploit vulnerabilities, lead to system instability or breaches and violate Terms of Service from our upstream providers. Our own security systems won't be able to distinguish hostile reconnaissance from whitehat research. If you wish to run an automated scanner, notify us at support@prowler.com and only run it on your own Prowler app project. Do NOT attack Prowler in usage of other customers.
- Do not take advantage of the vulnerability or problem you have discovered, for example by downloading more data than necessary to demonstrate the vulnerability or deleting or modifying other people's data.
Reporting guidelines:
- File a report through our Support Desk at https://support.prowler.com
- If it is about a lack of a security functionality, please file a feature request instead at https://github.com/prowler-cloud/prowler/issues
- Do provide sufficient information to reproduce the problem, so we will be able to resolve it as quickly as possible.
- If you have further questions and want direct interaction with the Prowler team, please contact us at via our Community Slack at goto.prowler.com/slack.
Disclosure guidelines:
- In order to protect our users and customers, do not reveal the problem to others until we have researched, addressed and informed our affected customers.
- If you want to publicly share your research about Prowler at a conference, in a blog or any other public forum, you should share a draft with us for review and approval at least 30 days prior to the publication date. Please note that the following should not be included:
- Data regarding any Prowler user or customer projects.
- Prowler customers' data.
- Information about Prowler employees, contractors or partners.
What we promise:
- We will respond to your report within 5 business days with our evaluation of the report and an expected resolution date.
- If you have followed the instructions above, we will not take any legal action against you in regard to the report.
- We will handle your report with strict confidentiality, and not pass on your personal details to third parties without your permission.
- We will keep you informed of the progress towards resolving the problem.
- In the public information concerning the problem reported, we will give your name as the discoverer of the problem (unless you desire otherwise).
We strive to resolve all problems as quickly as possible, and we would like to play an active role in the ultimate publication on the problem after it is resolved.

View File

@@ -1,6 +1,17 @@
# AWS Authentication in Prowler
Proper authentication is required for Prowler to perform security checks across AWS resources. Ensure that AWS-CLI is correctly configured or manually declare AWS credentials before running scans.
Prowler requires AWS credentials to function properly. Authentication is available through the following methods:
## Required Permissions
To ensure full functionality, attach the following AWS managed policies to the designated user or role:
- `arn:aws:iam::aws:policy/SecurityAudit`
- `arn:aws:iam::aws:policy/job-function/ViewOnlyAccess`
### Additional Permissions
For certain checks, additional read-only permissions are required. Attach the following custom policy to your role: [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json)
## Configure AWS Credentials
@@ -18,24 +29,13 @@ export AWS_SECRET_ACCESS_KEY="XXXXXXXXX"
export AWS_SESSION_TOKEN="XXXXXXXXX"
```
These credentials must be associated with a user or role with the necessary permissions to perform security checks.
These credentials must be associated with a user or role with the necessary permissions to perform security checks.
## Assign Required AWS Permissions
To ensure full functionality, attach the following AWS managed policies to the designated user or role:
- `arn:aws:iam::aws:policy/SecurityAudit`
- `arn:aws:iam::aws:policy/job-function/ViewOnlyAccess`
???+ note
Some security checks require read-only additional permissions. Attach the following custom policies to the role: [prowler-additions-policy.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-additions-policy.json). If you want Prowler to send findings to [AWS Security Hub](https://aws.amazon.com/security-hub), make sure to also attach the custom policy: [prowler-security-hub.json](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-security-hub.json).
## AWS Profiles
## AWS Profiles and Service Scanning in Prowler
Prowler supports authentication and security assessments using custom AWS profiles and can optionally scan unused services.
**Using Custom AWS Profiles**
Prowler allows you to specify a custom AWS profile using the following command:
Specify a custom AWS profile using the following command:
```console
prowler aws -p/--profile <profile_name>
@@ -43,7 +43,7 @@ prowler aws -p/--profile <profile_name>
## Multi-Factor Authentication (MFA)
If MFA enforcement is required for your IAM entity, you can use `--mfa`. Prowler will prompt you to enter the following in order to get a new session:
For IAM entities requiring Multi-Factor Authentication (MFA), use the `--mfa` flag. Prowler prompts for the following values to initiate a new session:
- ARN of your MFA device
- TOTP (Time-Based One-Time Password)
- **ARN of your MFA device**
- **TOTP (Time-Based One-Time Password)**

View File

@@ -97,6 +97,9 @@ This method grants permanent access and is the recommended setup for production
![External ID](./img/prowler-cloud-external-id.png)
![Stack Data](./img/fill-stack-data.png)
!!! info
An **External ID** is required when assuming the *ProwlerScan* role to comply with AWS [confused deputy prevention](https://docs.aws.amazon.com/IAM/latest/UserGuide/confused-deputy.html).
6. Acknowledge the IAM resource creation warning and proceed
![Stack Creation Second Step](./img/stack-creation-second-step.png)

View File

@@ -1,5 +1,13 @@
# AWS Organizations in Prowler
Prowler can integrate with AWS Organizations to manage the visibility and onboarding of accounts centrally.
When trusted access is enabled with the Organization, Prowler can discover accounts as they are created and even automate deployment of the Prowler Scan IAM Role.
> Trusted access can be enabled in the Management Account from the AWS Console under **AWS Organizations → Settings → Trusted access for AWS CloudFormation StackSets**.
When not using StackSets or Prowler and only needing to scan AWS Organization accounts using the CLI, it is possible to assume a role in each account manually or automate that logic with custom scripts.
## Retrieving AWS Account Details
If AWS Organizations is enabled, Prowler can fetch detailed account information during scans, including:
@@ -33,7 +41,7 @@ Prowler will scan the AWS account and get the account details from AWS Organizat
### Handling JSON Output
In Prowlers JSON output, tags are encoded in Base64 to prevent formatting errors in CSV or JSON outputs. This ensures compatibility when exporting findings.
In Prowler's JSON output, tags are encoded in Base64 to prevent formatting errors in CSV or JSON outputs. This ensures compatibility when exporting findings.
```json
"Account Email": "my-prod-account@domain.com",
@@ -51,6 +59,67 @@ The additional fields in CSV header output are as follows:
- ACCOUNT\_DETAILS\_ORG
- ACCOUNT\_DETAILS\_TAGS
## Deploying Prowler IAM Roles Across AWS Organizations
When onboarding multiple AWS accounts into Prowler Cloud, it is important to deploy the Prowler Scan IAM Role in each account. The most efficient way to do this across an AWS Organization is by leveraging AWS CloudFormation StackSets, which rolls out infrastructure—like IAM roles—to all accounts centrally from the Management or Delegated Admin account.
When using Infrastructure as Code (IaC), Terraform is recommended to manage this deployment systematically.
### Recommended Approach
- **Use StackSets** from the **Management Account** (or a Delegated Admin/Security Account).
- **Use Terraform** to orchestrate the deployment.
- **Use the official CloudFormation template** provided by Prowler.
- Target specific Organizational Units (OUs) or the entire Organization.
???+ note
A detailed community article this implementation is based on is available here:
[Deploy IAM Roles Across an AWS Organization as Code (Unicrons)](https://unicrons.cloud/en/2024/10/14/deploy-iam-roles-across-an-aws-organization-as-code/)
This guide has been adapted with permission and aligned with Prowlers IAM role requirements.
---
### Step-by-Step Guide Using Terraform
Below is a ready Terraform snippet that deploys the [Prowler Scan IAM Role CloudFormation template](https://github.com/prowler-cloud/prowler/blob/master/permissions/templates/cloudformation/prowler-scan-role.yml) across the AWS Organization using StackSets:
```hcl title="main.tf"
data "aws_caller_identity" "this" {}
data "aws_organizations_organization" "this" {}
module "prowler-scan-role" {
source = "unicrons/organization-iam-role/aws"
stack_set_name = "prowler-scan-role"
stack_set_description = "Deploy Prowler Scan IAM Role across all organization accounts"
template_path = "${path.root}/prowler-scan-role.yaml"
template_parameters = {
ExternalId = "<< external ID >>" # Replace with the External ID provided by Prowler Cloud
}
# Specific OU IDs can be specified instead of root
organizational_unit_ids = [data.aws_organizations_organization.this.roots[0].id]
}
```
#### `prowler-scan-role.yaml`
Download or reference the official CloudFormation template directly from GitHub:
- [prowler-scan-role.yml](https://github.com/prowler-cloud/prowler/blob/master/permissions/templates/cloudformation/prowler-scan-role.yml)
---
### IAM Role: External ID Support
Include the `ExternalId` parameter in the StackSet if required by the organization's Prowler Cloud setup. This ensures secure cross-account access for scanning.
---
When encountering issues during deployment or needing to target specific OUs or environments (e.g., dev/staging/prod), reach out to the Prowler team via [Slack Community](https://prowler.com/slack) or [Support](mailto:support@prowler.com).
## Extra: Run Prowler across all accounts in AWS Organizations by assuming roles
### Running Prowler Across All AWS Organization Accounts

View File

@@ -21,7 +21,7 @@ AWS Security Hub can be enabled using either of the following methods:
#### Enabling AWS Security Hub for Prowler Integration
If AWS Security Hub is already enabled, you can proceed to the [next section](#enable-prowler-integration).
If AWS Security Hub is already enabled, you can proceed to the [next section](#enabling-prowler-integration-in-aws-security-hub).
1. Enable AWS Security Hub via Console: Open the **AWS Security Hub** console: https://console.aws.amazon.com/securityhub/.
@@ -33,7 +33,7 @@ If AWS Security Hub is already enabled, you can proceed to the [next section](#e
#### Enabling Prowler Integration in AWS Security Hub
If the Prowler integration is already enabled in AWS Security Hub, you can proceed to the [next section](#send-findings) and begin sending findings.
If the Prowler integration is already enabled in AWS Security Hub, you can proceed to the [next section](#sending-findings-to-aws-security-hub) and begin sending findings.
Once **AWS Security Hub** is activated, **Prowler** must be enabled as partner integration to allow security findings to be sent to it.

View File

@@ -1,28 +1,77 @@
# Azure Authentication in Prowler
By default, Prowler utilizes the Azure Python SDK identity package for authentication, leveraging the classes `DefaultAzureCredential` and `InteractiveBrowserCredential`. This enables authentication against Azure using the following approaches:
Prowler for Azure supports multiple authentication types. To use a specific method, pass the appropriate flag during execution:
- Service principal authentication via environment variables (Enterprise Application)
- Currently stored AZ CLI credentials
- Interactive browser authentication
- Managed identity authentication
- [**Service Principal Application**](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (**Recommended**)
- Existing **AZ CLI credentials**
- **Interactive browser authentication**
- [**Managed Identity**](https://learn.microsoft.com/en-us/entra/identity/managed-identities-azure-resources/overview) authentication
Before launching the tool, specify the desired method using the following flags:
> ⚠️ **Important:** For Prowler App, only Service Principal authentication is supported.
### Service Principal Application Authentication
Enable Prowler authentication using a Service Principal Application by setting up the following environment variables:
```console
# Service principal authentication:
prowler azure --sp-env-auth
# AZ CLI authentication
prowler azure --az-cli-auth
# Browser authentication
prowler azure --browser-auth --tenant-id "XXXXXXXX"
# Managed identity authentication
prowler azure --managed-identity-auth
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXX"
```
## Permission Configuration
Execution with the `--sp-env-auth` flag fails if these variables are not set or exported.
To ensure Prowler can access the required resources within your Azure account, proper permissions must be configured. Refer to the [Requirements](../../getting-started/requirements.md) section for details on setting up necessary privileges.
Refer to the [Create Prowler Service Principal](create-prowler-service-principal.md) guide for detailed setup instructions.
### Azure Authentication Methods
Prowler for Azure supports the following authentication methods:
- **AZ CLI Authentication (`--az-cli-auth`)** Automated authentication using stored AZ CLI credentials.
- **Managed Identity Authentication (`--managed-identity-auth`)** Automated authentication via Azure Managed Identity.
- **Browser Authentication (`--browser-auth`)** Requires the user to authenticate using the default browser. The `tenant-id` parameter is mandatory for this method.
### Required Permissions
Prowler for Azure requires two types of permission scopes:
#### Microsoft Entra ID Permissions
These permissions allow Prowler to retrieve metadata from the assumed identity and perform specific Entra checks. While not mandatory for execution, they enhance functionality.
Required permissions:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All` (used for Entra multifactor authentication checks)
???+ note
Replace `Directory.Read.All` with `Domain.Read.All` for more restrictive permissions. Note that Entra checks related to DirectoryRoles and GetUsers will not run with this permission.
#### Subscription Scope Permissions
These permissions are required to perform security checks against Azure resources. The following **RBAC roles** must be assigned per subscription to the entity used by Prowler:
- `Reader` Grants read-only access to Azure resources.
- `ProwlerRole` A custom role with minimal permissions, defined in the [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json).
???+ note
The `assignableScopes` field in the JSON custom role file must be updated to reflect the correct subscription or management group. Use one of the following formats: `/subscriptions/<subscription-id>` or `/providers/Microsoft.Management/managementGroups/<management-group-id>`.
### Assigning Permissions
To properly configure permissions, follow these guides:
- [Microsoft Entra ID permissions](create-prowler-service-principal.md#assigning-proper-permissions)
- [Azure subscription permissions](subscriptions.md)
???+ warning
Some permissions in `ProwlerRole` involve **write access**. If a `ReadOnly` lock is attached to certain resources, you may encounter errors, and findings for those checks will not be available.
#### Checks Requiring `ProwlerRole`
The following security checks require the `ProwlerRole` permissions for execution. Ensure the role is assigned to the identity assumed by Prowler before running these checks:
- `app_function_access_keys_configured`
- `app_function_ftps_deployment_disabled`

View File

@@ -73,7 +73,7 @@ Permissions can be assigned via the Azure Portal or the Azure CLI.
7. Finally, search for "Directory", "Policy" and "UserAuthenticationMethod" select the following permissions:
- `Domain.Read.All`
- `Directory.Read.All`
- `Policy.Read.All`

View File

@@ -21,7 +21,7 @@ Prowler allows you to specify one or more subscriptions for scanning (up to N),
To perform scans, ensure that the identity assumed by Prowler has the appropriate permissions.
By default, Prowler scans all accessible subscriptions. If you need to audit specific subscriptions, you must assign the necessary role `Reader` for each one. For streamlined and less repetitive role assignments in multi-subscription environments, refer to the [following section](#recommendation-for-multiple-subscriptions).
By default, Prowler scans all accessible subscriptions. If you need to audit specific subscriptions, you must assign the necessary role `Reader` for each one. For streamlined and less repetitive role assignments in multi-subscription environments, refer to the [following section](#recommendation-for-managing-multiple-subscriptions).
### Assigning the Reader Role in Azure Portal
@@ -76,7 +76,7 @@ Navigate to the subscription you want to audit with Prowler.
Some read-only permissions required for specific security checks are not included in the built-in Reader role. To support these checks, Prowler utilizes a custom role, defined in [prowler-azure-custom-role](https://github.com/prowler-cloud/prowler/blob/master/permissions/prowler-azure-custom-role.json). Once created, this role can be assigned following the same process as the `Reader` role.
The checks requiring this `ProwlerRole` can be found in the [requirements section](../../getting-started/requirements.md#checks-that-require-prowlerrole).
The checks requiring this `ProwlerRole` can be found in this [section](../../tutorials/azure/authentication.md#checks-requiring-prowlerrole).
#### Create ProwlerRole via Azure Portal
@@ -152,7 +152,7 @@ Scanning multiple subscriptions requires creating and assigning roles for each,
![Create management group](../../img/create-management-group.gif)
2. **Assign Roles**: Assign necessary roles to the management group, similar to the [role assignment process](#assign-the-appropriate-permissions-to-the-identity-that-is-going-to-be-assumed-by-prowler).
2. **Assign Roles**: Assign necessary roles to the management group, similar to the [role assignment process](#assigning-permissions-for-subscription-scans).
Role assignment should be done at the management group level instead of per subscription.

View File

@@ -0,0 +1,367 @@
# Bulk Provider Provisioning in Prowler
Prowler enables automated provisioning of multiple cloud providers through the Bulk Provider Provisioning tool. This approach streamlines the onboarding process for organizations managing numerous cloud accounts, subscriptions, and projects across AWS, Azure, GCP, Kubernetes, Microsoft 365, and GitHub.
The tool is available in the Prowler repository at: [util/prowler-bulk-provisioning](https://github.com/prowler-cloud/prowler/tree/master/util/prowler-bulk-provisioning)
![](./img/bulk-provider-provisioning.png)
## Overview
The Bulk Provider Provisioning tool automates the creation of cloud providers in Prowler App or Prowler Cloud by:
* Reading provider configurations from YAML files
* Creating providers with appropriate authentication credentials
* Testing connections to verify successful authentication
* Processing multiple providers concurrently for efficiency
## Prerequisites
### Requirements
* Python 3.7 or higher
* Prowler API token (from Prowler Cloud or self-hosted Prowler App)
* For self-hosted Prowler App, remember to [point to your API base URL](#custom-api-endpoints)
* Authentication credentials for target cloud providers
### Installation
Clone the repository and install the required dependencies:
```bash
git clone https://github.com/prowler-cloud/prowler.git
cd prowler/util/prowler-bulk-provisioning
pip install -r requirements.txt
```
### Authentication Setup
Configure your Prowler API token:
```bash
export PROWLER_API_TOKEN="your-prowler-api-token"
```
To obtain an API token programmatically:
```bash
export PROWLER_API_TOKEN=$(curl --location 'https://api.prowler.com/api/v1/tokens' \
--header 'Content-Type: application/vnd.api+json' \
--header 'Accept: application/vnd.api+json' \
--data-raw '{
"data": {
"type": "tokens",
"attributes": {
"email": "your@email.com",
"password": "your-password"
}
}
}' | jq -r .data.attributes.access)
```
## Configuration File Structure
Create a YAML file listing your cloud providers and credentials:
```yaml
# providers.yaml
- provider: aws
uid: "123456789012" # AWS Account ID
alias: "production-account"
auth_method: role
credentials:
role_arn: "arn:aws:iam::123456789012:role/ProwlerScanRole"
external_id: "prowler-external-id"
- provider: azure
uid: "00000000-1111-2222-3333-444444444444" # Subscription ID
alias: "azure-production"
auth_method: service_principal
credentials:
tenant_id: "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee"
client_id: "ffffffff-1111-2222-3333-444444444444"
client_secret: "your-client-secret"
- provider: gcp
uid: "my-gcp-project" # Project ID
alias: "gcp-production"
auth_method: service_account
credentials:
service_account_key_json_path: "./service-account.json"
```
## Running the Bulk Provisioning Tool
### Basic Usage
To provision all providers from your configuration file:
```bash
python prowler_bulk_provisioning.py providers.yaml
```
The tool automatically tests each provider connection after creation (enabled by default).
### Dry Run Mode
Test your configuration without making API calls:
```bash
python prowler_bulk_provisioning.py providers.yaml --dry-run
```
### Skip Connection Testing
To provision providers without testing connections:
```bash
python prowler_bulk_provisioning.py providers.yaml --test-provider false
```
### Test Existing Providers Only
To verify connections for already provisioned providers:
```bash
python prowler_bulk_provisioning.py providers.yaml --test-provider-only
```
## Provider-Specific Configuration
### AWS Provider Configuration
#### Using IAM Role (Recommended)
```yaml
- provider: aws
uid: "123456789012"
alias: "aws-production"
auth_method: role
credentials:
role_arn: "arn:aws:iam::123456789012:role/ProwlerScanRole"
external_id: "optional-external-id"
session_name: "prowler-scan-session" # optional
duration_seconds: 3600 # optional
```
#### Using Access Keys
```yaml
- provider: aws
uid: "123456789012"
alias: "aws-development"
auth_method: credentials
credentials:
access_key_id: "AKIA..."
secret_access_key: "..."
session_token: "..." # optional for temporary credentials
```
### Azure Provider Configuration
```yaml
- provider: azure
uid: "subscription-uuid"
alias: "azure-production"
auth_method: service_principal
credentials:
tenant_id: "tenant-uuid"
client_id: "client-uuid"
client_secret: "client-secret"
```
### GCP Provider Configuration
#### Using Service Account JSON
```yaml
- provider: gcp
uid: "project-id"
alias: "gcp-production"
auth_method: service_account
credentials:
service_account_key_json_path: "/path/to/key.json"
```
#### Using OAuth2 Credentials
```yaml
- provider: gcp
uid: "project-id"
alias: "gcp-production"
auth_method: oauth2
credentials:
client_id: "123456789.apps.googleusercontent.com"
client_secret: "GOCSPX-xxxx"
refresh_token: "1//0exxxxxx"
```
### Kubernetes Provider Configuration
```yaml
- provider: kubernetes
uid: "context-name"
alias: "eks-production"
auth_method: kubeconfig
credentials:
kubeconfig_path: "~/.kube/config"
# OR inline configuration:
# kubeconfig_inline: |
# apiVersion: v1
# clusters: ...
```
### Microsoft 365 Provider Configuration
```yaml
- provider: m365
uid: "domain.onmicrosoft.com"
alias: "m365-tenant"
auth_method: service_principal
credentials:
tenant_id: "tenant-uuid"
client_id: "client-uuid"
client_secret: "client-secret"
```
### GitHub Provider Configuration
#### Using Personal Access Token
```yaml
- provider: github
uid: "organization-name"
alias: "github-org"
auth_method: personal_access_token
credentials:
token: "ghp_..."
```
#### Using GitHub App
```yaml
- provider: github
uid: "organization-name"
alias: "github-org"
auth_method: github_app
credentials:
app_id: "123456"
private_key_path: "/path/to/private-key.pem"
```
## Advanced Configuration
### Concurrent Processing
Adjust the number of concurrent provider creations:
```bash
python prowler_bulk_provisioning.py providers.yaml --concurrency 10
```
### Custom API Endpoints
For self-hosted Prowler App installations:
```bash
python prowler_bulk_provisioning.py providers.yaml \
--base-url http://localhost:8080/api/v1
```
### Timeout Configuration
Set custom timeout for API requests:
```bash
python prowler_bulk_provisioning.py providers.yaml --timeout 120
```
## Bulk Provider Management
### Deleting Multiple Providers
To remove all providers from your Prowler account:
```bash
python nuke_providers.py --confirm
```
Filter deletions by provider type:
```bash
python nuke_providers.py --confirm --filter-provider aws
```
Filter deletions by alias pattern:
```bash
python nuke_providers.py --confirm --filter-alias "test-*"
```
## Configuration File Format
The tool uses YAML format for provider configuration files. Each provider entry requires:
* `provider`: The cloud provider type (aws, azure, gcp, kubernetes, m365, github)
* `uid`: Unique identifier for the provider (account ID, subscription ID, project ID, etc.)
* `alias`: A friendly name for the provider
* `auth_method`: Authentication method to use
* `credentials`: Authentication credentials specific to the provider and method
Example YAML structure:
```yaml
- provider: aws
uid: "123456789012"
alias: "production"
auth_method: role
credentials:
role_arn: "arn:aws:iam::123456789012:role/ProwlerScan"
```
## Example Output
Successful provider provisioning:
```
[1] ✅ Created provider (id=db9a8985-f9ec-4dd8-b5a0-e05ab3880bed)
[1] ✅ Created secret (id=466f76c6-5878-4602-a4bc-13f9522c1fd2)
[1] ✅ Connection test: Connected
[2] ✅ Created provider (id=7a99f789-0cf5-4329-8279-2d443a962676)
[2] ✅ Created secret (id=c5702180-f7c4-40fd-be0e-f6433479b126)
[2] ⚠️ Connection test: Not connected
Done. Success: 2 Failures: 0
```
## Troubleshooting
### Invalid API Token
```
Error: 401 Unauthorized
Solution: Verify your PROWLER_API_TOKEN or --token parameter
```
### Network Timeouts
```
Error: Connection timeout
Solution: Increase timeout with --timeout 120
```
### Provider Already Exists
```
Error: Provider with this UID already exists
Solution: Use different UID or delete existing provider first
```
### Authentication Failures
```
Connection test: Not connected
Solution: Verify credentials and IAM permissions
```

View File

@@ -80,8 +80,12 @@ The following list includes all the Azure checks with configurable variables tha
| `app_ensure_python_version_is_latest` | `python_latest_version` | String |
| `app_ensure_java_version_is_latest` | `java_latest_version` | String |
| `sqlserver_recommended_minimal_tls_version` | `recommended_minimal_tls_versions` | List of Strings |
| `vm_sufficient_daily_backup_retention_period` | `vm_backup_min_daily_retention_days` | Integer |
| `vm_desired_sku_size` | `desired_vm_sku_sizes` | List of Strings |
| `defender_attack_path_notifications_properly_configured` | `defender_attack_path_minimal_risk_level` | String |
| `apim_threat_detection_llm_jacking` | `apim_threat_detection_llm_jacking_threshold` | Float |
| `apim_threat_detection_llm_jacking` | `apim_threat_detection_llm_jacking_minutes` | Integer |
| `apim_threat_detection_llm_jacking` | `apim_threat_detection_llm_jacking_actions` | List of Strings |
## GCP
@@ -493,6 +497,48 @@ azure:
"Standard_DS3_v2",
"Standard_D4s_v3",
]
# Azure VM Backup Configuration
# azure.vm_sufficient_daily_backup_retention_period
vm_backup_min_daily_retention_days: 7
# Azure API Management Threat Detection Configuration
# azure.apim_threat_detection_llm_jacking
apim_threat_detection_llm_jacking_threshold: 0.1
apim_threat_detection_llm_jacking_minutes: 1440
apim_threat_detection_llm_jacking_actions:
[
# OpenAI API endpoints
"ImageGenerations_Create",
"ChatCompletions_Create",
"Completions_Create",
"Embeddings_Create",
"FineTuning_Jobs_Create",
"Models_List",
# Azure OpenAI endpoints
"Deployments_List",
"Deployments_Get",
"Deployments_Create",
"Deployments_Delete",
# Anthropic endpoints
"Messages_Create",
"Claude_Create",
# Google AI endpoints
"GenerateContent",
"GenerateText",
"GenerateImage",
# Meta AI endpoints
"Llama_Create",
"CodeLlama_Create",
# Other LLM endpoints
"Gemini_Generate",
"Claude_Generate",
"Llama_Generate"
]
# GCP Configuration
gcp:

View File

@@ -1,40 +1,40 @@
# GCP Authentication in Prowler
## Default Authentication
## Required Permissions
By default, Prowler uses your User Account credentials. You can configure authentication as follows:
Prowler for Google Cloud requires the following permissions:
- `gcloud init` to use a new account, or
- `gcloud config set account <account>` to use an existing account.
### IAM Roles
- **Reader (`roles/reader`)** Must be granted at the **project, folder, or organization** level to allow scanning of target projects.
Then, obtain your access credentials using: `gcloud auth application-default login`.
### Project-Level Settings
## Using Service Account Keys
At least one project must have the following configurations:
Alternatively, Service Account keys can be generated and downloaded in JSON format. Follow the steps in the Google Cloud IAM guide (https://cloud.google.com/iam/docs/creating-managing-service-account-keys) to create and manage service account keys. Provide the path to the key file using:
- **Identity and Access Management (IAM) API (`iam.googleapis.com`)** Must be enabled via:
```console
prowler gcp --credentials-file path
```
- The [Google Cloud API UI](https://console.cloud.google.com/apis/api/iam.googleapis.com/metrics), or
- The `gcloud` CLI:
```sh
gcloud services enable iam.googleapis.com --project <your-project-id>
```
- **Service Usage Consumer (`roles/serviceusage.serviceUsageConsumer`)** IAM Role Required for resource scanning.
- **Quota Project Setting** Define a quota project using either:
- The `gcloud` CLI:
```sh
gcloud auth application-default set-quota-project <project-id>
```
- Setting an environment variable:
```sh
export GOOGLE_CLOUD_QUOTA_PROJECT=<project-id>
```
???+ note
`prowler` will scan the GCP project associated with the credentials.
## Using an access token
If you already have an access token (e.g., generated with `gcloud auth print-access-token`), you can run Prowler with:
```bash
export CLOUDSDK_AUTH_ACCESS_TOKEN=$(gcloud auth print-access-token)
prowler gcp --project-ids <project-id>
```
???+ note
If using this method, it's recommended to also set the default project explicitly:
```bash
export GOOGLE_CLOUD_PROJECT=<project-id>
```
## Credentials lookup order
Prowler follows the same credential search process as [Google authentication libraries](https://cloud.google.com/docs/authentication/application-default-credentials#search_order), checking credentials in this order:
@@ -52,26 +52,27 @@ Prowler follows the same credential search process as [Google authentication lib
Prowler will use the enabled Google Cloud APIs to get the information needed to perform the checks.
## Required Permissions
To ensure full functionality, Prowler for Google Cloud needs the following permissions to be set:
- **Reader (`roles/reader`) IAM role**: granted at the project / folder / org level in order to scan the target projects
- **Project level settings**: you need to have at least one project with the below settings:
- Identity and Access Management (IAM) API (`iam.googleapis.com`) enabled by either using the
[Google Cloud API UI](https://console.cloud.google.com/apis/api/iam.googleapis.com/metrics) or
by using the gcloud CLI `gcloud services enable iam.googleapis.com --project <your-project-id>` command
- Set the quota project to be this project by either running `gcloud auth application-default set-quota-project <project-id>` or by setting an environment variable:
`export GOOGLE_CLOUD_QUOTA_PROJECT=<project-id>`
The above settings must be associated to a user or service account.
## Using an Access Token
For existing access tokens (e.g., generated with `gcloud auth print-access-token`), run Prowler with:
```bash
export CLOUDSDK_AUTH_ACCESS_TOKEN=$(gcloud auth print-access-token)
prowler gcp --project-ids <project-id>
```
???+ note
Prowler will use the enabled Google Cloud APIs to get the information needed to perform the checks.
When using this method, also set the default project explicitly:
```bash
export GOOGLE_CLOUD_PROJECT=<project-id>
```
## Impersonating a GCP Service Account in Prowler
## Impersonating a GCP Service Account
To impersonate a GCP service account, use the `--impersonate-service-account` argument followed by the service account email:

View File

@@ -1,4 +1,4 @@
# GitHub Authentication
# Github Authentication in Prowler
Prowler supports multiple methods to [authenticate with GitHub](https://docs.github.com/en/rest/authentication/authenticating-to-the-rest-api). These include:
@@ -6,7 +6,7 @@ Prowler supports multiple methods to [authenticate with GitHub](https://docs.git
- **OAuth App Token**
- **GitHub App Credentials**
This flexibility allows you to scan and analyze your GitHub account, including repositories, organizations, and applications, using the method that best suits your use case.
This flexibility enables scanning and analysis of GitHub accounts, including repositories, organizations, and applications, using the method that best suits the use case.
## Supported Login Methods
@@ -44,4 +44,4 @@ If no login method is explicitly provided, Prowler will automatically attempt to
3. `GITHUB_APP_ID` and `GITHUB_APP_KEY` (where the key is the content of the private key file)
???+ note
Ensure the corresponding environment variables are set up before running Prowler for automatic detection if you don't plan to specify the login method.
Ensure the corresponding environment variables are set up before running Prowler for automatic detection when not specifying the login method.

View File

@@ -11,9 +11,12 @@ This guide explains how to set up authentication with GitHub for Prowler. The do
### 1. Personal Access Token (PAT)
Personal Access Tokens provide the simplest GitHub authentication method and support individual user authentication or testing scenarios.
Personal Access Tokens provide the simplest GitHub authentication method, but it can only access resources owned by a single user or organization.
#### How to Create a Personal Access Token
???+ warning "Classic Tokens Deprecated"
GitHub has deprecated Personal Access Tokens (classic) in favor of fine-grained Personal Access Tokens. We recommend using fine-grained tokens as they provide better security through more granular permissions and resource-specific access control.
#### **Option 1: Create a Fine-Grained Personal Access Token (Recommended)**
1. **Navigate to GitHub Settings**
- Open [GitHub](https://github.com) and sign in
@@ -41,13 +44,13 @@ Personal Access Tokens provide the simplest GitHub authentication method and sup
To enable Prowler functionality, configure the following permissions:
- **Repository permissions:**
- **Administration**: Read-only access
- **Contents**: Read-only access
- **Metadata**: Read-only access
- **Pull requests**: Read-only access
- **Security advisories**: Read-only access
- **Statuses**: Read-only access
- **Organization permissions:**
- **Administration**: Read-only access
- **Members**: Read-only access
- **Account permissions:**
@@ -80,11 +83,11 @@ Personal Access Tokens provide the simplest GitHub authentication method and sup
4. **Configure Token Permissions**
To enable Prowler functionality, configure the following scopes:
- `repo`: Full control of private repositories
- `repo`: Full control of private repositories (includes `repo:status` and `repo:contents`)
- `read:org`: Read organization and team membership
- `read:user`: Read user profile data
- `read:discussion`: Read discussions
- `read:enterprise`: Read enterprise data (if applicable)
- `security_events`: Access security events (secret scanning and Dependabot alerts)
- `read:enterprise`: Read enterprise data (if using GitHub Enterprise)
5. **Copy and Store the Token**
- Copy the generated token immediately (GitHub displays tokens only once)
@@ -183,7 +186,10 @@ GitHub Apps provide the recommended integration method for accessing multiple re
- **Account permissions**:
- Email addresses (Read)
4. **Generate Private Key**
4. **Where can this GitHub App be installed?**
- Select "Any account" to be able to install the GitHub App in any organization.
5. **Generate Private Key**
- Scroll to the "Private keys" section after app creation
- Click "Generate a private key"
- Download the `.pem` file and store securely

View File

@@ -0,0 +1,11 @@
# IaC Authentication in Prowler
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Trivy](https://trivy.dev/). This provider supports a wide range of IaC frameworks and requires no cloud authentication for local scans.
### Authentication
- For local scans, no authentication is required.
- For remote repository scans, authentication can be provided via:
- [**GitHub Username and Personal Access Token (PAT)**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic)
- [**GitHub OAuth App Token**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-fine-grained-personal-access-token)
- [**Git URL**](https://git-scm.com/docs/git-clone#_git_urls)

View File

@@ -1,32 +1,22 @@
# Getting Started with the IaC Provider
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Checkov](https://www.checkov.io/). This provider supports a wide range of IaC frameworks, allowing you to assess your code before deployment.
Prowler's Infrastructure as Code (IaC) provider enables you to scan local or remote infrastructure code for security and compliance issues using [Trivy](https://trivy.dev/). This provider supports a wide range of IaC frameworks, allowing you to assess your code before deployment.
## Supported Frameworks
## Supported Scanners
The IaC provider leverages Checkov to support multiple frameworks, including:
The IaC provider leverages Trivy to support multiple scanners, including:
- Terraform
- CloudFormation
- Kubernetes
- ARM (Azure Resource Manager)
- Serverless
- Dockerfile
- YAML/JSON (generic IaC)
- Bicep
- Helm
- GitHub Actions, GitLab CI, Bitbucket Pipelines, Azure Pipelines, CircleCI, Argo Workflows
- Ansible
- Kustomize
- OpenAPI
- SAST, SCA (Software Composition Analysis)
- Vulnerability
- Misconfiguration
- Secret
- License
## How It Works
- The IaC provider scans your local directory (or a specified path) for supported IaC files, or scan a remote repository.
- No cloud credentials or authentication are required for local scans.
- For remote repository scans, authentication can be provided via [git URL](https://git-scm.com/docs/git-clone#_git_urls), CLI flags or environment variables.
- Mutelist logic is handled by Checkov, not Prowler.
- Mutelist logic is handled by Trivy, not Prowler.
- Results are output in the same formats as other Prowler providers (CSV, JSON, HTML, etc.).
## Usage
@@ -67,12 +57,12 @@ You can provide authentication for private repositories using one of the followi
#### Mutually Exclusive Flags
- `--scan-path` and `--scan-repository-url` are mutually exclusive. Only one can be specified at a time.
### Specify Frameworks
### Specify Scanners
Scan only Terraform and Kubernetes files:
Scan only vulnerability and misconfiguration scanners:
```sh
prowler iac --scan-path ./my-iac-directory --frameworks terraform kubernetes
prowler iac --scan-path ./my-iac-directory --scanners vuln misconfig
```
### Exclude Paths
@@ -95,4 +85,4 @@ prowler iac --scan-path ./iac --output-formats csv json html
- For remote repository scans, authentication is optional but required for private repos.
- CLI flags override environment variables for authentication.
- It is ideal for CI/CD pipelines and local development environments.
- For more details on supported frameworks and rules, see the [Checkov documentation](https://www.checkov.io/1.Welcome/Quick%20Start.html).
- For more details on supported scanners, see the [Trivy documentation](https://trivy.dev/latest/docs/scanner/vulnerability/).

Binary file not shown.

After

Width:  |  Height:  |  Size: 100 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 98 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 574 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 89 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 55 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 561 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 284 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 277 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 111 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 247 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 338 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 217 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 159 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 79 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

View File

@@ -1,29 +1,361 @@
# Microsoft 365 Authentication for Prowler
By default, Prowler utilizes the MsGraph Python SDK identity package for authentication, leveraging the class `ClientSecretCredential`. This enables authentication against Microsoft 365 using the following approaches:
Prowler for Microsoft 365 (M365) supports the following authentication methods:
- Service principal authentication by environment variables (Enterprise Application)
- Service principal and Microsoft user credentials by environment variabled (using PowerShell requires this authentication method)
- Current CLI credentials stored
- Interactive browser authentication
- [**Service Principal Application**](https://learn.microsoft.com/en-us/entra/identity-platform/app-objects-and-service-principals?tabs=browser#service-principal-object) (**Recommended**)
- **Service Principal Application with Microsoft User Credentials**
- **Stored AZ CLI credentials**
- **Interactive browser authentication**
???+ warning
Prowler App supports the **Service Principal** authentication method and the **Service Principal with User Credentials** authentication method, but this last one will be deprecated in September once Microsoft will enforce MFA in all tenants not allowing User authentication without interactive method.
To launch the tool first you need to specify which method is used through the following flags:
### Service Principal Authentication (Recommended)
**Authentication flag:** `--sp-env-auth`
Enable Prowler authentication as the **Service Principal Application** by configuring the following environment variables:
```console
# To use service principal (app) authentication and Microsoft user credentials
prowler m365 --env-auth
# To use service principal authentication
prowler m365 --sp-env-auth
# To use cli authentication
prowler m365 --az-cli-auth
# To use browser authentication
prowler m365 --browser-auth --tenant-id "XXXXXXXX"
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
```
## Permission Configuration
If these variables are not set or exported, execution using `--sp-env-auth` will fail.
To ensure Prowler can access the required resources within your Microsoft 365 account, proper permissions must be configured. Refer to the [Requirements](../../getting-started/requirements.md#needed-permissions_2) section for details on setting up necessary privileges.
Refer to the [Create Prowler Service Principal](getting-started-m365.md#create-the-service-principal-app) guide for setup instructions.
If the external API permissions described in the mentioned section above are not added only checks that work through MS Graph will be executed. This means that the full provider will not be executed.
???+ note
In order to scan all the checks from M365 required permissions to the service principal application must be added. Refer to the [External API Permissions Assignment](getting-started-m365.md#grant-powershell-modules-permissions) section for more information.
### Service Principal and User Credentials Authentication
Authentication flag: `--env-auth`
???+ warning
This method is not recommended anymore, we recommend just use the **Service Principal Application** authentication method instead.
This method builds upon the Service Principal authentication by adding User Credentials. Configure the following environment variables: `M365_USER` and `M365_PASSWORD`.
```console
export AZURE_CLIENT_ID="XXXXXXXXX"
export AZURE_CLIENT_SECRET="XXXXXXXXX"
export AZURE_TENANT_ID="XXXXXXXXX"
export M365_USER="your_email@example.com"
export M365_PASSWORD="examplepassword"
```
These two new environment variables are **required** in this authentication method to execute the PowerShell modules needed to retrieve information from M365 services. Prowler uses Service Principal authentication to access Microsoft Graph and user credentials to authenticate to Microsoft PowerShell modules.
- `M365_USER` should be your Microsoft account email using the **assigned domain in the tenant**. This means it must look like `example@YourCompany.onmicrosoft.com` or `example@YourCompany.com`, but it must be the exact domain assigned to that user in the tenant.
???+ warning
Newly created users must sign in with the account first, as Microsoft prompts for password change. Without completing this step, user authentication fails because Microsoft marks the initial password as expired.
???+ warning
The user must not be MFA capable. Microsoft does not allow MFA capable users to authenticate programmatically. See [Microsoft documentation](https://learn.microsoft.com/en-us/entra/identity-platform/scenario-desktop-acquire-token-username-password?tabs=dotnet) for more information.
???+ warning
Using a tenant domain other than the one assigned — even if it belongs to the same tenant — will cause Prowler to fail, as Microsoft authentication will not succeed.
Ensure the correct domain is used for the authenticating user.
![User Domains](img/user-domains.png)
- `M365_PASSWORD` must be the user password.
???+ note
Previously an encrypted password was required, but now the user password is accepted directly. Prowler handles the password encryption.
### Interactive Browser Authentication
**Authentication flag:** `--browser-auth`
This authentication method requires authentication against Azure using the default browser to start the scan. The `--tenant-id` flag is also required.
These credentials only enable checks that rely on Microsoft Graph. The entire provider cannot be run with this method. To perform a full M365 security scan, use the **recommended authentication method**.
Since this is a **delegated permission** authentication method, necessary permissions should be assigned to the user rather than the application.
### Required Permissions
To run the full Prowler provider, including PowerShell checks, two types of permission scopes must be set in **Microsoft Entra ID**.
#### Service Principal Authentication (`--sp-env-auth`) - Recommended
When using service principal authentication, add the following **Application Permissions**:
**Microsoft Graph API Permissions:**
- `AuditLog.Read.All`: Required for Entra service.
- `Directory.Read.All`: Required for all services.
- `Policy.Read.All`: Required for all services.
- `SharePointTenantSettings.Read.All`: Required for SharePoint service.
- `User.Read` (IMPORTANT: this must be set as **delegated**): Required for the sign-in.
**External API Permissions:**
- `Exchange.ManageAsApp` from external API `Office 365 Exchange Online`: Required for Exchange PowerShell module app authentication. You also need to assign the `Global Reader` role to the app.
- `application_access` from external API `Skype and Teams Tenant Admin API`: Required for Teams PowerShell module app authentication.
???+ note
`Directory.Read.All` can be replaced with `Domain.Read.All` that is a more restrictive permission but you won't be able to run the Entra checks related with DirectoryRoles and GetUsers.
> If you do this you will need to add also the `Organization.Read.All` permission to the service principal application in order to authenticate.
???+ note
This is the **recommended authentication method** because it allows you to run the full M365 provider including PowerShell checks, providing complete coverage of all available security checks, same as the Service Principal Authentication + User Credentials Authentication but this last one will be deprecated in September once Microsoft will enforce MFA in all tenants not allowing User authentication without interactive method.
#### Service Principal + User Credentials Authentication (`--env-auth`)
When using service principal with user credentials authentication, you need **both** sets of permissions:
**1. Service Principal Application Permissions**:
- You **will need** all the Microsoft Graph API permissions listed above.
- You **won't need** the External API permissions listed above.
**2. User-Level Permissions**: These are set at the `M365_USER` level, so the user used to run Prowler must have one of the following roles:
- `Global Reader` (recommended): this allows you to read all roles needed.
- `Exchange Administrator` and `Teams Administrator`: user needs both roles but with this [roles](https://learn.microsoft.com/en-us/exchange/permissions-exo/permissions-exo#microsoft-365-permissions-in-exchange-online) you can access to the same information as a Global Reader (since only read access is needed, Global Reader is recommended).
#### Browser Authentication (`--browser-auth`)
When using browser authentication, permissions are delegated to the user, so the user must have the appropriate permissions rather than the application.
???+ warning
With browser authentication, you will only be able to run checks that work through MS Graph API. PowerShell module checks will not be executed.
### Assigning Permissions and Roles
For guidance on assigning the necessary permissions and roles, follow these instructions:
- [Grant API Permissions](getting-started-m365.md#grant-required-graph-api-permissions)
- [Assign Required Roles](getting-started-m365.md#if-using-user-authentication)
### Supported PowerShell Versions
PowerShell is required to run certain M365 checks.
**Supported versions:**
- **PowerShell 7.4 or higher** (7.5 is recommended)
#### Why Is PowerShell 7.4+ Required?
- **PowerShell 5.1** (default on some Windows systems) does not support required cmdlets.
- Older [cross-platform PowerShell versions](https://learn.microsoft.com/en-us/powershell/scripting/install/powershell-support-lifecycle?view=powershell-7.5) are **unsupported**, leading to potential errors.
???+ note
Installing PowerShell is only necessary if you install Prowler via **pip or other sources**. **SDK and API containers include PowerShell by default.**
### Installing PowerShell
Installing PowerShell is different depending on your OS.
- [Windows](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.5#install-powershell-using-winget-recommended): you will need to update PowerShell to +7.4 to be able to run prowler, if not some checks will not show findings and the provider could not work as expected. This version of PowerShell is [supported](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-windows?view=powershell-7.4#supported-versions-of-windows) on Windows 10, Windows 11, Windows Server 2016 and higher versions.
```console
winget install --id Microsoft.PowerShell --source winget
```
- [MacOS](https://learn.microsoft.com/es-es/powershell/scripting/install/installing-powershell-on-macos?view=powershell-7.5#install-the-latest-stable-release-of-powershell): installing PowerShell on MacOS needs to have installed [brew](https://brew.sh/), once you have it is just running the command above, Pwsh is only supported in macOS 15 (Sequoia) x64 and Arm64, macOS 14 (Sonoma) x64 and Arm64, macOS 13 (Ventura) x64 and Arm64
```console
brew install powershell/tap/powershell
```
Once it's installed run `pwsh` on your terminal to verify it's working.
- Linux: installing PowerShell on Linux depends on the distro you are using:
- [Ubuntu](https://learn.microsoft.com/es-es/powershell/scripting/install/install-ubuntu?view=powershell-7.5#installation-via-package-repository-the-package-repository): The required version for installing PowerShell +7.4 on Ubuntu are Ubuntu 22.04 and Ubuntu 24.04. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget apt-transport-https software-properties-common
# Get the version of Ubuntu
source /etc/os-release
# Download the Microsoft repository keys
wget -q https://packages.microsoft.com/config/ubuntu/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Alpine](https://learn.microsoft.com/es-es/powershell/scripting/install/install-alpine?view=powershell-7.5#installation-steps): The only supported version for installing PowerShell +7.4 on Alpine is Alpine 3.20. The unique way to install it is downloading the tar.gz package available on [PowerShell github](https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz). You just need to follow the following steps:
```console
# Install the requirements
sudo apk add --no-cache \
ca-certificates \
less \
ncurses-terminfo-base \
krb5-libs \
libgcc \
libintl \
libssl3 \
libstdc++ \
tzdata \
userspace-rcu \
zlib \
icu-libs \
curl
apk -X https://dl-cdn.alpinelinux.org/alpine/edge/main add --no-cache \
lttng-ust \
openssh-client \
# Download the powershell '.tar.gz' archive
curl -L https://github.com/PowerShell/PowerShell/releases/download/v7.5.0/powershell-7.5.0-linux-musl-x64.tar.gz -o /tmp/powershell.tar.gz
# Create the target folder where powershell will be placed
sudo mkdir -p /opt/microsoft/powershell/7
# Expand powershell to the target folder
sudo tar zxf /tmp/powershell.tar.gz -C /opt/microsoft/powershell/7
# Set execute permissions
sudo chmod +x /opt/microsoft/powershell/7/pwsh
# Create the symbolic link that points to pwsh
sudo ln -s /opt/microsoft/powershell/7/pwsh /usr/bin/pwsh
# Start PowerShell
pwsh
```
- [Debian](https://learn.microsoft.com/es-es/powershell/scripting/install/install-debian?view=powershell-7.5#installation-on-debian-11-or-12-via-the-package-repository): The required version for installing PowerShell +7.4 on Debian are Debian 11 and Debian 12. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Update the list of packages
sudo apt-get update
# Install pre-requisite packages.
sudo apt-get install -y wget
# Get the version of Debian
source /etc/os-release
# Download the Microsoft repository GPG keys
wget -q https://packages.microsoft.com/config/debian/$VERSION_ID/packages-microsoft-prod.deb
# Register the Microsoft repository GPG keys
sudo dpkg -i packages-microsoft-prod.deb
# Delete the Microsoft repository GPG keys file
rm packages-microsoft-prod.deb
# Update the list of packages after we added packages.microsoft.com
sudo apt-get update
###################################
# Install PowerShell
sudo apt-get install -y powershell
# Start PowerShell
pwsh
```
- [Rhel](https://learn.microsoft.com/es-es/powershell/scripting/install/install-rhel?view=powershell-7.5#installation-via-the-package-repository): The required version for installing PowerShell +7.4 on Red Hat are RHEL 8 and RHEL 9. The recommended way to install it is downloading the package available on PMC. You just need to follow the following steps:
```console
###################################
# Prerequisites
# Get version of RHEL
source /etc/os-release
if [ ${VERSION_ID%.*} -lt 8 ]
then majorver=7
elif [ ${VERSION_ID%.*} -lt 9 ]
then majorver=8
else majorver=9
fi
# Download the Microsoft RedHat repository package
curl -sSL -O https://packages.microsoft.com/config/rhel/$majorver/packages-microsoft-prod.rpm
# Register the Microsoft RedHat repository
sudo rpm -i packages-microsoft-prod.rpm
# Delete the downloaded package after installing
rm packages-microsoft-prod.rpm
# Update package index files
sudo dnf update
# Install PowerShell
sudo dnf install powershell -y
```
- [Docker](https://learn.microsoft.com/es-es/powershell/scripting/install/powershell-in-docker?view=powershell-7.5#use-powershell-in-a-container): The following command download the latest stable versions of PowerShell:
```console
docker pull mcr.microsoft.com/dotnet/sdk:9.0
```
To start an interactive shell of Pwsh you just need to run:
```console
docker run -it mcr.microsoft.com/dotnet/sdk:9.0 pwsh
```
### Required PowerShell Modules
Prowler relies on several PowerShell cmdlets to retrieve necessary data.
These cmdlets come from different modules that must be installed.
#### Automatic Installation
The required modules are automatically installed when running Prowler with the `--init-modules` flag.
Example command:
```console
python3 prowler-cli.py m365 --verbose --log-level ERROR --env-auth --init-modules
```
If the modules are already installed, running this command will not cause issues—it will simply verify that the necessary modules are available.
???+ note
Prowler installs the modules using `-Scope CurrentUser`.
If you encounter any issues with services not working after the automatic installation, try installing the modules manually using `-Scope AllUsers` (administrator permissions are required for this).
The command needed to install a module manually is:
```powershell
Install-Module -Name "ModuleName" -Scope AllUsers -Force
```
#### Modules Version
- [ExchangeOnlineManagement](https://www.powershellgallery.com/packages/ExchangeOnlineManagement/3.6.0) (Minimum version: 3.6.0) Required for checks across Exchange, Defender, and Purview.
- [MicrosoftTeams](https://www.powershellgallery.com/packages/MicrosoftTeams/6.6.0) (Minimum version: 6.6.0) Required for all Teams checks.
- [MSAL.PS](https://www.powershellgallery.com/packages/MSAL.PS/4.32.0): Required for Exchange module via application authentication.
- [MSAL.PS](https://www.powershellgallery.com/packages/MSAL.PS/4.32.0): Required for Exchange module via application authentication.

View File

@@ -257,7 +257,8 @@ This method is not recommended because it requires a user with MFA enabled and M
- `AZURE_CLIENT_SECRET` from earlier
If you are using user authentication, also add:
- `M365_USER` the user using the correct assigned domain, more info [here](../../getting-started/requirements.md#service-principal-and-user-credentials-authentication)
- `M365_USER` the user using the correct assigned domain, more info [here](../../tutorials/microsoft365/authentication.md#service-principal-and-user-credentials-authentication)
- `M365_PASSWORD` the password of the user
![Prowler Cloud M365 Credentials](./img/m365-credentials.png)

View File

@@ -4,9 +4,9 @@ PowerShell is required by this provider because it is the only way to retrieve d
If you are using Prowler Cloud, you don't need to worry about PowerShell — it is already installed in our infrastructure.
However, if you want to run Prowler on your own, you must have PowerShell installed to execute the full M365 provider and retrieve all findings.
To learn more about how to install PowerShell and which versions are supported, click [here](../../getting-started/requirements.md#supported-powershell-versions).
To learn more about how to install PowerShell and which versions are supported, click [here](../../tutorials/microsoft365/authentication.md#supported-powershell-versions).
## Required Modules
The necessary modules will not be installed automatically by Prowler. Nevertheless, if you want Prowler to install them for you, you can execute the provider with the flag `--init-modules`, which will run the script to install and import them.
If you want to learn more about this process or you are running some issues with this, click [here](../../getting-started/requirements.md#needed-powershell-modules).
If you want to learn more about this process or you are running some issues with this, click [here](../../tutorials/microsoft365/authentication.md#required-powershell-modules).

View File

@@ -116,7 +116,7 @@ Each check must reside in a dedicated subfolder, following this structure:
???+ note
The check name must start with the service name followed by an underscore (e.g., ec2\_instance\_public\_ip).
To see more information about how to write checks, refer to the [Developer Guide](../developer-guide/checks.md#create-a-new-check-for-a-provider).
To see more information about how to write checks, refer to the [Developer Guide](../developer-guide/checks.md#creating-a-check).
???+ note
If you want to run ONLY your custom check(s), import it with -x (--checks-folder) and then run it with -c (--checks), e.g.: `console prowler aws -x s3://bucket/prowler/providers/aws/services/s3/s3_bucket_policy/ -c s3_bucket_policy`

View File

@@ -0,0 +1,412 @@
# Amazon S3 Integration
**Prowler App** allows automatic export of scan results to Amazon S3 buckets, providing seamless integration with existing data workflows and storage infrastructure. This comprehensive guide demonstrates configuration and management of Amazon S3 integrations to streamline security finding management and reporting.
When enabled and configured, scan results are automatically stored in the configured bucket. Results are provided in `csv`, `html` and `json-ocsf` formats, offering flexibility for custom integrations:
<!-- TODO: remove the comment once the AWS Security Hub integration is completed -->
<!-- - json-asff -->
<!--
???+ note
The `json-asff` file will be only present in your configured Amazon S3 Bucket if you have the AWS Security Hub integration enabled. You can get more information about that integration here. -->
???+ note
Enabling this integration incurs costs in Amazon S3. Refer to [Amazon S3 pricing](https://aws.amazon.com/s3/pricing/) for more information.
The Amazon S3 Integration provides the following capabilities:
- **Automate scan result exports** to designated S3 buckets after each scan
- **Configure separate bucket destinations** for different cloud providers or use cases
- **Customize export paths** within buckets for organized storage
- **Support multiple authentication methods** including IAM roles and static credentials
- **Verify connection reliability** through built-in connection testing
- **Manage integrations independently** with separate configuration and credential controls
## Required Permissions
Before configuring the Amazon S3 Integration, ensure that AWS credentials and optionally the IAM Role used for S3 access have the necessary permissions to write scan results to the designated S3 bucket. This requirement applies when using static credentials, session credentials, or an IAM role (either self-created or generated using [Prowler's permissions templates](#available-templates)).
### IAM Policy
The S3 integration requires the following permissions. Add these to the IAM role policy, or ensure AWS credentials have these permissions:
```json title="s3:DeleteObject"
{
"Version": "2012-10-17",
"Statement": [
{
"Condition": {
"StringEquals": {
"s3:ResourceAccount": "<BUCKET AWS ACCOUNT NUMBER>"
}
},
"Action": [
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::<BUCKET NAME>/*test-prowler-connection.txt"
],
"Effect": "Allow"
}
]
}
```
`s3:DeleteObject` permission is required for connection testing. When testing the S3 integration, Prowler creates a temporary beacon file, `test-prowler-connection.txt`, to verify write permissions, then deletes it to confirm the connection is working properly.
```json title="s3:PutObject"
{
"Version": "2012-10-17",
"Statement": [
{
"Condition": {
"StringEquals": {
"s3:ResourceAccount": "<BUCKET AWS ACCOUNT NUMBER>"
}
},
"Action": [
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::<BUCKET NAME>/*"
],
"Effect": "Allow"
}
]
}
```
```json title="s3:ListBucket"
{
"Version": "2012-10-17",
"Statement": [
{
"Condition": {
"StringEquals": {
"s3:ResourceAccount": "<BUCKET AWS ACCOUNT NUMBER>"
}
},
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<BUCKET NAME>"
],
"Effect": "Allow"
}
]
}
```
???+ note
Replace `<BUCKET AWS ACCOUNT NUMBER>` with the AWS account ID that owns the destination S3 bucket, and `<BUCKET NAME>` with the actual bucket name.
### Cross-Account S3 Bucket
If the S3 destination bucket is in a different AWS account than the one providing the credentials for S3 access, configure a bucket policy on the destination bucket to allow cross-account access.
The following diagrams illustrate the three common S3 integration scenarios:
##### Same Account Setup (No Bucket Policy Required)
When both the IAM credentials and destination S3 bucket are in the same AWS account, no additional bucket policy is required.
![](./img/s3/s3-same-account.png)
##### Cross-Account Setup (Bucket Policy Required)
When the S3 bucket is in a different AWS account, you must configure a bucket policy to allow cross-account access.
![](./img/s3/s3-cross-account.png)
##### Multi-Account Setup (Multiple Principals in Bucket Policy)
When multiple AWS accounts need to write to the same destination bucket, configure the bucket policy with multiple principals.
![](./img/s3/s3-multiple-accounts.png)
#### S3 Bucket Policy
Apply the following bucket policy to the destination S3 bucket:
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<SOURCE ACCOUNT ID>:role/ProwlerScan"
},
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::<BUCKET NAME>/*"
},
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<SOURCE ACCOUNT ID>:role/ProwlerScan"
},
"Action": "s3:DeleteObject",
"Resource": "arn:aws:s3:::<BUCKET NAME>/*test-prowler-connection.txt"
},
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<SOURCE ACCOUNT ID>:role/ProwlerScan"
},
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::<BUCKET NAME>"
}
]
}
```
???+ note
Replace `<SOURCE ACCOUNT ID>` with the AWS account ID that contains the IAM role and `<BUCKET NAME>` with the destination bucket name. The role name `ProwlerScan` is the default name when using Prowler's permissions templates. If using a custom IAM role or different authentication method, replace `ProwlerScan` with the actual role name.
##### Multi-Account Configuration
For multiple AWS accounts, modify the `Principal` field to an array:
```json
"Principal": {
"AWS": [
"arn:aws:iam::<SOURCE ACCOUNT ID 1>:role/ProwlerScan",
"arn:aws:iam::<SOURCE ACCOUNT ID 2>:role/ProwlerScan"
]
}
```
???+ note
Replace `<SOURCE ACCOUNT ID>` with the AWS account ID that contains the IAM role and `<BUCKET NAME>` with the destination bucket name. The role name `ProwlerScan` is the default name when using Prowler's permissions templates. If using a custom IAM role or different authentication method, replace `ProwlerScan` with the actual role name.
### Available Templates
**Prowler App** provides Infrastructure as Code (IaC) templates to automate IAM role setup with S3 integration permissions.
???+ note
Templates are optional. Custom IAM roles or static credentials can be used instead.
Choose from the following deployment options:
- [CloudFormation](https://prowler-cloud-public.s3.eu-west-1.amazonaws.com/permissions/templates/aws/cloudformation/prowler-scan-role.yml)
- [Terraform](https://github.com/prowler-cloud/prowler/tree/master/permissions/templates/terraform)
#### CloudFormation
##### AWS CLI
When using Prowler's CloudFormation template, execute the following command to update the existing Prowler stack:
```bash
aws cloudformation update-stack \
--capabilities CAPABILITY_IAM --capabilities CAPABILITY_NAMED_IAM \
--stack-name "Prowler" \
--template-url "https://prowler-cloud-public.s3.eu-west-1.amazonaws.com/permissions/templates/aws/cloudformation/prowler-scan-role.yml" \
--parameters \
ParameterKey=EnableS3Integration,ParameterValue="true" \
ParameterKey=ExternalId,ParameterValue="your-external-id" \
ParameterKey=S3IntegrationBucketName,ParameterValue="your-bucket-name" \
ParameterKey=S3IntegrationBucketAccountId,ParameterValue="your-bucket-aws-account-id-owner"
```
Alternatively, if you don't have the `ProwlerScan` IAM Role, execute the following command to create the CloudFormation stack:
```bash
aws cloudformation create-stack \
--capabilities CAPABILITY_IAM --capabilities CAPABILITY_NAMED_IAM \
--stack-name "Prowler" \
--template-url "https://prowler-cloud-public.s3.eu-west-1.amazonaws.com/permissions/templates/aws/cloudformation/prowler-scan-role.yml" \
--parameters \
ParameterKey=EnableS3Integration,ParameterValue="true" \
ParameterKey=ExternalId,ParameterValue="your-external-id" \
ParameterKey=S3IntegrationBucketName,ParameterValue="your-bucket-name" \
ParameterKey=S3IntegrationBucketAccountId,ParameterValue="your-bucket-aws-account-id-owner"
```
A CloudFormation Quick Link is also available [here](https://us-east-1.console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/quickcreate?templateURL=https%3A%2F%2Fprowler-cloud-public.s3.eu-west-1.amazonaws.com%2Fpermissions%2Ftemplates%2Faws%2Fcloudformation%2Fprowler-scan-role.yml&stackName=Prowler&param_EnableS3Integration=true)
##### AWS Console
If using Prowler's CloudFormation template, execute the following command to update the existing Prowler stack:
1. Navigate to CloudFormation service in the AWS region you are using
2. Select "ProwlerScan", click "Update" and then "Make a direct update"
3. Replace template, uploading the [CloudFormation template](https://prowler-cloud-public.s3.eu-west-1.amazonaws.com/permissions/templates/aws/cloudformation/prowler-scan-role.yml)
4. Configure parameters:
- `ExternalId`: Keep existing value
- `EnableS3Integration`: Select "true"
- `S3IntegrationBucketName`: Your bucket name
- `S3IntegrationBucketAccountId`: Bucket owner's AWS account ID
5. In the "Configure stack options" screen, again, leave everything as it is and click on "Next"
6. Finally, under "Review Prowler", at the bottom click on "Submit"
#### Terraform
1. Download the Terraform code:
```bash
# Clone or download from GitHub
git clone https://github.com/prowler-cloud/prowler.git
cd prowler/permissions/templates/terraform
```
2. Configure your variables:
```bash
cp terraform.tfvars.example terraform.tfvars
```
3. Edit `terraform.tfvars` with your specific values:
```hcl
# Required: External ID from Prowler App
external_id = "your-unique-external-id-here"
# S3 Integration Configuration
enable_s3_integration = true
s3_integration_bucket_name = "your-s3-bucket-name"
s3_integration_bucket_account_id = "123456789012" # Bucket owner's AWS Account ID
```
4. Deploy the infrastructure:
```bash
terraform init
terraform plan # Review the planned changes
terraform apply # Type 'yes' when prompted
```
5. After successful deployment, Terraform will display important values:
```
Outputs:
prowler_role_arn = "arn:aws:iam::123456789012:role/ProwlerScan"
prowler_role_name = "ProwlerScan"
s3_integration_enabled = "true"
```
6. Copy the `prowler_role_arn`, as it's required to complete the S3 integration credentials configuration.
For detailed information, refer to the [Terraform README](https://github.com/prowler-cloud/prowler/blob/master/permissions/templates/terraform/README.md).
---
## Configuration
Once the required permissions are set up, proceed to configure the S3 integration in **Prowler App**.
1. Navigate to "Integrations"
![Navigate to integrations](./img/s3/s3-integration-ui-1.png)
2. Locate the Amazon S3 Integration card and click on the "Configure" button
![Access S3 integration](./img/s3/s3-integration-ui-2.png)
3. Click the "Add Integration" button
![Add integration button](./img/s3/s3-integration-ui-3.png)
4. Complete the configuration form with the following details:
- **Cloud Providers:** Select the providers whose scan results should be exported to this S3 bucket
- **Bucket Name:** Enter the name of the target S3 bucket (e.g., `my-security-findings-bucket`)
- **Output Directory:** Specify the directory path within the bucket (e.g., `/prowler-findings/`, defaults to `output`)
![Configuration form](./img/s3/s3-integration-ui-4.png)
6. Click "Next" to configure credentials
7. Configure AWS authentication using one of the supported methods:
- **AWS SDK Default:** Use default AWS credentials from the environment. For Prowler Cloud users, this is the recommended option as the service has AWS credentials to assume IAM roles with ARNs matching `arn:aws:iam::*:role/Prowler*` or `arn:aws:iam::*:role/prowler*`
- **Access Keys:** Provide AWS access key ID and secret access key
- **IAM Role (optional):** Specify the IAM Role ARN, external ID, and optional session parameters
![Credentials configuration](./img/s3/s3-integration-ui-5.png)
8. Optional - For IAM role authentication, complete the required fields:
- **Role ARN:** The Amazon Resource Name of the IAM role
- **External ID:** Unique identifier for additional security (defaults to Tenant/Organization ID) - mandatory and automatically filled
- **Role Session Name:** Optional - name for the assumed role session
- **Session Duration:** Optional - duration in seconds for the session
9. Click "Create Integration" to verify the connection and complete the setup
???+ success
Once credentials are configured and the connection test passes, the S3 integration will be active. Scan results will automatically be exported to the specified bucket after each scan completes. Run a new scan and check the S3 bucket to verify the integration is working.
???+ note
Scan outputs are processed after scan completion. Depending on scan size and network conditions, exports may take a few minutes to appear in the S3 bucket.
---
### Integration Status
Once the integration is active, monitor its status and make adjustments as needed through the integrations management interface.
1. Review configured integrations in the management interface
2. Each integration displays:
- **Connection Status:** Connected or Disconnected indicator
- **Bucket Information:** Bucket name and output directory
- **Last Checked:** Timestamp of the most recent connection test
![Integration status view](./img/s3/s3-integration-ui-6.png)
#### Actions
![Action buttons](./img/s3/s3-integration-ui-7.png)
Each S3 integration provides several management actions accessible through dedicated buttons:
| Button | Purpose | Available Actions | Notes |
|--------|---------|------------------|-------|
| **Test** | Verify integration connectivity | • Test AWS credential validity<br/>• Check S3 bucket accessibility<br/>• Verify write permissions<br/>• Validate connection setup | Results displayed in notification message |
| **Config** | Modify integration settings | • Update selected cloud providers<br/>• Change bucket name<br/>• Modify output directory path | Click "Update Configuration" to save changes |
| **Credentials** | Update authentication settings | • Modify AWS access keys<br/>• Update IAM role configuration<br/>• Change authentication method | Click "Update Credentials" to save changes |
| **Enable/Disable** | Toggle integration status | • Enable integration to start exporting results<br/>• Disable integration to pause exports | Status change takes effect immediately |
| **Delete** | Remove integration permanently | • Permanently delete integration<br/>• Remove all configuration data | ⚠️ **Cannot be undone** - confirm before deleting |
???+ tip "Management Best Practices"
- Test the integration after any configuration changes
- Use the Enable/Disable toggle for temporary changes instead of deleting
---
## Understanding S3 Export Structure
When the S3 integration is enabled and a scan completes, Prowler creates a folder inside the specified bucket path (using `output` as the default folder name) with subfolders for each output format:
- Regular: `prowler-output-{provider-uid}-{timestamp}.{extension}`
- Compliance: `prowler-output-{provider-uid}-{timestamp}_{compliance_framework}.{extension}`
```
output/
├── compliance/
│ └── prowler-output-111122223333-20250805120000_cis_5.0_aws.csv
├── csv/
│ └── prowler-output-111122223333-20250805120000.csv
├── html/
│ └── prowler-output-111122223333-20250805120000.html
└── json-ocsf/
└── prowler-output-111122223333-20250805120000.ocsf.json
```
![](./img/s3/s3-output-folder.png)
For detailed information about Prowler's reporting formats, refer to the [Prowler reporting documentation](https://docs.prowler.com/projects/prowler-open-source/en/latest/tutorials/reporting/).
## Troubleshooting
**Connection test fails:**
- Check AWS credentials are valid
- If using IAM Role, check its permissions
- Verify bucket permissions and region
- Confirm network access to S3
**No scan results in bucket:**
- Ensure integration shows "Connected"
- Check that provider is associated with integration
- Verify bucket policies allow writes

View File

@@ -4,7 +4,7 @@
## Accessing Prowler App and API Documentation
After [installing](../index.md#prowler-app-installation) **Prowler App**, access it at [http://localhost:3000](http://localhost:3000). To view the auto-generated **Prowler API** documentation, navigate to [http://localhost:8080/api/v1/docs](http://localhost:8080/api/v1/docs). This documentation provides details on available endpoints, parameters, and responses.
After [installing](../installation/prowler-app.md) **Prowler App**, access it at [http://localhost:3000](http://localhost:3000). To view the auto-generated **Prowler API** documentation, navigate to [http://localhost:8080/api/v1/docs](http://localhost:8080/api/v1/docs). This documentation provides details on available endpoints, parameters, and responses.
???+ note
If you are a [Prowler Cloud](https://cloud.prowler.com/sign-in) user, you can access API docs at [https://api.prowler.com/api/v1/docs](https://api.prowler.com/api/v1/docs)
@@ -109,7 +109,7 @@ For AWS, enter your `AWS Account ID` and choose one of the following methods to
### **Step 4.2: Azure Credentials**:
For Azure, Prowler App uses a service principal application to authenticate. For more information about the process of creating and adding permissions to a service principal refer to this [section](../getting-started/requirements.md#azure). When you finish creating and adding the [Entra](./azure/create-prowler-service-principal.md#assigning-the-proper-permissions) and [Subscription](./azure/subscriptions.md#assign-the-appropriate-permissions-to-the-identity-that-is-going-to-be-assumed-by-prowler) scope permissions to the service principal, enter the `Tenant ID`, `Client ID` and `Client Secret` of the service principal application.
For Azure, Prowler App uses a service principal application to authenticate. For more information about the process of creating and adding permissions to a service principal refer to this [section](../tutorials/azure/authentication.md). When you finish creating and adding the [Entra](./azure/create-prowler-service-principal.md#assigning-proper-permissions) and [Subscription](./azure/subscriptions.md) scope permissions to the service principal, enter the `Tenant ID`, `Client ID` and `Client Secret` of the service principal application.
<img src="../../img/azure-credentials.png" alt="Azure Credentials" width="700"/>
@@ -201,6 +201,51 @@ For full setup instructions and requirements, check the [Microsoft 365 provider
<img src="../../img/m365-credentials.png" alt="Prowler Cloud M365 Credentials" width="700"/>
### **Step 4.6: GitHub Credentials**
For GitHub, you must enter your Provider ID (username or organization name) and choose the authentication method you want to use:
- **Personal Access Token** (Recommended for individual users)
- **OAuth App Token** (For applications requiring user consent)
- **GitHub App** (Recommended for organizations and production use)
???+ note
For full setup instructions and requirements, check the [GitHub provider requirements](./github/getting-started-github.md).
<img src="../img/github-auth-methods.png" alt="GitHub Authentication Methods" width="700"/>
#### **Step 4.6.1: Personal Access Token**
Personal Access Tokens provide the simplest GitHub authentication method and support individual user authentication or testing scenarios.
- Select `Personal Access Token` and enter your `Personal Access Token`:
<img src="../img/github-pat-credentials.png" alt="GitHub Personal Access Token Credentials" width="700"/>
???+ note
For detailed instructions on creating a Personal Access Token and the exact permissions required, check the [GitHub Personal Access Token tutorial](./github/getting-started-github.md#1-personal-access-token-pat).
#### **Step 4.6.2: OAuth App Token**
OAuth Apps enable applications to act on behalf of users with explicit consent.
- Select `OAuth App Token` and enter your `OAuth App Token`:
<img src="../img/github-oauth-credentials.png" alt="GitHub OAuth App Credentials" width="700"/>
???+ note
To create an OAuth App, go to GitHub Settings → Developer settings → OAuth Apps → New OAuth App. You'll need to exchange an authorization code for an access token using the OAuth flow.
#### **Step 4.6.3: GitHub App**
GitHub Apps provide the recommended integration method for accessing multiple repositories or organizations.
- Select `GitHub App` and enter your `GitHub App ID` and `GitHub App Private Key`:
<img src="../img/github-app-credentials.png" alt="GitHub App Credentials" width="700"/>
???+ note
To create a GitHub App, go to GitHub Settings → Developer settings → GitHub Apps → New GitHub App. Configure the necessary permissions and generate a private key. Install the app to your account or organization and provide the App ID and private key content.
## **Step 5: Test Connection**
After adding your credentials of your cloud account, click the `Launch` button to verify that Prowler App can successfully connect to your provider:

View File

@@ -46,8 +46,19 @@ repo_name: prowler-cloud/prowler
nav:
- Getting Started:
- Overview: index.md
- Requirements: getting-started/requirements.md
- Overview:
- What is Prowler?: index.md
- Products:
- Prowler App: products/prowler-app.md
- Prowler CLI: products/prowler-cli.md
- Prowler Cloud 🔗: https://cloud.prowler.com
- Prowler Hub 🔗: https://hub.prowler.com
- Installation:
- Prowler App: installation/prowler-app.md
- Prowler CLI: installation/prowler-cli.md
- Basic Usage:
- Prowler App: basic-usage/prowler-app.md
- Prowler CLI: basic-usage/prowler-cli.md
- Tutorials:
- Prowler App:
- Getting Started: tutorials/prowler-app.md
@@ -55,7 +66,9 @@ nav:
- Social Login: tutorials/prowler-app-social-login.md
- SSO with SAML: tutorials/prowler-app-sso.md
- Mute findings: tutorials/prowler-app-mute-findings.md
- Amazon S3 Integration: tutorials/prowler-app-s3-integration.md
- Lighthouse: tutorials/prowler-app-lighthouse.md
- Bulk Provider Provisioning: tutorials/bulk-provider-provisioning.md
- CLI:
- Miscellaneous: tutorials/misc.md
- Reporting: tutorials/reporting.md
@@ -110,12 +123,13 @@ nav:
- Authentication: tutorials/microsoft365/authentication.md
- Use of PowerShell: tutorials/microsoft365/use-of-powershell.md
- GitHub:
- Authentication: tutorials/github/authentication.md
- Getting Started: tutorials/github/getting-started-github.md
- Authentication: tutorials/github/authentication.md
- IaC:
- Getting Started: tutorials/iac/getting-started-iac.md
- Authentication: tutorials/iac/authentication.md
- Developer Guide:
- General Concepts:
- Concepts:
- Introduction: developer-guide/introduction.md
- Providers: developer-guide/provider.md
- Services: developer-guide/services.md
@@ -124,7 +138,7 @@ nav:
- Integrations: developer-guide/integrations.md
- Compliance: developer-guide/security-compliance-framework.md
- Lighthouse: developer-guide/lighthouse.md
- Provider Specific Details:
- Providers:
- AWS: developer-guide/aws-details.md
- Azure: developer-guide/azure-details.md
- Google Cloud: developer-guide/gcp-details.md
@@ -141,8 +155,8 @@ nav:
- Security: security.md
- Contact Us: contact.md
- Troubleshooting: troubleshooting.md
- About: about.md
- Prowler Cloud: https://prowler.com
- About 🔗: https://prowler.com/about#team
- Release Notes 🔗: https://github.com/prowler-cloud/prowler/releases
# Customization
extra:

View File

@@ -1,12 +1,21 @@
## Deployment using Terraform
To deploy the Prowler Scan Role in order to allow scanning your AWS account from Prowler, please run the following commands in your terminal:
This Terraform configuration creates the necessary IAM role and policies to allow Prowler to scan your AWS account, with optional S3 integration for storing scan reports.
1. `terraform init`
2. `terraform plan`
3. `terraform apply`
### Quick Start
During the `terraform plan` and `terraform apply` steps you will be asked for an External ID to be configured in the `ProwlerScan` IAM role.
1. **Configure variables:**
```bash
cp terraform.tfvars.example terraform.tfvars
# Edit terraform.tfvars with your values
```
2. **Deploy:**
```bash
terraform init
terraform plan
terraform apply
```
### Variables
@@ -15,7 +24,7 @@ During the `terraform plan` and `terraform apply` steps you will be asked for an
- `iam_principal` (optional): IAM principal pattern allowed to assume the role (defaults to Prowler Cloud: "role/prowler*")
- `enable_s3_integration` (optional): Enable S3 integration for storing scan reports (default: false)
- `s3_integration_bucket_name` (conditional): S3 bucket name for reports (required if `enable_s3_integration` is true)
- `s3_integration_bucket_account` (conditional): S3 bucket owner account ID (required if `enable_s3_integration` is true)
- `s3_integration_bucket_account_id` (conditional): S3 bucket owner account ID (required if `enable_s3_integration` is true)
### Usage Examples
@@ -30,18 +39,26 @@ terraform apply \
-var="external_id=your-external-id-here" \
-var="enable_s3_integration=true" \
-var="s3_integration_bucket_name=your-s3-bucket-name" \
-var="s3_integration_bucket_account=123456789012"
-var="s3_integration_bucket_account_id=123456789012"
```
#### Using terraform.tfvars file:
Create a `terraform.tfvars` file:
```hcl
external_id = "your-external-id-here"
enable_s3_integration = true
s3_integration_bucket_name = "your-s3-bucket-name"
s3_integration_bucket_account = "123456789012"
#### Using terraform.tfvars file (Recommended):
```bash
cp terraform.tfvars.example terraform.tfvars
# Edit the file with your values
terraform apply
```
Then run: `terraform apply`
#### Command line variables (Alternative):
```bash
terraform apply -var="external_id=your-external-id-here"
```
> Note that Terraform will use the AWS credentials of your default profile.
### Outputs
After successful deployment, you'll get:
- `prowler_role_arn`: The ARN of the created IAM role (use this in Prowler App)
- `prowler_role_name`: The name of the IAM role
- `s3_integration_enabled`: Whether S3 integration is enabled
> **Note:** Terraform will use the AWS credentials of your default profile or AWS_PROFILE environment variable.

View File

@@ -3,15 +3,15 @@
locals {
s3_integration_validation = (
!var.enable_s3_integration ||
(var.enable_s3_integration && var.s3_integration_bucket_name != "" && var.s3_integration_bucket_account != "")
(var.enable_s3_integration && var.s3_integration_bucket_name != "" && var.s3_integration_bucket_account_id != "")
)
}
# Validation check using check block (Terraform 1.5+)
check "s3_integration_requirements" {
assert {
condition = !var.enable_s3_integration || (var.s3_integration_bucket_name != "" && var.s3_integration_bucket_account != "")
error_message = "When enable_s3_integration is true, both s3_integration_bucket_name and s3_integration_bucket_account must be provided and non-empty."
condition = !var.enable_s3_integration || (var.s3_integration_bucket_name != "" && var.s3_integration_bucket_account_id != "")
error_message = "When enable_s3_integration is true, both s3_integration_bucket_name and s3_integration_bucket_account_id must be provided and non-empty."
}
}
@@ -75,7 +75,7 @@ module "s3_integration" {
source = "./s3-integration"
s3_integration_bucket_name = var.s3_integration_bucket_name
s3_integration_bucket_account = var.s3_integration_bucket_account
s3_integration_bucket_account_id = var.s3_integration_bucket_account_id
prowler_role_name = aws_iam_role.prowler_scan.name
}

View File

@@ -17,7 +17,7 @@ resource "aws_iam_role_policy" "prowler_s3_integration" {
]
Condition = {
StringEquals = {
"s3:ResourceAccount" = var.s3_integration_bucket_account
"s3:ResourceAccount" = var.s3_integration_bucket_account_id
}
}
},
@@ -31,7 +31,7 @@ resource "aws_iam_role_policy" "prowler_s3_integration" {
]
Condition = {
StringEquals = {
"s3:ResourceAccount" = var.s3_integration_bucket_account
"s3:ResourceAccount" = var.s3_integration_bucket_account_id
}
}
},
@@ -45,7 +45,7 @@ resource "aws_iam_role_policy" "prowler_s3_integration" {
]
Condition = {
StringEquals = {
"s3:ResourceAccount" = var.s3_integration_bucket_account
"s3:ResourceAccount" = var.s3_integration_bucket_account_id
}
}
}

View File

@@ -8,13 +8,13 @@ variable "s3_integration_bucket_name" {
}
}
variable "s3_integration_bucket_account" {
variable "s3_integration_bucket_account_id" {
type = string
description = "The AWS Account ID owner of the S3 Bucket."
validation {
condition = length(var.s3_integration_bucket_account) == 12 && can(tonumber(var.s3_integration_bucket_account))
error_message = "s3_integration_bucket_account must be a valid 12-digit AWS Account ID."
condition = length(var.s3_integration_bucket_account_id) == 12 && can(tonumber(var.s3_integration_bucket_account_id))
error_message = "s3_integration_bucket_account_id must be a valid 12-digit AWS Account ID."
}
}

View File

@@ -0,0 +1,30 @@
# =============================================================================
# Prowler Terraform Configuration
# =============================================================================
# REQUIRED: External ID from your Prowler App
external_id = "your-unique-external-id-here"
# =============================================================================
# Optional Variables (uncomment and modify as needed)
# =============================================================================
# Prowler Cloud Service Account (leave default unless using self-hosted)
# account_id = "232136659152"
# IAM Principal Pattern (leave default unless using self-hosted)
# iam_principal = "role/prowler*"
# =============================================================================
# S3 Integration Configuration
# =============================================================================
# Uncomment the following lines to enable S3 integration for storing scan reports
# Enable S3 integration
# enable_s3_integration = true
# S3 bucket name where reports will be stored
# s3_integration_bucket_name = "my-prowler-reports-bucket"
# AWS Account ID that owns the S3 bucket (usually your account)
# s3_integration_bucket_account_id = "123456789012"

View File

@@ -1,13 +1,31 @@
# Required variable
# =============================================================================
# Prowler Terraform Configuration
# =============================================================================
# REQUIRED: External ID from your Prowler App setup
# This must match exactly what you configured in Prowler App
external_id = "your-unique-external-id-here"
# Optional Variables
# Prowler Cloud Account
# =============================================================================
# Optional Variables (uncomment and modify as needed)
# =============================================================================
# Prowler Cloud Service Account (leave default unless using self-hosted)
# account_id = "232136659152"
# Prowler Cloud Role
# IAM Principal Pattern (leave default unless using self-hosted)
# iam_principal = "role/prowler*"
# S3 Integration (optional)
# =============================================================================
# S3 Integration Configuration
# =============================================================================
# Uncomment the following lines to enable S3 integration for storing scan reports
# Enable S3 integration
# enable_s3_integration = true
# s3_bucket_name = "your-prowler-reports-bucket"
# s3_bucket_account = "123456789012"
# S3 bucket name where reports will be stored
# s3_integration_bucket_name = "my-prowler-reports-bucket"
# AWS Account ID that owns the S3 bucket (usually your account)
# s3_integration_bucket_account_id = "123456789012"

View File

@@ -44,13 +44,13 @@ variable "s3_integration_bucket_name" {
}
}
variable "s3_integration_bucket_account" {
variable "s3_integration_bucket_account_id" {
type = string
description = "The AWS Account ID owner of the S3 Bucket. Required if enable_s3_integration is true."
default = ""
validation {
condition = var.s3_integration_bucket_account == "" || (length(var.s3_integration_bucket_account) == 12 && can(tonumber(var.s3_integration_bucket_account)))
error_message = "s3_integration_bucket_account must be a valid 12-digit AWS Account ID or empty."
condition = var.s3_integration_bucket_account_id == "" || (length(var.s3_integration_bucket_account_id) == 12 && can(tonumber(var.s3_integration_bucket_account_id)))
error_message = "s3_integration_bucket_account_id must be a valid 12-digit AWS Account ID or empty."
}
}

62
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.1.3 and should not be changed by hand.
[[package]]
name = "about-time"
@@ -394,6 +394,24 @@ cryptography = ">=2.1.4"
isodate = ">=0.6.1"
typing-extensions = ">=4.0.1"
[[package]]
name = "azure-mgmt-apimanagement"
version = "5.0.0"
description = "Microsoft Azure API Management Client Library for Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "azure_mgmt_apimanagement-5.0.0-py3-none-any.whl", hash = "sha256:b88c42a392333b60722fb86f15d092dfc19a8d67510dccd15c217381dff4e6ec"},
{file = "azure_mgmt_apimanagement-5.0.0.tar.gz", hash = "sha256:0ab7fe17e70fe3154cd840ff47d19d7a4610217003eaa7c21acf3511a6e57999"},
]
[package.dependencies]
azure-common = ">=1.1"
azure-mgmt-core = ">=1.3.2"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-mgmt-applicationinsights"
version = "4.1.0"
@@ -551,6 +569,23 @@ azure-mgmt-core = ">=1.3.2"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-mgmt-loganalytics"
version = "12.0.0"
description = "Microsoft Azure Log Analytics Management Client Library for Python"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "azure-mgmt-loganalytics-12.0.0.zip", hash = "sha256:da128a7e0291be7fa2063848df92a9180cf5c16d42adc09d2bc2efd711536bfb"},
{file = "azure_mgmt_loganalytics-12.0.0-py2.py3-none-any.whl", hash = "sha256:75ac1d47dd81179905c40765be8834643d8994acff31056ddc1863017f3faa02"},
]
[package.dependencies]
azure-common = ">=1.1,<2.0"
azure-mgmt-core = ">=1.2.0,<2.0.0"
msrest = ">=0.6.21"
[[package]]
name = "azure-mgmt-monitor"
version = "6.0.2"
@@ -761,6 +796,23 @@ azure-mgmt-core = ">=1.3.2"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-monitor-query"
version = "2.0.0"
description = "Microsoft Corporation Azure Monitor Query Client Library for Python"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "azure_monitor_query-2.0.0-py3-none-any.whl", hash = "sha256:8f52d581271d785e12f49cd5aaa144b8910fb843db2373855a7ef94c7fc462ea"},
{file = "azure_monitor_query-2.0.0.tar.gz", hash = "sha256:7b05f2fcac4fb67fc9f77a7d4c5d98a0f3099fb73b57c69ec1b080773994671b"},
]
[package.dependencies]
azure-core = ">=1.30.0"
isodate = ">=0.6.1"
typing-extensions = ">=4.6.0"
[[package]]
name = "azure-storage-blob"
version = "12.24.1"
@@ -2060,14 +2112,14 @@ files = [
[[package]]
name = "h2"
version = "4.2.0"
version = "4.3.0"
description = "Pure-Python HTTP/2 protocol implementation"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "h2-4.2.0-py3-none-any.whl", hash = "sha256:479a53ad425bb29af087f3458a61d30780bc818e4ebcf01f0b536ba916462ed0"},
{file = "h2-4.2.0.tar.gz", hash = "sha256:c8a52129695e88b1a0578d8d2cc6842bbd79128ac685463b887ee278126ad01f"},
{file = "h2-4.3.0-py3-none-any.whl", hash = "sha256:c438f029a25f7945c69e0ccf0fb951dc3f73a5f6412981daee861431b70e2bdd"},
{file = "h2-4.3.0.tar.gz", hash = "sha256:6c59efe4323fa18b47a632221a1888bd7fde6249819beda254aeca909f221bf1"},
]
[package.dependencies]
@@ -5839,4 +5891,4 @@ type = ["pytest-mypy"]
[metadata]
lock-version = "2.1"
python-versions = ">3.9.1,<3.13"
content-hash = "a0635a7bb99427a5169b126b429d603079fc24c39ce6759a648fdffe74e50d6c"
content-hash = "aea38b0311bfabac00d4bf9ee5d2fa0a7f3e32dd2ee5c5d27eb54c69a80b35e9"

View File

@@ -2,29 +2,34 @@
All notable changes to the **Prowler SDK** are documented in this file.
## [v5.11.0] (Prowler UNRELEASED)
## [v5.11.0] (Prowler v5.11.0)
### Added
- Certificate authentication for M365 provider [(#8404)](https://github.com/prowler-cloud/prowler/pull/8404)
- `vm_sufficient_daily_backup_retention_period` check for Azure provider [(#8200)](https://github.com/prowler-cloud/prowler/pull/8200)
- `vm_jit_access_enabled` check for Azure provider [(#8202)](https://github.com/prowler-cloud/prowler/pull/8202)
- Bedrock AgentCore privilege escalation combination for AWS provider [(#8526)](https://github.com/prowler-cloud/prowler/pull/8526)
- Add User Email and APP name/installations information in GitHub provider [(#8501)](https://github.com/prowler-cloud/prowler/pull/8501)
- Remove standalone iam:PassRole from privesc detection and add missing patterns [(#8530)](https://github.com/prowler-cloud/prowler/pull/8530)
- Support session/profile/role/static credentials in Security Hub integration [(#8539)](https://github.com/prowler-cloud/prowler/pull/8539)
- `eks_cluster_deletion_protection_enabled` check for AWS provider [(#8536)](https://github.com/prowler-cloud/prowler/pull/8536)
- ECS privilege escalation patterns (StartTask and RunTask) for AWS provider [(#8541)](https://github.com/prowler-cloud/prowler/pull/8541)
- Resource Explorer enumeration v2 API actions in `cloudtrail_threat_detection_enumeration` check [(#8557)](https://github.com/prowler-cloud/prowler/pull/8557)
- `apim_threat_detection_llm_jacking` check for Azure provider [(#8571)](https://github.com/prowler-cloud/prowler/pull/8571)
- GCP `--skip-api-check` command line flag [(#8575)](https://github.com/prowler-cloud/prowler/pull/8575)
### Changed
- Refine kisa isms-p compliance mapping [(#8479)](https://github.com/prowler-cloud/prowler/pull/8479)
- Improve AWS Security Hub region check using multiple threads [(#8365)](https://github.com/prowler-cloud/prowler/pull/8365)
### Fixed
---
## [v5.10.3] (Prowler UNRELEASED)
### Fixed
- Resource metadata error in `s3_bucket_shadow_resource_vulnerability` check [(#8572)](https://github.com/prowler-cloud/prowler/pull/8572)
- GitHub App authentication through API fails with auth_method validation error [(#8587)](https://github.com/prowler-cloud/prowler/pull/8587)
- AWS resource-arn filtering [(#8533)](https://github.com/prowler-cloud/prowler/pull/8533)
- GitHub App authentication for GitHub provider [(#8529)](https://github.com/prowler-cloud/prowler/pull/8529)
- List all accessible organizations in GitHub provider [(#8535)](https://github.com/prowler-cloud/prowler/pull/8535)
- Only evaluate enabled accounts in `entra_users_mfa_capable` check [(#8544)](https://github.com/prowler-cloud/prowler/pull/8544)
- GitHub Personal Access Token authentication fails without `user:email` scope [(#8580)](https://github.com/prowler-cloud/prowler/pull/8580)
---
@@ -61,6 +66,7 @@ All notable changes to the **Prowler SDK** are documented in this file.
- GitHub repository and organization scoping support with `--repository/respositories` and `--organization/organizations` flags [(#8329)](https://github.com/prowler-cloud/prowler/pull/8329)
- GCP provider retry configuration [(#8412)](https://github.com/prowler-cloud/prowler/pull/8412)
- `s3_bucket_shadow_resource_vulnerability` check for AWS provider [(#8398)](https://github.com/prowler-cloud/prowler/pull/8398)
- Use `trivy` as engine for IaC provider [(#8466)](https://github.com/prowler-cloud/prowler/pull/8466)
### Changed
- Handle some AWS errors as warnings instead of errors [(#8347)](https://github.com/prowler-cloud/prowler/pull/8347)

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More