Compare commits

..

64 Commits

Author SHA1 Message Date
github-actions
fd912b2c18 chore(release): 3.16.0 2024-04-04 08:52:09 +00:00
Rubén De la Torre Vico
dd843cfb98 docs(azure): Add new permissions necessary from Microsoft Entra ID (#3648) 2024-04-03 17:49:22 +02:00
Pedro Martín
5dd3c30d04 fix(azure): add DefaultValue to Azure CIS compliance (#3652) 2024-04-03 17:46:11 +02:00
Rubén De la Torre Vico
8b085c4c68 chore(azure): Fix AKS and App tests to new format (#3651) 2024-04-03 14:17:23 +02:00
Rubén De la Torre Vico
455343b5c1 chore(entra): Moving constants from checks and services to config file (#3645) 2024-04-03 14:15:12 +02:00
Nacho Rivera
e11a2d6790 chore(regions_update): Changes in regions for AWS services. (#3647)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-04-03 14:14:34 +02:00
Pedro Martín
3577277cc1 fix(cis_2.0_azure): add remaining requirement with id 1.25 (#3646) 2024-04-03 14:13:51 +02:00
Sergio Garcia
88f8c1ab6d fix(azure): normalize tenant domain set in checks (#3641) 2024-04-02 16:59:47 +02:00
Pedro Martín
5c298086de feat(compliance): Add new CIS 2.0 / 2.1 compliance framework for Azure (#3626)
Co-authored-by: Sergio <sergio@prowler.com>
2024-04-02 16:38:52 +02:00
Hugo966
be19ec53bd feat(azure): Check related with roles and vm access with MFA (#3638)
Co-authored-by: Hugo Gálvez Ureña <hugogalvezu96@gmail.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-04-02 13:00:42 +02:00
Pepe Fagoaga
5839d8c49b docs: Update number of Azure checks (#3639) 2024-04-02 11:57:01 +02:00
Pepe Fagoaga
cd54919ca6 chore(action): Prepare containers release for v4 (#3597) 2024-04-02 11:38:35 +02:00
Nacho Rivera
229409de8c chore(regions_update): Changes in regions for AWS services. (#3637)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-04-02 11:22:25 +02:00
dependabot[bot]
abf0447171 build(deps-dev): bump moto from 5.0.3 to 5.0.4 (#3629)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Sergio <sergio@prowler.com>
2024-04-02 10:21:49 +02:00
dependabot[bot]
b9c5634b19 build(deps-dev): bump safety from 3.0.1 to 3.1.0 (#3632)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-02 08:16:17 +02:00
dependabot[bot]
fdef043e21 build(deps-dev): bump mkdocs-material from 9.5.15 to 9.5.17
Bumps [mkdocs-material](https://github.com/squidfunk/mkdocs-material) from 9.5.15 to 9.5.17.
- [Release notes](https://github.com/squidfunk/mkdocs-material/releases)
- [Changelog](https://github.com/squidfunk/mkdocs-material/blob/master/CHANGELOG)
- [Commits](https://github.com/squidfunk/mkdocs-material/compare/9.5.15...9.5.17)

---
updated-dependencies:
- dependency-name: mkdocs-material
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-02 05:50:05 +00:00
dependabot[bot]
559c585c22 build(deps): bump google-api-python-client from 2.123.0 to 2.124.0 (#3630)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-02 07:32:50 +02:00
dependabot[bot]
4b5a3dc2dd build(deps): bump trufflesecurity/trufflehog from 3.71.0 to 3.71.2 (#3628)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-02 07:31:26 +02:00
dependabot[bot]
ab47c2e519 build(deps): bump tj-actions/changed-files from 43 to 44 (#3627)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-02 07:31:10 +02:00
Rubén De la Torre Vico
6c7c36b856 feat(entra): Manage 403 error for getting user authentication methods (#3624) 2024-04-01 11:27:55 +02:00
Pepe Fagoaga
d91ad9e25b chore(apigateway): Handle NotFoundException (#3623) 2024-04-01 11:09:12 +02:00
Nacho Rivera
289687e393 chore(regions_update): Changes in regions for AWS services. (#3621)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-04-01 07:51:42 +02:00
Rubén De la Torre Vico
ac60b6f08d feat(azure) New Microsoft Entra ID checks (#3610) 2024-03-27 14:17:15 +01:00
Kay Agahd
dabb2acfcc fix(aws): break loop after FAIL in SQS and SNS checks (#3618) 2024-03-27 13:03:04 +01:00
Hugo966
98cb4fa2cb fix(azure): fixed check vm_ensure_using_managed_disks metadata (#3617) 2024-03-27 12:35:13 +01:00
Hugo966
20abfc87e8 feat(azure): New check related with trusted launch in vm (#3616) 2024-03-27 12:32:42 +01:00
Nacho Rivera
e2bb4d885c chore(regions_update): Changes in regions for AWS services. (#3615)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-03-27 11:27:04 +01:00
Sergio Garcia
3015381f43 fix(sts): handle China STS regions (#3613) 2024-03-27 11:18:44 +01:00
Sergio Garcia
5b46bf4b67 chore(version): update Prowler version (#3614) 2024-03-26 14:41:00 +01:00
dependabot[bot]
0e8ffb09bb build(deps): bump google-api-python-client from 2.122.0 to 2.123.0 (#3608)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-26 13:51:58 +01:00
dependabot[bot]
acbd2a85a1 build(deps-dev): bump pytest-cov from 4.1.0 to 5.0.0 (#3607)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-26 13:28:50 +01:00
dependabot[bot]
8778bad2c9 build(deps-dev): bump mkdocs-material from 9.5.14 to 9.5.15 (#3606)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-26 13:00:20 +01:00
Nacho Rivera
865a64a47e chore(regions_update): Changes in regions for AWS services. (#3609)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-03-26 12:59:52 +01:00
dependabot[bot]
7bbb164a33 build(deps): bump crazy-max/ghaction-import-gpg from 4 to 6 (#3604)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-26 12:59:35 +01:00
dependabot[bot]
bc0c9780be build(deps): bump trufflesecurity/trufflehog from 3.70.2 to 3.71.0 (#3603)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-26 12:58:57 +01:00
Gabriel Soltz
4932cccdce fix(metadata): change ResourceType Type for AWS Inline Policy Check (#3599) 2024-03-25 09:41:21 +01:00
Nacho Rivera
1a3f8c0277 chore(regions_update): Changes in regions for AWS services. (#3598)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-03-25 09:04:23 +01:00
Pepe Fagoaga
41e0a218de fix(securityhub): Remove region from exception match (#3593) 2024-03-22 10:33:55 +01:00
Sergio Garcia
417da2545d fix(apigatewayv2): handle empty names (#3592) 2024-03-22 10:27:31 +01:00
Pepe Fagoaga
c12080b177 chore(release): update Prowler Version to 3.15.2. (#3591) 2024-03-22 10:04:12 +01:00
Pepe Fagoaga
b8869e617f fix(json-asff): Remediation.Recommendation.Text < 512 chars (#3589) 2024-03-22 10:03:40 +01:00
Sergio Garcia
e37edee276 chore(gcp): remove unnecessary default project id (#3586) 2024-03-21 17:20:26 +01:00
Rubén De la Torre Vico
2d58d1bdc7 feat(entra): New 11 checks related with Microsoft Entra ID (#3585) 2024-03-21 17:17:45 +01:00
Pedro Martín
170d555ab4 fix(compliance): fix csv output for framework Mitre Attack v3 (#3584) 2024-03-21 13:09:58 +01:00
Pepe Fagoaga
35d024822d chore(actions): Set branch based on version (#3580) 2024-03-21 11:01:21 +01:00
Nacho Rivera
1c96cb5120 chore(regions_update): Changes in regions for AWS services. (#3581)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-03-21 10:23:41 +01:00
Sergio Garcia
64c7de93b6 fix(cloudtrail): use dictionary instead of list (#3579) 2024-03-20 19:05:34 +01:00
Pepe Fagoaga
9109bf9213 chore(release): update Prowler Version to 3.15.1 (#3578) 2024-03-20 16:06:27 +01:00
Pepe Fagoaga
c8b7fc7857 fix(actions): Remove indent (#3577) 2024-03-20 16:06:05 +01:00
Pepe Fagoaga
cd11bd6cc2 fix(action): Release on whatever branch (#3576) 2024-03-20 14:50:14 +01:00
Hugo966
e224215fa3 feat(azure):App check related with http logs (#3568)
Co-authored-by: Sergio Garcia <sergargar1@gmail.com>
2024-03-20 11:11:52 +01:00
Nacho Rivera
3bab7552b2 chore(regions_update): Changes in regions for AWS services. (#3571)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-03-20 11:05:03 +01:00
Nacho Rivera
7920dccbe2 chore(regions_update): Changes in regions for AWS services. (#3566)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-03-19 10:23:39 +01:00
dependabot[bot]
6498b6917d build(deps-dev): bump mkdocs-material from 9.5.12 to 9.5.14
Bumps [mkdocs-material](https://github.com/squidfunk/mkdocs-material) from 9.5.12 to 9.5.14.
- [Release notes](https://github.com/squidfunk/mkdocs-material/releases)
- [Changelog](https://github.com/squidfunk/mkdocs-material/blob/master/CHANGELOG)
- [Commits](https://github.com/squidfunk/mkdocs-material/compare/9.5.12...9.5.14)

---
updated-dependencies:
- dependency-name: mkdocs-material
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-19 07:44:13 +00:00
dependabot[bot]
691023ec6c build(deps): bump azure-mgmt-compute from 30.5.0 to 30.6.0 (#3559)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-19 08:26:00 +01:00
dependabot[bot]
fec1c41320 build(deps-dev): bump black from 24.2.0 to 24.3.0
Bumps [black](https://github.com/psf/black) from 24.2.0 to 24.3.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.2.0...24.3.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-19 07:07:10 +00:00
dependabot[bot]
3c1fe72708 build(deps): bump trufflesecurity/trufflehog from 3.69.0 to 3.70.2 (#3561)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-19 07:49:30 +01:00
dependabot[bot]
e4f3329b90 build(deps): bump tj-actions/changed-files from 42 to 43 (#3560)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-19 07:49:17 +01:00
dependabot[bot]
ae835b85dc build(deps-dev): bump coverage from 7.4.3 to 7.4.4 (#3558)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-19 07:47:38 +01:00
Sergio Garcia
1c728af2b9 fix(iam): handle KeyError in service_last_accessed (#3555) 2024-03-18 10:10:49 +01:00
Sergio Garcia
62d37caa1c chore(compliance): rename AWS FTR compliance (#3550) 2024-03-18 09:38:08 +01:00
Nacho Rivera
bfda613a82 chore(regions_update): Changes in regions for AWS services. (#3552)
Co-authored-by: sergargar <38561120+sergargar@users.noreply.github.com>
2024-03-18 08:36:13 +01:00
Hugo966
b240c46973 feat(azure): New check related with logging in Azure Key Vault (#3496)
Co-authored-by: Hugo Gálvez Ureña <hugogalvezu96@gmail.com>
Co-authored-by: Sergio Garcia <38561120+sergargar@users.noreply.github.com>
2024-03-15 15:51:58 +01:00
Hugo966
d7fbcce48c feat(azure): New check related with diagnostics settings in subscriptions (#3539)
Co-authored-by: Hugo Gálvez Ureña <hugogalvezu96@gmail.com>
2024-03-15 14:09:32 +01:00
163 changed files with 12542 additions and 914 deletions

View File

@@ -3,6 +3,8 @@ name: build-lint-push-containers
on:
push:
branches:
# TODO: update it for v3 and v4
# - "v3"
- "master"
paths-ignore:
- ".github/**"
@@ -13,15 +15,27 @@ on:
types: [published]
env:
# AWS Configuration
AWS_REGION_STG: eu-west-1
AWS_REGION_PLATFORM: eu-west-1
AWS_REGION: us-east-1
# Container's configuration
IMAGE_NAME: prowler
DOCKERFILE_PATH: ./Dockerfile
# Tags
LATEST_TAG: latest
STABLE_TAG: stable
TEMPORARY_TAG: temporary
DOCKERFILE_PATH: ./Dockerfile
PYTHON_VERSION: 3.9
# The RELEASE_TAG is set during runtime in releases
RELEASE_TAG: ""
# The PROWLER_VERSION and PROWLER_VERSION_MAJOR are set during runtime in releases
PROWLER_VERSION: ""
PROWLER_VERSION_MAJOR: ""
# TEMPORARY_TAG: temporary
# Python configuration
PYTHON_VERSION: 3.11
jobs:
# Build Prowler OSS container
@@ -30,26 +44,57 @@ jobs:
runs-on: ubuntu-latest
env:
POETRY_VIRTUALENVS_CREATE: "false"
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup python (release)
if: github.event_name == 'release'
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: Install dependencies (release)
if: github.event_name == 'release'
- name: Install Poetry
run: |
pipx install poetry
pipx inject poetry poetry-bumpversion
- name: Get Prowler version
run: |
PROWLER_VERSION="$(poetry version -s 2>/dev/null)"
# Store prowler version major just for the release
PROWLER_VERSION_MAJOR="${PROWLER_VERSION%%.*}"
echo "PROWLER_VERSION_MAJOR=${PROWLER_VERSION_MAJOR}" >> "${GITHUB_ENV}"
case ${PROWLER_VERSION_MAJOR} in
3)
# TODO: update it for v3 and v4
# echo "LATEST=v3-latest" >> "${GITHUB_ENV}"
# echo "STABLE_TAG=v3-stable" >> "${GITHUB_ENV}"
echo "LATEST=latest" >> "${GITHUB_ENV}"
echo "STABLE_TAG=stable" >> "${GITHUB_ENV}"
;;
# TODO: uncomment for v3 and v4
# 4)
# echo "LATEST=latest" >> "${GITHUB_ENV}"
# echo "STABLE_TAG=stable" >> "${GITHUB_ENV}"
# ;;
*)
# Fallback if any other version is present
echo "Releasing another Prowler major version, aborting..."
exit 1
;;
esac
- name: Update Prowler version (release)
if: github.event_name == 'release'
run: |
poetry version ${{ github.event.release.tag_name }}
PROWLER_VERSION="${{ github.event.release.tag_name }}"
poetry version "${PROWLER_VERSION}"
echo "PROWLER_VERSION="${PROWLER_VERSION}" >> "${GITHUB_ENV}"
- name: Login to DockerHub
uses: docker/login-action@v3
@@ -90,9 +135,9 @@ jobs:
context: .
push: true
tags: |
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.PROWLER_VERSION }}
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.PROWLER_VERSION }}
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
file: ${{ env.DOCKERFILE_PATH }}
cache-from: type=gha
@@ -102,16 +147,26 @@ jobs:
needs: container-build-push
runs-on: ubuntu-latest
steps:
- name: Get latest commit info
- name: Get latest commit info (latest)
if: github.event_name == 'push'
run: |
LATEST_COMMIT_HASH=$(echo ${{ github.event.after }} | cut -b -7)
echo "LATEST_COMMIT_HASH=${LATEST_COMMIT_HASH}" >> $GITHUB_ENV
- name: Dispatch event for latest
if: github.event_name == 'push'
- name: Dispatch event (latest)
if: github.event_name == 'push' && ${{ env. PROWLER_VERSION_MAJOR }} == '3'
run: |
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" -H "X-GitHub-Api-Version: 2022-11-28" --data '{"event_type":"dispatch","client_payload":{"version":"latest", "tag": "${{ env.LATEST_COMMIT_HASH }}"}}'
- name: Dispatch event for release
if: github.event_name == 'release'
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
--data '{"event_type":"dispatch","client_payload":{"version":"latest", "tag": "${{ env.LATEST_COMMIT_HASH }}"}}'
- name: Dispatch event (release)
if: github.event_name == 'release' && ${{ env. PROWLER_VERSION_MAJOR }} == '3'
run: |
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" -H "X-GitHub-Api-Version: 2022-11-28" --data '{"event_type":"dispatch","client_payload":{"version":"release", "tag":"${{ github.event.release.tag_name }}"}}'
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
--data '{"event_type":"dispatch","client_payload":{"version":"release", "tag":"${{ env.PROWLER_VERSION }}"}}'

View File

@@ -11,7 +11,7 @@ jobs:
with:
fetch-depth: 0
- name: TruffleHog OSS
uses: trufflesecurity/trufflehog@v3.70.2
uses: trufflesecurity/trufflehog@v3.71.2
with:
path: ./
base: ${{ github.event.repository.default_branch }}

View File

@@ -20,7 +20,7 @@ jobs:
- uses: actions/checkout@v4
- name: Test if changes are in not ignored paths
id: are-non-ignored-files-changed
uses: tj-actions/changed-files@v43
uses: tj-actions/changed-files@v44
with:
files: ./**
files_ignore: |

View File

@@ -8,10 +8,7 @@ env:
RELEASE_TAG: ${{ github.event.release.tag_name }}
PYTHON_VERSION: 3.11
CACHE: "poetry"
# This base branch is used to create a PR with the updated version
# We'd need to handle the base branch for v4 and v3, since they will be
# `master` and `3.0-dev`, respectively
GITHUB_BASE_BRANCH: "master"
# TODO: create a bot user for this kind of tasks, like prowler-bot
GIT_COMMITTER_EMAIL: "sergio@prowler.com"
jobs:
@@ -21,6 +18,23 @@ jobs:
POETRY_VIRTUALENVS_CREATE: "false"
name: Release Prowler to PyPI
steps:
- name: Get Prowler version
run: |
PROWLER_VERSION="${{ env.RELEASE_TAG }}"
case ${PROWLER_VERSION%%.*} in
3)
echo "Releasing Prowler v3 with tag ${PROWLER_VERSION}"
;;
4)
echo "Releasing Prowler v4 with tag ${PROWLER_VERSION}"
;;
*)
echo "Releasing another Prowler major version, aborting..."
exit 1
;;
esac
- uses: actions/checkout@v4
- name: Install dependencies
@@ -39,7 +53,7 @@ jobs:
poetry version ${{ env.RELEASE_TAG }}
- name: Import GPG key
uses: crazy-max/ghaction-import-gpg@v4
uses: crazy-max/ghaction-import-gpg@v6
with:
gpg_private_key: ${{ secrets.GPG_PRIVATE_KEY }}
passphrase: ${{ secrets.GPG_PASSPHRASE }}
@@ -62,12 +76,6 @@ jobs:
# Push the tag
git push -f origin ${{ env.RELEASE_TAG }}
- name: Create new branch for the version update
run: |
git switch -c release-${{ env.RELEASE_TAG }}
git push --set-upstream origin release-${{ env.RELEASE_TAG }}
- name: Build Prowler package
run: |
poetry build
@@ -77,23 +85,6 @@ jobs:
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
poetry publish
- name: Create PR to update version in the branch
run: |
echo "### Description
This PR updates Prowler Version to ${{ env.RELEASE_TAG }}.
### License
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license." |\
gh pr create \
--base ${{ env.GITHUB_BASE_BRANCH }} \
--head release-${{ env.RELEASE_TAG }} \
--title "chore(release): update Prowler Version to ${{ env.RELEASE_TAG }}." \
--body-file -
env:
GH_TOKEN: ${{ secrets.PROWLER_ACCESS_TOKEN }}
- name: Replicate PyPI package
run: |
rm -rf ./dist && rm -rf ./build && rm -rf prowler.egg-info

View File

@@ -8,7 +8,7 @@
<p align="center">
<b>Learn more at <a href="https://prowler.com">prowler.com</i></b>
</p>
<p align="center">
<a href="https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog"><img width="30" height="30" alt="Prowler community on Slack" src="https://github.com/prowler-cloud/prowler/assets/3985464/3617e470-670c-47c9-9794-ce895ebdb627"></a>
<br>
@@ -49,7 +49,7 @@ It contains hundreds of controls covering CIS, NIST 800, NIST CSF, CISA, RBI, Fe
|---|---|---|---|---|
| AWS | 304 | 61 -> `prowler aws --list-services` | 28 -> `prowler aws --list-compliance` | 6 -> `prowler aws --list-categories` |
| GCP | 75 | 11 -> `prowler gcp --list-services` | 1 -> `prowler gcp --list-compliance` | 2 -> `prowler gcp --list-categories`|
| Azure | 109 | 16 -> `prowler azure --list-services` | CIS soon | 2 -> `prowler azure --list-categories` |
| Azure | 126 | 16 -> `prowler azure --list-services` | 2 -> `prowler azure --list-compliance` | 2 -> `prowler azure --list-categories` |
| Kubernetes | Work In Progress | - | CIS soon | - |
# 📖 Documentation

View File

@@ -64,16 +64,17 @@ The other three cases does not need additional configuration, `--az-cli-auth` an
To use each one you need to pass the proper flag to the execution. Prowler for Azure handles two types of permission scopes, which are:
- **Azure Active Directory permissions**: Used to retrieve metadata from the identity assumed by Prowler and future AAD checks (not mandatory to have access to execute the tool)
- **Microsoft Entra ID permissions**: Used to retrieve metadata from the identity assumed by Prowler (not mandatory to have access to execute the tool).
- **Subscription scope permissions**: Required to launch the checks against your resources, mandatory to launch the tool.
#### Azure Active Directory scope
#### Microsoft Entra ID scope
Microsoft Entra ID (AAD earlier) permissions required by the tool are the following:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All`
The best way to assign it is through the Azure web console:
@@ -86,9 +87,10 @@ The best way to assign it is through the Azure web console:
5. In the left menu bar, select "API permissions"
6. Then click on "+ Add a permission" and select "Microsoft Graph"
7. Once in the "Microsoft Graph" view, select "Application permissions"
8. Finally, search for "Directory" and "Policy" and select the following permissions:
8. Finally, search for "Directory", "Policy" and "UserAuthenticationMethod" select the following permissions:
- `Directory.Read.All`
- `Policy.Read.All`
- `UserAuthenticationMethod.Read.All`
![EntraID Permissions](../img/AAD-permissions.png)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 358 KiB

After

Width:  |  Height:  |  Size: 376 KiB

View File

@@ -17,6 +17,8 @@ Currently, the available frameworks are:
- `cis_1.5_aws`
- `cis_2.0_aws`
- `cis_2.0_gcp`
- `cis_2.0_azure`
- `cis_2.1_azure`
- `cis_3.0_aws`
- `cisa_aws`
- `ens_rd2022_aws`

BIN
output.zip Normal file

Binary file not shown.

113
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.7.1 and should not be changed by hand.
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
[[package]]
name = "about-time"
@@ -153,6 +153,17 @@ files = [
about-time = "4.2.1"
grapheme = "0.6.0"
[[package]]
name = "antlr4-python3-runtime"
version = "4.13.1"
description = "ANTLR 4.13.1 runtime for Python 3"
optional = false
python-versions = "*"
files = [
{file = "antlr4-python3-runtime-4.13.1.tar.gz", hash = "sha256:3cd282f5ea7cfb841537fe01f143350fdb1c0b1ce7981443a2fa8513fddb6d1a"},
{file = "antlr4_python3_runtime-4.13.1-py3-none-any.whl", hash = "sha256:78ec57aad12c97ac039ca27403ad61cb98aaec8a3f9bb8144f889aa0fa28b943"},
]
[[package]]
name = "anyio"
version = "4.3.0"
@@ -1446,20 +1457,20 @@ grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0.dev0)"]
[[package]]
name = "google-api-python-client"
version = "2.122.0"
version = "2.124.0"
description = "Google API Client Library for Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "google-api-python-client-2.122.0.tar.gz", hash = "sha256:77447bf2d6b6ea9e686fd66fc2f12ee7a63e3889b7427676429ebf09fcb5dcf9"},
{file = "google_api_python_client-2.122.0-py2.py3-none-any.whl", hash = "sha256:a5953e60394b77b98bcc7ff7c4971ed784b3b693e9a569c176eaccb1549330f2"},
{file = "google-api-python-client-2.124.0.tar.gz", hash = "sha256:f6d3258420f7c76b0f5266b5e402e6f804e30351b018a10083f4a46c3ec33773"},
{file = "google_api_python_client-2.124.0-py2.py3-none-any.whl", hash = "sha256:07dc674449ed353704b1169fdee792f74438d024261dad71b6ce7bb9c683d51f"},
]
[package.dependencies]
google-api-core = ">=1.31.5,<2.0.dev0 || >2.3.0,<3.0.0.dev0"
google-auth = ">=1.19.0,<3.0.0.dev0"
google-auth-httplib2 = ">=0.1.0"
httplib2 = ">=0.15.0,<1.dev0"
google-auth = ">=1.32.0,<2.24.0 || >2.24.0,<2.25.0 || >2.25.0,<3.0.0.dev0"
google-auth-httplib2 = ">=0.2.0,<1.0.0"
httplib2 = ">=0.19.0,<1.dev0"
uritemplate = ">=3.0.1,<5"
[[package]]
@@ -1804,6 +1815,20 @@ files = [
[package.dependencies]
jsonpointer = ">=1.9"
[[package]]
name = "jsonpath-ng"
version = "1.6.1"
description = "A final implementation of JSONPath for Python that aims to be standard compliant, including arithmetic and binary comparison operators and providing clear AST for metaprogramming."
optional = false
python-versions = "*"
files = [
{file = "jsonpath-ng-1.6.1.tar.gz", hash = "sha256:086c37ba4917304850bd837aeab806670224d3f038fe2833ff593a672ef0a5fa"},
{file = "jsonpath_ng-1.6.1-py3-none-any.whl", hash = "sha256:8f22cd8273d7772eea9aaa84d922e0841aa36fdb8a2c6b7f6c3791a16a9bc0be"},
]
[package.dependencies]
ply = "*"
[[package]]
name = "jsonpickle"
version = "3.0.3"
@@ -2237,13 +2262,13 @@ pytz = "*"
[[package]]
name = "mkdocs-material"
version = "9.5.14"
version = "9.5.17"
description = "Documentation that simply works"
optional = false
python-versions = ">=3.8"
files = [
{file = "mkdocs_material-9.5.14-py3-none-any.whl", hash = "sha256:a45244ac221fda46ecf8337f00ec0e5cb5348ab9ffb203ca2a0c313b0d4dbc27"},
{file = "mkdocs_material-9.5.14.tar.gz", hash = "sha256:2a1f8e67cda2587ab93ecea9ba42d0ca61d1d7b5fad8cf690eeaeb39dcd4b9af"},
{file = "mkdocs_material-9.5.17-py3-none-any.whl", hash = "sha256:14a2a60119a785e70e765dd033e6211367aca9fc70230e577c1cf6a326949571"},
{file = "mkdocs_material-9.5.17.tar.gz", hash = "sha256:06ae1275a72db1989cf6209de9e9ecdfbcfdbc24c58353877b2bb927dbe413e4"},
]
[package.dependencies]
@@ -2293,16 +2318,17 @@ test = ["pytest", "pytest-cov"]
[[package]]
name = "moto"
version = "5.0.3"
version = "5.0.4"
description = ""
optional = false
python-versions = ">=3.8"
files = [
{file = "moto-5.0.3-py2.py3-none-any.whl", hash = "sha256:261d312d1d69c2afccb450a0566666d7b75d76ed6a7d00aac278a9633b073ff0"},
{file = "moto-5.0.3.tar.gz", hash = "sha256:070ac2edf89ad7aee28534481ce68e2f344c8a6a8fefec5427eea0d599bfdbdb"},
{file = "moto-5.0.4-py2.py3-none-any.whl", hash = "sha256:4054360b882b6e7bab25d52d057e196b978b8d15f1921333f534c4d8f6510bbb"},
{file = "moto-5.0.4.tar.gz", hash = "sha256:8d19125d40c919cb40df62f4576904c2647c4e9a0e1ebc42491dd7787d09e107"},
]
[package.dependencies]
antlr4-python3-runtime = {version = "*", optional = true, markers = "extra == \"all\""}
aws-xray-sdk = {version = ">=0.93,<0.96 || >0.96", optional = true, markers = "extra == \"all\""}
boto3 = ">=1.9.201"
botocore = ">=1.14.0"
@@ -2313,9 +2339,10 @@ graphql-core = {version = "*", optional = true, markers = "extra == \"all\""}
Jinja2 = ">=2.10.1"
joserfc = {version = ">=0.9.0", optional = true, markers = "extra == \"all\""}
jsondiff = {version = ">=1.1.2", optional = true, markers = "extra == \"all\""}
jsonpath-ng = {version = "*", optional = true, markers = "extra == \"all\""}
multipart = {version = "*", optional = true, markers = "extra == \"all\""}
openapi-spec-validator = {version = ">=0.5.0", optional = true, markers = "extra == \"all\""}
py-partiql-parser = {version = "0.5.1", optional = true, markers = "extra == \"all\""}
py-partiql-parser = {version = "0.5.2", optional = true, markers = "extra == \"all\""}
pyparsing = {version = ">=3.0.7", optional = true, markers = "extra == \"all\""}
python-dateutil = ">=2.1,<3.0.0"
PyYAML = {version = ">=5.1", optional = true, markers = "extra == \"all\""}
@@ -2326,24 +2353,25 @@ werkzeug = ">=0.5,<2.2.0 || >2.2.0,<2.2.1 || >2.2.1"
xmltodict = "*"
[package.extras]
all = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "multipart", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.1)", "pyparsing (>=3.0.7)", "setuptools"]
all = ["PyYAML (>=5.1)", "antlr4-python3-runtime", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "jsonpath-ng", "multipart", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.2)", "pyparsing (>=3.0.7)", "setuptools"]
apigateway = ["PyYAML (>=5.1)", "joserfc (>=0.9.0)", "openapi-spec-validator (>=0.5.0)"]
apigatewayv2 = ["PyYAML (>=5.1)", "openapi-spec-validator (>=0.5.0)"]
appsync = ["graphql-core"]
awslambda = ["docker (>=3.0.0)"]
batch = ["docker (>=3.0.0)"]
cloudformation = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.1)", "pyparsing (>=3.0.7)", "setuptools"]
cloudformation = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.2)", "pyparsing (>=3.0.7)", "setuptools"]
cognitoidp = ["joserfc (>=0.9.0)"]
dynamodb = ["docker (>=3.0.0)", "py-partiql-parser (==0.5.1)"]
dynamodbstreams = ["docker (>=3.0.0)", "py-partiql-parser (==0.5.1)"]
dynamodb = ["docker (>=3.0.0)", "py-partiql-parser (==0.5.2)"]
dynamodbstreams = ["docker (>=3.0.0)", "py-partiql-parser (==0.5.2)"]
glue = ["pyparsing (>=3.0.7)"]
iotdata = ["jsondiff (>=1.1.2)"]
proxy = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=2.5.1)", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "multipart", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.1)", "pyparsing (>=3.0.7)", "setuptools"]
resourcegroupstaggingapi = ["PyYAML (>=5.1)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.1)", "pyparsing (>=3.0.7)"]
s3 = ["PyYAML (>=5.1)", "py-partiql-parser (==0.5.1)"]
s3crc32c = ["PyYAML (>=5.1)", "crc32c", "py-partiql-parser (==0.5.1)"]
server = ["PyYAML (>=5.1)", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "flask (!=2.2.0,!=2.2.1)", "flask-cors", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.1)", "pyparsing (>=3.0.7)", "setuptools"]
proxy = ["PyYAML (>=5.1)", "antlr4-python3-runtime", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=2.5.1)", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "jsonpath-ng", "multipart", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.2)", "pyparsing (>=3.0.7)", "setuptools"]
resourcegroupstaggingapi = ["PyYAML (>=5.1)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.2)", "pyparsing (>=3.0.7)"]
s3 = ["PyYAML (>=5.1)", "py-partiql-parser (==0.5.2)"]
s3crc32c = ["PyYAML (>=5.1)", "crc32c", "py-partiql-parser (==0.5.2)"]
server = ["PyYAML (>=5.1)", "antlr4-python3-runtime", "aws-xray-sdk (>=0.93,!=0.96)", "cfn-lint (>=0.40.0)", "docker (>=3.0.0)", "flask (!=2.2.0,!=2.2.1)", "flask-cors", "graphql-core", "joserfc (>=0.9.0)", "jsondiff (>=1.1.2)", "jsonpath-ng", "openapi-spec-validator (>=0.5.0)", "py-partiql-parser (==0.5.2)", "pyparsing (>=3.0.7)", "setuptools"]
ssm = ["PyYAML (>=5.1)"]
stepfunctions = ["antlr4-python3-runtime", "jsonpath-ng"]
xray = ["aws-xray-sdk (>=0.93,!=0.96)", "setuptools"]
[[package]]
@@ -2894,6 +2922,17 @@ files = [
dev = ["pre-commit", "tox"]
testing = ["pytest", "pytest-benchmark"]
[[package]]
name = "ply"
version = "3.11"
description = "Python Lex & Yacc"
optional = false
python-versions = "*"
files = [
{file = "ply-3.11-py2.py3-none-any.whl", hash = "sha256:096f9b8350b65ebd2fd1346b12452efe5b9607f7482813ffca50c22722a807ce"},
{file = "ply-3.11.tar.gz", hash = "sha256:00c7c1aaa88358b9c765b6d3000c6eec0ba42abca5351b095321aef446081da3"},
]
[[package]]
name = "portalocker"
version = "2.8.2"
@@ -2935,13 +2974,13 @@ files = [
[[package]]
name = "py-partiql-parser"
version = "0.5.1"
version = "0.5.2"
description = "Pure Python PartiQL Parser"
optional = false
python-versions = "*"
files = [
{file = "py-partiql-parser-0.5.1.tar.gz", hash = "sha256:aeac8f46529d8651bbae88a1a6c14dc3aa38ebc4bc6bd1eb975044c0564246c6"},
{file = "py_partiql_parser-0.5.1-py3-none-any.whl", hash = "sha256:53053e70987dea2983e1990ad85f87a7d8cec13dd4a4b065a740bcfd661f5a6b"},
{file = "py-partiql-parser-0.5.2.tar.gz", hash = "sha256:bdec65fe17d6093c05e9bc1742a99a041ef810b50a71cc0d9e74a88218d938cf"},
{file = "py_partiql_parser-0.5.2-py3-none-any.whl", hash = "sha256:9c79b59bbe0cb50daa8090020f2e7f3e5a0e33f7846b48924f19a8f7704f4877"},
]
[package.extras]
@@ -3178,13 +3217,13 @@ testing = ["argcomplete", "attrs (>=19.2)", "hypothesis (>=3.56)", "mock", "pygm
[[package]]
name = "pytest-cov"
version = "4.1.0"
version = "5.0.0"
description = "Pytest plugin for measuring coverage."
optional = false
python-versions = ">=3.7"
python-versions = ">=3.8"
files = [
{file = "pytest-cov-4.1.0.tar.gz", hash = "sha256:3904b13dfbfec47f003b8e77fd5b589cd11904a21ddf1ab38a64f204d6a10ef6"},
{file = "pytest_cov-4.1.0-py3-none-any.whl", hash = "sha256:6ba70b9e97e69fcc3fb45bfeab2d0a138fb65c4d0d6a41ef33983ad114be8c3a"},
{file = "pytest-cov-5.0.0.tar.gz", hash = "sha256:5837b58e9f6ebd335b0f8060eecce69b662415b16dc503883a02f45dfeb14857"},
{file = "pytest_cov-5.0.0-py3-none-any.whl", hash = "sha256:4f0764a1219df53214206bf1feea4633c3b558a2925c8b59f144f682861ce652"},
]
[package.dependencies]
@@ -3192,7 +3231,7 @@ coverage = {version = ">=5.2.1", extras = ["toml"]}
pytest = ">=4.6"
[package.extras]
testing = ["fields", "hunter", "process-tests", "pytest-xdist", "six", "virtualenv"]
testing = ["fields", "hunter", "process-tests", "pytest-xdist", "virtualenv"]
[[package]]
name = "pytest-env"
@@ -3808,13 +3847,13 @@ crt = ["botocore[crt] (>=1.20.29,<2.0a.0)"]
[[package]]
name = "safety"
version = "3.0.1"
version = "3.1.0"
description = "Checks installed dependencies for known vulnerabilities and licenses."
optional = false
python-versions = ">=3.7"
files = [
{file = "safety-3.0.1-py3-none-any.whl", hash = "sha256:1ed058bc4bef132b974e58d7fcad020fb897cd255328016f8a5a194b94ca91d2"},
{file = "safety-3.0.1.tar.gz", hash = "sha256:1f2000f03652f3a0bfc67f8fd1e98bc5723ccb76e15cb1bdd68545c3d803df01"},
{file = "safety-3.1.0-py3-none-any.whl", hash = "sha256:f2ba2d36f15ac1e24751547a73b854509a7d6db31efd30b57f64ffdf9d021934"},
{file = "safety-3.1.0.tar.gz", hash = "sha256:71f47b82ece153ec2f240e277f7cbfa70d5da2e0d143162c67f63b2f7459a1aa"},
]
[package.dependencies]
@@ -3824,11 +3863,11 @@ dparse = ">=0.6.4b0"
jinja2 = ">=3.1.0"
marshmallow = ">=3.15.0"
packaging = ">=21.0"
pydantic = ">=1.10.12,<2.0"
pydantic = ">=1.10.12"
requests = "*"
rich = "*"
"ruamel.yaml" = ">=0.17.21"
safety-schemas = ">=0.0.1"
safety-schemas = ">=0.0.2"
setuptools = ">=65.5.1"
typer = "*"
typing-extensions = ">=4.7.1"
@@ -4426,4 +4465,4 @@ testing = ["big-O", "jaraco.functools", "jaraco.itertools", "more-itertools", "p
[metadata]
lock-version = "2.0"
python-versions = ">=3.9,<3.13"
content-hash = "1e1b7b3f2d7388580096bab9889f44901a3bde26c71bddd5e9f4b275a0e7e116"
content-hash = "4c3d437bd810c2186cff45dee54e0a97a4c58d1d39ffb956399fe686b8b00a13"

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@@ -11,7 +11,7 @@ from prowler.lib.logger import logger
timestamp = datetime.today()
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
prowler_version = "3.15.1"
prowler_version = "3.16.0"
html_logo_url = "https://github.com/prowler-cloud/prowler/"
html_logo_img = "https://user-images.githubusercontent.com/3985464/113734260-7ba06900-96fb-11eb-82bc-d4f68a1e2710.png"
square_logo_img = "https://user-images.githubusercontent.com/38561120/235905862-9ece5bd7-9aa3-4e48-807a-3a9035eb8bfb.png"

View File

@@ -87,6 +87,7 @@ class CIS_Requirement_Attribute(BaseModel):
RemediationProcedure: str
AuditProcedure: str
AdditionalInformation: str
DefaultValue: Optional[str]
References: str

View File

@@ -11,6 +11,7 @@ from prowler.lib.outputs.models import (
Check_Output_CSV_AWS_CIS,
Check_Output_CSV_AWS_ISO27001_2013,
Check_Output_CSV_AWS_Well_Architected,
Check_Output_CSV_AZURE_CIS,
Check_Output_CSV_ENS_RD2022,
Check_Output_CSV_GCP_CIS,
Check_Output_CSV_Generic_Compliance,
@@ -35,6 +36,7 @@ def add_manual_controls(output_options, audit_info, file_descriptors):
manual_finding.region = ""
manual_finding.location = ""
manual_finding.project_id = ""
manual_finding.subscription = ""
fill_compliance(
output_options, manual_finding, audit_info, file_descriptors
)
@@ -161,7 +163,36 @@ def fill_compliance(output_options, finding, audit_info, file_descriptors):
csv_header = generate_csv_fields(
Check_Output_CSV_GCP_CIS
)
elif compliance.Provider == "Azure":
compliance_row = Check_Output_CSV_AZURE_CIS(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,
Subscription=finding.subscription,
AssessmentDate=outputs_unix_timestamp(
output_options.unix_timestamp, timestamp
),
Requirements_Id=requirement_id,
Requirements_Description=requirement_description,
Requirements_Attributes_Section=attribute.Section,
Requirements_Attributes_Profile=attribute.Profile,
Requirements_Attributes_AssessmentStatus=attribute.AssessmentStatus,
Requirements_Attributes_Description=attribute.Description,
Requirements_Attributes_RationaleStatement=attribute.RationaleStatement,
Requirements_Attributes_ImpactStatement=attribute.ImpactStatement,
Requirements_Attributes_RemediationProcedure=attribute.RemediationProcedure,
Requirements_Attributes_AuditProcedure=attribute.AuditProcedure,
Requirements_Attributes_AdditionalInformation=attribute.AdditionalInformation,
Requirements_Attributes_DefaultValue=attribute.DefaultValue,
Requirements_Attributes_References=attribute.References,
Status=finding.status,
StatusExtended=finding.status_extended,
ResourceId=finding.resource_id,
ResourceName=finding.resource_name,
CheckId=finding.check_metadata.CheckID,
)
csv_header = generate_csv_fields(
Check_Output_CSV_AZURE_CIS
)
elif (
"AWS-Well-Architected-Framework" in compliance.Framework
and compliance.Provider == "AWS"
@@ -269,11 +300,19 @@ def fill_compliance(output_options, finding, audit_info, file_descriptors):
attributes_categories = ""
attributes_values = ""
attributes_comments = ""
for attribute in requirement.Attributes:
attributes_aws_services += attribute.AWSService + "\n"
attributes_categories += attribute.Category + "\n"
attributes_values += attribute.Value + "\n"
attributes_comments += attribute.Comment + "\n"
attributes_aws_services = ", ".join(
attribute.AWSService for attribute in requirement.Attributes
)
attributes_categories = ", ".join(
attribute.Category for attribute in requirement.Attributes
)
attributes_values = ", ".join(
attribute.Value for attribute in requirement.Attributes
)
attributes_comments = ", ".join(
attribute.Comment for attribute in requirement.Attributes
)
compliance_row = Check_Output_MITRE_ATTACK(
Provider=finding.check_metadata.Provider,
Description=compliance.Description,

View File

@@ -15,6 +15,7 @@ from prowler.lib.outputs.models import (
Check_Output_CSV_AWS_CIS,
Check_Output_CSV_AWS_ISO27001_2013,
Check_Output_CSV_AWS_Well_Architected,
Check_Output_CSV_AZURE_CIS,
Check_Output_CSV_ENS_RD2022,
Check_Output_CSV_GCP_CIS,
Check_Output_CSV_Generic_Compliance,
@@ -23,6 +24,7 @@ from prowler.lib.outputs.models import (
)
from prowler.lib.utils.utils import file_exists, open_file
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info
from prowler.providers.common.outputs import get_provider_output_model
from prowler.providers.gcp.lib.audit_info.models import GCP_Audit_Info
@@ -113,7 +115,16 @@ def fill_file_descriptors(output_modes, output_directory, output_filename, audit
filename, output_mode, audit_info, Check_Output_CSV_GCP_CIS
)
file_descriptors.update({output_mode: file_descriptor})
elif isinstance(audit_info, Azure_Audit_Info):
filename = f"{output_directory}/{output_filename}_{output_mode}{csv_file_suffix}"
if "cis_" in output_mode:
file_descriptor = initialize_file_descriptor(
filename,
output_mode,
audit_info,
Check_Output_CSV_AZURE_CIS,
)
file_descriptors.update({output_mode: file_descriptor})
elif isinstance(audit_info, AWS_Audit_Info):
if output_mode == "json-asff":
filename = f"{output_directory}/{output_filename}{json_asff_file_suffix}"

View File

@@ -100,7 +100,17 @@ def fill_json_asff(finding_output, audit_info, finding, output_options):
if not finding.check_metadata.Remediation.Recommendation.Url:
finding.check_metadata.Remediation.Recommendation.Url = "https://docs.aws.amazon.com/securityhub/latest/userguide/what-is-securityhub.html"
finding_output.Remediation = {
"Recommendation": finding.check_metadata.Remediation.Recommendation
"Recommendation": {
"Text": (
(
finding.check_metadata.Remediation.Recommendation.Text[:509]
+ "..."
)
if len(finding.check_metadata.Remediation.Recommendation.Text) > 512
else finding.check_metadata.Remediation.Recommendation.Text
),
"Url": finding.check_metadata.Remediation.Recommendation.Url,
}
}
return finding_output

View File

@@ -599,6 +599,35 @@ class Check_Output_CSV_GCP_CIS(BaseModel):
CheckId: str
class Check_Output_CSV_AZURE_CIS(BaseModel):
"""
Check_Output_CSV_CIS generates a finding's output in CSV CIS format.
"""
Provider: str
Description: str
Subscription: str
AssessmentDate: str
Requirements_Id: str
Requirements_Description: str
Requirements_Attributes_Section: str
Requirements_Attributes_Profile: str
Requirements_Attributes_AssessmentStatus: str
Requirements_Attributes_Description: str
Requirements_Attributes_RationaleStatement: str
Requirements_Attributes_ImpactStatement: str
Requirements_Attributes_RemediationProcedure: str
Requirements_Attributes_AuditProcedure: str
Requirements_Attributes_AdditionalInformation: str
Requirements_Attributes_DefaultValue: str
Requirements_Attributes_References: str
Status: str
StatusExtended: str
ResourceId: str
ResourceName: str
CheckId: str
class Check_Output_CSV_Generic_Compliance(BaseModel):
"""
Check_Output_CSV_Generic_Compliance generates a finding's output in CSV Generic Compliance format.

View File

@@ -234,6 +234,7 @@
"ap-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
@@ -260,6 +261,7 @@
"aws": [
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
@@ -286,6 +288,7 @@
"aws": [
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
@@ -1415,8 +1418,14 @@
"chime-sdk-media-pipelines": {
"regions": {
"aws": [
"ap-northeast-1",
"ap-northeast-2",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ca-central-1",
"eu-central-1",
"eu-west-2",
"us-east-1",
"us-west-2"
],
@@ -2262,6 +2271,7 @@
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-central-2",
@@ -2296,6 +2306,7 @@
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-central-2",
@@ -2332,6 +2343,7 @@
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ap-southeast-4",
"ca-central-1",
"eu-central-1",
"eu-central-2",
@@ -2825,6 +2837,22 @@
"aws-us-gov": []
}
},
"deadline-cloud": {
"regions": {
"aws": [
"ap-northeast-1",
"ap-southeast-1",
"ap-southeast-2",
"eu-central-1",
"eu-west-1",
"us-east-1",
"us-east-2",
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
}
},
"deepcomposer": {
"regions": {
"aws": [
@@ -5992,7 +6020,9 @@
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
"aws-us-gov": [
"us-gov-west-1"
]
}
},
"lexv2-models": {
@@ -6011,7 +6041,9 @@
"us-west-2"
],
"aws-cn": [],
"aws-us-gov": []
"aws-us-gov": [
"us-gov-west-1"
]
}
},
"license-manager": {
@@ -6576,6 +6608,7 @@
"eu-west-1",
"eu-west-2",
"eu-west-3",
"me-central-1",
"sa-east-1",
"us-east-1",
"us-east-2",
@@ -6597,6 +6630,7 @@
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-4",
@@ -6606,6 +6640,7 @@
"eu-west-1",
"eu-west-2",
"eu-west-3",
"me-central-1",
"sa-east-1",
"us-east-1",
"us-east-2",
@@ -7046,6 +7081,7 @@
"ap-east-1",
"ap-northeast-1",
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
@@ -7143,6 +7179,7 @@
"ap-south-1",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
"ca-central-1",
"eu-central-1",
"eu-north-1",
@@ -9807,6 +9844,7 @@
"ap-northeast-2",
"ap-northeast-3",
"ap-south-1",
"ap-south-2",
"ap-southeast-1",
"ap-southeast-2",
"ap-southeast-3",
@@ -9816,6 +9854,7 @@
"eu-central-2",
"eu-north-1",
"eu-south-1",
"eu-south-2",
"eu-west-1",
"eu-west-2",
"eu-west-3",

View File

@@ -69,6 +69,9 @@ Caller Identity ARN: {Fore.YELLOW}[{audit_info.audited_identity_arn}]{Style.RESE
def create_sts_session(
session: session.Session, aws_region: str
) -> session.Session.client:
return session.client(
"sts", aws_region, endpoint_url=f"https://sts.{aws_region}.amazonaws.com"
sts_endpoint_url = (
f"https://sts.{aws_region}.amazonaws.com"
if "cn-" not in aws_region
else f"https://sts.{aws_region}.amazonaws.com.cn"
)
return session.client("sts", aws_region, endpoint_url=sts_endpoint_url)

View File

@@ -86,7 +86,7 @@ def verify_security_hub_integration_enabled_per_region(
error_message = client_error.response["Error"]["Message"]
if (
error_code == "InvalidAccessException"
and f"Account {aws_account_number} is not subscribed to AWS Security Hub in region {region}"
and f"Account {aws_account_number} is not subscribed to AWS Security Hub"
in error_message
):
logger.warning(

View File

@@ -1,5 +1,6 @@
from typing import Optional
from botocore.exceptions import ClientError
from pydantic import BaseModel
from prowler.lib.logger import logger
@@ -47,12 +48,28 @@ class APIGateway(AWSService):
logger.info("APIGateway - Getting Rest APIs authorizer...")
try:
for rest_api in self.rest_apis:
regional_client = self.regional_clients[rest_api.region]
authorizers = regional_client.get_authorizers(restApiId=rest_api.id)[
"items"
]
if authorizers:
rest_api.authorizer = True
try:
regional_client = self.regional_clients[rest_api.region]
authorizers = regional_client.get_authorizers(
restApiId=rest_api.id
)["items"]
if authorizers:
rest_api.authorizer = True
except ClientError as error:
if error.response["Error"]["Code"] == "NotFoundException":
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
@@ -62,10 +79,25 @@ class APIGateway(AWSService):
logger.info("APIGateway - Describing Rest API...")
try:
for rest_api in self.rest_apis:
regional_client = self.regional_clients[rest_api.region]
rest_api_info = regional_client.get_rest_api(restApiId=rest_api.id)
if rest_api_info["endpointConfiguration"]["types"] == ["PRIVATE"]:
rest_api.public_endpoint = False
try:
regional_client = self.regional_clients[rest_api.region]
rest_api_info = regional_client.get_rest_api(restApiId=rest_api.id)
if rest_api_info["endpointConfiguration"]["types"] == ["PRIVATE"]:
rest_api.public_endpoint = False
except ClientError as error:
if error.response["Error"]["Code"] == "NotFoundException":
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
@@ -75,29 +107,44 @@ class APIGateway(AWSService):
logger.info("APIGateway - Getting stages for Rest APIs...")
try:
for rest_api in self.rest_apis:
regional_client = self.regional_clients[rest_api.region]
stages = regional_client.get_stages(restApiId=rest_api.id)
for stage in stages["item"]:
waf = None
logging = False
client_certificate = False
if "webAclArn" in stage:
waf = stage["webAclArn"]
if "methodSettings" in stage:
if stage["methodSettings"]:
logging = True
if "clientCertificateId" in stage:
client_certificate = True
arn = f"arn:{self.audited_partition}:apigateway:{regional_client.region}::/restapis/{rest_api.id}/stages/{stage['stageName']}"
rest_api.stages.append(
Stage(
name=stage["stageName"],
arn=arn,
logging=logging,
client_certificate=client_certificate,
waf=waf,
tags=[stage.get("tags")],
try:
regional_client = self.regional_clients[rest_api.region]
stages = regional_client.get_stages(restApiId=rest_api.id)
for stage in stages["item"]:
waf = None
logging = False
client_certificate = False
if "webAclArn" in stage:
waf = stage["webAclArn"]
if "methodSettings" in stage:
if stage["methodSettings"]:
logging = True
if "clientCertificateId" in stage:
client_certificate = True
arn = f"arn:{self.audited_partition}:apigateway:{regional_client.region}::/restapis/{rest_api.id}/stages/{stage['stageName']}"
rest_api.stages.append(
Stage(
name=stage["stageName"],
arn=arn,
logging=logging,
client_certificate=client_certificate,
waf=waf,
tags=[stage.get("tags")],
)
)
except ClientError as error:
if error.response["Error"]["Code"] == "NotFoundException":
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
@@ -108,33 +155,50 @@ class APIGateway(AWSService):
logger.info("APIGateway - Getting API resources...")
try:
for rest_api in self.rest_apis:
regional_client = self.regional_clients[rest_api.region]
get_resources_paginator = regional_client.get_paginator("get_resources")
for page in get_resources_paginator.paginate(restApiId=rest_api.id):
for resource in page["items"]:
id = resource["id"]
resource_methods = []
methods_auth = {}
for resource_method in resource.get(
"resourceMethods", {}
).keys():
resource_methods.append(resource_method)
try:
regional_client = self.regional_clients[rest_api.region]
get_resources_paginator = regional_client.get_paginator(
"get_resources"
)
for page in get_resources_paginator.paginate(restApiId=rest_api.id):
for resource in page["items"]:
id = resource["id"]
resource_methods = []
methods_auth = {}
for resource_method in resource.get(
"resourceMethods", {}
).keys():
resource_methods.append(resource_method)
for resource_method in resource_methods:
if resource_method != "OPTIONS":
method_config = regional_client.get_method(
restApiId=rest_api.id,
resourceId=id,
httpMethod=resource_method,
for resource_method in resource_methods:
if resource_method != "OPTIONS":
method_config = regional_client.get_method(
restApiId=rest_api.id,
resourceId=id,
httpMethod=resource_method,
)
auth_type = method_config["authorizationType"]
methods_auth.update({resource_method: auth_type})
rest_api.resources.append(
PathResourceMethods(
path=resource["path"], resource_methods=methods_auth
)
auth_type = method_config["authorizationType"]
methods_auth.update({resource_method: auth_type})
rest_api.resources.append(
PathResourceMethods(
path=resource["path"], resource_methods=methods_auth
)
except ClientError as error:
if error.response["Error"]["Code"] == "NotFoundException":
logger.warning(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
else:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
except Exception as error:
logger.error(

View File

@@ -32,7 +32,7 @@ class ApiGatewayV2(AWSService):
arn=arn,
id=apigw["ApiId"],
region=regional_client.region,
name=apigw["Name"],
name=apigw.get("Name", ""),
tags=[apigw.get("Tags")],
)
)

View File

@@ -20,7 +20,7 @@ class awslambda_function_invoke_api_operations_cloudtrail_logging_enabled(Check)
f"Lambda function {function.name} is not recorded by CloudTrail."
)
lambda_recorded_cloudtrail = False
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
for data_event in trail.data_events:
# classic event selectors
if not data_event.is_advanced:

View File

@@ -8,7 +8,7 @@ from prowler.providers.aws.services.s3.s3_client import s3_client
class cloudtrail_bucket_requires_mfa_delete(Check):
def execute(self):
findings = []
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
if trail.is_logging:
trail_bucket_is_in_account = False
trail_bucket = trail.s3_bucket

View File

@@ -11,7 +11,7 @@ maximum_time_without_logging = 1
class cloudtrail_cloudwatch_logging_enabled(Check):
def execute(self):
findings = []
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
if trail.name:
report = Check_Report_AWS(self.metadata())
report.region = trail.region

View File

@@ -7,7 +7,7 @@ from prowler.providers.aws.services.cloudtrail.cloudtrail_client import (
class cloudtrail_insights_exist(Check):
def execute(self):
findings = []
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
if trail.is_logging:
report = Check_Report_AWS(self.metadata())
report.region = trail.region

View File

@@ -7,7 +7,7 @@ from prowler.providers.aws.services.cloudtrail.cloudtrail_client import (
class cloudtrail_kms_encryption_enabled(Check):
def execute(self):
findings = []
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
if trail.name:
report = Check_Report_AWS(self.metadata())
report.region = trail.region

View File

@@ -7,7 +7,7 @@ from prowler.providers.aws.services.cloudtrail.cloudtrail_client import (
class cloudtrail_log_file_validation_enabled(Check):
def execute(self):
findings = []
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
if trail.name:
report = Check_Report_AWS(self.metadata())
report.region = trail.region

View File

@@ -8,7 +8,7 @@ from prowler.providers.aws.services.s3.s3_client import s3_client
class cloudtrail_logs_s3_bucket_access_logging_enabled(Check):
def execute(self):
findings = []
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
if trail.name:
trail_bucket_is_in_account = False
trail_bucket = trail.s3_bucket

View File

@@ -8,7 +8,7 @@ from prowler.providers.aws.services.s3.s3_client import s3_client
class cloudtrail_logs_s3_bucket_is_not_publicly_accessible(Check):
def execute(self):
findings = []
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
if trail.name:
trail_bucket_is_in_account = False
trail_bucket = trail.s3_bucket

View File

@@ -10,8 +10,8 @@ class cloudtrail_multi_region_enabled(Check):
for region in cloudtrail_client.regional_clients.keys():
report = Check_Report_AWS(self.metadata())
report.region = region
for trail in cloudtrail_client.trails:
if trail.region == region:
for trail in cloudtrail_client.trails.values():
if trail.region == region or trail.is_multiregion:
if trail.is_logging:
report.status = "PASS"
report.resource_id = trail.name

View File

@@ -16,7 +16,7 @@ class cloudtrail_multi_region_enabled_logging_management_events(Check):
report.resource_id = cloudtrail_client.audited_account
report.resource_arn = cloudtrail_client.trail_arn_template
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
if trail.is_logging:
if trail.is_multiregion:
for event in trail.data_events:

View File

@@ -8,7 +8,7 @@ from prowler.providers.aws.services.s3.s3_client import s3_client
class cloudtrail_s3_dataevents_read_enabled(Check):
def execute(self):
findings = []
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
for data_event in trail.data_events:
# classic event selectors
if not data_event.is_advanced:

View File

@@ -8,7 +8,7 @@ from prowler.providers.aws.services.s3.s3_client import s3_client
class cloudtrail_s3_dataevents_write_enabled(Check):
def execute(self):
findings = []
for trail in cloudtrail_client.trails:
for trail in cloudtrail_client.trails.values():
for data_event in trail.data_events:
# Classic event selectors
if not data_event.is_advanced:

View File

@@ -15,7 +15,7 @@ class Cloudtrail(AWSService):
# Call AWSService's __init__
super().__init__(__class__.__name__, audit_info)
self.trail_arn_template = f"arn:{self.audited_partition}:cloudtrail:{self.region}:{self.audited_account}:trail"
self.trails = []
self.trails = {}
self.__threading_call__(self.__get_trails__)
self.__get_trail_status__()
self.__get_insight_selectors__()
@@ -45,27 +45,23 @@ class Cloudtrail(AWSService):
kms_key_id = trail["KmsKeyId"]
if "CloudWatchLogsLogGroupArn" in trail:
log_group_arn = trail["CloudWatchLogsLogGroupArn"]
self.trails.append(
Trail(
name=trail["Name"],
is_multiregion=trail["IsMultiRegionTrail"],
home_region=trail["HomeRegion"],
arn=trail["TrailARN"],
region=regional_client.region,
is_logging=False,
log_file_validation_enabled=trail[
"LogFileValidationEnabled"
],
latest_cloudwatch_delivery_time=None,
s3_bucket=trail["S3BucketName"],
kms_key=kms_key_id,
log_group_arn=log_group_arn,
data_events=[],
has_insight_selectors=trail.get("HasInsightSelectors"),
)
self.trails[trail["TrailARN"]] = Trail(
name=trail["Name"],
is_multiregion=trail["IsMultiRegionTrail"],
home_region=trail["HomeRegion"],
arn=trail["TrailARN"],
region=regional_client.region,
is_logging=False,
log_file_validation_enabled=trail["LogFileValidationEnabled"],
latest_cloudwatch_delivery_time=None,
s3_bucket=trail["S3BucketName"],
kms_key=kms_key_id,
log_group_arn=log_group_arn,
data_events=[],
has_insight_selectors=trail.get("HasInsightSelectors"),
)
if trails_count == 0:
self.trails.append(
self.trails[self.__get_trail_arn_template__(regional_client.region)] = (
Trail(
region=regional_client.region,
)
@@ -79,7 +75,7 @@ class Cloudtrail(AWSService):
def __get_trail_status__(self):
logger.info("Cloudtrail - Getting trail status")
try:
for trail in self.trails:
for trail in self.trails.values():
for region, client in self.regional_clients.items():
if trail.region == region and trail.name:
status = client.get_trail_status(Name=trail.arn)
@@ -97,7 +93,7 @@ class Cloudtrail(AWSService):
def __get_event_selectors__(self):
logger.info("Cloudtrail - Getting event selector")
try:
for trail in self.trails:
for trail in self.trails.values():
for region, client in self.regional_clients.items():
if trail.region == region and trail.name:
data_events = client.get_event_selectors(TrailName=trail.arn)
@@ -131,7 +127,7 @@ class Cloudtrail(AWSService):
logger.info("Cloudtrail - Getting trail insight selectors...")
try:
for trail in self.trails:
for trail in self.trails.values():
for region, client in self.regional_clients.items():
if trail.region == region and trail.name:
insight_selectors = None
@@ -180,7 +176,7 @@ class Cloudtrail(AWSService):
def __list_tags_for_resource__(self):
logger.info("CloudTrail - List Tags...")
try:
for trail in self.trails:
for trail in self.trails.values():
# Check if trails are in this account and region
if (
trail.region == trail.home_region

View File

@@ -12,7 +12,7 @@ def check_cloudwatch_log_metric_filter(
):
# 1. Iterate for CloudWatch Log Group in CloudTrail trails
log_groups = []
for trail in trails:
for trail in trails.values():
if trail.log_group_arn:
log_groups.append(trail.log_group_arn.split(":")[6])
# 2. Describe metric filters for previous log groups

View File

@@ -11,7 +11,7 @@
"SubServiceName": "",
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
"Severity": "high",
"ResourceType": "AwsIamPolicy",
"ResourceType": "AwsIamRole",
"Description": "Ensure inline policies that allow full \"*:*\" administrative privileges are not associated to IAM identities",
"Risk": "IAM policies are the means by which privileges are granted to users, groups or roles. It is recommended and considered a standard security advice to grant least privilege—that is; granting only the permissions required to perform a task. Determine what users need to do and then craft policies for them that let the users perform only those tasks instead of allowing full administrative privileges. Providing full administrative privileges instead of restricting to the minimum set of permissions that the user is required to do exposes the resources to potentially unwanted actions.",
"RelatedUrl": "",

View File

@@ -43,6 +43,7 @@ class sns_topics_not_publicly_accessible(Check):
else:
report.status = "FAIL"
report.status_extended = f"SNS topic {topic.name} is public because its policy allows public access."
break
findings.append(report)

View File

@@ -41,9 +41,11 @@ class sqs_queues_not_publicly_accessible(Check):
else:
report.status = "FAIL"
report.status_extended = f"SQS queue {queue.id} is public because its policy allows public access, and the condition does not limit access to resources within the same account."
break
else:
report.status = "FAIL"
report.status_extended = f"SQS queue {queue.id} is public because its policy allows public access."
break
findings.append(report)
return findings

View File

@@ -0,0 +1,21 @@
from uuid import UUID
# Service management API
WINDOWS_AZURE_SERVICE_MANAGEMENT_API = "797f4846-ba00-4fd7-ba43-dac1f8f63013"
# Authorization policy roles
GUEST_USER_ACCESS_NO_RESTRICTICTED = UUID("a0b1b346-4d3e-4e8b-98f8-753987be4970")
GUEST_USER_ACCESS_RESTRICTICTED = UUID("2af84b1e-32c8-42b7-82bc-daa82404023b")
# General built-in roles
CONTRIBUTOR_ROLE_ID = "b24988ac-6180-42a0-ab88-20f7382dd24c"
OWNER_ROLE_ID = "8e3af657-a8ff-443c-a75c-2fe8c4bcb635"
# Compute roles
VIRTUAL_MACHINE_CONTRIBUTOR_ROLE_ID = "9980e02c-c2be-4d73-94e8-173b1dc7cf3c"
VIRTUAL_MACHINE_ADMINISTRATOR_LOGIN_ROLE_ID = "1c0163c0-47e6-4577-8991-ea5c82e286e4"
VIRTUAL_MACHINE_USER_LOGIN_ROLE_ID = "fb879df8-f326-4884-b1cf-06f3ad86be52"
VIRTUAL_MACHINE_LOCAL_USER_LOGIN_ROLE_ID = "602da2ba-a5c2-41da-b01d-5360126ab525"
WINDOWS_ADMIN_CENTER_ADMINISTRATOR_LOGIN_ROLE_ID = (
"a6333a3e-0164-44c3-b281-7a577aff287f"
)

View File

@@ -1,3 +1,5 @@
from typing import Any
from prowler.lib.logger import logger
from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info
@@ -5,7 +7,7 @@ from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info
class AzureService:
def __init__(
self,
service: str,
service: Any,
audit_info: Azure_Audit_Info,
):
self.clients = self.__set_clients__(

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "app_http_logs_enabled",
"CheckTitle": "Ensure that logging for Azure AppService 'HTTP logs' is enabled",
"CheckType": [],
"ServiceName": "app",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "low",
"ResourceType": "Microsoft.Web/sites/config",
"Description": "Enable AppServiceHTTPLogs diagnostic log category for Azure App Service instances to ensure all http requests are captured and centrally logged.",
"Risk": "Capturing web requests can be important supporting information for security analysts performing monitoring and incident response activities. Once logging, these logs can be ingested into SIEM or other central aggregation point for the organization.",
"RelatedUrl": "https://learn.microsoft.com/en-us/security/benchmark/azure/mcsb-logging-threat-detection#lt-3-enable-logging-for-security-investigation",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": "https://docs.bridgecrew.io/docs/ensure-that-app-service-enables-http-logging#terraform"
},
"Recommendation": {
"Text": "1. Go to App Services For each App Service: 2. Go to Diagnostic Settings 3. Click Add Diagnostic Setting 4. Check the checkbox next to 'HTTP logs' 5. Configure a destination based on your specific logging consumption capability (for example Stream to an event hub and then consuming with SIEM integration for Event Hub logging).",
"Url": "https://docs.microsoft.com/en-us/azure/app-service/troubleshoot-diagnostic-logs"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Log consumption and processing will incur additional cost."
}

View File

@@ -0,0 +1,29 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.app.app_client import app_client
class app_http_logs_enabled(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription_name, apps in app_client.apps.items():
for app_name, app in apps.items():
if "functionapp" not in app.kind:
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = subscription_name
report.resource_name = app_name
report.resource_id = app.resource_id
if not app.monitor_diagnostic_settings:
report.status_extended = f"App {app_name} does not have a diagnostic setting in subscription {subscription_name}."
else:
for diagnostic_setting in app.monitor_diagnostic_settings:
report.status_extended = f"App {app_name} does not have HTTP Logs enabled in diagnostic setting {diagnostic_setting.name} in subscription {subscription_name}"
for log in diagnostic_setting.logs:
if log.category == "AppServiceHTTPLogs" and log.enabled:
report.status = "PASS"
report.status_extended = f"App {app_name} has HTTP Logs enabled in diagnostic setting {diagnostic_setting.name} in subscription {subscription_name}"
break
findings.append(report)
return findings

View File

@@ -6,6 +6,8 @@ from azure.mgmt.web.models import ManagedServiceIdentity, SiteConfigResource
from prowler.lib.logger import logger
from prowler.providers.azure.lib.audit_info.models import Azure_Audit_Info
from prowler.providers.azure.lib.service.service import AzureService
from prowler.providers.azure.services.monitor.monitor_client import monitor_client
from prowler.providers.azure.services.monitor.monitor_service import DiagnosticSetting
########################## App
@@ -49,8 +51,12 @@ class App(AzureService):
getattr(app, "client_cert_enabled", False),
getattr(app, "client_cert_mode", "Ignore"),
),
monitor_diagnostic_settings=self.__get_app_monitor_settings__(
app.name, app.resource_group, subscription_name
),
https_only=getattr(app, "https_only", False),
identity=getattr(app, "identity", None),
kind=getattr(app, "kind", "app"),
)
}
)
@@ -78,6 +84,21 @@ class App(AzureService):
return cert_mode
def __get_app_monitor_settings__(self, app_name, resource_group, subscription):
logger.info(f"App - Getting monitor diagnostics settings for {app_name}...")
monitor_diagnostics_settings = []
try:
monitor_diagnostics_settings = monitor_client.diagnostic_settings_with_uri(
self.subscriptions[subscription],
f"subscriptions/{self.subscriptions[subscription]}/resourceGroups/{resource_group}/providers/Microsoft.Web/sites/{app_name}",
monitor_client.clients[subscription],
)
except Exception as error:
logger.error(
f"Subscription name: {self.subscription} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return monitor_diagnostics_settings
@dataclass
class WebApp:
@@ -87,3 +108,5 @@ class WebApp:
client_cert_mode: str = "Ignore"
auth_enabled: bool = False
https_only: bool = False
monitor_diagnostic_settings: list[DiagnosticSetting] = None
kind: str = "app"

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_conditional_access_policy_require_mfa_for_management_api",
"CheckTitle": "Ensure Multifactor Authentication is Required for Windows Azure Service Management API",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "#microsoft.graph.conditionalAccess",
"Description": "This recommendation ensures that users accessing the Windows Azure Service Management API (i.e. Azure Powershell, Azure CLI, Azure Resource Manager API, etc.) are required to use multifactor authentication (MFA) credentials when accessing resources through the Windows Azure Service Management API.",
"Risk": "Administrative access to the Windows Azure Service Management API should be secured with a higher level of scrutiny to authenticating mechanisms. Enabling multifactor authentication is recommended to reduce the potential for abuse of Administrative actions, and to prevent intruders or compromised admin credentials from changing administrative settings.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/identity/conditional-access/howto-conditional-access-policy-azure-management",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "1. From the Azure Admin Portal dashboard, open Microsoft Entra ID. 2. Click Security in the Entra ID blade. 3. Click Conditional Access in the Security blade. 4. Click Policies in the Conditional Access blade. 5. Click + New policy. 6. Enter a name for the policy. 7. Click the blue text under Users. 8. Under Include, select All users. 9. Under Exclude, check Users and groups. 10. Select users or groups to be exempted from this policy (e.g. break-glass emergency accounts, and non-interactive service accounts) then click the Select button. 11. Click the blue text under Target Resources. 12. Under Include, click the Select apps radio button. 13. Click the blue text under Select. 14. Check the box next to Windows Azure Service Management APIs then click the Select button. 15. Click the blue text under Grant. 16. Under Grant access check the box for Require multifactor authentication then click the Select button. 17. Before creating, set Enable policy to Report-only. 18. Click Create. After testing the policy in report-only mode, update the Enable policy setting from Report-only to On.",
"Url": "https://learn.microsoft.com/en-us/entra/identity/conditional-access/concept-conditional-access-cloud-apps"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Conditional Access policies require Microsoft Entra ID P1 or P2 licenses. Similarly, they may require additional overhead to maintain if users lose access to their MFA. Any users or groups which are granted an exception to this policy should be carefully tracked, be granted only minimal necessary privileges, and conditional access exceptions should be regularly reviewed or investigated."
}

View File

@@ -0,0 +1,44 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.config import WINDOWS_AZURE_SERVICE_MANAGEMENT_API
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_conditional_access_policy_require_mfa_for_management_api(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for (
tenant_name,
conditional_access_policies,
) in entra_client.conditional_access_policy.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant_name}"
report.resource_name = "Conditional Access Policy"
report.resource_id = "Conditional Access Policy"
report.status_extended = (
"Conditional Access Policy does not require MFA for management API."
)
for policy_id, policy in conditional_access_policies.items():
if (
policy.state == "enabled"
and "All" in policy.users["include"]
and WINDOWS_AZURE_SERVICE_MANAGEMENT_API
in policy.target_resources["include"]
and any(
"mfa" in access_control.lower()
for access_control in policy.access_controls["grant"]
)
):
report.status = "PASS"
report.status_extended = (
"Conditional Access Policy requires MFA for management API."
)
report.resource_id = policy_id
report.resource_name = policy.name
break
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_global_admin_in_less_than_five_users",
"CheckTitle": "Ensure fewer than 5 users have global administrator assignment",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "#microsoft.graph.directoryRole",
"Description": "This recommendation aims to maintain a balance between security and operational efficiency by ensuring that a minimum of 2 and a maximum of 4 users are assigned the Global Administrator role in Microsoft Entra ID. Having at least two Global Administrators ensures redundancy, while limiting the number to four reduces the risk of excessive privileged access.",
"Risk": "The Global Administrator role has extensive privileges across all services in Microsoft Entra ID. The Global Administrator role should never be used in regular daily activities; administrators should have a regular user account for daily activities, and a separate account for administrative responsibilities. Limiting the number of Global Administrators helps mitigate the risk of unauthorized access, reduces the potential impact of human error, and aligns with the principle of least privilege to reduce the attack surface of an Azure tenant. Conversely, having at least two Global Administrators ensures that administrative functions can be performed without interruption in case of unavailability of a single admin.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/identity/role-based-access-control/best-practices#5-limit-the-number-of-global-administrators-to-less-than-5",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "1. From Azure Home select the Portal Menu 2. Select Microsoft Entra ID 3. Select Roles and Administrators 4. Select Global Administrator 5. Ensure less than 5 users are actively assigned the role. 6. Ensure that at least 2 users are actively assigned the role.",
"Url": "https://learn.microsoft.com/en-us/microsoft-365/admin/add-users/about-admin-roles?view=o365-worldwide#security-guidelines-for-assigning-roles"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Implementing this recommendation may require changes in administrative workflows or the redistribution of roles and responsibilities. Adequate training and awareness should be provided to all Global Administrators."
}

View File

@@ -0,0 +1,36 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_global_admin_in_less_than_five_users(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant_domain, directory_roles in entra_client.directory_roles.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = "Global Administrator"
if "Global Administrator" in directory_roles:
report.resource_id = getattr(
directory_roles["Global Administrator"],
"id",
"Global Administrator",
)
num_global_admins = len(
getattr(directory_roles["Global Administrator"], "members", [])
)
if num_global_admins < 5:
report.status = "PASS"
report.status_extended = (
f"There are {num_global_admins} global administrators."
)
else:
report.status_extended = f"There are {num_global_admins} global administrators. It should be less than five."
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_non_privileged_user_has_mfa",
"CheckTitle": "Ensure that 'Multi-Factor Auth Status' is 'Enabled' for all Non-Privileged Users",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "#microsoft.graph.users",
"Description": "Enable multi-factor authentication for all non-privileged users.",
"Risk": "Multi-factor authentication requires an individual to present a minimum of two separate forms of authentication before access is granted. Multi-factor authentication provides additional assurance that the individual attempting to gain access is who they claim to be. With multi-factor authentication, an attacker would need to compromise at least two different authentication mechanisms, increasing the difficulty of compromise and thus reducing the risk.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/identity/authentication/concept-mfa-howitworks",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/azure/ActiveDirectory/multi-factor-authentication-for-all-non-privileged-users.html#",
"Terraform": ""
},
"Recommendation": {
"Text": "Activate one of the available multi-factor authentication methods for users in Microsoft Entra ID.",
"Url": "https://learn.microsoft.com/en-us/entra/identity/authentication/tutorial-enable-azure-mfa"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Users would require two forms of authentication before any access is granted. Also, this requires an overhead for managing dual forms of authentication."
}

View File

@@ -0,0 +1,34 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
from prowler.providers.azure.services.entra.lib.user_privileges import (
is_privileged_user,
)
class entra_non_privileged_user_has_mfa(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant_domain, users in entra_client.users.items():
for user_domain_name, user in users.items():
if not is_privileged_user(
user, entra_client.directory_roles[tenant_domain]
):
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = user_domain_name
report.resource_id = user.id
report.status_extended = (
f"Non-privileged user {user.name} does not have MFA."
)
if len(user.authentication_methods) > 1:
report.status = "PASS"
report.status_extended = (
f"Non-privileged user {user.name} has MFA."
)
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_policy_default_users_cannot_create_security_groups",
"CheckTitle": "Ensure that 'Users can create security groups in Azure portals, API or PowerShell' is set to 'No'",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "#microsoft.graph.authorizationPolicy",
"Description": "Restrict security group creation to administrators only.",
"Risk": "When creating security groups is enabled, all users in the directory are allowed to create new security groups and add members to those groups. Unless a business requires this day-to-day delegation, security group creation should be restricted to administrators only.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/identity/users/groups-self-service-management",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/azure/ActiveDirectory/users-can-create-security-groups.html",
"Terraform": ""
},
"Recommendation": {
"Text": "1. From Azure Home select the Portal Menu 2. Select Microsoft Entra ID 3. Select Groups 4. Select General under Settings 5. Set Users can create security groups in Azure portals, API or PowerShell to No",
"Url": ""
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Enabling this setting could create a number of requests that would need to be managed by an administrator."
}

View File

@@ -0,0 +1,30 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_policy_default_users_cannot_create_security_groups(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant_domain, auth_policy in entra_client.authorization_policy.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = getattr(auth_policy, "name", "Authorization Policy")
report.resource_id = getattr(auth_policy, "id", "authorizationPolicy")
report.status_extended = "Non-privileged users are able to create security groups via the Access Panel and the Azure administration portal."
if getattr(
auth_policy, "default_user_role_permissions", None
) and not getattr(
auth_policy.default_user_role_permissions,
"allowed_to_create_security_groups",
True,
):
report.status = "PASS"
report.status_extended = "Non-privileged users are not able to create security groups via the Access Panel and the Azure administration portal."
findings.append(report)
return findings

View File

@@ -15,7 +15,7 @@
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/azure/ActiveDirectory/users-can-register-applications.html",
"Terraform": ""
},
"Recommendation": {

View File

@@ -10,12 +10,14 @@ class entra_policy_ensure_default_user_cannot_create_apps(Check):
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"All from tenant '{tenant_domain}'"
report.resource_name = auth_policy.name
report.resource_id = auth_policy.id
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = getattr(auth_policy, "name", "Authorization Policy")
report.resource_id = getattr(auth_policy, "id", "authorizationPolicy")
report.status_extended = "App creation is not disabled for non-admin users."
if auth_policy.default_user_role_permissions and not getattr(
if getattr(
auth_policy, "default_user_role_permissions", None
) and not getattr(
auth_policy.default_user_role_permissions,
"allowed_to_create_apps",
True,

View File

@@ -7,17 +7,18 @@ class entra_policy_ensure_default_user_cannot_create_tenants(Check):
findings = []
for tenant_domain, auth_policy in entra_client.authorization_policy.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"All from tenant '{tenant_domain}'"
report.resource_name = auth_policy.name
report.resource_id = auth_policy.id
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = getattr(auth_policy, "name", "Authorization Policy")
report.resource_id = getattr(auth_policy, "id", "authorizationPolicy")
report.status_extended = (
"Tenants creation is not disabled for non-admin users."
)
if auth_policy.default_user_role_permissions and not getattr(
if getattr(
auth_policy, "default_user_role_permissions", None
) and not getattr(
auth_policy.default_user_role_permissions,
"allowed_to_create_tenants",
True,

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_policy_guest_invite_only_for_admin_roles",
"CheckTitle": "Ensure that 'Guest invite restrictions' is set to 'Only users assigned to specific admin roles can invite guest users'",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "#microsoft.graph.authorizationPolicy",
"Description": "Restrict invitations to users with specific administrative roles only.",
"Risk": "Restricting invitations to users with specific administrator roles ensures that only authorized accounts have access to cloud resources. This helps to maintain 'Need to Know' permissions and prevents inadvertent access to data. By default the setting Guest invite restrictions is set to Anyone in the organization can invite guest users including guests and non-admins. This would allow anyone within the organization to invite guests and non-admins to the tenant, posing a security risk.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/external-id/external-collaboration-settings-configure",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "1. From Azure Home select the Portal Menu 2. Select Microsoft Entra ID 3. Then External Identities 4. Select External collaboration settings 5. Under Guest invite settings, for Guest invite restrictions, ensure that Only users assigned to specific admin roles can invite guest users is selected",
"Url": "https://learn.microsoft.com/en-us/answers/questions/685101/how-to-allow-only-admins-to-add-guests"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "With the option of Only users assigned to specific admin roles can invite guest users selected, users with specific admin roles will be in charge of sending invitations to the external users, requiring additional overhead by them to manage user accounts. This will mean coordinating with other departments as they are onboarding new users."
}

View File

@@ -0,0 +1,27 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_policy_guest_invite_only_for_admin_roles(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant_domain, auth_policy in entra_client.authorization_policy.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = getattr(auth_policy, "name", "Authorization Policy")
report.resource_id = getattr(auth_policy, "id", "authorizationPolicy")
report.status_extended = "Guest invitations are not restricted to users with specific administrative roles only."
if (
getattr(auth_policy, "guest_invite_settings", "everyone")
== "adminsAndGuestInviters"
or getattr(auth_policy, "guest_invite_settings", "everyone") == "none"
):
report.status = "PASS"
report.status_extended = "Guest invitations are restricted to users with specific administrative roles only."
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_policy_guest_users_access_restrictions",
"CheckTitle": "Ensure That 'Guest users access restrictions' is set to 'Guest user access is restricted to properties and memberships of their own directory objects'",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "#microsoft.graph.authorizationPolicy",
"Description": "Limit guest user permissions.",
"Risk": "Limiting guest access ensures that guest accounts do not have permission for certain directory tasks, such as enumerating users, groups or other directory resources, and cannot be assigned to administrative roles in your directory. Guest access has three levels of restriction. 1. Guest users have the same access as members (most inclusive), 2. Guest users have limited access to properties and memberships of directory objects (default value), 3. Guest user access is restricted to properties and memberships of their own directory objects (most restrictive). The recommended option is the 3rd, most restrictive: 'Guest user access is restricted to their own directory object'.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/identity/users/users-restrict-guest-permissions",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "1. From Azure Home select the Portal Menu 2. Select Microsoft Entra ID 3. Then External Identities 4. Select External collaboration settings 5. Under Guest user access, change Guest user access restrictions to be Guest user access is restricted to properties and memberships of their own directory objects",
"Url": "https://learn.microsoft.com/en-us/entra/fundamentals/users-default-permissions#member-and-guest-users"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "This may create additional requests for permissions to access resources that administrators will need to approve. According to https://learn.microsoft.com/en-us/azure/active-directory/enterprise- users/users-restrict-guest-permissions#services-currently-not-supported Service without current support might have compatibility issues with the new guest restriction setting."
}

View File

@@ -0,0 +1,27 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.config import GUEST_USER_ACCESS_RESTRICTICTED
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_policy_guest_users_access_restrictions(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant_domain, auth_policy in entra_client.authorization_policy.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = getattr(auth_policy, "name", "Authorization Policy")
report.resource_id = getattr(auth_policy, "id", "authorizationPolicy")
report.status_extended = "Guest user access is not restricted to properties and memberships of their own directory objects"
if (
getattr(auth_policy, "guest_user_role_id", None)
== GUEST_USER_ACCESS_RESTRICTICTED
):
report.status = "PASS"
report.status_extended = "Guest user access is restricted to properties and memberships of their own directory objects"
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_policy_restricts_user_consent_for_apps",
"CheckTitle": "Ensure 'User consent for applications' is set to 'Do not allow user consent'",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "#microsoft.graph.authorizationPolicy",
"Description": "Require administrators to provide consent for applications before use.",
"Risk": "If Microsoft Entra ID is running as an identity provider for third-party applications, permissions and consent should be limited to administrators or pre-approved. Malicious applications may attempt to exfiltrate data or abuse privileged user accounts.",
"RelatedUrl": "https://learn.microsoft.com/en-gb/entra/identity/enterprise-apps/configure-user-consent?pivots=portal",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/azure/ActiveDirectory/users-can-consent-to-apps-accessing-company-data-on-their-behalf.html#",
"Terraform": ""
},
"Recommendation": {
"Text": "1. From Azure Home select the Portal Menu 2. Select Microsoft Entra ID 3. Select Enterprise Applications 4. Select Consent and permissions 5. Select User consent settings 6. Set User consent for applications to Do not allow user consent 7. Click save",
"Url": "https://learn.microsoft.com/en-us/security/benchmark/azure/mcsb-privileged-access#pa-1-separate-and-limit-highly-privilegedadministrative-users"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Enforcing this setting may create additional requests that administrators need to review."
}

View File

@@ -0,0 +1,30 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_policy_restricts_user_consent_for_apps(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant_domain, auth_policy in entra_client.authorization_policy.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = getattr(auth_policy, "name", "Authorization Policy")
report.resource_id = getattr(auth_policy, "id", "authorizationPolicy")
report.status_extended = "Entra allows users to consent apps accessing company data on their behalf"
if getattr(auth_policy, "default_user_role_permissions", None) and not any(
"ManagePermissionGrantsForSelf" in policy_assigned
for policy_assigned in getattr(
auth_policy.default_user_role_permissions,
"permission_grant_policies_assigned",
["ManagePermissionGrantsForSelf.microsoft-user-default-legacy"],
)
):
report.status = "PASS"
report.status_extended = "Entra does not allow users to consent apps accessing company data on their behalf"
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_policy_user_consent_for_verified_apps",
"CheckTitle": "Ensure 'User consent for applications' Is Set To 'Allow for Verified Publishers'",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "#microsoft.graph.authorizationPolicy",
"Description": "Allow users to provide consent for selected permissions when a request is coming from a verified publisher.",
"Risk": "If Microsoft Entra ID is running as an identity provider for third-party applications, permissions and consent should be limited to administrators or pre-approved. Malicious applications may attempt to exfiltrate data or abuse privileged user accounts.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/identity/enterprise-apps/configure-user-consent?pivots=portal#configure-user-consent-to-applications",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "1. From Azure Home select the Portal Menu 2. Select Microsoft Entra ID 3. Select Enterprise Applications 4. Select Consent and permissions 5. Select User consent settings 6. Under User consent for applications, select Allow user consent for apps from verified publishers, for selected permissions 7. Select Save",
"Url": "https://learn.microsoft.com/en-us/security/benchmark/azure/mcsb-privileged-access#pa-1-separate-and-limit-highly-privilegedadministrative-users"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Enforcing this setting may create additional requests that administrators need to review."
}

View File

@@ -0,0 +1,31 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_policy_user_consent_for_verified_apps(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant_domain, auth_policy in entra_client.authorization_policy.items():
report = Check_Report_Azure(self.metadata())
report.status = "PASS"
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = getattr(auth_policy, "name", "Authorization Policy")
report.resource_id = getattr(auth_policy, "id", "authorizationPolicy")
report.status_extended = "Entra does not allow users to consent non-verified apps accessing company data on their behalf."
if getattr(auth_policy, "default_user_role_permissions", None) and any(
"ManagePermissionGrantsForSelf.microsoft-user-default-legacy"
in policy_assigned
for policy_assigned in getattr(
auth_policy.default_user_role_permissions,
"permission_grant_policies_assigned",
["ManagePermissionGrantsForSelf.microsoft-user-default-legacy"],
)
):
report.status = "FAIL"
report.status_extended = "Entra allows users to consent apps accessing company data on their behalf."
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_privileged_user_has_mfa",
"CheckTitle": "Ensure that 'Multi-Factor Auth Status' is 'Enabled' for all Privileged Users",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "#microsoft.graph.users",
"Description": "Enable multi-factor authentication for all roles, groups, and users that have write access or permissions to Azure resources. These include custom created objects or built-in roles such as; - Service Co-Administrators - Subscription Owners - Contributors",
"Risk": "Multi-factor authentication requires an individual to present a minimum of two separate forms of authentication before access is granted. Multi-factor authentication provides additional assurance that the individual attempting to gain access is who they claim to be. With multi-factor authentication, an attacker would need to compromise at least two different authentication mechanisms, increasing the difficulty of compromise and thus reducing the risk.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/identity/authentication/concept-mfa-howitworks",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/azure/ActiveDirectory/multi-factor-authentication-for-all-privileged-users.html#",
"Terraform": ""
},
"Recommendation": {
"Text": "Activate one of the available multi-factor authentication methods for users in Microsoft Entra ID.",
"Url": "https://learn.microsoft.com/en-us/entra/identity/authentication/tutorial-enable-azure-mfa"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Users would require two forms of authentication before any access is granted. Additional administrative time will be required for managing dual forms of authentication when enabling multi-factor authentication."
}

View File

@@ -0,0 +1,32 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
from prowler.providers.azure.services.entra.lib.user_privileges import (
is_privileged_user,
)
class entra_privileged_user_has_mfa(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant_domain, users in entra_client.users.items():
for user_domain_name, user in users.items():
if is_privileged_user(
user, entra_client.directory_roles[tenant_domain]
):
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = user_domain_name
report.resource_id = user.id
report.status_extended = (
f"Privileged user {user.name} does not have MFA."
)
if len(user.authentication_methods) > 1:
report.status = "PASS"
report.status_extended = f"Privileged user {user.name} has MFA."
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_security_defaults_enabled",
"CheckTitle": "Ensure Security Defaults is enabled on Microsoft Entra ID",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "#microsoft.graph.identitySecurityDefaultsEnforcementPolicy",
"Description": "Security defaults in Microsoft Entra ID make it easier to be secure and help protect your organization. Security defaults contain preconfigured security settings for common attacks. Security defaults is available to everyone. The goal is to ensure that all organizations have a basic level of security enabled at no extra cost. You may turn on security defaults in the Azure portal.",
"Risk": "Security defaults provide secure default settings that we manage on behalf of organizations to keep customers safe until they are ready to manage their own identity security settings. For example, doing the following: - Requiring all users and admins to register for MFA. - Challenging users with MFA - when necessary, based on factors such as location, device, role, and task. - Disabling authentication from legacy authentication clients, which cant do MFA.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/fundamentals/security-defaults",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/azure/ActiveDirectory/security-defaults-enabled.html#",
"Terraform": ""
},
"Recommendation": {
"Text": "1. From Azure Home select the Portal Menu. 2. Browse to Microsoft Entra ID > Properties 3. Select Manage security defaults 4. Set the Enable security defaults to Enabled 5. Select Save",
"Url": "https://techcommunity.microsoft.com/t5/microsoft-entra-blog/introducing-security-defaults/ba-p/1061414"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "This recommendation should be implemented initially and then may be overridden by other service/product specific CIS Benchmarks. Administrators should also be aware that certain configurations in Microsoft Entra ID may impact other Microsoft services such as Microsoft 365."
}

View File

@@ -0,0 +1,26 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_security_defaults_enabled(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for (
tenant,
security_default,
) in entra_client.security_default.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant}"
report.resource_name = getattr(security_default, "name", "Security Default")
report.resource_id = getattr(security_default, "id", "Security Default")
report.status_extended = "Entra security defaults is diabled."
if getattr(security_default, "is_enabled", False):
report.status = "PASS"
report.status_extended = "Entra security defaults is enabled."
findings.append(report)
return findings

View File

@@ -1,18 +1,20 @@
import asyncio
from dataclasses import dataclass
from typing import Optional
from typing import Any, List, Optional
from uuid import UUID
from msgraph import GraphServiceClient
from msgraph.generated.models.default_user_role_permissions import (
DefaultUserRolePermissions,
)
from msgraph.generated.models.setting_value import SettingValue
from pydantic import BaseModel
from prowler.lib.logger import logger
from prowler.providers.azure.config import GUEST_USER_ACCESS_NO_RESTRICTICTED
from prowler.providers.azure.lib.service.service import AzureService
########################## Entra
class Entra(AzureService):
def __init__(self, azure_audit_info):
super().__init__(GraphServiceClient, azure_audit_info)
@@ -20,10 +22,26 @@ class Entra(AzureService):
self.authorization_policy = asyncio.get_event_loop().run_until_complete(
self.__get_authorization_policy__()
)
self.group_settings = asyncio.get_event_loop().run_until_complete(
self.__get_group_settings__()
)
self.security_default = asyncio.get_event_loop().run_until_complete(
self.__get_security_default__()
)
self.named_locations = asyncio.get_event_loop().run_until_complete(
self.__get_named_locations__()
)
self.directory_roles = asyncio.get_event_loop().run_until_complete(
self.__get_directory_roles__()
)
self.conditional_access_policy = asyncio.get_event_loop().run_until_complete(
self.__get_conditional_access_policy__()
)
async def __get_users__(self):
logger.info("Entra - Getting users...")
users = {}
try:
users = {}
for tenant, client in self.clients.items():
users_list = await client.users.get()
users.update({tenant: {}})
@@ -31,20 +49,36 @@ class Entra(AzureService):
users[tenant].update(
{
user.user_principal_name: User(
id=user.id, name=user.display_name
id=user.id,
name=user.display_name,
authentication_methods=(
await client.users.by_user_id(
user.id
).authentication.methods.get()
).value,
)
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
if (
error.__class__.__name__ == "ODataError"
and error.__dict__.get("response_status_code", None) == 403
):
logger.error(
"You need 'UserAuthenticationMethod.Read.All' permission to access this information. It only can be granted through Service Principal authentication."
)
else:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return users
async def __get_authorization_policy__(self):
logger.info("Entra - Getting authorization policy...")
authorization_policy = {}
try:
authorization_policy = {}
for tenant, client in self.clients.items():
auth_policy = await client.policies.authorization_policy.get()
authorization_policy.update(
@@ -56,6 +90,16 @@ class Entra(AzureService):
default_user_role_permissions=getattr(
auth_policy, "default_user_role_permissions", None
),
guest_invite_settings=(
auth_policy.allow_invites_from.value
if getattr(auth_policy, "allow_invites_from", None)
else "everyone"
),
guest_user_role_id=getattr(
auth_policy,
"guest_user_role_id",
GUEST_USER_ACCESS_NO_RESTRICTICTED,
),
)
}
)
@@ -66,10 +110,202 @@ class Entra(AzureService):
return authorization_policy
async def __get_group_settings__(self):
logger.info("Entra - Getting group settings...")
group_settings = {}
try:
for tenant, client in self.clients.items():
group_settings_list = await client.group_settings.get()
group_settings.update({tenant: {}})
for group_setting in group_settings_list.value:
group_settings[tenant].update(
{
group_setting.id: GroupSetting(
name=getattr(group_setting, "display_name", None),
template_id=getattr(group_setting, "template_id", None),
settings=getattr(group_setting, "values", []),
)
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return group_settings
async def __get_security_default__(self):
logger.info("Entra - Getting security default...")
try:
security_defaults = {}
for tenant, client in self.clients.items():
security_default = (
await client.policies.identity_security_defaults_enforcement_policy.get()
)
security_defaults.update(
{
tenant: SecurityDefault(
id=security_default.id,
name=security_default.display_name,
is_enabled=security_default.is_enabled,
),
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return security_defaults
async def __get_named_locations__(self):
logger.info("Entra - Getting named locations...")
named_locations = {}
try:
for tenant, client in self.clients.items():
named_locations_list = (
await client.identity.conditional_access.named_locations.get()
)
named_locations.update({tenant: {}})
for named_location in getattr(named_locations_list, "value", []):
named_locations[tenant].update(
{
named_location.id: NamedLocation(
name=named_location.display_name,
ip_ranges_addresses=[
getattr(ip_range, "cidr_address", None)
for ip_range in getattr(
named_location, "ip_ranges", []
)
],
is_trusted=getattr(named_location, "is_trusted", False),
)
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return named_locations
async def __get_directory_roles__(self):
logger.info("Entra - Getting directory roles...")
directory_roles_with_members = {}
try:
for tenant, client in self.clients.items():
directory_roles_with_members.update({tenant: {}})
directory_roles = await client.directory_roles.get()
for directory_role in directory_roles.value:
directory_role_members = (
await client.directory_roles.by_directory_role_id(
directory_role.id
).members.get()
)
directory_roles_with_members[tenant].update(
{
directory_role.display_name: DirectoryRole(
id=directory_role.id,
members=[
self.users[tenant][member.user_principal_name]
for member in directory_role_members.value
if self.users[tenant].get(
member.user_principal_name, None
)
],
)
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return directory_roles_with_members
async def __get_conditional_access_policy__(self):
logger.info("Entra - Getting conditional access policy...")
conditional_access_policy = {}
try:
for tenant, client in self.clients.items():
conditional_access_policies = (
await client.identity.conditional_access.policies.get()
)
conditional_access_policy.update({tenant: {}})
for policy in getattr(conditional_access_policies, "value", []):
conditions = getattr(policy, "conditions", None)
included_apps = []
excluded_apps = []
if getattr(conditions, "applications", None):
if getattr(conditions.applications, "include_applications", []):
included_apps = conditions.applications.include_applications
elif getattr(
conditions.applications, "include_user_actions", []
):
included_apps = conditions.applications.include_user_actions
if getattr(conditions.applications, "exclude_applications", []):
excluded_apps = conditions.applications.exclude_applications
elif getattr(
conditions.applications, "exclude_user_actions", []
):
excluded_apps = conditions.applications.exclude_user_actions
grant_access_controls = []
block_access_controls = []
for access_control in (
getattr(policy.grant_controls, "built_in_controls")
if policy.grant_controls
else []
):
if "Grant" in str(access_control):
grant_access_controls.append(str(access_control))
else:
block_access_controls.append(str(access_control))
conditional_access_policy[tenant].update(
{
policy.id: ConditionalAccessPolicy(
name=policy.display_name,
state=getattr(policy, "state", "None"),
users={
"include": (
getattr(conditions.users, "include_users", [])
if getattr(conditions, "users", None)
else []
),
"exclude": (
getattr(conditions.users, "exclude_users", [])
if getattr(conditions, "users", None)
else []
),
},
target_resources={
"include": included_apps,
"exclude": excluded_apps,
},
access_controls={
"grant": grant_access_controls,
"block": block_access_controls,
},
)
}
)
except Exception as error:
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return conditional_access_policy
class User(BaseModel):
id: str
name: str
authentication_methods: List[Any] = []
@dataclass
@@ -78,3 +314,37 @@ class AuthorizationPolicy:
name: str
description: str
default_user_role_permissions: Optional[DefaultUserRolePermissions]
guest_invite_settings: str
guest_user_role_id: UUID
@dataclass
class GroupSetting:
name: Optional[str]
template_id: Optional[str]
settings: List[SettingValue]
class SecurityDefault(BaseModel):
id: str
name: str
is_enabled: bool
class NamedLocation(BaseModel):
name: str
ip_ranges_addresses: List[str]
is_trusted: bool
class DirectoryRole(BaseModel):
id: str
members: List[User]
class ConditionalAccessPolicy(BaseModel):
name: str
state: str
users: dict[str, List[str]]
target_resources: dict[str, List[str]]
access_controls: dict[str, List[str]]

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_trusted_named_locations_exists",
"CheckTitle": "Ensure Trusted Locations Are Defined",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "#microsoft.graph.ipNamedLocation",
"Description": "Microsoft Entra ID Conditional Access allows an organization to configure Named locations and configure whether those locations are trusted or untrusted. These settings provide organizations the means to specify Geographical locations for use in conditional access policies, or define actual IP addresses and IP ranges and whether or not those IP addresses and/or ranges are trusted by the organization.",
"Risk": "Defining trusted source IP addresses or ranges helps organizations create and enforce Conditional Access policies around those trusted or untrusted IP addresses and ranges. Users authenticating from trusted IP addresses and/or ranges may have less access restrictions or access requirements when compared to users that try to authenticate to Microsoft Entra ID from untrusted locations or untrusted source IP addresses/ranges.",
"RelatedUrl": "https://learn.microsoft.com/en-us/entra/identity/conditional-access/location-condition",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "1. Navigate to the Microsoft Entra ID Conditional Access Blade 2. Click on the Named locations blade 3. Within the Named locations blade, click on IP ranges location 4. Enter a name for this location setting in the Name text box 5. Click on the + sign 6. Add an IP Address Range in CIDR notation inside the text box that appears 7. Click on the Add button 8. Repeat steps 5 through 7 for each IP Range that needs to be added 9. If the information entered are trusted ranges, select the Mark as trusted location check box 10. Once finished, click on Create",
"Url": "https://learn.microsoft.com/en-us/security/benchmark/azure/mcsb-identity-management#im-7-restrict-resource-access-based-on--conditions"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "When configuring Named locations, the organization can create locations using Geographical location data or by defining source IP addresses or ranges. Configuring Named locations using a Country location does not provide the organization the ability to mark those locations as trusted, and any Conditional Access policy relying on those Countries location setting will not be able to use the All trusted locations setting within the Conditional Access policy. They instead will have to rely on the Select locations setting. This may add additional resource requirements when configuring, and will require thorough organizational testing. In general, Conditional Access policies may completely prevent users from authenticating to Microsoft Entra ID, and thorough testing is recommended. To avoid complete lockout, a 'Break Glass' account with full Global Administrator rights is recommended in the event all other administrators are locked out of authenticating to Microsoft Entra ID. This 'Break Glass' account should be excluded from Conditional Access Policies and should be configured with the longest pass phrase feasible. This account should only be used in the event of an emergency and complete administrator lockout."
}

View File

@@ -0,0 +1,29 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_trusted_named_locations_exists(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant, named_locations in entra_client.named_locations.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant}"
report.resource_name = "Named Locations"
report.resource_id = "Named Locations"
report.status_extended = (
"There is no trusted location with IP ranges defined."
)
for named_location_id, named_location in named_locations.items():
report.resource_name = named_location.name
report.resource_id = named_location_id
if named_location.ip_ranges_addresses and named_location.is_trusted:
report.status = "PASS"
report.status_extended = f"Exits trusted location with trusted IP ranges, this IPs ranges are: {[ip_range for ip_range in named_location.ip_ranges_addresses if ip_range]}"
break
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_user_with_vm_access_has_mfa",
"CheckTitle": "Ensure only MFA enabled identities can access privileged Virtual Machine",
"CheckType": [],
"ServiceName": "iam",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "#microsoft.graph.users",
"Description": "Verify identities without MFA that can log in to a privileged virtual machine using separate login credentials. An adversary can leverage the access to move laterally and perform actions with the virtual machine's managed identity. Make sure the virtual machine only has necessary permissions, and revoke the admin-level permissions according to the least privileges principal",
"Risk": "Managed disks are by default encrypted on the underlying hardware, so no additional encryption is required for basic protection. It is available if additional encryption is required. Managed disks are by design more resilient that storage accounts. For ARM-deployed Virtual Machines, Azure Adviser will at some point recommend moving VHDs to managed disks both from a security and cost management perspective.",
"RelatedUrl": "",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "",
"Terraform": ""
},
"Recommendation": {
"Text": "1. Log in to the Azure portal. Reducing access of managed identities attached to virtual machines. 2. This can be remediated by enabling MFA for user, Removing user access or • Case I : Enable MFA for users having access on virtual machines. 1. Navigate to Azure AD from the left pane and select Users from the Manage section. 2. Click on Per-User MFA from the top menu options and select each user with MULTI-FACTOR AUTH STATUS as Disabled and can login to virtual machines:  From quick steps on the right side select enable.  Click on enable multi-factor auth and share the link with the user to setup MFA as required. • Case II : Removing user access on a virtual machine. 1. Select the Subscription, then click on Access control (IAM). 2. Select Role assignments and search for Virtual Machine Administrator Login or Virtual Machine User Login or any role that provides access to log into virtual machines. 3. Click on Role Name, Select Assignments, and remove identities with no MFA configured. • Case III : Reducing access of managed identities attached to virtual machines. 1. Select the Subscription, then click on Access control (IAM). 2. Select Role Assignments from the top menu and apply filters on Assignment type as Privileged administrator roles and Type as Virtual Machines. 3. Click on Role Name, Select Assignments, and remove identities access make sure this follows the least privileges principal.",
"Url": ""
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "This recommendation requires an Azure AD P2 License to implement. Ensure that identities that are provisioned to a virtual machine utilizes an RBAC/ABAC group and is allocated a role using Azure PIM, and the Role settings require MFA or use another PAM solution (like CyberArk) for accessing Virtual Machines."
}

View File

@@ -0,0 +1,53 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.config import (
CONTRIBUTOR_ROLE_ID,
OWNER_ROLE_ID,
VIRTUAL_MACHINE_ADMINISTRATOR_LOGIN_ROLE_ID,
VIRTUAL_MACHINE_CONTRIBUTOR_ROLE_ID,
VIRTUAL_MACHINE_LOCAL_USER_LOGIN_ROLE_ID,
VIRTUAL_MACHINE_USER_LOGIN_ROLE_ID,
WINDOWS_ADMIN_CENTER_ADMINISTRATOR_LOGIN_ROLE_ID,
)
from prowler.providers.azure.services.entra.entra_client import entra_client
from prowler.providers.azure.services.iam.iam_client import iam_client
class entra_user_with_vm_access_has_mfa(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for users in entra_client.users.values():
for user_domain_name, user in users.items():
for (
subscription_name,
role_assigns,
) in iam_client.role_assignments.items():
for assignment in role_assigns.values():
if (
assignment.agent_type == "User"
and assignment.role_id
in [
CONTRIBUTOR_ROLE_ID,
OWNER_ROLE_ID,
VIRTUAL_MACHINE_CONTRIBUTOR_ROLE_ID,
VIRTUAL_MACHINE_ADMINISTRATOR_LOGIN_ROLE_ID,
VIRTUAL_MACHINE_USER_LOGIN_ROLE_ID,
VIRTUAL_MACHINE_LOCAL_USER_LOGIN_ROLE_ID,
WINDOWS_ADMIN_CENTER_ADMINISTRATOR_LOGIN_ROLE_ID,
]
and assignment.agent_id == user.id
):
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.status_extended = f"User {user.name} without MFA can access VMs in subscription {subscription_name}"
report.subscription = subscription_name
report.resource_name = user_domain_name
report.resource_id = user.id
if len(user.authentication_methods) > 1:
report.status = "PASS"
report.status_extended = f"User {user.name} can access VMs in subscription {subscription_name} but it has MFA."
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "entra_users_cannot_create_microsoft_365_groups",
"CheckTitle": "Ensure that 'Users can create Microsoft 365 groups in Azure portals, API or PowerShell' is set to 'No'",
"CheckType": [],
"ServiceName": "entra",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "high",
"ResourceType": "Microsoft.Users/Settings",
"Description": "Restrict Microsoft 365 group creation to administrators only.",
"Risk": "Restricting Microsoft 365 group creation to administrators only ensures that creation of Microsoft 365 groups is controlled by the administrator. Appropriate groups should be created and managed by the administrator and group creation rights should not be delegated to any other user.",
"RelatedUrl": "https://learn.microsoft.com/en-us/microsoft-365/community/all-about-groups#microsoft-365-groups",
"Remediation": {
"Code": {
"CLI": "",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity/knowledge-base/azure/ActiveDirectory/users-can-create-office-365-groups.html#",
"Terraform": ""
},
"Recommendation": {
"Text": "1. From Azure Home select the Portal Menu 2. Select Microsoft Entra ID 3. Then Groups 4. Select General in settings 5. Set Users can create Microsoft 365 groups in Azure portals, API or PowerShell to No",
"Url": "https://learn.microsoft.com/en-us/microsoft-365/solutions/manage-creation-of-groups?view=o365-worldwide&redirectSourcePath=%252fen-us%252farticle%252fControl-who-can-create-Office-365-Groups-4c46c8cb-17d0-44b5-9776-005fced8e618"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "Enabling this setting could create a number of requests that would need to be managed by an administrator."
}

View File

@@ -0,0 +1,32 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.entra.entra_client import entra_client
class entra_users_cannot_create_microsoft_365_groups(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for tenant_domain, group_settings in entra_client.group_settings.items():
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = f"Tenant: {tenant_domain}"
report.resource_name = "Microsoft365 Groups"
report.resource_id = "Microsoft365 Groups"
report.status_extended = "Users can create Microsoft 365 groups."
for group_setting in group_settings.values():
if group_setting.name == "Group.Unified":
for setting_value in group_setting.settings:
if (
getattr(setting_value, "name", "") == "EnableGroupCreation"
and setting_value.value != "true"
):
report.status = "PASS"
report.status_extended = (
"Users cannot create Microsoft 365 groups."
)
break
findings.append(report)
return findings

View File

@@ -0,0 +1,25 @@
"""
This module contains functions with user privileges in Azure.
"""
def is_privileged_user(user, privileged_roles) -> bool:
"""
Checks if a user is a privileged user.
Args:
user: An object representing the user to be checked.
privileged_roles: A dictionary containing privileged roles.
Returns:
A boolean value indicating whether the user is a privileged user.
"""
is_privileged = False
for role in privileged_roles.values():
if user in role.members:
is_privileged = True
break
return is_privileged

View File

@@ -12,6 +12,7 @@ class IAM(AzureService):
def __init__(self, audit_info):
super().__init__(AuthorizationManagementClient, audit_info)
self.roles, self.custom_roles = self.__get_roles__()
self.role_assignments = self.__get_role_assignments__()
def __get_roles__(self):
logger.info("IAM - Getting roles...")
@@ -52,6 +53,34 @@ class IAM(AzureService):
)
return builtin_roles, custom_roles
def __get_role_assignments__(self):
logger.info("IAM - Getting role assignments...")
role_assignments = {}
for subscription, client in self.clients.items():
try:
role_assignments.update({subscription: {}})
all_role_assignments = client.role_assignments.list_for_subscription(
filter="atScope()"
)
for role_assignment in all_role_assignments:
role_assignments[subscription].update(
{
role_assignment.id: RoleAssignment(
agent_id=role_assignment.principal_id,
agent_type=role_assignment.principal_type,
role_id=role_assignment.role_definition_id.split("/")[
-1
],
)
}
)
except Exception as error:
logger.error(f"Subscription name: {subscription}")
logger.error(
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return role_assignments
@dataclass
class Role:
@@ -60,3 +89,10 @@ class Role:
type: str
assignable_scopes: list[str]
permissions: list[Permission]
@dataclass
class RoleAssignment:
agent_id: str
agent_type: str
role_id: str

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "keyvault_logging_enabled",
"CheckTitle": "Ensure that logging for Azure Key Vault is 'Enabled'",
"CheckType": [],
"ServiceName": "keyvault",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "KeyVault",
"Description": "Enable AuditEvent logging for key vault instances to ensure interactions with key vaults are logged and available.",
"Risk": "Monitoring how and when key vaults are accessed, and by whom, enables an audit trail of interactions with confidential information, keys, and certificates managed by Azure Keyvault. Enabling logging for Key Vault saves information in an Azure storage account which the user provides. This creates a new container named insights-logs-auditevent automatically for the specified storage account. This same storage account can be used for collecting logs for multiple key vaults.",
"RelatedUrl": "https://docs.microsoft.com/en-us/azure/key-vault/key-vault-logging",
"Remediation": {
"Code": {
"CLI": "az monitor diagnostic-settings create --name <diagnostic settings name> --resource <key vault resource ID> --logs'[{category:AuditEvents,enabled:true,retention-policy:{enabled:true,days:180}}]' --metrics'[{category:AllMetrics,enabled:true,retention-policy:{enabled:true,days:180}}]' <[--event-hub <event hub ID> --event-hub-rule <event hub auth rule ID> | --storage-account <storage account ID> |--workspace <log analytics workspace ID> | --marketplace-partner-id <full resource ID of third-party solution>]>",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity-staging/knowledge-base/azure/KeyVault/enable-audit-event-logging-for-azure-key-vaults.html",
"Terraform": ""
},
"Recommendation": {
"Text": "1. Go to Key vaults 2. For each Key vault 3. Go to Diagnostic settings 4. Click on Edit Settings 5. Ensure that Archive to a storage account is Enabled 6. Ensure that AuditEvent is checked, and the retention days is set to 180 days or as appropriate",
"Url": "https://docs.microsoft.com/en-us/security/benchmark/azure/security-controls-v3-data-protection#dp-8-ensure-security-of-key-and-certificate-repository"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "By default, Diagnostic AuditEvent logging is not enabled for Key Vault instances."
}

View File

@@ -0,0 +1,43 @@
from prowler.lib.check.models import Check, Check_Report_Azure
from prowler.providers.azure.services.keyvault.keyvault_client import keyvault_client
class keyvault_logging_enabled(Check):
def execute(self) -> Check_Report_Azure:
findings = []
for subscription, key_vaults in keyvault_client.key_vaults.items():
for keyvault in key_vaults:
keyvault_name = keyvault.name
subscription_name = subscription
if not keyvault.monitor_diagnostic_settings:
report = Check_Report_Azure(self.metadata())
report.status = "FAIL"
report.subscription = subscription_name
report.resource_name = keyvault.name
report.resource_id = keyvault.id
report.status_extended = f"There are no diagnostic settings capturing audit logs for Key Vault {keyvault_name} in subscription {subscription_name}."
findings.append(report)
else:
for diagnostic_setting in keyvault.monitor_diagnostic_settings:
report = Check_Report_Azure(self.metadata())
report.subscription = subscription_name
report.resource_name = diagnostic_setting.name
report.resource_id = diagnostic_setting.id
report.status = "FAIL"
report.status_extended = f"Diagnostic setting {diagnostic_setting.name} for Key Vault {keyvault_name} in subscription {subscription_name} does not have audit logging."
audit = False
allLogs = False
for log in diagnostic_setting.logs:
if log.category_group == "audit" and log.enabled:
audit = True
if log.category_group == "allLogs" and log.enabled:
allLogs = True
if audit and allLogs:
report.status = "PASS"
report.status_extended = f"Diagnostic setting {diagnostic_setting.name} for Key Vault {keyvault_name} in subscription {subscription_name} has audit logging."
break
findings.append(report)
return findings

View File

@@ -11,9 +11,11 @@ from azure.mgmt.keyvault.v2023_07_01.models import (
from prowler.lib.logger import logger
from prowler.providers.azure.lib.service.service import AzureService
from prowler.providers.azure.services.monitor.monitor_client import monitor_client
from prowler.providers.azure.services.monitor.monitor_service import DiagnosticSetting
########################## Storage
########################## KeyVault
class KeyVault(AzureService):
def __init__(self, audit_info):
super().__init__(KeyVaultManagementClient, audit_info)
@@ -47,6 +49,9 @@ class KeyVault(AzureService):
properties=keyvault_properties,
keys=keys,
secrets=secrets,
monitor_diagnostic_settings=self.__get_vault_monitor_settings__(
keyvault_name, resource_group, subscription
),
)
)
except Exception as error:
@@ -116,6 +121,25 @@ class KeyVault(AzureService):
)
return secrets
def __get_vault_monitor_settings__(
self, keyvault_name, resource_group, subscription
):
logger.info(
f"KeyVault - Getting monitor diagnostics settings for {keyvault_name}..."
)
monitor_diagnostics_settings = []
try:
monitor_diagnostics_settings = monitor_client.diagnostic_settings_with_uri(
self.subscriptions[subscription],
f"subscriptions/{self.subscriptions[subscription]}/resourceGroups/{resource_group}/providers/Microsoft.KeyVault/vaults/{keyvault_name}",
monitor_client.clients[subscription],
)
except Exception as error:
logger.error(
f"Subscription name: {self.subscription} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
)
return monitor_diagnostics_settings
@dataclass
class Key:
@@ -145,3 +169,4 @@ class KeyVaultInfo:
properties: VaultProperties
keys: list[Key] = None
secrets: list[Secret] = None
monitor_diagnostic_settings: list[DiagnosticSetting] = None

View File

@@ -29,5 +29,4 @@ class monitor_alert_create_update_public_ip_address_rule(Check):
break
findings.append(report)
return findings

View File

@@ -0,0 +1,30 @@
{
"Provider": "azure",
"CheckID": "monitor_diagnostic_settings_exists",
"CheckTitle": "Ensure that a 'Diagnostic Setting' exists for Subscription Activity Logs ",
"CheckType": [],
"ServiceName": "monitor",
"SubServiceName": "",
"ResourceIdTemplate": "",
"Severity": "medium",
"ResourceType": "Monitor",
"Description": "Enable Diagnostic settings for exporting activity logs. Diagnostic settings are available for each individual resource within a subscription. Settings should be configured for all appropriate resources for your environment.",
"Risk": "A diagnostic setting controls how a diagnostic log is exported. By default, logs are retained only for 90 days. Diagnostic settings should be defined so that logs can be exported and stored for a longer duration in order to analyze security activities within an Azure subscription.",
"RelatedUrl": "https://learn.microsoft.com/en-us/cli/azure/monitor/diagnostic-settings?view=azure-cli-latest",
"Remediation": {
"Code": {
"CLI": "az monitor diagnostic-settings subscription create --subscription <subscription id> --name <diagnostic settings name> --location <location> <[- -event-hub <event hub ID> --event-hub-auth-rule <event hub auth rule ID>] [-- storage-account <storage account ID>] [--workspace <log analytics workspace ID>] --logs '<JSON encoded categories>' (e.g. [{category:Security,enabled:true},{category:Administrative,enabled:true},{cat egory:Alert,enabled:true},{category:Policy,enabled:true}])",
"NativeIaC": "",
"Other": "https://www.trendmicro.com/cloudoneconformity-staging/knowledge-base/azure/Monitor/subscription-activity-log-diagnostic-settings.html#trendmicro",
"Terraform": ""
},
"Recommendation": {
"Text": "To enable Diagnostic Settings on a Subscription: 1. Go to Monitor 2. Click on Activity Log 3. Click on Export Activity Logs 4. Click + Add diagnostic setting 5. Enter a Diagnostic setting name 6. Select Categories for the diagnostic settings 7. Select the appropriate Destination details (this may be Log Analytics, Storage Account, Event Hub, or Partner solution) 8. Click Save To enable Diagnostic Settings on a specific resource: 1. Go to Monitor 2. Click Diagnostic settings 3. Click on the resource that has a diagnostics status of disabled 4. Select Add Diagnostic Setting 5. Enter a Diagnostic setting name 6. Select the appropriate log, metric, and destination. (this may be Log Analytics, Storage Account, Event Hub, or Partner solution) 7. Click save Repeat these step for all resources as needed.",
"Url": "https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-overview-activity-logs#export-the-activity-log-with-a-log-profile"
}
},
"Categories": [],
"DependsOn": [],
"RelatedTo": [],
"Notes": "By default, diagnostic setting is not set."
}

Some files were not shown because too many files have changed in this diff Show More