mirror of
https://github.com/prowler-cloud/prowler.git
synced 2026-01-25 02:08:11 +00:00
Compare commits
1 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
bffe2a2c63 |
2
.github/CODEOWNERS
vendored
2
.github/CODEOWNERS
vendored
@@ -1 +1 @@
|
||||
* @prowler-cloud/prowler-oss
|
||||
* @prowler-cloud/prowler-team
|
||||
|
||||
52
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
52
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,52 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Create a report to help us improve
|
||||
title: "[Bug]: "
|
||||
labels: bug, status/needs-triage
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
<!--
|
||||
Please use this template to create your bug report. By providing as much info as possible you help us understand the issue, reproduce it and resolve it for you quicker. Therefore, take a couple of extra minutes to make sure you have provided all info needed.
|
||||
|
||||
PROTIP: record your screen and attach it as a gif to showcase the issue.
|
||||
|
||||
- How to record and attach gif: https://bit.ly/2Mi8T6K
|
||||
-->
|
||||
|
||||
**What happened?**
|
||||
A clear and concise description of what the bug is or what is not working as expected
|
||||
|
||||
|
||||
**How to reproduce it**
|
||||
Steps to reproduce the behavior:
|
||||
1. What command are you running?
|
||||
2. Cloud provider you are launching
|
||||
3. Environment you have like single account, multi-account, organizations, multi or single subsctiption, etc.
|
||||
4. See error
|
||||
|
||||
|
||||
**Expected behavior**
|
||||
A clear and concise description of what you expected to happen.
|
||||
|
||||
|
||||
**Screenshots or Logs**
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
Also, you can add logs (anonymize them first!). Here a command that may help to share a log
|
||||
`prowler <your arguments> --log-level DEBUG --log-file $(date +%F)_debug.log` then attach here the log file.
|
||||
|
||||
|
||||
**From where are you running Prowler?**
|
||||
Please, complete the following information:
|
||||
- Resource: (e.g. EC2 instance, Fargate task, Docker container manually, EKS, Cloud9, CodeBuild, workstation, etc.)
|
||||
- OS: [e.g. Amazon Linux 2, Mac, Alpine, Windows, etc. ]
|
||||
- Prowler Version [`prowler --version`]:
|
||||
- Python version [`python --version`]:
|
||||
- Pip version [`pip --version`]:
|
||||
- Installation method (Are you running it from pip package or cloning the github repo?):
|
||||
- Others:
|
||||
|
||||
|
||||
**Additional context**
|
||||
Add any other context about the problem here.
|
||||
97
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
97
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
@@ -1,97 +0,0 @@
|
||||
name: 🐞 Bug Report
|
||||
description: Create a report to help us improve
|
||||
title: "[Bug]: "
|
||||
labels: ["bug", "status/needs-triage"]
|
||||
|
||||
body:
|
||||
- type: textarea
|
||||
id: reproduce
|
||||
attributes:
|
||||
label: Steps to Reproduce
|
||||
description: Steps to reproduce the behavior
|
||||
placeholder: |-
|
||||
1. What command are you running?
|
||||
2. Cloud provider you are launching
|
||||
3. Environment you have, like single account, multi-account, organizations, multi or single subscription, etc.
|
||||
4. See error
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: expected
|
||||
attributes:
|
||||
label: Expected behavior
|
||||
description: A clear and concise description of what you expected to happen.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: actual
|
||||
attributes:
|
||||
label: Actual Result with Screenshots or Logs
|
||||
description: If applicable, add screenshots to help explain your problem. Also, you can add logs (anonymize them first!). Here a command that may help to share a log `prowler <your arguments> --log-level DEBUG --log-file $(date +%F)_debug.log` then attach here the log file.
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: type
|
||||
attributes:
|
||||
label: How did you install Prowler?
|
||||
options:
|
||||
- Cloning the repository from github.com (git clone)
|
||||
- From pip package (pip install prowler)
|
||||
- From brew (brew install prowler)
|
||||
- Docker (docker pull toniblyx/prowler)
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: environment
|
||||
attributes:
|
||||
label: Environment Resource
|
||||
description: From where are you running Prowler?
|
||||
placeholder: |-
|
||||
1. EC2 instance
|
||||
2. Fargate task
|
||||
3. Docker container locally
|
||||
4. EKS
|
||||
5. Cloud9
|
||||
6. CodeBuild
|
||||
7. Workstation
|
||||
8. Other(please specify)
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: os
|
||||
attributes:
|
||||
label: OS used
|
||||
description: Which OS are you using?
|
||||
placeholder: |-
|
||||
1. Amazon Linux 2
|
||||
2. MacOS
|
||||
3. Alpine Linux
|
||||
4. Windows
|
||||
5. Other(please specify)
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: prowler-version
|
||||
attributes:
|
||||
label: Prowler version
|
||||
description: Which Prowler version are you using?
|
||||
placeholder: |-
|
||||
prowler --version
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: pip-version
|
||||
attributes:
|
||||
label: Pip version
|
||||
description: Which pip version are you using?
|
||||
placeholder: |-
|
||||
pip --version
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: additional
|
||||
attributes:
|
||||
description: Additional context
|
||||
label: Context
|
||||
validations:
|
||||
required: false
|
||||
36
.github/ISSUE_TEMPLATE/feature-request.yml
vendored
36
.github/ISSUE_TEMPLATE/feature-request.yml
vendored
@@ -1,36 +0,0 @@
|
||||
name: 💡 Feature Request
|
||||
description: Suggest an idea for this project
|
||||
labels: ["enhancement", "status/needs-triage"]
|
||||
|
||||
|
||||
body:
|
||||
- type: textarea
|
||||
id: Problem
|
||||
attributes:
|
||||
label: New feature motivation
|
||||
description: Is your feature request related to a problem? Please describe
|
||||
placeholder: |-
|
||||
1. A clear and concise description of what the problem is. Ex. I'm always frustrated when
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: Solution
|
||||
attributes:
|
||||
label: Solution Proposed
|
||||
description: A clear and concise description of what you want to happen.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: Alternatives
|
||||
attributes:
|
||||
label: Describe alternatives you've considered
|
||||
description: A clear and concise description of any alternative solutions or features you've considered.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: Context
|
||||
attributes:
|
||||
label: Additional context
|
||||
description: Add any other context or screenshots about the feature request here.
|
||||
validations:
|
||||
required: false
|
||||
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
title: ''
|
||||
labels: enhancement, status/needs-triage
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
2
.github/dependabot.yml
vendored
2
.github/dependabot.yml
vendored
@@ -8,7 +8,7 @@ updates:
|
||||
- package-ecosystem: "pip" # See documentation for possible values
|
||||
directory: "/" # Location of package manifests
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
interval: "daily"
|
||||
target-branch: master
|
||||
labels:
|
||||
- "dependencies"
|
||||
|
||||
178
.github/workflows/build-lint-push-containers.yml
vendored
178
.github/workflows/build-lint-push-containers.yml
vendored
@@ -15,17 +15,36 @@ on:
|
||||
env:
|
||||
AWS_REGION_STG: eu-west-1
|
||||
AWS_REGION_PLATFORM: eu-west-1
|
||||
AWS_REGION: us-east-1
|
||||
AWS_REGION_PRO: us-east-1
|
||||
IMAGE_NAME: prowler
|
||||
LATEST_TAG: latest
|
||||
STABLE_TAG: stable
|
||||
TEMPORARY_TAG: temporary
|
||||
DOCKERFILE_PATH: ./Dockerfile
|
||||
PYTHON_VERSION: 3.9
|
||||
|
||||
jobs:
|
||||
# Lint Dockerfile using Hadolint
|
||||
# dockerfile-linter:
|
||||
# runs-on: ubuntu-latest
|
||||
# steps:
|
||||
# -
|
||||
# name: Checkout
|
||||
# uses: actions/checkout@v3
|
||||
# -
|
||||
# name: Install Hadolint
|
||||
# run: |
|
||||
# VERSION=$(curl --silent "https://api.github.com/repos/hadolint/hadolint/releases/latest" | \
|
||||
# grep '"tag_name":' | \
|
||||
# sed -E 's/.*"v([^"]+)".*/\1/' \
|
||||
# ) && curl -L -o /tmp/hadolint https://github.com/hadolint/hadolint/releases/download/v${VERSION}/hadolint-Linux-x86_64 \
|
||||
# && chmod +x /tmp/hadolint
|
||||
# -
|
||||
# name: Run Hadolint
|
||||
# run: |
|
||||
# /tmp/hadolint util/Dockerfile
|
||||
|
||||
# Build Prowler OSS container
|
||||
container-build-push:
|
||||
container-build:
|
||||
# needs: dockerfile-linter
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
@@ -33,30 +52,87 @@ jobs:
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup python (release)
|
||||
- name: setup python (release)
|
||||
if: github.event_name == 'release'
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.PYTHON_VERSION }}
|
||||
|
||||
python-version: 3.9 #install the python needed
|
||||
- name: Install dependencies (release)
|
||||
if: github.event_name == 'release'
|
||||
run: |
|
||||
pipx install poetry
|
||||
pipx inject poetry poetry-bumpversion
|
||||
|
||||
- name: Update Prowler version (release)
|
||||
if: github.event_name == 'release'
|
||||
run: |
|
||||
poetry version ${{ github.event.release.tag_name }}
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v2
|
||||
- name: Build
|
||||
uses: docker/build-push-action@v2
|
||||
with:
|
||||
# Without pushing to registries
|
||||
push: false
|
||||
tags: ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }}
|
||||
file: ${{ env.DOCKERFILE_PATH }}
|
||||
outputs: type=docker,dest=/tmp/${{ env.IMAGE_NAME }}.tar
|
||||
- name: Share image between jobs
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: ${{ env.IMAGE_NAME }}.tar
|
||||
path: /tmp/${{ env.IMAGE_NAME }}.tar
|
||||
|
||||
# Lint Prowler OSS container using Dockle
|
||||
# container-linter:
|
||||
# needs: container-build
|
||||
# runs-on: ubuntu-latest
|
||||
# steps:
|
||||
# -
|
||||
# name: Get container image from shared
|
||||
# uses: actions/download-artifact@v2
|
||||
# with:
|
||||
# name: ${{ env.IMAGE_NAME }}.tar
|
||||
# path: /tmp
|
||||
# -
|
||||
# name: Load Docker image
|
||||
# run: |
|
||||
# docker load --input /tmp/${{ env.IMAGE_NAME }}.tar
|
||||
# docker image ls -a
|
||||
# -
|
||||
# name: Install Dockle
|
||||
# run: |
|
||||
# VERSION=$(curl --silent "https://api.github.com/repos/goodwithtech/dockle/releases/latest" | \
|
||||
# grep '"tag_name":' | \
|
||||
# sed -E 's/.*"v([^"]+)".*/\1/' \
|
||||
# ) && curl -L -o dockle.deb https://github.com/goodwithtech/dockle/releases/download/v${VERSION}/dockle_${VERSION}_Linux-64bit.deb \
|
||||
# && sudo dpkg -i dockle.deb && rm dockle.deb
|
||||
# -
|
||||
# name: Run Dockle
|
||||
# run: dockle ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }}
|
||||
|
||||
# Push Prowler OSS container to registries
|
||||
container-push:
|
||||
# needs: container-linter
|
||||
needs: container-build
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
id-token: write
|
||||
contents: read # This is required for actions/checkout
|
||||
steps:
|
||||
- name: Get container image from shared
|
||||
uses: actions/download-artifact@v2
|
||||
with:
|
||||
name: ${{ env.IMAGE_NAME }}.tar
|
||||
path: /tmp
|
||||
- name: Load Docker image
|
||||
run: |
|
||||
docker load --input /tmp/${{ env.IMAGE_NAME }}.tar
|
||||
docker image ls -a
|
||||
- name: Login to DockerHub
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||
|
||||
- name: Login to Public ECR
|
||||
uses: docker/login-action@v2
|
||||
with:
|
||||
@@ -64,53 +140,55 @@ jobs:
|
||||
username: ${{ secrets.PUBLIC_ECR_AWS_ACCESS_KEY_ID }}
|
||||
password: ${{ secrets.PUBLIC_ECR_AWS_SECRET_ACCESS_KEY }}
|
||||
env:
|
||||
AWS_REGION: ${{ env.AWS_REGION }}
|
||||
AWS_REGION: ${{ env.AWS_REGION_PRO }}
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v2
|
||||
|
||||
- name: Build and push container image (latest)
|
||||
if: github.event_name == 'push'
|
||||
uses: docker/build-push-action@v2
|
||||
with:
|
||||
push: true
|
||||
tags: |
|
||||
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
|
||||
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
|
||||
file: ${{ env.DOCKERFILE_PATH }}
|
||||
cache-from: type=gha
|
||||
cache-to: type=gha,mode=max
|
||||
|
||||
- name: Build and push container image (release)
|
||||
if: github.event_name == 'release'
|
||||
uses: docker/build-push-action@v2
|
||||
with:
|
||||
# Use local context to get changes
|
||||
# https://github.com/docker/build-push-action#path-context
|
||||
context: .
|
||||
push: true
|
||||
tags: |
|
||||
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
|
||||
${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
|
||||
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
|
||||
${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
|
||||
file: ${{ env.DOCKERFILE_PATH }}
|
||||
cache-from: type=gha
|
||||
cache-to: type=gha,mode=max
|
||||
|
||||
dispatch-action:
|
||||
needs: container-build-push
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Get latest commit info
|
||||
- name: Tag (latest)
|
||||
if: github.event_name == 'push'
|
||||
run: |
|
||||
LATEST_COMMIT_HASH=$(echo ${{ github.event.after }} | cut -b -7)
|
||||
echo "LATEST_COMMIT_HASH=${LATEST_COMMIT_HASH}" >> $GITHUB_ENV
|
||||
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
|
||||
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
|
||||
|
||||
- # Push to master branch - push "latest" tag
|
||||
name: Push (latest)
|
||||
if: github.event_name == 'push'
|
||||
run: |
|
||||
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
|
||||
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.LATEST_TAG }}
|
||||
|
||||
- # Tag the new release (stable and release tag)
|
||||
name: Tag (release)
|
||||
if: github.event_name == 'release'
|
||||
run: |
|
||||
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
|
||||
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
|
||||
|
||||
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
|
||||
docker tag ${{ env.IMAGE_NAME }}:${{ env.TEMPORARY_TAG }} ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
|
||||
|
||||
- # Push the new release (stable and release tag)
|
||||
name: Push (release)
|
||||
if: github.event_name == 'release'
|
||||
run: |
|
||||
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
|
||||
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ github.event.release.tag_name }}
|
||||
|
||||
docker push ${{ secrets.DOCKER_HUB_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
|
||||
docker push ${{ secrets.PUBLIC_ECR_REPOSITORY }}/${{ env.IMAGE_NAME }}:${{ env.STABLE_TAG }}
|
||||
|
||||
- name: Delete artifacts
|
||||
if: always()
|
||||
uses: geekyeggo/delete-artifact@v1
|
||||
with:
|
||||
name: ${{ env.IMAGE_NAME }}.tar
|
||||
|
||||
dispatch-action:
|
||||
needs: container-push
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Dispatch event for latest
|
||||
if: github.event_name == 'push'
|
||||
run: |
|
||||
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" -H "X-GitHub-Api-Version: 2022-11-28" --data '{"event_type":"dispatch","client_payload":{"version":"latest", "tag": "${{ env.LATEST_COMMIT_HASH }}"}}'
|
||||
curl https://api.github.com/repos/${{ secrets.DISPATCH_OWNER }}/${{ secrets.DISPATCH_REPO }}/dispatches -H "Accept: application/vnd.github+json" -H "Authorization: Bearer ${{ secrets.ACCESS_TOKEN }}" -H "X-GitHub-Api-Version: 2022-11-28" --data '{"event_type":"dispatch","client_payload":{"version":"latest"}'
|
||||
- name: Dispatch event for release
|
||||
if: github.event_name == 'release'
|
||||
run: |
|
||||
|
||||
10
.github/workflows/pull-request.yml
vendored
10
.github/workflows/pull-request.yml
vendored
@@ -17,17 +17,14 @@ jobs:
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Install poetry
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pipx install poetry
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
cache: 'poetry'
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install poetry
|
||||
poetry install
|
||||
poetry run pip list
|
||||
VERSION=$(curl --silent "https://api.github.com/repos/hadolint/hadolint/releases/latest" | \
|
||||
@@ -35,9 +32,6 @@ jobs:
|
||||
sed -E 's/.*"v([^"]+)".*/\1/' \
|
||||
) && curl -L -o /tmp/hadolint "https://github.com/hadolint/hadolint/releases/download/v${VERSION}/hadolint-Linux-x86_64" \
|
||||
&& chmod +x /tmp/hadolint
|
||||
- name: Poetry check
|
||||
run: |
|
||||
poetry lock --check
|
||||
- name: Lint with flake8
|
||||
run: |
|
||||
poetry run flake8 . --ignore=E266,W503,E203,E501,W605,E128 --exclude contrib
|
||||
|
||||
42
.github/workflows/pypi-release.yml
vendored
42
.github/workflows/pypi-release.yml
vendored
@@ -6,6 +6,7 @@ on:
|
||||
|
||||
env:
|
||||
RELEASE_TAG: ${{ github.event.release.tag_name }}
|
||||
GITHUB_BRANCH: master
|
||||
|
||||
jobs:
|
||||
release-prowler-job:
|
||||
@@ -16,16 +17,16 @@ jobs:
|
||||
steps:
|
||||
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
|
||||
- uses: actions/checkout@v3
|
||||
|
||||
with:
|
||||
ref: ${{ env.GITHUB_BRANCH }}
|
||||
- name: setup python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: 3.9 #install the python needed
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
pipx install poetry
|
||||
pipx inject poetry poetry-bumpversion
|
||||
- name: setup python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: 3.9
|
||||
cache: 'poetry'
|
||||
- name: Change version and Build package
|
||||
run: |
|
||||
poetry version ${{ env.RELEASE_TAG }}
|
||||
@@ -40,11 +41,20 @@ jobs:
|
||||
run: |
|
||||
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
|
||||
poetry publish
|
||||
- name: Replicate PyPi Package
|
||||
run: |
|
||||
rm -rf ./dist && rm -rf ./build && rm -rf prowler.egg-info
|
||||
python util/replicate_pypi_package.py
|
||||
poetry build
|
||||
- name: Publish prowler-cloud package to PyPI
|
||||
run: |
|
||||
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
|
||||
poetry publish
|
||||
# Create pull request with new version
|
||||
- name: Create Pull Request
|
||||
uses: peter-evans/create-pull-request@v4
|
||||
with:
|
||||
token: ${{ secrets.PROWLER_ACCESS_TOKEN }}
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
commit-message: "chore(release): update Prowler Version to ${{ env.RELEASE_TAG }}."
|
||||
branch: release-${{ env.RELEASE_TAG }}
|
||||
labels: "status/waiting-for-revision, severity/low"
|
||||
@@ -57,21 +67,3 @@ jobs:
|
||||
### License
|
||||
|
||||
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
|
||||
- name: Replicate PyPi Package
|
||||
run: |
|
||||
rm -rf ./dist && rm -rf ./build && rm -rf prowler.egg-info
|
||||
pip install toml
|
||||
python util/replicate_pypi_package.py
|
||||
poetry build
|
||||
- name: Publish prowler-cloud package to PyPI
|
||||
run: |
|
||||
poetry config pypi-token.pypi ${{ secrets.PYPI_API_TOKEN }}
|
||||
poetry publish
|
||||
# Create pull request to github.com/Homebrew/homebrew-core to update prowler formula
|
||||
- name: Bump Homebrew formula
|
||||
uses: mislav/bump-homebrew-formula-action@v2
|
||||
with:
|
||||
formula-name: prowler
|
||||
base-branch: release-${{ env.RELEASE_TAG }}
|
||||
env:
|
||||
COMMITTER_TOKEN: ${{ secrets.PROWLER_ACCESS_TOKEN }}
|
||||
|
||||
@@ -19,7 +19,6 @@ repos:
|
||||
hooks:
|
||||
- id: pretty-format-toml
|
||||
args: [--autofix]
|
||||
files: pyproject.toml
|
||||
|
||||
## BASH
|
||||
- repo: https://github.com/koalaman/shellcheck-precommit
|
||||
@@ -56,11 +55,10 @@ repos:
|
||||
exclude: contrib
|
||||
args: ["--ignore=E266,W503,E203,E501,W605"]
|
||||
|
||||
- repo: https://github.com/python-poetry/poetry
|
||||
rev: 1.4.0 # add version here
|
||||
- repo: https://github.com/haizaar/check-pipfile-lock
|
||||
rev: v0.0.5
|
||||
hooks:
|
||||
- id: poetry-check
|
||||
- id: poetry-lock
|
||||
- id: check-pipfile-lock
|
||||
|
||||
- repo: https://github.com/hadolint/hadolint
|
||||
rev: v2.12.1-beta
|
||||
|
||||
22
README.md
22
README.md
@@ -13,13 +13,12 @@
|
||||
<a href="https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog"><img alt="Slack Shield" src="https://img.shields.io/badge/slack-prowler-brightgreen.svg?logo=slack"></a>
|
||||
<a href="https://pypi.org/project/prowler-cloud/"><img alt="Python Version" src="https://img.shields.io/pypi/v/prowler.svg"></a>
|
||||
<a href="https://pypi.python.org/pypi/prowler-cloud/"><img alt="Python Version" src="https://img.shields.io/pypi/pyversions/prowler.svg"></a>
|
||||
<a href="https://pypistats.org/packages/prowler"><img alt="PyPI Prowler Downloads" src="https://img.shields.io/pypi/dw/prowler.svg?label=prowler%20downloads"></a>
|
||||
<a href="https://pypistats.org/packages/prowler-cloud"><img alt="PyPI Prowler-Cloud Downloads" src="https://img.shields.io/pypi/dw/prowler-cloud.svg?label=prowler-cloud%20downloads"></a>
|
||||
<a href="https://formulae.brew.sh/formula/prowler#default"><img alt="Brew Prowler Downloads" src="https://img.shields.io/homebrew/installs/dm/prowler?label=brew%20downloads"></a>
|
||||
<a href="https://pypistats.org/packages/prowler"><img alt="PyPI Prowler Downloads" src="https://img.shields.io/pypi/dw/prowler.svg"></a>
|
||||
<a href="https://pypistats.org/packages/prowler-cloud"><img alt="PyPI Prowler-Cloud Downloads" src="https://img.shields.io/pypi/dw/prowler-cloud.svg"></a>
|
||||
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/toniblyx/prowler"></a>
|
||||
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker" src="https://img.shields.io/docker/cloud/build/toniblyx/prowler"></a>
|
||||
<a href="https://hub.docker.com/r/toniblyx/prowler"><img alt="Docker" src="https://img.shields.io/docker/image-size/toniblyx/prowler"></a>
|
||||
<a href="https://gallery.ecr.aws/prowler-cloud/prowler"><img width="120" height=19" alt="AWS ECR Gallery" src="https://user-images.githubusercontent.com/3985464/151531396-b6535a68-c907-44eb-95a1-a09508178616.png"></a>
|
||||
<a href="https://gallery.ecr.aws/o4g1s5r6/prowler"><img width="120" height=19" alt="AWS ECR Gallery" src="https://user-images.githubusercontent.com/3985464/151531396-b6535a68-c907-44eb-95a1-a09508178616.png"></a>
|
||||
</p>
|
||||
<p align="center">
|
||||
<a href="https://github.com/prowler-cloud/prowler"><img alt="Repo size" src="https://img.shields.io/github/repo-size/prowler-cloud/prowler"></a>
|
||||
@@ -38,13 +37,8 @@
|
||||
|
||||
It contains hundreds of controls covering CIS, PCI-DSS, ISO27001, GDPR, HIPAA, FFIEC, SOC2, AWS FTR, ENS and custom security frameworks.
|
||||
|
||||
# 📖 Documentation
|
||||
|
||||
The full documentation can now be found at [https://docs.prowler.cloud](https://docs.prowler.cloud)
|
||||
|
||||
## Looking for Prowler v2 documentation?
|
||||
For Prowler v2 Documentation, please go to https://github.com/prowler-cloud/prowler/tree/2.12.1.
|
||||
|
||||
# ⚙️ Install
|
||||
|
||||
## Pip package
|
||||
@@ -54,7 +48,6 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
pip install prowler
|
||||
prowler -v
|
||||
```
|
||||
More details at https://docs.prowler.cloud
|
||||
|
||||
## Containers
|
||||
|
||||
@@ -63,11 +56,11 @@ The available versions of Prowler are the following:
|
||||
- `latest`: in sync with master branch (bear in mind that it is not a stable version)
|
||||
- `<x.y.z>` (release): you can find the releases [here](https://github.com/prowler-cloud/prowler/releases), those are stable releases.
|
||||
- `stable`: this tag always point to the latest release.
|
||||
|
||||
|
||||
The container images are available here:
|
||||
|
||||
- [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags)
|
||||
- [AWS Public ECR](https://gallery.ecr.aws/prowler-cloud/prowler)
|
||||
- [AWS Public ECR](https://gallery.ecr.aws/o4g1s5r6/prowler)
|
||||
|
||||
## From Github
|
||||
|
||||
@@ -81,6 +74,11 @@ poetry install
|
||||
python prowler.py -v
|
||||
```
|
||||
|
||||
# 📖 Documentation
|
||||
|
||||
The full documentation can now be found at [https://docs.prowler.cloud](https://docs.prowler.cloud)
|
||||
|
||||
|
||||
# 📐✏️ High level architecture
|
||||
|
||||
You can run Prowler from your workstation, an EC2 instance, Fargate or any other container, Codebuild, CloudShell and Cloud9.
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 631 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 320 KiB |
BIN
docs/img/quick-inventory.png
Normal file
BIN
docs/img/quick-inventory.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 220 KiB |
@@ -87,23 +87,6 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
prowler -v
|
||||
```
|
||||
|
||||
=== "GitHub"
|
||||
|
||||
_Requirements for Developers_:
|
||||
|
||||
* AWS and/or Azure credentials
|
||||
* `git`, `Python >= 3.9`, `pip` and `poetry` installed (`pip install poetry`)
|
||||
|
||||
_Commands_:
|
||||
|
||||
```
|
||||
git clone https://github.com/prowler-cloud/prowler
|
||||
cd prowler
|
||||
poetry shell
|
||||
poetry install
|
||||
python prowler.py -v
|
||||
```
|
||||
|
||||
=== "Amazon Linux 2"
|
||||
|
||||
_Requirements_:
|
||||
@@ -120,20 +103,6 @@ Prowler is available as a project in [PyPI](https://pypi.org/project/prowler-clo
|
||||
prowler -v
|
||||
```
|
||||
|
||||
=== "Brew"
|
||||
|
||||
_Requirements_:
|
||||
|
||||
* `Brew` installed in your Mac or Linux
|
||||
* AWS and/or Azure credentials
|
||||
|
||||
_Commands_:
|
||||
|
||||
``` bash
|
||||
brew install prowler
|
||||
prowler -v
|
||||
```
|
||||
|
||||
=== "AWS CloudShell"
|
||||
|
||||
Prowler can be easely executed in AWS CloudShell but it has some prerequsites to be able to to so. AWS CloudShell is a container running with `Amazon Linux release 2 (Karoo)` that comes with Python 3.7, since Prowler requires Python >= 3.9 we need to first install a newer version of Python. Follow the steps below to successfully execute Prowler v3 in AWS CloudShell:
|
||||
@@ -185,7 +154,7 @@ The available versions of Prowler are the following:
|
||||
The container images are available here:
|
||||
|
||||
- [DockerHub](https://hub.docker.com/r/toniblyx/prowler/tags)
|
||||
- [AWS Public ECR](https://gallery.ecr.aws/prowler-cloud/prowler)
|
||||
- [AWS Public ECR](https://gallery.ecr.aws/o4g1s5r6/prowler)
|
||||
|
||||
## High level architecture
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Scan Multiple AWS Accounts
|
||||
|
||||
Prowler can scan multiple accounts when it is executed from one account that can assume a role in those given accounts to scan using [Assume Role feature](role-assumption.md) and [AWS Organizations integration feature](organizations.md).
|
||||
Prowler can scan multiple accounts when it is ejecuted from one account that can assume a role in those given accounts to scan using [Assume Role feature](role-assumption.md) and [AWS Organizations integration feature](organizations.md).
|
||||
|
||||
|
||||
## Scan multiple specific accounts sequentially
|
||||
|
||||
@@ -18,13 +18,13 @@ Before sending findings to Prowler, you will need to perform next steps:
|
||||
Once it is enabled, it is as simple as running the command below (for all regions):
|
||||
|
||||
```sh
|
||||
prowler aws -S
|
||||
./prowler aws -S
|
||||
```
|
||||
|
||||
or for only one filtered region like eu-west-1:
|
||||
|
||||
```sh
|
||||
prowler -S -f eu-west-1
|
||||
./prowler -S -f eu-west-1
|
||||
```
|
||||
|
||||
> **Note 1**: It is recommended to send only fails to Security Hub and that is possible adding `-q` to the command.
|
||||
@@ -43,5 +43,5 @@ By default, Prowler archives all its findings in Security Hub that have not appe
|
||||
You can skip this logic by using the option `--skip-sh-update` so Prowler will not archive older findings:
|
||||
|
||||
```sh
|
||||
prowler -S --skip-sh-update
|
||||
./prowler -S --skip-sh-update
|
||||
```
|
||||
|
||||
@@ -31,7 +31,7 @@ checks_v3_to_v2_mapping = {
|
||||
"awslambda_function_url_cors_policy": "extra7180",
|
||||
"awslambda_function_url_public": "extra7179",
|
||||
"awslambda_function_using_supported_runtimes": "extra762",
|
||||
"cloudformation_stack_outputs_find_secrets": "extra742",
|
||||
"cloudformation_outputs_find_secrets": "extra742",
|
||||
"cloudformation_stacks_termination_protection_enabled": "extra7154",
|
||||
"cloudfront_distributions_field_level_encryption_enabled": "extra767",
|
||||
"cloudfront_distributions_geo_restrictions_enabled": "extra732",
|
||||
@@ -113,6 +113,7 @@ checks_v3_to_v2_mapping = {
|
||||
"ec2_securitygroup_allow_wide_open_public_ipv4": "extra778",
|
||||
"ec2_securitygroup_default_restrict_traffic": "check43",
|
||||
"ec2_securitygroup_from_launch_wizard": "extra7173",
|
||||
"ec2_securitygroup_in_use_without_ingress_filtering": "extra74",
|
||||
"ec2_securitygroup_not_used": "extra75",
|
||||
"ec2_securitygroup_with_many_ingress_egress_rules": "extra777",
|
||||
"ecr_repositories_lifecycle_policy_enabled": "extra7194",
|
||||
@@ -137,6 +138,7 @@ checks_v3_to_v2_mapping = {
|
||||
"elbv2_internet_facing": "extra79",
|
||||
"elbv2_listeners_underneath": "extra7158",
|
||||
"elbv2_logging_enabled": "extra717",
|
||||
"elbv2_request_smugling": "extra7142",
|
||||
"elbv2_ssl_listeners": "extra793",
|
||||
"elbv2_waf_acl_attached": "extra7129",
|
||||
"emr_cluster_account_public_block_enabled": "extra7178",
|
||||
|
||||
@@ -81,4 +81,36 @@ Standard results will be shown and additionally the framework information as the
|
||||
|
||||
## Create and contribute adding other Security Frameworks
|
||||
|
||||
This information is part of the Developer Guide and can be found here: https://docs.prowler.cloud/en/latest/tutorials/developer-guide/.
|
||||
If you want to create or contribute with your own security frameworks or add public ones to Prowler you need to make sure the checks are available if not you have to create your own. Then create a compliance file per provider like in `prowler/compliance/aws/` and name it as `<framework>_<version>_<provider>.json` then follow the following format to create yours.
|
||||
|
||||
Each file version of a framework will have the following structure at high level with the case that each framework needs to be generally identified), one requirement can be also called one control but one requirement can be linked to multiple prowler checks.:
|
||||
|
||||
- `Framework`: string. Indistiguish name of the framework, like CIS
|
||||
- `Provider`: string. Provider where the framework applies, such as AWS, Azure, OCI,...
|
||||
- `Version`: string. Version of the framework itself, like 1.4 for CIS.
|
||||
- `Requirements`: array of objects. Include all requirements or controls with the mapping to Prowler.
|
||||
- `Requirements_Id`: string. Unique identifier per each requirement in the specific framework
|
||||
- `Requirements_Description`: string. Description as in the framework.
|
||||
- `Requirements_Attributes`: array of objects. Includes all needed attributes per each requirement, like levels, sections, etc. Whatever helps to create a dedicated report with the result of the findings. Attributes would be taken as closely as possible from the framework's own terminology directly.
|
||||
- `Requirements_Checks`: array. Prowler checks that are needed to prove this requirement. It can be one or multiple checks. In case of no automation possible this can be empty.
|
||||
|
||||
```
|
||||
{
|
||||
"Framework": "<framework>-<provider>",
|
||||
"Version": "<version>",
|
||||
"Requirements": [
|
||||
{
|
||||
"Id": "<unique-id>",
|
||||
"Description": "Requiemente full description",
|
||||
"Checks": [
|
||||
"Here is the prowler check or checks that is going to be executed"
|
||||
],
|
||||
"Attributes": [
|
||||
{
|
||||
<Add here your custom attributes.>
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Finally, to have a proper output file for your reports, your framework data model has to be created in `prowler/lib/outputs/models.py` and also the CLI table output in `prowler/lib/outputs/compliance.py`.
|
||||
|
||||
@@ -1,281 +0,0 @@
|
||||
# Developer Guide
|
||||
|
||||
You can extend Prowler in many different ways, in most cases you will want to create your own checks and compliance security frameworks, here is where you can learn about how to get started with it. We also include how to create custom outputs, integrations and more.
|
||||
|
||||
## Get the code and install all dependencies
|
||||
|
||||
First of all, you need a version of Python 3.9 or higher and also pip installed to be able to install all dependencies requred. Once that is satisfied go a head and clone the repo:
|
||||
|
||||
```
|
||||
git clone https://github.com/prowler-cloud/prowler
|
||||
cd prowler
|
||||
```
|
||||
For isolation and avoid conflicts with other environments, we recommend usage of `poetry`:
|
||||
```
|
||||
pip install poetry
|
||||
```
|
||||
Then install all dependencies including the ones for developers:
|
||||
```
|
||||
poetry install
|
||||
poetry shell
|
||||
```
|
||||
|
||||
## Contributing with your code or fixes to Prowler
|
||||
|
||||
This repo has git pre-commit hooks managed via the pre-commit tool. Install it how ever you like, then in the root of this repo run:
|
||||
```
|
||||
pre-commit install
|
||||
```
|
||||
You should get an output like the following:
|
||||
```
|
||||
pre-commit installed at .git/hooks/pre-commit
|
||||
```
|
||||
|
||||
Before we merge any of your pull requests we pass checks to the code, we use the following tools and automation to make sure the code is secure and dependencies up-to-dated (these should have been already installed if you ran `pipenv install -d`):
|
||||
|
||||
- `bandit` for code security review.
|
||||
- `safety` and `dependabot` for dependencies.
|
||||
- `hadolint` and `dockle` for our containers security.
|
||||
- `snyk` in Docker Hub.
|
||||
- `clair` in Amazon ECR.
|
||||
- `vulture`, `flake8`, `black` and `pylint` for formatting and best practices.
|
||||
|
||||
You can see all dependencies in file `Pipfile`.
|
||||
|
||||
## Create a new check for a Provider
|
||||
|
||||
### If the check you want to create belongs to an existing service
|
||||
|
||||
To create a new check, you will need to create a folder inside the specific service, i.e. `prowler/providers/<provider>/services/<service>/<check_name>/`, with the name of check following the pattern: `service_subservice_action`.
|
||||
Inside that folder, create the following files:
|
||||
|
||||
- An empty `__init__.py`: to make Python treat this check folder as a package.
|
||||
- A `check_name.py` containing the check's logic, for example:
|
||||
```
|
||||
# Import the Check_Report of the specific provider
|
||||
from prowler.lib.check.models import Check, Check_Report_AWS
|
||||
# Import the client of the specific service
|
||||
from prowler.providers.aws.services.ec2.ec2_client import ec2_client
|
||||
|
||||
# Create the class for the check
|
||||
class ec2_ebs_volume_encryption(Check):
|
||||
def execute(self):
|
||||
findings = []
|
||||
# Iterate the service's asset that want to be analyzed
|
||||
for volume in ec2_client.volumes:
|
||||
# Initialize a Check Report for each item and assign the region, resource_id, resource_arn and resource_tags
|
||||
report = Check_Report_AWS(self.metadata())
|
||||
report.region = volume.region
|
||||
report.resource_id = volume.id
|
||||
report.resource_arn = volume.arn
|
||||
report.resource_tags = volume.tags
|
||||
# Make the logic with conditions and create a PASS and a FAIL with a status and a status_extended
|
||||
if volume.encrypted:
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"EBS Snapshot {volume.id} is encrypted."
|
||||
else:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"EBS Snapshot {volume.id} is unencrypted."
|
||||
findings.append(report) # Append a report for each item
|
||||
|
||||
return findings
|
||||
```
|
||||
- A `check_name.metadata.json` containing the check's metadata, for example:
|
||||
```
|
||||
{
|
||||
"Provider": "aws",
|
||||
"CheckID": "ec2_ebs_volume_encryption",
|
||||
"CheckTitle": "Ensure there are no EBS Volumes unencrypted.",
|
||||
"CheckType": [
|
||||
"Data Protection"
|
||||
],
|
||||
"ServiceName": "ec2",
|
||||
"SubServiceName": "volume",
|
||||
"ResourceIdTemplate": "arn:partition:service:region:account-id:resource-id",
|
||||
"Severity": "medium",
|
||||
"ResourceType": "AwsEc2Volume",
|
||||
"Description": "Ensure there are no EBS Volumes unencrypted.",
|
||||
"Risk": "Data encryption at rest prevents data visibility in the event of its unauthorized access or theft.",
|
||||
"RelatedUrl": "",
|
||||
"Remediation": {
|
||||
"Code": {
|
||||
"CLI": "",
|
||||
"NativeIaC": "",
|
||||
"Other": "",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "Encrypt all EBS volumes and Enable Encryption by default You can configure your AWS account to enforce the encryption of the new EBS volumes and snapshot copies that you create. For example; Amazon EBS encrypts the EBS volumes created when you launch an instance and the snapshots that you copy from an unencrypted snapshot.",
|
||||
"Url": "https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSEncryption.html"
|
||||
}
|
||||
},
|
||||
"Categories": [
|
||||
"encryption"
|
||||
],
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
}
|
||||
```
|
||||
|
||||
### If the check you want to create belongs to a service not supported already by Prowler you will need to create a new service first
|
||||
|
||||
To create a new service, you will need to create a folder inside the specific provider, i.e. `prowler/providers/<provider>/services/<service>/`.
|
||||
Inside that folder, create the following files:
|
||||
|
||||
- An empty `__init__.py`: to make Python treat this service folder as a package.
|
||||
- A `<service>_service.py`, containing all the service's logic and API Calls:
|
||||
```
|
||||
# You must import the following libraries
|
||||
import threading
|
||||
from typing import Optional
|
||||
|
||||
from pydantic import BaseModel
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
|
||||
from prowler.providers.aws.aws_provider import generate_regional_clients
|
||||
|
||||
|
||||
# Create a class for the Service
|
||||
################## <Service>
|
||||
class <Service>:
|
||||
def __init__(self, audit_info):
|
||||
self.service = "<service>" # The name of the service boto3 client
|
||||
self.session = audit_info.audit_session
|
||||
self.audited_account = audit_info.audited_account
|
||||
self.audit_resources = audit_info.audit_resources
|
||||
self.regional_clients = generate_regional_clients(self.service, audit_info)
|
||||
self.<items> = [] # Create an empty list of the items to be gathered, e.g., instances
|
||||
self.__threading_call__(self.__describe_<items>__)
|
||||
self.__describe_<item>__() # Optionally you can create another function to retrieve more data about each item
|
||||
|
||||
def __get_session__(self):
|
||||
return self.session
|
||||
|
||||
def __threading_call__(self, call):
|
||||
threads = []
|
||||
for regional_client in self.regional_clients.values():
|
||||
threads.append(threading.Thread(target=call, args=(regional_client,)))
|
||||
for t in threads:
|
||||
t.start()
|
||||
for t in threads:
|
||||
t.join()
|
||||
|
||||
def __describe_<items>__(self, regional_client):
|
||||
"""Get ALL <Service> <Items>"""
|
||||
logger.info("<Service> - Describing <Items>...")
|
||||
try:
|
||||
describe_<items>_paginator = regional_client.get_paginator("describe_<items>") # Paginator to get every item
|
||||
for page in describe_<items>_paginator.paginate():
|
||||
for <item> in page["<Items>"]:
|
||||
if not self.audit_resources or (
|
||||
is_resource_filtered(<item>["<item_arn>"], self.audit_resources)
|
||||
):
|
||||
self.<items>.append(
|
||||
<Item>(
|
||||
arn=stack["<item_arn>"],
|
||||
name=stack["<item_name>"],
|
||||
tags=stack.get("Tags", []),
|
||||
region=regional_client.region,
|
||||
)
|
||||
)
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
def __describe_<item>__(self):
|
||||
"""Get Details for a <Service> <Item>"""
|
||||
logger.info("<Service> - Describing <Item> to get specific details...")
|
||||
try:
|
||||
for <item> in self.<items>:
|
||||
<item>_details = self.regional_clients[<item>.region].describe_<item>(
|
||||
<Attribute>=<item>.name
|
||||
)
|
||||
# For example, check if item is Public
|
||||
<item>.public = <item>_details.get("Public", False)
|
||||
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{<item>.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
|
||||
class <Item>(BaseModel):
|
||||
"""<Item> holds a <Service> <Item>"""
|
||||
|
||||
arn: str
|
||||
"""<Items>[].Arn"""
|
||||
name: str
|
||||
"""<Items>[].Name"""
|
||||
public: bool
|
||||
"""<Items>[].Public"""
|
||||
tags: Optional[list] = []
|
||||
region: str
|
||||
|
||||
```
|
||||
- A `<service>_client_.py`, containing the initialization of the service's class we have just created so the service's checks can use them:
|
||||
```
|
||||
from prowler.providers.aws.lib.audit_info.audit_info import current_audit_info
|
||||
from prowler.providers.aws.services.<service>.<service>_service import <Service>
|
||||
|
||||
<service>_client = <Service>(current_audit_info)
|
||||
```
|
||||
|
||||
## Create a new security compliance framework
|
||||
|
||||
If you want to create or contribute with your own security frameworks or add public ones to Prowler you need to make sure the checks are available if not you have to create your own. Then create a compliance file per provider like in `prowler/compliance/aws/` and name it as `<framework>_<version>_<provider>.json` then follow the following format to create yours.
|
||||
|
||||
Each file version of a framework will have the following structure at high level with the case that each framework needs to be generally identified, one requirement can be also called one control but one requirement can be linked to multiple prowler checks.:
|
||||
|
||||
- `Framework`: string. Indistiguish name of the framework, like CIS
|
||||
- `Provider`: string. Provider where the framework applies, such as AWS, Azure, OCI,...
|
||||
- `Version`: string. Version of the framework itself, like 1.4 for CIS.
|
||||
- `Requirements`: array of objects. Include all requirements or controls with the mapping to Prowler.
|
||||
- `Requirements_Id`: string. Unique identifier per each requirement in the specific framework
|
||||
- `Requirements_Description`: string. Description as in the framework.
|
||||
- `Requirements_Attributes`: array of objects. Includes all needed attributes per each requirement, like levels, sections, etc. Whatever helps to create a dedicated report with the result of the findings. Attributes would be taken as closely as possible from the framework's own terminology directly.
|
||||
- `Requirements_Checks`: array. Prowler checks that are needed to prove this requirement. It can be one or multiple checks. In case of no automation possible this can be empty.
|
||||
|
||||
```
|
||||
{
|
||||
"Framework": "<framework>-<provider>",
|
||||
"Version": "<version>",
|
||||
"Requirements": [
|
||||
{
|
||||
"Id": "<unique-id>",
|
||||
"Description": "Requiemente full description",
|
||||
"Checks": [
|
||||
"Here is the prowler check or checks that is going to be executed"
|
||||
],
|
||||
"Attributes": [
|
||||
{
|
||||
<Add here your custom attributes.>
|
||||
}
|
||||
]
|
||||
},
|
||||
...
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Finally, to have a proper output file for your reports, your framework data model has to be created in `prowler/lib/outputs/models.py` and also the CLI table output in `prowler/lib/outputs/compliance.py`.
|
||||
|
||||
|
||||
## Create a custom output format
|
||||
|
||||
## Create a new integration
|
||||
|
||||
## Contribute with documentation
|
||||
|
||||
We use `mkdocs` to build this Prowler documentation site so you can easely contribute back with new docs or improving them.
|
||||
|
||||
1. Install `mkdocs` with your favorite package manager.
|
||||
2. Inside the `prowler` repository folder run `mkdocs serve` and point your browser to `http://localhost:8000` and you will see live changes to your local copy of this documentation site.
|
||||
3. Make all needed changes to docs or add new documents. To do so just edit existing md files inside `prowler/docs` and if you are adding a new section or file please make sure you add it to `mkdocs.yaml` file in the root folder of the Prowler repo.
|
||||
4. Once you are done with changes, please send a pull request to us for review and merge. Thank you in advance!
|
||||
|
||||
## Want some swag as appreciation for your contribution?
|
||||
|
||||
If you are like us and you love swag, we are happy to thank you for your contribution with some laptop stickers or whatever other swag we may have at that time. Please, tell us more details and your pull request link in our [Slack workspace here](https://join.slack.com/t/prowler-workspace/shared_invite/zt-1hix76xsl-2uq222JIXrC7Q8It~9ZNog). You can also reach out to Toni de la Fuente on Twitter [here](https://twitter.com/ToniBlyx), his DMs are open.
|
||||
@@ -52,15 +52,14 @@ prowler <provider> -e/--excluded-checks ec2 rds
|
||||
prowler <provider> -C/--checks-file <checks_list>.json
|
||||
```
|
||||
|
||||
## Severities
|
||||
Each of Prowler's checks has a severity, which can be:
|
||||
- informational
|
||||
- low
|
||||
- medium
|
||||
- high
|
||||
- critical
|
||||
## Severities
|
||||
Each check of Prowler has a severity, there are options related with it:
|
||||
|
||||
To execute specific severity(s):
|
||||
- List the available checks in the provider:
|
||||
```console
|
||||
prowler <provider> --list-severities
|
||||
```
|
||||
- Execute specific severity(s):
|
||||
```console
|
||||
prowler <provider> --severity critical high
|
||||
```
|
||||
|
||||
@@ -33,8 +33,9 @@ Several checks analyse resources that are exposed to the Internet, these are:
|
||||
- ec2_instance_internet_facing_with_instance_profile
|
||||
- ec2_instance_public_ip
|
||||
- ec2_networkacl_allow_ingress_any_port
|
||||
- ec2_securitygroup_allow_wide_open_public_ipv4
|
||||
- ec2_securitygroup_allow_ingress_from_internet_to_any_port
|
||||
- ec2_securitygroup_allow_wide_open_public_ipv4
|
||||
- ec2_securitygroup_in_use_without_ingress_filtering
|
||||
- ecr_repositories_not_publicly_accessible
|
||||
- eks_control_plane_endpoint_access_restricted
|
||||
- eks_endpoints_not_publicly_accessible
|
||||
|
||||
@@ -14,6 +14,4 @@ prowler <provider> -i
|
||||
|
||||
- Also, it creates by default a CSV and JSON to see detailed information about the resources extracted.
|
||||
|
||||

|
||||
|
||||
> The inventorying process is done with `resourcegroupstaggingapi` calls (except for the IAM resources which are done with Boto3 API calls.)
|
||||

|
||||
|
||||
@@ -46,11 +46,9 @@ Prowler supports natively the following output formats:
|
||||
|
||||
Hereunder is the structure for each of the supported report formats by Prowler:
|
||||
|
||||
### HTML
|
||||

|
||||
### CSV
|
||||
| ASSESSMENT_START_TIME | FINDING_UNIQUE_ID | PROVIDER | PROFILE | ACCOUNT_ID | ACCOUNT_NAME | ACCOUNT_EMAIL | ACCOUNT_ARN | ACCOUNT_ORG | ACCOUNT_TAGS | REGION | CHECK_ID | CHECK_TITLE | CHECK_TYPE | STATUS | STATUS_EXTENDED | SERVICE_NAME | SUBSERVICE_NAME | SEVERITY | RESOURCE_ID | RESOURCE_ARN | RESOURCE_TYPE | RESOURCE_DETAILS | RESOURCE_TAGS | DESCRIPTION | COMPLIANCE | RISK | RELATED_URL | REMEDIATION_RECOMMENDATION_TEXT | REMEDIATION_RECOMMENDATION_URL | REMEDIATION_RECOMMENDATION_CODE_NATIVEIAC | REMEDIATION_RECOMMENDATION_CODE_TERRAFORM | REMEDIATION_RECOMMENDATION_CODE_CLI | REMEDIATION_RECOMMENDATION_CODE_OTHER | CATEGORIES | DEPENDS_ON | RELATED_TO | NOTES |
|
||||
| ------- | ----------- | ------ | -------- | ------------ | ----------- | ---------- | ---------- | --------------------- | -------------------------- | -------------- | ----------------- | ------------------------ | --------------- | ---------- | ----------------- | --------- | -------------- | ----------------- | ------------------ | --------------------- | -------------------- | ------------------- | ------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- |
|
||||
| ASSESSMENT_START_TIME | FINDING_UNIQUE_ID | PROVIDER | PROFILE | ACCOUNT_ID | ACCOUNT_NAME | ACCOUNT_EMAIL | ACCOUNT_ARN | ACCOUNT_ORG | ACCOUNT_TAGS | REGION | CHECK_ID | CHECK_TITLE | CHECK_TYPE | STATUS | STATUS_EXTENDED | SERVICE_NAME | SUBSERVICE_NAME | SEVERITY | RESOURCE_ID | RESOURCE_ARN | RESOURCE_TYPE | RESOURCE_DETAILS | RESOURCE_TAGS | DESCRIPTION | RISK | RELATED_URL | REMEDIATION_RECOMMENDATION_TEXT | REMEDIATION_RECOMMENDATION_URL | REMEDIATION_RECOMMENDATION_CODE_NATIVEIAC | REMEDIATION_RECOMMENDATION_CODE_TERRAFORM | REMEDIATION_RECOMMENDATION_CODE_CLI | REMEDIATION_RECOMMENDATION_CODE_OTHER | CATEGORIES | DEPENDS_ON | RELATED_TO | NOTES |
|
||||
| ------- | ----------- | ------ | -------- | ------------ | ----------- | ---------- | ---------- | --------------------- | -------------------------- | -------------- | ----------------- | ------------------------ | --------------- | ---------- | ----------------- | --------- | -------------- | ----------------- | ------------------ | --------------------- | -------------------- | ------------------- | ------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- | -------------------- |
|
||||
|
||||
### JSON
|
||||
|
||||
@@ -73,10 +71,6 @@ Hereunder is the structure for each of the supported report formats by Prowler:
|
||||
"Severity": "low",
|
||||
"ResourceId": "rds-instance-id",
|
||||
"ResourceArn": "",
|
||||
"ResourceTags": {
|
||||
"test": "test",
|
||||
"enironment": "dev"
|
||||
},
|
||||
"ResourceType": "AwsRdsDbInstance",
|
||||
"ResourceDetails": "",
|
||||
"Description": "Ensure RDS instances have minor version upgrade enabled.",
|
||||
@@ -95,15 +89,7 @@ Hereunder is the structure for each of the supported report formats by Prowler:
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Notes": "",
|
||||
"Compliance": {
|
||||
"CIS-1.4": [
|
||||
"1.20"
|
||||
],
|
||||
"CIS-1.5": [
|
||||
"1.20"
|
||||
]
|
||||
}
|
||||
"Notes": ""
|
||||
},{
|
||||
"AssessmentStartTime": "2022-12-01T14:16:57.354413",
|
||||
"FindingUniqueId": "",
|
||||
@@ -123,7 +109,7 @@ Hereunder is the structure for each of the supported report formats by Prowler:
|
||||
"ResourceId": "rds-instance-id",
|
||||
"ResourceArn": "",
|
||||
"ResourceType": "AwsRdsDbInstance",
|
||||
"ResourceTags": {},
|
||||
"ResourceDetails": "",
|
||||
"Description": "Ensure RDS instances have minor version upgrade enabled.",
|
||||
"Risk": "Auto Minor Version Upgrade is a feature that you can enable to have your database automatically upgraded when a new minor database engine version is available. Minor version upgrades often patch security vulnerabilities and fix bugs and therefore should be applied.",
|
||||
"RelatedUrl": "https://aws.amazon.com/blogs/database/best-practices-for-upgrading-amazon-rds-to-major-and-minor-versions-of-postgresql/",
|
||||
@@ -140,8 +126,7 @@ Hereunder is the structure for each of the supported report formats by Prowler:
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Notes": "",
|
||||
"Compliance: {}
|
||||
"Notes": ""
|
||||
}]
|
||||
```
|
||||
|
||||
@@ -181,30 +166,7 @@ Hereunder is the structure for each of the supported report formats by Prowler:
|
||||
],
|
||||
"Compliance": {
|
||||
"Status": "PASSED",
|
||||
"RelatedRequirements": [
|
||||
"CISA your-systems-2 booting-up-thing-to-do-first-3",
|
||||
"CIS-1.5 2.3.2",
|
||||
"AWS-Foundational-Security-Best-Practices rds",
|
||||
"RBI-Cyber-Security-Framework annex_i_6",
|
||||
"FFIEC d3-cc-pm-b-1 d3-cc-pm-b-3"
|
||||
],
|
||||
"AssociatedStandards": [
|
||||
{
|
||||
"StandardsId": "CISA"
|
||||
},
|
||||
{
|
||||
"StandardsId": "CIS-1.5"
|
||||
},
|
||||
{
|
||||
"StandardsId": "AWS-Foundational-Security-Best-Practices"
|
||||
},
|
||||
{
|
||||
"StandardsId": "RBI-Cyber-Security-Framework"
|
||||
},
|
||||
{
|
||||
"StandardsId": "FFIEC"
|
||||
}
|
||||
]
|
||||
"RelatedRequirements": []
|
||||
},
|
||||
"Remediation": {
|
||||
"Recommendation": {
|
||||
@@ -243,30 +205,7 @@ Hereunder is the structure for each of the supported report formats by Prowler:
|
||||
],
|
||||
"Compliance": {
|
||||
"Status": "PASSED",
|
||||
"RelatedRequirements": [
|
||||
"CISA your-systems-2 booting-up-thing-to-do-first-3",
|
||||
"CIS-1.5 2.3.2",
|
||||
"AWS-Foundational-Security-Best-Practices rds",
|
||||
"RBI-Cyber-Security-Framework annex_i_6",
|
||||
"FFIEC d3-cc-pm-b-1 d3-cc-pm-b-3"
|
||||
],
|
||||
"AssociatedStandards": [
|
||||
{
|
||||
"StandardsId": "CISA"
|
||||
},
|
||||
{
|
||||
"StandardsId": "CIS-1.5"
|
||||
},
|
||||
{
|
||||
"StandardsId": "AWS-Foundational-Security-Best-Practices"
|
||||
},
|
||||
{
|
||||
"StandardsId": "RBI-Cyber-Security-Framework"
|
||||
},
|
||||
{
|
||||
"StandardsId": "FFIEC"
|
||||
}
|
||||
]
|
||||
"RelatedRequirements": []
|
||||
},
|
||||
"Remediation": {
|
||||
"Recommendation": {
|
||||
|
||||
@@ -37,7 +37,6 @@ nav:
|
||||
- Logging: tutorials/logging.md
|
||||
- Allowlist: tutorials/allowlist.md
|
||||
- Pentesting: tutorials/pentesting.md
|
||||
- Developer Guide: tutorials/developer-guide.md
|
||||
- AWS:
|
||||
- Assume Role: tutorials/aws/role-assumption.md
|
||||
- AWS Security Hub: tutorials/aws/securityhub.md
|
||||
@@ -51,7 +50,6 @@ nav:
|
||||
- Azure:
|
||||
- Authentication: tutorials/azure/authentication.md
|
||||
- Subscriptions: tutorials/azure/subscriptions.md
|
||||
- Developer Guide: tutorials/developer-guide.md
|
||||
- Security: security.md
|
||||
- Contact Us: contact.md
|
||||
- Troubleshooting: troubleshooting.md
|
||||
|
||||
@@ -4,7 +4,7 @@ AWSTemplateFormatVersion: '2010-09-09'
|
||||
# aws cloudformation create-stack \
|
||||
# --capabilities CAPABILITY_IAM --capabilities CAPABILITY_NAMED_IAM \
|
||||
# --template-body "file://create_role_to_assume_cfn.yaml" \
|
||||
# --stack-name "ProwlerScanRole" \
|
||||
# --stack-name "ProwlerExecRole" \
|
||||
# --parameters "ParameterKey=AuthorisedARN,ParameterValue=arn:aws:iam::123456789012:root"
|
||||
#
|
||||
Description: |
|
||||
@@ -13,7 +13,7 @@ Description: |
|
||||
account to assume that role. The role name and the ARN of the trusted user can all be passed
|
||||
to the CloudFormation stack as parameters. Then you can run Prowler to perform a security
|
||||
assessment with a command like:
|
||||
prowler --role ProwlerScanRole.ARN
|
||||
./prowler -A <THIS_ACCOUNT_ID> -R ProwlerExecRole
|
||||
Parameters:
|
||||
AuthorisedARN:
|
||||
Description: |
|
||||
@@ -22,12 +22,12 @@ Parameters:
|
||||
Type: String
|
||||
ProwlerRoleName:
|
||||
Description: |
|
||||
Name of the IAM role that will have these policies attached. Default: ProwlerScanRole
|
||||
Name of the IAM role that will have these policies attached. Default: ProwlerExecRole
|
||||
Type: String
|
||||
Default: 'ProwlerScanRole'
|
||||
Default: 'ProwlerExecRole'
|
||||
|
||||
Resources:
|
||||
ProwlerScanRole:
|
||||
ProwlerExecRole:
|
||||
Type: AWS::IAM::Role
|
||||
Properties:
|
||||
AssumeRolePolicyDocument:
|
||||
@@ -42,49 +42,31 @@ Resources:
|
||||
# Bool:
|
||||
# 'aws:MultiFactorAuthPresent': true
|
||||
# This is 12h that is maximum allowed, Minimum is 3600 = 1h
|
||||
# to take advantage of this use -T like in './prowler --role ProwlerScanRole.ARN -T 43200'
|
||||
# to take advantage of this use -T like in './prowler -A <ACCOUNT_ID_TO_ASSUME> -R ProwlerExecRole -T 43200 -M text,html'
|
||||
MaxSessionDuration: 43200
|
||||
ManagedPolicyArns:
|
||||
- 'arn:aws:iam::aws:policy/SecurityAudit'
|
||||
- 'arn:aws:iam::aws:policy/job-function/ViewOnlyAccess'
|
||||
RoleName: !Sub ${ProwlerRoleName}
|
||||
Policies:
|
||||
- PolicyName: ProwlerScanRoleAdditionalViewPrivileges
|
||||
- PolicyName: ProwlerExecRoleAdditionalViewPrivileges
|
||||
PolicyDocument:
|
||||
Version : '2012-10-17'
|
||||
Statement:
|
||||
- Effect: Allow
|
||||
Action:
|
||||
- 'account:Get*'
|
||||
- 'appstream:Describe*'
|
||||
- 'appstream:List*'
|
||||
- 'codeartifact:List*'
|
||||
- 'codebuild:BatchGet*'
|
||||
- 'ds:Get*'
|
||||
- 'ds:Describe*'
|
||||
- 'ds:List*'
|
||||
- 'ds:ListAuthorizedApplications'
|
||||
- 'ec2:GetEbsEncryptionByDefault'
|
||||
- 'ecr:Describe*'
|
||||
- 'elasticfilesystem:DescribeBackupPolicy'
|
||||
- 'glue:GetConnections'
|
||||
- 'glue:GetSecurityConfiguration*'
|
||||
- 'glue:GetSecurityConfiguration'
|
||||
- 'glue:SearchTables'
|
||||
- 'lambda:GetFunction*'
|
||||
- 'macie2:GetMacieSession'
|
||||
- 'lambda:GetFunction'
|
||||
- 's3:GetAccountPublicAccessBlock'
|
||||
- 'shield:DescribeProtection'
|
||||
- 'shield:GetSubscriptionState'
|
||||
- 'securityhub:BatchImportFindings'
|
||||
- 'securityhub:GetFindings'
|
||||
- 'ssm:GetDocument'
|
||||
- 'support:Describe*'
|
||||
- 'tag:GetTagKeys'
|
||||
Resource: '*'
|
||||
- PolicyName: ProwlerScanRoleAdditionalViewPrivilegesApiGateway
|
||||
PolicyDocument:
|
||||
Version : '2012-10-17'
|
||||
Statement:
|
||||
- Effect: Allow
|
||||
Action:
|
||||
- 'apigateway:GET'
|
||||
Resource: 'arn:aws:apigateway:*::/restapis/*'
|
||||
|
||||
@@ -3,9 +3,7 @@
|
||||
"Statement": [
|
||||
{
|
||||
"Action": [
|
||||
"account:Get*",
|
||||
"appstream:Describe*",
|
||||
"appstream:List*",
|
||||
"codeartifact:List*",
|
||||
"codebuild:BatchGet*",
|
||||
"ds:Describe*",
|
||||
|
||||
2981
poetry.lock
generated
2981
poetry.lock
generated
File diff suppressed because it is too large
Load Diff
@@ -10,6 +10,8 @@ from prowler.lib.check.check import (
|
||||
exclude_checks_to_run,
|
||||
exclude_services_to_run,
|
||||
execute_checks,
|
||||
get_checks_from_input_arn,
|
||||
get_regions_from_audit_resources,
|
||||
list_categories,
|
||||
list_services,
|
||||
print_categories,
|
||||
@@ -27,16 +29,13 @@ from prowler.lib.outputs.html import add_html_footer, fill_html_overview_statist
|
||||
from prowler.lib.outputs.json import close_json
|
||||
from prowler.lib.outputs.outputs import extract_findings_statistics, send_to_s3_bucket
|
||||
from prowler.lib.outputs.summary_table import display_summary_table
|
||||
from prowler.providers.aws.lib.allowlist.allowlist import parse_allowlist_file
|
||||
from prowler.providers.aws.lib.quick_inventory.quick_inventory import quick_inventory
|
||||
from prowler.providers.aws.lib.security_hub.security_hub import (
|
||||
resolve_security_hub_previous_findings,
|
||||
)
|
||||
from prowler.providers.common.allowlist import set_provider_allowlist
|
||||
from prowler.providers.common.audit_info import (
|
||||
set_provider_audit_info,
|
||||
set_provider_execution_parameters,
|
||||
)
|
||||
from prowler.providers.common.audit_info import set_provider_audit_info
|
||||
from prowler.providers.common.outputs import set_provider_output_options
|
||||
from prowler.providers.common.quick_inventory import run_provider_quick_inventory
|
||||
|
||||
|
||||
def prowler():
|
||||
@@ -81,19 +80,26 @@ def prowler():
|
||||
# Load compliance frameworks
|
||||
logger.debug("Loading compliance frameworks from .json files")
|
||||
|
||||
bulk_compliance_frameworks = bulk_load_compliance_frameworks(provider)
|
||||
# Complete checks metadata with the compliance framework specification
|
||||
update_checks_metadata_with_compliance(
|
||||
bulk_compliance_frameworks, bulk_checks_metadata
|
||||
)
|
||||
if args.list_compliance:
|
||||
print_compliance_frameworks(bulk_compliance_frameworks)
|
||||
sys.exit()
|
||||
if args.list_compliance_requirements:
|
||||
print_compliance_requirements(
|
||||
bulk_compliance_frameworks, args.list_compliance_requirements
|
||||
# Load the compliance framework if specified with --compliance
|
||||
# If some compliance argument is specified we have to load it
|
||||
if (
|
||||
args.list_compliance
|
||||
or args.list_compliance_requirements
|
||||
or compliance_framework
|
||||
):
|
||||
bulk_compliance_frameworks = bulk_load_compliance_frameworks(provider)
|
||||
# Complete checks metadata with the compliance framework specification
|
||||
update_checks_metadata_with_compliance(
|
||||
bulk_compliance_frameworks, bulk_checks_metadata
|
||||
)
|
||||
sys.exit()
|
||||
if args.list_compliance:
|
||||
print_compliance_frameworks(bulk_compliance_frameworks)
|
||||
sys.exit()
|
||||
if args.list_compliance_requirements:
|
||||
print_compliance_requirements(
|
||||
bulk_compliance_frameworks, args.list_compliance_requirements
|
||||
)
|
||||
sys.exit()
|
||||
|
||||
# Load checks to execute
|
||||
checks_to_execute = load_checks_to_execute(
|
||||
@@ -129,22 +135,29 @@ def prowler():
|
||||
# Set the audit info based on the selected provider
|
||||
audit_info = set_provider_audit_info(provider, args.__dict__)
|
||||
|
||||
# Once the audit_info is set and we have the eventual checks based on the resource identifier,
|
||||
# it is time to check what Prowler's checks are going to be executed
|
||||
# Once the audit_info is set and we have the eventual checks from arn, it is time to exclude the others
|
||||
if audit_info.audit_resources:
|
||||
checks_to_execute = set_provider_execution_parameters(provider, audit_info)
|
||||
audit_info.audited_regions = get_regions_from_audit_resources(
|
||||
audit_info.audit_resources
|
||||
)
|
||||
checks_to_execute = get_checks_from_input_arn(
|
||||
audit_info.audit_resources, provider
|
||||
)
|
||||
|
||||
# Parse Allowlist
|
||||
allowlist_file = set_provider_allowlist(provider, audit_info, args)
|
||||
# Parse content from Allowlist file and get it, if necessary, from S3
|
||||
if provider == "aws" and args.allowlist_file:
|
||||
allowlist_file = parse_allowlist_file(audit_info, args.allowlist_file)
|
||||
else:
|
||||
allowlist_file = None
|
||||
|
||||
# Set output options based on the selected provider
|
||||
# Setting output options based on the selected provider
|
||||
audit_output_options = set_provider_output_options(
|
||||
provider, args, audit_info, allowlist_file, bulk_checks_metadata
|
||||
)
|
||||
|
||||
# Run the quick inventory for the provider if available
|
||||
if hasattr(args, "quick_inventory") and args.quick_inventory:
|
||||
run_provider_quick_inventory(provider, audit_info, args.output_directory)
|
||||
# Quick Inventory for AWS
|
||||
if provider == "aws" and args.quick_inventory:
|
||||
quick_inventory(audit_info, args.output_directory)
|
||||
sys.exit()
|
||||
|
||||
# Execute checks
|
||||
|
||||
@@ -533,7 +533,6 @@
|
||||
"Id": "2.1.5",
|
||||
"Description": "Ensure that S3 Buckets are configured with 'Block public access (bucket settings)'",
|
||||
"Checks": [
|
||||
"s3_bucket_level_public_access_block",
|
||||
"s3_account_level_public_access_blocks"
|
||||
],
|
||||
"Attributes": [
|
||||
|
||||
@@ -533,7 +533,6 @@
|
||||
"Id": "2.1.5",
|
||||
"Description": "Ensure that S3 Buckets are configured with 'Block public access (bucket settings)'",
|
||||
"Checks": [
|
||||
"s3_bucket_level_public_access_block",
|
||||
"s3_account_level_public_access_blocks"
|
||||
],
|
||||
"Attributes": [
|
||||
|
||||
@@ -1626,7 +1626,7 @@
|
||||
}
|
||||
],
|
||||
"Checks": [
|
||||
"ec2_securitygroup_allow_ingress_from_internet_to_any_port"
|
||||
"ec2_securitygroup_in_use_without_ingress_filtering"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -1,17 +1,15 @@
|
||||
import json
|
||||
import os
|
||||
import pathlib
|
||||
from datetime import datetime, timezone
|
||||
from os import getcwd
|
||||
|
||||
import requests
|
||||
import yaml
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
|
||||
timestamp = datetime.today()
|
||||
timestamp_utc = datetime.now(timezone.utc).replace(tzinfo=timezone.utc)
|
||||
prowler_version = "3.3.4"
|
||||
prowler_version = "3.2.2"
|
||||
html_logo_url = "https://github.com/prowler-cloud/prowler/"
|
||||
html_logo_img = "https://user-images.githubusercontent.com/3985464/113734260-7ba06900-96fb-11eb-82bc-d4f68a1e2710.png"
|
||||
|
||||
@@ -19,18 +17,29 @@ orange_color = "\033[38;5;208m"
|
||||
banner_color = "\033[1;92m"
|
||||
|
||||
# Compliance
|
||||
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
|
||||
compliance_aws_dir = f"{actual_directory}/../compliance/aws"
|
||||
available_compliance_frameworks = []
|
||||
with os.scandir(compliance_aws_dir) as files:
|
||||
files = [
|
||||
file.name
|
||||
for file in files
|
||||
if file.is_file()
|
||||
and file.name.endswith(".json")
|
||||
and available_compliance_frameworks.append(file.name.removesuffix(".json"))
|
||||
]
|
||||
|
||||
compliance_specification_dir = "./compliance"
|
||||
available_compliance_frameworks = [
|
||||
"ens_rd2022_aws",
|
||||
"cis_1.4_aws",
|
||||
"cis_1.5_aws",
|
||||
"aws_audit_manager_control_tower_guardrails_aws",
|
||||
"aws_foundational_security_best_practices_aws",
|
||||
"cisa_aws",
|
||||
"fedramp_low_revision_4_aws",
|
||||
"fedramp_moderate_revision_4_aws",
|
||||
"ffiec_aws",
|
||||
"gdpr_aws",
|
||||
"gxp_eu_annex_11_aws",
|
||||
"gxp_21_cfr_part_11_aws",
|
||||
"hipaa_aws",
|
||||
"nist_800_53_revision_4_aws",
|
||||
"nist_800_53_revision_5_aws",
|
||||
"nist_800_171_revision_2_aws",
|
||||
"nist_csf_1.1_aws",
|
||||
"pci_3.2.1_aws",
|
||||
"rbi_cyber_security_framework_aws",
|
||||
"soc2_aws",
|
||||
]
|
||||
# AWS services-regions matrix json
|
||||
aws_services_json_file = "aws_regions_by_service.json"
|
||||
|
||||
@@ -45,20 +54,6 @@ html_file_suffix = ".html"
|
||||
config_yaml = f"{pathlib.Path(os.path.dirname(os.path.realpath(__file__)))}/config.yaml"
|
||||
|
||||
|
||||
def check_current_version(prowler_version):
|
||||
try:
|
||||
release_response = requests.get(
|
||||
"https://api.github.com/repos/prowler-cloud/prowler/tags"
|
||||
)
|
||||
latest_version = json.loads(release_response)[0]["name"]
|
||||
if latest_version != prowler_version:
|
||||
return f"(latest is {latest_version}, upgrade for the latest features)"
|
||||
else:
|
||||
return "(it is the latest version, yay!)"
|
||||
except Exception:
|
||||
return ""
|
||||
|
||||
|
||||
def change_config_var(variable, value):
|
||||
try:
|
||||
with open(config_yaml) as f:
|
||||
|
||||
@@ -108,8 +108,8 @@ def exclude_services_to_run(
|
||||
# Load checks from checklist.json
|
||||
def parse_checks_from_file(input_file: str, provider: str) -> set:
|
||||
checks_to_execute = set()
|
||||
with open_file(input_file) as f:
|
||||
json_file = parse_json_file(f)
|
||||
f = open_file(input_file)
|
||||
json_file = parse_json_file(f)
|
||||
|
||||
for check_name in json_file[provider]:
|
||||
checks_to_execute.add(check_name)
|
||||
@@ -122,10 +122,7 @@ def list_services(provider: str) -> set():
|
||||
checks_tuple = recover_checks_from_provider(provider)
|
||||
for _, check_path in checks_tuple:
|
||||
# Format: /absolute_path/prowler/providers/{provider}/services/{service_name}/{check_name}
|
||||
if os.name == "nt":
|
||||
service_name = check_path.split("\\")[-2]
|
||||
else:
|
||||
service_name = check_path.split("/")[-2]
|
||||
service_name = check_path.split("/")[-2]
|
||||
available_services.add(service_name)
|
||||
return sorted(available_services)
|
||||
|
||||
@@ -356,22 +353,6 @@ def execute_checks(
|
||||
audit_progress=0,
|
||||
)
|
||||
|
||||
if os.name != "nt":
|
||||
try:
|
||||
from resource import RLIMIT_NOFILE, getrlimit
|
||||
|
||||
# Check ulimit for the maximum system open files
|
||||
soft, _ = getrlimit(RLIMIT_NOFILE)
|
||||
if soft < 4096:
|
||||
logger.warning(
|
||||
f"Your session file descriptors limit ({soft} open files) is below 4096. We recommend to increase it to avoid errors. Solve it running this command `ulimit -n 4096`. For more info visit https://docs.prowler.cloud/en/latest/troubleshooting/"
|
||||
)
|
||||
except Exception as error:
|
||||
logger.error("Unable to retrieve ulimit default settings")
|
||||
logger.error(
|
||||
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
# Execution with the --only-logs flag
|
||||
if audit_output_options.only_logs:
|
||||
for check_name in checks_to_execute:
|
||||
@@ -523,3 +504,68 @@ def recover_checks_from_service(service_list: list, provider: str) -> list:
|
||||
# if service_name in group_list: checks_from_arn.add(check_name)
|
||||
checks.add(check_name)
|
||||
return checks
|
||||
|
||||
|
||||
def get_checks_from_input_arn(audit_resources: list, provider: str) -> set:
|
||||
"""get_checks_from_input_arn gets the list of checks from the input arns"""
|
||||
checks_from_arn = set()
|
||||
# Handle if there are audit resources so only their services are executed
|
||||
if audit_resources:
|
||||
services_without_subservices = ["guardduty", "kms", "s3", "elb"]
|
||||
service_list = set()
|
||||
sub_service_list = set()
|
||||
for resource in audit_resources:
|
||||
service = resource.split(":")[2]
|
||||
sub_service = resource.split(":")[5].split("/")[0].replace("-", "_")
|
||||
|
||||
if (
|
||||
service != "wafv2" and service != "waf"
|
||||
): # WAF Services does not have checks
|
||||
# Parse services when they are different in the ARNs
|
||||
if service == "lambda":
|
||||
service = "awslambda"
|
||||
if service == "elasticloadbalancing":
|
||||
service = "elb"
|
||||
elif service == "logs":
|
||||
service = "cloudwatch"
|
||||
service_list.add(service)
|
||||
|
||||
# Get subservices to execute only applicable checks
|
||||
if service not in services_without_subservices:
|
||||
# Parse some specific subservices
|
||||
if service == "ec2":
|
||||
if sub_service == "security_group":
|
||||
sub_service = "securitygroup"
|
||||
if sub_service == "network_acl":
|
||||
sub_service = "networkacl"
|
||||
if sub_service == "image":
|
||||
sub_service = "ami"
|
||||
if service == "rds":
|
||||
if sub_service == "cluster_snapshot":
|
||||
sub_service = "snapshot"
|
||||
sub_service_list.add(sub_service)
|
||||
else:
|
||||
sub_service_list.add(service)
|
||||
|
||||
checks = recover_checks_from_service(service_list, provider)
|
||||
|
||||
# Filter only checks with audited subservices
|
||||
for check in checks:
|
||||
if any(sub_service in check for sub_service in sub_service_list):
|
||||
if not (sub_service == "policy" and "password_policy" in check):
|
||||
checks_from_arn.add(check)
|
||||
|
||||
# Return final checks list
|
||||
return sorted(checks_from_arn)
|
||||
|
||||
|
||||
def get_regions_from_audit_resources(audit_resources: list) -> list:
|
||||
"""get_regions_from_audit_resources gets the regions from the audit resources arns"""
|
||||
audited_regions = []
|
||||
for resource in audit_resources:
|
||||
region = resource.split(":")[3]
|
||||
if region and region not in audited_regions: # Check if arn has a region
|
||||
audited_regions.append(region)
|
||||
if audited_regions:
|
||||
return audited_regions
|
||||
return None
|
||||
|
||||
@@ -1,12 +1,10 @@
|
||||
import sys
|
||||
|
||||
from pydantic import parse_obj_as
|
||||
|
||||
from prowler.lib.check.compliance_models import (
|
||||
Compliance_Base_Model,
|
||||
Compliance_Requirement,
|
||||
)
|
||||
from prowler.lib.check.models import Check_Metadata_Model
|
||||
from prowler.lib.check.models import Check_Report_AWS
|
||||
from prowler.lib.logger import logger
|
||||
|
||||
|
||||
@@ -64,33 +62,44 @@ def update_checks_metadata_with_compliance(
|
||||
# Include the compliance framework for the check
|
||||
check_compliance.append(compliance)
|
||||
# Create metadata for Manual Control
|
||||
manual_check_metadata = {
|
||||
"Provider": "aws",
|
||||
"CheckID": "manual_check",
|
||||
"CheckTitle": "Manual Check",
|
||||
"CheckType": [],
|
||||
"ServiceName": "",
|
||||
"SubServiceName": "",
|
||||
"ResourceIdTemplate": "",
|
||||
"Severity": "",
|
||||
"ResourceType": "",
|
||||
"Description": "",
|
||||
"Risk": "",
|
||||
"RelatedUrl": "",
|
||||
manual_check_metadata = """{
|
||||
"Provider" : "aws",
|
||||
"CheckID" : "manual_check",
|
||||
"CheckTitle" : "Manual Check",
|
||||
"CheckType" : [],
|
||||
"ServiceName" : "",
|
||||
"SubServiceName" : "",
|
||||
"ResourceIdTemplate" : "",
|
||||
"Severity" : "",
|
||||
"ResourceType" : "",
|
||||
"Description" : "",
|
||||
"Risk" : "",
|
||||
"RelatedUrl" : "",
|
||||
"Remediation": {
|
||||
"Code": {"CLI": "", "NativeIaC": "", "Other": "", "Terraform": ""},
|
||||
"Recommendation": {"Text": "", "Url": ""},
|
||||
"Code": {
|
||||
"CLI": "",
|
||||
"NativeIaC": "",
|
||||
"Other": "",
|
||||
"Terraform": ""
|
||||
},
|
||||
"Recommendation": {
|
||||
"Text": "",
|
||||
"Url": ""
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": "",
|
||||
}
|
||||
manual_check = parse_obj_as(Check_Metadata_Model, manual_check_metadata)
|
||||
"Categories" : [],
|
||||
"Tags" : {},
|
||||
"DependsOn" : [],
|
||||
"RelatedTo" : [],
|
||||
"Notes" : ""
|
||||
}"""
|
||||
manual_check = Check_Report_AWS(manual_check_metadata)
|
||||
manual_check.status = "INFO"
|
||||
manual_check.status_extended = "Manual check"
|
||||
manual_check.resource_id = "manual_check"
|
||||
manual_check.Compliance = check_compliance
|
||||
# Save it into the check's metadata
|
||||
bulk_checks_metadata["manual_check"] = manual_check
|
||||
bulk_checks_metadata["manual_check"].Compliance = check_compliance
|
||||
|
||||
return bulk_checks_metadata
|
||||
except Exception as e:
|
||||
|
||||
@@ -48,6 +48,7 @@ class Check_Metadata_Model(BaseModel):
|
||||
RelatedUrl: str
|
||||
Remediation: Remediation
|
||||
Categories: list[str]
|
||||
Tags: dict
|
||||
DependsOn: list[str]
|
||||
RelatedTo: list[str]
|
||||
Notes: str
|
||||
|
||||
@@ -4,7 +4,6 @@ from argparse import RawTextHelpFormatter
|
||||
|
||||
from prowler.config.config import (
|
||||
available_compliance_frameworks,
|
||||
check_current_version,
|
||||
default_output_directory,
|
||||
prowler_version,
|
||||
)
|
||||
@@ -37,7 +36,7 @@ Detailed documentation at https://docs.prowler.cloud
|
||||
"-v",
|
||||
"--version",
|
||||
action="version",
|
||||
version=f"Prowler {prowler_version} {check_current_version(prowler_version)}",
|
||||
version=f"Prowler {prowler_version}",
|
||||
help="show Prowler version",
|
||||
)
|
||||
# Common arguments parser
|
||||
|
||||
@@ -5,7 +5,6 @@ from colorama import Fore, Style
|
||||
from tabulate import tabulate
|
||||
|
||||
from prowler.config.config import orange_color, timestamp
|
||||
from prowler.lib.check.models import Check_Report
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.outputs.models import (
|
||||
Check_Output_CSV_CIS,
|
||||
@@ -19,13 +18,7 @@ def add_manual_controls(output_options, audit_info, file_descriptors):
|
||||
try:
|
||||
# Check if MANUAL control was already added to output
|
||||
if "manual_check" in output_options.bulk_checks_metadata:
|
||||
manual_finding = Check_Report(
|
||||
output_options.bulk_checks_metadata["manual_check"].json()
|
||||
)
|
||||
manual_finding.status = "INFO"
|
||||
manual_finding.status_extended = "Manual check"
|
||||
manual_finding.resource_id = "manual_check"
|
||||
manual_finding.region = ""
|
||||
manual_finding = output_options.bulk_checks_metadata["manual_check"]
|
||||
fill_compliance(
|
||||
output_options, manual_finding, audit_info, file_descriptors
|
||||
)
|
||||
|
||||
@@ -9,12 +9,6 @@ from prowler.config.config import (
|
||||
timestamp,
|
||||
)
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.outputs.models import (
|
||||
get_check_compliance,
|
||||
parse_html_string,
|
||||
unroll_dict,
|
||||
unroll_tags,
|
||||
)
|
||||
from prowler.lib.utils.utils import open_file
|
||||
|
||||
|
||||
@@ -189,16 +183,16 @@ def add_html_header(file_descriptor, audit_info):
|
||||
<tr>
|
||||
<th scope="col">Status</th>
|
||||
<th scope="col">Severity</th>
|
||||
<th scope="col">Service Name</th>
|
||||
<th style="width:5%" scope="col">Service Name</th>
|
||||
<th scope="col">Region</th>
|
||||
<th style="width:20%" scope="col">Check ID</th>
|
||||
<th style="width:20%" scope="col">Check Title</th>
|
||||
<th scope="col">Resource ID</th>
|
||||
<th scope="col">Resource Tags</th>
|
||||
<th style="width:15%" scope="col">Check Description</th>
|
||||
<th scope="col">Check ID</th>
|
||||
<th scope="col">Status Extended</th>
|
||||
<th scope="col">Risk</th>
|
||||
<th scope="col">Recomendation</th>
|
||||
<th scope="col">Compliance</th>
|
||||
<th style="width:5%" scope="col">Recomendation URL</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
@@ -210,7 +204,7 @@ def add_html_header(file_descriptor, audit_info):
|
||||
)
|
||||
|
||||
|
||||
def fill_html(file_descriptor, finding, output_options):
|
||||
def fill_html(file_descriptor, finding):
|
||||
row_class = "p-3 mb-2 bg-success-custom"
|
||||
if finding.status == "INFO":
|
||||
row_class = "table-info"
|
||||
@@ -225,14 +219,14 @@ def fill_html(file_descriptor, finding, output_options):
|
||||
<td>{finding.check_metadata.Severity}</td>
|
||||
<td>{finding.check_metadata.ServiceName}</td>
|
||||
<td>{finding.region}</td>
|
||||
<td>{finding.check_metadata.CheckID.replace("_", "<wbr>_")}</td>
|
||||
<td>{finding.check_metadata.CheckTitle}</td>
|
||||
<td>{finding.resource_id.replace("<", "<").replace(">", ">").replace("_", "<wbr>_")}</td>
|
||||
<td>{parse_html_string(unroll_tags(finding.resource_tags))}</td>
|
||||
<td>{finding.check_metadata.Description}</td>
|
||||
<td>{finding.check_metadata.CheckID.replace("_", "<wbr>_")}</td>
|
||||
<td>{finding.status_extended.replace("<", "<").replace(">", ">").replace("_", "<wbr>_")}</td>
|
||||
<td><p class="show-read-more">{finding.check_metadata.Risk}</p></td>
|
||||
<td><p class="show-read-more">{finding.check_metadata.Remediation.Recommendation.Text}</p> <a class="read-more" href="{finding.check_metadata.Remediation.Recommendation.Url}"><i class="fas fa-external-link-alt"></i></a></td>
|
||||
<td><p class="show-read-more">{parse_html_string(unroll_dict(get_check_compliance(finding, finding.check_metadata.Provider, output_options)))}</p></td>
|
||||
<td><p class="show-read-more">{finding.check_metadata.Remediation.Recommendation.Text}</p></td>
|
||||
<td><a class="read-more" href="{finding.check_metadata.Remediation.Recommendation.Url}"><i class="fas fa-external-link-alt"></i></a></td>
|
||||
</tr>
|
||||
"""
|
||||
)
|
||||
|
||||
@@ -8,17 +8,11 @@ from prowler.config.config import (
|
||||
timestamp_utc,
|
||||
)
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.outputs.models import (
|
||||
Compliance,
|
||||
ProductFields,
|
||||
Resource,
|
||||
Severity,
|
||||
get_check_compliance,
|
||||
)
|
||||
from prowler.lib.outputs.models import Compliance, ProductFields, Resource, Severity
|
||||
from prowler.lib.utils.utils import hash_sha512, open_file
|
||||
|
||||
|
||||
def fill_json_asff(finding_output, audit_info, finding, output_options):
|
||||
def fill_json_asff(finding_output, audit_info, finding):
|
||||
# Check if there are no resources in the finding
|
||||
if finding.resource_arn == "":
|
||||
if finding.resource_id == "":
|
||||
@@ -46,22 +40,14 @@ def fill_json_asff(finding_output, audit_info, finding, output_options):
|
||||
Region=finding.region,
|
||||
)
|
||||
]
|
||||
# Iterate for each compliance framework
|
||||
compliance_summary = []
|
||||
associated_standards = []
|
||||
check_compliance = get_check_compliance(finding, "aws", output_options)
|
||||
for key, value in check_compliance.items():
|
||||
associated_standards.append({"StandardsId": key})
|
||||
item = f"{key} {' '.join(value)}"
|
||||
if len(item) > 64:
|
||||
item = item[0:63]
|
||||
compliance_summary.append(item)
|
||||
|
||||
# Check if any Requirement has > 64 characters
|
||||
check_types = []
|
||||
for type in finding.check_metadata.CheckType:
|
||||
check_types.extend(type.split("/"))
|
||||
# Add ED to PASS or FAIL (PASSED/FAILED)
|
||||
finding_output.Compliance = Compliance(
|
||||
Status=finding.status + "ED",
|
||||
AssociatedStandards=associated_standards,
|
||||
RelatedRequirements=compliance_summary,
|
||||
RelatedRequirements=check_types,
|
||||
)
|
||||
finding_output.Remediation = {
|
||||
"Recommendation": finding.check_metadata.Remediation.Recommendation
|
||||
|
||||
@@ -11,26 +11,7 @@ from prowler.lib.logger import logger
|
||||
from prowler.providers.aws.lib.audit_info.models import AWS_Organizations_Info
|
||||
|
||||
|
||||
def get_check_compliance(finding, provider, output_options):
|
||||
check_compliance = {}
|
||||
# We have to retrieve all the check's compliance requirements
|
||||
for compliance in output_options.bulk_checks_metadata[
|
||||
finding.check_metadata.CheckID
|
||||
].Compliance:
|
||||
compliance_fw = compliance.Framework
|
||||
if compliance.Version:
|
||||
compliance_fw = f"{compliance_fw}-{compliance.Version}"
|
||||
if compliance.Provider == provider.upper():
|
||||
if compliance_fw not in check_compliance:
|
||||
check_compliance[compliance_fw] = []
|
||||
for requirement in compliance.Requirements:
|
||||
check_compliance[compliance_fw].append(requirement.Id)
|
||||
return check_compliance
|
||||
|
||||
|
||||
def generate_provider_output_csv(
|
||||
provider: str, finding, audit_info, mode: str, fd, output_options
|
||||
):
|
||||
def generate_provider_output_csv(provider: str, finding, audit_info, mode: str, fd):
|
||||
"""
|
||||
set_provider_output_options configures automatically the outputs based on the selected provider and returns the Provider_Output_Options object.
|
||||
"""
|
||||
@@ -51,9 +32,6 @@ def generate_provider_output_csv(
|
||||
data[
|
||||
"finding_unique_id"
|
||||
] = f"prowler-{provider}-{finding.check_metadata.CheckID}-{finding.subscription}-{finding.resource_id}"
|
||||
data["compliance"] = unroll_dict(
|
||||
get_check_compliance(finding, provider, output_options)
|
||||
)
|
||||
finding_output = output_model(**data)
|
||||
|
||||
if provider == "aws":
|
||||
@@ -65,9 +43,6 @@ def generate_provider_output_csv(
|
||||
data[
|
||||
"finding_unique_id"
|
||||
] = f"prowler-{provider}-{finding.check_metadata.CheckID}-{audit_info.audited_account}-{finding.region}-{finding.resource_id}"
|
||||
data["compliance"] = unroll_dict(
|
||||
get_check_compliance(finding, provider, output_options)
|
||||
)
|
||||
finding_output = output_model(**data)
|
||||
|
||||
if audit_info.organizations_metadata:
|
||||
@@ -116,7 +91,7 @@ def fill_common_data_csv(finding: dict) -> dict:
|
||||
"severity": finding.check_metadata.Severity,
|
||||
"resource_type": finding.check_metadata.ResourceType,
|
||||
"resource_details": finding.resource_details,
|
||||
"resource_tags": unroll_tags(finding.resource_tags),
|
||||
"resource_tags": finding.resource_tags,
|
||||
"description": finding.check_metadata.Description,
|
||||
"risk": finding.check_metadata.Risk,
|
||||
"related_url": finding.check_metadata.RelatedUrl,
|
||||
@@ -138,99 +113,26 @@ def fill_common_data_csv(finding: dict) -> dict:
|
||||
"remediation_recommendation_code_other": (
|
||||
finding.check_metadata.Remediation.Code.Other
|
||||
),
|
||||
"categories": unroll_list(finding.check_metadata.Categories),
|
||||
"depends_on": unroll_list(finding.check_metadata.DependsOn),
|
||||
"related_to": unroll_list(finding.check_metadata.RelatedTo),
|
||||
"categories": __unroll_list__(finding.check_metadata.Categories),
|
||||
"depends_on": __unroll_list__(finding.check_metadata.DependsOn),
|
||||
"related_to": __unroll_list__(finding.check_metadata.RelatedTo),
|
||||
"notes": finding.check_metadata.Notes,
|
||||
}
|
||||
return data
|
||||
|
||||
|
||||
def unroll_list(listed_items: list):
|
||||
def __unroll_list__(listed_items: list):
|
||||
unrolled_items = ""
|
||||
separator = "|"
|
||||
if listed_items:
|
||||
for item in listed_items:
|
||||
if not unrolled_items:
|
||||
unrolled_items = f"{item}"
|
||||
else:
|
||||
unrolled_items = f"{unrolled_items} {separator} {item}"
|
||||
|
||||
return unrolled_items
|
||||
|
||||
|
||||
def unroll_tags(tags: list):
|
||||
unrolled_items = ""
|
||||
separator = "|"
|
||||
if tags:
|
||||
for item in tags:
|
||||
# Check if there are tags in list
|
||||
if type(item) == dict:
|
||||
for key, value in item.items():
|
||||
if not unrolled_items:
|
||||
# Check the pattern of tags (Key:Value or Key:key/Value:value)
|
||||
if "Key" != key and "Value" != key:
|
||||
unrolled_items = f"{key}={value}"
|
||||
else:
|
||||
if "Key" == key:
|
||||
unrolled_items = f"{value}="
|
||||
else:
|
||||
unrolled_items = f"{value}"
|
||||
else:
|
||||
if "Key" != key and "Value" != key:
|
||||
unrolled_items = (
|
||||
f"{unrolled_items} {separator} {key}={value}"
|
||||
)
|
||||
else:
|
||||
if "Key" == key:
|
||||
unrolled_items = (
|
||||
f"{unrolled_items} {separator} {value}="
|
||||
)
|
||||
else:
|
||||
unrolled_items = f"{unrolled_items}{value}"
|
||||
elif not unrolled_items:
|
||||
unrolled_items = f"{item}"
|
||||
else:
|
||||
unrolled_items = f"{unrolled_items} {separator} {item}"
|
||||
|
||||
return unrolled_items
|
||||
|
||||
|
||||
def unroll_dict(dict: dict):
|
||||
unrolled_items = ""
|
||||
separator = "|"
|
||||
for key, value in dict.items():
|
||||
if type(value) == list:
|
||||
value = ", ".join(value)
|
||||
for item in listed_items:
|
||||
if not unrolled_items:
|
||||
unrolled_items = f"{key}: {value}"
|
||||
unrolled_items = f"{item}"
|
||||
else:
|
||||
unrolled_items = f"{unrolled_items} {separator} {key}: {value}"
|
||||
unrolled_items = f"{unrolled_items}{separator}{item}"
|
||||
|
||||
return unrolled_items
|
||||
|
||||
|
||||
def parse_html_string(str: str):
|
||||
string = ""
|
||||
for elem in str.split(" | "):
|
||||
if elem:
|
||||
string += f"\n•{elem}\n"
|
||||
|
||||
return string
|
||||
|
||||
|
||||
def parse_json_tags(tags: list):
|
||||
dict_tags = {}
|
||||
if tags and tags != [{}] and tags != [None]:
|
||||
for tag in tags:
|
||||
if "Key" in tag and "Value" in tag:
|
||||
dict_tags[tag["Key"]] = tag["Value"]
|
||||
else:
|
||||
dict_tags.update(tag)
|
||||
|
||||
return dict_tags
|
||||
|
||||
|
||||
def generate_csv_fields(format: Any) -> list[str]:
|
||||
"""Generates the CSV headers for the given class"""
|
||||
csv_fields = []
|
||||
@@ -260,7 +162,7 @@ class Check_Output_CSV(BaseModel):
|
||||
severity: str
|
||||
resource_type: str
|
||||
resource_details: str
|
||||
resource_tags: str
|
||||
resource_tags: list
|
||||
description: str
|
||||
risk: str
|
||||
related_url: str
|
||||
@@ -270,7 +172,6 @@ class Check_Output_CSV(BaseModel):
|
||||
remediation_recommendation_code_terraform: str
|
||||
remediation_recommendation_code_cli: str
|
||||
remediation_recommendation_code_other: str
|
||||
compliance: str
|
||||
categories: str
|
||||
depends_on: str
|
||||
related_to: str
|
||||
@@ -305,9 +206,7 @@ class Azure_Check_Output_CSV(Check_Output_CSV):
|
||||
resource_name: str = ""
|
||||
|
||||
|
||||
def generate_provider_output_json(
|
||||
provider: str, finding, audit_info, mode: str, output_options
|
||||
):
|
||||
def generate_provider_output_json(provider: str, finding, audit_info, mode: str, fd):
|
||||
"""
|
||||
generate_provider_output_json configures automatically the outputs based on the selected provider and returns the Check_Output_JSON object.
|
||||
"""
|
||||
@@ -329,9 +228,6 @@ def generate_provider_output_json(
|
||||
finding_output.ResourceId = finding.resource_id
|
||||
finding_output.ResourceName = finding.resource_name
|
||||
finding_output.FindingUniqueId = f"prowler-{provider}-{finding.check_metadata.CheckID}-{finding.subscription}-{finding.resource_id}"
|
||||
finding_output.Compliance = get_check_compliance(
|
||||
finding, provider, output_options
|
||||
)
|
||||
|
||||
if provider == "aws":
|
||||
finding_output.Profile = audit_info.profile
|
||||
@@ -339,11 +235,7 @@ def generate_provider_output_json(
|
||||
finding_output.Region = finding.region
|
||||
finding_output.ResourceId = finding.resource_id
|
||||
finding_output.ResourceArn = finding.resource_arn
|
||||
finding_output.ResourceTags = parse_json_tags(finding.resource_tags)
|
||||
finding_output.FindingUniqueId = f"prowler-{provider}-{finding.check_metadata.CheckID}-{audit_info.audited_account}-{finding.region}-{finding.resource_id}"
|
||||
finding_output.Compliance = get_check_compliance(
|
||||
finding, provider, output_options
|
||||
)
|
||||
|
||||
if audit_info.organizations_metadata:
|
||||
finding_output.OrganizationsInfo = (
|
||||
@@ -379,11 +271,11 @@ class Check_Output_JSON(BaseModel):
|
||||
Severity: str
|
||||
ResourceType: str
|
||||
ResourceDetails: str = ""
|
||||
Tags: dict
|
||||
Description: str
|
||||
Risk: str
|
||||
RelatedUrl: str
|
||||
Remediation: Remediation
|
||||
Compliance: Optional[dict]
|
||||
Categories: List[str]
|
||||
DependsOn: List[str]
|
||||
RelatedTo: List[str]
|
||||
@@ -401,7 +293,6 @@ class Aws_Check_Output_JSON(Check_Output_JSON):
|
||||
Region: str = ""
|
||||
ResourceId: str = ""
|
||||
ResourceArn: str = ""
|
||||
ResourceTags: list = []
|
||||
|
||||
def __init__(self, **metadata):
|
||||
super().__init__(**metadata)
|
||||
@@ -409,7 +300,7 @@ class Aws_Check_Output_JSON(Check_Output_JSON):
|
||||
|
||||
class Azure_Check_Output_JSON(Check_Output_JSON):
|
||||
"""
|
||||
Azure_Check_Output_JSON generates a finding's output in JSON format for the AWS provider.
|
||||
Aws_Check_Output_JSON generates a finding's output in JSON format for the AWS provider.
|
||||
"""
|
||||
|
||||
Tenant_Domain: str = ""
|
||||
@@ -518,7 +409,6 @@ class Resource(BaseModel):
|
||||
class Compliance(BaseModel):
|
||||
Status: str
|
||||
RelatedRequirements: List[str]
|
||||
AssociatedStandards: List[dict]
|
||||
|
||||
|
||||
class Check_Output_JSON_ASFF(BaseModel):
|
||||
|
||||
@@ -101,16 +101,12 @@ def report(check_findings, output_options, audit_info):
|
||||
)
|
||||
|
||||
if "html" in file_descriptors:
|
||||
fill_html(
|
||||
file_descriptors["html"], finding, output_options
|
||||
)
|
||||
fill_html(file_descriptors["html"], finding)
|
||||
file_descriptors["html"].write("")
|
||||
|
||||
if "json-asff" in file_descriptors:
|
||||
finding_output = Check_Output_JSON_ASFF()
|
||||
fill_json_asff(
|
||||
finding_output, audit_info, finding, output_options
|
||||
)
|
||||
fill_json_asff(finding_output, audit_info, finding)
|
||||
|
||||
json.dump(
|
||||
finding_output.dict(),
|
||||
@@ -140,7 +136,6 @@ def report(check_findings, output_options, audit_info):
|
||||
audit_info,
|
||||
"csv",
|
||||
file_descriptors["csv"],
|
||||
output_options,
|
||||
)
|
||||
csv_writer.writerow(finding_output.__dict__)
|
||||
|
||||
@@ -150,7 +145,7 @@ def report(check_findings, output_options, audit_info):
|
||||
finding,
|
||||
audit_info,
|
||||
"json",
|
||||
output_options,
|
||||
file_descriptors["json"],
|
||||
)
|
||||
json.dump(
|
||||
finding_output.dict(),
|
||||
@@ -208,8 +203,6 @@ def send_to_s3_bucket(
|
||||
filename = f"{output_filename}{json_asff_file_suffix}"
|
||||
elif output_mode == "html":
|
||||
filename = f"{output_filename}{html_file_suffix}"
|
||||
else: # Compliance output mode
|
||||
filename = f"{output_filename}_{output_mode}{csv_file_suffix}"
|
||||
logger.info(f"Sending outputs to S3 bucket {output_bucket}")
|
||||
bucket_remote_dir = output_directory
|
||||
while "prowler/" in bucket_remote_dir: # Check if it is not a custom directory
|
||||
|
||||
@@ -1,26 +1,16 @@
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
from hashlib import sha512
|
||||
from io import TextIOWrapper
|
||||
from os.path import exists
|
||||
from typing import Any
|
||||
|
||||
from detect_secrets import SecretsCollection
|
||||
from detect_secrets.settings import default_settings
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
|
||||
|
||||
def open_file(input_file: str, mode: str = "r") -> TextIOWrapper:
|
||||
try:
|
||||
f = open(input_file, mode)
|
||||
except OSError:
|
||||
logger.critical(
|
||||
"Ooops! You reached your user session maximum open files. To solve this issue, increase the shell session limit by running this command `ulimit -n 4096`. For more info visit https://docs.prowler.cloud/en/latest/troubleshooting/"
|
||||
)
|
||||
sys.exit(1)
|
||||
except Exception as e:
|
||||
logger.critical(
|
||||
f"{input_file}: {e.__class__.__name__}[{e.__traceback__.tb_lineno}]"
|
||||
@@ -59,20 +49,3 @@ def file_exists(filename: str):
|
||||
# create sha512 hash for string
|
||||
def hash_sha512(string: str) -> str:
|
||||
return sha512(string.encode("utf-8")).hexdigest()[0:9]
|
||||
|
||||
|
||||
def detect_secrets_scan(data):
|
||||
temp_data_file = tempfile.NamedTemporaryFile(delete=False)
|
||||
temp_data_file.write(bytes(data, encoding="raw_unicode_escape"))
|
||||
temp_data_file.close()
|
||||
|
||||
secrets = SecretsCollection()
|
||||
with default_settings():
|
||||
secrets.scan_file(temp_data_file.name)
|
||||
os.remove(temp_data_file.name)
|
||||
|
||||
detect_secrets_output = secrets.json()
|
||||
if detect_secrets_output:
|
||||
return detect_secrets_output[temp_data_file.name]
|
||||
else:
|
||||
return None
|
||||
|
||||
@@ -7,7 +7,6 @@ from botocore.credentials import RefreshableCredentials
|
||||
from botocore.session import get_session
|
||||
|
||||
from prowler.config.config import aws_services_json_file
|
||||
from prowler.lib.check.check import list_modules, recover_checks_from_service
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.utils.utils import open_file, parse_json_file
|
||||
from prowler.providers.aws.lib.audit_info.models import AWS_Assume_Role, AWS_Audit_Info
|
||||
@@ -85,7 +84,7 @@ def assume_role(session: session.Session, assumed_role_info: AWS_Assume_Role) ->
|
||||
if assumed_role_info.external_id:
|
||||
assumed_credentials = sts_client.assume_role(
|
||||
RoleArn=assumed_role_info.role_arn,
|
||||
RoleSessionName="ProwlerAsessmentSession",
|
||||
RoleSessionName="ProwlerProAsessmentSession",
|
||||
DurationSeconds=assumed_role_info.session_duration,
|
||||
ExternalId=assumed_role_info.external_id,
|
||||
)
|
||||
@@ -111,8 +110,8 @@ def generate_regional_clients(
|
||||
regional_clients = {}
|
||||
# Get json locally
|
||||
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
|
||||
with open_file(f"{actual_directory}/{aws_services_json_file}") as f:
|
||||
data = parse_json_file(f)
|
||||
f = open_file(f"{actual_directory}/{aws_services_json_file}")
|
||||
data = parse_json_file(f)
|
||||
# Check if it is a subservice
|
||||
json_regions = data["services"][service]["regions"][
|
||||
audit_info.audited_partition
|
||||
@@ -145,8 +144,8 @@ def generate_regional_clients(
|
||||
def get_aws_available_regions():
|
||||
try:
|
||||
actual_directory = pathlib.Path(os.path.dirname(os.path.realpath(__file__)))
|
||||
with open_file(f"{actual_directory}/{aws_services_json_file}") as f:
|
||||
data = parse_json_file(f)
|
||||
f = open_file(f"{actual_directory}/{aws_services_json_file}")
|
||||
data = parse_json_file(f)
|
||||
|
||||
regions = set()
|
||||
for service in data["services"].values():
|
||||
@@ -157,73 +156,3 @@ def get_aws_available_regions():
|
||||
except Exception as error:
|
||||
logger.error(f"{error.__class__.__name__}: {error}")
|
||||
return []
|
||||
|
||||
|
||||
def get_checks_from_input_arn(audit_resources: list, provider: str) -> set:
|
||||
"""get_checks_from_input_arn gets the list of checks from the input arns"""
|
||||
checks_from_arn = set()
|
||||
# Handle if there are audit resources so only their services are executed
|
||||
if audit_resources:
|
||||
services_without_subservices = ["guardduty", "kms", "s3", "elb"]
|
||||
service_list = set()
|
||||
sub_service_list = set()
|
||||
for resource in audit_resources:
|
||||
service = resource.split(":")[2]
|
||||
sub_service = resource.split(":")[5].split("/")[0].replace("-", "_")
|
||||
# WAF Services does not have checks
|
||||
if service != "wafv2" and service != "waf":
|
||||
# Parse services when they are different in the ARNs
|
||||
if service == "lambda":
|
||||
service = "awslambda"
|
||||
if service == "elasticloadbalancing":
|
||||
service = "elb"
|
||||
elif service == "logs":
|
||||
service = "cloudwatch"
|
||||
# Check if Prowler has checks in service
|
||||
try:
|
||||
list_modules(provider, service)
|
||||
except ModuleNotFoundError:
|
||||
# Service is not supported
|
||||
pass
|
||||
else:
|
||||
service_list.add(service)
|
||||
|
||||
# Get subservices to execute only applicable checks
|
||||
if service not in services_without_subservices:
|
||||
# Parse some specific subservices
|
||||
if service == "ec2":
|
||||
if sub_service == "security_group":
|
||||
sub_service = "securitygroup"
|
||||
if sub_service == "network_acl":
|
||||
sub_service = "networkacl"
|
||||
if sub_service == "image":
|
||||
sub_service = "ami"
|
||||
if service == "rds":
|
||||
if sub_service == "cluster_snapshot":
|
||||
sub_service = "snapshot"
|
||||
sub_service_list.add(sub_service)
|
||||
else:
|
||||
sub_service_list.add(service)
|
||||
|
||||
checks = recover_checks_from_service(service_list, provider)
|
||||
|
||||
# Filter only checks with audited subservices
|
||||
for check in checks:
|
||||
if any(sub_service in check for sub_service in sub_service_list):
|
||||
if not (sub_service == "policy" and "password_policy" in check):
|
||||
checks_from_arn.add(check)
|
||||
|
||||
# Return final checks list
|
||||
return sorted(checks_from_arn)
|
||||
|
||||
|
||||
def get_regions_from_audit_resources(audit_resources: list) -> list:
|
||||
"""get_regions_from_audit_resources gets the regions from the audit resources arns"""
|
||||
audited_regions = []
|
||||
for resource in audit_resources:
|
||||
region = resource.split(":")[3]
|
||||
if region and region not in audited_regions: # Check if arn has a region
|
||||
audited_regions.append(region)
|
||||
if audited_regions:
|
||||
return audited_regions
|
||||
return None
|
||||
|
||||
@@ -123,9 +123,9 @@
|
||||
"acm-pca": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"ap-east-1",
|
||||
"ap-northeast-3",
|
||||
"ap-south-1",
|
||||
"ap-south-2",
|
||||
"ap-southeast-1",
|
||||
"eu-south-1",
|
||||
"eu-west-2",
|
||||
@@ -133,18 +133,15 @@
|
||||
"me-south-1",
|
||||
"us-east-2",
|
||||
"us-west-1",
|
||||
"ap-east-1",
|
||||
"af-south-1",
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-2",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-3",
|
||||
"ca-central-1",
|
||||
"eu-central-1",
|
||||
"eu-south-2",
|
||||
"eu-west-1",
|
||||
"us-east-1",
|
||||
"af-south-1",
|
||||
"eu-central-2",
|
||||
"eu-north-1",
|
||||
"eu-west-3",
|
||||
"sa-east-1",
|
||||
@@ -152,15 +149,14 @@
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": [
|
||||
"us-gov-east-1",
|
||||
"us-gov-west-1"
|
||||
"us-gov-west-1",
|
||||
"us-gov-east-1"
|
||||
]
|
||||
}
|
||||
},
|
||||
"ahl": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"ap-south-1",
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
"us-west-2"
|
||||
@@ -461,29 +457,23 @@
|
||||
"ap-east-1",
|
||||
"ap-northeast-1",
|
||||
"ap-south-1",
|
||||
"ap-south-2",
|
||||
"ap-southeast-1",
|
||||
"eu-west-1",
|
||||
"eu-west-2",
|
||||
"me-south-1",
|
||||
"us-east-2",
|
||||
"ap-northeast-2",
|
||||
"ap-northeast-3",
|
||||
"ap-southeast-4",
|
||||
"ca-central-1",
|
||||
"eu-central-2",
|
||||
"me-central-1",
|
||||
"sa-east-1",
|
||||
"us-west-1",
|
||||
"us-west-2",
|
||||
"af-south-1",
|
||||
"ap-northeast-2",
|
||||
"ap-northeast-3",
|
||||
"ca-central-1",
|
||||
"eu-south-1",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"us-west-1",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-3",
|
||||
"eu-central-1",
|
||||
"eu-north-1",
|
||||
"eu-south-1",
|
||||
"eu-south-2",
|
||||
"us-east-1",
|
||||
"eu-west-3"
|
||||
],
|
||||
"aws-cn": [
|
||||
@@ -491,8 +481,8 @@
|
||||
"cn-northwest-1"
|
||||
],
|
||||
"aws-us-gov": [
|
||||
"us-gov-east-1",
|
||||
"us-gov-west-1"
|
||||
"us-gov-west-1",
|
||||
"us-gov-east-1"
|
||||
]
|
||||
}
|
||||
},
|
||||
@@ -656,9 +646,6 @@
|
||||
"regions": {
|
||||
"aws": [
|
||||
"ap-northeast-1",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"eu-central-1",
|
||||
"eu-west-1",
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
@@ -671,6 +658,7 @@
|
||||
"appstream": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-2",
|
||||
"ap-south-1",
|
||||
"ap-southeast-1",
|
||||
@@ -680,14 +668,12 @@
|
||||
"eu-west-2",
|
||||
"sa-east-1",
|
||||
"us-west-2",
|
||||
"ap-northeast-1",
|
||||
"ap-southeast-2",
|
||||
"us-east-1",
|
||||
"us-east-2"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": [
|
||||
"us-gov-east-1",
|
||||
"us-gov-west-1"
|
||||
]
|
||||
}
|
||||
@@ -746,22 +732,16 @@
|
||||
"arc-zonal-shift": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"af-south-1",
|
||||
"ap-south-1",
|
||||
"ap-northeast-1",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-3",
|
||||
"eu-central-1",
|
||||
"eu-north-1",
|
||||
"eu-west-1",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"us-west-2",
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-2",
|
||||
"ap-southeast-3",
|
||||
"eu-west-2",
|
||||
"eu-west-3",
|
||||
"us-east-2"
|
||||
"us-east-2",
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": []
|
||||
@@ -1722,29 +1702,26 @@
|
||||
"codebuild": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-2",
|
||||
"ap-south-1",
|
||||
"ap-southeast-1",
|
||||
"eu-central-1",
|
||||
"eu-south-1",
|
||||
"eu-south-2",
|
||||
"eu-west-1",
|
||||
"eu-west-3",
|
||||
"me-south-1",
|
||||
"us-west-1",
|
||||
"af-south-1",
|
||||
"ap-east-1",
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-3",
|
||||
"ap-southeast-2",
|
||||
"eu-north-1",
|
||||
"eu-west-2",
|
||||
"us-east-1",
|
||||
"us-west-2",
|
||||
"ap-south-2",
|
||||
"ap-southeast-3",
|
||||
"ca-central-1",
|
||||
"eu-central-2",
|
||||
"eu-north-1",
|
||||
"me-central-1",
|
||||
"sa-east-1",
|
||||
"us-east-2"
|
||||
@@ -1781,19 +1758,16 @@
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
"ap-east-1",
|
||||
"ap-south-2",
|
||||
"ap-northeast-2",
|
||||
"ap-south-1",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-3",
|
||||
"eu-central-1",
|
||||
"eu-north-1",
|
||||
"eu-south-1",
|
||||
"me-central-1",
|
||||
"sa-east-1",
|
||||
"us-west-1",
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-2",
|
||||
"ap-south-1",
|
||||
"eu-north-1",
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [
|
||||
@@ -1893,24 +1867,21 @@
|
||||
"eu-north-1",
|
||||
"eu-south-1",
|
||||
"eu-west-1",
|
||||
"me-south-1",
|
||||
"us-west-2",
|
||||
"af-south-1",
|
||||
"ap-northeast-1",
|
||||
"ca-central-1",
|
||||
"eu-central-1",
|
||||
"eu-central-2",
|
||||
"eu-west-2",
|
||||
"eu-west-3",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
"us-west-1",
|
||||
"ap-southeast-1",
|
||||
"ca-central-1",
|
||||
"us-east-2"
|
||||
"ap-southeast-1"
|
||||
],
|
||||
"aws-cn": [
|
||||
"cn-northwest-1",
|
||||
"cn-north-1"
|
||||
"cn-north-1",
|
||||
"cn-northwest-1"
|
||||
],
|
||||
"aws-us-gov": [
|
||||
"us-gov-west-1"
|
||||
@@ -3104,14 +3075,13 @@
|
||||
"us-west-2",
|
||||
"ap-northeast-1",
|
||||
"ap-south-1",
|
||||
"ap-south-2",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-4",
|
||||
"ca-central-1",
|
||||
"eu-central-2",
|
||||
"eu-west-3",
|
||||
"me-central-1",
|
||||
"us-east-1",
|
||||
"ap-south-2"
|
||||
"us-east-1"
|
||||
],
|
||||
"aws-cn": [
|
||||
"cn-northwest-1",
|
||||
@@ -3643,9 +3613,9 @@
|
||||
"ap-southeast-3",
|
||||
"eu-central-2",
|
||||
"eu-west-3",
|
||||
"af-south-1",
|
||||
"ap-east-1",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-4",
|
||||
"eu-central-1",
|
||||
"eu-north-1",
|
||||
"eu-west-1",
|
||||
@@ -3653,15 +3623,14 @@
|
||||
"me-central-1",
|
||||
"us-east-2",
|
||||
"us-west-1",
|
||||
"af-south-1",
|
||||
"ap-southeast-1",
|
||||
"ca-central-1",
|
||||
"eu-south-1",
|
||||
"eu-south-2",
|
||||
"me-south-1",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"us-west-2",
|
||||
"sa-east-1"
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [
|
||||
"cn-northwest-1",
|
||||
@@ -3913,10 +3882,9 @@
|
||||
"eu-south-1",
|
||||
"eu-west-2",
|
||||
"eu-west-3",
|
||||
"me-central-1",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"eu-north-1",
|
||||
"sa-east-1"
|
||||
"eu-north-1"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": [
|
||||
@@ -4465,34 +4433,6 @@
|
||||
]
|
||||
}
|
||||
},
|
||||
"internetmonitor": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"af-south-1",
|
||||
"ap-northeast-1",
|
||||
"ap-south-1",
|
||||
"ap-southeast-1",
|
||||
"eu-central-1",
|
||||
"eu-south-1",
|
||||
"eu-west-1",
|
||||
"eu-west-2",
|
||||
"eu-west-3",
|
||||
"sa-east-1",
|
||||
"ap-east-1",
|
||||
"ap-northeast-2",
|
||||
"ap-southeast-2",
|
||||
"ca-central-1",
|
||||
"eu-north-1",
|
||||
"me-south-1",
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
"us-west-1",
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
"iot": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
@@ -4832,9 +4772,7 @@
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": [
|
||||
"us-gov-west-1"
|
||||
]
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
"iotwireless": {
|
||||
@@ -4890,7 +4828,7 @@
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-3",
|
||||
"eu-west-3",
|
||||
"me-central-1",
|
||||
"us-east-1",
|
||||
"us-west-2",
|
||||
"ap-east-1",
|
||||
"ap-northeast-1",
|
||||
@@ -4900,7 +4838,6 @@
|
||||
"eu-south-1",
|
||||
"eu-west-2",
|
||||
"me-south-1",
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
"ap-southeast-1",
|
||||
"ca-central-1",
|
||||
@@ -5201,38 +5138,6 @@
|
||||
]
|
||||
}
|
||||
},
|
||||
"launchwizard": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"af-south-1",
|
||||
"ap-east-1",
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-3",
|
||||
"eu-central-1",
|
||||
"eu-north-1",
|
||||
"eu-west-2",
|
||||
"us-east-2",
|
||||
"us-west-1",
|
||||
"ap-south-1",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"eu-south-1",
|
||||
"eu-west-1",
|
||||
"eu-west-3",
|
||||
"me-south-1",
|
||||
"sa-east-1",
|
||||
"us-west-2",
|
||||
"ap-northeast-2",
|
||||
"ca-central-1",
|
||||
"us-east-1"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": [
|
||||
"us-gov-west-1",
|
||||
"us-gov-east-1"
|
||||
]
|
||||
}
|
||||
},
|
||||
"lex-models": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
@@ -6027,7 +5932,6 @@
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"eu-central-1",
|
||||
"me-central-1",
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [],
|
||||
@@ -6215,10 +6119,9 @@
|
||||
"eu-south-1",
|
||||
"eu-west-1",
|
||||
"eu-west-2",
|
||||
"me-central-1",
|
||||
"us-east-1",
|
||||
"eu-central-1",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [],
|
||||
@@ -6295,6 +6198,7 @@
|
||||
"eu-central-2",
|
||||
"eu-south-1",
|
||||
"me-south-1",
|
||||
"sa-east-1",
|
||||
"us-east-2",
|
||||
"ap-northeast-1",
|
||||
"ap-south-2",
|
||||
@@ -6302,13 +6206,9 @@
|
||||
"ap-southeast-4",
|
||||
"eu-north-1",
|
||||
"eu-west-3",
|
||||
"me-central-1",
|
||||
"sa-east-1"
|
||||
],
|
||||
"aws-cn": [
|
||||
"cn-north-1",
|
||||
"cn-northwest-1"
|
||||
"me-central-1"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
@@ -7141,9 +7041,7 @@
|
||||
"us-west-1",
|
||||
"ap-northeast-1",
|
||||
"ap-south-1",
|
||||
"ap-south-2",
|
||||
"eu-central-1",
|
||||
"eu-central-2",
|
||||
"eu-south-2",
|
||||
"eu-west-2",
|
||||
"us-east-1",
|
||||
@@ -7343,27 +7241,24 @@
|
||||
"ap-south-1",
|
||||
"ap-southeast-3",
|
||||
"ca-central-1",
|
||||
"eu-west-2",
|
||||
"me-south-1",
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-2",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"eu-central-1",
|
||||
"eu-north-1",
|
||||
"eu-south-1",
|
||||
"eu-west-1",
|
||||
"eu-west-2",
|
||||
"us-west-1",
|
||||
"ap-southeast-2",
|
||||
"eu-west-3",
|
||||
"us-west-1",
|
||||
"sa-east-1",
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [
|
||||
"cn-northwest-1",
|
||||
"cn-north-1"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
@@ -7435,15 +7330,6 @@
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
"route53-recovery-readiness": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
"route53domains": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
@@ -7466,7 +7352,7 @@
|
||||
"us-east-2",
|
||||
"ap-southeast-3",
|
||||
"eu-central-1",
|
||||
"eu-central-2",
|
||||
"eu-west-2",
|
||||
"eu-west-3",
|
||||
"me-central-1",
|
||||
"me-south-1",
|
||||
@@ -7478,7 +7364,6 @@
|
||||
"ca-central-1",
|
||||
"eu-north-1",
|
||||
"eu-west-1",
|
||||
"eu-west-2",
|
||||
"us-west-1"
|
||||
],
|
||||
"aws-cn": [
|
||||
@@ -7626,9 +7511,9 @@
|
||||
"us-west-1",
|
||||
"us-west-2",
|
||||
"af-south-1",
|
||||
"ap-northeast-2",
|
||||
"ap-southeast-3",
|
||||
"eu-central-1",
|
||||
"eu-central-2",
|
||||
"eu-south-1",
|
||||
"eu-west-1",
|
||||
"eu-west-3",
|
||||
@@ -7636,7 +7521,6 @@
|
||||
"me-south-1",
|
||||
"us-east-2",
|
||||
"ap-east-1",
|
||||
"ap-northeast-2",
|
||||
"ap-south-1",
|
||||
"ap-southeast-1",
|
||||
"ca-central-1",
|
||||
@@ -7695,40 +7579,9 @@
|
||||
},
|
||||
"sagemaker-metrics": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"af-south-1",
|
||||
"ap-east-1",
|
||||
"ap-northeast-3",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"eu-south-2",
|
||||
"sa-east-1",
|
||||
"us-west-1",
|
||||
"ap-northeast-1",
|
||||
"ap-south-1",
|
||||
"ap-southeast-3",
|
||||
"ca-central-1",
|
||||
"eu-central-1",
|
||||
"eu-central-2",
|
||||
"eu-west-1",
|
||||
"me-south-1",
|
||||
"us-east-1",
|
||||
"ap-northeast-2",
|
||||
"eu-north-1",
|
||||
"eu-south-1",
|
||||
"eu-west-2",
|
||||
"me-central-1",
|
||||
"us-east-2",
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [
|
||||
"cn-north-1",
|
||||
"cn-northwest-1"
|
||||
],
|
||||
"aws-us-gov": [
|
||||
"us-gov-west-1",
|
||||
"us-gov-east-1"
|
||||
]
|
||||
"aws": [],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
"sagemaker-runtime": {
|
||||
@@ -7744,19 +7597,16 @@
|
||||
"us-east-2",
|
||||
"af-south-1",
|
||||
"ap-northeast-1",
|
||||
"ap-south-2",
|
||||
"ap-south-1",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-3",
|
||||
"ca-central-1",
|
||||
"eu-south-2",
|
||||
"eu-west-2",
|
||||
"me-central-1",
|
||||
"ap-northeast-2",
|
||||
"ap-northeast-3",
|
||||
"ap-south-1",
|
||||
"ap-southeast-2",
|
||||
"eu-central-2",
|
||||
"eu-north-1",
|
||||
"me-central-1",
|
||||
"sa-east-1",
|
||||
"us-west-1",
|
||||
"us-west-2"
|
||||
@@ -7903,29 +7753,26 @@
|
||||
"aws": [
|
||||
"ap-east-1",
|
||||
"ap-northeast-2",
|
||||
"ap-south-2",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"ca-central-1",
|
||||
"eu-north-1",
|
||||
"eu-west-2",
|
||||
"us-east-2",
|
||||
"us-west-2",
|
||||
"af-south-1",
|
||||
"ap-northeast-3",
|
||||
"ap-southeast-3",
|
||||
"eu-north-1",
|
||||
"eu-south-1",
|
||||
"eu-west-1",
|
||||
"eu-west-3",
|
||||
"me-central-1",
|
||||
"me-south-1",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"ap-northeast-1",
|
||||
"ap-south-1",
|
||||
"eu-central-1",
|
||||
"eu-central-2",
|
||||
"eu-south-2",
|
||||
"me-south-1",
|
||||
"us-west-1"
|
||||
],
|
||||
"aws-cn": [
|
||||
@@ -7942,12 +7789,9 @@
|
||||
"regions": {
|
||||
"aws": [
|
||||
"ap-northeast-1",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"eu-central-1",
|
||||
"eu-west-1",
|
||||
"eu-west-2",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
"us-west-2"
|
||||
@@ -7996,28 +7840,24 @@
|
||||
"ap-south-1",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-3",
|
||||
"eu-south-2",
|
||||
"ca-central-1",
|
||||
"eu-west-1",
|
||||
"eu-west-2",
|
||||
"us-east-2",
|
||||
"ap-east-1",
|
||||
"ap-northeast-3",
|
||||
"ap-southeast-4",
|
||||
"ca-central-1",
|
||||
"eu-central-1",
|
||||
"eu-north-1",
|
||||
"me-south-1",
|
||||
"sa-east-1",
|
||||
"us-west-1",
|
||||
"us-west-2",
|
||||
"af-south-1",
|
||||
"ap-south-2",
|
||||
"ap-southeast-1",
|
||||
"eu-central-2",
|
||||
"eu-south-1",
|
||||
"eu-west-3",
|
||||
"me-south-1",
|
||||
"us-east-1",
|
||||
"me-central-1"
|
||||
"me-central-1",
|
||||
"us-east-1"
|
||||
],
|
||||
"aws-cn": [
|
||||
"cn-northwest-1",
|
||||
@@ -8042,20 +7882,16 @@
|
||||
"af-south-1",
|
||||
"ap-east-1",
|
||||
"ap-northeast-3",
|
||||
"ap-south-2",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-3",
|
||||
"me-central-1",
|
||||
"eu-central-1",
|
||||
"me-south-1",
|
||||
"sa-east-1",
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-2",
|
||||
"ap-south-1",
|
||||
"ap-southeast-1",
|
||||
"eu-central-1",
|
||||
"eu-central-2",
|
||||
"eu-south-1",
|
||||
"eu-south-2",
|
||||
"eu-west-1",
|
||||
"us-west-1"
|
||||
],
|
||||
@@ -8109,9 +7945,9 @@
|
||||
"ap-northeast-1",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-4",
|
||||
"eu-south-1",
|
||||
"eu-south-2",
|
||||
"eu-west-2",
|
||||
"eu-west-3",
|
||||
"us-west-1",
|
||||
"af-south-1",
|
||||
@@ -8119,7 +7955,7 @@
|
||||
"ap-southeast-3",
|
||||
"ca-central-1",
|
||||
"eu-central-1",
|
||||
"eu-west-2",
|
||||
"eu-north-1",
|
||||
"us-east-2",
|
||||
"us-west-2",
|
||||
"ap-east-1",
|
||||
@@ -8127,12 +7963,11 @@
|
||||
"ap-south-1",
|
||||
"ap-south-2",
|
||||
"eu-central-2",
|
||||
"eu-north-1",
|
||||
"eu-west-1",
|
||||
"me-central-1",
|
||||
"me-south-1",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"me-central-1"
|
||||
"us-east-1"
|
||||
],
|
||||
"aws-cn": [
|
||||
"cn-north-1",
|
||||
@@ -8237,6 +8072,22 @@
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
"simpledb": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
"ap-northeast-1",
|
||||
"ap-southeast-1",
|
||||
"ap-southeast-2",
|
||||
"eu-west-1",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"us-west-1",
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
"simspaceweaver": {
|
||||
"regions": {
|
||||
"aws": [
|
||||
@@ -8685,24 +8536,22 @@
|
||||
"me-central-1",
|
||||
"us-west-2",
|
||||
"af-south-1",
|
||||
"ap-northeast-2",
|
||||
"ap-northeast-3",
|
||||
"ap-south-2",
|
||||
"ap-southeast-1",
|
||||
"ca-central-1",
|
||||
"eu-central-2",
|
||||
"eu-west-2",
|
||||
"us-west-1",
|
||||
"ap-northeast-2",
|
||||
"ap-south-1",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-4",
|
||||
"eu-central-1",
|
||||
"eu-north-1",
|
||||
"eu-south-1",
|
||||
"me-south-1",
|
||||
"sa-east-1",
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
"eu-south-1"
|
||||
"us-east-2"
|
||||
],
|
||||
"aws-cn": [
|
||||
"cn-northwest-1",
|
||||
@@ -9022,7 +8871,8 @@
|
||||
"eu-west-2",
|
||||
"me-south-1",
|
||||
"us-east-1",
|
||||
"us-east-2"
|
||||
"us-east-2",
|
||||
"us-west-2"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": []
|
||||
@@ -9043,15 +8893,14 @@
|
||||
"us-west-1",
|
||||
"ap-northeast-2",
|
||||
"ap-northeast-3",
|
||||
"eu-north-1",
|
||||
"eu-west-2",
|
||||
"eu-west-3",
|
||||
"me-central-1",
|
||||
"me-south-1",
|
||||
"us-east-1",
|
||||
"us-west-2",
|
||||
"ap-northeast-1",
|
||||
"ca-central-1",
|
||||
"eu-north-1",
|
||||
"eu-south-1",
|
||||
"eu-west-1"
|
||||
],
|
||||
@@ -9082,13 +8931,12 @@
|
||||
"ap-south-1",
|
||||
"ap-southeast-1",
|
||||
"ca-central-1",
|
||||
"eu-central-2",
|
||||
"eu-north-1",
|
||||
"eu-west-1",
|
||||
"us-east-2",
|
||||
"us-west-1",
|
||||
"ap-southeast-2",
|
||||
"ap-southeast-3",
|
||||
"eu-north-1",
|
||||
"eu-south-1",
|
||||
"me-south-1",
|
||||
"us-west-2"
|
||||
@@ -9179,20 +9027,19 @@
|
||||
"eu-central-1",
|
||||
"eu-south-1",
|
||||
"eu-west-2",
|
||||
"me-south-1",
|
||||
"us-east-1",
|
||||
"us-east-2",
|
||||
"us-west-2",
|
||||
"af-south-1",
|
||||
"ap-east-1",
|
||||
"ap-northeast-1",
|
||||
"ap-northeast-3",
|
||||
"ap-south-1",
|
||||
"ap-southeast-2",
|
||||
"eu-north-1",
|
||||
"eu-west-1",
|
||||
"eu-west-3",
|
||||
"sa-east-1",
|
||||
"us-west-2",
|
||||
"ap-southeast-2",
|
||||
"us-west-1"
|
||||
],
|
||||
"aws-cn": [],
|
||||
@@ -9456,9 +9303,7 @@
|
||||
"us-east-1"
|
||||
],
|
||||
"aws-cn": [],
|
||||
"aws-us-gov": [
|
||||
"us-gov-west-1"
|
||||
]
|
||||
"aws-us-gov": []
|
||||
}
|
||||
},
|
||||
"wisdom": {
|
||||
|
||||
@@ -1,9 +1,7 @@
|
||||
import csv
|
||||
import json
|
||||
from copy import deepcopy
|
||||
|
||||
from alive_progress import alive_bar
|
||||
from botocore.client import ClientError
|
||||
from colorama import Fore, Style
|
||||
from tabulate import tabulate
|
||||
|
||||
@@ -18,10 +16,10 @@ from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
|
||||
|
||||
|
||||
def quick_inventory(audit_info: AWS_Audit_Info, output_directory: str):
|
||||
print(
|
||||
f"-=- Running Quick Inventory for AWS Account {Fore.YELLOW}{audit_info.audited_account}{Style.RESET_ALL} -=-\n"
|
||||
)
|
||||
resources = []
|
||||
global_resources = []
|
||||
total_resources_per_region = {}
|
||||
iam_was_scanned = False
|
||||
# If not inputed regions, check all of them
|
||||
if not audit_info.audited_regions:
|
||||
# EC2 client for describing all regions
|
||||
@@ -42,20 +40,42 @@ def quick_inventory(audit_info: AWS_Audit_Info, output_directory: str):
|
||||
enrich_print=False,
|
||||
) as bar:
|
||||
for region in sorted(audit_info.audited_regions):
|
||||
bar.title = f"Inventorying AWS Account {orange_color}{audit_info.audited_account}{Style.RESET_ALL}"
|
||||
bar.title = f"-> Scanning {orange_color}{region}{Style.RESET_ALL} region"
|
||||
resources_in_region = []
|
||||
# {
|
||||
# eu-west-1: 100,...
|
||||
# }
|
||||
|
||||
try:
|
||||
# Scan IAM only once
|
||||
if not iam_was_scanned:
|
||||
global_resources.extend(get_iam_resources(audit_info.audit_session))
|
||||
iam_was_scanned = True
|
||||
# If us-east-1 get IAM resources from there otherwise see if it is US GovCloud or China
|
||||
iam_client = audit_info.audit_session.client("iam")
|
||||
if (
|
||||
region == "us-east-1"
|
||||
or region == "us-gov-west-1"
|
||||
or region == "cn-north-1"
|
||||
):
|
||||
get_roles_paginator = iam_client.get_paginator("list_roles")
|
||||
for page in get_roles_paginator.paginate():
|
||||
for role in page["Roles"]:
|
||||
# Avoid aws-service-role roles
|
||||
if "aws-service-role" not in role["Arn"]:
|
||||
resources_in_region.append(role["Arn"])
|
||||
|
||||
# Get regional S3 buckets since none-tagged buckets are not supported by the resourcegroupstaggingapi
|
||||
resources_in_region.extend(get_regional_buckets(audit_info, region))
|
||||
get_users_paginator = iam_client.get_paginator("list_users")
|
||||
for page in get_users_paginator.paginate():
|
||||
for user in page["Users"]:
|
||||
resources_in_region.append(user["Arn"])
|
||||
|
||||
get_groups_paginator = iam_client.get_paginator("list_groups")
|
||||
for page in get_groups_paginator.paginate():
|
||||
for group in page["Groups"]:
|
||||
resources_in_region.append(group["Arn"])
|
||||
|
||||
get_policies_paginator = iam_client.get_paginator("list_policies")
|
||||
for page in get_policies_paginator.paginate(Scope="Local"):
|
||||
for policy in page["Policies"]:
|
||||
resources_in_region.append(policy["Arn"])
|
||||
|
||||
for saml_provider in iam_client.list_saml_providers()[
|
||||
"SAMLProviderList"
|
||||
]:
|
||||
resources_in_region.append(saml_provider["Arn"])
|
||||
|
||||
client = audit_info.audit_session.client(
|
||||
"resourcegroupstaggingapi", region_name=region
|
||||
@@ -66,27 +86,12 @@ def quick_inventory(audit_info: AWS_Audit_Info, output_directory: str):
|
||||
for page in get_resources_paginator.paginate():
|
||||
resources_count += len(page["ResourceTagMappingList"])
|
||||
for resource in page["ResourceTagMappingList"]:
|
||||
# Avoid adding S3 buckets again:
|
||||
if resource["ResourceARN"].split(":")[2] != "s3":
|
||||
# Check if region is not in ARN --> Global service
|
||||
if not resource["ResourceARN"].split(":")[3]:
|
||||
global_resources.append(
|
||||
{
|
||||
"arn": resource["ResourceARN"],
|
||||
"tags": resource["Tags"],
|
||||
}
|
||||
)
|
||||
else:
|
||||
resources_in_region.append(
|
||||
{
|
||||
"arn": resource["ResourceARN"],
|
||||
"tags": resource["Tags"],
|
||||
}
|
||||
)
|
||||
resources_in_region.append(resource["ResourceARN"])
|
||||
bar()
|
||||
if len(resources_in_region) > 0:
|
||||
total_resources_per_region[region] = len(resources_in_region)
|
||||
bar.text = f"-> Found {Fore.GREEN}{len(resources_in_region)}{Style.RESET_ALL} resources in {region}"
|
||||
print(
|
||||
f"Found {Fore.GREEN}{len(resources_in_region)}{Style.RESET_ALL} resources in region {Fore.YELLOW}{region}{Style.RESET_ALL}"
|
||||
)
|
||||
print("\n")
|
||||
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
@@ -97,26 +102,20 @@ def quick_inventory(audit_info: AWS_Audit_Info, output_directory: str):
|
||||
resources.extend(resources_in_region)
|
||||
bar.title = f"-> {Fore.GREEN}Quick Inventory completed!{Style.RESET_ALL}"
|
||||
|
||||
resources.extend(global_resources)
|
||||
total_resources_per_region["global"] = len(global_resources)
|
||||
inventory_table = create_inventory_table(resources, total_resources_per_region)
|
||||
inventory_table = create_inventory_table(resources)
|
||||
|
||||
print(
|
||||
f"\nQuick Inventory of AWS Account {Fore.YELLOW}{audit_info.audited_account}{Style.RESET_ALL}:"
|
||||
)
|
||||
|
||||
print(
|
||||
tabulate(
|
||||
inventory_table, headers="keys", tablefmt="rounded_grid", stralign="left"
|
||||
)
|
||||
)
|
||||
print(tabulate(inventory_table, headers="keys", tablefmt="rounded_grid"))
|
||||
|
||||
print(f"\nTotal resources found: {Fore.GREEN}{len(resources)}{Style.RESET_ALL}")
|
||||
|
||||
create_output(resources, audit_info, output_directory)
|
||||
|
||||
|
||||
def create_inventory_table(resources: list, resources_in_region: dict) -> dict:
|
||||
regions_with_resources = list(resources_in_region.keys())
|
||||
def create_inventory_table(resources: list) -> dict:
|
||||
services = {}
|
||||
# { "S3":
|
||||
# 123,
|
||||
@@ -125,30 +124,13 @@ def create_inventory_table(resources: list, resources_in_region: dict) -> dict:
|
||||
# }
|
||||
resources_type = {}
|
||||
# { "S3":
|
||||
# "Buckets":
|
||||
# eu-west-1: 10,
|
||||
# eu-west-2: 3,
|
||||
# "Buckets": 13,
|
||||
# "IAM":
|
||||
# "Roles":
|
||||
# us-east-1: 143,
|
||||
# "Users":
|
||||
# us-west-2: 22,
|
||||
# "Roles": 143,
|
||||
# "Users": 22,
|
||||
# }
|
||||
|
||||
inventory_table = {
|
||||
"Service": [],
|
||||
f"Total\n({Fore.GREEN}{str(len(resources))}{Style.RESET_ALL})": [],
|
||||
"Total per\nresource type": [],
|
||||
}
|
||||
|
||||
for region, count in resources_in_region.items():
|
||||
inventory_table[f"{region}\n({Fore.GREEN}{str(count)}{Style.RESET_ALL})"] = []
|
||||
|
||||
for resource in sorted(resources, key=lambda d: d["arn"]):
|
||||
service = resource["arn"].split(":")[2]
|
||||
region = resource["arn"].split(":")[3]
|
||||
if not region:
|
||||
region = "global"
|
||||
for resource in sorted(resources):
|
||||
service = resource.split(":")[2]
|
||||
if service not in services:
|
||||
services[service] = 0
|
||||
services[service] += 1
|
||||
@@ -160,58 +142,36 @@ def create_inventory_table(resources: list, resources_in_region: dict) -> dict:
|
||||
elif service == "sqs":
|
||||
resource_type = "queue"
|
||||
elif service == "apigateway":
|
||||
split_parts = resource["arn"].split(":")[5].split("/")
|
||||
split_parts = resource.split(":")[5].split("/")
|
||||
if "integration" in split_parts and "responses" in split_parts:
|
||||
resource_type = "restapis-resources-methods-integration-response"
|
||||
elif "documentation" in split_parts and "parts" in split_parts:
|
||||
resource_type = "restapis-documentation-parts"
|
||||
else:
|
||||
resource_type = resource["arn"].split(":")[5].split("/")[1]
|
||||
resource_type = resource.split(":")[5].split("/")[1]
|
||||
else:
|
||||
resource_type = resource["arn"].split(":")[5].split("/")[0]
|
||||
resource_type = resource.split(":")[5].split("/")[0]
|
||||
if service not in resources_type:
|
||||
resources_type[service] = {}
|
||||
if resource_type not in resources_type[service]:
|
||||
resources_type[service][resource_type] = {}
|
||||
if region not in resources_type[service][resource_type]:
|
||||
resources_type[service][resource_type][region] = 0
|
||||
resources_type[service][resource_type][region] += 1
|
||||
resources_type[service][resource_type] = 0
|
||||
resources_type[service][resource_type] += 1
|
||||
|
||||
# Add results to inventory table
|
||||
inventory_table = {
|
||||
"Service": [],
|
||||
"Total": [],
|
||||
"Count per resource types": [],
|
||||
}
|
||||
for service in services:
|
||||
pending_regions = deepcopy(regions_with_resources)
|
||||
aux = {}
|
||||
# {
|
||||
# "region": summary,
|
||||
# }
|
||||
summary = ""
|
||||
inventory_table["Service"].append(f"{service}")
|
||||
inventory_table[
|
||||
f"Total\n({Fore.GREEN}{str(len(resources))}{Style.RESET_ALL})"
|
||||
].append(f"{Fore.GREEN}{services[service]}{Style.RESET_ALL}")
|
||||
for resource_type, regions in resources_type[service].items():
|
||||
summary += f"{resource_type} {Fore.GREEN}{str(sum(regions.values()))}{Style.RESET_ALL}\n"
|
||||
# Check if region does not have resource type
|
||||
for region in pending_regions:
|
||||
if region not in aux:
|
||||
aux[region] = ""
|
||||
if region not in regions:
|
||||
aux[region] += "-\n"
|
||||
for region, count in regions.items():
|
||||
aux[region] += f"{Fore.GREEN}{str(count)}{Style.RESET_ALL}\n"
|
||||
# Add Total per resource type
|
||||
inventory_table["Total per\nresource type"].append(summary)
|
||||
# Add Total per region
|
||||
for region, text in aux.items():
|
||||
inventory_table[
|
||||
f"{region}\n({Fore.GREEN}{str(resources_in_region[region])}{Style.RESET_ALL})"
|
||||
].append(text)
|
||||
if region in pending_regions:
|
||||
pending_regions.remove(region)
|
||||
for region_without_resource in pending_regions:
|
||||
inventory_table[
|
||||
f"{region_without_resource}\n ({Fore.GREEN}{str(resources_in_region[region_without_resource])}{Style.RESET_ALL})"
|
||||
].append("-")
|
||||
inventory_table["Total"].append(
|
||||
f"{Fore.GREEN}{services[service]}{Style.RESET_ALL}"
|
||||
)
|
||||
for resource_type in resources_type[service]:
|
||||
summary += f"{resource_type} {Fore.GREEN}{resources_type[service][resource_type]}{Style.RESET_ALL}\n"
|
||||
inventory_table["Count per resource types"].append(summary)
|
||||
|
||||
return inventory_table
|
||||
|
||||
@@ -220,37 +180,30 @@ def create_output(resources: list, audit_info: AWS_Audit_Info, output_directory:
|
||||
json_output = []
|
||||
output_file = f"{output_directory}/prowler-inventory-{audit_info.audited_account}-{output_file_timestamp}"
|
||||
|
||||
for item in sorted(resources, key=lambda d: d["arn"]):
|
||||
for item in sorted(resources):
|
||||
resource = {}
|
||||
resource["AWS_AccountID"] = audit_info.audited_account
|
||||
resource["AWS_Region"] = item["arn"].split(":")[3]
|
||||
resource["AWS_Partition"] = item["arn"].split(":")[1]
|
||||
resource["AWS_Service"] = item["arn"].split(":")[2]
|
||||
resource["AWS_ResourceType"] = item["arn"].split(":")[5].split("/")[0]
|
||||
resource["AWS_Region"] = item.split(":")[3]
|
||||
resource["AWS_Partition"] = item.split(":")[1]
|
||||
resource["AWS_Service"] = item.split(":")[2]
|
||||
resource["AWS_ResourceType"] = item.split(":")[5].split("/")[0]
|
||||
resource["AWS_ResourceID"] = ""
|
||||
if len(item["arn"].split("/")) > 1:
|
||||
resource["AWS_ResourceID"] = item["arn"].split("/")[-1]
|
||||
elif len(item["arn"].split(":")) > 6:
|
||||
resource["AWS_ResourceID"] = item["arn"].split(":")[-1]
|
||||
resource["AWS_ResourceARN"] = item["arn"]
|
||||
if len(item.split("/")) > 1:
|
||||
resource["AWS_ResourceID"] = item.split("/")[-1]
|
||||
elif len(item.split(":")) > 6:
|
||||
resource["AWS_ResourceID"] = item.split(":")[-1]
|
||||
resource["AWS_ResourceARN"] = item
|
||||
# Cover S3 case
|
||||
if resource["AWS_Service"] == "s3":
|
||||
resource["AWS_ResourceType"] = "bucket"
|
||||
resource["AWS_ResourceID"] = item["arn"].split(":")[-1]
|
||||
resource["AWS_ResourceID"] = item.split(":")[-1]
|
||||
# Cover WAFv2 case
|
||||
if resource["AWS_Service"] == "wafv2":
|
||||
resource["AWS_ResourceType"] = "/".join(
|
||||
item["arn"].split(":")[-1].split("/")[:-2]
|
||||
)
|
||||
resource["AWS_ResourceID"] = "/".join(
|
||||
item["arn"].split(":")[-1].split("/")[2:]
|
||||
)
|
||||
resource["AWS_ResourceType"] = "/".join(item.split(":")[-1].split("/")[:-2])
|
||||
resource["AWS_ResourceID"] = "/".join(item.split(":")[-1].split("/")[2:])
|
||||
# Cover Config case
|
||||
if resource["AWS_Service"] == "config":
|
||||
resource["AWS_ResourceID"] = "/".join(
|
||||
item["arn"].split(":")[-1].split("/")[1:]
|
||||
)
|
||||
resource["AWS_Tags"] = item["tags"]
|
||||
resource["AWS_ResourceID"] = "/".join(item.split(":")[-1].split("/")[1:])
|
||||
json_output.append(resource)
|
||||
|
||||
# Serializing json
|
||||
@@ -276,64 +229,3 @@ def create_output(resources: list, audit_info: AWS_Audit_Info, output_directory:
|
||||
print("\nMore details in files:")
|
||||
print(f" - CSV: {output_file+csv_file_suffix}")
|
||||
print(f" - JSON: {output_file+json_file_suffix}")
|
||||
|
||||
|
||||
def get_regional_buckets(audit_info: AWS_Audit_Info, region: str) -> list:
|
||||
regional_buckets = []
|
||||
s3_client = audit_info.audit_session.client("s3", region_name=region)
|
||||
buckets = s3_client.list_buckets()
|
||||
for bucket in buckets["Buckets"]:
|
||||
bucket_region = s3_client.get_bucket_location(Bucket=bucket["Name"])[
|
||||
"LocationConstraint"
|
||||
]
|
||||
if bucket_region == "EU": # If EU, bucket_region is eu-west-1
|
||||
bucket_region = "eu-west-1"
|
||||
if not bucket_region: # If None, bucket_region is us-east-1
|
||||
bucket_region = "us-east-1"
|
||||
if bucket_region == region: # Only add bucket if is in current region
|
||||
try:
|
||||
bucket_tags = s3_client.get_bucket_tagging(Bucket=bucket["Name"])[
|
||||
"TagSet"
|
||||
]
|
||||
except ClientError as error:
|
||||
bucket_tags = []
|
||||
if error.response["Error"]["Code"] != "NoSuchTagSet":
|
||||
logger.error(
|
||||
f"{region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
bucket_arn = (
|
||||
f"arn:{audit_info.audited_partition}:s3:{region}::{bucket['Name']}"
|
||||
)
|
||||
regional_buckets.append({"arn": bucket_arn, "tags": bucket_tags})
|
||||
return regional_buckets
|
||||
|
||||
|
||||
def get_iam_resources(session) -> list:
|
||||
iam_resources = []
|
||||
iam_client = session.client("iam")
|
||||
get_roles_paginator = iam_client.get_paginator("list_roles")
|
||||
for page in get_roles_paginator.paginate():
|
||||
for role in page["Roles"]:
|
||||
# Avoid aws-service-role roles
|
||||
if "aws-service-role" not in role["Arn"]:
|
||||
iam_resources.append({"arn": role["Arn"], "tags": role.get("Tags")})
|
||||
|
||||
get_users_paginator = iam_client.get_paginator("list_users")
|
||||
for page in get_users_paginator.paginate():
|
||||
for user in page["Users"]:
|
||||
iam_resources.append({"arn": user["Arn"], "tags": user.get("Tags")})
|
||||
|
||||
get_groups_paginator = iam_client.get_paginator("list_groups")
|
||||
for page in get_groups_paginator.paginate():
|
||||
for group in page["Groups"]:
|
||||
iam_resources.append({"arn": group["Arn"], "tags": []})
|
||||
|
||||
get_policies_paginator = iam_client.get_paginator("list_policies")
|
||||
for page in get_policies_paginator.paginate(Scope="Local"):
|
||||
for policy in page["Policies"]:
|
||||
iam_resources.append({"arn": policy["Arn"], "tags": policy.get("Tags")})
|
||||
|
||||
for saml_provider in iam_client.list_saml_providers()["SAMLProviderList"]:
|
||||
iam_resources.append({"arn": saml_provider["Arn"], "tags": []})
|
||||
|
||||
return iam_resources
|
||||
|
||||
@@ -1,38 +0,0 @@
|
||||
import sys
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.providers.aws.aws_provider import generate_regional_clients
|
||||
from prowler.providers.aws.lib.audit_info.models import AWS_Audit_Info
|
||||
|
||||
|
||||
def get_tagged_resources(input_resource_tags: list, current_audit_info: AWS_Audit_Info):
|
||||
"""
|
||||
get_tagged_resources returns a list of the resources that are going to be scanned based on the given input tags
|
||||
"""
|
||||
try:
|
||||
resource_tags = []
|
||||
tagged_resources = []
|
||||
for tag in input_resource_tags:
|
||||
key = tag.split("=")[0]
|
||||
value = tag.split("=")[1]
|
||||
resource_tags.append({"Key": key, "Values": [value]})
|
||||
# Get Resources with resource_tags for all regions
|
||||
for regional_client in generate_regional_clients(
|
||||
"resourcegroupstaggingapi", current_audit_info
|
||||
).values():
|
||||
try:
|
||||
get_resources_paginator = regional_client.get_paginator("get_resources")
|
||||
for page in get_resources_paginator.paginate(TagFilters=resource_tags):
|
||||
for resource in page["ResourceTagMappingList"]:
|
||||
tagged_resources.append(resource["ResourceARN"])
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
except Exception as error:
|
||||
logger.critical(
|
||||
f"{error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
sys.exit(1)
|
||||
else:
|
||||
return tagged_resources
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -17,7 +17,6 @@ class accessanalyzer_enabled(Check):
|
||||
)
|
||||
report.resource_id = analyzer.name
|
||||
report.resource_arn = analyzer.arn
|
||||
report.resource_tags = analyzer.tags
|
||||
|
||||
elif analyzer.status == "NOT_AVAILABLE":
|
||||
report.status = "FAIL"
|
||||
@@ -32,7 +31,6 @@ class accessanalyzer_enabled(Check):
|
||||
)
|
||||
report.resource_id = analyzer.name
|
||||
report.resource_arn = analyzer.arn
|
||||
report.resource_tags = analyzer.tags
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -17,7 +17,6 @@ class accessanalyzer_enabled_without_findings(Check):
|
||||
)
|
||||
report.resource_id = analyzer.name
|
||||
report.resource_arn = analyzer.arn
|
||||
report.resource_tags = analyzer.tags
|
||||
if len(analyzer.findings) != 0:
|
||||
active_finding_counter = 0
|
||||
for finding in analyzer.findings:
|
||||
@@ -29,7 +28,6 @@ class accessanalyzer_enabled_without_findings(Check):
|
||||
report.status_extended = f"IAM Access Analyzer {analyzer.name} has {active_finding_counter} active findings"
|
||||
report.resource_id = analyzer.name
|
||||
report.resource_arn = analyzer.arn
|
||||
report.resource_tags = analyzer.tags
|
||||
elif analyzer.status == "NOT_AVAILABLE":
|
||||
report.status = "FAIL"
|
||||
report.status_extended = (
|
||||
@@ -43,7 +41,6 @@ class accessanalyzer_enabled_without_findings(Check):
|
||||
)
|
||||
report.resource_id = analyzer.name
|
||||
report.resource_arn = analyzer.arn
|
||||
report.resource_tags = analyzer.tags
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import threading
|
||||
from typing import Optional
|
||||
|
||||
from pydantic import BaseModel
|
||||
|
||||
@@ -49,7 +48,7 @@ class AccessAnalyzer:
|
||||
arn=analyzer["arn"],
|
||||
name=analyzer["name"],
|
||||
status=analyzer["status"],
|
||||
tags=[analyzer.get("tags")],
|
||||
tags=str(analyzer["tags"]),
|
||||
type=analyzer["type"],
|
||||
region=regional_client.region,
|
||||
)
|
||||
@@ -61,7 +60,7 @@ class AccessAnalyzer:
|
||||
arn="",
|
||||
name=self.audited_account,
|
||||
status="NOT_AVAILABLE",
|
||||
tags=[],
|
||||
tags="",
|
||||
type="",
|
||||
region=regional_client.region,
|
||||
)
|
||||
@@ -120,6 +119,6 @@ class Analyzer(BaseModel):
|
||||
name: str
|
||||
status: str
|
||||
findings: list[Finding] = []
|
||||
tags: Optional[list] = []
|
||||
tags: str
|
||||
type: str
|
||||
region: str
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -15,13 +15,11 @@ class acm_certificates_expiration_check(Check):
|
||||
report.status_extended = f"ACM Certificate for {certificate.name} expires in {certificate.expiration_days} days."
|
||||
report.resource_id = certificate.name
|
||||
report.resource_arn = certificate.arn
|
||||
report.resource_tags = certificate.tags
|
||||
else:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"ACM Certificate for {certificate.name} is about to expire in {DAYS_TO_EXPIRE_THRESHOLD} days."
|
||||
report.resource_id = certificate.name
|
||||
report.resource_arn = certificate.arn
|
||||
report.resource_tags = certificate.tags
|
||||
|
||||
findings.append(report)
|
||||
return findings
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -15,19 +15,16 @@ class acm_certificates_transparency_logs_enabled(Check):
|
||||
)
|
||||
report.resource_id = certificate.name
|
||||
report.resource_arn = certificate.arn
|
||||
report.resource_tags = certificate.tags
|
||||
else:
|
||||
if not certificate.transparency_logging:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"ACM Certificate for {certificate.name} has Certificate Transparency logging disabled."
|
||||
report.resource_id = certificate.name
|
||||
report.resource_arn = certificate.arn
|
||||
report.resource_tags = certificate.tags
|
||||
else:
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"ACM Certificate for {certificate.name} has Certificate Transparency logging enabled."
|
||||
report.resource_id = certificate.name
|
||||
report.resource_arn = certificate.arn
|
||||
report.resource_tags = certificate.tags
|
||||
findings.append(report)
|
||||
return findings
|
||||
|
||||
@@ -1,9 +1,7 @@
|
||||
import threading
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
|
||||
from pydantic import BaseModel
|
||||
from dataclasses import dataclass
|
||||
|
||||
from prowler.config.config import timestamp_utc
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
|
||||
from prowler.providers.aws.aws_provider import generate_regional_clients
|
||||
@@ -20,7 +18,6 @@ class ACM:
|
||||
self.certificates = []
|
||||
self.__threading_call__(self.__list_certificates__)
|
||||
self.__describe_certificates__()
|
||||
self.__list_tags_for_certificate__()
|
||||
|
||||
def __get_session__(self):
|
||||
return self.session
|
||||
@@ -47,26 +44,12 @@ class ACM:
|
||||
certificate["CertificateArn"], self.audit_resources
|
||||
)
|
||||
):
|
||||
if "NotAfter" in certificate:
|
||||
# We need to get the TZ info to be able to do the math
|
||||
certificate_expiration_time = (
|
||||
certificate["NotAfter"]
|
||||
- datetime.now(
|
||||
certificate["NotAfter"].tzinfo
|
||||
if hasattr(certificate["NotAfter"], "tzinfo")
|
||||
else None
|
||||
)
|
||||
).days
|
||||
else:
|
||||
certificate_expiration_time = 0
|
||||
self.certificates.append(
|
||||
Certificate(
|
||||
arn=certificate["CertificateArn"],
|
||||
name=certificate["DomainName"],
|
||||
type=certificate["Type"],
|
||||
expiration_days=certificate_expiration_time,
|
||||
transparency_logging=False,
|
||||
region=regional_client.region,
|
||||
certificate["CertificateArn"],
|
||||
certificate["DomainName"],
|
||||
False,
|
||||
regional_client.region,
|
||||
)
|
||||
)
|
||||
except Exception as error:
|
||||
@@ -82,6 +65,13 @@ class ACM:
|
||||
response = regional_client.describe_certificate(
|
||||
CertificateArn=certificate.arn
|
||||
)["Certificate"]
|
||||
certificate.type = response["Type"]
|
||||
if "NotAfter" in response:
|
||||
certificate.expiration_days = (
|
||||
response["NotAfter"] - timestamp_utc
|
||||
).days
|
||||
else:
|
||||
certificate.expiration_days = 0
|
||||
if (
|
||||
response["Options"]["CertificateTransparencyLoggingPreference"]
|
||||
== "ENABLED"
|
||||
@@ -92,26 +82,24 @@ class ACM:
|
||||
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
def __list_tags_for_certificate__(self):
|
||||
logger.info("ACM - List Tags...")
|
||||
try:
|
||||
for certificate in self.certificates:
|
||||
regional_client = self.regional_clients[certificate.region]
|
||||
response = regional_client.list_tags_for_certificate(
|
||||
CertificateArn=certificate.arn
|
||||
)["Tags"]
|
||||
certificate.tags = response
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
|
||||
class Certificate(BaseModel):
|
||||
@dataclass
|
||||
class Certificate:
|
||||
arn: str
|
||||
name: str
|
||||
type: str
|
||||
tags: Optional[list] = []
|
||||
expiration_days: int
|
||||
transparency_logging: Optional[bool]
|
||||
transparency_logging: bool
|
||||
region: str
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
arn,
|
||||
name,
|
||||
transparency_logging,
|
||||
region,
|
||||
):
|
||||
self.arn = arn
|
||||
self.name = name
|
||||
self.transparency_logging = transparency_logging
|
||||
self.region = region
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -12,7 +12,6 @@ class apigateway_authorizers_enabled(Check):
|
||||
report.region = rest_api.region
|
||||
report.resource_id = rest_api.name
|
||||
report.resource_arn = rest_api.arn
|
||||
report.resource_tags = rest_api.tags
|
||||
if rest_api.authorizer:
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} has authorizer configured."
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -13,7 +13,6 @@ class apigateway_client_certificate_enabled(Check):
|
||||
report.resource_id = rest_api.name
|
||||
report.region = rest_api.region
|
||||
report.resource_arn = stage.arn
|
||||
report.resource_tags = stage.tags
|
||||
if stage.client_certificate:
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} in stage {stage.name} has client certificate enabled."
|
||||
|
||||
@@ -28,6 +28,10 @@
|
||||
"Categories": [
|
||||
"internet-exposed"
|
||||
],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -12,7 +12,6 @@ class apigateway_endpoint_public(Check):
|
||||
report.region = rest_api.region
|
||||
report.resource_id = rest_api.name
|
||||
report.resource_arn = rest_api.arn
|
||||
report.resource_tags = rest_api.tags
|
||||
if rest_api.public_endpoint:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} is internet accesible."
|
||||
|
||||
@@ -28,6 +28,10 @@
|
||||
"Categories": [
|
||||
"forensics-ready"
|
||||
],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -13,7 +13,6 @@ class apigateway_logging_enabled(Check):
|
||||
report.region = rest_api.region
|
||||
report.resource_id = rest_api.name
|
||||
report.resource_arn = stage.arn
|
||||
report.resource_tags = stage.tags
|
||||
if stage.logging:
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} in stage {stage.name} has logging enabled."
|
||||
|
||||
@@ -1,7 +1,5 @@
|
||||
import threading
|
||||
from typing import Optional
|
||||
|
||||
from pydantic import BaseModel
|
||||
from dataclasses import dataclass
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
|
||||
@@ -47,11 +45,10 @@ class APIGateway:
|
||||
):
|
||||
self.rest_apis.append(
|
||||
RestAPI(
|
||||
id=apigw["id"],
|
||||
arn=arn,
|
||||
region=regional_client.region,
|
||||
name=apigw["name"],
|
||||
tags=[apigw.get("tags")],
|
||||
apigw["id"],
|
||||
arn,
|
||||
regional_client.region,
|
||||
apigw["name"],
|
||||
)
|
||||
)
|
||||
except Exception as error:
|
||||
@@ -103,33 +100,61 @@ class APIGateway:
|
||||
arn = f"arn:{self.audited_partition}:apigateway:{regional_client.region}::/apis/{rest_api.id}/stages/{stage['stageName']}"
|
||||
rest_api.stages.append(
|
||||
Stage(
|
||||
name=stage["stageName"],
|
||||
arn=arn,
|
||||
logging=logging,
|
||||
client_certificate=client_certificate,
|
||||
waf=waf,
|
||||
tags=[stage.get("tags")],
|
||||
stage["stageName"],
|
||||
arn,
|
||||
logging,
|
||||
client_certificate,
|
||||
waf,
|
||||
)
|
||||
)
|
||||
except Exception as error:
|
||||
logger.error(f"{error.__class__.__name__}: {error}")
|
||||
|
||||
|
||||
class Stage(BaseModel):
|
||||
@dataclass
|
||||
class Stage:
|
||||
name: str
|
||||
arn: str
|
||||
logging: bool
|
||||
client_certificate: bool
|
||||
waf: Optional[str]
|
||||
tags: Optional[list] = []
|
||||
waf: str
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
name,
|
||||
arn,
|
||||
logging,
|
||||
client_certificate,
|
||||
waf,
|
||||
):
|
||||
self.name = name
|
||||
self.arn = arn
|
||||
self.logging = logging
|
||||
self.client_certificate = client_certificate
|
||||
self.waf = waf
|
||||
|
||||
|
||||
class RestAPI(BaseModel):
|
||||
@dataclass
|
||||
class RestAPI:
|
||||
id: str
|
||||
arn: str
|
||||
region: str
|
||||
name: str
|
||||
authorizer: bool = False
|
||||
public_endpoint: bool = True
|
||||
stages: list[Stage] = []
|
||||
tags: Optional[list] = []
|
||||
authorizer: bool
|
||||
public_endpoint: bool
|
||||
stages: list[Stage]
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
id,
|
||||
arn,
|
||||
region,
|
||||
name,
|
||||
):
|
||||
self.id = id
|
||||
self.arn = arn
|
||||
self.region = region
|
||||
self.name = name
|
||||
self.authorizer = False
|
||||
self.public_endpoint = True
|
||||
self.stages = []
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -13,7 +13,6 @@ class apigateway_waf_acl_attached(Check):
|
||||
report.region = rest_api.region
|
||||
report.resource_id = rest_api.name
|
||||
report.resource_arn = stage.arn
|
||||
report.resource_tags = stage.tags
|
||||
if stage.waf:
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"API Gateway {rest_api.name} ID {rest_api.id} in stage {stage.name} has {stage.waf} WAF ACL attached."
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -15,12 +15,10 @@ class apigatewayv2_access_logging_enabled(Check):
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"API Gateway V2 {api.name} ID {api.id} in stage {stage.name} has access logging enabled."
|
||||
report.resource_id = api.name
|
||||
report.resource_tags = api.tags
|
||||
else:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"API Gateway V2 {api.name} ID {api.id} in stage {stage.name} has access logging disabled."
|
||||
report.resource_id = api.name
|
||||
report.resource_tags = api.tags
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -16,12 +16,10 @@ class apigatewayv2_authorizers_enabled(Check):
|
||||
f"API Gateway V2 {api.name} ID {api.id} has authorizer configured."
|
||||
)
|
||||
report.resource_id = api.name
|
||||
report.resource_tags = api.tags
|
||||
else:
|
||||
report.status = "FAIL"
|
||||
report.status_extended = f"API Gateway V2 {api.name} ID {api.id} has not authorizer configured."
|
||||
report.resource_id = api.name
|
||||
report.resource_tags = api.tags
|
||||
findings.append(report)
|
||||
|
||||
return findings
|
||||
|
||||
@@ -1,7 +1,5 @@
|
||||
import threading
|
||||
from typing import Optional
|
||||
|
||||
from pydantic import BaseModel
|
||||
from dataclasses import dataclass
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
|
||||
@@ -44,10 +42,9 @@ class ApiGatewayV2:
|
||||
):
|
||||
self.apis.append(
|
||||
API(
|
||||
id=apigw["ApiId"],
|
||||
region=regional_client.region,
|
||||
name=apigw["Name"],
|
||||
tags=[apigw.get("Tags")],
|
||||
apigw["ApiId"],
|
||||
regional_client.region,
|
||||
apigw["Name"],
|
||||
)
|
||||
)
|
||||
except Exception as error:
|
||||
@@ -80,9 +77,8 @@ class ApiGatewayV2:
|
||||
logging = True
|
||||
api.stages.append(
|
||||
Stage(
|
||||
name=stage["StageName"],
|
||||
logging=logging,
|
||||
tags=[stage.get("Tags")],
|
||||
stage["StageName"],
|
||||
logging,
|
||||
)
|
||||
)
|
||||
except Exception as error:
|
||||
@@ -91,16 +87,36 @@ class ApiGatewayV2:
|
||||
)
|
||||
|
||||
|
||||
class Stage(BaseModel):
|
||||
@dataclass
|
||||
class Stage:
|
||||
name: str
|
||||
logging: bool
|
||||
tags: Optional[list] = []
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
name,
|
||||
logging,
|
||||
):
|
||||
self.name = name
|
||||
self.logging = logging
|
||||
|
||||
|
||||
class API(BaseModel):
|
||||
@dataclass
|
||||
class API:
|
||||
id: str
|
||||
region: str
|
||||
name: str
|
||||
authorizer: bool = False
|
||||
stages: list[Stage] = []
|
||||
tags: Optional[list] = []
|
||||
authorizer: bool
|
||||
stages: list[Stage]
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
id,
|
||||
region,
|
||||
name,
|
||||
):
|
||||
self.id = id
|
||||
self.region = region
|
||||
self.name = name
|
||||
self.authorizer = False
|
||||
self.stages = []
|
||||
|
||||
@@ -28,6 +28,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": "Infrastructure Security"
|
||||
|
||||
@@ -14,7 +14,6 @@ class appstream_fleet_default_internet_access_disabled(Check):
|
||||
report.region = fleet.region
|
||||
report.resource_id = fleet.name
|
||||
report.resource_arn = fleet.arn
|
||||
report.resource_tags = fleet.tags
|
||||
|
||||
if fleet.enable_default_internet_access:
|
||||
report.status = "FAIL"
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": "Infrastructure Security"
|
||||
|
||||
@@ -17,7 +17,6 @@ class appstream_fleet_maximum_session_duration(Check):
|
||||
report.region = fleet.region
|
||||
report.resource_id = fleet.name
|
||||
report.resource_arn = fleet.arn
|
||||
report.resource_tags = fleet.tags
|
||||
|
||||
if fleet.max_user_duration_in_seconds < max_session_duration_seconds:
|
||||
report.status = "PASS"
|
||||
|
||||
@@ -28,6 +28,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": "Infrastructure Security"
|
||||
|
||||
@@ -17,7 +17,6 @@ class appstream_fleet_session_disconnect_timeout(Check):
|
||||
report.region = fleet.region
|
||||
report.resource_id = fleet.name
|
||||
report.resource_arn = fleet.arn
|
||||
report.resource_tags = fleet.tags
|
||||
|
||||
if fleet.disconnect_timeout_in_seconds <= max_disconnect_timeout_in_seconds:
|
||||
report.status = "PASS"
|
||||
|
||||
@@ -28,6 +28,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": "Infrastructure Security"
|
||||
|
||||
@@ -19,7 +19,6 @@ class appstream_fleet_session_idle_disconnect_timeout(Check):
|
||||
report.region = fleet.region
|
||||
report.resource_id = fleet.name
|
||||
report.resource_arn = fleet.arn
|
||||
report.resource_tags = fleet.tags
|
||||
|
||||
if (
|
||||
fleet.idle_disconnect_timeout_in_seconds
|
||||
|
||||
@@ -18,7 +18,6 @@ class AppStream:
|
||||
self.regional_clients = generate_regional_clients(self.service, audit_info)
|
||||
self.fleets = []
|
||||
self.__threading_call__(self.__describe_fleets__)
|
||||
self.__list_tags_for_resource__()
|
||||
|
||||
def __get_session__(self):
|
||||
return self.session
|
||||
@@ -66,20 +65,6 @@ class AppStream:
|
||||
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
def __list_tags_for_resource__(self):
|
||||
logger.info("AppStream - List Tags...")
|
||||
try:
|
||||
for fleet in self.fleets:
|
||||
regional_client = self.regional_clients[fleet.region]
|
||||
response = regional_client.list_tags_for_resource(
|
||||
ResourceArn=fleet.arn
|
||||
)["Tags"]
|
||||
fleet.tags = [response]
|
||||
except Exception as error:
|
||||
logger.error(
|
||||
f"{regional_client.region} -- {error.__class__.__name__}[{error.__traceback__.tb_lineno}]: {error}"
|
||||
)
|
||||
|
||||
|
||||
class Fleet(BaseModel):
|
||||
arn: str
|
||||
@@ -89,4 +74,3 @@ class Fleet(BaseModel):
|
||||
idle_disconnect_timeout_in_seconds: Optional[int]
|
||||
enable_default_internet_access: bool
|
||||
region: str
|
||||
tags: Optional[list] = []
|
||||
|
||||
@@ -28,6 +28,10 @@
|
||||
"Categories": [
|
||||
"secrets"
|
||||
],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import threading
|
||||
|
||||
from pydantic import BaseModel
|
||||
from dataclasses import dataclass
|
||||
|
||||
from prowler.lib.logger import logger
|
||||
from prowler.lib.scan_filters.scan_filters import is_resource_filtered
|
||||
@@ -46,11 +45,11 @@ class AutoScaling:
|
||||
):
|
||||
self.launch_configurations.append(
|
||||
LaunchConfiguration(
|
||||
arn=configuration["LaunchConfigurationARN"],
|
||||
name=configuration["LaunchConfigurationName"],
|
||||
user_data=configuration["UserData"],
|
||||
image_id=configuration["ImageId"],
|
||||
region=regional_client.region,
|
||||
configuration["LaunchConfigurationARN"],
|
||||
configuration["LaunchConfigurationName"],
|
||||
configuration["UserData"],
|
||||
configuration["ImageId"],
|
||||
regional_client.region,
|
||||
)
|
||||
)
|
||||
|
||||
@@ -60,9 +59,24 @@ class AutoScaling:
|
||||
)
|
||||
|
||||
|
||||
class LaunchConfiguration(BaseModel):
|
||||
@dataclass
|
||||
class LaunchConfiguration:
|
||||
arn: str
|
||||
name: str
|
||||
user_data: str
|
||||
image_id: str
|
||||
image_id: int
|
||||
region: str
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
arn,
|
||||
name,
|
||||
user_data,
|
||||
image_id,
|
||||
region,
|
||||
):
|
||||
self.arn = arn
|
||||
self.name = name
|
||||
self.image_id = image_id
|
||||
self.user_data = user_data
|
||||
self.region = region
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
"Categories": [
|
||||
"forensics-ready"
|
||||
],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -13,7 +13,6 @@ class awslambda_function_invoke_api_operations_cloudtrail_logging_enabled(Check)
|
||||
report.region = function.region
|
||||
report.resource_id = function.name
|
||||
report.resource_arn = function.arn
|
||||
report.resource_tags = function.tags
|
||||
|
||||
report.status = "FAIL"
|
||||
report.status_extended = (
|
||||
@@ -22,31 +21,20 @@ class awslambda_function_invoke_api_operations_cloudtrail_logging_enabled(Check)
|
||||
lambda_recorded_cloudtrail = False
|
||||
for trail in cloudtrail_client.trails:
|
||||
for data_event in trail.data_events:
|
||||
# classic event selectors
|
||||
if not data_event.is_advanced:
|
||||
if "DataResources" in data_event.event_selector:
|
||||
for resource in data_event.event_selector["DataResources"]:
|
||||
if resource["Type"] == "AWS::Lambda::Function" and (
|
||||
function.arn in resource["Values"]
|
||||
or "arn:aws:lambda" in resource["Values"]
|
||||
):
|
||||
lambda_recorded_cloudtrail = True
|
||||
break
|
||||
elif data_event.is_advanced:
|
||||
for field_selector in data_event.event_selector[
|
||||
"FieldSelectors"
|
||||
]:
|
||||
if "DataResources" in data_event.event_selector:
|
||||
for resource in data_event.event_selector["DataResources"]:
|
||||
if (
|
||||
field_selector["Field"] == "resources.type"
|
||||
and "AWS::Lambda::Function" in field_selector["Equals"]
|
||||
resource["Type"] == "AWS::Lambda::Function"
|
||||
and function.arn in resource["Values"]
|
||||
):
|
||||
lambda_recorded_cloudtrail = True
|
||||
break
|
||||
|
||||
if lambda_recorded_cloudtrail:
|
||||
break
|
||||
if lambda_recorded_cloudtrail:
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"Lambda function {function.name} is recorded by CloudTrail trail {trail.name}"
|
||||
report.status_extended = f"Lambda function {function.name} is recorded by CloudTrail {trail.name}"
|
||||
break
|
||||
findings.append(report)
|
||||
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
"Categories": [
|
||||
"secrets"
|
||||
],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -17,7 +17,6 @@ class awslambda_function_no_secrets_in_code(Check):
|
||||
report.region = function.region
|
||||
report.resource_id = function.name
|
||||
report.resource_arn = function.arn
|
||||
report.resource_tags = function.tags
|
||||
|
||||
report.status = "PASS"
|
||||
report.status_extended = (
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
"Categories": [
|
||||
"secrets"
|
||||
],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -17,7 +17,6 @@ class awslambda_function_no_secrets_in_variables(Check):
|
||||
report.region = function.region
|
||||
report.resource_id = function.name
|
||||
report.resource_arn = function.arn
|
||||
report.resource_tags = function.tags
|
||||
|
||||
report.status = "PASS"
|
||||
report.status_extended = (
|
||||
|
||||
@@ -26,6 +26,10 @@
|
||||
"Categories": [
|
||||
"internet-exposed"
|
||||
],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -10,7 +10,6 @@ class awslambda_function_not_publicly_accessible(Check):
|
||||
report.region = function.region
|
||||
report.resource_id = function.name
|
||||
report.resource_arn = function.arn
|
||||
report.resource_tags = function.tags
|
||||
|
||||
report.status = "PASS"
|
||||
report.status_extended = f"Lambda function {function.name} has a policy resource-based policy not public"
|
||||
|
||||
@@ -24,6 +24,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -10,7 +10,6 @@ class awslambda_function_url_cors_policy(Check):
|
||||
report.region = function.region
|
||||
report.resource_id = function.name
|
||||
report.resource_arn = function.arn
|
||||
report.resource_tags = function.tags
|
||||
if function.url_config:
|
||||
if "*" in function.url_config.cors_config.allow_origins:
|
||||
report.status = "FAIL"
|
||||
|
||||
@@ -24,6 +24,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
@@ -11,7 +11,6 @@ class awslambda_function_url_public(Check):
|
||||
report.region = function.region
|
||||
report.resource_id = function.name
|
||||
report.resource_arn = function.arn
|
||||
report.resource_tags = function.tags
|
||||
if function.url_config:
|
||||
if function.url_config.auth_type == AuthType.AWS_IAM:
|
||||
report.status = "PASS"
|
||||
|
||||
@@ -24,6 +24,10 @@
|
||||
}
|
||||
},
|
||||
"Categories": [],
|
||||
"Tags": {
|
||||
"Tag1Key": "value",
|
||||
"Tag2Key": "value"
|
||||
},
|
||||
"DependsOn": [],
|
||||
"RelatedTo": [],
|
||||
"Notes": ""
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user